The DMPTool 2 project has two advisory boards: one for researchers and one for administrative users, such as librarians, research offices, and IT professionals. The role of both is to provide feedback and guidance on the development of the DMPTool as well as our outreach and communication efforts. May was a busy month for both sets of board members, as we held two (virtual) meetings with each advisory board, with the second meetings. We thought we’d highlight here some of the things that we talked about at these informative and enlightening meetings.
For both boards, we shared examples of the functionality, roles, and wireframes that are under development for the next phase of the DMPTool. For both groups we highlighted the new functionality that will allow researchers to truly collaborate with others on the development of a DMP. For the Administrative Advisory Board, we focused in large part on the new functionality for institutions: ability to customize resource templates, such as local links and help text, requirement templates if an institution has a need to set up their own DMP requirements, and new types of institutional roles such as editors. This was the first chance we had had to share these with people outside of the project team, and we were thrilled to hear positive response to these developments, as well as feedback and suggestions. (Also, watch this space as we’ll be highlighting a lot of this functionality over the next few weeks!)
In response to questions from the first meetings with the advisory boards, we had begun to investigate further some of the usage patterns of the tool. While we have always collected use statistics, we haven’t done much in depth exploration or segmentation of these – for example, tracking numbers of repeat users or understanding spikes in usage of the tool. Working with this data and hearing the questions from board members is helping us to better understand the types of data that would be interesting to the institutions that use the tool, but also how we begin to measure the impact of the tool. For example, is the number of repeat users within a year a strong metric of success? How many researchers apply multiple times for funding within a year? The advisory boards are helping us to think critically about these issues.
We are lucky to have such engaged advisory boards and we encourage you to share your thoughts with board members as well as with us directly!