DMPTool unavailable for several hours, Saturday Jan 12 and Sunday Jan 13

Due to system maintenance, the DMPTool will be unavailable for periods on Saturday January 12 2013 and Sunday January 13 2013.

  • Saturday January 12 2013, 10:00am to 2:00pm PST (18:00 to 22:00 UTC)
  • Sunday January 13, 2013, 1:00pm to 4:00pm PST (21:00 to 23:59 UTC)

During this time, the website will be unavailable and access to data management plans will be blocked.

This maintenance window is necessary to upgrade core infrastructure in the California Digital Library’s production data center. It is possible that service to the DMPTool will be restored before the scheduled end of the outage, but this cannot be guaranteed.

Please contact us at uc3@ucop.edu with any questions or concerns. We apologize for the inconvenience.

New article in International Journal of Digital Curation

The DMPTool and DMPOnline teams are pleased to announce a new article outlining the visions, strategies, and future developments for the respective projects.  Access it here:  http://www.ijdc.net/index.php/ijdc/article/view/225

Citation:   Sallans, A., Donnelly, M. (2012). DMP Online and DMPTool: Different Strategies Towards a Shared Goal. International Journal of Digital Curation, 7(2), 123–129.

Abstract:

This paper provides a comparative discussion of the strategies employed in the UK’s DMP Online tool and the US’s DMPTool, both designed to provide a structured environment for research data management planning (DMP) with explicit links to funder requirements. Following the Sixth International Digital Curation Conference, held in Chicago in December 2010, a number of US institutions partnered with the Digital Curation Centre’s DMP Online team to learn from their experiences while developing a US counterpart. DMPTool arrived in beta in August 2011 and released a production version in November 2011. This joint paper will compare and contrast use cases, organizational and national/cultural characteristics that have influenced the development decisions, outcomes achieved so far, and planned future developments.

 

 

 

DMPTool Demo at AGU 2012

If you are headed to the AGU meeting next week, consider attending the “Data Management 101 for the Early Career Scientist” workshop being run by ESIP. In addition to getting lots of great information on how best to manage your data, Amber Budden from DataONE will be demoing the DMPTool.

When: Tuesday December 4th, 12:20-1:40 PM
Where: Marriott Marquis, Salons 5-6, San Francisco

Abstract:
Don’t let the changing face of science leave you behind. The proliferation of data from all sources requires today’s scientists to be knowledgeable about sound data management practices. In this free workshop, everything the modern data-producing scientist needs to know about data management will be introduced. We will discuss the rationale for data management, provide an overview for managing science data and demonstrate use of two popular data management planning tools from IEDA and DataONE. This workshop is open to all AGU attendees. Students and early careers scientists are especially encouraged to attend. For details on the outline and for links to all of our Data Management Short Course modules see: http://esipfed.org/AGUDataManagement101

New funder: Gulf of Mexico Research Initiative

The Gulf of Mexico Research Initiative (GoMRI) studies the effects of oil and dispersant on the ecosystems of the Gulf of Mexico, and offers research grants to scientists to study the environmental impacts and public health implications of petroleum pollution. As part of these grants, GoMRI requires researchers to provide data management plans. The DMPTool now supports GoMRI’s data management requirements–you’ll see the Gulf of Mexico Research Initiative included in the dropdown list of funders when you log in. As part of this initiative, GoMRI is creating a research database where scientists can deposit their data, called the Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC). As they state:

Scientists and researchers of GoMRI are encouraged to submit their data onto recognized national data centers. However, if one cannot be identified, GRIIDC will host these data for long-term archival and to facilitate data discovery.

They will also provide other services, such as generating DOIs and maintaining a data registry.

DMPTool monthly report, Oct 2012

Use stats

October was our biggest month ever. 375 new users logged into the DMPTool in October, and 336 plans were created. There is now a total of 3,466 users, and they’ve created almost 3,000 plans total. 2 more universities customized the DMPTool for their researchers, bringing the total to 28. 65 have configured their campus single-signon for the DMPTool. See the map of our participating organizations: http://bit.ly/L85sKj

As the graph at the bottom shows, more DMPs are created for these funders:

  • NIH
  • NSF Biological Sciences directorate
  • NSF Directorate for Social, Behavioral and Economic Sciences

Here’s a graph of overall use through Oct 2012:

Also, here’s a graph of use by funder through Oct 2012:

NSF will consider research data as important as publications

The NSF has released a new Proposal and Award Policies and Procedures Guide (PAPPG), effective January 14, 2013. This guide includes updates to the Grant Proposal Guide (GPG) as well as the Award and Administration Guide (AAG).

Among the various changes is a significant change is to the Biographical Sketches portion of the proposal. No longer are researchers asked to list just their “relevant publications.” After January, researchers will be requested to list their “relevant products.” To quote the new guide:

Chapter II.C.2.f(i)(c), Biographical Sketch(es), has been revised to rename the “Publications” section to “Products” and amend terminology and instructions accordingly. This change makes clear that products may include, but are not limited to, publications, data sets, software, patents, and copyrights.

As Carly Strasser points out, this is “great news for those of us trying to get data the recognition it deserves.” Sherry Lake at the University of Virginia, and resident requirements analyst here at the DMPTool, has examined the new requirements and determined that no changes will be needed to the DMPTool. Any data management plan that you create using the DMPTool today will meet these new NSF requirements. We are working to stay abreast of developments in the rapidly changing area of data management and US federal agencies.

DMPTool monthly report, Sept 2012

News

There was a service outage on September 19-20. The DMPTool service was unavailable from approximately 6:30pm on Sept 19 until 9am on Sept 20. Among other problems, we did not have adequate monitoring in place so that the data center staff was unaware of the problem. We have corrected this problem and now have 24/7 monitoring in place.

Use stats

We continue to add users—250 new users logged into the DMPTool in September. There is now a total of almost 3100 users, and they’ve created 2,645 plans total. 3 more universities customized the DMPTool for their researchers, bringing the total to 26. Over 60 have configured their campus single-signon for the DMPTool. See the map of our participating organizations: http://bit.ly/L85sKj

Here’s a graph of overall use through Sept 2012:

New publication

Starr, Joan; Willett, Perry; Federer, Lisa; Horning, Claudia; and Bergstrom, Mary Linn (2012) “A Collaborative Framework for Data Management Services: The Experience of the University of California,” Journal of eScience Librarianship: Vol. 1: Iss. 2, Article 7. doi:10.7191/jeslib.2012.1014
Available at: http://escholarship.umassmed.edu/jeslib/vol1/iss2/7

This article describes the full suite of services provided by CDL (including the DMPTool) in the context of the research data lifecycle.

DMPTool unavailable Sunday Sept 23

Due to system maintenance, the DMPTool will be unavailable from 12am (midnight) until 9:00am (PDT) Sunday September 23 2012 (07:00 to 16:00 UTC). During this time, the website will be unavailable and access to data management plans will be blocked.

This maintenance window is necessary to upgrade core infrastructure in the California Digital Library’s production data center, and will affect several CDL services, including the DMPTool, the Merritt Repository Service, and the Unified Digital Format Registry (UDFR). It is possible that service to the DMPTool will be restored before the scheduled end of the outage, but this cannot be guaranteed.

Please contact us at uc3@ucop.edu with any questions or concerns. We apologize for the inconvenience.

DMPTool user stats, August 2012

Total use of the DMPTool as of August 31, 2012:

  • 2841 unique users
  • 2448 data management plans
  • 62 institutions using single sign-on
  • 25 institutions that have customized the DMPTool for their users

A couple of things caught my eye–for the first time, there were more plans created in a single month (253) than first-time users (226). I think this means that users are coming back to create more plans, which would be a good sign. Second, there is a small but growing number of users from outside the US, including Europe and Asia. Since the DMPTool supports only US grants so far, I suppose they’re using it to create data management plans for other purposes.

We now have participating organizations from 30 states and the District of Columbia. See the map.

Here’s a graph showing overall use as of the end of August:

De-Mystifying Data Management Requirements

This week the DMPTool Partners are pleased to offer a guest post from the authors of a recent article on data management plan requirements in Issues in Science and Technology Librarianship. In the post below, Dianne Dietrich (Cornell), Trisha Adamus (Syracuse), Alison Miner (Syracuse), and Gail Steinhart (Cornell), offer a summary and bit of perspective on their recent article, and what this means to the community of interest around data management.  Let us know what you think.  – Andrew Sallans

 

Contributed by Dianne Dietrich (Cornell), Trisha Adamus (Syracuse), Alison Miner (Syracuse), and Gail Steinhart (Cornell)

Those of us who have run information sessions on the NSF Data Management Plan requirement (http://www.nsf.gov/eng/general/dmp.jsp) have probably heard the participants express some level of anxiety about its implications. Perhaps you’re familiar with comments like these: Will I be required to make all of my data available on the web in perpetuity? Where will I put all of this data? Who will pay for storage after the grant is finished? (http://data.research.cornell.edu/nsf-data-management-plan-faq) The answers to these questions aren’t always straightforward, especially since the NSF requirements are relatively general.

The NSF requirement isn’t the only one researchers face, however: many federal funding agencies have data management requirements for PIs. What does this landscape look like? What can we learn from examining a range of data policies? In investigating over two dozen funder policies, we observed that many were quite general, like the NSF-wide policy, but there were others that were more specific about aspects such as data publication options. The more specific policies tended to come from units within agencies (http://science.nasa.gov/earth-science/earth-science-data/data-information-policy/), and this makes sense: a closer connection to a discipline provides the opportunity to provide more concrete examples of accepted metadata standards, for instance. Strong policies can provide researchers with the tools to share data in ways that make sense for them and their research and more vague policies might seem limiting. For instance, we noted that few policies provided a thorough description of embargoes, an option that might alleviate some concerns about sharing research data openly.

Of course, this landscape is continually evolving. Several of the policies we looked at had been around for a number of years and had been revised one or more times. One way to approach a consultation on writing a data management plan is to tell researchers to start by describing their current data management practices, look to see how what they’re already doing aligns with their funder’s data policy, and then plan to fill the gaps. The more information funders receive about what actual needs are, the better they will be able to understand the gaps in this area, and the better positioned they will be to adapt policies to the needs of their research communities. We hope that our survey helps both communities as data management policies evolve.

De-Mystifying the Data Management Requirements of Research Funders
by Dianne Dietrich, Cornell University, Trisha Adamus, Syracuse University, Alison Miner, Syracuse University, and Gail Steinhart, Cornell University