Introduction

Data sharing and data management are topics that are becoming increasingly important. More information is appearing about their benefits, such as increased citation rates for research papers with associated shared datasets (; ). A growing number of funding bodies such as the NIH and the Wellcome Trust (; ), but also several journals (), have installed policies that require research data to be shared (). To be able to share data, both now and in the future, datasets not only need to be preserved, but also need to be comprehensible, and useable for others. To ensure these qualities, research data needs to be managed () and data repositories can play a role in maintaining the data in a useable structure (). However, using a data repository does not guarantee that the data is usable, since not every repository uses the same procedures and quality metrics, such as applying proper metadata tags (). As many repositories have not yet adopted generally accepted standards, it can be difficult for researchers to choose the right repository for their dataset ().

Several organisations, including funding agencies, academic publishers, and data organisations provide researchers with lists of supported or recommended repositories, e.g. BioSharing (). These lists vary in length, in the number and type of repositories they list, and in their selection criteria for recommendation. In addition, recommendations for data and data sharing are emerging, such as the FAIR Data Principles: guidelines to establish a common ground for all data to be Findable, Accessible, Interoperable, and Reusable (). Some data repositories are beginning to incorporate the FAIR principles into their policies, such as the UK Data Service () and several funders such as the EU Horizon 2020 program and the NIH (; ). Lists of recommended repositories and guidelines such as these can help researchers decide how and where to store and share their data.

Next to lists of recommended repositories, there are a number of schemes which specifically certify the quality of data repositories. One of the first of these certification schemes is the Data Seal of Approval (DSA), with an objective ‘to safeguard data, to ensure high quality and to guide reliable management of data for the future without requiring the implementation of new standards, regulations or high costs’ (). Building upon the DSA certification, but with more elaborate and detailed guidelines (), is the Network of Expertise in Long-Term Storage of Digital Resources (NESTOR) and the ISO 16363 standard/Trusted Data Repository (TDR). DSA, NESTOR, and TDR form a three-step framework for data repository certification (). The ICSU-WDS membership incorporates guidelines from DSA, NESTOR and Trustworthy Repositories Audit & Certification (TRAC), among others, for its data repository framework (). Furthermore, the TRAC guidelines were used as a basis for the ISO 16363/TDR guidelines ().

Given the multitude of recommendations and certification schemes, we set out to map the current landscape to compare criteria and analyse which repositories are recommended and certified by different parties. This paper is structured as follows: first, we investigate which repositories have been recommended and certified by different organizations. Next, we provide an analysis of the criteria used by organisations recommending repositories and the criteria used by certification schemes, and derive a set of shared criteria for recommendation and certification. Lastly, we explore what this tells us about the overlap between recommendations and certifications.

Methods

1. Lists of repositories

To examine which repositories are being recommended, we looked at the recommendations of 17 different organisations, including academic publishers, funding agencies, and data organisations. These lists of recommended repositories are all the available recommendation lists currently found on the BioSharing website under the Recommendations tab (), and in a web search by using the term “recommended data repositories”. These are lists compiled by the American Geophysical Union (AGU n.d.), BBSRC (), BioSharing (), COPDESS (), DataMed (), Elsevier (), EMBO Press (), F1000Research (), GigaScience (), NIH (), PLOS (), Scientific Data (), Springer Nature/BioMed Central (both share the same list) (), Web of Science (), Wellcome Trust (), and Wiley (Wiley n.d.). All lists, including links to the online lists, were compiled into one list to compare recommendations (http://dx.doi.org/10.17632/zx2kcyvvwm.1). Not all data repositories indexed by the Web of Science’s Data Citation Index (DCI) were included as there is no publicly available list with all repositories indexed by the DCI, so retrieval of recommended repositories was done through an individual search. The repositories indexed by Re3Data were not included in our list of recommended repositories as Re3data functions as “a global registry of research data repositories” () and thus does not recommend repositories. However, Re3Data was used to verify the repository’s status, persistent identifiers, and obtained certifications.

1.2 Certified repositories

For our analysis of data repository certification schemes, we examined five certification schemes. These were the DSA, ICSU-WDS, NESTOR, TRAC, and ISO 16363/TDR. These schemes were chosen due to being used for certification (DSA, ICSU-WDS, NESTOR and ISO 16363/TDR) or as a self-assessment check for repositories (TRAC). Both the DSA (), ICSU-WDS () and NESTOR () provide lists of certified repositories on their respective websites. ISO 16363/TDR certification has not yet been awarded (). We consulted the websites of the DSA, ICSU-WDS, NESTOR, TRAC and ISO 16363/TDR certification schemes to see which repositories they certified. The results were compiled into one list (https://doi.org/10.17632/zx2kcyvvwm.1).

After composing the list of recommended repositories, we investigated which criteria are being used to determine a recommendation or certification, and whether an overlap exists between recommended and certified repositories, and the criteria used (Figure 1).

  • 2.1 Compiled list of criteria for recommendation
  • 2.2 Clustered criteria into the “Recommended Criteria Cluster”
  • 2.3 Compiled list of criteria for certification
  • 2.4 Clustered criteria into the “Certification Criteria Cluster”
  • 3.2 Compared and merged steps 2.2 and 2.4 to create the umbrella categories
Figure 1 

Flowchart of the methodology used to create the umbrella categories.

These steps will be discussed in turn.

2. Criteria used for recommendation and certification

2.1 Criteria for recommendation

To understand the motivation behind specific recommendations, we looked at the organisations’ selection criteria for their lists of recommended repositories. Four of the 17 organisations supplied criteria for this online alongside their lists: BioCADDIE, F1000Research, Scientific Data/Springer Nature (SD/SN), and Web of Science (WoS). The Research Data Alliance’s () criteria for recommended repositories were also included in this analysis. Although the RDA does not maintain a list of recommended repositories, we included their criteria to balance out the weight between the number of organisations that recommend repositories, and the number of organisations that provide criteria for certification. The criteria of the five organisations were then compiled into one list (https://doi.org/10.17632/zx2kcyvvwm.1).

We categorized the criteria into 15 subheadings: Recognition, Mission, Transparency, Certification, Interface, Legal, Access, Structure, Retrievability, Preservation/Persistence, Curation, Persistent Identifier, Citability, Language, and Diversity of Data. These subheadings were derived from recurring and shared subjects throughout the different criteria lists. We then filtered out repetitions, or criteria unique to one organisation, namely Language and Diversity of Data, to create the “Recommended Criteria Cluster”.

2.3 Criteria for certification

We consulted relevant websites to obtain the criteria used by the DSA, ICSU-WDS, NESTOR, TRAC and ISO 16363/TDR certification schemes and compiled a list of all certification criteria of the five schemes. In the case of the DSA (), ICSU-WDS () and NESTOR (), these were found through their respective websites. The criteria for TRAC were found through the website of the Center for Research Libraries (CRL)(), and the criteria for the ISO 16363/TDR were found through the Primary Trustworthy Digital Repository Authorisation Body, on the website of the Consultative Committee for Space Systems ().

2.4 Certification Criteria Cluster

The criteria found were categorized into 14 subheadings: Recognition, Mission, Transparency, Certification, Interface, Legal, Access, Indexation, Structure, Retrievability, Preservation/Persistence, Curation, Persistent Identifier, and Citability. These subheadings were derived from recurring and shared subjects throughout the different lists. Criteria that did not match any of the 14 subheadings, due to being too specific for one scheme, were categorized under a “miscellaneous” subheading. These criteria were then reorganized and filtered to remove repetitions and criteria unique to one certification scheme into 11 reworded categories: Community, Mission, Providence, Organisation, Technical Structure, Legal and Contractual Compliance, Accessibility, Data Quality, Retrievability, Responsiveness, and Preservation. We named this list the “Certification Criteria Cluster”

To see whether there was overlap between the lists of recommended and certified repositories, we gathered the lists of repositories issued by the certifying organisations, and compiled these results into one list (available here: https://doi.org/10.17632/zx2kcyvvwm.1). We then calculated the number of times a repository was recommended by the different organizations as well as the percentage of recommended repositories with and without certification.

3.2 Criteria of recommendation and certification

We compared the Recommended Criteria Cluster and the Certification Criteria Cluster by looking at commonalities and recurrences between the two sets of broader headings and its constituents. We matched headings such as Community together with Recognition, Access with Accessibility, and Technical Structure with Interface, to derive a higher-order cluster of categories, which we are calling “Umbrella Categories”. This was done to create broader terms that are relevant to both recommendation and certification criteria, to further ease the process of qualifying repositories. In the process of matching terms, those that were identical were left as is, e.g. Mission, and Retrievability.

3.3 Repositories by discipline

Next, we classified every recommended repository into one of five disciplines, to analyse if a relationship exists between certification, recommendation, and discipline. The disciplines we identified – General/Interdisciplinary, Health and Medicine, Life Sciences, Physical Sciences, and Social Sciences and Economics – were based on the focus disciplines provided on the websites of Scientific Data/Springer Nature, PLOS, Elsevier, and Web of Science, as these organisations provided comparable lists of disciplines.

Results

1. Lists of repositories

The lists of recommended repositories of all 17 organizations were compiled into one list together with the lists of certified repositories (https://doi.org/10.17632/zx2kcyvvwm.1) to allow for further analysis.

2. Criteria used for recommendation and certification

Based on the list of criteria that were used by the different organizations to recommend repositories, criteria were reorganized into 15 subheadings. After initial creation of the subheadings, we found some repetitions or scheme-specific criteria which we refined into 13 subheadings. The resulting list is the “Recommended Criteria Cluster” (Supplementary Table 1).

Similarly, based on the list of criteria of certification schemes, we were able to create 14 subheadings. These were re-examined and after refining, 11 subheadings remained. These resulting 11 subheadings and their criteria form the “Certification Criteria Cluster” (Supplementary Table 2).

3. Overlap between recommendation and certification

In total, we found 242 repositories that were recommended by publishers, funding agencies, and/or community organisations, and the distribution of recommendations is depicted in Figure 2. A majority (88) were only recommended by a single organisation; ArrayExpress was mentioned most frequently, being recommended by 12 out of 17 organisations.

Figure 2 

Pie-chart showing the number of repositories that are recommended by 1 to 17 organisations. The highest number of recommendations received is 12.

When we look at recommended repositories with certification, only 13 out of 242 recommended repositories had any kind of certification: six repositories were certified by the DSA; six by the ICSU-WDS; none by NESTOR, TRAC, or TDR (Figure 3). Only one repository was certified by both the DSA and ICSU-WDS: the Inter-University Consortium for Political and Social Research (ICPSR). Of the 50 repositories that were recommended most often, only PANGAEA was certified.

Figure 3 

Pie-chart showing the percentage of recommended repositories that have obtained one of the following certifications: DSA, ICSU-WDS, and both DSA and ICSU-WDS.

To arrive at broadly applicable terms we grouped the criteria and the broader categories of the Recommended Criteria Cluster and the Certification Criteria Cluster into seven recurring umbrella categories, listed in Table 1.

Table 1

Table listing 7 common terms, referred to as “umbrella categories”, based on a comparison between the Recommended Criteria Cluster and the Certification Criteria Cluster. The second column describes the shared meaning of the umbrella category, followed by differences in characteristics of recommended repository criteria and repository certification scheme criteria.

Umbrella CategoriesSharedRecommended Repository CriteriaRepository Certification Scheme Criteria

MissionExplicit mission statement in providing long-term responsibility, persistence, and management of data(sets)
Community/RecognitionEvidence of use by downloads or citations from an identifiable and active user communityUnderstand and meet the needs of the designated and defined target community
Legal and Contractual ComplianceRepository operates within a legal framework/Ensures compliance with legal regulationsWhen applicable, have contractual regulations governing the protection of human subjectsContracts and agreements maintained with relevant parties on relevant subjects
Access/AccessibilityPublic access to the scientific/repository designated communityAnonymous referees (including peer-reviewers) have access to the data before public release as indicated by policies
Technical Structure/InterfaceThe software system supports data organisation and searchability by both humans and computers. The interface is intuitive and mobile user-friendlyThe technical (infra)structure is appropriate, protective, and secure
RetrievabilityData need to have enough metadata. All data receive a persistent identifier
PreservationLong-term and formal preservation/succession plan for the data, even if the repository ceases to existIf the data are retracted, the persistent identifier needs to be maintainedPreservation of data information properties and metadata

When we looked at the discipline that each recommended repository covers, we found that that the most common focus discipline is Life Sciences with 115 repositories, followed by Physical Sciences with 80, Health and Medicine with 26, General/Interdisciplinary with 12, and Social Sciences and Economics with 9 (https://doi.org/10.17632/zx2kcyvvwm.1).

Discussion

In the analyses of criteria, we identified seven categories for criteria based on the shared aspects between the criteria for recommendation and the criteria for certification. These categories give an indication of the common requirements observed in this study. These suggest that a repository needs:

  1. To have an explicit mission statement;
  2. To have a defined user community;
  3. To operate within a legal framework;
  4. To be accessible to the designated user community;
  5. To have an adequate technical structure;
  6. To ensure data retrievability;
  7. To ensure long-term preservation of data.

In comparing lists of recommended and certified repositories, we found a strong discrepancy: only 13 out of the 242 repositories that were recommended have obtained any form of certification. This gap between recommendation and certification can also be found in the fact that of the 50 most often recommended repositories, only one is certified. There are several possible explanations for the existence of a gap between recommended and certified repositories.

Firstly, repository certifications find their origins in specific disciplines/domains. For instance, the DSA has a background in social sciences and humanities, NESTOR is formed by a network of museums, archives and libraries, and the ICSU-WDS mainly operates in the earth and space science domain (). When looking at the different repository domains, we found that fifty of the most recommended repositories are linked to the natural sciences (life/physical/health sciences); exceptions in the top 50 are three interdisciplinary repositories. This depicts a contrast between repository domains and certifications’ target domains. Repositories operating within the life sciences domain might be less aware of certifications while conversely, certifiers might be tailoring standards that are less focused on the requirements for life science repositories.

A second reason might be the dynamics in the field of data sharing; guidelines and practices are constantly evolving and being evaluated. While data repositories such as FlyBase have been active since the early 90s (), certification schemes tend to be much more recent. The DSA is one of the oldest certifications, created in 2005 and becoming an internationally applicable scheme in 2009 (). Changing data sharing dynamics are also reflected in the collaboration between certifying organisations and the merger of different guidelines. For instance, the partnership between the DSA and ICSU-WDS has brought together two certification schemes from diverse backgrounds and disciplines (). Today, these schemes are still being developed and improved.

The only certifications that have been obtained by the repositories in our analysis are DSA and ICSU-WDS. NESTOR has certified two repositories at the time of writing, none of which were included in the examined recommendation lists. Both TRAC and ISO 16363/TDR have not officially certified repositories. In the case of the TRAC guidelines, this is because they are ‘meant primarily for those responsible for auditing digital repositories […] seeking objective measurement of the trustworthiness of their repository’ (), and are intended as a self-assessment check for repositories. The ISO 16363/TDR is a follow-up based on TRAC: it is an audit standard supported by the Primary Trustworthy Digital Repository Authorisation Body. Next to being considered a detailed way to evaluate a digital repository together with DSA and NESTOR (), repositories can be ISO 16363/TDR accredited by the ANSI-ASQ National Accreditation Board, although so far this has not yet happened ().

Another possible cause of the gap between certification and recommendation is that most organisations do not ask repositories for certification. From the five organisations used to create our Recommended Criteria Cluster analysis, only the RDA requires repositories to have a certification. The RDA states that a “trustworthy” repository is one ‘that untertake [sic] regularly quality assessments successfully such as Data Seal of Approval/World Data Systems’ (RDA). Organisations not asking for certification might provide repositories with little incentive to work towards the requirements of certification schemes.

A limitation of this study is that we clustered the lists of all 17 organisations under the header of recommended repositories, but the way in which this recommendation happens differs. In some cases repositories are referred to by these lists as “supported” or “approved” (Elsevier, F1000Research), encouraged for their ability to make data reusable (NIH), meeting access, preservation, and stability requirements (Scientific Data/Springer Nature), having received funding or investment (BBSRC, Wellcome), or the list contains generally recognised repositories (PLOS). This can explain why overlap between the lists is limited, in certain cases, because different considerations might have played a role, depending on the intention of the organisation. We note that there might be additional lists available that were not found using our search methodology.

Conclusion

The umbrella categories which we identified for data repositories indicate that publishers, community organisations and certification schemes largely agree on quality. Yet this is not reflected in the relationship between recommended and certified repositories. Out of the entire list, less than 6% of recommended repositories obtained some form of certification. Programmes such as the Horizon 2020 programme of the European Union tell their grantees that when choosing a repository ‘[p]reference should be given to certified repositories which support open access where possible’ (), increasing the focus on certified repositories. This focus could create an incentive for repositories, in particular uncertified recommended repositories, to become certified.

To ensure certification schemes are known to repositories, we suggest domains collaborate in creating or maintaining certification schemes. While we see the value in having different levels of certification, the recent DSA – WDS partnership provides a good example of a collaboration between domains. If repositories and certification schemes work together towards common standards, this can provide clarity and improve data management and quality. It would improve the data landscape if collectively, we make sure that recommended repositories are certified, and certifications are obtained by repositories that are being recommended.

Further research into this area could involve looking at whether the criteria for repository certification have actually contributed to better data. It would be of interest to study if certification leads to more “FAIR” data, and whether data in a certified repository is more highly used and cited. Another possible topic of study is whether these criteria can be applied at the level of the dataset. This could also lead to the development of (semi-)automated tools to check whether datasets comply with the certification standards, before or during the process of submission of datasets to repositories. Our umbrella categories might help to develop the categories that datasets are judged against.

As a final conclusion, we suggest that the seven common umbrella categories we identified could form a common ground for the standardisation of data repository requirements. These categories are multifaceted in meaning and could therefore support and improve data management, data quality, and transparency of the services provided by the data repository. Our results complement previous work by Dabratz et al., 2010 in which she concludes that general standards for repositories are complex, standards are used as guidelines instead, and that there is a need for a specific standard. Our umbrella categories could provide the basis for this standard. Therefore, we suggest that researchers take these criteria into account when choosing a repository, and suggest that recommending organizations consider converging on these standards.

Additional Files

The additional files for this article can be found as follows:

Supplementary Table 1

Table “Recommended Criteria Cluster” showing 13 main criteria used by organizations recommending repositories, with separate individual characteristics for each. The occurrence of criteria in the information provided by organizations is represented by an “X”. DOI: https://doi.org/10.5334/dsj-2017-042.s1

Supplementary Table 2

Table “Certified Criteria Cluster” showing 11 main criteria used by organizations certifying repositories, with separate individual characteristics for each. The occurrence of criteria in the information provided by organizations is represented by an “X”. DOI: https://doi.org/10.5334/dsj-2017-042.s2