Skip to main content

Notice

We are in the process of rolling out a soft launch of the RDA website, which includes a new member platform. Existing RDA members PLEASE REACTIVATE YOUR ACCOUNT using this link: https://rda-login.wicketcloud.com/users/confirmation. Visitors may encounter functionality issues with group pages, navigation, missing content, broken links, etc. As you explore the new site, please provide your feedback using the UserSnap tool on the bottom right corner of each page. Thank you for your understanding and support as we work through all issues as quickly as possible. Stay updated about upcoming features and functionalities: https://www.rd-alliance.org/rda-web-platform-upcoming-features-and-functionalities/

Canvasing interest and collaborations for a data-review form

  • Creator
    Discussion
  • #68790

    Dear All,
    I would like to reach out to this community with an idea that I hope fits with the charter of the group, to create a Data Review Report that could form part of the scholarly communication in a similar way to that of peer reviews. Some basic details are included below.
     
    Background:
    As pointed out in this group’s charter “The prevalence of research data policies from institutions and research funders is increasing”, meaning researchers are seeing more, and sometimes confusing, information and recommendations on how best to go about sharing data. This can result in a wide variety of things being made available by authors in an attempt to meet those policies. Often policies are vague to enable them to cover the broad scope of subject and disciplines covered by the publisher.
    In order to both educate authors and researchers, as well as provide constructive feedback about data sharing and availability I believe there is a need in the scientific publishing world for a standardised process of data-review to be carried out on all manuscripts in a similar way to the peer-review process. Eventually, when it becomes stream-lined and common practice, it may be folded into the peer-review process itself.
     
    The (rough) plan:
    To create a spreadsheet-based structured data-review form to aid a (data)reviewer to assess the availability of data discussed in a manuscript. It is envisaged that a data review would be carried out either in parallel to, or in advance of, the normal peer review of a manuscript.
    The goal of a  data review is to assess the availability of ALL data discussed in the manuscript. This would include data generated as part of the study as well as that previously generated and re-used in the study. A key factor here will be the definition of what counts as “data”, for me, it is any electronic item that is required to reproduce the work, including software and scripts.
    The data-review-report generated could form a part of the scholarly communication in a similar way to that of peer reviews.
     
    Desired outcomes:
    – Reviewers get guidance on how to check manuscripts for the availability of “data” (including what is/are “data”).
    – Authors get guidance from reviewers on what is expected /required to enable reproducibility.
    – the community gets more reproducible (and FAIRer) research publications.
    – If the data-review-report is published, the reviewers can get credit for their work on the review.
    – The reports could become a valuable resource for policymakers to see where data-sharing is going right and where it still needs work.
     
    Please bear in mind this is early days of the idea and there is much scope for input from interested parties. I have made a start on a spreadsheet-based form that I am happy to share with anyone interested. The idea is that it provides guidance to the reviewer on what to look for, and structures their findings in a simple table to allow authors to address any issues highlighted.
     
    Feel free to contact me directly (email in footer) to discuss this further.
     
    Kind regards
    Chris
     
    Chris Hunter
    Lead BioCurator, GigaDB
    GigaScience, BGI-HK
    Email: chris at gigasciencejournal.com
    Tel: (44)07429063514
    ORCID: 0000-0002-1335-0881
    Web: www.gigadb.org
     

  • Author
    Replies
  • #89932

    Hi Chris,
    we (American Economic Association) have been using a template report form (not checkbox) for the past 2 years (about 1,200 reports prepared so far) – https://github.com/AEADataEditor/replication-template/blob/main/REPLICAT….
    It is light on data checks, heavy on data provenance and computational checks. It is geared at a team of undergraduate replicators, not seasoned researchers, to use while checking replication packages.
    To encourage better compliance, a number of us have put together and published a template “README” – which is meant for researchers to be used to document their work, and which in turn could be checked. http://doi.org/10.5281/zenodo.4319999 and https://social-science-data-editors.github.io/template_README/. It includes the usual data availability statement (no code availability statement, since at least at AEA journals, providing code is not optional).
    Happy to discuss further with anybody interested.
    Lars

    Lars Vilhuber, Economist
    Cornell University, Executive Director, Labor Dynamics Institute
    and ILR School – Department of Economics
    American Economic Association – Data Editor
    Journal of Privacy and Confidentiality – Managing Editor
    ***@***.*** | http://lars.vilhuber.com/
    p: +1.607-330-5743 | https://twitter.com/larsvil
    Assistant: ***@***.*** | +1.607-255-2744
    – Show quoted text -From: ***@***.***-groups.org on behalf of only1chunts via Data policy standardisation and implementation IG
    Sent: Friday, March 12, 2021 07:04
    To: Data policy standardisation and implementation IG
    Subject: [rda-data-policy-standardisation-ig] Canvasing interest and collaborations for a data-review form
    Dear All,
    I would like to reach out to this community with an idea that I hope fits with the charter of the group, to create a Data Review Report that could form part of the scholarly communication in a similar way to that of peer reviews. Some basic details are included below.
    Background:
    As pointed out in this group’s charter “The prevalence of research data policies from institutions and research funders is increasing”, meaning researchers are seeing more, and sometimes confusing, information and recommendations on how best to go about sharing data. This can result in a wide variety of things being made available by authors in an attempt to meet those policies. Often policies are vague to enable them to cover the broad scope of subject and disciplines covered by the publisher.
    In order to both educate authors and researchers, as well as provide constructive feedback about data sharing and availability I believe there is a need in the scientific publishing world for a standardised process of data-review to be carried out on all manuscripts in a similar way to the peer-review process. Eventually, when it becomes stream-lined and common practice, it may be folded into the peer-review process itself.
    The (rough) plan:
    To create a spreadsheet-based structured data-review form to aid a (data)reviewer to assess the availability of data discussed in a manuscript. It is envisaged that a data review would be carried out either in parallel to, or in advance of, the normal peer review of a manuscript.
    The goal of a data review is to assess the availability of ALL data discussed in the manuscript. This would include data generated as part of the study as well as that previously generated and re-used in the study. A key factor here will be the definition of what counts as “data”, for me, it is any electronic item that is required to reproduce the work, including software and scripts.
    The data-review-report generated could form a part of the scholarly communication in a similar way to that of peer reviews.
    Desired outcomes:
    – Reviewers get guidance on how to check manuscripts for the availability of “data” (including what is/are “data”).
    – Authors get guidance from reviewers on what is expected /required to enable reproducibility.
    – the community gets more reproducible (and FAIRer) research publications.
    – If the data-review-report is published, the reviewers can get credit for their work on the review.
    – The reports could become a valuable resource for policymakers to see where data-sharing is going right and where it still needs work.
    Please bear in mind this is early days of the idea and there is much scope for input from interested parties. I have made a start on a spreadsheet-based form that I am happy to share with anyone interested. The idea is that it provides guidance to the reviewer on what to look for, and structures their findings in a simple table to allow authors to address any issues highlighted.
    Feel free to contact me directly (email in footer) to discuss this further.
    Kind regards
    Chris
    Chris Hunter
    Lead BioCurator, GigaDB
    GigaScience, BGI-HK
    Email: chris at gigasciencejournal.com
    Tel: (44)07429063514
    ORCID: 0000-0002-1335-0881
    Web: http://www.gigadb.org

    Full post: https://www.rd-alliance.org/group/data-policy-standardisation-and-implem
    Manage my subscriptions: https://www.rd-alliance.org/mailinglist
    Stop emails for this post: https://www.rd-alliance.org/mailinglist/unsubscribe/73065

Log in to reply.