Draft minutes and action summary

02 Dec 2016
Groups audience: 

Dear all,
The draft minutes from today's call are available here: https://www.rd-alliance.org/group/rda-organisational-assembly/wiki/oab-n...
And I also created the list of actions (quite big this time and we need to implement them!):
* Previous ones:
o Mark: Work more on adopter registration (who's adopted what outputs)- Open.
o Fotis: Subscribe all Industrial members to the private OAB wiki space after accepting (invites sent)- Open
* New ones:
o Action Fotis: Include the strategy papers for discussion in the January meeting and inform all members. Action: All members to read the papers.
o Action: Raphael to clarify with Rainer what information he is looking for from OA/OAB on the secretariat needs.
o Action Lynn: Provide a list with all the groups for review and circulate this to the OAB list - CLOSED
o Action Fotis: To get more information on the work and timeplan on the update of the TAB LOG output by RDA Europe.
o Action Fotis: Link the V&E document to the OA page and also in the get involved/organisational membership page with some highlights
o Action: OAB-sub-group on industry (Amy, Juan, Fab, Ross) to plan work on industrial engagement talking to industrial members and liaise with the Hilary group on Industry.
o Action Mark: Follow-up with Laurel Haak (ORCID) to better connect RDA with ORCID, including possible automatic registration of reviews under the ORCID and/or RDA profile.
o Action Fotis: Prepare doodle for the next meeting for the week of the 9th of January 2017.
o Action: Fotis with the help of secretariat to prepare a calendar with some options for the next OA f2f meeting in Barcelona so that the OAB can decide on those.
All the best,
Fotis
---
Fotis Karayannis, Dr. Eng.
RDA secretariat-OAB liaison
ATHENA Research Center
Phone: +30 211 1206 431
Mobile: +30 6945 878784
Skype: fotis71
Twitter: fkarayan

  • Jamie Shiers's picture

    Author: Jamie Shiers

    Date: 02 Dec, 2016

    Ciao Fotis and all,
    On 02 Dec 2016, at 03:07, fkara2 <***@***.***-innovation.gr> wrote:
    Mark: Work more on adopter registration (who's adopted what outputs)- Open.
    This is as good a place as any to hang my current (repeated) thoughts on.
    I’ll start with the “case study”.
    Certification of Digital Repositories is something on the RDA roadmap. (In the loosest sense of the term).
    Quite a few people interpret this as you pass a metric (and there are often many) or you fail it: binary.
    A suggestion I came across recently was to use a scale from 0 - 4: not started, slightly compliant etc.
    How much is this worth? A Facebook like? A pint of Guinness?
    How much is it worth in a concrete case, e.g. how much time & effort might it save in the certification of a major DR?
    You can write the rest of this story yourself(ves).
    The point is (again) focussing only on adoption of “RDA outputs” is missing a lot of the “value”.
    (I tell myself every time never to bother writing a message like this but then I do. But not forever).
    Happy Winter Solstice (NH) and all that.
    Cheers, Jamie

  • Hans Pfeiffenberger's picture

    Author: Hans Pfeiffenberger

    Date: 03 Dec, 2016

    Dear Jamie,
    and all
    unfortunately, I could not be present this Thursday. So I apologize in
    case I repeat any of your discussion:
    I am not sure I really get the drift of your argument, but here is my
    take on certification:
    There are lots of "repositories" "out there" in actual use - most of
    which do not really qualify ... according to which criteria? And who
    would need to do the vetting - I do ask this, among other functions,
    as editor of ESSD? A few obvious examples:
    - if providing persistent identifiers is a criterion, that seems
    relatively easy to see - until some repo claims to provide one which I
    do not know, or I do not know the policy (persistence) associated with it
    - repos should be able to ensure and prove the integrity of data - how
    should I know they do?
    - repos should have a succession plan in case they are disbanded (see
    the recent announcement about CDIAC http://cdiac.ornl.gov ) - how
    would I find that plan and assess its credibility?
    etc.
    Also, since funder increasingly require data to be put in a
    trustworthy repo, credible certification of said trustworthyness is
    indispensable, or else there will be lots of people printing money for
    claiming to take care of data...
    About your question "to use a scale from 0 - 4: ... How much is this
    worth?". I was present at the discussion when such a scheme was
    introduced at Data Seal of Approval.
    First: The Seal (certification) was granted only upon reaching a
    certain minimum of points per criterion and then, as long as full
    compliance was not reached, a repo was required to reach one more
    point per year (or so) until is did - or would loose the seal.
    Meanwhile the regulation on this is: "Note also that compliance levels
    1 and 2 can be valid for internal self-assessments, while
    certification may be granted if some guidelines are considered to be
    at level 3—in the implementation phase—"
    https://assessment.datasealofapproval.org/media/files/DSA_booklets/Core_...
    What was the rationale for not requiring full compliance upfront?
    There were (and still are) lots of otherwise good and important repos
    "out there" which couldn't reach full DSA certification (not to speak
    of ISO 16 363). For these this stepwise approach was supposed to
    provide a scale to climb out of their misery ... and it was
    envisioning that the leeway (such as accepting the level "we have a
    plan") would be reduced over time (as it was, indeed)
    In other words, you always need a realistic way to go from A to B,
    especially if B is an ideal.
    best,
    Hans

  • Jamie Shiers's picture

    Author: Jamie Shiers

    Date: 03 Dec, 2016

    Dear Hans,
    I also didn’t take part in Thursday’s meeting.
    I was trying to give a concrete example of things other than “outputs” that had measurable value.
    In my direct experience, an idea such as this can have value as one goes through the self-certification steps (according to whatever guidelines that repository chooses) prior to an external audit.
    It is unlikely that, on day 1, any repository that has not already considered the relevant criteria will fully satisfy all of them. Using this scale, you can easily see where work needs to be invested as well as to track progress towards “conformance”. (I have done this for CERN and we have started discussing this internally - the people involved seemed to think it helped).
    At CERN, we are working towards ISO 16363 conformance for both scientific data as well as our “digital memory”. We target completing this prior to the next update of the European Strategy for Particle Physics, around 2019 / 2020. We have a concrete plan for addressing this, but this is secondary to the main argument: it is not just “outputs” that have value. (This was described at iPRES, but not the 0-4 scale, that I was not aware of at the time).
    (The OA(B), after all, discusses regularly Value and Engagement: the former likely leading to the other).
    Bonne soiree, Jamie
    On 03 Dec 2016, at 17:24, ***@***.*** wrote:
    Dear Jamie,
    and all
    unfortunately, I could not be present this Thursday. So I apologize in case I repeat any of your discussion:
    I am not sure I really get the drift of your argument, but here is my take on certification:
    There are lots of "repositories" "out there" in actual use - most of which do not really qualify ... according to which criteria? And who would need to do the vetting - I do ask this, among other functions, as editor of ESSD? A few obvious examples:
    - if providing persistent identifiers is a criterion, that seems relatively easy to see - until some repo claims to provide one which I do not know, or I do not know the policy (persistence) associated with it
    - repos should be able to ensure and prove the integrity of data - how should I know they do?
    - repos should have a succession plan in case they are disbanded (see the recent announcement about CDIAC http://cdiac.ornl.gov ) - how would I find that plan and assess its credibility?
    etc.
    Also, since funder increasingly require data to be put in a trustworthy repo, credible certification of said trustworthyness is indispensable, or else there will be lots of people printing money for claiming to take care of data...
    About your question "to use a scale from 0 - 4: ... How much is this worth?". I was present at the discussion when such a scheme was introduced at Data Seal of Approval.
    First: The Seal (certification) was granted only upon reaching a certain minimum of points per criterion and then, as long as full compliance was not reached, a repo was required to reach one more point per year (or so) until is did - or would loose the seal. Meanwhile the regulation on this is: "Note also that compliance levels 1 and 2 can be valid for internal self-assessments, while certification may be granted if some guidelines are considered to be at level 3—in the implementation phase—" https://assessment.datasealofapproval.org/media/files/DSA_booklets/Core_...
    What was the rationale for not requiring full compliance upfront? There were (and still are) lots of otherwise good and important repos "out there" which couldn't reach full DSA certification (not to speak of ISO 16 363). For these this stepwise approach was supposed to provide a scale to climb out of their misery ... and it was envisioning that the leeway (such as accepting the level "we have a plan") would be reduced over time (as it was, indeed)
    In other words, you always need a realistic way to go from A to B, especially if B is an ideal.
    best,
    Hans
    Am 02.12.16 um 20:30 schrieb Jamie Shiers:
    Ciao Fotis and all,
    On 02 Dec 2016, at 03:07, fkara2 <***@***.***-innovation.gr> wrote:
    Mark: Work more on adopter registration (who's adopted what outputs)- Open.
    This is as good a place as any to hang my current (repeated) thoughts on.
    I’ll start with the “case study”.
    Certification of Digital Repositories is something on the RDA roadmap. (In the loosest sense of the term).
    Quite a few people interpret this as you pass a metric (and there are often many) or you fail it: binary.
    A suggestion I came across recently was to use a scale from 0 - 4: not started, slightly compliant etc.
    How much is this worth? A Facebook like? A pint of Guinness?
    How much is it worth in a concrete case, e.g. how much time & effort might it save in the certification of a major DR?
    You can write the rest of this story yourself(ves).
    The point is (again) focussing only on adoption of “RDA outputs” is missing a lot of the “value”.
    (I tell myself every time never to bother writing a message like this but then I do. But not forever).
    Happy Winter Solstice (NH) and all that.
    Cheers, Jamie
    --
    Full post: https://www.rd-alliance.org/group/rda-organisational-assembly/post/draft...
    Manage my subscriptions: https://www.rd-alliance.org/mailinglist
    Stop emails for this post: https://www.rd-alliance.org/mailinglist/unsubscribe/54592
    --
    Hans Pfeiffenberger
    www.awi.de/People/show?pfeiff
    --
    Full post: https://www.rd-alliance.org/group/rda-organisational-assembly/post/draft...
    Manage my subscriptions: https://www.rd-alliance.org/mailinglist
    Stop emails for this post: https://www.rd-alliance.org/mailinglist/unsubscribe/54592

  • Ross Wilkinson's picture

    Author: Ross Wilkinson

    Date: 04 Dec, 2016

    Hi All, An interesting discussion on value…
    I agree that simply saying whether an output has been implemented or not, does not capture value, but I do like the outputs to be concrete, but feel as organisations simply “implementing an output” misses value.
    Staying with the same example, the existence of the output - the Core Trustworthy Data Repository Requirements, has been helpful in Australia.
    ANDS benefits because it can ask our project partners who are working on Trusted Data Repository Services to use this output as a reference.
    The implementers of Trusted Data Repository Services, 6 at present, have an internationally agreed approach to use to measure their performance. As Hans points out, not all might meet all requirements fully, but they have a pathway.
    Thus the effect of the output is beyond implementation or not, and really quite significant in this case.
    Clearly knowing a set of organisations that have been “influenced” by an RDA output is very good - and not quite so easily captured by “adoption”. As well as influence, it would be good to solicit stories of how the influence has occurred.
    This is quite relevant to our technical outputs, but is probably even more important for our social outputs.
    Finally, to share with you a story on impact. In Australia this year over 1,000 people participated in our “23 Things for data librarians” - derived directly from the RDA output - quite an impact!
    ….Ross
    -----------
    Australian National Data Service
    Research Data Alliance
    M: +61 419 53 41 63 <> | T: +61 3 9902 0598 <>

submit a comment