Munin 2018 Day 1 through the eyes of @LuytenBram @atmire

Começar. É Gratuito
ou inscrever-se com seu endereço de e-mail
Munin 2018 Day 1 through the eyes of @LuytenBram @atmire por Mind Map: Munin 2018 Day 1 through the eyes of @LuytenBram @atmire

1. Martin Donnelly Open Science Approaches at University of Edinburgh

1.1. Institution stats

1.1.1. 13.8k staff

1.1.2. ~40k students

1.1.3. Largest HEI in Scotland

1.1.4. Mission: creation, dissemination and curation of knowledge

1.2. Library Research Support

1.3. Research Data Service

1.3.1. Cross cuts multiple information services directorates

1.3.2. Supporting the RDM agenda

1.4. Edinburgh's policy approach

1.4.1. Formal commitments to integrity, OA, DMPs, open data

1.4.2. Publication policy is 8 years old. Institution moving away from supporting hybrid journals (pre plan S)

1.4.3. 2011 data management policy doesn't emphasize on Open

1.4.4. Both policies to be updated this year where a theme will likely be OPEN AS THE DEFAULT

1.5. Benefits of/drivers for Openness

1.5.1. The integrity argument is very hard to argument against!

1.6. Replication challenges

1.7. Current state of play

1.7.1. Library research support taking the lead

1.7.2. ...

1.8. LERU Roadmap

1.9. ED RDM Roadmap

1.10. Challenge: stay compliant with GDPR

2. Mikhail Popov Sales executive at RSC Transition not revolution

2.1. Royal society of chemistry

2.2. Effect of converting a specific journal to OA

2.3. RSC Advances

2.3.1. Multidisciplinary high quality journal

2.3.2. Open Access since early 2017

2.3.3. Used to be the biggest journal in chemistry and the second biggest in the world after PLOS ONE

2.3.4. China counting as 48% of articles in 2017

2.3.5. History

2.3.5.1. 2011 2 year free access period

2.3.5.2. 2016 Conversion to gold OA

2.3.5.3. 2017 no longer part of RSC Gold, content freely accessible

2.3.6. Significant DECLINE of submissions

2.3.6.1. dropped to less than 60 submissions per day

2.3.6.2. only 6.6k publications in 2017, after 13k in 2016

2.3.6.3. Acceptance rate the same, but volume of publications went down

2.3.7. Acceptance rate around 40%

2.3.7.1. No quantitative change

2.3.8. The Drop was really correlated to the flip to OA.

2.3.9. Submissions increased to other titles, but RSC Advances declined

2.3.10. Still experiencing more loss of submissions

2.3.11. 750 pound APC

2.3.11.1. Also full APC waiver

2.3.12. Industry standard CC licensing

2.3.13. Strong declines across all countries

2.3.13.1. Doesn't matter whether it's from a country with strong OA support.

2.3.14. Norway actually went up with 8%!

2.3.15. We have a serious risk losing authorship to competing journals

2.3.15.1. Some countries not supporting (gold) OA?

2.3.15.2. Funding tied up with bigger publishers?

2.3.16. We don't want to see the same declines in submissions for all our journals

2.3.17. RSC Advances market share loss

2.3.18. Flip to OA had no positive effect on readership

2.4. Transparent hybrid model (Read & Publish)

2.5. Conclusions

2.5.1. We need support of research intensive countries

2.5.2. Conversion to OA will mean a decline in submissions

2.5.3. Journals will not mass convert when this decline is there

2.5.4. Researcher academic freedom should not be limited

2.5.5. Models employing hybrid is sustainable

3. Demmy Verbeke OA Support at KU Leuven

3.1. OA before 2018

3.1.1. focus on green OA

3.1.2. obligation to deposit, no obligation to publish in OA

3.1.3. focus on AAMs of journal articles

3.1.4. Overall approach relatively soft

3.1.5. Gold OA journals and monographs of Leuven University Press

3.1.6. Officially no support for for-profit Gold OA

3.1.7. In reality, APC payments outside of library at very least ~€380k

3.1.7.1. This is on top of the €9M KUL collection budget they already spend

3.2. Why Fair OA?

3.2.1. Ethical reasons

3.2.2. Academic reasons

3.2.3. Financial reasons

3.3. How Fair OA?

3.3.1. for-profit OA

3.3.1.1. Commercial publishers

3.3.1.1.1. ~€380k spend

3.3.1.2. Fake publishers

3.3.1.2.1. ~€30k spend

3.3.1.3. Short term solution that will ultimately be unsustainable

3.3.2. non-profit OA

3.3.2.1. green OA

3.3.2.1.1. self archiving, typically delayed

3.3.2.1.2. no answer on the financial reasons because it costs money to run a repository, and we still need to pay for the publishing that preceeds the deposit

3.3.2.2. black OA

3.3.2.2.1. Provides a financial solution!

3.3.2.3. fair (gold) OA

3.3.2.3.1. non-commercial academic publishers: mission driven rather than profit driven

3.3.2.3.2. OA through cost effective APCs/BPCs or through subsidy model (aka diamond/platinum)

3.3.2.3.3. Will tick all the boxes

3.4. KU Leuven fund for Fair OA

3.4.1. Monographs

3.4.1.1. BPCs for OA monographs with Leuven university press

3.4.2. Articles

3.4.2.1. For non-profit publishers

3.4.2.2. fair character guaranteed

3.4.2.3. DOAJ in combination with WoS and VABB

3.4.2.4. 8 months in, 28 applications, 16 approved. averaging 734 APC including VAT

3.4.2.5. Not super effective yet, but a great conversation starter on why applications are being rejected.

3.4.2.6. Practically: if a researcher asks less than 1000, approved without question. If they ask more, there must be an argumentation

3.4.2.6.1. That argumentation/answer, justifying more than 1000 euro never comes.

3.5. BPCs

3.5.1. Open to all authors, not only KU Leuven authors

3.5.2. 8 months old, 13 book projects approved

3.6. Question: It's not for profit that it's bad, right? If good services are provided at a fair price, what's the problem?

3.6.1. If scholars determine the rules it's better

3.6.2. For commercial companies, the profit motivation will ultimately compromise/hurt scholarship.

4. OpenUP Measuring Research Impact: Concepts, Methods, Limitations and Solutions Frontiers Media

4.1. Metrics providers today

4.1.1. Short term

4.1.1.1. Altmetric

4.1.1.2. Plum

4.1.1.3. Crossref event data

4.1.2. Medium term

4.1.2.1. Clarivate analytics

4.1.2.2. Google Analytics

4.1.3. Longer term

4.1.3.1. Lens.org

4.1.3.2. Dimensions

4.2. Authors

4.2.1. Mappet Walker

4.2.2. Stepahnie Oeben

4.2.3. Richard Walker

4.3. OpenUP objectives

4.3.1. dissemination

4.3.2. Peer review

4.3.3. impact measurement

4.4. impact measurement

4.4.1. Feedback from researchers: citations are not impact

4.4.2. Impact of Basic research

4.4.2.1. society improvements because we have a better understanding

4.4.3. Impact of Applied research

4.4.3.1. Only possible because of the basic research FIRST

4.4.4. the research metrics GUIDE my research and careers

4.5. Limitations of current metrics

4.5.1. Limited view of the reality of scientific impact

4.5.1.1. distorted incentives

4.5.1.2. easily gamed

4.5.1.3. incentive for publication bias

4.5.1.4. counts only publications (no data, software, ...)

4.5.2. Small number of metrics providers, no role for scientific communities

4.5.3. Lack of easy reproducibility

4.6. Impact Data Services Platform

4.6.1. Open Metrics

4.6.1.1. Data collection

4.6.1.2. Increased coverage

4.6.1.3. Standards based, and methods to link data

4.7. Next steps

4.7.1. Continue conceptual framework, design, collaborators and network

4.7.2. Implement through H2020

4.8. Question: do you think there are metrics that can't be gamed?

5. Timon Oefelein Current practices in data sharing and challenges ahead Springer Nature

5.1. Importance of verification

5.1.1. Economist How Science Goes Wrong

5.1.2. Community perspective

5.1.2.1. 52% of survey respondents say there is a reproducibility crisis

5.2. Current data sharing practices amongst researchers

5.2.1. We are not counting/checking sharing via email etc

5.2.2. Only durable ways of sharing, e.g. through repositories

5.3. Largest chunk of survey respondents primarily from US and Europe

5.4. Survey results

5.4.1. Everyone generally finds it important that data is discoverable

5.4.2. Everyone wants data to be shared with them, but they don't massively share themselves yet.

5.4.3. Most respondents have not heard of FAIR Data in 2018

5.4.4. Sharing is on the rise, 62% of respondents sharing in 2018

5.5. What is Springer Nature doing

5.5.1. Research data helpdesk

5.5.2. Recommended repositories

5.5.3. Data policies

5.5.4. Helping to set standards

5.5.5. Implement citation and linking practices

5.5.6. Research Data Service

5.5.7. New: Data availability reporting

5.5.8. Credit for data sharing via badges

5.6. Key developments

5.6.1. Researchers citing Data via DataCite Standard

5.6.2. Interconnectivity of repositories DANS/InterMed

6. Nigel Gilles Yoccoz - UiT The replication and reproducibility crises: origins and consequences for studies of ecology and evolution

6.1. Terminology of reproducibility is still undecided

6.1.1. methods reproducibility

6.1.2. results reproducibility

6.1.3. inferential reproducibility

6.2. Nature experiment. Asking one specific question about one dataset, to different people

6.2.1. There is no "unique" or "correct" way of analysing data.

6.3. Transferability

6.3.1. Can your model be used in another place or another time?

6.4. Not every statistically "significant" result is notable.

6.5. Scientific progress is hurt by overemphasis of novelty

6.5.1. For example, statements that a replication of an existing study is not a candidate for publication.

6.5.2. But if replications are not published, who would be bothered carrying them out?

7. Kenneth Ruud

7.1. Univ strategy 2009-2013

7.1.1. OA should be preferred dissemination

7.1.2. OA policy approved by board in 2010

7.2. 2017 UiT status

7.2.1. Gold OA ~20%

7.2.2. Total OA ~60%

7.2.3. Non-gold OA overtook gold OA

7.2.4. Big shift between 2016 and 2017 as funders threatened to withhold funding in case of non compliance

7.3. Implementing DORA

7.3.1. Main 2 things

7.3.1.1. Being evaluated by the quality of work you have done, not the publication channel you selected

7.3.1.2. We should look more broadly at different types of contribution to science

7.3.1.2.1. Look at the WHOLE researcher's contribution

7.3.1.3. In that sense, it's not controversial to sign this declaration.

7.3.2. 3 institutions in Norway who have signed it

7.3.2.1. Research council of Norway has ALSO signed it. Key that the funder is on board.

7.3.3. What does it take?

7.3.3.1. Hiring processes

7.3.3.1.1. DORA statements included both in policy as well as in announcements of positions

7.3.3.1.2. 4 documents to change

7.3.3.2. Evaluation of PhD theses

7.3.3.2.1. In Norway: pass or no pass, no grading

7.3.3.3. Distribution of prizes and research funding

7.3.3.3.1. Evaluations of project proposals

7.3.3.3.2. Change texts of announcements of prizes and research funding instruments

7.3.3.3.3. Again, work upstream with external funders and their announcements as well.

7.3.3.4. Sabbaticals

7.3.3.4.1. Potentially as often as every 5th year

7.3.3.4.2. People need to qualify for this, by getting a minimum of publication points.

7.3.3.4.3. We need to find other measures to qualify for sabbaticals.

7.3.4. The Norwegian publication indicator

7.3.4.1. Evaluation on quality of all possible channels of dissemination.

7.3.4.2. Some journals ranked as of "higher" quality as others - a "level 2" channel. For example, recognition of a journal as the "best" in its field.

7.3.4.2.1. Still problematic with DORA, because this is exactly judging the venue where you publish and not the quality of what you publish

7.3.4.2.2. This indicator was never meant as an instrument to evaluate individuals. But a tool to distribute funding between INSTITUTIONS

7.3.4.3. Leads to a definition of publication "points" for an article

7.3.4.4. Basically a national assessment replacing and decoupled of the impact factor.

7.3.4.5. Question: Is this based on an expert committee? Is it possible to compare this to other countries

7.3.4.5.1. Answer: one of the basic principles is that level 2 journals should only account for 20% of the output. So if more publications get into these level 2 journals, the selection will become stricter.

7.3.4.5.2. In some fields, there is a very strong correlation with impact factor BUT not in others.

7.3.4.5.3. Other countries who adopt the indicator basically copy the lists, and don't do their own entire effort of establishing these committees etc.

7.3.5. We can change all the paperwork, but how do we change CULTURE?

7.3.6. TODO

7.3.6.1. Still need to assess which other research work than scientific articles alone

7.3.6.1.1. We need to be more explicit and specific here.

7.3.6.2. Follow work currently ongoing in EUA

7.3.6.3. Need to get funders, also on EU levels, need to be on board

7.3.7. Take home messages

7.3.7.1. Implementing DORA is easier than implementing OA

7.3.7.2. You can not hope for OA transformation without adoption of DORA, or something similar

7.3.7.3. DORA implements what researchers want, but perhaps do not comply with in practice

7.3.7.4. Basing ourselves on proxies of quality is BAD PRACTICE

7.4. Question: Evaluation processes are so different across disciplines. In History we have always been looking at quality

7.4.1. If you want to know how to implement DORA, learn from the humanities

7.5. Question: is it difficult to advocate for DORA?

7.5.1. It's easier than the switch to open access because DORA is more straightforward what people want.

8. Victoria Tsoukala

8.1. Open Science, Excellent Science

8.2. European Commission

8.2.1. EU Legislation

8.2.2. legislate with other community institutions, like the parliament, the council

8.2.3. ...

8.3. Why Open Science

8.3.1. Better ROI for R&I investments

8.3.2. Faster circulation of new ideas

8.3.2.1. 22 million SMEs

8.3.3. More transparency in the system

8.3.4. ...

8.4. How do we support open science?

8.4.1. Funding programmes requirements

8.4.2. 2012 recommendation on scientific information

8.4.2.1. Often we can only RECOMMEND to member states but rely on member states for actual implementation and enforcement

8.4.3. 2016 council conclusions

8.4.4. Stakeholders and Expertise

8.4.4.1. Open Science policy platform

8.4.5. Long standing support starting 2006, with pilot in FP7

8.4.6. INTENSIFICATION. Now starting preparing open science in FP9 - Horizon Europe

8.4.7. PSI Directive revision

8.5. Holistic Policy Agenda - Scope and Ambitions

8.5.1. Open Data

8.5.2. European Science Cloud

8.5.2.1. 2016 announced

8.5.2.2. Problems

8.5.2.2.1. Fragmented access

8.5.2.2.2. Limited cross disciplinary access

8.5.2.2.3. Non interoperable services

8.5.2.2.4. Closed data

8.5.2.3. Goal: Seamless and unified access for researchers

8.5.2.3.1. Federation of existing services at European level

8.5.2.3.2. It's not simply technical, it's a socio-technical process. Change of current processes and habits.

8.5.2.4. 6 action lines. Launch last week in Vienna

8.5.2.4.1. Architecture

8.5.2.4.2. Data

8.5.2.4.3. Services

8.5.2.4.4. Access & Interface

8.5.2.4.5. Rules

8.5.2.4.6. Governance

8.5.3. Altmetrics

8.5.4. Future of scholarly communications

8.5.5. Reward systems

8.5.6. ...

8.5.7. 2018 Data Package

8.5.7.1. PSI Directive revision

8.5.7.1.1. Data produced by publicly funded research should be open.

8.5.7.1.2. Re-use of research data will be free

8.5.7.1.3. The feedback of the parliament on this has been very good.

8.5.7.1.4. To be seen how this will be transposed in member state laws.

8.5.7.2. Draft guidance on private sector data sharing in B2B and B2C contexts

8.5.7.3. 2018 revised recommendation on access and preservation of scientific information

8.5.7.3.1. Fine-tuning of the initial recommendation, making it even a stronger instrument

8.6. The Open Research Europe Publishing Platform (ORE)

8.6.1. Non-award for this RFP, new procedure being launched very soon. In the coming weeks.

8.6.2. Commission is committed on providing such a platform, so it's going to happen

8.6.3. Help H2020 beneficiaries to comply with the strict OA mandate ... at NO COST TO THEM

8.6.4. Improve uptake of OA in H2020

8.6.4.1. We're close to 70% but want to get to 100%

8.6.5. Support sharing of preprints, open peer review and post publication commenting

8.6.6. New generation metrics

8.6.7. Explore the economic side of OA publishing

8.6.7.1. transparency

8.6.7.2. cost effectiveness

8.6.7.3. sustainability

8.6.8. As a funder, we want to be more involved in the publishing activity.

8.7. Horizon Europe to operate under open science premise (FP9)

8.7.1. Stronger OA instrument

8.7.2. Status quo: trilogue on regulation of HE; articles 2, 10 and 35

8.7.3. Authors/beneficiaries should retain enough rights so that they can open up their data/results.

8.7.4. Problem: researchers think that research data management automatically includes full open data. But those are separate.

8.7.4.1. They could opt out on open access, but still need to do proper data management.

8.7.5. Stronger sanctions for non-compliance

8.7.5.1. directed at institutions, not researchers

8.8. Plan S - Making open access a reality by 2020

8.8.1. 13 national research funders associated, including the Norwegian one

8.8.2. Implementation guidance

8.8.3. It is a political commitment by funders, and has the potential to be a strong coordination mechanism

8.8.4. Principles

8.8.5. Main paths

8.8.5.1. Publication in OA journals or platforms without embargo

8.8.5.2. Deposit in OA repositories (VoR or AAM) without embargo

8.8.5.3. Publish in "Hybrid" journals under transformative agreements

8.8.5.3.1. participating funders can be more strict and drop this path

8.8.6. Proposed policy for Horizon Europe re: Plan S

8.8.6.1. Legal text in regulation articles 2, 10 and 35

8.8.6.1.1. Good news/aligned

8.8.6.2. Model grant agreement

8.8.6.2.1. Provisions for Horizon Europe are stronger in some aspects, both to publications and data

8.8.6.3. There is no intrinsical conflict

8.9. Promoting an EOSC in practice

8.10. FAIR data access group - turning FAIR into reality

8.11. Forthcoming report: vision on the future of scholarly communication

9. Corina Logan Bullied into Bad Science (BIBS)

9.1. www.bulliedintobadscience.org

9.1.1. Times article that exploded

9.2. Leading individuals and institutions in adopting open practices to improve research rigor

9.3. People who are not bullied are able to

9.3.1. READ, UNDERSTAND and VERIFY

9.3.2. GENERATE and DISSEMINATE RESEARCH

9.4. JISC Collections Deal

9.4.1. 5y deal didn't address any of the points that had to be negotiated

9.4.2. Rampage formed of people who were explicit about the bad Elsevier contract

9.4.3. Our efforts to change Cambridge failed

9.4.3.1. Senior leadership tended to look at other institutions (Oxford, ...)

9.5. Based on this failure, a campaign started targeting ECRs

9.5.1. Petition with 9 points

9.5.2. Signatures sent to parliament

9.6. Exploitative route

9.6.1. We can format our own papers, I can't see why this can still cost so much as a service.

9.7. Ethical route

9.7.1. Ethical framework

9.7.1.1. Knowledge as common good

9.7.2. By choosing where to publish, where to put your money, you have an impact as well. Example: publishing in PeerJ because of the conviction that they are improving research rigor.

9.7.3. It's also easier, faster so I can focus more on my research

9.8. Open Science MOOC

9.9. Gatto: Publishing data gets you 5-50% more citations?

9.10. Peer Community In

9.10.1. Managing board member of PCI now

9.10.1.1. Budget was 12.000 euro per year, with half of it spent on going to meetings to see eachother. It's basically free to do this.

9.10.2. Pre-registration of plans/analysis BEFORE executing the study, to avoid p-hacking

9.10.3. Bram: sounds very much like peerageofscience

9.10.4. We now review & work for ourselves instead of working for a publisher.

9.11. Editors4BetterResearch

9.12. Make my own lab transparent and open

9.12.1. Preregistered hypotheses in Github

9.12.2. How can the workflows be automated as much as possible?

9.12.3. Selection pressure for people: require evidence/willingness to engage in open practices.

9.13. Implicit biases

9.13.1. Women are not more RISK averse than men

9.13.2. Discover your implicit biases

10. Oliver Zendel Testing Open Peer Review (OPR) for Conferences

10.1. Hot CRP Conference Review Software

10.1.1. Specific branch

10.2. Fun fact: Peer review only came up after the second world war.

10.2.1. Before that: editorial control

10.3. Flaws in traditional peer review

10.3.1. Single/Double blind

10.3.2. Editors have to hand-pick reviewers

10.3.3. Process is hidden from authors

10.4. OPR Features

10.4.1. Open identity

10.4.2. Open participation

10.4.3. Open Interaction

10.4.3.1. direct discussions between authors and reviewers

10.4.4. Open Final-version comments

10.4.5. Open pre-review

10.4.6. Open Report

10.4.7. Open Platforms

10.5. Testing at two venues

10.5.1. EMVA Forum 2017

10.5.2. eHealth 2018 Conference

10.5.2.1. biggest concern

10.5.2.1.1. too much positivity 17% agreed

10.5.2.2. very well received

10.5.2.3. "whitewashing" of reviews to avoid backlash

10.5.2.4. layman reviews could make the reviewer look bad

10.6. Summary and Outlook

10.6.1. Testing OPR is still difficult

10.6.2. OPR generally well received

10.6.3. Many ways to improve traditional peer review

10.6.4. OPR branch of CMS is freely available

10.6.5. recommend: double blind for the BEGINNING and allow people to WITHDRAW reviews

11. Gerit Pfuhl Accelerating open science: the collaborative replications and education project (CREP)

11.1. Pre-registration of replications

11.1.1. Simultaneously solves

11.1.1.1. File drawer (all results published)

11.1.1.2. HARKing (impossible)

11.1.1.3. P-hacking (impossible)

11.1.1.4. Low power

11.2. Replication projects in teaching

11.3. CREP Workflow

11.4. Accelerated CREP

11.5. OSF Preprints

11.6. Open Data

11.7. Open Materials

11.8. CREP OSF Site