Top Banner
Frontiers Collaborative Review June 21, 2015 Marie Soulière, Ph.D. | Program Manager
22

Frontiers' Collaborative Review

Jan 24, 2018

Download

Education

FrontiersIn
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Frontiers' Collaborative Review

Frontiers Collaborative ReviewJune 21, 2015

Marie Soulière, Ph.D. | Program Manager

Page 2: Frontiers' Collaborative Review

Frontiers Collaborative Review

- Back to basics

- How it actually works

- Future engagement and latest developments

Page 3: Frontiers' Collaborative Review

How different is it?

Key principles

- Collaborative – unites the authors and reviewers in a direct online dialogue

- Objective – peer review will concentrate on technical rather than subjective concerns

BioMed Central ‘inclusive’ peer review, PLOS One focuses on technical concerns.

- Transparent – reviewers and editor names acknowledged on published papers.

BioMed Central, BMJ, F1000Research - publish names, full reports and pre-publication discussion. Copernicus - Interactive Public Peer-Review.

- Rigorous – questionnaires for each article types, with focus on validity of research

Rubriq - ScoreCard, Peerage of Science - pre-submission questionnaires

Page 4: Frontiers' Collaborative Review

How efficient is it?A

vera

ge r

evie

w t

ime

(Day

s )

0

20

40

60

80

100

120

105

9792 90

84 8277

74 74 74 74 7369 67 66

62 61

25

Average review time (submission to acceptance) per field at Frontiers

Data as of June 2015

Page 5: Frontiers' Collaborative Review

How efficient is it?Fr

equ

ency

0

50

100

150

200

250

300

350

400

450

0 50 100 150 200 250 300 350

Review time analysis

Submission to Acceptance (days)

50% of all papers: 37-92 days

75% of all papers: 21-122 days

Average time from submission to acceptance for all Frontiers Journals

Median time from submission to acceptance for all Frontiers Journals

86

52

Data as of June 2015

Page 6: Frontiers' Collaborative Review

How efficient is it?

PLOS One (2013)Average: 134 daysMedian*: 104 days

PNAS (as of 2015)Average: 152 days

Frontiers (as of 2015)

Average: 86 daysMedian*: 52 days

Open Biology, Royal Society (as of 2015)Average: 96 days

eLIFE (2014)

Median*: 90 days

Journal of Applied Mathematics, Hindawi (2013)

Median*: 88 days

Data as of June 2015

Page 7: Frontiers' Collaborative Review

How efficient is it?

Data on iterations in the Review Forum (RF):

Average # comments posted in RF by participants 4Median # comments posted in RF by participants 3

Average MS resubmission 1.8Median MS resubmission 1

Collaborative? Yes. But not too much.

Data as of June 2015

Page 8: Frontiers' Collaborative Review

Positive author and reviewer feedback

Page 9: Frontiers' Collaborative Review

New Online Review Forum (2014)

Page 10: Frontiers' Collaborative Review
Page 11: Frontiers' Collaborative Review
Page 12: Frontiers' Collaborative Review

Current and future developments

- Plagiarism check for ALL submissions(iThenticate program)

- Conflicts of interest between editors and authors

- Final validation stage(all files received, permissions for

figures or ethics committees provided, no COI or plagiarism issues left)

- Policy on authorship changes and disputes

Page 13: Frontiers' Collaborative Review

Current and future developments

- Plagiarism check for ALL submissions(iThenticate program)

- Conflicts of interest between editors and authors

- Final validation stage(all files received, permissions for

figures or ethics committees provided, no COI or plagiarism issues left)

- Policy on authorship changes and disputes

Page 14: Frontiers' Collaborative Review

Current and future developments

- Plagiarism check for ALL submissions(iThenticate program)

- Conflicts of interest between editors and authors

- Final validation stage(all files received, permissions for

figures or ethics committees provided, no COI or plagiarism issues left)

- Policy on authorship changes and disputes

Page 15: Frontiers' Collaborative Review

Current and future developments

- Plagiarism check for ALL submissions(iThenticate program)

- Conflicts of interest between editors and authors

- Final validation stage(all files received, permissions for

figures or ethics committees provided, no COI or plagiarism issues left)

- Policy on authorship changes and disputes

Page 16: Frontiers' Collaborative Review

- Affiliation list selection during submission and on Loop network

- Manuscript length check at submission

- FundRef- Crossmark

- New Associate Editor tab in the Review Forum

- Article type Review Questionnaires per Program

- Algorithm for Associate and Review Editors invitations

Current and future developments

As of June 2015

Page 17: Frontiers' Collaborative Review

28.6%

24.3%

28.1% 27.7%

20.6%

Sept Oct Nov Dec Jan

% Acceptance rate manual invitations to REV

22.7%

17.7%

23.3%

19.7%18.0%

Sept Oct Nov Dec Jan

% Acceptance rate manual invitations to RE

Review Editor Smart invitations: Case study Human Neuroscience

% acceptance of manual invitations sent by month to Review Editors (RE) or external Reviewers (REV)

Data as of May 2015

Page 18: Frontiers' Collaborative Review

KPI: Decrease % Not expertise declination rate of automatic invitations sent by month (baseline # declined invitations)

Top-10 invitations

Keywords Semantic Keywords Semantic

18.9%

22.3%23.3%

24.6%

14.2% 17.6%

13.4%

21.4%

14.0%

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

30.0%

Sep-14 Oct-14 Nov-14 Dec-14 Jan-15 Feb-15 Mar-15 Apr-15 May-15

23.1%

29.3%29.0%

40.6%

21.6% 22.3% 20.9% 20.2% 22.4%

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

30.0%

35.0%

40.0%

45.0%

Sep-14 Oct-14 Nov-14 Dec-14 Jan-15 Feb-15 Mar-15 Apr-15 May-15

RE Smart invitations

Top-20 invitations

Data as of May 2015

Page 19: Frontiers' Collaborative Review

Accepted

Interested

KPI: Increase % of Acceptance and Interested automatic invitations sent by month (baseline total invitations sent in the month)

Keywords Semantic Keywords Semantic

1.0%

2.4% 2.5%1.9%

3.7%

5.6%

2.7%

4.0%

6.4%

0.2%

0.6%1.0%

1.3%

0.2%

1.0%

0.2%

2.0%

0.6%

0.0%

1.0%

2.0%

3.0%

4.0%

5.0%

6.0%

7.0%

8.0%

Sep-14 Oct-14 Nov-14 Dec-14 Jan-15 Feb-15 Mar-15 Apr-15 May-15

1.4% 1.2% 1.5%0.9%

1.9% 1.9%2.4%

1.5%

4.0%

0.0% 0.3%

0.8%1.4%

0.3% 0.6%

0.5%

0.4%

1.9%

0.0%

1.0%

2.0%

3.0%

4.0%

5.0%

6.0%

7.0%

Sep-14 Oct-14 Nov-14 Dec-14 Jan-15 Feb-15 Mar-15 Apr-15 May-15

*Max acceptance rate is 20% (2 reviewers reached)

Top-10 invitations Top-20 invitations

RE Smart invitations

Data as of May 2015

Page 20: Frontiers' Collaborative Review

KPI: Decrease % of manuscripts delayed in review assignment (baseline # manuscripts submitted by month)

Keywords Semantic

15.0%

13.0%

16.0%

13.0%11.8% 12.4%

10.6%

0.0%

2.0%

4.0%

6.0%

8.0%

10.0%

12.0%

14.0%

16.0%

18.0%

30-Nov-14 28- Feb-1531- Jan-1531- Dec-14 31- March-15 30- April-15 31- May-15

RE Smart invitations

Data as of May 2015

Page 21: Frontiers' Collaborative Review

Associate Editor assignment: Case study Microbiology

28.7%

21.2%

33.3%

28.3%

22.8%

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

30.0%

35.0%

Jan-15 Feb-15 Mar-15 Apr-15 May-15

Acceptance rate of the Associate Editor selected by the author at submission

Submitted month of the manuscript

29.0%22.0%

56.0%

30.0% 29.0%

68.0%72.0%

38.0%

67.0%

50.0%

3.0% 6.0% 6.0% 3.0%

21.0%

0.0%

10.0%

20.0%

30.0%

40.0%

50.0%

60.0%

70.0%

80.0%

90.0%

100.0%

Jan-15 Feb-15 Mar-15 Apr-15 May-15

% of reasons for declining an invitation the preferred Associate Editor

No expertise

No time

Others & COI

Data as of May 2015

Page 22: Frontiers' Collaborative Review

Questions?