Information for ICRA Reviewers
Reviewer Timeline for ICRA 2017
|15 September 2016||Submission deadline|
|2 October 2016||AEs begin to assign papers to reviewers|
|15 October 2016||Deadline for AEs to assign papers to reviewers and confirm|
|12 November 2016||Deadline for reviewers to submit reviews|
|30 November 2016||Deadline for AE final reports (AEs may request reviewers to revise reviews before this deadline)|
|9 December 2016||Deadline for Editor endorsements of AE reports|
|9 December 2016||Deadline for AEs to nominate reviewers for the Best Reviewer Award|
|6-7 January 2017||Senior PC Meeting|
|15 January 2017||Paper acceptance notification|
|25 February 2017||Final paper submission deadline|
Overview of Review Process and Reviewer Responsibilities
The ICRA Conference Editorial Board (CEB) structure is: one Editor-in-Chief, approximately 15 Editors, approximately 350 Associate Editors (AEs), and several thousand Reviewers. Each paper to be reviewed will be assigned to one Editor and to one of the AEs that that Editor supervises. The AE will be responsible for obtaining a minimum of two high quality reviews for each paper they handle, and for preparing an AE recommendation that explains the review result. The Editors will be responsible for reviewing and endorsing the work done by the AEs on the papers for which they are responsible. The task of the Reviewers is to provide high-quality assessments of each individual paper assigned to them. These include statements about their own confidence, different criteria for the papers, a potential confidential statement to the CEB as well as a detailed review describing the contribution of paper and justifying the overall assessment.
This page focuses on issues for Reviewers.
Conflicts of Interest
A. A Reviewer is deemed to have a conflict of interest in a submitted paper if he or she is a (co-)author of the paper; or
B. one (or more) of the authors of the paper:
B.i. is, or has been, a student or advisor of that person, or
B.ii. has co-authored a paper or has closely collaborated in a research project with that person in the previous five years, or
B.iii. is employed at the same institution (at the Department or Division level) as that person;
C. there are any other circumstances which may create an appearance that the person might have a bias in the evaluation of the paper.
Scoring Guidelines for ReviewersThe reviews will help to maintain the quality of ICRA. Each review is required to meet the 1,200 non-white character threshold, which has the purpose to provide useful and constructive feedback to the authors. Please refrain from filling the space with meaningless or repeated text. The reviewers should respect the time invested by the authors. They should not accept to review unless they are willing to at provide a meaningful review. The letter-grade system used for scoring papers is described below.
|A / 5.0||Definitely accept||Top 15% of accepted ICRA papers, an excellent paper. I advocate and will fight for acceptance.|
|B+ / 4.5||Accept||A great paper. I will strongly argue for acceptance.|
|B / 4.0||High Borderline||I am leaning to accept. This paper should be accepted, although I would not be upset if it were rejected.|
|B- / 3.5||Borderline||I am undecided, I would not be upset if it were accepted or rejected.|
|C / 3.0||Low Borderline||I am leaning to reject. This paper should be rejected, although I would not be upset if it were accepted.|
|C- / 2.5||Reject||The paper needs substantial improvements. I will strongly argue for rejection.|
|D / 2.0||Definitely Reject||The paper is trivial or wrong or known. It is clearly below ICRA conference quality, I assume no further discussion is needed.|
|U / 1.0||Inappropriate||This does not fit the conference or its standards.|
If you suspect plagiarism or multiple submission (sometimes referred to as self-plagiarism), please notify your AE but proceed to review the paper in the standard way. Lack of citation to prior work should be noted in your review, but reviewers should not directly accuse authors of "plagiarism". Other members of the CEB and, if necessary, IEEE committees, will determine if plagiarism has occured.