IEEE Robotics and Automation Society IEEE

Information for ICRA Associate Editors

The Time Line for ICRA 2014

15 September 2013 Submission deadline
27 September 2013 All papers assigned to AEs
10 October 2013 Deadline for AEs to assign papers to reviewers
10 October 2013 Deadline for AEs to summarily reject papers
2 November 2013 Reviews Due
19 November 2013 Deadline for AE final reports
27 November 2013 Deadline for Editor endorsements
15-16 December 2013 Senior PC Meeting (Editor attendance not needed)
15 January 2014 Paper acceptance notification

Editorial Board Structure

The Conference Editorial Board is organized into a three-layer hierarchy. The Editor-in-Chief oversees the entire Board. There are several Editors, each of whom handles a distinct set of submitted papers. The keywords for ICRA are partitioned into groups, each of which is the responsibility of exactly one Editor. When a paper is submitted to ICRA (via PaperPlaza), the author's choice of first keyword determines to which Editor the paper is assigned. The paper will be assigned to an AE based on specific keyword choices of the AEs.

Getting Started

The review process for ICRA is managed using the PaperPlaza system. PaperPlaza provides a wide variety of tools to help AEs manage the review process. Reviewer assignments, review entry, AE reporting, and final decisions are all managed using PaperPlaza.

To access the system, go to the PaperPlaza page, click Start and then Log in. If you have forgotten your login information, you can retrieve it using the PIN management page.

It may be useful to spend a few minutes looking over the help pages, and in particular the Associate Editor's FAQ at the PaperPlaza site: PaperPlaza Help Page.

Standard Review Process

The CEB has a strict review quality policy: every submitted paper must receive at least two substantial reviews. The Associate Editor should reject any sub-standard review he/she receives - e.g. too short or shallow. It is also the AE's responsibility to see that reviews are constructive and do not diminish the authors' efforts, even when they have to be negative or very negative.

Avoid the assignment of reviewers from the same institution as the AE. While it is not explicitly forbidden to ask other AEs to perform reviews, it is likely that they will decline, since they will be quite busy managing their own paper load.

Selecting appropriate reviewers is critical in order to have a quality and timely review process. Candidate reviewers may include leading researchers in the topic area, colleagues who are familiar with the topic, authors of papers on the reference list, or authors of relevant papers that have been previously presented at ICRA or in other high-profile journals and conferences.

A good mix of senior and junior reviewers is desirable as they provide reviews from different perspectives and at different levels of detail. Extensive use of student reviewers should be avoided, unless they are experienced, senior-level graduate students whose review work is overseen by an advisor or mentor. A mix of reviewers from different geographical regions is also desirable. A key idea to secure reviewers is to align the interest of reviewers with ours: the paper is on a topic that they would really like to see or to say something about. Having reviewers who are interested in going over the manuscript alleviates the need of repeated sending of reminder messages, and avoids the need for AEs to find last-minute additional reviewers.

Note that it is not acceptable for you to be one of the reviewers for a paper that you are handling as AE. This provides insufficient independent opinions of the paper, and is not consistent with a quality review process. If you have difficulty getting reviewers to return their reviews on time, then you can ask another AE from your Editor's profile to help with reviews for your papers, while returning the favor to the other AEs who are helping you. Your Editor can help with contact information for other AEs who can help. However, this option is only to be used when other reviewers are not responsive to their review responsibilities.

Review requests are handled by the PaperPlaza system. You begin by compiling your own list of reviewers. Follow the link Reviewers on your workspace page to do this. You can find details about this process in the PaperPlaza help pages: Compiling my reviewer list.

IMPORTANT NOTICE: Paperplaza has a very nice feature to help you choose your reviewers, which however has a potential for misuse. The feature is that you can get a list of registered users whose keywords match the paper at hand, so that a reviewer can be recruited with a single click. The abuse of this feature is that the AE selects a reviewer whom the AE does not know, and whose expertise is not proven. It may thus happen that junior members of the community, or simply people that are associated to a keyword by chance or mistake, are asked a review in a field where they are not competent. This is clearly unacceptable, and ultimately counterproductive in terms of time - as the reviewer will probably decline, or will provide a poor-quality review.

Also, please note that you should not add new users to the system unless you are certain that they are not already registered. Multiple PINs for a single user can cause significant difficulty and confusion.

After you have compiled your list of reviewers, a review can be requested by following the Reviews link for a submission, then clicking the Request a review link. You may edit the standard form letter as you please. You must click Send to actually generate the e-mail invitation to the reviewer.

Do not enlist more than 4 reviewers per paper. This needlessly focuses too much reviewer time on an individual paper. The reviewer pool is limited, and we must not waste their collective time in providing an excessive number of reviews per paper.

Once you have 2-4 accepted reviewers for each paper, you should send out frequent reminders to these reviewers of the due date. The review schedule is very tight, and must be followed very closely, to ensure a quality review process.

Once you have collected all reviews, you should prepare and submit your AE Review Summary Report. In your report, you should briefly summarize the reviews (please avoid copying excerpts of the reviews in your report, as authors will see the reviews anyway), state your own opinion on the paper, and note any special circumstances that may apply. For example, if two reviews are in serious conflict, the AE Report should resolve the conflict if possible. If neither review is substantive, the AE Review Summary Report should give a solid rationale for the decision to be taken. If the reviews disagree, please do not merely give the average as your rating. As AE, your role is to settle the conflict. 

The text part of your report will be received by authors: in this text, please do not indicate the decision you recommend, but only the motivations for your recommendation. In some cases the Senior Program Committee may change the recommended decision, and we want to avoid confusing the authors. You will give a recommended score, which is not visible to authors. Once you have submitted your report, the Editor overseeing your reviews will then have the ability to amend your report if needed, based on calibration across multiple papers. The Editor will also have the ability to overwrite your score, if he/she believes it is not consistent with other similarly-reviewed papers.

In recommending your score, please consider that papers with an "A," "B+", or "B" are likely accepts, while those with a "D" are likely rejects. Papers with a "B-," "C" or "C-" are borderline papers: it would be useful if you can provide your opinion (in the confidential comments) on whether the paper should be accepted or not. You should avoid placing too many papers in borderline ratings. Since you are the expert on the paper and the issues raised by the reviewers, you should try to lean the paper either towards acceptance or towards rejection, rather than in the middle. Notice that "U" is not "E", i.e. it is not the fifth grade in the technical score scale. Rather, "U" stands for "Unsuitable", and should be used as a flag to signal "problem" papers - e.g. out of scope, or suspect of partial plagiarism, or incomplete, etc.

Note that once you have submitted your AE report, no further reviews can be obtained or entered into the system. When you submit your report, all pending reviews are canceled automatically. If you made a mistake or need to change the report, you must ask the Editor to reopen the paper. 

Summary Rejection

In certain special cases, an AE may recommend that a paper be rejected without sending the paper for review. In such cases, the AE writes a summary review giving the rationale for this decision, and gives the paper an unsatisfactory rating. This is done via the usual AE Review Summary Report mechanism (described above). The Editor assigned to the paper, together with the Editor-in-Chief, will then make the final determination as to whether the paper should be rejected without further review, or should go through the formal review process.

A paper should be rejected without review in any of the following cases:

  • It clearly makes no novel contribution to the state of the art.
  • It contains significant technical errors.
  • The paper has been published previously (i.e., the paper is identical to, or nearly identical to previously published work by the same authors).
  • The paper plagiarizes previously published research by other authors (please see below)

A paper should not be rejected without review merely because it makes only an incremental contribution, because it fails to report real-world experiments or because of poor writing quality. Further, a paper should not be summarily rejected because the AE feels its subject lies outside the scope of ICRA (this judgment is left for the Program Committee). If there is any doubt as to the decision, the paper should be sent for review.

No more that 10-15% of submitted papers will be rejected without review.

Plagiarism

IEEE defines plagiarism as the reuse of someone else's prior ideas, processes, results, or words without explicitly acknowledging the original author and source. It is important for all IEEE authors to recognize that plagiarism in any form, at any level, is unacceptable and is considered a serious breach of professional conduct, with potentially severe ethical and legal consequences (source: Section "8.2 Publication Guidelines" of the IEEE PSPB Operations Manual, "Guidelines for Adjudicating Different Levels of Plagiarism.")

Plagiarism cases involve serious accusations, which should be dealt with carefully. IEEE has clear policies to follow.

When a plagiarism case is detected by a CEB reviewer, he/she should inform the Associate Editor who assigned the review, the Editor, and the CEB Editor in Chief. While informing the editorial chain and ultimately the CEB Editor in Chief is mandatory, confidentiality in this process is strongly recommended by IEEE.

The iThenticate tool is used to detect overlaps between the submitted paper and other published documents. The outcome of this tool (available under "CrossCheck") should be used to evaluate whether possible plagiarism issues are present. One should ignore bibliographic overlaps, small matches, etc. If concerns are found, then the paper should be flagged as a possible case of plagiarism (following the link associated with the paper).

For submissions that have overlaps with previously published papers by the same authors (or with papers simultaneously submitted elsewhere), the IEEE rules are that the submission should cite the prior work, and clearly state how the submitted work is different from the previous publication or simultaneous submission. We don't have hard and fast rules for how much overlap with prior publications is too much. Somewhere in the neighborhood of 35% or more (of previously-published material) is a rough limit, although each submission has to be judged on its own merit. Note that overlaps with prior workshop papers (by the same authors) are usually not considered problematic, since IEEE supports the evolutionary publication paradigm of workshop papers being improved to conference papers, which are then improved to journal papers.

When a plagiarism case is detected, and the AE, Editor, and Editor-in-Chief concur on its relevance, the CEB stops the review process, and marks the paper for summary rejection. The Conference will send a summary rejection message with a stern comment, referring authors to IEEE policies, and warning of possibly impending further actions (in serious cases of plagiarism, IEEE Central will contact the authors directly). The EiC submits the case, with all evidence available, to an ad-hoc Committee for follow-up actions at IEEE level - as plagiarism cases usually involve more than one publication, and are of big concern to the Society.

Award candidates

ICRA features many awards:

  • Best Automation Paper (est. 1997)
  • Best Conference Paper (est. 1993)
  • Best Manipulation Paper, sponsored by Ben Wegbreit (est. 2000)
  • Best Student Paper (est. 1988)
  • Best Video Proceedings Award (est. 1992)
  • Best Vision Paper (sponsored by Ben Wegbreit) (est. 2000)
  • KUKA Service Robotics Best Paper Award (est. 2008)
  • Intuitive Surgical Best Medical Robotics Paper (est. 2009)
  • Best Cognitive Robotics Award (est. 2009)

The ICRA Program Chair has invited the CEB to designate a number of outstanding papers from which the Senior Program Committee will draw finalists for Conference Awards. Accordingly, Associate Editors are asked to identify which papers in their assignments they would consider as potential candidates for an award. To do so, please use the "Confidential comments to the Program Committee" textbox in your AE Review Summary Report form. It is not necessary to identify which award a paper would be a suitable candidate for. The actual selection of finalists for each award will be made by the Senior Program Committee, using your inputs.

Editor Endorsement of AE reports

The AE's recommendation and score, expressed in the Review Summary Report, will be overviewed by the supervising Editor, who will be responsible for checking that the quality standards of the review process (including number and depth of reviews, quality of AE's report, etc.) have been met. The Editor will update the score and report as needed, based on their calibration across multiple papers. Any confidential comments regarding the report/reviews should be noted in the confidential comments. The authors will see the finalized Review Summary Report prepared by the AE and Editor (but not the score); thus, the report should be carefully written to convey the main issues raised during the review process.

Conflict of Interests

A CEB Editor is deemed to have a conflict of interest in a submitted paper if he or she is a (co-)author of the paper. 

A CEB Associate Editor or a CEB Reviewer is deemed to have a conflict of interest in a submitted paper if

A. he or she is a (co-)author of the paper; or
B. one (or more) of the authors of the paper:

B.i. is, or has been, a student or advisor of that person, or
B.ii. has co-authored a paper or has closely collaborated in a research project with that person in the previous five years, or
B.iii. is employed at the same institution (at the Department or Division level) as that person;

C. there are any other circumstances which may create an appearance that the person might have a bias in the evaluation of the paper.

All COIs should be reported to the Editor-in-Chief.

Invited Sessions

Invited session proposals and invited papers will be handled as follows:

The AE in charge of the Invited Session Proposal (ISP) is also in charge of all its "Invited Papers" (IPs), i.e. papers submitted with a link to that ISP;

All papers linked to an ISP are considered and reviewed just like any other contributed paper - that is, the AE will get two independent reviews, and draft a Review Summary Report purely based on the technical merits of the individual paper. Reviewers should not even be informed of the underlying invited session proposal.

The AE in charge of the ISP will draft an overall report, recommending acceptance or rejection of the session itself. This decision shall take into account the timeliness of the proposal, its quality, organization and expected impact. It will also take into account the quality of papers submitted for that session. It might thus happen that a good session proposal is recommended for rejection because the associated papers are poor quality. Also, a poor session proposal should be recommended for rejection, even though the associated papers are all good quality.

Good papers submitted as invited, whose session will eventually be turned down by the IPC, will be nonetheless accepted and presented in regular sessions. Good session proposals, for which only few good papers were submitted as invited and accepted by the IPC, might be integrated by the IPC with other accepted papers in the area. These aspects of how sessions will be formed pertain to the IPC, and they should not concern the AEs.