IEEE Robotics and Automation Society IEEE

IROS Venues and Archive

IROS 2015 Hamburg, Germany
28 September-3 October
General Chair: Jianwei Zhang
Program Chair: Alois Knoll

IROS 2014 Chicago, IL USA
14-18 September
General Chair: Kevin Lynch
Program Chair: Lynne Parker

IROS 2013 Tokyo, Japan
3-8 November
General Chair: Shigeki Sugano
Program Chair: Makoto Kaneko

IROS 2012 Vilamoura, Algarve, Portugal
7-12 October 
General co-Chairs: Anibal T. de Almeida, Urbano Nunes,
Program Chair: Eugenio Guglielmelli

IROS 2011 San Francisco, CA USA
25-30 September 
General Chair: Oussama Khatib
Program Chair: Gaurav Sukhatme

IROS 2010 Taipei, Taiwan
18-22 October 
General Chair: Ren C. Luo
Program Chair: Huei-Yung Lin

IROS 2009 St. Louis, MO, USA
11-15 October 
General Chair: Ning Xi
Program Chair: Zhidong Wang

IROS 2008 Nice, France
22-26 September 
General co-Chairs: Raja Chatila, Jean-Pierre Merlet 
Program Chair: Christian Laugier

IROS 2007 San Diego, CA, USA
29 October-2 November

IROS 2006 Beijing, China
9-15 October 

IROS 2005 Edmonton, Canada
2-6 August 
General Chair: Max Meng
Program Chair: Hong Zhang

IROS 2004 Sendai, Japan
28 September-2 October 

IROS 2003 Las Vegas, NV, USA
27-31 October 

IROS 2002 Lausanne, Switerland
30 September-4 October
General Chair: Roland Siegwart
Program Chair: Christian Laugier

IROS 2001 Maui, HI USA
29 October- 3 November 

IROS 2000 Takamatsu, Japan
30 October-5 November 

IROS 1999 Kyongju, Korea
17-21 October 

IROS 1998 Victoria, Canada
13-17 October 

IROS 1997 Grenoble, France
7-11 September 

IROS 1996 Osaka, Japan
4-8 November 

IROS 1995 Pittsburgh, PA USA
5-9 August 

IROS 1994 Munich, Germany
12-16 September  

IROS 1993 Tokyo, Japan
26-30 July  

IROS 1992 Raleigh, NC USA
7-10 July  

IROS 1991 Osaka, Japan
3-5 November 

IROS 1990 Ibaraki, Japan
3-6 July  

IROS 1989 Ysukuba, Japan
4-6 September 

IROS 1988 Tokyo, Japan
31 October-2 November 

IROS Conference Paper Review Board

The IROS Conference Paper Review Board (CPRB) is organized in the same way as the ICRA CEB. There is one Editor-in-Chief (EiC), up to 17 Editors (EDs), and more than 200 Associate Editors (AEs). Each paper to be reviewed will be assigned one Editor and to one of the AEs that that Editor supervises. The AE will be responsible for obtaining a minimum of two high quality reviews for each paper they handle, and for preparing an AE recommendation that explains the recommended decision regarding the paper acceptance and presentation type. The AEs will also help to identify papers to be considered for awards. The Editors will be responsible for reviewing and endorsing the work done by the AEs on the papers for which they are responsible.

The ultimate responsibility for the conference content, acceptance rate, symposia, etc. rests with the Program Committee. The purpose of the CPRB is to ensure a high quality, robust, reliable and consistent review process.


Wolfram Burgard

Conference Paper Review Board, Editor-in-Chief 2014-2016

burgard2008-new-small

University of Freiburg
Department of Computer Science
Freiburg, Germany
This email address is being protected from spambots. You need JavaScript enabled to view it.
http://www.informatik.uni-freiburg.de/~burgard/

 

Nancy Amato

Conference Paper Review Board, Editor-in-Chief 2011-2013

nancy-amato

Texas A&M University
Parasol Laboratory, Algorithms & Applications Group
College Station (TX), United States
This email address is being protected from spambots. You need JavaScript enabled to view it.


 

2014 CPRB Editors

Alois Knoll, TU Munich, Germany (Profile AK)
Adriana Tapus, ENSTA-ParisTech, France (Profile AT)
Danica Kragic, KTH, Sweden (Profile DK)
Dezhen Song, TAMU, USA (Profile DS)
Eiichi Yoshida, AIST, Japan (Profile EY)
Fumihito Arai, Nagoya University, Japan (Profile FA)
Giuseppe Oriolo, Università di Roma La Sapienza, Italy (Profile GO)
Jose A. Castellanos, University of Zaragoza, Spain (Profile JC)
Jan Peters, TU Darmstadt, Germany (Profile JP)
Jean-Pierre Merlet, INRIA Sophia Antipolis, France (Profile JPM)
Jing Xiao, University of North Carolina at Charlotte, USA (Profile JX)
Maren Bennewitz, University of Freiburg, Germany (Profile MB)
Maria Gini, University of Minnesota, USA (Profile MG)
Shinichi Harai, Ritsumeikan University, Japan (Profile SH)
Simon Lacroix, LAAS, France (Profile SL)
Sang-Rok Oh, KIST, Korea (Profile SRO)
Siddhartha Srinivasa, Carnegie Mellon University, USA (Profile SS)
Tatsuo Arai, Osaka University, Japan (Profile TA)

 

2013 CPRB Editors

Wolfram Burgard - University of Freiburg, Germany (Profile WB)
Martin Buss - TU Munich, Germany (Profile MB)
Edwardo Fukushima - Tokyo Institute of Technology, Japan (Profile EF)
Maria Gini - University of Minnesota, USA (Profile MG)
Masashi Konyo - Tohoko University, Japan (Profile MK)
Danica Kragic - KTH, Sweden (Profile DK)
Simon Lacroix - LAAS, France (Profile SL)
Cecilia Laschi - Scuola Superiore Sant'Anna, Italy (Profile CL)
Kevin Lynch - Northwestern University (Profile KL)
Alison Okamura - Stanford University, USA (Profile AO)
Sang-Rok Oh - KIST, Korea (Profile SRO)
Adriana Tapus - ENSTA-ParisTech, France (Profile AT)
Ning Xi - Michigan State University, USA (Profile NX)
Jing Xiao - University of North Carolina at Charlotte, USA (Profile JX)
Mark Yim - University of Pennsylvania, USA (Profile MY)
Eiichi Yoshida - AIST, Japan (Profile EY)
Hongbin Zha - Peking University, China (Profile HZ)

 2013 Associate Editors

 

 2012 CPRB Editors

Wolfram Burgard, University of Freiburg, Germany;
Gordon Cheng, Technical University of Munich, Germany;
Paolo Fiorini, University of Verona, Italy;
Li-Chen Fu, National Taiwan University, Taiwan;
Maria Gini, University of Minnesota, USA;
David Hsu, National University of Singapore, Singapore;
Cecilia Laschi, Scuola Superiore Sant'Anna, Italy;
Marcia O'Malley, Rice University, USA;
Sang-Rok Oh, KIST, Korea;
Mihoko Otake, Tokyo University, Japan;
Satoshi Tadokoro, Tohoku University, Japan;
Adriana Tapus, ENSTA-ParisTech, France;
Ning Xi, Michigan State University, USA;
Jing Xiao, University of North Carolina at Charlotte, USA;
Mark Yim, University of Pennsylvania, USA

2012 IROS Associate Editors

2012 Videos Editor

João Barreto, University of Coimbra, Portugal

2012 Videos Associate Editors

 

2011 CPRB Editors

I-Ming Chen, Nanyang Technological University, Singapore
Alessandro De Luca, Universita di Roma "La Sapienza", Italy
Chad Jenkins, Brown University, USA
Danica Kragic, Royal Institute of Technology, Sweden
Nikos Papanikolopoulos, University of Minnesota, USA
Frank Park, Seoul National University, Korea
Lynne Parker, University of Tennessee, Knoxville, USA
Shigeki Sugano, Waseda University, Japan
Frank van der Stappen, Univerity of Utrecht, The Netherlands

2011 IROS Associate Editors

Information for IROS Associate Editors

IROS Conference Review Board (ICRB) Timeline for 2013

22 March 2013 Submission deadline
29 March 2013 Papers assigned to AEs
5 April 2013 Deadline for AEs to assign papers to reviewers
5 April 2013 Deadline for AEs to summarily reject papers
1 May 2013 Deadline for reviewers to submit reviews
17 May 2013 Deadline for AE final reports
29 May 2013 Deadline for Editor endorsements of AE reports
June 2013 Executive PC Meeting (ICRB Editor-in-Chief attends)

 

Overview of Review Process and Associate Editor Responsibilities

The IROS Conference Review Board (ICRB) is organized in the same way as the ICRA CEB. There is one Editor-in-Chief (EiC), 17 Editors (EDs), and more than 200 Associate Editors (AEs). Each paper to be reviewed will be assigned one Editor and to one of the AEs that that Editor supervises. The AE will be responsible for obtaining a minimum of two high quality reviews for each paper they handle, and for preparing an AE recommendation that explains the recommended decision regarding the paper acceptance and presentation type. The AEs will also help to identify papers to be considered for awards. The Editors will be responsible for reviewing and endorsing the work done by the AEs on the papers for which they are responsible.

Assignment of Papers to Editors and AEs:
For IROS 2013, the keywords are partitioned into sets, each of which is the responsibility of one Editor. When a paper is submitted to the conference (via PaperPlaza), the author's choice of keywords will be used to determine to which Editor and AE the paper will initially be assigned. This initial assignment, which is done automatically by the system, is then reviewed by the Editor in Chief and the Editors to avoid conflict of interests (COI), to improve the matching of the expertise of the AE with the paper, and to provide load balancing across the AEs. Finally, the AEs must review the papers assigned to them and inform the supervising Editor of any COI they may have with their assigned papers.

AE Tasks and Responsibilities:
Once assigned papers to handle and notified that they should get started, the AE should follow the following steps:

  1. As soon as possible, the AE should check the papers they have been assigned and make sure that they do not have a conflict of interest (COI) with any of their assigned papers. If a COI exists, they should notify the managing Editor and the EiC immediately so that the paper can be assigned to another AE.
  2. By the posted deadline (typically 1 week after paper assignments are made), the AE should:
  3. identify papers to be recommended for summary rejection (rejected without review), or
  4. determine appropriate reviewers and request reviews from them in PaperPlaza.
  5. The AE should monitor the review process as reviews come in. If reviews are not substantial enough or are not prepared in a appropriate way, then the AE should ask the reviewer to provide an improved review. In extreme cases, the AE may need to request an additional review if a reviewer does not provide a review or does not provide an appropriate review. If difficult situations arise, the AE should consult with the managing Editor and/or the EiC.
  6. After a sufficient number of quality reviews (at least 2) have been obtained, then the AE should prepare the AE report providing their recommendation on the paper. This should also include a recommendation on the presentation type (regular oral presentation or multimedia presentation), if the paper is to be accepted, and a recommendation on whether the paper should be considered for an award.
  7. After the AE report has been submitted, the supervising Editor will review the AE recommendation and either provide the Editor Endorsement or suggest modifications.

Note about Organized Sessions: From the perspective of AEs and Editors, papers submitted to organized sessions will be handled in the same way as any other paper. The decision regarding whether an Organized Session Proposal (ISP) will be accepted, and if so, which papers will be included in it, will be handled by the Senior Program Committee (SPC).

 

Getting Started

The review process for IROS is managed using the PaperPlaza system. PaperPlaza provides a wide variety of tools to help AEs manage the review process. Reviewer assignments, review entry, AE reporting, and final decisions are all managed using PaperPlaza.

To access the system, go to the PaperPlaza page, click Start and then Log in. If you have forgotten your login information, you can retrieve it using the PIN management page.

It may be useful to spend a few minutes looking over the help pages, and in particular the Associate Editor's FAQ at the PaperPlaza site: PaperPlaza Help Page.

 

Summary Rejection

In certain special cases, an AE may recommend that a paper be rejected without sending the paper for review. In such cases, the AE writes a brief summary review giving the rationale for this decision, and gives the paper an Unsatisfactory rating. This is done via the usual AE Report mechanism (described below). The Editor assigned to the paper will then make the final determination as to whether the paper should be rejected without further review, or should go through the formal review process.

A paper should be rejected without review in any of the following cases:

  • It clearly makes no novel contribution to the state of the art.
  • It contains significant technical errors.
  • The paper has been published previously (i.e., the paper is identical to, or nearly identical to previously published work by the same authors).
  • The paper plagiarizes previously published research by other authors.

A paper should not be rejected without review merely because it makes only an incremental contribution, because it fails to report real-world experiments or because of poor writing quality. Further, a paper should not be summarily rejected because the AE feels its subject lies outside the scope of IROS (this judgment is left for the Program Committee). If there is any doubt as to the decision, the paper should be sent for review.

No more that 10-15% of submitted papers will be rejected without review.

 

Selecting Appropriate Reviewers

The ICRB has a strict review quality policy: every submitted paper must receive at least two substantial reviews. An AE should reject any sub-standard review received - e.g., too short or shallow. It is also the AE's responsibility to see that reviews are constructive and not diminishing the authors' efforts, even when they have to be negative or very negative.

Selecting appropriate reviewers is critical in order to have a quality and timely review process. The most important criteria is for the reviewer to have the appropriate expertise. Candidate reviewers may include leading researchers in the topic area, colleagues who are familiar with the topic, authors of papers on the reference list, authors of relevant papers that have been previously presented at IROS or in other high-profile journals and conferences. Generally, a key idea to secure reviewers is to align the interest of reviewers with ours: the paper is on a topic that they would really like to see or to say something about. Having reviewers who are interested in going over the manuscript helps alleviate the need of repeatedly sending reminder messages, and avoids the need for AEs to write detailed comments themselves to supplement shallow and not-to-the-point reviews.

In addition to having appropriate expertise, the following provides some other requirements and guidelines in selecting reviewers.

  • The reviewer cannot have a conflict of interest with the paper.
  • An AE cannot provide a review for the paper.
  • At most one of the reviewers for a paper can be at the same institution as the AE for the paper.
  • A mix of reviewers from different geographical regions is also desirable.
  • While it is not explicitly forbidden to ask other AEs to perform reviews, it is likely that they will decline, since they will be quite busy managing their own paper load.
  • A good mix of senior and junior reviewers is desirable as they provide reviews from different perspectives and at different levels of detail.

IMPORTANT NOTICE: Paperplaza has a very nice feature to help you choose your reviewers, which however has a potential for misuse. The feature is that you can get a list of registered users whose keywords match the paper at hand, so that a reviewer can be recruited with a single click. The abuse of this feature is that the AE selects a reviewer whom the AE does not know, and whose expertise is not proven. It may thus happen that junior members of the community (e.g., undergraduate students), or simply people that are associated to a keyword by chance or mistake, are asked a review in a field where they are not competent. This is clearly unacceptable, and ultimately counterproductive in terms of time - as the reviewer will probably decline, or anyway the Editors will discard the review.

Also, please note that you should not add new users to the system unless you are certain that they are not already registered. Multiple PINs for a single user can cause significant difficulty and confusion.

 

Requesting Reviews in PaperPlaza

Review requests are handled by the PaperPlaza system. After you have compiled your list of reviewers, a review can be requested by following the Reviews link for a submission, then clicking the Request a review link. You may edit the standard form letter as you please. You must click Send to actually generate the e-mail invitation to the reviewer.

You can find details about this process in the PaperPlaza help pages: Compiling my reviewer list.

 

Preparing and Submitting an AE Report

Once you have collected all reviews, you should prepare and submit your AE report. You can find details about the report process in the PaperPlaza Help pages available after you have logged in. Please note that we will not be using the rebuttal process.

We would like to have the AE reports have a common structure that:

  • First, summarizes the paper topic/contribution (could be brief, even just 1-2 sentences).
  • Next, notes the main strengths and weaknesses of the paper, summarizing the main points from the reviews and noting any additional issues noted by the AE. Please avoid copying excerpts of the reviews in your report, as authors will see the reviews.
  • Then, notes any special circumstances that may apply. For example, if two reviews are in serious conflict, the AE Report should resolve the conflict if possible. If neither review is substantive, the AE Report should give a solid rationale for the decision to be taken.

Here is guidance and reminders on what to put or not to put in your report and in determining your scores:

  • Please do not mention charge/mention plagiarism in the comments to the authors. That should be in the confidential comments only.
  • Please do not mention a decision (reject or accept) in the text part of the AE report that is meant for the authors, but only the motivations for your recommendation. The reason for this is that your recommendation is a recommendation to the program committee and that while they usually do agree with the AE, in some cases the Program Committee may change the recommended decision, and we want to avoid confusing the authors if that happens.
  • You will give a recommended score, which is not visible to authors. If the reviews disagree, please do not merely give the average as your rating. As AE, your role is to settle the conflict. In recommending your score, please consider that papers with an "A" are "Definitely Accept", "B+" are "accept", "B" are "high borderline", "B-" are "DO NOT USE" , "C" are "low borderline" and "C-" are "reject" and "D" are "reject". Ratings of "B-" are discouraged and should not be used. Notice that "U" is not "F", i.e., it is not the next lowest grade in the technical score scale. Rather, "U" stands for "Unsuitable", and should be used as a flag to signal "problem" papers - e.g., out of scope, or suspect of partial plagiarism, or incomplete, etc.
  • Note that once you have submitted your AE report, no further reviews can be obtained or entered into the system. When you submit your report, all pending reviews are canceled automatically. If you made a mistake or anyway need to change the report, you must ask the Profile Editor to reopen the paper.

 

Requesting Reviewers to Revise Reviews

Request revision of the review. After checking a review that has been submitted this link may be used to request the reviewer to prepare and submit a revised review. The form letter "Request to submit a revised review" is available for this purpose. This is a conference wide form letter that may be individualized on the Personalized form letters page. Revision of a review may be requested if the status of the submission is Under review or Decision pending. When this request is sent to the reviewer the status of the review is reset to 'Saved' so that the reviewer may re-access and resubmit the review. The usual option to save a review is not available to the reviewer for such reviews.

Restoring reviews to original after revision requested. To Restore a review for which revision was requested to the original review click on the link "Revision requested on..." for the review in question. This opens an information box with a link to view the original review and another link to restore the original review.

NOTE: Disposition of unreceived revised reviews. Reviews for which a revision was requested but not received are considered reviews that were requested but not submitted by the system. If they are not restored to the original review they are automatically canceled when the Review summary report is submitted for the submission.

 

AE Recommendation of Presentation Type (Oral or Interactive)

All papers accepted to IROS 2013 will undergo the same review process and will be allocated the same number of pages in the proceedings. Each accepted paper will be presented at the conference in one of two formats, a regular oral presentation or interactive presentation in a multi-media session, somewhat similar to a poster presentation but with multimedia capabilities possible. The determination on the type of presentation for the paper will be made based on which format is best suited to that particular paper.

Hence, when making a recommendation on the disposition of the paper, AEs will also be asked to provide a recommendation on the type of presentation that is best suited for that paper.

 

AE Recommendation for Award Candidates

IROS 2013 features a number of awards. The ICRB is asked to help in the selection of these awards. In particular, the ICRB will designate a number of outstanding papers from which the Senior Program Committee will draw finalists for Conference Awards. Accordingly, Associate Editors are asked to identify which papers in their assignments they would consider as potential candidates for an award. To do so, please use the "Confidential comments to the Program Committee" textbox in your AE Report form. It is not necessary to identify which award would a paper be a suitable candidate for. The actual selection of finalists for each award will be made by the Senior Program Committee, using your inputs.

 

Editor Endorsement of AE reports

The AE's recommendation, expressed in the report, will be reviewed by the supervising Editor, who will be responsible for checking that the quality standards of the review process (including number and depth of reviews, significance of AE's reports avoiding undecisiveness) have been met. The Editor will issue a brief statement for each recommendation, whereby the correctness and completeness of the revieweing procedure is endorsed.

Editors will also overview and endorse the processing of award candidates by the AEs.

 

Conflict of Interests

An ICRB Editor, Associate Editor and a ICRB Reviewer is deemed to have a conflict of interest in a submitted paper if
A. he or she is a (co-)author of the paper; or
B. one (or more) of the authors of the paper:

  • is, or has been, a student or advisor of that person, or
  • has co-authored a paper or has closely collaborated in a research project with that person in the previous five years, or
  • is employed at the same institution (at the Department or Division level) as that person;

C. there are any other circumstances which may create an appearance that the person might have a bias in the evaluation of the paper.

 

Plagiarism

Plagiarism cases involve serious accusations, which should be dealt with carefully. IEEE has clear policies to follow. IEEE defines plagiarism as the reuse of someone else's prior ideas, processes, results, or words without explicitly acknowledging the original author and source. It is important for all IEEE authors to recognize that plagiarism in any form, at any level, is unacceptable and is considered a serious breach of professional conduct, with potentially severe ethical and legal consequences (source: Section "8.2 Publication Guidelines" of the IEEE PSPB Operations Manual, "Guidelines for Adjudicating Different Levels of Plagiarism." )

CrossCheck database and iThenticate tool
IROS 2013 has access to the CrossCheck database, and initiative to prevent scholarly and professional plagiarism. Every submission will receive a plagiarism similarity score. The score and scan reports are generated by an external provider (iThenticate) and the scan reports are stored on the iThenticate servers and not downloaded to the conference submission system servers. Eventually the reports are deleted from the iThenticate servers at a time determined by conference and provider policy, after which they are no longer available.

IMPORTANT: It is very important to note that it is not possible to draw any conclusion from the iThenticate numerical score alone. Unfortunately, due to the output from iThenticate algorithms, there will be a number of false positives. One issue is that it represents a a cumulative score so that, e.g., a 1% similarity with 40 papers is shown as 40% similarity. Another issue is that there may be large similarity, but still not plagiarism. For example, if an author has a version of their paper as an technical report or in a public dropbox someplace, it might get a very high (e.g., 99% or 100% similarity score). Hence, it is necessary that the detailed report be scanned to make sure that there is indeed a case of plagiarism. Also, some of the papers will not scan properly due to font problems.

The iThenticate reports are available to you as an Associate Editor and to your Editor, but they are NOT available to reviewers. To access the report for a particular paper:

  • click on the "Workspace" link
  • On your Workspace, there is a column labeled "Plagiarism scan" it shows the percentage overlap determined.
  • Click on the "Go to the plagiarism scan page" link for the paper in the "Plagiarism scan" column
  • Click on the "View" link in the "Report" column to see the plagiarism report. You should review the report for all of your papers, and need to pay particular attention if the score is 40 or higher.
  • Also on that page you can set one of two plagiarism flags, "Possible case of plagiarism" if you think there is plagiarism, and "Plagiarism report needs to be followed up" if you are not sure but think it should be followed up. If you determine that you need to set either of these flags, please also email your Editor to let them know to take a look at this paper.

Self-plagiarism. The definition of self-plagiarism is that the paper includes substantial overlap with another of the authors published paper(s) and that the previously published paper is not cited in the references and/or the contribution of the current paper over that other papers is not described in the current submission, both of which are required by IEEE policy.

The AE should be able to determine if self-plagiarism is a concern by reviewing the paper and the plagiarism report. If this is considered to be the case, then the AE should set one of the plagiarism flags and inform their managing Editor. If the Editor agrees, then they will inform the Editor-in-Chief who will also review the paper and the report. If the EiC concurs, then the paper is a candidate for summary rejection and the AE will be asked to prepare a report describing the reason for the summary rejection.

The Process
Notes:

  • The similarity score and iThenticate report are available to AEs and to Editors, but not to reviewers.
  • Discretion and confidentiality are extremely important. The reviewers, AEs, and Editors should not discuss the details or names of potential plagiarism case with anyone other than the persons above them in the chain, e.g., an AE could discuss with their supervising Editor or the Editor-in-Chief but not with another Editor or AE.
  • Comments related to plagiarism should be in the confidential comments for the PC and should not be mentioned in the comments for the authors. It will be up to the plagiarism process to determine what actions to take and what to report to the authors.

AE Procedures:

  1. For all papers whose similarity score is 40% or above, the AE should review the paper and the iThenticate report.
  2. The AE should provide a report regarding their findings in the confidential comments of the AE report.
  3. The AE should indicate their determination regarding potential plagiarism in the plagiarism report section of the AE report.
  4. If the AE indicated "plagiarism report needs to be followed up" or "possible case of plagiarism", then they should alert their Editor.
  5. Unless instructed otherwise by their Editor or the Editor-in-Chief, the AE should follow the review process normally for the paper by obtaining reviews and making a recommendation for acceptance based on the reviews and their own technical evaluation of the paper. It is important that this is done so that the paper can be treated fairly if the plagiarism alert is determined to be unfounded.

Plagiarism may also be spotted/reported by reviewers (recall, they don't have access to the iThenticate report). If a reviewer detects a potential case of plagiarism, then they should document their concerns in the confidential comments portion of their review and should alert their AE. It is important that the Reviewer is factual in their remarks, and that as much and detailed evidence is provided as possible. For instance, this could be a copy of the supposedly plagiarized paper with the copied parts highlighted. It should also be noted that there exist freely available software that can detect plagiarism automatically: if this was used, details on the query and its outcomes would also be useful. Based on this information, the AE should use the information provided by the reviewer in the same fashion in which they would have if the alert was prompted by the iThenticate report.

 

Organized Sessions and Organized Papers

Organized session proposals and organized papers will be handled as follows:

  • Organized Session Proposals (ISPs) will be reviewed by the Executive Program Committee (EPC).
  • Organized papers (those linked to an ISP) will be considered and reviewed just like any other contributed paper - that is, the AE will get two independent reviews, and draft a recommendation purely based on the technical merits of the individual paper. Reviewers should not even be informed of the underlying organized session proposal.
  • Good papers submitted as organized, whose session will eventually be turned down by the SPC, will be nonetheless accepted and presented in regular sessions. Good session proposals, for which only few good papers were submitted as organized and accepted by the IPC, might be integrated by the SPC with other accepted papers in the area. These aspects of how sessions will be formed pertain to the SPC, and they should not concern the AEs or Editors.

IROS Keywords

Profile Keyword
AO Medical Robots and Systems
AO Rehabilitation Robotics
AO Telerobotics
AO Soft-tissue Modeling
AO Surgical Robotics
AO Teleoperated Surgical Systems
AO Medical Systems, Healthcare and Assisted Living
AT Human-Robot Interaction
AT  Voice, Speech Synthesis  and Recognition
AT  Physical Human-Robot Interaction 
AT  Human Performance Augmentation 
AT  Gesture, Posture, Social Spaces and Facial Expressions
AT  Computer-assisted Diagnosis and Therapy
AT  Humanoid Ethics and Philosophy 
AT  Learning from Demonstration 
CL Biologically Inspired Robots
CL  Biomimetics
CL  Flexible Arms
CL  Evolutionary Robotics
CL Neurorobotics
DK Computer Vision
DK  Visual Learning
DK  Gripper Hand design 
DK  Perception for Grasping and Manipulation 
EF Legged Robots
EF  Wheeled Robots 
EF  Mechanism Design 
EF  Dynamics 
EF  Surveillance Systems 
EF  Robotics in Hazardous Fields 
EF  Energy and Environment Monitoring and Management 
EF  Mining and Demining
EY Humanoid Robots
EY  Humanoid and Bipedal Locomotion 
EY  Motion Control 
EY  Human-Humanoid Interaction 
EY  Self-Organized Robot Systems
EY  Manipulation and Compliant Assembly
EY  Humanoid Interaction
EY  Compliant Assembly
EY  Multifingered Hands 
EY  Humanitarian Technology for Energy, Environment and Safety
EY  Swarm Robots
EY  Motion and Trajectory Generation 
EY  Task Planning 
HZ Localization
HZ Recognition
HZ  Visual Navigation 
HZ  Human Detection and Tracking 
HZ  Omnidirectional Vision 
JX Grasping
JX  Intelligent Transportation Systems
JX  Collision Detection and Avoidance 
JX  Dexterous Manipulation 
JX  Manipulation Planning and Control
JX  Path Planning for Manipulators
JX  Mobile Manipulation
JX Nonholonomic Motion Planning
JX Contact Modeling
JX Cooperative Manipulators
JX Integrated Planning and Control
JX Robotics in Construction
KL Motion and Path Planning
KL  Kinematics
KL  Joint/Mechanism
KL  Parallel Robots
KL  Sensor Networks 
KL  Underactuated Robots 
KL  Parts Feeding and Fixturing
KL  Communication-aware Sensor and Motion Planning
KL  Factory Automation
KL  Integrated Task and Motion Planning 
KL  Manufacturing and Production Systems 
KL  Variable Stiffness Actuator Design and Control 
KL  Smart Infrastructures 
MB Learning and Adaptive Systems
MB  Robot Safety
MB  Adative Control
MB  AI Reasoning Methods 
MB  Force Control 
MB  Formal Methods in Robotics and Automation 
MB  Cognitive Human-Robot Interaction 
MB  Human Centered Planning and Control 
MB  Human Centered Automation 
MB  Human and Humanoid Skills/Cognition/Interaction 
MG Distributed Robot Systems
MG  Software and Architecture
MG  Cooperating Robots 
MG  Autonomous Agents
MG  Planning, Scheduling and Coordination 
MG  Control Architectures and Programming 
MG  Animation and Simulation 
MG  Agent-Based Systems 
MG  Behavior-Based Systems 
MG  Programming Environment 
MG  Architectures, Protocols and Middle-ware for Networked Robots 
MG  Multi-Robot Coordination
MG  Performance Evaluation and Benchmarking 
MK Haptics and Haptic Interfaces
MK  Force and Tactile Sensing
MK  Search and rescue Robots 
MK  Compliance and Impedance Control 
MK  Virtual Reality and Interfaces
MK  Hydraulic/Pneumatic Actuators 
MK  Smart Actuators 
MY Aerial Robotics
MY  New Actuators for Robotics 
MY  Cellular and Modular Robots 
MY  Climbing Robots
MY  Redundant Robots
MY  Tendon/Wire Mechanism
MY  Green Manufacturing 
NX Visual Serving
NX Visual Tracking
NX Calibration and Identification
NX Micro/Nano Robots
NX Industrial Robots
NX Micro-Manipulation
NX Networked Robots
NX Neural and Fuzzy Control
NX Nano Manipulation
NX Intrusion Detection, Identification and Security
NX Nano Assembly
NX Nano Automation
NX Nano Manufacturing
SL Mapping
SL  Marine Robotics 
SL  Field Robots 
SL  Space Robotics and Automation
SL  Failure Detection and Recovery 
SL  Reactive and Sensor-Based Planning 
SL  Robotics in Agriculture and Forestry 
SL  Unmanned Aerial Vehicles 
SL  Unmanned Aerial Systems 
SL  Sensor-Based Planning 
SRO Service Robots
SRO  Cloud Robotics 
SRO  Robot Companions and Social Human-Robot Interation 
SRO  Domestic Robots and Home Automation 
SRO  Education Robotics 
SRO  Personal Robots 
SRO  Automation in Life Sciences: Biotechnology, Pharmaceutical and Health care 
SRO  Entertainment Robotics 
SRO Brain Machine Interface
SRO Networked Teleoperation
SRO Ubiquitous Robotics
SRO Intelligent Toys
WB SLAM
WB Sensor Fusion
WB Path Planning for Multiple Mobile Robots or Agents
WB Navigation
WB Range Sensing
WB Sonars

Information for IROS Editors

IROS Conference Review Board (ICRB) Timeline for 2013

22 March 2013 Submission deadline
24 March 2013 Papers placed in profiles and initial AE assignments available for Editors to review
26 March 2013 Deadline for Editors to review papers in profile and AE assignments to papers
29 March 2013 Papers assigned to AEs
5 April 2013 Deadline for AEs to assign papers to reviewers
5 April 2013 Deadline for AEs to summarily reject papers
10 April 2013 Deadline for Editors to endorse/revise AE summary rejection recommendations
1 May 2013 Deadline for reviewers to submit reviews
17 May 2013 Deadline for AE final reports
29 May 2013 Deadline for Editor endorsements of AE reports
June 2013 Executive PC Meeting (ICRB Editor-in-Chief attends)

 

Overview of Review Process and Editor Responsibilities

The IROS Conference Review Board (ICRB) is organized in the same way as the ICRA CEB. There is one Editor-in-Chief (EiC), 15 Editors (EDs), and more than 200 Associate Editors (AEs). Each paper to be reviewed will be assigned one Editor and to one of the AEs that that Editor supervises. The AE will be responsible for obtaining a minimum of two high quality reviews for each paper they handle, and for preparing an AE recommendation that explains the recommended decision regarding the paper acceptance and presentation type. The AEs will also help to identify papers to be considered for awards. The Editors will be responsible for reviewing and endorsing the work done by the AEs on the papers for which they are responsible.

This page focuses on issues for Editors. The process from the perspective of AEs is noted on the IROS ICRB: Information for IROS Associate Editors page.

Assignment of Papers to Editors and AEs:
Keywords are partitioned into sets, each of which is the responsibility of one Editor. When a paper is submitted to the conference (via PaperPlaza), the author's choice of keywords will be used to determine to which Editor and AE the paper will initially be assigned. This initial assignment, which is done automatically by the system, is then reviewed by the Editor in Chief and the Editors to avoid conflict of interests (COI), to improve the matching of the expertise of the AE with the paper, and to provide load balancing across the AEs. Finally, the AEs must review the papers assigned to them and inform the supervising Editor of any COI they may have with their assigned papers.

Editor Tasks and Responsibilities:
1. The Editors should review the keyword assignments proposed by the EiC and notify the EiC of any problems.
2. After the keywords have been set, the Editors should start recruiting AEs whose expertise matches their assigned area of expertise. As AEs agree, the Editor should send their names, affiliation and paperplaza PIN to the EiC and the EiC will send them an official invitation from paper plaza.
3. After submissions close, the EiC will do an initial balancing of papers in profiles and will use the paper plaza tool to do an initial assignment of AEs to papers. After the Editors have been notified this is done, they should review the papers in their profile for conflict of interests (COI) with themself or their AEs. They should notify the EiC of a COI they have with any of papers in their profile so it can be moved to another profile and they should move papers between AEs in their profile to address any known COIs. Editors should also look for papers that are better handled in another profile, or that are duplicate or "empty" submissions. All of these cases should be reported to the EiC who can take care of them.
4. First week after papers released to the AEs:

  • The Editors should reassign papers within their profiles when AEs alert them to AE COIs or because the paper does not match their expertise. If the Editor cannot handle the paper within their profile, they should notify the EiC so it can be moved to another Editor.
  • AEs have 1 week to assign reviewers for their papers. The Editors should monitor that process, and prompt any AEs that are not doing this in a timely fashion or who are not selecting appropriate reviewers.
  • AEs also have 1 week to recommend summary rejection (i.e., rejection w/o review) for any of their papers. Editors should review those recommendations, including the AE report which will be sent back to the authors, and make a decision on it.

5. Second Week after papers released to AEs: During the second week of reviewing, the Editors should review any summary rejection recommendations made by AEs and determine if they agree with them. If so, they should notify the EiC. If not, they should notify the AE and ask them to send the paper for review.
6. During Reviewing Period:

  • Generally, Editors should keep an eye on their AEs and make sure they are making progress, e.g., requesting new reviews if reviews are cancelled, if disparate or low quality reviews are received, writing and submitting AE reports as reviews come in, etc. The Editor should help the AE with any difficult situations.
  • When potential plagiarism cases are reported, the Editor should review them and make a determination as to next steps, informing the EiC as needed. Additionally, the Editor should review all papers with similarity scores of 40 or higher and prompt AEs to review them in more detail if they are not already.

7. After AE reports are submitted: After the AE report has been submitted, it will be reviewed by the supervising Editor, who will be responsible for checking that the quality standards of the review process (including number and depth of reviews, significance of AE's reports avoiding undecisiveness) have been met. The Editor will complete a brief report in paper plaza for each paper they handle, whereby the correctness and completeness of the reviewing procedure is endorsed.

Editors will also overview and endorse the identification of award candidates by the AEs.


Note about Organized Sessions and Organized Papers: From the perspective of AEs and Editors, papers submitted to organized sessions will be handled in the same way as any other paper. The decision regarding whether an Organized Session Proposal (ISP) will be accepted, and if so, which papers will be included in it, will be handled by the Excutive Program Committee (EPC).

 

Getting Started, (Re) Assigning Papers to AEs, and Monitoring the Reviewing Process with "Digests"

The review process for IROS is managed using the PaperPlaza system. PaperPlaza provides a wide variety of tools to help AEs manage the review process. Reviewer assignments, review entry, AE reporting, and final decisions are all managed using PaperPlaza.

To access the system, go to the PaperPlaza page, click Start and then Log in. If you have forgotten your login information, you can retrieve it using the PIN management page.

It may be useful to spend a few minutes looking over the help pages, and in particular the Associate Editor's FAQ at the PaperPlaza site: PaperPlaza Help Page.

(Re)Assigning Papers to AEs: A simple way to (re)assign a paper to one of your AEs or change the assignement is as follows:

  • click on the "Workspace" link
  • click on your profile designation (your initials)
  • click on the "Details" link for the paper you want to move
  • click on the "Assign" link
  • you should see a radio button where you could select "None" or one of your AEs

Monitoring the Review Process: A useful way to monitor the review process and see the ratings and text for the reviews and the AE report is to use "Digests" as follows:
Tools > Program > Digests
After you have the digest page, select the items of information you are interested in and then hit the "submit" button at the bottom of the page. The spreadsheet resulting from your work will show up at the top of the page.

 

Editor Reports - Endorsement of AE Reports

The Editor needs to review and endorse all AE recommendations. The Editor is responsible for checking that the quality standards of the review process (including number and depth of reviews, significance of AE's reports, avoiding undecisiveness, etc.) have been met.

Support has been added to paper plaza so that the Editor reports can be done in the system. For each paper you handle, you can submit a report by clicking on the "Report" link in next to the paper's information in your workspace. The report has similar format to the AE report. You will be required to complete the following parts of the report.

  • A rating of A, B+, B, C, C-, D, or U. Please note that you should not use the rating of B-. Please note that the editor recommendation should help to distinguish the papers that received low or high borderline from the AEs.
  • An indication of whether you think the paper is a potential award candidate.
  • An indication of whether you think the paper would be suitable for interactive presentation (called multimedia in the form).
  • The plagiarism report section
  • The confidential comments to the Program committee. If you agree with the AE's report and recommendation, you can simply say "ok".

If you have already submitted your Editor report for a paper and wish to change it, please send the Editor-in-Chief the paper number and ask them to change the status of the paper to be under review so you can re-edit it.

 

Plagiarism

Plagiarism cases involve serious accusations, which should be dealt with carefully. IEEE has clear policies to follow. IEEE defines plagiarism as the reuse of someone else's prior ideas, processes, results, or words without explicitly acknowledging the original author and source. It is important for all IEEE authors to recognize that plagiarism in any form, at any level, is unacceptable and is considered a serious breach of professional conduct, with potentially severe ethical and legal consequences (source: Section "8.2 Publication Guidelines" of the IEEE PSPB Operations Manual, "Guidelines for Adjudicating Different Levels of Plagiarism." )

CrossCheck database and iThenticate tool
IROS has access to the CrossCheck database, and initiative to prevent scholarly and professional plagiarism. Every submission will receive a plagiarism similarity score. The score and scan reports are generated by an external provider (iThenticate) and the scan reports are stored on the iThenticate servers and not downloaded to the conference submission system servers. Eventually the reports are deleted from the iThenticate servers at a time determined by conference and provider policy, after which they are no longer available.

IMPORTANT: It is very important to note that it is not possible to draw any conclusion from the iThenticate numerical score alone. Unfortunately, due to the output from iThenticate algorithms, there will be a number of false positives. One issue is that it represents a a cumulative score so that, e.g., a 1% similarity with 40 papers is shown as 40% similarity. Another issue is that there may be large similarity, but still not plagiarism. For example, if an author has a version of their paper as an technical report or in a public dropbox someplace, it might get a very high (e.g., 99% or 100% similarity score). Hence, it is necessary that the detailed report be scanned to make sure that there is indeed a case of plagiarism. Also, some of the papers will not scan properly due to font problems.

The iThenticate reports are available to you as Editor and to your AE but they are NOT available to reviewers. You can access the reports for a particular paper, or prepare a digest containing the reporst for all the papers in your profile as follows.

To see the report on a particular paper:

  • Click on the "Details" link for a paper, either by putting the paper number in the "Go to" link, or from your "Workspace".
  • click on the "CrossCheck" link
  • the similarity score percentage
  • click on the "View" link in the "Report" column to see the plagiarism report.
  • Also on that page you can set one of two plagiarism flags, "Possible case of plagiarism" if you think there is plagiarism, and "Plagiarism report needs to be followed up" if you are not sure but think it should be followed up. The AEs can also set these flags.

To create a digest for all the papers in your profile:

  • Tools > Digest
  • Select the submission types to include all types (you need to have the "Contributed paper" and "Organized Session paper", but all types will get them both)
  • Select the items you want in the digest, including at least "Nr" (paper number), "Plagiarism score (%)" (Plagiarism similarity score from Crosscheck), and "Plagiarism flag".
  • Hit the "Submit" button to create the digest.

This will give you a summary of all the papers in your profile which you can process as you wish, e.g., sort by similarity score, identify flags, etc., that will allow you to easily identify the papers that have scores 40 or higher, which should all get an inspection by you (and the AEs).

Self-plagiarism. The definition of self-plagiarism is that the paper includes substantial overlap with another of the authors published paper(s) and that the previously published paper is not cited in the references and/or the contribution of the current paper over that other papers is not described in the current submission, both of which are required by IEEE policy.

The AE should be able to determine if self-plagiarism is a concern by reviewing the paper and the plagiarism report. If this is considered to be the case, then the AE should set one of the plagiarism flags and inform their managing Editor. If the Editor agrees, then they should inform the Editor-in-Chief who will also review the paper and the report. If the EiC concurs, then they will inform the Editor and AE that the paper is a candidate for summary rejection and will request that the AE prepare a report describing the reason for the summary rejection.

The Process

Notes:

  • The similarity score and iThenticate report are available to AEs and to Editors, but not to reviewers.
  • Discretion and confidentiality are extremely important. The reviewers, AEs, and Editors should not discuss the details or names of potential plagiarism case with anyone other than the persons above them in the chain, e.g., an AE could discuss with their supervising Editor or the Editor-in-Chief but not with another Editor or AE.
  • Comments related to plagiarism should be in the confidential comments for the PC and should not be mentioned in the comments for the authors. It will be up to the plagiarism process to determine what actions to take and what to report to the authors.

AE Procedures:

  1. For all papers whose similarity score is 40% or above, the AE should review the paper and the iThenticate report.
  2. The AE should provide a report regarding their findings in the confidential comments of the AE report.
  3. The AE should indicate their determination regarding potential plagiarism in the plagiarism report section of the AE report.
  4. If the AE indicated "plagiarism report needs to be followed up" or "possible case of plagiarism", then they should alert their Editor.
  5. Unless instructed otherwise by their Editor or the Editor-in-Chief, the AE should follow the review process normally for the paper by obtaining reviews and making a recommendation for acceptance based on the reviews and their own technical evaluation of the paper. It is important that this is done so that the paper can be treated fairly if the plagiarism alert is determined to be unfounded.

Plagiarism may also be spotted/reported by reviewers (recall, they don't have access to the iThenticate report). If a reviewer detects a potential case of plagiarism, then they should document their concerns in the confidential comments portion of their review and should alert their AE. It is important that the Reviewer is factual in their remarks, and that as much and detailed evidence is provided as possible. For instance, this could be a copy of the supposedly plagiarized paper with the copied parts highlighted. It should also be noted that there exist freely available software that can detect plagiarism automatically: if this was used, details on the query and its outcomes would also be useful. Based on this information, the AE should use the information provided by the reviewer in the same fashion in which they would have if the alert was prompted by the iThenticate report.

Editor Procedures:

  1. For all papers whose similarity score is 40% or above, whether or not they have been flagged by the AE, the Editor should review the paper and the iThenticate report, and then should review the AE's plagiarism report for accurracy and completeneteness.
  2. The Editor should provide a report regarding their findings in the confidential comments of the Editor report. If they agree with the AE report and have nothing to add to it, then they can simply note that in their comments.
  3. The Editor should indicate their determination regarding potential plagiarism in the plagiarism report section of the AE report. If they agree with the AE determination, they should record make the same determination in their report.
  4. If the Editor indicated "plagiarism report needs to be followed up" or "possible case of plagiarism", then they should alert the Editor in Chief.

 

Organized Sessions and Organized Papers

Organized session proposals and organized papers will be handled as follows:

  • Organized Session Proposals (ISPs) will be reviewed by the Executive Program Committee (EPC).
  • Organized papers (those linked to an ISP) will be considered and reviewed just like any other contributed paper - that is, the AE will get two independent reviews, and draft a recommendation purely based on the technical merits of the individual paper. Reviewers should not even be informed of the underlying organized session proposal.
  • Good papers submitted as organized, whose session will eventually be turned down by the EPC, will be nonetheless accepted and presented in regular sessions. Good session proposals, for which only few good papers were submitted as organized and accepted by the IPC, might be integrated by the EPC with other accepted papers in the area. These aspects of how sessions will be formed pertain to the EPC, and they should not concern the AEs or Editors.