IEEE Robotics and Automation Society IEEE

IROS Venues and Archive

IROS 2015 Hamburg, Germany
28 September-3 October
General Chair: Jianwei Zhang
Program Chair: Alois Knoll

IROS 2014 Chicago, IL USA
14-18 September
General Chair: Kevin Lynch
Program Chair: Lynne Parker

IROS 2013 Tokyo, Japan
3-8 November
General Chair: Shigeki Sugano
Program Chair: Makoto Kaneko

IROS 2012 Vilamoura, Algarve, Portugal
7-12 October 
General co-Chairs: Anibal T. de Almeida, Urbano Nunes,
Program Chair: Eugenio Guglielmelli

IROS 2011 San Francisco, CA USA
25-30 September 
General Chair: Oussama Khatib
Program Chair: Gaurav Sukhatme

IROS 2010 Taipei, Taiwan
18-22 October 
General Chair: Ren C. Luo
Program Chair: Huei-Yung Lin

IROS 2009 St. Louis, MO, USA
11-15 October 
General Chair: Ning Xi
Program Chair: Zhidong Wang

IROS 2008 Nice, France
22-26 September 
General co-Chairs: Raja Chatila, Jean-Pierre Merlet 
Program Chair: Christian Laugier

IROS 2007 San Diego, CA, USA
29 October-2 November

IROS 2006 Beijing, China
9-15 October 

IROS 2005 Edmonton, Canada
2-6 August 
General Chair: Max Meng
Program Chair: Hong Zhang

IROS 2004 Sendai, Japan
28 September-2 October 

IROS 2003 Las Vegas, NV, USA
27-31 October 

IROS 2002 Lausanne, Switerland
30 September-4 October
General Chair: Roland Siegwart
Program Chair: Christian Laugier

IROS 2001 Maui, HI USA
29 October- 3 November 

IROS 2000 Takamatsu, Japan
30 October-5 November 

IROS 1999 Kyongju, Korea
17-21 October 

IROS 1998 Victoria, Canada
13-17 October 

IROS 1997 Grenoble, France
7-11 September 

IROS 1996 Osaka, Japan
4-8 November 

IROS 1995 Pittsburgh, PA USA
5-9 August 

IROS 1994 Munich, Germany
12-16 September  

IROS 1993 Tokyo, Japan
26-30 July  

IROS 1992 Raleigh, NC USA
7-10 July  

IROS 1991 Osaka, Japan
3-5 November 

IROS 1990 Ibaraki, Japan
3-6 July  

IROS 1989 Ysukuba, Japan
4-6 September 

IROS 1988 Tokyo, Japan
31 October-2 November 

IROS Conference Paper Review Board

The IROS Conference Paper Review Board (CPRB) is organized in the same way as the ICRA CEB. There is one Editor-in-Chief (EiC), up to 17 Editors (EDs), and more than 200 Associate Editors (AEs). Each paper to be reviewed will be assigned one Editor and to one of the AEs that that Editor supervises. The AE will be responsible for obtaining a minimum of two high quality reviews for each paper they handle, and for preparing an AE recommendation that explains the recommended decision regarding the paper acceptance and presentation type. The AEs will also help to identify papers to be considered for awards. The Editors will be responsible for reviewing and endorsing the work done by the AEs on the papers for which they are responsible.

The ultimate responsibility for the conference content, acceptance rate, symposia, etc. rests with the Program Committee. The purpose of the CPRB is to ensure a high quality, robust, reliable and consistent review process.

Wolfram Burgard

Conference Paper Review Board, Editor-in-Chief 2014-2016


University of Freiburg
Department of Computer Science
Freiburg, Germany
This email address is being protected from spambots. You need JavaScript enabled to view it.


Nancy Amato

Conference Paper Review Board, Editor-in-Chief 2011-2013


Texas A&M University
Parasol Laboratory, Algorithms & Applications Group
College Station (TX), United States
This email address is being protected from spambots. You need JavaScript enabled to view it.


2014 CPRB Editors

Alois Knoll, TU Munich, Germany (Profile AK)
Adriana Tapus, ENSTA-ParisTech, France (Profile AT)
Danica Kragic, KTH, Sweden (Profile DK)
Dezhen Song, TAMU, USA (Profile DS)
Eiichi Yoshida, AIST, Japan (Profile EY)
Fumihito Arai, Nagoya University, Japan (Profile FA)
Giuseppe Oriolo, Università di Roma La Sapienza, Italy (Profile GO)
Jose A. Castellanos, University of Zaragoza, Spain (Profile JC)
Jan Peters, TU Darmstadt, Germany (Profile JP)
Jean-Pierre Merlet, INRIA Sophia Antipolis, France (Profile JPM)
Jing Xiao, University of North Carolina at Charlotte, USA (Profile JX)
Maren Bennewitz, University of Freiburg, Germany (Profile MB)
Maria Gini, University of Minnesota, USA (Profile MG)
Shinichi Harai, Ritsumeikan University, Japan (Profile SH)
Simon Lacroix, LAAS, France (Profile SL)
Sang-Rok Oh, KIST, Korea (Profile SRO)
Siddhartha Srinivasa, Carnegie Mellon University, USA (Profile SS)
Tatsuo Arai, Osaka University, Japan (Profile TA)


2013 CPRB Editors

Wolfram Burgard - University of Freiburg, Germany (Profile WB)
Martin Buss - TU Munich, Germany (Profile MB)
Edwardo Fukushima - Tokyo Institute of Technology, Japan (Profile EF)
Maria Gini - University of Minnesota, USA (Profile MG)
Masashi Konyo - Tohoko University, Japan (Profile MK)
Danica Kragic - KTH, Sweden (Profile DK)
Simon Lacroix - LAAS, France (Profile SL)
Cecilia Laschi - Scuola Superiore Sant'Anna, Italy (Profile CL)
Kevin Lynch - Northwestern University (Profile KL)
Alison Okamura - Stanford University, USA (Profile AO)
Sang-Rok Oh - KIST, Korea (Profile SRO)
Adriana Tapus - ENSTA-ParisTech, France (Profile AT)
Ning Xi - Michigan State University, USA (Profile NX)
Jing Xiao - University of North Carolina at Charlotte, USA (Profile JX)
Mark Yim - University of Pennsylvania, USA (Profile MY)
Eiichi Yoshida - AIST, Japan (Profile EY)
Hongbin Zha - Peking University, China (Profile HZ)

 2013 Associate Editors


 2012 CPRB Editors

Wolfram Burgard, University of Freiburg, Germany;
Gordon Cheng, Technical University of Munich, Germany;
Paolo Fiorini, University of Verona, Italy;
Li-Chen Fu, National Taiwan University, Taiwan;
Maria Gini, University of Minnesota, USA;
David Hsu, National University of Singapore, Singapore;
Cecilia Laschi, Scuola Superiore Sant'Anna, Italy;
Marcia O'Malley, Rice University, USA;
Sang-Rok Oh, KIST, Korea;
Mihoko Otake, Tokyo University, Japan;
Satoshi Tadokoro, Tohoku University, Japan;
Adriana Tapus, ENSTA-ParisTech, France;
Ning Xi, Michigan State University, USA;
Jing Xiao, University of North Carolina at Charlotte, USA;
Mark Yim, University of Pennsylvania, USA

2012 IROS Associate Editors

2012 Videos Editor

João Barreto, University of Coimbra, Portugal

2012 Videos Associate Editors


2011 CPRB Editors

I-Ming Chen, Nanyang Technological University, Singapore
Alessandro De Luca, Universita di Roma "La Sapienza", Italy
Chad Jenkins, Brown University, USA
Danica Kragic, Royal Institute of Technology, Sweden
Nikos Papanikolopoulos, University of Minnesota, USA
Frank Park, Seoul National University, Korea
Lynne Parker, University of Tennessee, Knoxville, USA
Shigeki Sugano, Waseda University, Japan
Frank van der Stappen, Univerity of Utrecht, The Netherlands

2011 IROS Associate Editors

ICRA Venues and Archive

ICRA 2018 Brisbane, Australia
21-26 May
General Chair: Alex Zelinsky
Program Chair: Peter Corke

ICRA 2017 Singapore
29 May - 2 June
General Chair: I-Ming Chen
Program Chair: Yoshihiko Nakamura

ICRA 2016 Stockholm, Sweden
16-21 May
General Chair: Danica Kragic
Program Chair: Antonio Bicchi

ICRA 2015 Seattle, WA, USA
26-30 May
General Chair: Lynne Parker
Program Chair: Nancy Amato

ICRA 2014 Hong Kong, China
31 May - 5 June
General Chair: Ning Xi
Program Chair: Bill Hamel

ICRA 2013 Karlsruhe, Germany
6-10 May
General Chair: Rüdiger Dillmann
Program Chair: Markus Vincze

ICRA 2012 Minneapolis, MN USA
4-18 May
General Chair: Nikos Papanikolopoulos
Program Chair: Paul Oh

ICRA 2011 Shanghai, China
9-13 May
General Chair: Zexiang Li
Program Chair: Yuan Fang Zheng

ICRA 2010 Anchorage, AK USA
3-8 May
General Chair: Wesley Snyder
Program Chair: Vijay Kumar

ICRA 2009 Kobe, Japan
12-17 May
General Chair: Kazuhiro Kosuge
Program Chair: Katsushi Ikeuchi

ICRA 2008 Pasadena, CA USA
19-23 May
General co-Chairs: Maja Matari, Paul Schenker
Program co-Chairs: Stefan Schaal, Gaurav S Sukhatme

ICRA 2007 Rome, Italy
10-14 April
General co-Chairs: Paolo Dario, Alessandro De Luca
Program Chair: Bruno Siciliano

ICRA 2006 Orlando, FL USA
15-19 May
General co-Chairs: Normal Caplan, C.S. George Lee

ICRA 2005 Barcelona, Spain
8-22 April
General Chair: Alícia Casals
Program Chair: Rüdiger Dillmann

ICRA 2004 New Orleans, LA USA
26 April - 1 May
General co-Chairs: T.J. Tarn, Toshio Fukuda
Program Chair: Kimon Valavanis

ICRA 2003 Taipei, China
14-19 September (Delayed from May due to concerns about SARS epidemic)
General Chair: Ren C. Luo
Program Chair: Li-Chen Fu

ICRA 2002 Washington DC, USA
11-15 May
General Chair: William Hamel
Program Chair: Anthony A. Maciejewski

ICRA 2001 Seoul, Korea
21-26 May
General Chair: Wook Hyun Kwon
Program Chair: Beom Hee Lee

ICRA 2000 San Francisco, CA USA
24-28 April
General Chair: Brian S. Carlisle
Program Chair: Oussama Khatib

ICRA 1999 Detroit, MI USA
10-15 May
General Chair: Hadi Akeel

ICRA 1998 Leuven, Belgium
16-21 May
General Chair: Georges Giralt

ICRA 1997 Albuquerque, NM USA
20-25 April
General Chair: Ray Harrigan

ICRA 1996 Minneapolis, MN USA
22-28 April
General Chair: Norman Caplan

ICRA 1995 Nagoya, Japan
21-27 May
General Chair: Toshio Fukuda

ICRA 1994 San Diego, CA USA
8-13 May
General Chair: William Gruver

ICRA 1993 Atlanta, GA USA
2-6 May
Ceneral Chair: Wayne Book

ICRA 1992 Nice, France
12-14 May
General Chair: Giuseppe Menga

ICRA 1991 Sacramento, CA USA
9-11 April
General Chair: T.C. (Steve) Hsia

ICRA 1990 Cincinnati, OH, USA
13-18 May
General Chair: Richard A. Volz

ICRA 1989 Scottsdale, AZ USA
14-19 May
General Chair: George Bekey

ICRA 1988 Philadelphia, PA USA
24-29 April
General Chair: T. Pavlidis

ICRA 1987 Raleigh, NC USA
31 March - 3 April
General Chair: Y.C. 'Larry' Ho

ICRA 1986 San Francisco, CA USA
7-10 April
General Chair: Antal (Tony) Bejczy

ICRA 1985 St Louis, MO USA
25-28 March
General Chair: K.S. Fu
Program Chair: T. Lozano-Perez

1984: ICR (Int'l Conf. on Robotics,sponsored by Computer Society Technical Com. on Robotics) Atlanta
13-15 March
General Chair: John Jarvis

Information for IROS Associate Editors

IROS Conference Review Board (ICRB) Timeline for 2013

22 March 2013 Submission deadline
29 March 2013 Papers assigned to AEs
5 April 2013 Deadline for AEs to assign papers to reviewers
5 April 2013 Deadline for AEs to summarily reject papers
1 May 2013 Deadline for reviewers to submit reviews
17 May 2013 Deadline for AE final reports
29 May 2013 Deadline for Editor endorsements of AE reports
June 2013 Executive PC Meeting (ICRB Editor-in-Chief attends)


Overview of Review Process and Associate Editor Responsibilities

The IROS Conference Review Board (ICRB) is organized in the same way as the ICRA CEB. There is one Editor-in-Chief (EiC), 17 Editors (EDs), and more than 200 Associate Editors (AEs). Each paper to be reviewed will be assigned one Editor and to one of the AEs that that Editor supervises. The AE will be responsible for obtaining a minimum of two high quality reviews for each paper they handle, and for preparing an AE recommendation that explains the recommended decision regarding the paper acceptance and presentation type. The AEs will also help to identify papers to be considered for awards. The Editors will be responsible for reviewing and endorsing the work done by the AEs on the papers for which they are responsible.

Assignment of Papers to Editors and AEs:
For IROS 2013, the keywords are partitioned into sets, each of which is the responsibility of one Editor. When a paper is submitted to the conference (via PaperPlaza), the author's choice of keywords will be used to determine to which Editor and AE the paper will initially be assigned. This initial assignment, which is done automatically by the system, is then reviewed by the Editor in Chief and the Editors to avoid conflict of interests (COI), to improve the matching of the expertise of the AE with the paper, and to provide load balancing across the AEs. Finally, the AEs must review the papers assigned to them and inform the supervising Editor of any COI they may have with their assigned papers.

AE Tasks and Responsibilities:
Once assigned papers to handle and notified that they should get started, the AE should follow the following steps:

  1. As soon as possible, the AE should check the papers they have been assigned and make sure that they do not have a conflict of interest (COI) with any of their assigned papers. If a COI exists, they should notify the managing Editor and the EiC immediately so that the paper can be assigned to another AE.
  2. By the posted deadline (typically 1 week after paper assignments are made), the AE should:
  3. identify papers to be recommended for summary rejection (rejected without review), or
  4. determine appropriate reviewers and request reviews from them in PaperPlaza.
  5. The AE should monitor the review process as reviews come in. If reviews are not substantial enough or are not prepared in a appropriate way, then the AE should ask the reviewer to provide an improved review. In extreme cases, the AE may need to request an additional review if a reviewer does not provide a review or does not provide an appropriate review. If difficult situations arise, the AE should consult with the managing Editor and/or the EiC.
  6. After a sufficient number of quality reviews (at least 2) have been obtained, then the AE should prepare the AE report providing their recommendation on the paper. This should also include a recommendation on the presentation type (regular oral presentation or multimedia presentation), if the paper is to be accepted, and a recommendation on whether the paper should be considered for an award.
  7. After the AE report has been submitted, the supervising Editor will review the AE recommendation and either provide the Editor Endorsement or suggest modifications.

Note about Organized Sessions: From the perspective of AEs and Editors, papers submitted to organized sessions will be handled in the same way as any other paper. The decision regarding whether an Organized Session Proposal (ISP) will be accepted, and if so, which papers will be included in it, will be handled by the Senior Program Committee (SPC).


Getting Started

The review process for IROS is managed using the PaperPlaza system. PaperPlaza provides a wide variety of tools to help AEs manage the review process. Reviewer assignments, review entry, AE reporting, and final decisions are all managed using PaperPlaza.

To access the system, go to the PaperPlaza page, click Start and then Log in. If you have forgotten your login information, you can retrieve it using the PIN management page.

It may be useful to spend a few minutes looking over the help pages, and in particular the Associate Editor's FAQ at the PaperPlaza site: PaperPlaza Help Page.


Summary Rejection

In certain special cases, an AE may recommend that a paper be rejected without sending the paper for review. In such cases, the AE writes a brief summary review giving the rationale for this decision, and gives the paper an Unsatisfactory rating. This is done via the usual AE Report mechanism (described below). The Editor assigned to the paper will then make the final determination as to whether the paper should be rejected without further review, or should go through the formal review process.

A paper should be rejected without review in any of the following cases:

  • It clearly makes no novel contribution to the state of the art.
  • It contains significant technical errors.
  • The paper has been published previously (i.e., the paper is identical to, or nearly identical to previously published work by the same authors).
  • The paper plagiarizes previously published research by other authors.

A paper should not be rejected without review merely because it makes only an incremental contribution, because it fails to report real-world experiments or because of poor writing quality. Further, a paper should not be summarily rejected because the AE feels its subject lies outside the scope of IROS (this judgment is left for the Program Committee). If there is any doubt as to the decision, the paper should be sent for review.

No more that 10-15% of submitted papers will be rejected without review.


Selecting Appropriate Reviewers

The ICRB has a strict review quality policy: every submitted paper must receive at least two substantial reviews. An AE should reject any sub-standard review received - e.g., too short or shallow. It is also the AE's responsibility to see that reviews are constructive and not diminishing the authors' efforts, even when they have to be negative or very negative.

Selecting appropriate reviewers is critical in order to have a quality and timely review process. The most important criteria is for the reviewer to have the appropriate expertise. Candidate reviewers may include leading researchers in the topic area, colleagues who are familiar with the topic, authors of papers on the reference list, authors of relevant papers that have been previously presented at IROS or in other high-profile journals and conferences. Generally, a key idea to secure reviewers is to align the interest of reviewers with ours: the paper is on a topic that they would really like to see or to say something about. Having reviewers who are interested in going over the manuscript helps alleviate the need of repeatedly sending reminder messages, and avoids the need for AEs to write detailed comments themselves to supplement shallow and not-to-the-point reviews.

In addition to having appropriate expertise, the following provides some other requirements and guidelines in selecting reviewers.

  • The reviewer cannot have a conflict of interest with the paper.
  • An AE cannot provide a review for the paper.
  • At most one of the reviewers for a paper can be at the same institution as the AE for the paper.
  • A mix of reviewers from different geographical regions is also desirable.
  • While it is not explicitly forbidden to ask other AEs to perform reviews, it is likely that they will decline, since they will be quite busy managing their own paper load.
  • A good mix of senior and junior reviewers is desirable as they provide reviews from different perspectives and at different levels of detail.

IMPORTANT NOTICE: Paperplaza has a very nice feature to help you choose your reviewers, which however has a potential for misuse. The feature is that you can get a list of registered users whose keywords match the paper at hand, so that a reviewer can be recruited with a single click. The abuse of this feature is that the AE selects a reviewer whom the AE does not know, and whose expertise is not proven. It may thus happen that junior members of the community (e.g., undergraduate students), or simply people that are associated to a keyword by chance or mistake, are asked a review in a field where they are not competent. This is clearly unacceptable, and ultimately counterproductive in terms of time - as the reviewer will probably decline, or anyway the Editors will discard the review.

Also, please note that you should not add new users to the system unless you are certain that they are not already registered. Multiple PINs for a single user can cause significant difficulty and confusion.


Requesting Reviews in PaperPlaza

Review requests are handled by the PaperPlaza system. After you have compiled your list of reviewers, a review can be requested by following the Reviews link for a submission, then clicking the Request a review link. You may edit the standard form letter as you please. You must click Send to actually generate the e-mail invitation to the reviewer.

You can find details about this process in the PaperPlaza help pages: Compiling my reviewer list.


Preparing and Submitting an AE Report

Once you have collected all reviews, you should prepare and submit your AE report. You can find details about the report process in the PaperPlaza Help pages available after you have logged in. Please note that we will not be using the rebuttal process.

We would like to have the AE reports have a common structure that:

  • First, summarizes the paper topic/contribution (could be brief, even just 1-2 sentences).
  • Next, notes the main strengths and weaknesses of the paper, summarizing the main points from the reviews and noting any additional issues noted by the AE. Please avoid copying excerpts of the reviews in your report, as authors will see the reviews.
  • Then, notes any special circumstances that may apply. For example, if two reviews are in serious conflict, the AE Report should resolve the conflict if possible. If neither review is substantive, the AE Report should give a solid rationale for the decision to be taken.

Here is guidance and reminders on what to put or not to put in your report and in determining your scores:

  • Please do not mention charge/mention plagiarism in the comments to the authors. That should be in the confidential comments only.
  • Please do not mention a decision (reject or accept) in the text part of the AE report that is meant for the authors, but only the motivations for your recommendation. The reason for this is that your recommendation is a recommendation to the program committee and that while they usually do agree with the AE, in some cases the Program Committee may change the recommended decision, and we want to avoid confusing the authors if that happens.
  • You will give a recommended score, which is not visible to authors. If the reviews disagree, please do not merely give the average as your rating. As AE, your role is to settle the conflict. In recommending your score, please consider that papers with an "A" are "Definitely Accept", "B+" are "accept", "B" are "high borderline", "B-" are "DO NOT USE" , "C" are "low borderline" and "C-" are "reject" and "D" are "reject". Ratings of "B-" are discouraged and should not be used. Notice that "U" is not "F", i.e., it is not the next lowest grade in the technical score scale. Rather, "U" stands for "Unsuitable", and should be used as a flag to signal "problem" papers - e.g., out of scope, or suspect of partial plagiarism, or incomplete, etc.
  • Note that once you have submitted your AE report, no further reviews can be obtained or entered into the system. When you submit your report, all pending reviews are canceled automatically. If you made a mistake or anyway need to change the report, you must ask the Profile Editor to reopen the paper.


Requesting Reviewers to Revise Reviews

Request revision of the review. After checking a review that has been submitted this link may be used to request the reviewer to prepare and submit a revised review. The form letter "Request to submit a revised review" is available for this purpose. This is a conference wide form letter that may be individualized on the Personalized form letters page. Revision of a review may be requested if the status of the submission is Under review or Decision pending. When this request is sent to the reviewer the status of the review is reset to 'Saved' so that the reviewer may re-access and resubmit the review. The usual option to save a review is not available to the reviewer for such reviews.

Restoring reviews to original after revision requested. To Restore a review for which revision was requested to the original review click on the link "Revision requested on..." for the review in question. This opens an information box with a link to view the original review and another link to restore the original review.

NOTE: Disposition of unreceived revised reviews. Reviews for which a revision was requested but not received are considered reviews that were requested but not submitted by the system. If they are not restored to the original review they are automatically canceled when the Review summary report is submitted for the submission.


AE Recommendation of Presentation Type (Oral or Interactive)

All papers accepted to IROS 2013 will undergo the same review process and will be allocated the same number of pages in the proceedings. Each accepted paper will be presented at the conference in one of two formats, a regular oral presentation or interactive presentation in a multi-media session, somewhat similar to a poster presentation but with multimedia capabilities possible. The determination on the type of presentation for the paper will be made based on which format is best suited to that particular paper.

Hence, when making a recommendation on the disposition of the paper, AEs will also be asked to provide a recommendation on the type of presentation that is best suited for that paper.


AE Recommendation for Award Candidates

IROS 2013 features a number of awards. The ICRB is asked to help in the selection of these awards. In particular, the ICRB will designate a number of outstanding papers from which the Senior Program Committee will draw finalists for Conference Awards. Accordingly, Associate Editors are asked to identify which papers in their assignments they would consider as potential candidates for an award. To do so, please use the "Confidential comments to the Program Committee" textbox in your AE Report form. It is not necessary to identify which award would a paper be a suitable candidate for. The actual selection of finalists for each award will be made by the Senior Program Committee, using your inputs.


Editor Endorsement of AE reports

The AE's recommendation, expressed in the report, will be reviewed by the supervising Editor, who will be responsible for checking that the quality standards of the review process (including number and depth of reviews, significance of AE's reports avoiding undecisiveness) have been met. The Editor will issue a brief statement for each recommendation, whereby the correctness and completeness of the revieweing procedure is endorsed.

Editors will also overview and endorse the processing of award candidates by the AEs.


Conflict of Interests

An ICRB Editor, Associate Editor and a ICRB Reviewer is deemed to have a conflict of interest in a submitted paper if
A. he or she is a (co-)author of the paper; or
B. one (or more) of the authors of the paper:

  • is, or has been, a student or advisor of that person, or
  • has co-authored a paper or has closely collaborated in a research project with that person in the previous five years, or
  • is employed at the same institution (at the Department or Division level) as that person;

C. there are any other circumstances which may create an appearance that the person might have a bias in the evaluation of the paper.



Plagiarism cases involve serious accusations, which should be dealt with carefully. IEEE has clear policies to follow. IEEE defines plagiarism as the reuse of someone else's prior ideas, processes, results, or words without explicitly acknowledging the original author and source. It is important for all IEEE authors to recognize that plagiarism in any form, at any level, is unacceptable and is considered a serious breach of professional conduct, with potentially severe ethical and legal consequences (source: Section "8.2 Publication Guidelines" of the IEEE PSPB Operations Manual, "Guidelines for Adjudicating Different Levels of Plagiarism." )

CrossCheck database and iThenticate tool
IROS 2013 has access to the CrossCheck database, and initiative to prevent scholarly and professional plagiarism. Every submission will receive a plagiarism similarity score. The score and scan reports are generated by an external provider (iThenticate) and the scan reports are stored on the iThenticate servers and not downloaded to the conference submission system servers. Eventually the reports are deleted from the iThenticate servers at a time determined by conference and provider policy, after which they are no longer available.

IMPORTANT: It is very important to note that it is not possible to draw any conclusion from the iThenticate numerical score alone. Unfortunately, due to the output from iThenticate algorithms, there will be a number of false positives. One issue is that it represents a a cumulative score so that, e.g., a 1% similarity with 40 papers is shown as 40% similarity. Another issue is that there may be large similarity, but still not plagiarism. For example, if an author has a version of their paper as an technical report or in a public dropbox someplace, it might get a very high (e.g., 99% or 100% similarity score). Hence, it is necessary that the detailed report be scanned to make sure that there is indeed a case of plagiarism. Also, some of the papers will not scan properly due to font problems.

The iThenticate reports are available to you as an Associate Editor and to your Editor, but they are NOT available to reviewers. To access the report for a particular paper:

  • click on the "Workspace" link
  • On your Workspace, there is a column labeled "Plagiarism scan" it shows the percentage overlap determined.
  • Click on the "Go to the plagiarism scan page" link for the paper in the "Plagiarism scan" column
  • Click on the "View" link in the "Report" column to see the plagiarism report. You should review the report for all of your papers, and need to pay particular attention if the score is 40 or higher.
  • Also on that page you can set one of two plagiarism flags, "Possible case of plagiarism" if you think there is plagiarism, and "Plagiarism report needs to be followed up" if you are not sure but think it should be followed up. If you determine that you need to set either of these flags, please also email your Editor to let them know to take a look at this paper.

Self-plagiarism. The definition of self-plagiarism is that the paper includes substantial overlap with another of the authors published paper(s) and that the previously published paper is not cited in the references and/or the contribution of the current paper over that other papers is not described in the current submission, both of which are required by IEEE policy.

The AE should be able to determine if self-plagiarism is a concern by reviewing the paper and the plagiarism report. If this is considered to be the case, then the AE should set one of the plagiarism flags and inform their managing Editor. If the Editor agrees, then they will inform the Editor-in-Chief who will also review the paper and the report. If the EiC concurs, then the paper is a candidate for summary rejection and the AE will be asked to prepare a report describing the reason for the summary rejection.

The Process

  • The similarity score and iThenticate report are available to AEs and to Editors, but not to reviewers.
  • Discretion and confidentiality are extremely important. The reviewers, AEs, and Editors should not discuss the details or names of potential plagiarism case with anyone other than the persons above them in the chain, e.g., an AE could discuss with their supervising Editor or the Editor-in-Chief but not with another Editor or AE.
  • Comments related to plagiarism should be in the confidential comments for the PC and should not be mentioned in the comments for the authors. It will be up to the plagiarism process to determine what actions to take and what to report to the authors.

AE Procedures:

  1. For all papers whose similarity score is 40% or above, the AE should review the paper and the iThenticate report.
  2. The AE should provide a report regarding their findings in the confidential comments of the AE report.
  3. The AE should indicate their determination regarding potential plagiarism in the plagiarism report section of the AE report.
  4. If the AE indicated "plagiarism report needs to be followed up" or "possible case of plagiarism", then they should alert their Editor.
  5. Unless instructed otherwise by their Editor or the Editor-in-Chief, the AE should follow the review process normally for the paper by obtaining reviews and making a recommendation for acceptance based on the reviews and their own technical evaluation of the paper. It is important that this is done so that the paper can be treated fairly if the plagiarism alert is determined to be unfounded.

Plagiarism may also be spotted/reported by reviewers (recall, they don't have access to the iThenticate report). If a reviewer detects a potential case of plagiarism, then they should document their concerns in the confidential comments portion of their review and should alert their AE. It is important that the Reviewer is factual in their remarks, and that as much and detailed evidence is provided as possible. For instance, this could be a copy of the supposedly plagiarized paper with the copied parts highlighted. It should also be noted that there exist freely available software that can detect plagiarism automatically: if this was used, details on the query and its outcomes would also be useful. Based on this information, the AE should use the information provided by the reviewer in the same fashion in which they would have if the alert was prompted by the iThenticate report.


Organized Sessions and Organized Papers

Organized session proposals and organized papers will be handled as follows:

  • Organized Session Proposals (ISPs) will be reviewed by the Executive Program Committee (EPC).
  • Organized papers (those linked to an ISP) will be considered and reviewed just like any other contributed paper - that is, the AE will get two independent reviews, and draft a recommendation purely based on the technical merits of the individual paper. Reviewers should not even be informed of the underlying organized session proposal.
  • Good papers submitted as organized, whose session will eventually be turned down by the SPC, will be nonetheless accepted and presented in regular sessions. Good session proposals, for which only few good papers were submitted as organized and accepted by the IPC, might be integrated by the SPC with other accepted papers in the area. These aspects of how sessions will be formed pertain to the SPC, and they should not concern the AEs or Editors.

Contact RAS

For a list of robotic and automation experts, please visit the RA Expert listing.

Contact the IEEE Robotics & Automation Society Staff at This email address is being protected from spambots. You need JavaScript enabled to view it.

Kathy Colabaugh

IEEE RAS Program Specialist

Kathy Colabaugh

Piscataway (NJ), United States
+1 732 562 3906
This email address is being protected from spambots. You need JavaScript enabled to view it.


Rachel O. Warnick

IEEE RAS Program Specialist

Rachel Warnick

Piscataway (NJ), United States
+1 732 562 6585
This email address is being protected from spambots. You need JavaScript enabled to view it.