RAS Summer School on Experimental Methodology, Performance Evaluation and Benchmarking in Robotics

From 14 Sep, 2015 08:00 until 18 Sep, 2015 18:00

As the complexity of current robotic and embodied intelligent systems grows, it is more and more necessary to adhere to sound experimental approaches and benchmarking procedures. We think that it is of fundamental importance that the next generation learns good experimental, performance evaluation and benchmarking methods as part of their technical education. We will capitalize on our experience on organizing workshops on these topics and on the tradition of organizing robotics summer schools at the same venue. We believe that the field is now mature to try to convey all the accumulated knowledge and know-how in a one week summer school format in order to present the relevant topics to the graduate students in a consistent and systematic way.

The students will learn how to implement approaches with exhaustive experimental methodology and apply appropriate benchmarking procedures to compare the actual practical results with reference to standard accepted procedures; how experimental results can be replicable and refutable on the one hand, and quantitatively comparable according to community-endorsed metrics, on the other hand. At the end, a student should be able to provide an educated response to the following questions: - How research should be performed and reported in order to allow result replications - How should competitions be designed in
order to maximize their contribution to robotics research and the industrial exploitation of results - To which extent and for measuring which functionalities and capabilities are benchmarks useful and possible - Which balance of result replication, benchmarking, challenges and competitions would allow a more objective evaluation of results and consequent objective assessment of the state of the art in the various subfields of theis
domain of research.

List of topics
• Design of Experiments in Robotics
• Execution of Experiments in Robotics
• Experimental scenarios to evaluate performance, demonstrate generality, and measure robustness
• Well-grounded experimental methods
• Replication of Experiments in Robotics
• Relationship between benchmarking and replication of experiments with robots
• Reporting experiments in Robotics
• Examples of Good Experimental Practice
• Evaluation of Experimental Robotics Work
• Evaluation and Benchmarking in HRI
• Comparison of experimental methodology in neurosciences and in Robotics
• Comparison of experimental methodology in Biology and in Robotics
• Benchmark standardization
• Benchmarking autonomy, cognition and intelligence
• Scalable autonomy measurements
• Metrics for sensory motor coordination and visual servoing effectiveness and efficiency
• Performance metrics based on Shannon entropy related measures
• Performance metrics based on dynamical systems methods
• Performance modeling of the relationship between a task and the environment where it is performed
• Performance Metrics for Response Robotics
• Success metrics in bio-inspired Robotics
• Design of Robotics competitions
• Design of Robotics challenges
• Integration of experimental methods, benchmarking, challenges and competitions for a better evaluation of results
• Epistemological issues

Call for Participation

Register here: http://www.ieee-raspebras2015.com/node/37

For more information see: http://ieee-raspebras2015.org/

Or email: ieee.raspebras2015@gmail.com

Organizing Committee
Angel P. del Pobil - Universitat Jaume I, Spain
Fabio Bonsignorio - Scuola Superiore S. Anna
Enric Cervera - Universitat Jaume I, Spain
Elena Messina - NIST, USA
John Hallam - University of Southern Denmark
Antonio Morales - Universitat Jaume I, Spain
Ester Martínez-Martín (UJI)
Angel Duran (UJI)
Gabriel Recatala (UJI)
Pedro Sanz (UJI)
Raul Marin (UJI)
Marco Antonelli (UJI)
Javier Felip (UJI)

Partial funding provided by IEEE RAS

Easy Links