Call For Abstracts- AGAINST-20
AGAINST robot dystopias: thinking through the ethical, legal and societal issues of robotics and automation (AGAINST-20)
A full-day ICRA 2020 workshop- 31 May 2020 or 4 June 2020
Submission deadline: 23 March 2020
Intelligent robots come with promises of great value to society but can also be seen as threats in terms of job loss, increased social inequalities, transparency and privacy, age/race/gender bias in training or design, autonomous warfare, excessive delegation of decision-making, etc. This workshop aims to put forth a diversity of viewpoints from experts not only in robotics and AI but also disciplines that are close to the real-world use of robots: social science, law, economics, and philosophy. Together we want to explore possible outcomes of increased uptake of intelligent robots, and contribute to ethical frameworks and good practices. The workshop builds on its previous edition AGAINST-19 at ICRA 2019, which focused on bias and discrimination. This edition will further extend its scope to other topics of ethics and responsible innovation in robotics, while at the same time keeping its focus on spreading awareness and a critical mindset in our robotics community regarding ethical issues of the technologies we design.
The workshop will provide a venue for both theoretical and technical discussions which interleave philosophy/economics/social-science talks with technical talks.
We welcome the submission of extended abstracts (max 2 pages) related to the workshop's topics of interest. Both unpublished original contributions and previously published work may be submitted. We specifically encourage submission of position papers that address the ethical issues involved in robot design, deployment and use; as well as the techniques which will be required to avoid or alleviate these issues. These papers will be used to initiate discussion on the topics during the workshop, and will be presented both as short oral presentations and posters.
We accept various kinds of papers:
1) Discussion papers (discussing an ethical/societal/legal issue related to automation and robotics)
2) Method papers (introducing a technical method to address or to account for an ethical concern)
3) User study papers (investigating user perceptions or interactions with a system)
4) Special: Dystopia papers*
*Dystopia papers are inspired by the Re-coding Black Mirror Workshop (https://kmitd.github.io/recoding-black-mirror/) and are provocative papers that explicitly discuss how science fiction dystopias from a movie or TV series (e.g. Black Mirror) could take place and avoided in practice. The authors should pick an episode or movie and discuss what kind of methods, research, and/or societal and legal interventions need to come about to avoid such a scenario. These should still be high-quality academic contributions, however with a different touch to its motivation and storyline.
Topics of interest:
- Social and societal issues in robotics and automation
- Fairness, Accountability and Transparency in robotics and automation
- Explainability in robotics and automation
- Privacy in robotics and automation
- Ethics of robotics and automation
- Ethical algorithms and autonomous systems
- Bias in robot perception / design / interaction
- Law and governance of robotics and automation
- Economics of robotics and automation
- Paper submission deadline: 23 March 2020
- Author notification: 19 April 2020
- Camera-ready submission: 4 May 2020
- 31 May or 4 June 2020
The manuscripts should use the IEEE ICRA two-column format, maximum 2 pages, and a PDF copy of the manuscript should be submitted through the dedicated EasyChair website.
Accepted contributions will be presented both as short oral presentations and as posters. The presentation should be up to 3-minutes to highlight the key findings of their research and identify their poster to the audience. The papers will be posted on this website unless otherwise requested by the authors.
This workshop is sponsored by The IEEE RAS Technical Committee on Human Robot Interaction & Coordination