Alois C. Knoll is a computer scientist and professor at Technical University Munich TUM. He teaches and conducts research in the fields of autonomous and embedded systems, robotics and artificial intelligence. In 2011, he founded the interdisciplinary course "Robotics, Cognition, and Intelligence" at TUM, which has become one of the largest MSc programs in TUM’s CS Dept. In 2007 he became a member of the EU's highest advisory board for information technology, ISTAG. In this function, he was in involved in the development of the EU flagship projects and was ab author of the report "European Challenges and Flagships 2020". In 2009, he co-founded "fortiss", the Munich Institute for Software, which has since been transformed into a state institute of Bavaria. He has coordinated the project ECHORD++, an initiative to bring together the robotics industry and universities to speed up robot technology’s route to market. He has been head of the sub-project "Neurorobotics" of the EU flagship project “Human Brain Project”. Since 2011 he has been PI at TUMCREATE, a joint venture of NTU and TUM-Asia in Singapore. His focus there is on modeling, simulation and optimization for infrastructure, both in methodical development and in the construction of practical systems.
Talk # 1
Introduction to Neurorobotics Platform with EU Human Brain
Abstract of Talk Tentative Talk Content: In conjunction with future computing, HBP’s robotics research plays multiple, significant roles in the HBP: (Closed Loop Studies): it links the real world with the “virtual world of simulation” by connecting physical sensors (e.g., cameras) in the real world to a simulated brain. (Brain-Derived Products): it links brain research to information technology by using scientific results (e.g., data, and models of behaviour) obtained in brain research and refining it to a readiness level where it can be used by commercial companies and easily taken up and rapidly turned into new categories of products. (Virtualised Brain Research): it links information technology to brain research by designing new tools for brain researchers, with which they can design experiments and then carry them out in simulation. We envision that the unique integration of the above three paths will lead to widespread mutually beneficial fertilization and research acceleration through the two-way inspiration of the involved disciplines.
Zhijun Li received the Ph.D. degree in mechatronics, Shanghai Jiao Tong University, P. R. China, in 2002. From 2003 to 2005, he was a postdoctoral fellow in Department of Mechanical Engineering and Intelligent systems, The University of Electro-Communications, Tokyo, Japan. From 2005 to 2006, he was a research fellow in the Department of Electrical and Computer Engineering, National University of Singapore, and Nanyang Technological University, Singapore. Since 2017, he is a Professor in Department of Automation, University of Science and Technology of China,. From 2016, he has been the Co-Chairs of Technical Committee on Biomechatronics and Bio-robotics Systems (B2S), IEEE Systems, Man and Cybernetics Society, and Technical Committee on Neuro-Robotics Systems, IEEE Robotics and Automation Society. He is serving as an Editor-at-large of Journal of Intelligent & Robotic Systems, and Associate Editors of several IEEE Transactions. He has been the General Chair and Program Chair of 2016 and 2017 IEEE Conference on Advanced Robotics and Mechatronics, respectively. Dr. Li’s current research interests include service robotics, teleoperation systems, nonlinear control, neural network optimization, etc.
Talk # 1
Development of Key Technology for Wearable Robots and Its Applications
Recently, as the aged population is increasing, many kinds of wearable robot have attracted much attention. In this talk, we present kinds of wearable robots to achieve many activities of daily living
(ADL). The developed service robots include upper limb exoskeleton, lower limb exoskeleton, intelligent robotic wheelchairs, wheeled inverted transportation, and interactive tour guide operation robot, and neural prosthesis, bionic hands, etc. The various advanced control approaches including skill transfer technique, intelligent control, bio-feedback control have been proposed and verified in these developed platforms.
Talk # 2
Human Cooperative Exoskeleton
The development of robotic systems capable of sharing with humans the load of heavy tasks has been one of the primary objectives in robotics research. At present, in order to fulfill such an objective, a strong interest in the robotics community is collected by the so-called wearable robots, a class of robotics systems that are worn and directly controlled by the human operator. Wearable robots, together with powered orthoses that exploit robotic components and control strategies, can represent an immediate resource also for allowing humans to restore manipulation and/or walking functionalities. The talk deals with wearable robotics systems capable of providing different levels of functional and/or operational augmentation to the human beings for specific functions or tasks. Prostheses, powered orthoses, and exoskeletons are described for upper limb, lower limb, and whole body structures.
Bronxville (NY), USA
Dr. Chun-Yi Su is a Professor of Mechanical, Industrial and Aerospace Engineering at Concordia University, Montreal, QC, Canada, and holds the Concordia University Research Chair in Control. He received his Ph.D. degree in control engineering from the South China University of Technology in 1990. He joined Concordia University in 1998, after a seven-year stint with the University of Victoria, Victoria, BC, Canada. He also held some visiting positions including Chair Professor of Chang Jiang (Cheung Kong) Scholars Program in China and JSPS Invitation Fellow in Japan. His research covers control theory and its applications to robotic systems. Dr. Su has authored or coauthored over 250 journal publications in robotics, mechatronics and control that have resulted in over 10,500 citations. Dr. Su has delivered over 80 invited talks, including plenary conference presentations and seminars. He was a recipient of several best conference paper awards. He has been on the Editorial Board of a few journals, including IEEE Transactions on Automatic Control and IEEE Transactions on Control Systems Technology. He has also served for many conferences as an organizing committee member.
Talk # 1
Impedance Control of Powered Exoskeletons for Human-Cooperative Manipulations Using Biological Signals
Powered exoskeletons have attracted much attention over the past few years and can be applied from military usage to patient rehabilitation. The exoskeletons are developed to augment the human’s muscular force and endurance that can not only perform the cooperation with the humans but also assist or supplement the human motion. Therefore, it is essential that the developed exoskeletons could exhibit biological behavior and performance. Considering human joints, one of the important features is the physical properties of the musculotendinous and their resultant impedance. However, the impedance profiles of the human joints vary substantially during motion. Therefore, exoskeletons should accordingly respond and adapt to these impedance profiles. This talk presents methods to develop adaptive impedance control of exoskeletons using biological signals. First, a reference musculoskeletal model of the human upper limb is developed and experimentally calibrate the model to match the operator’s motion behavior. Then, an impedance algorithm is proposed transferring stiffness from human operator through the surface electromyography (sEMG) signals, being utilized to design the optimal reference impedance model. In order to verify the proposed approach, the actual implementation has been performed using a real robotic exoskeleton and a human operator.
Talk # 2
Development of Mind Control System for Robotic Manipulators Fused with Visual Technology
With the advances in robot control (RC) systems, the relationship between humans and robots has thus become increasingly intimate, and many human-robot collaboration systems have been developed. However, it is hard for a disabled person to operate a robot because of the loss of motion capacity or reduced sensing ability. This talk will summarize the development of an intelligent shared control system for a robotic manipulator that is commanded by the user’s mind. The target objects are detected by a vision system and then displayed to the user in a video that shows them fused with flicking diamonds. Through the analysis of the invoked electroencephalograph EEG signals, a brain computer interface (BCI) is developed to infer the exact object that is required by the user. These results are then transferred to the shared control system, which is enabled by visual servoing (VS) techniques to achieve accurate object manipulation. Extensive experimental studies are performed to verify the performance of the developed mind control system.