SONARO - Smarte Objektübernahme und -übergabe für die nutzerzentrierte mobile Assistenzrobotik
- Duration: 01.04.2019 - 31.12.2021
- Funding: Free State of Thuringia from the European Social Fund (ESF)
- Applicant: Thüringer Zentrum für Maschinenbau (ThZM)
- Project partner:
TU Ilmenau, FG Neuroinformatik und Kognitive Robotik (Prof. Horst-Michael Groß, Dr. Steffen Müller)
TU Ilmenau, FG Qualitätssicherung und Industrielle Bildverarbeitung (Prof. Gunther Notni)
Hochschule Schmalkalden, FG Eingebettete Diagnosesysteme (Prof. Andreas Wenzel)
Gesellschaft für Fertigungstechnik und Entwicklung Schmalkalden e.V. (Prof. Frank Barthelmä)
- Project coordinator: Prof. Dr. Horst-Michael Groß, Dr.-Ing. Steffen Müller
- Project homepage: www.sonaro-projekt.de
The development of applications for Industry 4.0 and Smart Health is characterized by the increasing use of intelligent interactive systems for human-machine interaction (MMI). In the field of "Smart Health", for example, assistance robots will support nursing staff in typical nursing activities and act as assistants with a handyman function. Similar assistance functions based on intelligent handler activities will also find their way into other areas of activity, such as industrial manufacturing (e.g. assembly) or the skilled trades, if the assistance robots can be designed in such a way that users and robots can work together cooperatively and highly efficiently hand in hand according to the current situation.
Smart handover of objects between human and robot currently offers a lot of potential for method development in various subtasks starting with the perception of people and objects, through the recognition of the need for assistance, to the planning and execution of the robotic grasping and handover movements. Although there has been intensive international research activity in the field of collaborative assistance robotics in recent years on sub-tasks, no research work is known to date that treats the handover scenario in its entirety as the SONARO research group does.
Against this background, novel solutions for smart object handover and takeover are to be researched for human-robot collaboration (HRC), which are essential for the further development of interactive assistance robotics and clearly go beyond the current state of the art. They should allow assistance robots to situationally adapt their actions (motion paths, motion patterns, interaction speed, grasping positions) to humans and their current activities when interacting with humans, thus becoming socially acceptable. In the context of the intended smart takeover/handover scenario, this means that the assistance robot can recognize the holding pose and grasping position of the handover person's hand on the object during an object takeover and can then safely grasp and take over the object in an alternative way without endangering the human (e.g., touching the human's hand). In the further course, the object that has been taken over must then be safely transported to another actor and handed over to him without endangering him. The dynamic processes necessary for this require non-contact monitoring and analysis of the shared interaction space and the current activity of the interaction partner. Thus, in addition to the evaluation of distances, movements and structures (human, robot, transfer object, hand position) in the interaction space or on the object, a resulting motion planning as well as the real-time reaction of the entire system with high spatio-temporal resolution at low latency is required.
To make this objective a reality, novel user-centered HRC system solutions with highly dynamic collaboration concepts are required, in which the decisions of the HRC system are dynamically adapted to the current situation, in particular the current movements of the user in the shared interaction space and his object holding pose. The associated overall process of human-robot object takeover and handover is shown in the figure. This requires the exploration of novel methods for user-centered interaction in four key areas:
- Perception of hand posture and reliable discrimination of hands and objects to be handed over (situation recognition).
- Robust recognition of objects to be picked up/handed over and their position determination (object recognition and position determination)
- Precise self-localization of the robot and its manipulators in space based on accurate environment modeling (space localization)
- User-centered navigation and grasping motion control for object transfer (user-centered navigation and access)