DynCoMM: Dynamic Collaborative control of Mobile Manipulators for complex picking
Name of demonstration
DynCoMM: Dynamic Collaborative control of Mobile Manipulators for complex picking
Main objective
DynCoMM is an advanced robotic solution for external control of mobile platforms to perform dynamic operations in collaborative scenarios. The objective is to control and synchronize both application operation and platform dynamics simultaneously, increasing the productivity and flexibility of potential applications. Thus, three prominent aspects must be considered: the environment and process analysis, the collaborative integral control of the mobile manipulator, and the application integration. DynCoMM solution includes the integration of computer vision algorithms for both the application and the environment reconstruction. The developed RealTime (RT) external control system will use the obtained information to perform complex picking applications in industrial collaborative scenarios. More specifically, the demonstrator will be focused on complex manufacturing processes where manual operations are still needed, facing technology challenges in the field of robotics and vision-based Human-Robot Collaboration (HRC).
Short description
The project presents a novel solution for mobile manipulators, embedding the robot manipulator into a mobile platform to combine their benefits. The combination of both actuators increases productivity, as the mechanism will be synchronized to operate in RT with the manufacturing line workers, creating a human-machine collaborative environment. This high level of coordination between the mobile manipulator and the human employer is obtained by integrating a novel artificial vision framework designed to recognize in RT the environment and make decisions about the mobile platform position to maintain human safety without halting the production.
This demonstration has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825196.
Owner of the demonstrator
Aldakin
Responsible person
Oihane Mayo Ijurra,
Engineer at R&D department
E-mail: o.mayo@aldathink.es
NACE
M71.2 - Technical testing and analysis
Keywords
Robotics, Vision System, Machine Learning, Motion Planning.
Benefits for the users
Cost efficiency: Factories would obtain high degrees of automation, increasing their efficiency and profitability in the long term.
Ergonomy: In the collaborative environment presented, operators are aided through the mobile platform in their daily tasks. This situation will reduce the absenteeism of workers, as the mobile platform will deal with the most physically intensive operations, increasing the manufactory competitiveness.
Flexibility and agility: Manufactured products complexity has been increased during the previous years, while their live cycles have been decreased. This situation requires developing novel technological solutions and processes in constant adaptation to maintain production stability without reducing product quality or increasing costs
Innovation
Novel mobile platform designed to coordinate the robotic manipulator and the mobile platform through a novel artificial vision algorithm in RT.
New platform capable of performing dynamic manipulation of complex pieces in collaborative and hazardous environments.
Flexible and easy integration of RT controllers and computer vision solutions.
Risks and limitations
Failure in the integration of software modules and developments. Inaccurate specifications of use case scenario or demonstrator platform. Inaccuracies in the reconstruction of systems through Computer Vision. Synchronization inaccuracies between robot and platform systems. Loss of consortium partnership. Failure to achieve industrial relevance. Real-Time limitations in commercial Hardware. Failure to complete the demonstration. Failure to manage the project effectively.
Technology readiness level
6 - Safety approved sensors and systems are commercially available
Sectors of application
Cross-sectoral approach through mechanical operation enhancement.
Patents / Licenses / Copyrights
Hardware / Software
Hardware:
KMR
Laptop
Computer Vision Cameras
External CPU
KUKA iiwa 14 R820
Intel RealSense D455
Intel RealSense LiDAR L515
Software:
ROS
TRINITY Modules
Sunrise
Photos
Depth-sensor Safety Model for HRC
Depth-based safety model for human-robot collaboration: Generates three different spatial zones in the shar...
LEARN MORE
Object Classification
A deep convolutional neural network (CNN) is used to classify and sort objects. This is a robust and fast i...
LEARN MORE
Object Detection
The object detection module is used to perceive the changing environment and modify systems actions accordi...
LEARN MORE
Trainings
To learn more about the solution, click on the link below to access the training on the Moodle platform
Dynamic collaborative control of mobile manipulators for complex picking