The mobile manipulator developed across this project will fill the gap between human and robots in industrial environments. The solution proposed integrates mobile manipulators into hazardous manufacturing lines, as the novel platform bring them with higher rates of flexibility to detect the environment and navigate through them.
The project presents a novel solution for mobile manipulators, embedding the robot manipulator into a mobile platform to combine their benefits. The combination of both actuators increases productivity, as the mechanism will be synchronised to operate in RT with the manufacturing line workers, creating a human-machine collaborative environment. This high level of coordination between the mobile manipulator and the human employer is obtained by integrating a novel artificial vision framework designed to recognise in RT the environment and make decisions about the mobile platform position to maintain human safety without halting the production. Mobile manipulators tend to operate in two disengaged working modes, which are not executed simultaneously: mobile platforms and robot manipulators. Especially in a collaborative scenario, the essential and synchronised operation of both systems is critical for complex dynamic applications. However, standard controllers and libraries such as KUKA (within Sunrise OS) are not prepared to work with the required synchronisation. The dynamic picking advanced mobile manipulator within a HRC environment will improve industrial production processes by solving one of their main bottlenecks: the handling of components coming out of the manufacturing process. Thus, the system developed in the DynCoMM project will allow to automate previously manual and complex picking processes, allowing to increase the productivity of the manufacturing chain and improving the working conditions of the employees by developing their tasks in a collaborative environment in which repetitive high-cadence operations will be carried out by cobots.
Challenge
Impact
FSTP Name:
DynCOMM
Beneficiary Lead:
ELECTROTECNICA ALAVESA SL
www.aldakin.com
Spain
Beneficiary 2:
Video Systems Srl
videosystems.it/en/
Italy
Beneficiary 3:
Ikerlan S.Coop.
www.ikerlan.es/
Spain
Technology Area:
Human-Robot Collaboration
End User:
Cross-sectorial application. Collaborative environments
Start Date - End Date:
01/11/2021 - 31/08/2022
Duration:
11 months
FSTP Funding:
287,676,29 €
TRL Level at Start:
4
TRL Level at End:
6
Number of early adopters raised:
-
SUBSCRIBE to newsletter
Please subscribe to our mailing list to be the first one informed about the new open calls, events and other hot news!
Your have successfully subscribed to Trinity newsletter.