VASTERAS, Sweden—Although robots have successfully tackled a variety of manufacturing tasks, many assembly applications still suffer from complex, time-consuming programming. A European Union funded initiative hopes to change that by enabling a nonexpert user to integrate a new manual assembly task on a collaborative robot in less than a day.

The goal of the SARAFun project is to empower industrial robots with perception, learning and reasoning abilities that will dramatically increase production speed, while taking on tedious assembly line jobs.

Engineers are using ABB’s dual-arm YuMi robot. The R&D version of the machine is equipped with cutting-edge sensory and cognitive abilities, as well as reasoning abilities required to plan and execute an assembly task.

“A disruptive game changer, SARAFun’s assembly robot can significantly change industrial manufacturing all over the world and encourage a re-evaluation of assembly [applications],” claims Ioannis Mariolis, Ph.D., an engineer heading up the project. “Products with short life cycles entail frequent changes. Unlike today’s robots that know only their nominal task, this smart robot for bimanual assembly of small parts is not limited in its ability to deal with regular changes on the production line.

“[Our] system is built around the concept of a robot capable of learning and executing assembly tasks, such as insertion or folding, demonstrated by a human instructor,” explains Mariolis. “After analyzing the demonstration task, the robot generates and executes its own assembly program. Based on the human instructor’s feedback, as well as sensory feedback from vision, force and tactile sensors, the robot can progressively improve its performance in terms of speed and robustness.

“Maintaining a humanlike reach for assembly of small parts within a very small space is critical to minimizing the footprint on the factory floor,” adds Mariolis. “It also enables the robot to be installed at workstations currently used only by humans.”

Mariolis and his colleagues have already successfully completed many demonstrations using components, such as 3D visual sensors, grasp planning, slip detection, motion and force control, to mimic manual assembly tasks.

“The result is a flexible assembly program that can adapt to the working environment without specific planning by the user,” claims Mariolis. “Compared to state-of-the-art technology, the simple graphical interface is much easier for non-experts to use. Tested on cell phone parts and emergency stop buttons, the system successfully learned assembly tasks in less than a day.”