Human-robot cooperative handling of flexible beams is an important task in for instance the manufacturing of extruded aluminum products. Inspired by this need a method for visual tracking of the motion of a flexible beam was developed in order to control a robot.

caption
The estimated position of the free end of the flexible beam is used to control the robot end-effector.

Approach

In this project computer vision was used to estimate the beam states. This is in constrast to earlier work by other researchers which commonly use force sensors on the robot end-effector. Two cameras were used in order to get a good 3D estimate of the beam states.

The stream of images was processed by a particle filter which estimated the current beam state. The state was then used to control the robot.

caption
This figure shows a conceptual sketch of the system.

Dynamic Modeling

The flexible beam was modeled using the Absolute Nodal Coordinate Formulation (ANCF). Some example ANCF code implemented in Python can be found on the Code snippets page.

Experimental results

The goal of this project was to make a method for human-robot cooperative handling of flexible beams. Videos from two experiments that demonstrate this are shown below.

Reconstruction of motion

In this experiment the free end of the flexible beam was moved in a known pattern to see if the algorithm was able to reconstruct the beam trajectory. In the figure below the trajectory of the beam moved in a circle is reconstructed. The robot is not moving during the execution of this experiment. The points near the free end are plotted in blue and the points near the stationary end are plotted in red.

caption
Reconstruction of circular motion.