Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have developed a system designed to grip tools with an appropriate amount of force for a given activity, such as writing with a pen.
The system, dubbed Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space and apply force to a tool.
Using six degrees of freedom, the object can be moved in a range of directions, including left to right and up and down. The closed-loop controller—a self-regulating system that maintains a desired state without manual intervention—uses SEED and visual and tactile feedback to adjust the robot arms’ position.
“We’ve been heavily relying on the work of Mason, Raibert, and Craig on what we call a hybrid force position controller,” says Hyung Ju Suh, a PhD student in electrical engineering and computer science at MIT, CSAIL affiliate, and lead author of the SEED paper.
“That’s the idea, that if you actually had three dimensions to move in when you’re writing on a chalkboard, you want to be able to control position on some of the axes, while controlling force on the other axis.”
Rigid-bodied robots have limitations, whereas softness and compliance leads to the ability to deform, meaning the bot can sense the interaction between the tool and the hand.
With SEED, every execution the robot senses is a recent 3D image from the grippers, thereby tracking in real-time how the grippers are shifting shape around an object. These images are used to reconstruct the position of the tool, with the bot using a learned model to map the position of the tool to the measured force.
The learned model is developed from the robot’s previous experiences, where it disturbs a force torque sensor to figure out how rigid the bubble grippers are. As soon as the robot has sensed the force, it will compare that with the force that the user commands and adjust itself accordingly, if needs be. It would then move in the direction to increase the force, all of which is completed over 6D space.
During the one of the assigned tasks, which involved using a squeegee, SEED was provided the right amount of force to clean up some liquid on a plane, where baseline methods had difficulty getting the correct sweep.
When asked to put paper to pen, the bot effectively wrote out “MIT,” and also used the necessary force to drive a screw.
While SEED was aware of the fact that it needs to command the force or torque for a given task, if grasped too hard, the object would slip, meaning there is an upper limit on that exerted hardness.
Currently, the system assumes a very specific geometry for the tools: it has to be cylindrical and there are limitations on how the bot might generalise when encountering new shapes. Forthcoming work might involve generalising the framework to different shapes so it can handle arbitrary tools on the fly.