Founded by John Espey, Organic Electrics designs and builds robots based on insects and other living organisms. Indeed, the small size, power efficiency and complex behaviors of insects make the creatures ideal models for next-gen robotics.
“Prominent robotics researchers like Rodney Brooks, Randall D. Beers, Joseph Ayers and Mark Tilden have been making this claim for decades. One of the key elements of their research has been to examine the neural circuitry and behavior of insects to program robots,” Espey wrote in a recent blog post published on InnoEcho.
Image Credit: John Espey, Organic Electrics
“This circuitry relies principally on sensors triggering a cascade of motor control. Insects act until their environment signals a different behavior. In essence, the environment controls the robot.”
As Espey notes, giving up direct control is an essential step in creating more autonomous ‘bots.
“We can influence the behavior of a robot because of its design, but once it is released and active, it’s free,” he explained. “A truly intelligent self-sufficient robot can’t touch base with its creator or owner to make decisions; we’ll need to trust the robot to act on its own.”
That’s why, says Epsey, researchers should start small and simple by designing autonomous insect ‘bots, which he defines as “our own symbiotic organism.”
“The development of electromechanical insects will greatly benefit from open source craftsmanship,” he concluded.
“We’re sharing knowledge globally, microscopic cheap electronic components are readily available and advanced super materials are now in hardware stores. Robots will change from heavy, power hungry machines into delicate and elegant creatures.”
LSS co-inventor Dr. Patrick R. Gill expressed similar sentiments during a recent interview with Rambus Press in Sunnyvale.
“Flying insects employ a wide range of sensory-motor responses to help them navigate the world and avoid potentially dangerous obstacles,” he said. “For example, the work of Harald Esch demonstrates how bees use optic flow to measure how far they should travel in a certain direction.”
Gill defines optic flow as the visual change caused by the relative motion of an observer and a visual scene.
“As a bee cruises past trees and grass in search of a flower, it manages to keep track of how much visual change it experiences along the way. When the bee returns to the hive to communicate the flower’s location by performing a waggle dance, the duration of the waggle encodes not distance, but rather, how much optic flow to expect on the way to the ‘target’ flower,” he continued.
“More specifically, the orientation of the waggle encodes the direction to fly. Meaning, bees following the same course fly straight until they experience a like-amount of optic flow. They then stop and search for nectar.”
Similarly, says Gill, a lightweight insect ‘bot could be equipped with optic flow sensors, allowing it to safely explore a designated sector of terrain before returning to its assigned point of origin.
It should also be noted that optic flow capabilities can help flying ‘bots optimize obstacle avoidance, assist with landing and reliably maintain a straight heading.
As Gill points out, lensless smart sensor technology (LSS) is fully capable of measuring optic flow, even if the uber-mini tech is fitted in atypically small form factors. Optic flow is one of the first LSS sensing functions Rambus has demonstrated (for instance, at Mobile World Congress 2014) in part because it is computationally light, yet performs robustly in real-world conditions.
“For applications like tiny flying robots, LSS technology could potentially enable sensing in one of the smallest form factors and power budgets available,” he added.