Rambus scientists Patrick Gill and Thomas Vogelsang recently presented a paper titled “Lensless Smart Sensors: Optical and Thermal Sensing for the Internet of Things” at the 2016 VLSI Symposia.
As Julien Happich of the EE Times explains, Rambus lensless smart sensors (LSS) are based on a phase anti-symmetric diffraction grating (either tuned for optical or IR thermal sensing) mounted directly on top of a conventional imaging array and co-designed with computational algorithms that extract relevant information from an imaged scene.
“The grating is very thin and boasts a wide field of view, up to 120º, and the resulting imaging sensor is almost flat, only a few hundred micrometers separate the grating from the image sensor,” writes Happich. “The raw sensed image is encoded by the grating structure, calling for dedicated reconstruction algorithms and image processing, but in some applications such as range-finding or eye-tracking, it may not even be necessary to reconstruct a full image. Instead, extracting distance measurements may suffice and the particular phase anti-symmetric diffraction structure makes it very simple.”
According to Happich, light from left and right of the anti-symmetric boundary (at the center of the grating) cancels in a curtain under the boundary. Meanwhile, the stereoscopic shift of the Point Spread Functions (PSF) of a point light source viewed through two gratings can be used to determine the distance of that light source with much greater accuracy than would be feasible with stereoscopy using lenses.
To demonstrate the above-mentioned concept, Rambus scientists designed an ultra-low power 2x2mm2 image sensor with a 128×128 pixel array and integrated image change detection circuitry on the same die. A pair of identical phase gratings was then mounted in apertures on a shared pixel array. The gratings were only 1.86mm apart, each within apertures 55μm in diameter; a configuration that facilitates the measuring of distances up to 50cm with an error of less than 8%.
As we’ve previously discussed on Rambus Press, the wide field of view and compact, flat form factor of such a stereoscopic lens-free sensor make the technology a good match for wearable eye-tracking applications within smart goggles or head-mounted displays. In addition, say the authors, the light sources could be near-IR emitters integrated within the periphery of a glass frame.
“We do see this as a better, more accurate replacement for head-mounted eye tracking systems like those found in virtual and augmented reality systems [currently on the market]. We estimate we can reduce the cost of the optics in a system significantly,” Paul Karazuba, Director of Product Marketing for Imaging at Rambus, told the EE Times.
“For LSS in head-mounted eye tracking systems, we would likely use anywhere from CIF (352×288 pixels) to VGA (640×480 pixels), and for LSS in anti-collision or SLAM, we would use anywhere from CIF to 1Mp, application-dependent.”
As Karazuba confirms, Rambus has already begun exploring licensing opportunities with select OEMs in various target markets.
“As part of this effort, we are securing a qualified third-party manufacturing chain that will allow our licensees to easily source LSS modules. We also offer licensees the option to manufacture LSS modules in their own facilities – if they have proper production capabilities to support LSS. The sensor resolution will be application- and market-dependent,” he concluded.
Leave a Reply