Discover the Chemical Composition of Everyday Stuff…With a Smartphone Camera

Our smartphones can do a lot—compute, pin down our location, sense motion and orientation, send and receive wireless signals, take photographs and video. What if you could also learn exactly what chemical components were present in any object? A new invention out of Israel aims to enable just that.

“The tricorder is no longer science fiction,” a recent Tel Aviv University (TAU) article declared. While a number devices in recent years have inspired similar comparisons, maybe this one is a little closer.

Created by TAU engineering professor, David Mendlovic, and doctoral student, Ariel Raz, the technology is an intimate combination of innovative hardware and software. The former, a microelectromechanical (MEMS) optical component, is mass producible and compatible with existing smartphone cameras.

The component is a kind of miniature filter that would allow smartphone cameras to take hyperspectral images that record the spectrum of light present in every pixel of the image. Software then creates a spectral map and compares it to a database of spectral “fingerprints” associated with substances.

“The optical element acts as a tunable filter and the software—an image fusion library—would support this new component and extract all the relevant information from the image,” said Professor Mendlovic.

The result? Point a handheld computing device at an object and learn its composition.

The technology behind hyperspectral technology isn’t new. USGS’s Landsat satellites, for example, have been using a similar digital imaging technique to analyze the Earth’s surface from space for decades.

The Israeli device is notable, however, because it exemplifies a more general trend in sensors: What was once large, costly, and the sole domain of states is now tiny, affordable, and in our pockets. And the interesting part is that we don’t know exactly how each new miniature sensor will be used.

Once incorporated into smartphones and opened to app developers, old sensors rapidly find new niches. Motion sensors, for example, are now commonly used in sleep tracking apps. GPS doesn’t just locate you on a map, it also enables your phone to automatically provide local weather, time, or the nearest bus stop.

What would a tricorder-like hyperspectral camera allow mobile devices to do? They would obviously be a fun novelty—great for analyzing that cocktail at happy hour. But depending on the accuracy of the device, applications range further than pure fun. Health apps and handheld diagnostic devices come to mind.

Currently, to keep track of what you’re eating, you have to manually enter each food. What if a simple photo of your plate was enough to analyze its nutritional content? It could be a great tool to spot dangerous ingredients for those with food allergies, but as it only records surface information for opaque objects (reflected light), it seems you’d never know what was lurking beneath—probably not worth the risk.

Farmers might use a drone with a miniature hyperspectral camera to monitor crops. Industrial workers with smartglasses (or robots) might use a hyperspectral camera to view the chemical composition of their surroundings in augmented reality, confirming all is well or warning of invisible hazards.

These applications are likely unimaginative compared to what may arise after developers take a look.

“A long list of fields stand to gain from this new technology,” says Mendlovic. “We predict hyperspectral imaging will play a major role in consumer electronics, the automotive industry, biotechnology, and homeland security.” Obviously, we aren’t there yet. But soon perhaps.

One critical piece, yet to be fully worked out, is providing a large enough database of spectral signatures of everyday (and not so everyday) materials. Mendlovic says his team is in talks with other organizations to help analyze images, and they are also speaking to smartphone and wearable device makers and car companies. They recently showed off a demonstration system and anticipate a prototype this June.

And perhaps we can see the greater potential by looking beyond the individual sensor and seeing how it converges with other sensors to create an all-in-one, tricorder-like device. It might prove widely useful for regular folks, scientists, doctors, and starship captains (of course) to study our bodies and environments.

Image Credit: DaMa-Studio/YouTube

Jason Dorrier
Jason Dorrier
Jason is editorial director of Singularity Hub. He researched and wrote about finance and economics before moving on to science and technology. He's curious about pretty much everything, but especially loves learning about and sharing big ideas and advances in artificial intelligence, computing, robotics, biotech, neuroscience, and space.
Don't miss a trend
Get Hub delivered to your inbox