Researchers at the University of California, led by Professor Michael Sailor, have been looking for ways to give the ubiquitous cellphone a greater level of usefulness and make it less of a pricey way to seem cooler than you actually are. Enter the Smellphone.
This new nose-simulating technology relies on a silicon sensor which changes color depending on the types of airborne chemicals it encounters. The pattern of colors that is produced is different depending on the amount and nature of the substance detected, and Professor Truong Nguyen, an electrical engineer at the Jacob School, is currently writing algorithms which will allow the cellphone to identify what chemical it is “smelling”.
Silicon smelling sensor
The identification process is accomplished by a way of a megapixel camera using a new type of supermacro lens developed by Rhevision, Inc., a California technology company. This lens uses fluids to change its focus rather than mechanical parts, making it much more akin to an animal’s eye than a traditional camera lens. This fluidity, combined with the ever-decreasing size of megapixel cameras, allows all of the silicon sensor’s data to be captured and analyzed in one shot.
In theory, this technology will allow first responders to the scene of a chemical attack or spill to determine what, if any, toxins are present in the air. Networked properly, it could give a disaster response department the ability to map out the spread of a deadly chemical, or a group of teenage girls the ability to identify and avoid that guy who just refuses to wear deodorant.
Either way, the applications are almost endless. Currently, this technology is several years from hitting the market, but the prototype can already distinguish between two chemicals – methyl salicylate, which is a mustard-gas simulator, and toluene, a gasoline additive.
Now if they could just add a “The Price is Right” loser sound effect when a deadly substance was encountered, it would funny and functional. Perhaps “smelltones” could become big business.