Vision Trough 3D sound
The VTS project was established to enable the visually impaired or disabled to live a happier, more fulfilling life filled with experiences about the environment surrounding them.
VTS uses a mini ITX computer small enough to clip on a belt or keep in a handbag or pocket. It consists of a variety sensors, IR, laser, ultrasonic and visual, providing a 360° field of ”view”.
The main difference between the system and other solutions available is;
· Fully customizable
· As mobile as the user
· Relatively cheap compared to similar devices
· Can be mass produced within Canada
· Works in any environment
· 30hr + battery life and can be recharged in just under 2 hours
The device communicates through sound signals, but can be customized to communicate by words or by other means. The front sensor needs to be placed at eye level, by either clipping it on sunglasses or to a baseball cap. The two side sensors need to be placed at waist height for accurate readings. The sensors are wireless to allow 100% mobility.
The sensors weigh only 100 grams and the computer unit weighs with all batteries and equipment just under 350 grams and has similar dimensions to that of a modern cell phone.
The system can distinguish between different materials, like metal, concrete or people, which may be beneficial when getting used to a new living or working environment.
· Can sense difference in elevation (i.e. Curbs or sidewalks)
· Will indicate rough walking terrain
· Observes moving objects and calculates the possibility of a collision
· Can follow paintings on road, guiding the user in a straight line
· Will provide distinguished warning if a collision is evident if current track is maintained.
· Creates real-time 3D sound from all the surrounding objects
The system uses two earplug-sized headphones for communication allowing a near surround experience, while still allowing the user to hear all the noises surrounding them. The warning “alphabet” consists of 300 different signals, but was developed in such way that the user can get fully accustomed to in about a week’s time.
VTS always runs in real-time, and takes just one minute to be fully operational once all the sensors are in place. It follows the head movements of the user, in a way as if fully functioning eyes were used, which means when the user turns his/her head in any direction, the signal changes while scanning objects in the particular direction.
The system can also be taught particular objects by material and dimensions and guide the user to them once they’re in range. (i.e. remotes, keys, chairs, glasses etc…)
The technology was originally implemented for surveillance purposes to be used on mobile surveillance robots. Currently we are working on making the system more user friendly and more suitable for less technically inclined people, while improving functionality and practicality.More information on the technology can be found on www.rotoconcept.com or detailed technical information is available by request.
Invention Status: Almost ready for production
Keywords: vision robotics visual targeting uav ugv