Tanvas, the company redefining the touchscreen technology, unveiled Tanvas Touch, a surface haptic feedback technology at CES 2017. The technology enables you to feel what you see on any touch display. The company seeks to truly replicate the human sense of touch adding ability to feel texture on a screen.
Tanvas Touch Applications
At CES, the company demonstrated a prototype of Tanvas Pad bolted on top of a reconfigured Nexus 9. One of the applications allowed users to drag their finger through a virtual pool with pebble floors, reports The Verge. The haptic feedback made it feel like the water ripples followed one’s fingertips.
Apart from this, the pebbles underneath created a bump between every other stone, the report stated. The company also demonstrated other surfaces resembling wooden bridges, cobblestone and grass.
Another interesting application showcased by Tanvas was with partners Bonobos, an apparel company. The app showed two pairs of pants, one cotton and one corduroy and users actually are able to feel the fabric. The screen though couldn’t replicate the original touch but did give a fair idea of how smooth the fabric is.
How TanvasTouch Works?
The ordinary haptic feedback relies heavily on vibrations. For example, during typing on a smartphone, the key pressed vibrates to give you impression of physical button. TanvasTouch technology uses a similar idea but instead uses electromagnetic pulses for more precise responses.
Tanvas uses electrostatics to adjust friction between a fingertip and touch surfaces, explains Engadget. The altered friction lets your fingers feel the surface as it moves across the touchscreen. Though it isn’t exactly accurate, you do feel the changes depending upon the object on the screen. Though early adopters have used the technology for textiles, the potential for Tanvas Touch is limitless. It could be used from retail, consumer electronics, gaming, visually-impaired people and more.
Tanvas was founded in 2011 by haptics pioneers, Ed Colgate and Michael Peshkin, and is headquartered in Chicago. The team revealed the technology emerged from ten years of research at the Neuroscience and Robotics Lab (NxR) at Northwestern University.
Stay tuned to TheBitbag for updates on the Tanvas Touch.