Researchers at the University of California, Santa Barbara have developed a new display technology that allows users to both see and feel on-screen graphics. The system uses tiny pixels that expand outward when illuminated, creating bumps that can be detected by touch. This advancement could lead to high-definition visual-haptic touch screens for applications such as automobiles, mobile devices, or intelligent architectural walls.
The research was led by Max Linnander, a PhD candidate in the RE Touch Lab under mechanical engineering professor Yon Visell. Their findings were published in Science Robotics.
Linnander explained how the project began: “The question was simple enough: Could the light that forms an image be converted into something that can be felt?” Visell added, “We didn’t know if it was feasible. The possibility that it might be impossible — and the very idea of enabling people to ‘feel light’ — made the question irresistible.”
After nearly a year of theoretical work and computer simulations, the team started building prototypes. Progress was slow until December 2022 when Linnander demonstrated a working prototype to Visell. Linnander described the moment: “I’d been working on this for a year. I was going to leave for the airport in a few hours, and I had just gotten my latest prototype up and running.” Visell recalled his experience with the device: “I put my finger on the pixel and felt a clear tactile pulse whenever the light flashed. That was a special moment — the moment we knew the core idea could work.”
The technology is based on thin display surfaces with arrays of millimeter-sized optotactile pixels. Each pixel is controlled by projected light from a low-power laser using optical addressing. The pixel includes an air-filled cavity and a suspended graphite film; when illuminated, this film heats up quickly, causing air expansion that pushes out the top surface by up to one millimeter.
This process happens rapidly enough for dynamic graphics—such as contours or moving shapes—to appear continuous both visually and through touch, similar to standard video displays.
Because power and control are provided via light alone, these displays do not require embedded wiring or electronics inside each pixel. A scanning laser illuminates each pixel briefly at high speed.
The researchers demonstrated devices with more than 1,500 independently controllable pixels—more than previous tactile displays—and suggested larger formats are possible using current projector technologies.
User studies showed participants could accurately identify individual pixels’ locations with millimeter precision through touch alone and perceive moving graphics or patterns reliably.
Visell noted historical precedents for converting light into mechanical action: “In the 19th century, Alexander Graham Bell and others used focused sunlight, modulated by blades of a rotating fan, to excite sound in air-filled test tubes.” He said similar physical principles now underpin their digital display technology.
Potential applications include automotive touchscreens simulating physical controls, electronic books featuring tangible illustrations, or mixed-reality architectural surfaces combining digital visuals with tactile feedback.
Visell summarized their achievement: “Whatever the future may hold, the technology his team has invented embodies a simple, intriguing idea: anything you see, you can also feel.”



