That touch feels real

From consumer electronics to medical applications, the sense of touch is being redefined as research mimics complex sensations.
In the years that John Rogers has been researching soft electronic devices, he has tried to improve their haptic feedback — making them more responsive to touch. As a Materials Science and Biomedical Engineering Professor at Northwestern University in Illinois, U.S., he knew how to build soft, skin-compatible electronic systems. But when it came to haptic-level functionality, what he missed were the microactuators that could go beyond simply 'poking' the skin or 'vibrating' on it. Could there be a way to mimic human touch more closely? In March 2025, he found the answer to that question.
In a paper published in Science (bit.ly/haptics-freedom), Rogers and a group of engineers from the university detail a technology for advanced haptic interfaces containing microactuators with full freedom of motion to mimic complex sensations on the skin. The actuators can impose not just vertical force on the skin but also pull, push, twist, and shear to activate the different mechanoreceptors that exist at various depths within the skin. "When you touch an object, your skin isn't only deformed in the out-of-plane direction, but it could be sheared and twisted at the same time," says Rogers.
As soft electronics evolved, the healthcare industry began exploring haptics. From medical monitoring to robot-assisted surgeries, new technologies relied on the precision of haptic feedback.
The forces come in combinations across different time scales, from static to vibrational deformations of the skin at frequencies of hundreds of hertz and, in some cases, even beyond. "Only then can you replicate the full richness of sensations that are associated with real physical touch," he says.
The actuators he has developed use three coils nested together. When current is applied to any of the coils in different magnitudes and time scales, it produces a complex magnetic field that can impose forces on the skin in controlled directions. "Using arrays of coils, we cover larger areas of the body, and then with time-coordinated electronics, we can deliver currents to each of the three coils associated with each actuator to create a spatiotemporal sensation of touch," says Rogers.
HAPTICS AS AN ENABLER
For a long time, research in haptics had been overtaken by work on visual and auditory perceptions. Industry experts find haptics to be an enabler: it refines the experience of a core technology and eases its use.
Consumer electronics was a major driver of haptic evolution. Haptics got significant attention when Apple launched the Taptic Engine as a way to refine the touch feedback in its iPhone 6s series in 2015. It started with the demand to make gadgets thinner and waterproof, which meant buttons had to go. Touchscreens forced engineers to develop a variety of tactical feedback to help users navigate the operating system. Eccentric rotating mass motors that cause the generic vibrations in phones were replaced by linear actuators that could emit more nuanced taps, with customised patterns for each operation.
Despite these developments, haptics was often an afterthought to the sensory experience. It is similar to how we interact with the world: the largest bandwidth in the brain is reserved for the visual cortex, followed by the auditory. True, life would be harder to navigate without touch; yet, most of our sensory processing is still dedicated to the eyes and the ears.
Technologically, too, it made sense to focus on audio-visual: you could build freestanding electronic devices that did not physically interface with the human body for operation.
"You're transmitting photons or sound waves, so the technology can be largely decoupled. But there aren't good choices in artificial muscles for force delivery," says Rogers. Getting technology comfortably integrated with the body, especially over large areas, is a big challenge and is not immediately compatible with how electronic circuits or displays are fabricated.
THE HEALING TOUCH
But as human-computer interfaces and soft electronics evolved, the healthcare industry began exploring haptics. From medical monitoring devices to robot-assisted surgeries, new technologies relied on the precision of haptic feedback. Haptics was no longer just about the sensation on one's fingertips but moved on to cover larger areas of the body.
These applications interest Rogers. He has used this technology for people with diabetic neuropathy, who have lost sensation in the soles of their feet. They lose the ability to walk with a normal gait unless they are looking at their feet because they can't feel the ground. His team is collaborating with a rehab clinic to offer sensory substitution.
They install arrays of pressure sensors in the base of their shoes, quantifying the pressure distribution across the feet when they are walking. These pressure sensors communicate that information in real time to haptic sensors mounted on their chest or upper legs. "We have done this for over a dozen patients now, and they have begun to feel their feet through their chest," says Rogers.

Haptic technologies also help provide feedback and training to surgeons or medical professionals operating devices remotely. A robotic system called the da Vinci series is used in hospitals in India. It allows teleoperation, helping surgeons feel the forces exerted by robotic instruments as they interact with the patient's tissues.
Along similar lines, Merkel Haptic Systems, an Indian Institute of Technology (IIT) Madras-incubated company in Chennai, collaborates with IVF clinics to help doctors train on simulations before performing injections in real time, potentially avoiding mishaps from inexperience. Another of its devices, too, works like an injection simulator and is used for giving epidurals to women in labour.
The start-up is a spinoff of the institute's Touch Lab, founded in 2005 by M. Manivannan, a Professor of Biomedical Engineering and Applied Mechanics. The lab has been pushing research into enhancing human-machine interactions by studying the underlying human haptics, from skin biomechanics to motor control, and developing machine haptics.
Manivannan teaches psychophysics — the relationship between tactile sensations and how the brain perceives them. "We study how human perception can be measured," he says. Engineers measure in SI (International System) units — length, weight, charge — but how can human perception be measured quantitatively? "Just like these SI units, we have units for human perception. It's called the just-noticeable difference (JND). For example, what's the smallest dot you can see? What's the smallest change in temperature that you can feel? It may vary from person to person, but we standardise it," he says.
BEYOND TEXTURES
In 2022, the lab patented an interactive touch active display that allowed users to feel the textures and shapes of images shown on a screen through static electricity emitted from each pixel. "We recently hosted an IAS officer who was visually challenged. He mentioned that in his line of work, he needed to read a lot of maps, and a technology like this would be useful," says Manivannan. His next project is for a global professional services company, developing digital twins for skin textures of all ages and genders to form a catalogue for dermatological research and training.
While he has developed feedback from a few dimensions of haptics, such as frictional and vibrational forces, others need more research: how do you add the sensation of temperature, softness, or wetness? That would mean integrating thermal actuators with mechanical actuators. Sensing wetness could also help surgical training systems replicate the slippery nature of blood. All these technologies need a deeper study of skin biomechanics and its thermal characteristics – how heat propagates from the skin surface through its depths and laterally.
To realise this, Manivannan aims to popularise the branch of perception engineering. "When you buy a phone, you go through its technical specifications. Similarly, when you are building technology for humans, you need to measure human perception to modify it," he says, adding that the field will be at the intersection of neuroscience and engineering. "It will put humans back at the centre of technological advancements," he says.
See also:
Have a
story idea?
Tell us.
Do you have a recent research paper or an idea for a science/technology-themed article that you'd like to tell us about?
GET IN TOUCH