Researchers at Queen Mary University of London have reported that humans can detect buried objects through sand without directly touching them, a sensory ability previously documented only in shorebirds. The findings, announced in November 2025, describe what the team calls “remote touch,” a pressure-based detection method that could represent a seventh human sense. If confirmed by further study, the discovery would challenge long-held assumptions about the limits of human perception and open new lines of inquiry in fields from search-and-rescue operations to robotics.
What Red Knots Taught Scientists About Remote Touch
The concept of remote touch did not originate in a human lab. It was first described in red knots, small shorebirds that probe wet sand with their bills to find mollusks and other prey buried beneath the surface. A foundational study in red knot foraging argued that these birds detect objects by sensing pressure gradients in wet sediment, created when pore water flowing through sand encounters a buried obstacle and is forced to change direction. The bird does not need to physically contact the prey item. Instead, it reads the disturbance in water pressure around the object, a mechanism distinct from ordinary touch.
The anatomical hardware behind this ability is equally specialized. Red knots possess dense clusters of Herbst and Grandry corpuscles in their bill tips, sensory structures that are finely tuned to detect vibrations and pressure shifts. A peer-reviewed analysis linking bill-tip receptors to foraging ecology clarified how remote touch differs from direct contact: the bird senses the object’s influence on its environment rather than the object itself. This distinction is central to the entire concept, because it means the detection happens at a distance, mediated by the physics of granular media rather than by skin-to-surface contact.
In shorebirds, this system is tightly integrated with behavior. Red knots probe repeatedly in patches of sand, sampling the pressure field in much the same way a bat samples echoes. Over many generations, birds that were better at reading those pressure fields would have had a survival advantage in coastal habitats where prey is hidden. Remote touch in this context is not a curiosity; it is a core foraging strategy that shapes bill morphology, habitat choice, and even migration routes.
Humans Show a Comparable Ability Without Specialized Anatomy
The Queen Mary team set out to test whether humans, who lack anything resembling a shorebird’s bill-tip sensors, could still perform this kind of indirect detection. According to the university’s own report, participants in the study were able to sense a hidden cube buried in sand before physically touching it. The research, described as the first evidence of human remote touch, frames the ability as a potential seventh sense alongside sight, hearing, taste, smell, touch, and proprioception.
Participants were asked to move a finger slowly through a tray of sand while blindfolded, eliminating visual cues. In many trials, they reported detecting the presence of a small buried object before any direct contact occurred, often describing a subtle change in resistance or a faint sense of “something there” in the sand flow. The consistency of these reports, combined with performance above chance, led the researchers to infer that humans can pick up on pressure disturbances in granular material, much like shorebirds do in wet sediment.
“Remarkably, the results revealed a comparable ability to that seen in shorebirds, despite humans lacking the specialized beak structures,” the researchers stated. That phrasing is striking. Red knots have spent millions of years evolving dense mechanoreceptor arrays in their bills. Humans have fingertips with high nerve density, but nothing analogous to the Herbst corpuscles that give shorebirds their edge. The fact that human participants performed at a level the team described as comparable suggests that remote touch may rely less on specialized anatomy and more on a general sensitivity to pressure changes in granular material, one that evolution may have preserved across a wider range of species than previously assumed.
The research also indicates that this ability is most relevant in situations where sight is impaired. That framing matters because it positions remote touch not as a parlor trick but as a functional sensory channel that activates when the dominant sense, vision, is unavailable. Think of a firefighter searching through rubble in smoke, or a diver feeling through silt on a river bottom. In those scenarios, the ability to detect an object before making direct contact could carry real survival value, helping avoid hazards such as sharp metal, unstable debris, or entangling cables.
At the same time, the work is preliminary. The study shows that people can perform the task, but it does not yet map the neural circuitry involved. It is not clear whether the brain is using existing tactile pathways in a novel way, or whether there are underappreciated mechanosensory channels that become especially active in granular environments. Future experiments using brain imaging or nerve recordings will be needed to clarify how the signal travels from fingertip to cortex.
Why Most Coverage Gets the “Seventh Sense” Framing Wrong
Popular accounts of this research have largely treated the seventh-sense label as settled science, but that framing deserves scrutiny. The traditional five senses are a simplification dating back to Aristotle. Modern neuroscience already recognizes well-established senses beyond the classic five, including proprioception (awareness of body position), thermoception (temperature sensing), and nociception (pain detection). Adding remote touch to the list would not make it the seventh sense in any strict numerical order.
The Queen Mary team’s use of the phrase appears designed to communicate the novelty of the finding to a general audience rather than to claim a fixed position in a sensory hierarchy. As a piece of science communication, “seventh sense” signals that this is something outside everyday experience, without requiring readers to wade through technical terminology. But if taken literally, it can obscure the more interesting question: how flexible is the human sense of touch, and how many distinct information channels does it actually contain?
A more productive way to think about the result is as evidence that human haptic perception is broader than textbook models suggest. The underlying mechanism, detecting pressure disturbances transmitted through a medium like sand, is a form of indirect mechanical sensing. Whether it deserves its own categorical label or is better understood as an extension of existing touch sensitivity is a question the current data alone cannot resolve. The researchers have demonstrated the phenomenon; the neurological pathway responsible for it in humans remains an open question that future imaging and psychophysical studies could address.
There is also a philosophical dimension. Senses are not just biological facts; they are conceptual tools for organizing experience. As scientists uncover more specialized detection capabilities (magnetic fields in birds, polarized light in insects, and now remote touch in humans), the neat list of five becomes harder to defend. The real story here may be less about numbering senses and more about accepting that human perception is modular, context-dependent, and still only partially mapped.
Robots Are Already Chasing the Same Ability
The engineering world has been working on a parallel problem: how to build machines that can identify objects buried in sand or soil. A robotics project at MIT produced a device called the Digger Finger, a slender tactile sensor designed to probe granular material for hidden items. The device uses a soft, camera-based GelSight sensor to detect shape and texture as it pushes through material, capturing detailed deformation patterns on its surface.
The associated outreach from the research team emphasized how difficult it is for robots to identify objects when they cannot rely on cameras or direct visual feedback. In the announcement of the Digger Finger work, the challenge is framed as a fundamental limitation of current robotic systems: without clear lines of sight, most machines struggle to distinguish between sand, stones, and delicate targets such as wires or buried devices. The Digger Finger addresses this by using tactile imaging to reconstruct the shapes it encounters, effectively giving the robot a kind of artificial fingertip that can interpret pressure distributions in sand.
The convergence between the biological and engineering stories is striking. Shorebirds detect prey by reading pressure fields in wet sediment. Humans, according to the Queen Mary study, can sense buried objects by feeling how sand flows and compresses around their fingers. Roboticists, facing the same physical constraints, are building sensors that interpret subtle changes in force and deformation as a probe moves underground. In each case, the key is not direct contact with the object but the way the surrounding medium transmits information about that object’s presence.
This convergence also hints at practical applications. If humans can be trained to enhance their remote touch sensitivity, they might work more effectively with tactile robots in low-visibility environments, from disaster zones to planetary exploration. Conversely, insights from human performance could guide the design of next-generation tactile sensors, suggesting which pressure cues are most informative and how to process them.
For now, the discovery of human remote touch remains an intriguing piece of the larger puzzle of perception. It reinforces the idea that our sensory world is richer than we typically assume, extending beyond the familiar categories taught in school. As researchers continue to explore how we interact with complex materials like sand, soil, and rubble, they are likely to find that the boundary between touch and its “seventh sense” cousins is far more porous than any simple list can capture.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.