
Eye-tracking has quietly moved from research labs into phones, headsets, and even experimental wearables, turning a classic science fiction trope into something you can actually buy or watch in action. What once sounded like fantasy, from controlling machines with a glance to mapping the mind through eye movements, is now being built into consumer devices and cutting-edge neurotechnology. I see a technology stack emerging that treats the human gaze as both a user interface and a diagnostic signal, with profound implications for how we work, play, and communicate.
From lab curiosity to everyday interface
For decades, eye-tracking was a niche research tool, used to study reading, attention, and neurological disease, but it is now being reimagined as a core interface for robots and smart systems. One project, the EyeRobot, uses robotic eyes to mimic human gaze patterns so engineers can systematically test how well tracking systems follow subtle movements, a setup that helps refine accuracy in both healthy volunteers and people with disorders that affect eye control. By treating the robot’s gaze as a controllable benchmark, researchers can push systems to handle the messy reality of blinks, micro-saccades, and head motion, which is essential if eye-based control is going to be reliable outside the lab.
What stands out to me is how this kind of work turns the eye into a bridge between neuroscience and engineering, rather than a one-way sensor bolted onto a screen. The EyeRobot is described as useful not only for building better tracking hardware but also for probing how gaze patterns change in health and disease, which hints at future diagnostics that might spot early signs of conditions like dementia or concussion simply by watching how someone looks at a scene. That dual role, as both controller and biomarker, is already visible in the way robotic eyes and tracking systems are evaluated for precision in clinical as well as consumer contexts.
Phones that follow your gaze
The leap from research rigs to smartphones is where the sci-fi comparison starts to feel literal, because it means eye-tracking is arriving in a device that already sits at the center of daily life. At Mobile World Congress, Honor showed how its Magic6 Pro could respond to a user’s gaze, letting people trigger actions on the phone or connected devices simply by looking at on-screen elements. Instead of tapping a button to answer a call or launch an app, the system detects where your eyes land and interprets that as intent, effectively turning the display into a field of invisible switches that light up only for you.
In my view, this kind of feature is less about novelty and more about testing how comfortable people are with gaze as a mainstream input, especially when it is paired with the phone’s existing sensors and AI. If a handset can reliably infer what you want from a glance, it can start to control smart home gear, cars, or wearables without forcing you to juggle menus or voice commands, which is exactly the promise behind demonstrations of Honor’s eye-tracking tech on the Magic6 Pro. The risk, of course, is that the same data could be mined to infer what content holds your attention or when you are emotionally vulnerable, so the race to commercialize gaze control will need to be matched by equally serious rules about how those signals are stored and shared.
Futuristic research that treats gaze as a control signal
Behind the polished phone demos, researchers are building systems that treat eye movements as a universal remote for the physical world, not just for screens. One project described by Jan under the title Researchers Unveil Futuristic Eye uses high precision tracking to let a user operate Internet of Things devices with nothing more than a directed look, effectively turning lamps, thermostats, or speakers into gaze-activated endpoints. The work is framed explicitly as Tracking Technology That Sounds Like Something from Science Fiction, but the underlying idea is straightforward: if the system knows exactly what object you are looking at, and can authenticate that it is really you, then a short dwell of your eyes can stand in for a button press or spoken command.
What I find striking is how this research treats the environment itself as a canvas for interaction, rather than confining gaze to a single display. The same project is described as part of a broader push to integrate eye-based control into Internet of Things electronic devices, which means the tracking hardware and software have to cope with cluttered rooms, moving people, and variable lighting. That is a much harder problem than following a cursor on a monitor, yet it is exactly what would be needed for a future in which you could dim a specific lamp, start a robot vacuum, or arm a security system simply by looking at it. The ambition is captured in the way Jan’s work is referenced as Tracking Technology That Sounds Like Something, a label that reflects both the technical challenge and the cultural resonance of machines that respond to our gaze.
DIY sci-fi: lasers, backpacks, and creator experiments
While institutional labs refine algorithms and sensor arrays, independent creators are stress-testing the same ideas in far more theatrical ways. In a widely shared clip, Hacksmith has turned sci-fi into reality with a dual-laser backpack that uses eye-tracking to control where the beams point, a setup that looks like something ripped from a comic book but is grounded in the same core principle of mapping gaze to precise coordinates. The rig straps onto the user, tracks their eye movements, and then steers the lasers accordingly, turning a glance into a targeting command that feels both exhilarating and slightly unsettling to watch.
As a journalist, I see these kinds of prototypes as cultural barometers, because they show how quickly the public imagination jumps from subtle interface tweaks to overtly weapon-like applications. The fact that the device is described explicitly as a Controlle Dual Laser Backpack Device underlines how eye-tracking can just as easily be paired with tools that cut or burn as with accessibility aids or productivity apps. Another version of the clip, shared as futuretech from Hacksmith, reinforces that the spectacle is part of the appeal, yet it also serves as a reminder that regulators and designers will have to think hard about where gaze-controlled hardware should and should not be deployed, especially in public spaces.
Blurring the line between vision, dreams, and communication
The most radical frontier for eye-related technology is not in how we control devices while awake, but in how we might communicate from within altered states of consciousness. Researchers have combined electroencephalography and functional near-infrared spectroscopy to detect lucid dream activity, using those brain signals to identify when a person becomes aware that they are dreaming. The work is framed as a neurotechnology breakthrough that may have just turned science fiction into reality, because it hints at the possibility of sending messages between the waking world and the dream world for human connection, a concept that has long been a staple of speculative fiction.
Although the primary sensors in that study are EEG and fNIRS, the logic is closely related to eye-tracking, since lucid dreamers often use deliberate eye movements as a way to signal to researchers that they know they are dreaming. In practical terms, that means the same ecosystem of tools that can read gaze during wakefulness could eventually be paired with brain monitoring to build richer two-way channels between internal experience and external devices. The project is described as using electroencephalography (EEG) and functional near-infrared spectroscopy to detect lucid dream states, and it sits alongside more public-facing summaries that present the same work as a neurotechnology breakthrough that edges us closer to reading and responding to the mind in real time.
When I connect these threads, from Jan’s Tracking Technology That Sounds Like Something to the EyeRobot, the Magic6 Pro, Hacksmith’s laser backpack, and EEG-driven dream experiments, I see a single trajectory: our eyes are becoming both a steering wheel for the digital world and a diagnostic port into our inner lives. That dual role is what makes the technology feel so uncanny, because it collapses the distance between looking, intending, and acting that we are used to in traditional interfaces. The challenge now is to decide how far we want to push that collapse, and what safeguards we demand, before the ability to control and interpret gaze becomes as ubiquitous as the cameras already staring back at us.
More from Morning Overview