Sports Photographer Captures the Action on the Field

The next revolution in sports viewing is not a new rulebook or a streaming app, it is a camera that seems to float through the action as if gravity no longer applies. By combining ultra-light rigs with real-time computer vision, a Norwegian startup is turning live matches into something that feels closer to a video game than a traditional broadcast. If it works at scale, this “weightless” perspective could reshape how I, and millions of other fans, experience everything from Tennis to the NFL.

From Norwegian lab project to Tennis proving ground

The story starts with Muybridge, a Norwegian startup that set out to rethink how cameras move around athletes instead of locking them to tripods, rails, or drones. Founded by Håkon Espeland and Anders Tomren in 2020, the company has spent nearly five years building real-time computer vision tools that let its system understand where players, lines, and balls are at any moment, then steer lightweight cameras through that space as if they were weightless. That foundation, according to reporting on how the company was founded, is what lets the camera glide through rallies without colliding with players or equipment.

Tennis has been the ideal sandbox for this experiment, because the court is compact, the lines are fixed, and the ball’s path is brutally unforgiving to any tracking errors. Muybridge’s system has already “rocked” pro Tennis by dropping cameras to angles that used to be impossible, hugging the baseline or sliding just above the net while still keeping the full geometry of the court in view. One report notes that Tennis has been an effective launchpad for the company’s technology and that when operators lowered the cameras all the way down to the lowest safe point, the result was a perspective that felt like standing inside the rally rather than watching from the stands, a shift captured in detail in coverage of how Tennis became its first big test.

How a “weightless” camera actually works

What makes this system feel so radical is not just that the camera is light, it is that the rig behaves more like a digital character than a piece of metal. Muybridge’s platform uses computer vision to build a live 3D model of the court and everyone on it, then calculates safe paths for the camera to move through that space in real time. The company’s engineers have described how this real-time processing lets the camera appear to float, adjusting its trajectory on the fly as players sprint, slide, or dive, which is why early adopters describe it as a kind of Norwegian magic trick rather than a conventional broadcast tool.

This approach builds on a long lineage of camera innovations that tried to free the lens from static positions. The Louma crane, for instance, was celebrated for creating a camera that, as one historical account puts it, was freed from its earthly constraints and could move fluidly through space, rising to great heights or dropping down low, obedient to the operator like a remote-controlled mechanical bird. That earlier leap, documented in an exploration of the Obedient crane, relied on physical arms and counterweights. Muybridge’s twist is to replace most of that hardware with software, letting algorithms, not steel, do the heavy lifting.

Beyond Tennis: from dugouts to F1 garages

Once a system like this proves itself in Tennis, the obvious question is where it goes next, and Muybridge’s ambitions are not subtle. Reporting on the company’s roadmap notes that the startup has already targeted soccer, hockey, and F1 as prime candidates, along with use cases far from stadiums, such as emergency medicine. In soccer, a weightless rig could skim just above the grass, tracking a winger’s run from the touchline to the box, while in hockey it could slide along the boards at puck level, giving viewers a view that traditional jib arms or cable-suspended cameras cannot match, a vision that aligns with descriptions of what Muybridge plans to tackle next.

The same reporting makes clear that the company is already experimenting with perspectives that used to be off limits, such as letting coaches in a baseball dugout see the field from the dugout’s own vantage point, or giving F1 engineers a camera that can move around a car in the garage without bulky rigs. When operators lowered the cameras all the way down to the lowest safe angle in early tests, the footage felt less like a broadcast and more like a player’s-eye view, which is exactly the kind of immersion leagues are chasing as they compete with gaming and social media for attention. That is why I see this technology less as a niche gadget and more as a new grammar for how live sport is framed.

3D clones, AI capture, and the race for immersion

The weightless camera is arriving at the same moment as a broader wave of volumetric and AI-driven capture that is trying to turn every game into a manipulable 3D scene. One viral demonstration from Apr showed how Arcturus uses a ring of cameras to record athletes from every angle, then reconstructs them as 3D models that can be replayed from any viewpoint, a process described as a brand new type of AI capture technology. In that demo, Arcturus places cameras around the field, captures the action, and then lets editors or fans fly a virtual camera through the reconstructed play, a workflow that the company itself has framed as a potential Arcturus-driven reinvention of sports media.

Traditional broadcasters are experimenting with similar ideas using fixed rigs. In one 3D system highlighted by Apr coverage, Cameras are placed around the edges of baseball stadiums to capture real-time action and generate a 3D digital clone of the game, which can then be used to create new angles for fans or analytical tools for coaches. That same report notes that teams can use the resulting data-rich clone to strategize and manage their players more effectively, turning the field into a living dataset as well as a spectacle, a dual role described in detail in an explainer on how those Cameras work.

What the NFL and Remote production can teach Muybridge

Major leagues are already proving that fans and officials will accept machine-guided views when the payoff is clarity. The NFL’s decision to install Sony’s Hawk-Eye system in all 30 NFL stadiums is a prime example, with the league arguing that the multi-camera setup will save time over traditional chain measurements while improving accuracy on first-down calls. The same system, which has long been used in tennis, cricket, soccer, and rugby, triangulates the ball’s position using a network of synchronized lenses, a process detailed in reporting on how Sony built Hawk-Eye for the NFL.

At the same time, production companies are rethinking where the people behind these systems actually sit. One analysis of Remote production trends notes that, whether in compact flypacks, production vans, or centralized control rooms, the right technology now lets directors and technicians work far from the venue while still elevating the emotional impact of the pictures. That shift, which is framed as a key part of how immersive storytelling is redefining the playing field, suggests that a weightless camera could be piloted from a control room hundreds of miles away, its movements choreographed alongside virtual graphics and 3D replays, a possibility that aligns with how Remote workflows are evolving.

More from Morning Overview