Morning Overview

Startups pitch optical metamaterials to speed up AI data centers

Semiconductor giant Marvell Technology has agreed to acquire Celestial AI, a startup building optical interconnects for artificial intelligence infrastructure, in a deal structured around cash, stock, and performance-based milestone shares. The acquisition, disclosed in a federal securities filing, arrives just months after Celestial AI closed a $250 million funding round to scale its photonic chip platform. Separately, a Duke University spinout called Neurophos recently raised $7.2 million to develop metamaterial-based optical AI chips. Together, these moves signal that optical metamaterials are attracting serious capital as a potential fix for the data transfer and energy bottlenecks that threaten to slow AI’s rapid expansion.

Why Copper Wiring Hits a Wall

Training large AI models requires moving enormous volumes of data between processors, memory, and storage at speeds that traditional copper-based electrical interconnects struggle to sustain. As clusters of graphics processing units scale into the tens of thousands, the wiring that links them becomes a chokepoint. Electrical signals lose energy as heat, degrade over distance, and consume growing amounts of power, all of which drive up operating costs and limit how fast models can learn.

Optical interconnects replace copper paths with light, which can carry far more data per second over longer distances while generating less waste heat. The concept is not new in telecommunications, but adapting it for the tight confines of a data center chip package is an engineering challenge that has kept photonic solutions on the margins of mainstream computing until recently. The surge in AI workloads has changed the calculus. If electrical links cannot keep pace with the next generation of AI accelerators, the entire training pipeline slows down regardless of how powerful the processors themselves become.

Celestial AI’s Photonic Fabric Platform

Celestial AI built its pitch around a technology it calls Photonic Fabric, a platform designed to move data optically between compute, memory, and networking components inside AI systems. What distinguishes the approach from earlier photonic efforts is its claimed compatibility with standard semiconductor manufacturing and 2.5D packaging processes, according to the company’s recent funding announcement. That compatibility matters because it means data center operators could, in theory, integrate optical links without overhauling their existing chip assembly lines or adopting entirely new fabrication tools.

The company secured $250 million in funding earlier this year to scale the platform toward production. That round placed Celestial AI among the more heavily backed photonic startups in the AI hardware space, giving it capital to hire engineers, build prototypes, and negotiate with foundry partners. The size of the raise also reflected investor confidence that the AI infrastructure market would reward optical solutions capable of slotting into current manufacturing workflows rather than demanding a clean-sheet redesign.

Celestial AI’s vision centers on decoupling memory from compute while keeping data movement efficient. By using light to shuttle information among disaggregated resources, the company aims to let system designers pool memory and accelerators more flexibly and keep utilization high. If successful, that could reduce the number of underused chips idling in racks because they are starved for bandwidth, a growing concern as model sizes and dataset volumes continue to climb.

Marvell’s Acquisition Bet

Marvell Technology, a major supplier of data infrastructure semiconductors, moved to buy Celestial AI through an Agreement and Plan of Reorganization filed with the U.S. Securities and Exchange Commission. The deal’s structure, combining upfront cash, Marvell shares, and additional milestone-contingent shares, suggests that the final price tag depends partly on whether Celestial AI’s technology hits specific technical or commercial targets after closing.

That milestone structure is telling. It implies Marvell views the Photonic Fabric platform as promising but not yet proven at the scale its largest customers, such as hyperscale cloud providers and AI chip designers, would require. By tying a portion of the consideration to future performance, Marvell limits downside risk while still securing access to a technology that could differentiate its product line if optical interconnects become standard in next-generation AI clusters.

For Celestial AI’s investors, the acquisition converts a high-risk venture bet into a partial exit with continued upside tied to milestones. For Marvell, it adds an optical interconnect capability that complements its existing portfolio of custom silicon, electro-optics, and networking chips sold to cloud and enterprise data center operators. The strategic logic is straightforward. If AI training infrastructure shifts toward photonic data movement, Marvell wants to own the technology rather than license it from a competitor.

The deal also underscores how established chip companies increasingly rely on acquisitions to keep pace with rapid shifts in AI hardware. Building an in-house photonics program from scratch can take years and require specialized expertise that is in short supply. Buying a startup with a focused team and working prototypes gives Marvell a faster route to market, even if some of the technical and manufacturing risks remain unresolved.

Neurophos and the Metamaterial Angle

While Celestial AI focused on optical data transport, a younger startup is targeting a different layer of the problem. Neurophos, spun out of Duke University, raised $7.2 million to develop optical AI chips that use metamaterials (engineered structures with properties not found in natural materials) to perform computation with light rather than electricity.

The distinction between Neurophos and Celestial AI is worth understanding. Celestial AI uses photonics primarily to move data between conventional electronic processors more efficiently. Neurophos aims to perform actual AI inference or computation optically, which could eliminate some of the energy-intensive electronic processing steps altogether. If the approach works at scale, it would represent a more radical departure from current chip architectures, one that processes information at the speed of light rather than merely transporting it that way.

At $7.2 million, the Neurophos raise is orders of magnitude smaller than Celestial AI’s $250 million round, reflecting the earlier stage of the technology and the higher technical risk involved. Metamaterial-based optical computing remains largely a laboratory achievement, and translating it into chips that can be manufactured reliably and cheaply enough for data center deployment is a challenge that has stalled previous photonic computing efforts over the past two decades.

Still, the investment signals that venture capitalists and strategic backers see potential in approaches that rethink not just how data moves but how it is processed. If optical computing can deliver meaningful gains in performance per watt, it could complement interconnect-focused platforms like Photonic Fabric, with light-based processors linked by light-based networks inside future AI systems.

What Stands Between Lab and Data Center

The common thread linking these companies is a shared bet that light will replace electrons for critical tasks inside AI infrastructure. But the gap between a working prototype and a product that hyperscale operators will actually deploy remains wide. Optical components must meet strict standards for yield, reliability, thermal performance, and cost before they can be integrated into the dense, power-hungry racks that dominate modern data centers.

Manufacturing is one obstacle. Photonic devices often require materials and structures that do not map neatly onto the highly optimized processes used for mainstream CMOS electronics. Celestial AI’s emphasis on compatibility with standard manufacturing flows is an attempt to sidestep that problem, but proving it out at volume will be crucial. If yields fall or packaging proves too complex, the economic advantages of optical links could erode quickly.

Another challenge is systems integration. Swapping out copper for optical links is not just a matter of replacing one cable with another. It can change thermal profiles, board layouts, signal integrity assumptions, and even software-level communication patterns. Data center operators are notoriously conservative about adopting new technologies that might jeopardize uptime, so photonic solutions must demonstrate clear, measurable gains in bandwidth and energy efficiency to justify the operational risk.

There is also a standards question. Today’s AI clusters rely on well-understood electrical interconnect protocols and form factors. For photonics to move beyond niche deployments, industry players will need to converge on interoperable specifications that let components from different vendors work together. Large chipmakers like Marvell, with broad customer bases and influence over networking roadmaps, are in a better position to push such standards than standalone startups.

Despite these hurdles, the economics of AI may leave little choice but to experiment with light. As models grow, the cost of moving data is consuming a larger share of total system power and capital expenditure. If optical interconnects and metamaterial-based computing can bend those curves, even incrementally, they could become essential tools for keeping AI’s growth on a sustainable trajectory.

The Marvell and Celestial AI deal and Neurophos’s seed round capture different points along that spectrum: one focused on near- to medium-term gains in data movement, the other on longer-range bets in optical computation. Both reflect a broader realization that simply adding more GPUs will not be enough. To keep pushing the frontiers of AI, the industry may have to rewire, and ultimately rethink, the physics underpinning its most basic infrastructure.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.