Panthalassa Raises $140M Series B Led by Peter Thiel for Wave-Powered Floating AI Data Centers (May 4, 2026)
Oregon-based Panthalassa raised a $140M Series B led by Peter Thiel's Founders Fund to manufacture and deploy floating, wave-powered AI inference data centers — with the first Ocean-3 nodes targeted for the northern Pacific in 2026 and commercial rollout in 2027.
Portland, Oregon-based Panthalassa on announced a $140 million Series B led by Peter Thiel's Founders Fund to finish a pilot manufacturing facility near Portland and deploy its first wave-powered floating AI data centers in the northern Pacific later this year, with commercial rollout targeted for 2027.
What Happened
Panthalassa, founded in 2016 by CEO Garth Sheldon-Coulson (a former Bridgewater Associates senior investment associate) and Chief Innovation Officer Brian Moffat (previously a wave-energy engineer at Spindrift Energy), confirmed via Business Wire on that the round was led by Founders Fund and brings the company's total raised to roughly $210 million.
The Series B added a striking lineup of new investors: John Doerr, Marc Benioff's TIME Ventures, Max Levchin's SciFi Ventures, Susquehanna Sustainable Investments, Hanwha Group, Anthony Pratt, Fortescue Ventures, Future Positive, WTI, Nimble Partners, Super Micro Computer, Sozo Ventures, Figma CEO Dylan Field, Planetary VC, Leblon Capital, Resilience Reserve, Portland Seed Fund and the Intrepid Oregon Fund. Returning backers Founders Fund, Gigascale Capital, Lowercarbon Capital, Unless and WovenEarth all participated.
Key Details
- Round size and valuation: $140M Series B, with TechLoy and DC industry trackers reporting a roughly $1B post-money valuation. Total funding to date is about $210M.
- How the technology works: Each "lollipop-shaped" Ocean-3 node has a buoyant spherical head connected to a long submerged vertical tube. As waves pass, the node bobs up and down, oscillating seawater through the tube and driving turbines that generate electricity onboard. Power is consumed in place, not transmitted back to shore.
- What runs on the node: AI inference workloads on already-trained models. Output is sent to land via low-Earth-orbit satellite as inference tokens, not as electricity. Cold seawater is used directly for cooling the hardware.
- Use of proceeds: Complete the pilot manufacturing facility near Portland, Oregon, and deploy the first Ocean-3 pilot node series in the northern Pacific in , with commercial deployments planned for 2027.
- Strategic backer: Hardware vendor Super Micro Computer's participation hints at the server platform that will sit inside the floating nodes.
What Developers and Users Are Saying
Reaction across Hacker News and the data-center subreddits has been a familiar mix of fascination and skepticism. The bull case, repeated by several top commenters, is that ocean waves are an underused, near-constant power source and that running inference at the energy source — instead of moving the energy to a grid-tied data center — sidesteps the grid bottleneck that is currently capping AI build-outs in Virginia, Phoenix and Dublin. The Tom's Hardware report frames Panthalassa as part of a broader trend of off-grid AI infrastructure, alongside nuclear-microreactor startups and the much-publicized space-based data-center experiments.
The bear case, voiced loudly on r/datacenter and by ESG-focused analysts, is operational: salt-water corrosion, biofouling, repair logistics in open seas, and the latency and bandwidth realities of LEO-satellite backhaul versus terrestrial fiber. Several commenters noted that even with low-earth-orbit links, real-time training is implausible at sea — so Panthalassa's bet is fundamentally on inference, not training, becoming the dominant compute workload.
What This Means for Developers
If Ocean-3 nodes ship on schedule, developers building AI applications get another differentiated inference endpoint to target in 2027 — one whose pitch is green, off-grid AI inference rather than the lowest p99 latency. Expect Panthalassa to expose its capacity through a fairly standard inference API, with marketing focused on tokens-per-watt and carbon intensity rather than raw throughput. For application developers in latency-sensitive paths (real-time agents, voice, streaming), terrestrial GPU clusters will remain the right choice; for batch and asynchronous AI workloads where carbon and energy cost dominate, sea-based compute could become a credible procurement option alongside nuclear-powered and hydro-powered AI data centers.
For infrastructure and platform teams, the more immediate signal is that AI compute is starting to follow the energy, not the other way round. Procurement strategies built around a small set of hyperscaler regions may need to flex to include power-first vendors — wave, nuclear, geothermal — that route through novel network paths.
What's Next
Panthalassa says the pilot manufacturing facility near Portland will be completed with this round, followed by initial Ocean-3 deployments in the northern Pacific in for inference benchmarking and manufacturing process refinement. Commercial deployments are scheduled for 2027. Full investor roster and technical specifications are on the company's site at panthalassa.com.
Sources
- Business Wire — primary press release with the full investor list and quotes
- GeekWire — regional coverage with founder background and pilot facility details
- Tom's Hardware — technical framing of the Ocean-3 node design and AI infrastructure context
- Data Center Dynamics — data-center industry perspective and commercial-rollout timeline
- ESG Today — climate and sustainability framing of the round
- Tech Startups — additional cross-reference on funding totals and company history
Stay up to date with Doolpa
Subscribe to Newsletter →