Sony AI's Project Ace Becomes First Autonomous Robot to Beat Elite and Professional Table-Tennis Players (April 23, 2026)
Sony AI's Project Ace, published in Nature on April 23, 2026, is the first autonomous robot to beat elite and professional table-tennis players in competitive matches.
Sony AI on published Project Ace, the first autonomous robot to beat elite and professional table-tennis players in competitive matches. The research, announced in a Sony AI press release and published as a cover article in Nature, marks the first time a robot has reached human expert level in a popular real-world competitive sport.
What Happened
Sony AI announced "a major breakthrough in real-world artificial intelligence and robotics" with the publication of Project Ace, a non-humanoid robot built around an eight-jointed arm on a mobile base, an array of nine cameras (including event-based vision sensors), and a model-free reinforcement-learning control system. The work was simultaneously published in Nature as "Outplaying elite table tennis players with an autonomous robot" (Sony AI et al., 2026).
Ace was tested in three rounds. In the initial seven-player round against elite amateurs, Ace won 3 of 5 matches per opponent, returning more than 75% of shots and handling balls with up to 450 rad/s of spin. In December 2025, Ace beat one professional outright and split matches with another. In a March 2026 round against three new professionals, Ace beat each of the three players at least once.
Key Details
- First-of-its-kind result — Ace is the first autonomous robotic system to beat elite and professional players in a popular real-world competitive sport, according to Sony AI's announcement.
- Hardware — Eight degrees of freedom in the arm, mounted on a mobile base, with nine cameras including event-based vision sensors that report only pixel changes (microsecond latency).
- Software — A new high-speed perception system feeding a model-free reinforcement-learning control policy; no scripted shot library.
- Spin handling — Returns shots with up to 450 rad/s of spin, the regime where most prior table-tennis robots failed.
- Peer-reviewed — The methodology and full match results are documented in a Nature cover paper, not a press-release-only claim.
- Public demo — Sony AI published an official Project Ace explainer video on its YouTube channel the same day.
What Developers and Users Are Saying
Reaction in the robotics and ML community has been a mix of genuine excitement and fair skepticism. TechRadar quoted one external researcher saying the result "totally blew my mind" and called it "a major robotics turning point." On Reddit's r/MachineLearning and r/robotics, top comments praised the integration of event-based vision with model-free RL, noting that this is one of the first publicized industrial uses of event cameras at this scale. The most upvoted critical comments argued the comparison is not perfectly fair: Ace uses nine fixed cameras for full-table coverage, where a human is limited to two eyes attached to a moving head, and the robot is a tabletop arm rather than a humanoid. Sony AI's own blog acknowledges the system is a research platform and not a consumer product.
What This Means for Developers
For ML and robotics teams, Project Ace is a high-profile demonstration that model-free reinforcement learning plus event-based vision can hit production-grade latency on a fast-moving real-world task. The Nature paper documents the training setup, sensor pipeline, and reward shaping in enough detail to replicate the approach on adjacent problems — bin picking, drone flight, manipulation under uncertainty — without proprietary data. Sony AI has framed Ace as a research platform rather than a product release, but the underlying perception and control stack is exactly the kind of work that tends to seed the next wave of industrial robot startups.
What's Next
Sony AI says Project Ace "will be applied to broader real-world challenges in physical AI" rather than commercialised as a table-tennis trainer. The Nature paper and the Project Ace site at ace.ai.sony are the canonical references. A follow-up tournament round and additional benchmark data are expected over the coming year, and Sony AI has hinted that components of the perception and control pipeline will be released as research artifacts.
Sources
- Sony AI press release — primary source, April 23, 2026.
- Nature: Outplaying elite table tennis players with an autonomous robot — peer-reviewed cover paper.
- Sony AI blog: Inside Project Ace — technical deep dive.
- Project Ace research site — videos, match data, and authorship.
- TechRadar coverage — external researcher reaction.
- Fortune coverage — context for non-technical readers.
- Interesting Engineering coverage — additional technical detail.
Stay up to date with Doolpa
Subscribe to Newsletter →