We are building a physics-native foundational model for Physical AI and robotics. An architecture that bakes physical dynamics directly into the network. An order of magnitude fewer parameters. Runs on edge hardware.
We collect large-scale proprietary dataset of human dexterous manipulation with our custom hardware. Finger joint positions, contact force, distributed pressure, arm kinematics, synchronized egocentric video. Growing continuously.
Dexterity data projects down to any gripper — it trains today’s robots better and tomorrow’s dexterous hands at all. One dataset, every embodiment, every timeline.
Prometheus — first-generation world model on real-world manipulation data. Human-to-robot transfer.
Vulcanus — full-scale foundation model on 100,000+ hours. Cross-embodiment transfer. Factory pilots.
Seraphim — production model on 500,000+ hours. Continuous learning. Deployed at scale.
One architecture, five domains. Robotic control 88% CEM at 913K params. Vision 81.03% at 2.26M. Sudoku 97.2% at 120K. Language 1.182 BPB at 18.5M. Maze SOTA generalisation at 40K. No domain-specific modifications.
All of physics stands on one framework: write down an energy function, identify the symmetries, the dynamics follow. The Metriplector is built on the same foundation. Every layer separates conservation from dissipation — the two forces that govern every physical process. The architecture doesn't learn physics from data. It starts with physics and learns the rest.
Five unrelated domains respond to the same primitive. The physical principles governing the universe may also govern how intelligent systems should be built.
Former Google DeepMind AI engineers, PhD theoretical Physicsists. Physics-native neural architectures from research to production.
Whether you're a researcher who thinks in energy landscapes, an engineer who wants to build what comes after transformers, or an investor backing fundamental breakthroughs — we want to hear from you.
peter@spheroid.ai