Trim
Trim is building an AI foundation model that simulates real-world physical systems over time, using a custom Galerkin-type linear-attention transformer architecture that claims linear scaling in computation vs. exponential/polynomial scaling in traditional solvers.
Trim
Trim is building an AI foundation model that simulates real-world physical systems over time, using a custom Galerkin-type linear-attention transformer architecture that claims linear scaling in computation vs. exponential/polynomial scaling in traditional solvers.
Executive Summary
Trim is a YC W25-backed seed-stage company building what it calls a foundation model for physics simulation, using a Galerkin-type linear-attention transformer to dramatically reduce compute costs for PDE-based simulations. The market timing is genuinely good — the AI physics simulation space is attracting serious capital ($155M to PhysicsX, $100M to Neural Concept) and growing at 16–21% CAGR — and the founder has authentic domain motivation as the youngest NRC-licensed reactor operator in the US. The core architectural claims are technically grounded in real academic work but remain entirely unvalidated by independent benchmarks, peer review, or paying customers, and the company has no disclosed business model, no co-founder, and only 3 employees total. The single biggest risk is a crowded, well-funded competitive field where Emmi AI, BeyondMath, and PhysicsX are all pursuing overlapping "foundational AI physics model" positioning with far more capital and headcount than Trim currently has.
Run your own diligence
Upload a pitch deck or paste any company URL to get a full AI-powered due diligence report in under 2 minutes.
Get started free →Free plan available · No credit card required