Sift Raises $42M to Build the Missing Data Layer for Physical AI
- 2 days ago
- 2 min read
Sift has raised a $42 million Series B led by StepStone, with participation from GV, Riot Ventures, Fika Ventures, and CIV, bringing total funding to $67 million.
The company is targeting a growing gap in AI infrastructure: the ability to make physical machines understandable to AI systems.
As AI moves beyond software into real-world environments, systems like rockets, satellites, and autonomous vehicles generate millions of sensor data points per second across audio, video, logs, and telemetry. But unlike software environments, this data is largely unstructured and difficult to interpret.
Sift’s platform aims to solve this by transforming raw machine data into structured, queryable formats that both engineers and AI models can use. In effect, it acts as an “observability layer” for hardware, bringing a level of visibility that software systems have developed over the past two decades.
Founded by former SpaceX engineers, the company is already working with organizations such as ULA, Astranis, and K2 Space, supporting systems that operate at fleet scale rather than as isolated machines. The shift from managing single assets to operating constellations of hundreds or thousands of systems is driving demand for automation in monitoring, anomaly detection, and performance validation.
With the new funding, Sift plans to expand its engineering team and platform capabilities as more industries move toward AI-controlled hardware systems across defense, space, manufacturing, and autonomy.
TheMarketAI Take
We’ve written before that physical AI is fundamentally different from software AI. Large language models benefit from abundant, structured data. Physical systems do not. Instead, they produce messy, high-frequency, multi-modal data that is difficult to interpret and even harder to scale.
Sift is tackling a less visible but critical layer of that problem: making the physical world legible to AI.
Before AI can control machines, it needs to understand them. That requires translating raw sensor output into structured data pipelines — something that, until now, has largely been handled manually or through fragmented tools.
This reinforces a broader theme in Physical AI: progress may not come from better models alone, but from infrastructure that bridges the gap between real-world complexity and machine understanding.
The physical part of AI remains both the hardest and most interesting frontier. Companies like Sift are betting that whoever builds the data layer wins.



