SambaNova has introduced the SN50, a new artificial intelligence inference chip designed to support agentic AI workloads in production environments.

The Potomac Officers Club’s 2026 Artificial Intelligence Summit on March 18 will bring together senior government and industry decision-makers to discuss how AI tools are being applied to real-world federal challenges. Sessions will explore automation, workflow optimization and the policies and infrastructure needed to deploy AI responsibly at scale. Register now to join the conversation.
What Capabilities Does the SN50 Offer?
The company said Tuesday the SN50 delivers up to five times the compute performance of competing accelerators and reduces total cost of ownership by as much as three times compared with graphics processing unit-based systems. The processor is built on SambaNova’s reconfigurable data unit architecture and is engineered to support high-concurrency, low-latency inference at scale. It can connect up to 256 accelerators through a multi-terabyte-per-second interconnect, enabling faster time‑to‑first‑token and support for larger batch sizes.
SambaNova said the chip’s three-tier memory architecture is designed to support models with more than 10 trillion parameters and context lengths exceeding 10 million tokens, targeting enterprise deployments that require larger, more complex reasoning models. SoftBank will be the first customer to deploy SN50 in its next-generation AI data centers in Japan, powering low-latency inference services for enterprise and sovereign customers across the Asia-Pacific region.
How Does the Intel Collaboration Fit In?
Alongside the chip launch, SambaNova announced a planned multi-year collaboration with Intel focused on delivering AI inference systems that combine Intel Xeon processors, GPUs and networking technologies with SambaNova systems.
“By combining Intel’s leadership in compute, networking, and memory with SambaNova’s full-stack AI systems and inference cloud platform, we are delivering a compelling option for organizations looking for GPU alternatives to deploy advanced AI at scale,” said Kevork Kechichian, executive vice president and general manager for Intel’s data center group.
The collaboration is expected to include cloud expansion, integrated infrastructure development and joint go-to-market efforts through Intel’s enterprise and partner channels.
The partnership was unveiled as SambaNova disclosed that it has raised more than $350 million in an oversubscribed Series E financing round led by Vista Equity Partners and Cambium Capital, with participation from Intel Capital and other investors.
SambaNova said the proceeds will be used to expand SN50 manufacturing capacity, scale its AI cloud platform and deepen enterprise software integrations as it prepares to ship the new chip later this year.

