
Untether AI
Ultra-efficient AI inference processor for edge devices.
Date | Investors | Amount | Round |
---|---|---|---|
- | investor investor | €0.0 | round |
investor | €0.0 | round | |
investor | €0.0 | round | |
investor investor | €0.0 | round | |
investor investor | €0.0 | round | |
investor investor investor investor | €0.0 | round | |
* | N/A | Acquisition | |
Total Funding | 000k |






Related Content
Untether AI is a technology startup that specializes in enhancing the speed, efficiency, and cost-effectiveness of AI inference workloads through at-memory computing. This innovative approach involves moving the compute element directly next to memory cells, which significantly increases compute density and accelerates AI inference across various neural networks, including vision, natural language processing, and recommendation engines.
Untether AI's primary market is businesses that heavily rely on AI technologies, particularly those that require high-performance computing for AI inference tasks. The company's unique at-memory architecture is incorporated into their runAI200® devices and tsunAImi® accelerator cards, which are currently available for purchase.
The business model of Untether AI revolves around the sale of these high-performance computing devices. The tsunAImi® accelerator card, for instance, delivers over 2 PetaOps per card, providing unparalleled performance for demanding inference tasks. This level of efficiency allows for more processing power to be delivered in a PCI-Express form factor and power envelope, making it a highly attractive solution for businesses seeking to optimize their AI workloads.
In simple terms, Untether AI makes money by selling their innovative computing devices to businesses that need to run AI applications more efficiently and cost-effectively.
Keywords: Untether AI, at-memory computing, AI inference workloads, compute density, neural networks, runAI200® devices, tsunAImi® accelerator cards, high-performance computing, PCI-Express form factor, AI workload optimization.