Metagenomi (MGX.O), a California-based biotech innovator, has started using Amazon Web Services’ Inferentia AI chips to speed up research into next-generation gene-editing tools, the company said Wednesday.
The move highlights a new use case for Amazon’s in-house AI hardware, which has primarily powered chatbots and large language models since its launch. Metagenomi said the Inferentia platform provided a significant cost edge — performing the same workloads as Nvidia’s GPUs at roughly half the cost.
Metagenomi uses AI to discover and optimize proteins that can precisely deliver genetic material into cells, potentially enabling treatments for genetic diseases. “We generated over a million protein variants from a rare enzyme class,” said Chris Brown, the company’s head of discovery. “Inferentia allowed us to cast a wider net without compromising speed or cost.”
Amazon’s Inferentia chips, introduced in 2019, were designed for low-cost, high-efficiency AI inference and now represent a growing part of Amazon’s strategy to challenge Nvidia’s dominance in AI hardware.
For Metagenomi, the partnership underscores how AI and biotech are converging — transforming how genetic tools are discovered and refined. Analysts say the collaboration could set a precedent for life sciences firms leveraging cloud-based AI to cut costs and expand research capabilities.




