Cohere pulled in another $100 million in an extension to its very recently oversubscribed funding round thanks to a valuation to the tune of $7 billion. The raise was accompanied by a new strategic partnership with chipmaker AMD, sending a clear signal that the firm’s multimodal bet wasn’t so much on the winners and losers of the generative AI wars as it was in building an infrastructure capable of competing at light speed.
A speedy add to an already oversubscribed round
The extension comes on the heels of a $500 million round that valued Cohere at $6.8 billion, further lifting the company’s valuation just weeks later. The BDC (Business Development Bank of Canada) and Nexxus Capital Management are new backers to the add-on, joining with an emphasis on data residency and national AI capacity — themes increasingly key for regulated enterprises and public-sector buyers.

Established in 2019 by Aidan Gomez, a co-author of the influential “Attention Is All You Need” paper that described the Transformer architecture, Cohere has always had its eyes on enterprise.
Instead of trying to chase down consumer chat interfaces, the company wants its pitch to be around deploy-anywhere models, security-first tooling and tight integrations with existing data stacks.
Why an AMD deal is important now for Cohere and customers
Securing a deeper partnership with AMD (already one of Cohere’s investors) achieves two strategic objectives: diversifying its compute supply and sharpening the company’s enterprise value proposition. In a market controlled by Nvidia’s GPUs, second sources are not just “nice to have.” These are working hedges against supply interruptions and price swings, as well as engines of performance-per-dollar increases.
AMD’s Instinct lineup, led by the MI300 family, has seen real-world adoption at cloud providers and research centers. MLCommons’ MLPerf benchmarks have demonstrated gradual performance increases in both training and inferencing over recent cycles, reducing the deficits in selected workloads. Along with AMD’s open software stack (ROCm) and a tipping point of support across PyTorch, Triton and major inference runtimes, the ecosystem now looks credible for production deployments instead of pilots only.
For enterprise purchasers, this is relevant in pragmatic ways — such as the reduced developer friction inherent within ROCm’s growing maturity, projects with shorter timelines enabled by increased accelerator availability, or keeping a stomping boot on TCO (total cost of ownership) and other single-vendor risk.
If Cohere tunes its models for AMD hardware as part of the deal — like kernel-level tweaks, optimizations to RAM, or quantization paths — customers could benefit with higher throughput for retrieval-augmented generation, summarization and code-assist scenarios.
There’s also an energy attribute many CIOs notice. The power efficiency at the level of a cluster guides more and more the decisions about what to deploy in data centers with limited access to power and cooling. Vendors who bring better tokens-per-watt on actual workloads, not benchmarks, will win contracts over the next several years. Expect Cohere to play up that as a story with AMD.
Enterprise-first and the AI sovereignty push
Cohere’s go-to-market has been not to herd customers into public endpoints only, but rather private, controllable deployments — be they VPCs, on-prem or hybrid footprints.

That ladder is in line with an expanding regulatory lens. The EU AI Act, industry-specific regulations (financial services, healthcare), and the government procurement standards are all forcing vendors to represent provenance, minimized data exposure and clear audit trails.
Sovereignty is now a buying factor, not just a buzzword. With BDC’s involvement and a base of investors targeting Latin America and Iberia through Nexxus, Cohere can plausibly court markets where local control over data and models is of national interest. You can anticipate that to be packaged up with regional deployments, private tuning and the plug-in of IP in customer environments.
Competitive context: Create your own space
Bloomberg is chasing consumer mindshare and astronomical private valuations, while Cohere is trying to make the quieter enterprise lane its lane.
Reliability, predictable latency and security posture are all areas the procurement teams run grueling bake-offs/proof-of-value pilots against before scaling. It’s a new sort of race: less a race to viral products, and more about compliance, SLAs and cost curves.
The AMD deal is also a positioning play. As foes cozy up to Nvidia, Cohere is signaling multi-cloud, multi-silicon optionality. For customers, that can mean more rapid access to capacity and better negotiating leverage — both necessary for programs that need steady training and inference windows every quarter, but really don’t make any use of bursts of capacity when GPUs happen to be available.
What to watch next as Cohere expands AMD partnership
The near-term questions are executional: how fast Cohere can tune its models for AMD clusters; whether it delivers performance-per-dollar customers care about; and at what point its private deployment stack moves into compliance with new regulation.
Yes, there are customer logos in regulated industries where sovereignty, auditability and uptime matter more than usual that are worth tracking as well.
Bottom line: A $7 billion valuation and a chip deal to increase compute access will give Cohere new momentum. If it can translate that into quicker delivery and lower per-unit costs for enterprise implementations, the company has a chance to play well above its weight class in a market where infrastructure strategy is largely shaping who will win out on the next wave of AI adoption.
