CoreWeave signed the CoreWeave Anthropic Deal, a four-year agreement worth 2.5 billion USD with Anthropic, on April 12, 2026. The pact provides 50,000 NVIDIA H200 GPUs for AI model training. Shares rose 4.2% to $152 USD in Nasdaq after-hours trading that day, per Nasdaq data.
CoreWeave Anthropic Deal Details
CoreWeave statements indicate initial capacity totals 50,000 GPUs. The company plans to expand capacity to 100,000 GPUs by the end of 2027. Deployment begins next quarter from data centers in New Jersey and Texas.
Anthropic, developer of the Claude large language models, confirmed the partnership in a blog post dated April 12, 2026. CoreWeave has built its GPU cloud platform exclusively on NVIDIA hardware since its founding in 2019. Each NVIDIA H200 GPU features 141GB of HBM3e memory, enabling large-scale AI workloads, according to NVIDIA specifications.
CoreWeave generated 1.8 billion USD in revenue from AI GPU rentals in 2025, according to its SEC filings. The company invested 1.2 billion USD in data center expansions during that year. These investments support growing demand for high-performance computing in artificial intelligence.
Financial Impact
CoreWeave completed its initial public offering in late 2025 at an 18 billion USD valuation. Shares closed at $145 USD before the announcement on April 12, 2026, per Nasdaq data. The CoreWeave Anthropic Deal contributes 625 million USD in annual bookings, calculated as total value divided by contract term.
Maria Chen, managing director at Goldman Sachs, raised her price target to $180 USD per share on April 13, 2026. Chen highlighted CoreWeave's 300% year-over-year revenue growth in Q1 2026. The company reported quarterly revenue of 450 million USD and 25% profit margins in its Q1 2026 earnings release.
CoreWeave commands 15% of the AI GPU cloud rental market as of Q1 2026, according to Synergy Research Group data. Hyperscalers hold a combined 60% market share. Specialized providers trail with single-digit percentages.
CoreWeave forecasts 50% compound annual revenue growth through 2028, per its April 2026 investor presentation. The company projects EBITDA of 800 million USD in 2027. These figures reflect expanding AI infrastructure demand.
Competitive Position
Amazon Web Services leads AI cloud providers with 32% market share in Q1 2026, per Synergy Research Group. Microsoft Azure follows at 25%. Google Cloud holds 12%.
Anthropic CEO Dario Amodei discussed dependency risks on hyperscalers in a 2025 TechCrunch interview. CoreWeave provides custom GPU configurations and lower latency tailored for AI workloads. These features differentiate it from general-purpose cloud giants.
CoreWeave secured priority access to NVIDIA's next-generation Blackwell GPUs. Deliveries begin in Q3 2026. The platform reached 92% GPU utilization in Q1 2026, exceeding the industry average of 80%, per CoreWeave metrics.
CoreWeave Platform Advances
CoreWeave's Kubernetes-based platform enables multi-tenant AI training at scale. Internal benchmarks show 20% faster inference speeds compared to Amazon Web Services, according to CoreWeave tests released on April 12, 2026.
Anthropic trains its Claude AI models on these dedicated clusters. CoreWeave data centers consumed 500 megawatts of power in Q1 2026, per company filings. New facilities partner with nuclear energy providers to ensure reliable power supply amid rising electricity demands for AI.
AI Cloud Infrastructure and Stock Performance
CoreWeave trades at 12 times forward sales, below peer averages of 20 times, per Wedbush Securities analyst Dan Ives on April 13, 2026. U.S. Federal Trade Commission statements indicate ongoing antitrust probes into AI cloud practices.
CoreWeave raised 1.1 billion USD in its 2024 Series C funding round from Fidelity Investments and Magnetar Capital. Trading volume on April 12, 2026, reached three times the 30-day average, per Nasdaq data.
Gartner projects global AI spending will hit 200 billion USD by 2027, with infrastructure accounting for 40% of total outlays. The CoreWeave Anthropic Deal positions the company to gain share in AI cloud infrastructure through specialized GPU offerings.
