What is new on Article Factory and latest in generative AI world

Microsoft Won’t Stop Buying AI Chips From Nvidia and AMD Even After Launching Its Own, Nadella Says

Microsoft Won’t Stop Buying AI Chips From Nvidia and AMD Even After Launching Its Own, Nadella Says
Microsoft has begun deploying its home‑grown Maia 200 AI inference chip in its data centers while confirming it will continue purchasing GPUs from Nvidia and AMD. CEO Satya Nadella emphasized the company’s ongoing partnerships and the need to stay ahead in AI hardware. The Maia 200, designed for high‑performance AI model inference, will first be used by Microsoft’s Superintelligence team as they develop frontier models, and will also support OpenAI models on Azure. The move reflects a hybrid strategy of building in‑house chips while leveraging external suppliers. Leia mais →

OpenAI Secures Multi-Year $10B Compute Deal with Cerebras

OpenAI Secures Multi-Year $10B Compute Deal with Cerebras
OpenAI announced a multi-year agreement with AI chipmaker Cerebras to deliver 750 megawatts of compute power from this year through 2028. The partnership, valued at over $10 billion, aims to accelerate real‑time inference and improve response times for OpenAI’s customers. Cerebras’ low‑latency hardware will complement OpenAI’s existing compute portfolio, providing faster, more natural interactions for AI applications. Both companies highlighted the strategic fit, noting that the deal strengthens OpenAI’s infrastructure while showcasing Cerebras’ advanced chip technology. Leia mais →

Amazon in Early Talks to Invest Up to $10 B in OpenAI

Amazon in Early Talks to Invest Up to $10 B in OpenAI
Amazon is reportedly in preliminary discussions to invest as much as $10 billion in OpenAI, a deal that would see the AI lab using Amazon's AI chips. The potential agreement could value OpenAI at more than $500 billion. The move follows Amazon's earlier $8 billion investment in Anthropic and its recent launch of the Trainium chip series. OpenAI's shift to a for‑profit model has opened the door to new partnerships beyond its long‑time backer Microsoft. The proposed investment would add to a series of circular deals linking AI startups with hardware and cloud providers. Leia mais →

Amazon Negotiates $10 B Investment in OpenAI and Provision of Trainium Chips

Amazon Negotiates $10 B Investment in OpenAI and Provision of Trainium Chips
Amazon is in advanced discussions with OpenAI about a $10 billion investment that would also see the AI startup adopt Amazon's Trainium chips and increase its use of Amazon Web Services. The proposed deal could lift OpenAI's valuation past $500 billion while deepening the company's reliance on Amazon's cloud and hardware infrastructure. At the same time, OpenAI continues to expand relationships with other technology partners, including Microsoft, NVIDIA, Oracle and AMD, prompting investors to scrutinize the circular nature of its financing and compute arrangements. Leia mais →

Trump Announces Approval for Nvidia's H200 AI Chips to Reach Approved Chinese Customers

Trump Announces Approval for Nvidia's H200 AI Chips to Reach Approved Chinese Customers
President Donald Trump announced that Nvidia has received White House approval to export its H200 AI GPU chips to "approved customers" in China and other nations. The deal includes a 25 percent U.S. cut of sales, higher than the 15 percent cut previously demanded. The H200 chips are an upgrade to the scaled‑down H20 models already allowed for export, though they remain less powerful than Nvidia's flagship Blackwell GPUs. Trump framed the policy as a boost for American jobs, manufacturing, and taxpayers, while noting that Nvidia’s U.S. customers are already moving forward with Blackwell chips. Senators have urged continued restrictions on China’s access to the most advanced AI chips. Leia mais →

U.S. Commerce Department Grants Nvidia Approval to Export H200 AI Chips to China

U.S. Commerce Department Grants Nvidia Approval to Export H200 AI Chips to China
The U.S. Department of Commerce has authorized Nvidia to ship its H200 artificial‑intelligence chips to approved Chinese customers. The approval allows the company to sell chips that are roughly 18 months old, with the United States taking a 25% share of the sales. The decision comes amid bipartisan legislative efforts to block advanced AI chip exports and follows a series of policy shifts on U.S. chip sales to China. Lawmakers have introduced the Secure and Feasible Exports (SAFE) Chips Act, which would prohibit such exports for up to 30 months. Leia mais →

Anthropic CEO Dario Amodei Warns of AI Bubble Risks and Competitive Overreach

Anthropic CEO Dario Amodei Warns of AI Bubble Risks and Competitive Overreach
Anthropic chief executive Dario Amodei told reporters that the AI sector faces a complex risk environment that could resemble a bubble. While bullish on the technology’s potential, he cautioned that some rivals may take imprudent bets, especially around the timing of economic value, data‑center investment, and chip depreciation. Amodei highlighted the uncertainty of revenue growth, the need for disciplined risk management, and the danger of “YOLO‑style” strategies that could jeopardize companies’ financial health. Leia mais →

Amazon Says Its Trainium AI Chip Is Already a Multi‑Billion‑Dollar Business

Amazon Says Its Trainium AI Chip Is Already a Multi‑Billion‑Dollar Business
Amazon executives highlighted the rapid growth of the company’s Trainium AI chip line, noting that Trainium2 is already generating multi‑billion‑dollar revenue with over a million chips in production and more than 100,000 customers. The chip’s price‑performance edge has attracted major partners such as Anthropic, which is using hundreds of thousands of chips for its Project Rainier. At AWS re:Invent, Amazon unveiled the next‑generation Trainium3, promising four‑times the speed and lower power usage, underscoring Amazon’s ambition to challenge Nvidia’s dominance in the AI‑hardware market. Leia mais →

Microsoft, NVIDIA Commit Up to $15 B to Back Anthropic in New AI Partnership

Microsoft, NVIDIA Commit Up to $15 B to Back Anthropic in New AI Partnership
Microsoft and NVIDIA have announced a joint investment of up to $15 billion to support Anthropic, the maker of Claude AI models. Microsoft will invest up to $10 billion while NVIDIA contributes up to $5 billion. In return, Anthropic will purchase $30 billion in Azure cloud capacity and may add up to one gigawatt of additional compute. The partnership also includes integration of Claude models into Microsoft Foundry, and collaboration with NVIDIA to optimize Anthropic’s models for NVIDIA hardware and future architectures. The deal arrives as Microsoft loosens its exclusivity with OpenAI and follows recent large cloud contracts with Amazon. Leia mais →

Google CEO Warns of AI Bubble Risks While Emphasizing Company’s Full‑Stack Advantage

Google CEO Warns of AI Bubble Risks While Emphasizing Company’s Full‑Stack Advantage
Alphabet’s market strength is being driven by investor confidence in its ability to rival OpenAI’s ChatGPT and to produce AI‑focused chips that can compete with Nvidia. CEO Sundar Pichai acknowledges that no firm is immune to a potential AI bubble burst, but argues Google’s “full stack” of chips, YouTube data, models and frontier research gives it a unique edge. He cautions users against blindly trusting AI outputs, notes ongoing accuracy concerns, and highlights the massive energy demands of AI that have slowed progress toward Alphabet’s 2030 net‑zero goal. Pichai still describes AI as the most profound technology and stresses that adaptation will create new opportunities. Leia mais →

Microsoft Leverages OpenAI's Chip Designs to Bolster Its Semiconductor Efforts

Microsoft Leverages OpenAI's Chip Designs to Bolster Its Semiconductor Efforts
Microsoft is turning to its AI partner OpenAI to address challenges in its chip strategy. Under a revised agreement, Microsoft will gain full access to OpenAI’s custom AI chip designs, which are being co‑developed with Broadcom, and secure intellectual‑property rights to those designs. The deal also preserves Microsoft’s access to OpenAI’s models through 2032 while excluding OpenAI’s consumer‑hardware initiatives. Executives describe the move as a pragmatic response to the high cost and complexity of building cutting‑edge AI chips, positioning OpenAI’s expertise as a key accelerator for Microsoft’s semiconductor ambitions. Leia mais →

Qualcomm Unveils AI200 and AI250 Chips to Challenge Nvidia

Qualcomm Unveils AI200 and AI250 Chips to Challenge Nvidia
Qualcomm announced the AI200 and AI250 processors, built on its Hexagon neural processing technology, to target AI inference workloads and compete with Nvidia's dominance. The AI200, slated for release next year, offers 768 GB of RAM and can be combined in racks of up to 72 chips. The AI250, planned for 2027, promises a generational leap in efficiency and lower power consumption. Saudi Arabia's Public Investment Fund‑backed AI firm Humain will use both chips in its upcoming AI datacenters, signaling Qualcomm's push into the high‑performance AI market. Leia mais →

OpenAI Secures Major Chip Partnerships with AMD and Broadcom, Heightening Dependence on TSMC

OpenAI Secures Major Chip Partnerships with AMD and Broadcom, Heightening Dependence on TSMC
OpenAI has signed significant agreements with AMD and Broadcom to supply AI chips and accelerators for its data‑center deployments. The AMD deal focuses on large‑scale GPU production, while Broadcom will provide custom AI accelerators and Ethernet systems. Both partnerships underline OpenAI's strategy to diversify its silicon supply and reduce reliance on a single vendor. However, the majority of the chips are still expected to be fabricated by Taiwan Semiconductor Manufacturing Company (TSMC), whose capacity constraints and geographic concentration remain a critical bottleneck for the broader AI and semiconductor industries. Leia mais →

NVIDIA Unveils First U.S.-Made Blackwell Wafer at TSMC Phoenix Facility

NVIDIA Unveils First U.S.-Made Blackwell Wafer at TSMC Phoenix Facility
NVIDIA announced the production of its first Blackwell wafer at TSMC’s Phoenix, Arizona fab, marking a major step in domestic chip manufacturing for its AI hardware. The Blackwell platform, touted as more powerful and up to 25 times more cost‑ and energy‑efficient than its predecessor, is positioned for volume production and aims to strengthen ties with major AI players while reducing exposure to tariffs and geopolitical risks. CEO Jensen Huang highlighted the significance of the milestone for U.S. semiconductor leadership and noted ongoing plans to expand U.S. manufacturing capacity and invest heavily in AI infrastructure. Leia mais →

OpenAI Partners With Broadcom To Deploy Custom AI Chips

OpenAI Partners With Broadcom To Deploy Custom AI Chips
OpenAI announced a partnership with Broadcom to design and deploy custom AI accelerators. The collaboration will see OpenAI creating specialized hardware while Broadcom handles manufacturing and integration into server racks that include its Ethernet, PCIe and optical connectivity products. Both companies say the effort will improve performance and efficiency for large‑scale AI models such as ChatGPT and the new Sora 2 video generator. Executives highlighted the partnership as a step toward broader AI infrastructure that benefits businesses and consumers, reinforcing a trend of major tech firms joining forces on AI hardware. Leia mais →

OpenAI Teams Up with Broadcom to Build Custom AI Chips

OpenAI Teams Up with Broadcom to Build Custom AI Chips
OpenAI announced a partnership with Broadcom to develop its own custom AI accelerator chips, aiming to diversify its compute supply and lessen reliance on Nvidia. The collaboration targets the creation of up to 10 gigawatts of bespoke AI hardware for OpenAI's data centers, with deployment slated for the second half of 2026 and completion by the end of 2029. The move follows prior multi‑gigawatt deals with AMD and Nvidia and reflects a broader industry push toward in‑house chip design to secure compute capacity for advanced models. Leia mais →

OpenAI Partners with Broadcom and Other Tech Leaders to Build Multi‑Gigawatt AI Compute Infrastructure

OpenAI Partners with Broadcom and Other Tech Leaders to Build Multi‑Gigawatt AI Compute Infrastructure
OpenAI is accelerating its AI compute ambitions by teaming with Broadcom to design and deploy custom AI accelerators, targeting a rollout of 10 gigawatts of chips by 2029. The partnership adds to existing agreements with NVIDIA, AMD and Oracle, which together promise tens of gigawatts of additional capacity. CEO Sam Altman has outlined a longer‑term vision of 250 gigawatts of compute over eight years, a scale that would require multi‑trillion‑dollar financing. The combined deals signal OpenAI’s push to secure the hardware foundation needed for its next generation of models while diversifying its supply chain across several chip makers. Leia mais →

AMD Strikes Major AI Chip Deal with OpenAI, Offering Share Purchase Option

AMD Strikes Major AI Chip Deal with OpenAI, Offering Share Purchase Option
AMD and OpenAI have entered a multi‑year agreement that lets OpenAI buy up to 160 million AMD shares at a nominal price while securing a substantial supply of AMD AI chips. The deal adds roughly 6 gigawatts of AMD‑based compute capacity to OpenAI’s growing AI infrastructure, complementing its existing relationships with Nvidia and its own silicon efforts. Industry sources say the partnership does not alter OpenAI’s broader compute strategy, including its work with Microsoft and Broadcom. Leia mais →

OpenAI Strikes Major Deal with AMD for AI Chip Supply

OpenAI Strikes Major Deal with AMD for AI Chip Supply
OpenAI announced a strategic partnership with AMD to procure AMD's Instinct GPUs for large‑scale AI infrastructure. The agreement envisions deploying up to six gigawatts of compute power, with the first gigawatt slated for the second half of 2026. In addition to the hardware purchase, OpenAI will acquire 160 million AMD shares at a nominal price, potentially giving it a ten‑percent stake in the chipmaker. The deal positions AMD as a core compute partner and underscores OpenAI’s push to diversify its hardware sources beyond its existing relationships. Leia mais →

Huawei Ascend 950, Nvidia H200, and AMD MI300 Instinct: Head‑to‑Head AI Chip Comparison

Huawei Ascend 950, Nvidia H200, and AMD MI300 Instinct: Head‑to‑Head AI Chip Comparison
A side‑by‑side look at three leading AI accelerators—Huawei's Ascend 950 series, Nvidia's H200 (GH100 Hopper), and AMD's Radeon Instinct MI300 (Aqua Vanjaram). The comparison covers architecture, process technology, transistor counts, die size, memory type and capacity, bandwidth, compute performance across FP8, FP16, FP32 and FP64, and target scenarios such as large‑scale LLM training, inference, and high‑performance computing. Availability timelines differ, with each vendor positioning its chip for data‑center and HPC workloads. Leia mais →