The US will permit Nvidia to export its H200 AI processors to approved customers in China, according to an announcement by President Donald Trump.
Trump noted that a 25% fee would be collected on these sales.
Access deeper industry intelligence
Experience unmatched clarity with a single platform that combines unique data, AI, and human expertise.
The announcement was made on Truth Social, where Trump also said that he had informed Chinese President Xi Jinping about the decision and received a positive response.
Trump indicated the US Commerce Department is finalising the details of the arrangement.
The same policy will apply to other US AI chip manufacturers, including Advanced Micro Devices (AMD) and Intel.
The White House confirmed that the fee would be 25%, higher than a previously discussed rate of 15%, reported Reuters.
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataTrump did not specify how many chips would be exported or detail the conditions, noting only that shipments would take place “under conditions that allow for continued strong National Security.”
He added: “We will protect National Security, create American Jobs, and keep America’s lead in AI,” and clarified that Nvidia’s Blackwell and Rubin chips would not be part of the deal.
It remains uncertain if this decision will lead to new sales in China, as local companies have been advised by Beijing not to use US technology, according to the news publication.
A statement from Nvidia said: “Offering H200 to approved commercial customers, vetted by the Department of Commerce, strikes a thoughtful balance that is great for America.”
Meanwhile, a recently released report by the non-partisan think tank Institute for Progress (IFP) found that the H200 is nearly six times more powerful than the H20.
The H20 is the most advanced AI chip that can legally be sold to China, following the Trump administration’s reversal of its brief ban on those exports earlier this year.
According to IFP, the Blackwell chip currently used by US AI companies is roughly 1.5 times faster than the H200 for training AI models and five times faster for inference, where models are deployed in real-world applications.
Nvidia’s own data indicates that, for some tasks, Blackwell can be up to ten times faster than the H200.
