Amazon Web Services (AWS) and Amazon-backed AI developer Hugging Face have released a custom AI chip designed to offer cost-effective performance for AI compute workloads. 

Inferentia2 will be available to Hugging Face users through its developer hub, which is widely used to access open-source models such as Meta’s Llama 3. 

The chip will allow developers to use modified open-source AI models to power software. 

“… When you find a model in Hugging Face you are interested in, you can deploy it in just a few clicks on Inferentia2,” read a blog post by Hugging Face. 

Users will be automatically billed by the amount of time used.

In a 2024 tech sentiment survey by research and analysis company GlobalData, AI was ranked the most disruptive technology by businesses. 

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

More than 40% of businesses participating in the survey stated that AI had already began to disrupt their industry and a further 13% believed that AI would begin to disrupt their business in the next 12 months. 

By 2030, GlobalData forecasts the total AI market will be worth more than $1.04trn, achieving a compound annual growth rate of 39% from 2023. 

AI chips are also predicted to account for 30% of all next-generation chip manufacturing by 2030, according to GlobalData forecasts.