- AI: The New Frontier
- Posts
- Chips with everything
Chips with everything
It's all about the numbers
Time for a catch-up!
Whilst a lot of the focus this week may have been on the impact that talk of tariffs has had on the stock prices of the “Magnificent Seven”, the AI sphere continued to announce new models and features, particularly in relation to processing chips.
Computer chips even became a key talking point in discussions about the trade war between the USA and China, and how that might play out in both countries in terms of both capacity and capabilities. Interestingly, it was revealed that Jensen Huang, the CEO of Nvidia, had secured an agreement from President Trump to allow the continued export of H20 chips to China.
Big Sharks vs. small fish
It’s only fair that we should start by congratulating Microsoft on reaching the grand old age of 50 this week. They celebrated in style with a slew of announcements around the upgraded capabilities of the Co-pilot suite, all of which are worth a mention. First there was “Memory”, which as you can guess enables it to remember thigs like past interactions and preferences – this matters if it’s going to act as your “personal assistant”. The next is “Vision” that allows it to interact with apps that you’re using on your device, providing help but also utilising them. This then relates to the last tool which is “Actions”. This really draws together all of the other parts to allow Co-pilot to act in true agentic fashion, to carry out whatever steps are required to achieve a goal you set.
Not to be outdone, both Amazon and Meta released news of new or upgraded LLMs / AI models. Amazon’s new generation foundation models are all multi-modal, and handle voice, text and vision. Not to be outdone, Meta announced new iterations of Llama, with the Scout version having a context window of 10 million tokens, which is both incredible and now affordable.
The issue of costs also featured this week, as it was revealed that the costs of inference tokens (the ones where you ask something), has reduced from about $20 USD per million to about 7 US cents per million. To put that into context, that’s a bit like going out to buy a 10 year-old, 2nd hand Vauxhall Corsa and discovering that you can afford a brand-new Ferrari 296 GTB.
OpenAI was also in the news regarding costs as it revealed that it currently has 20 million paid subscribers, with current income of $415 million USD per month, with revenue increasing 30% in the space of 3 months. As a result, it predicts an income of $12.7 billion USD in 2025. So, it came as no surprise when rumours emerged of a possible collaboration worth $500 million USD with Jony Ive’s io Products company. Ive was the designer who worked with Steve Jobs to create the iPhone and other iconic Apple products. In the rumours are true, he’s working on a new AI-enabled phone … but one without a screen.
But as the title indicates, the main news seemed to be all about processing chips. Amazon, Google and Nvidia all announced the release of new chips either now, or in the coming year. Also, TSMC (Taiwan Semiconductor Manufacturing Company) announced a 42% increase in revenue in the last 3 months, which highlighted why tariffs will inevitably impact upon AI development, as decisions are made about where production takes place, relative to likely marketplaces.
Legislation, policy and other news
Probably the most attention-grabbing news this week was the UK government’s announcement of a plan to use AI to help law enforcement predict possible murderers!! Yes, that does sound exactly like the film Minority Report.
Syntehsia also announced that it is teaming up with Shutterstock and using their video library to train its software. What does that have to do with policy? Well interestingly, Synthesia is compensating the people in the existing videos by compensating them with stock in the company. Given the controversy caused by Meta using pirate copies of books, this can only be seen as a positive step forward in terms of ethics.
Stay informed. Stay critical. And wherever possible—stay ahead.
Regards
Tom Carter