AI in 10
The most important AI story—explained in 10 minutes.
Every day, I break down the biggest AI story in just 10 minutes - what it is, why it matters, and how you can actually use it. No tech jargon, just AI made simple.
AI in 10
OpenAI Breaks Free: Custom Chip Threatens Nvidia's AI Monopoly
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Referenced Links:
OpenAI Official Website
OpenAI News & Updates
Claude AI Assistant
Perplexity AI Search
Ollama Local AI Tools
Want to go deeper with AI? A community of professionals is learning AI together right now at aihammock.com — show notes, links, tools, and real conversations about how to actually use AI in your life.
Welcome to AI in 10. I'm Chuck Getchell, and every day I break down the biggest AI story in just 10 minutes. What it is, why it matters, and how you can actually use it.
SPEAKER_01So OpenAI just dropped a bombshell that could change everything about how we access AI, and honestly, it's about time. They're developing proprietary silicon in partnership with Broadcom, and this isn't just another tech announcement. This is OpenAI declaring independence from the entire AI supply chain, which is like your favorite restaurant deciding to grow its own food instead of getting price gouged by every supplier in town. Here's what happened. For years, OpenAI has been renting massive computer power from companies like NVIDIA to run ChatGPT and all their other AI models. Think of NVIDIA as the landlord, and OpenAI has been paying rent that keeps going up and up. Well, they just decided to buy their own house. The chip is custom designed and it's designed specifically for running AI models, not the jack of all trades chips that NVIDIA makes. This is more like a surgeon's scalpel compared to a Swiss Army knife. It does one thing, but it does it with absolute maximum efficiency. But here's where it gets really interesting. This custom chip development is just one piece of their broader AI infrastructure strategy. They're working on custom silicon while also maintaining partnerships with companies like AMD through major multi-year deals for MI-450 chips. They're also planning massive chip factories as part of ambitious projects that could involve partners like G42 and SoftBank. And they're exploring revolutionary approaches to powering AI infrastructure. Yes, you heard that right. Revolutionary AI infrastructure. Because apparently regular electricity bills weren't expensive enough. Let me break down why this matters to you right now. First, the cost issue. When OpenAI controls their own hardware, they don't have to pay Nvidia's premium prices anymore. Those savings could flow directly to us. Imagine if ChatGPT Plus dropped from$20 a month to$10. Or if the free version got significantly more powerful. That's the kind of change we're talking about. Right now, OpenAI is spending enormous amounts of money just to keep the lights on. Every time you ask ChatGPT a question, they're paying for that computer processing power. It's like running a taxi company, but having to rent every single car from the same expensive dealer. Eventually you buy your own fleet. The timing here is fascinating too. This announcement comes as we see OpenAI moving at breakneck speed with their development efforts. And it comes right after we covered those viral AI agent tools that got everyone excited about AI doing actual work for us. Speaking of which, if you're just getting started with all this, my AI explained course walks you through everything in about 30 bite-sized videos because understanding this stuff is becoming less optional every day. But let's talk about what this really means for your daily life. Faster, cheaper AI processing means AI assistants become way more practical. Right now, if you want AI to help you plan your entire week, organize your photos, and write your emails, it can feel slow and expensive. With custom chips designed specifically for these tasks, we're talking about AI that responds instantly and costs pennies instead of dollars to run. This also means AI gets embedded into more everyday tools. Your photo editing app, your work software, even your car's navigation system, they all become smarter when the underlying AI processing gets cheaper and faster. It's like when smartphones got fast enough that every app could suddenly do things that seemed impossible just a few years earlier. Now there's a job market angle here that we need to address honestly. When AI gets this much more capable and accessible, it's going to change how work gets done, not eliminate jobs entirely, but definitely change them. The person who learns to work alongside these AI tools is going to have a massive advantage over someone who ignores them. Think about it this way: right now, if you're a marketing manager, you might spend half your day on routine tasks like scheduling social media posts, analyzing basic data, or writing routine emails. With faster, cheaper AI, those tasks become automated. That doesn't mean you lose your job. It means you get to focus on strategy, creativity, and building relationships. The human stuff that actually matters. But here's what you can actually do with this information today. First, start paying attention to OpenAI's updates. They're moving fast, and when their custom silicon starts powering their systems, you'll likely see new features and capabilities rolling out quickly. Sign up for their newsletter at openai.com or follow them on social media. Second, start experimenting with AI agents right now. Don't wait for the next generation of chips to get comfortable with AI doing tasks for you. Try tools like Claude or Perplexity to automate some of your daily routine. Ask Claude to help you plan your week. Use perplexity to research a project you're working on. Get comfortable with the idea of AI as your digital assistant because it's about to get a lot more powerful. Third, diversify your AI toolkit. Don't put all your eggs in one basket. While OpenAI is building their sovereign system, other companies are doing similar things. Try different AI tools for different tasks. Some excel at writing, others at analysis, others at creative work. Figure out what works best for your specific needs. And here's something practical you can do this week. Pick one routine task that takes you 30 minutes every day. Maybe it's organizing your inbox, planning meals, or updating a spreadsheet. Spend an hour this weekend figuring out how AI can help you with that task. Even with today's technology, you'll probably cut that 30 minutes down to five. When custom silicon and similar chips hit the market, that five minutes might become 30 seconds. The bigger picture here is fascinating. We're watching the AI industry mature in real time. For years it's been like the Wild West. Everyone's scrambling to rent the same expensive equipment and hoping the prices don't kill them. Now the major players are building their own infrastructure. That's what mature industries do. It's like watching the internet grow up all over again. This sovereignty push isn't just about open AI either. China's DeepSeek just released a model that runs on Chinese-made chips, specifically to avoid depending on NVIDIA. Google has their TPU chips, Amazon has their training processors, everyone's building their own hardware because everyone realizes that controlling your own technology stack is the key to long-term success. For us as users, this competition is fantastic news. When companies aren't all fighting over the same limited, expensive resources, they can focus on making their products better and cheaper. It's basic economics and it works in our favor. The nuclear power angle is worth thinking about too. AI processing requires enormous amounts of electricity. By building dedicated nuclear-powered data centers, companies like Microsoft and OpenAI are ensuring they have reliable, clean power for their AI systems. That means more consistent service for us, and it's actually better for the environment than running these systems on coal or natural gas power. Now, as I always say, I'm not a financial advisor, but I can tell you that understanding these shifts is becoming crucial for anyone who wants to stay competitive in their career. The companies and individuals who adapt early to these new AI capabilities are going to have significant advantages. The key insight here is that AI is moving from being a nice-to-have tool to being core infrastructure. Just like you expect your smartphone to work reliably, we're heading toward a world where you'll expect AI assistance to be fast, cheap, and always available. Custom silicon developments and similar advances are the building blocks that make that possible. What we're really witnessing is the foundation being laid for an AI everywhere world, where every app, every website, every device has intelligence built in, where asking for help with complex tasks feels as natural as sending a text message, and where the cost of that intelligence drops so low that it becomes essentially free to use. The companies building their own chips, their own factories, and their own power sources are positioning themselves to be the infrastructure providers for that world. And the people who start using these tools effectively today are positioning themselves to thrive in that world. This is why I keep saying that AI represents the greatest opportunity of our lifetime. Not because it's going to solve every problem, but because it's going to amplify human capability in ways we're just beginning to understand. And developments like custom AI silicon are what make that amplification accessible to regular people like you and me. The takeaway here is simple. The AI revolution isn't coming, it's here. And the companies smart enough to control their own destiny by building their own infrastructure are going to be the ones that shape what that revolution looks like for the rest of us.
SPEAKER_00That's today's AI Inten. If you want to go deeper and learn AI with a community of people just like you, join us at aihammock.com. I'll see you tomorrow, my friends.