January 1, 2025

The 2024 AI advancements that changed everything

hero image for blog post

2024 was an absolute rollercoaster for AI — and if you blinked, you probably missed half of it. Even those of us who eat, sleep, and breathe AI had trouble keeping up with the pace of announcements.

But the biggest AI story in 2024 isn’t product releases. It’s how we moved from incremental improvements to leaps in capabilities.

Four breakthrough moments changed everything:

  • Video AI evolved from a party trick into a professional tool
  • Voice AI finally started feeling like talking to a human
  • Visual AI learned to see the world through our eyes
  • And most importantly, AI learned to actually think

Here's your no-BS guide to how AI transformed in 2024.

Video: Natural and human-looking, with cinematography controls

If you saw AI-generated videos in early 2024, you probably weren't impressed. Remember that viral video of Will Smith eating spaghetti? Choppy animations, weird artifacts, more tech demo than useful tool. But if you gave up on AI videos in January, you’ve missed a lot.

The video AI landscape was transformed by two major releases from the biggest labs, OpenAI and Google:

  1. Sora (OpenAI): December's general release delivered something unprecedented – AI-generated video with natural motion and coherent narratives, which look much more like filmed, human video than earlier versions.  

  2. Veo 2 (Google): While Sora improved their video realism, Google focused on control. Veo 2 brought professional-grade cinematography tools and physics simulation that made AI-generated video practical for commercial use.

These releases solved three critical problems that were holding back AI video adoption:

  • Reliability: Generated videos now consistently match the prompt (no more random glitches or bizarre interpretations)
  • Control: Precise tools for adjusting composition, lighting, and camera movement
  • Cost-effectiveness: Generation times dropped from hours to minutes, making it practical for regular use

What this means for you:

  • If you're a content creator: Start experimenting with AI-generated video now. The learning curve is getting shorter, and the capabilities are expanding weekly.
  • If you're in marketing: These tools can dramatically reduce your content creation costs – but you'll need to develop new workflows.
  • If you're a video production company: Time to adapt. Consider how AI can augment your existing services rather than viewing it as competition.
  • If you're in corporate training: The cost-benefit ratio for video content just shifted dramatically in your favor.

Voice:  Natural conversation, available to anyone

In early 2024, talking to AI meant awkward pauses and robotic responses. By year's end, voice AI transformed from a clunky interface into something that actually feels like talking to a person.

OpenAI's Advanced Voice Mode (AVM) preview in May introduced the ability to respond in real-time (no more awkward waiting), interrupt mid-conversation, and had natural speech patterns with emotional awareness.

Within months, Google had announced similar functionality called Google Live, and Apple had revamped Siri – and even integrated ChatGPT.

But the most revolutionary development came in mid-December: 1-800-CHATGPT.

Pick up any phone, dial 1-800-CHATGPT, talk to AI. No apps. No accounts. No tech setup. Just a phone call. This makes the barrier to access AI even lower:

  • Less technical requirements for AI access
  • More accessible to older adults and tech-hesitant users
  • Works with any phone, anywhere
  • Brings AI to areas with limited internet access

What this means for you:

  • If you're a business leader: Voice AI is becoming essential for customer-facing businesses, particularly in customer service, hospitality, and any business with 24/7 high call volumes
  • If you're a developer or product manager: Focus on conversational interfaces and make them first-class experiences, not just an afterthought

Vision: When AI learned to see

This year, AI learned to see through our eyes, rather than relying on uploaded screenshots.

Two launches in May changed this:

  • Google's Project Astra: AI could now look “through” your mobile phone camera or VR glasses and interpret those images
  • OpenAI's camera integration: The ChatGPT app can watch your screen and talk to you – very similar to Project Astra

The key advancement: Instead of uploading images, AI can now see through your device's camera in real-time. Point your phone at anything, and AI understands what it's looking at instantly.

What this means for you:

  • If you're a developer or product manager: Start integrating camera-based AI features for tasks like visual search, object recognition, or real-time translation in your products
  • If you're a business owner: Use visual AI to simplify customer support (e.g., let customers point their camera at a product issue instead of describing it)

Reasoning: When AI learned to think

The most transformative moment in AI happened in September: AI started to think like humans do.

OpenAI’s ChatGPT o1 doesn't just pattern match or remix existing knowledge – it reasons through problems step by step, like a human expert would.

The difference is profound:

  • Old AI: "Based on historical data, strategy A typically works best"
  • New AI: "Let's analyze this situation's unique factors, consider potential approaches, and think through second-order effects"

This is a fundamental shift in what AI can do (we wrote all about it when it launched). And at the end of last week, OpenAI announced its next model – ChatGPT o3. It’s supposed to be available at large in January and approach AGI-level reasoning. We’ll do a full review once we have a chance to play with it, but it’s safe to say that AI is only going to get sharper and more nuanced in 2025.

What this means for you:

  • If you're a business leader: These tools can now help with genuine strategic thinking, not just data analysis
  • If you're in consulting: AI will accelerate your analysis and research (don’t worry, your judgment and client relationships are still safe for now)
  • If you're a knowledge worker: Stop using AI as a fancy search engine and start learning how to prompt for deep problem-solving

What this all means for 2025

Let's zoom out for a moment. Any one of these breakthroughs would have made 2024 a landmark year for AI. Together, they represent something bigger: the moment AI transformed from a fascinating technology into an indispensable tool. It’s not all roses and rainbows. These advances aren’t perfect, but they’re getting damn good.

How to shape your 2025 strategy around the new AI:

  • For businesses: The question is no longer if you'll use AI, but how quickly you can adapt to competitors who are already using it effectively.
  • For professionals: AI isn't coming for your job – it's available to augment parts of everyone's job. The winners will be those who learn to leverage these tools strategically.
  • For developers: The playing field has shifted from building basic AI capabilities to creating specialized applications of these new core technologies.
  • For society: We need to move past debating whether AI is going to make a “real” impact and start having serious conversations about how to use it.

If you've been waiting for the right moment to take AI seriously, this is it. Join us in 2025.

Greg Shove
Chase Ballard