AWS sits at the heart of the generative AI boom, powering everything from LLM training runs to global-scale inference. In this sweeping conversation, AWS CEO Matt Garman discusses the future of work, engineering, open vs. closed models, and why agentic workflows—not just raw models—will be where the next wave of value is created.
Garman is bullish on the economic upside of AI, skeptical of doomer narratives, and refreshingly candid about infrastructure bottlenecks, engineering culture, and how Amazon uses its own silicon to support customer choice.
01:00 – White Collar Bloodbath or Utopia?
Garman’s optimistic view on AI, jobs, and productivity.
04:15 – Hiring in the Age of AI
Why more productivity doesn’t mean fewer people.
08:52 – 80% of AWS Developers Use AI
How AI is changing developer workflows at Amazon.
12:55 – Should You Still Study Engineering?
Advice for the next generation in a fast-changing tech landscape.
15:46 – Infrastructure Bottlenecks
From silicon to power, what’s actually constraining AI growth?
18:12 – Where AI Usage Is Growing Most
Training matters, but inference drives demand.
20:05 – AWS Silicon Strategy
Graviton, Tranium, Inferentia, and why Annapurna matters.
27:57 – Serving the Model Ecosystem
Bedrock, specialization, and what AWS looks for in new models.
33:39 – Open vs. Closed Source
How AWS views the trade-offs and partnerships.
36:24 – Will AWS Build a Frontier Model?
On Nova, customer choice, and competition with partners.
41:33 – Benchmarks Are Breaking
Why standardized evaluations may not matter much longer.
47:13 – The Future of Agents
The biggest opportunity in AI—and how AWS is enabling it.
Reply