The Shift to Agent-Driven API Consumption | Speakeasy’s Sagar Batchu
AI agents are knocking, is your API ready to answer?
Think building killer APIs like Stripe or Twilio is purely a technical feat? Think again.
This week, host Andrew Zigler dives into why crafting successful APIs is fundamentally an organizational challenge, demanding internal excellence long before code is deployed. Sagar Batchu, co-founder and CEO of Speakeasy, joins us to unpack the crucial role of ownership, design principles, and why even internal APIs benefit from treating them like first-class products. Discover how Conway's Law dictates your API's fate and why achieving that coveted developer experience starts with your company's structure.
Getting these foundations right is more critical than ever, as the landscape now faces a dramatic shift with the rise of LLMs and agentic AI. This potentially multiplies API usage and introduces the concept of "Agent Experience" (AX), demanding even greater rigor around clear standards, documentation, and metrics like "Time to First 200." Sagar also explores the emerging Model Context Protocol (MCP) and explains why focusing on developer productivity today prepares you for the AI-driven future of tomorrow.
"As these new interfaces come online, [there's] even more of a demand to actually do the traditional things well. There's just more API consumers now and there's more API calls and there's gonna be more usage.” —Sagar Batchu
The Download
The Download is your code to success, one line at a time. 🖥️
1. Politeness is pricey when chatting with AI 💰
Turns out, saying “thank you” to ChatGPT could cost tens of millions in electricity, according to OpenAI CEO Sam Altman. While he suggests this might be “money well spent,” the hidden costs of AI politeness raise eyebrows. On one hand, being courteous could help the AI learn about human behavior; on the other, it’s a reminder that even basic pleasantries come with an environmental price tag. So, do we keep being polite, or is it time to get straight to the point?
Read: It costs nothing to be polite unless you're talking to ChatGPT - then it costs tens of millions
2. Google’s A2A protocol: AI agents finally get to chat 🗣️
Google has just launched the Agent2Agent (A2A) protocol, aimed at enabling seamless communication and collaboration between AI agents across diverse platforms. With contributions from over 50 tech giants, A2A allows agents to securely exchange information and coordinate actions without being tied to a specific vendor or framework. As we move towards a multi-agent ecosystem, A2A could redefine how AI integrates into everyday workflows.
Read: Announcing the Agent2Agent Protocol (A2A)
See what AI-powered dev productivity actually looks like ✨ (sponsored)
The future of developer productivity is here—and it’s already shipping. Join LinearB this week for a 30-minute live demo showcasing our newest AI features in action:
🔍 AI Code Reviews that catch issues early
📝 Automated PR Descriptions that save devs from writing them manually
📊 Smart Iteration Summaries that turn retros into instant insights
You’ll get real examples, workflow tips, and live Q&A with the experts building this tech. Want to see how AI is changing the game for high-performing teams?
3. Unlocking the future of AI with the Model Context Protocol (MCP) 🔑
The Model Context Protocol (MCP) is stepping in to unify the chaotic landscape of AI tool interactions. Introduced in late 2024, MCP provides a standardized way for AI agents to communicate with external tools and APIs, moving beyond the fragmented approach developers currently face. Think of it as the universal translator for AI workflows—allowing agents to autonomously select and chain tools, while also incorporating human oversight when needed. As we explore the potential of MCP, the question looms: will it become the go-to standard for AI integration, or will it face hurdles that keep it from mainstream adoption?