What is Agent Experience (AX)?

Developer Experience (DX) defined the last decade of API design. Documentation, SDKs, interactive consoles, getting-started guides — all optimized for a human developer sitting at a keyboard.

That developer is increasingly being replaced by an AI agent.

The Shift

When a human uses an API, they read the docs, understand the concepts, write code, test it, iterate. The process takes hours or days, but it happens once. After that, the integration is built.

When an agent uses an API, it reads the spec, makes the call, and moves on — potentially hundreds of times per day, across dozens of APIs. There’s no “setup phase.” Every interaction starts from the spec.

This changes what matters:

DX PriorityAX Priority
Readable docsParseable specs
Getting-started guidesAuth configuration
Interactive consoleEndpoint signatures
Code examplesParameter constraints
Conceptual explanationsMinimal token footprint

DX asks: “Can a developer understand this?” AX asks: “Can an agent use this correctly, cheaply, and fast?”

Who’s Talking About AX

We didn’t invent the term. Agent Experience is emerging across the industry:

Netlify published on “Agent Experience” as the next evolution of web platform design — how infrastructure needs to adapt for AI agent consumers alongside human developers.

Nordic APIs has covered the API-to-agent gap, focusing on how traditional API design patterns fail when the consumer is a language model rather than a human programmer.

Speakeasy frames their SDK generation tools in terms of agent readiness — making APIs consumable by AI systems, not just human developers.

Stytch has written about designing auth flows for agent consumption — a specific and critical piece of the AX puzzle.

The pattern is consistent: teams building API infrastructure are realizing their human-optimized interfaces are suboptimal for the fastest-growing consumer category.

The AX Stack

Good Agent Experience isn’t one thing. It’s a stack:

Layer 1: Spec Optimization The API spec itself needs to be agent-efficient. Compression, structured auth declarations, minimal-but-sufficient descriptions. This is what LAP addresses — taking specs from 87K average tokens to under 12K.

Layer 2: Discovery Agents need to find the right API for the task. Registries, search indices, categorization. Our registry at registry.lap.sh indexes 1,500+ APIs with search by name, format, and category.

Layer 3: Packaging A raw spec isn’t enough. Agent skills package the spec with auth config, usage instructions, and installation metadata. The skill format bridges the gap between “here’s an API” and “an agent can use this right now.”

Layer 4: Runtime How the agent actually interacts with the API — request construction, auth header injection, error handling, retry logic. This layer is mostly handled by agent frameworks today (LangChain, Claude tool use, etc.) but will increasingly be standardized.

Why DX ≠ AX

It’s tempting to think good DX automatically means good AX. It doesn’t. Some of the best-documented APIs in the world (Stripe, Twilio, Salesforce) are the most expensive for agents to use — precisely because their specs are optimized for human comprehension.

A beautifully written endpoint description that helps a developer understand the conceptual model behind a resource is pure waste for an agent. The agent needs: endpoint, method, parameters, types, auth. Everything else is cost.

This isn’t an argument against good documentation. Humans still read docs. But the spec that serves humans and the spec that serves agents should be different artifacts — or at least, the agent should get a compressed view.

The Numbers

Our benchmarks quantify the AX gap:

  • Full specs (DX-optimized): $0.37/run, 48K tokens, 0.824 success rate
  • LAP specs (AX-optimized): $0.24/run, 23K tokens, 0.851 success rate

Better AX is cheaper and more effective. The DX-optimized spec actively hurts agent performance by diluting the context with irrelevant information.

What API Providers Should Do

If you maintain a public API, consider your agent consumers:

  1. Publish a LAP version of your spec. Two commands: pip install lapsh && lapsh compile your-spec.yaml. Host it alongside your full spec.

  2. Audit your descriptions. Are they for understanding or for calling? Agent-facing descriptions should be one line: what the endpoint does, not how to think about it.

  3. Declare auth explicitly. Many specs bury auth requirements in descriptions or external docs. Put them in securitySchemes where tools can find them.

  4. Minimize response schemas. Agents rarely need the full response structure upfront. Top-level fields are usually sufficient.

  5. Track agent usage. If you have API analytics, segment by user-agent. You might be surprised how much of your traffic is already agent-driven.

The Decade Ahead

DX isn’t going away. Humans will keep building integrations. But the balance is shifting. As agent-powered products move from demos to production, the APIs that win will be the ones that work well for both audiences.

Agent Experience is the new competitive advantage. The APIs that are cheapest and easiest for agents to use will get the most agent traffic — and agent traffic is growing exponentially.


Start optimizing for agents:

Visit lap.sh to learn how LAP makes your APIs agent-ready. Or jump straight in: pip install lapsh.