← Back

From Actions to Intent

2025

Most traditional apps are built around user actions. You tap a button, something predictable happens. It’s a one-to-one relationship. If you want to log a food item, the app might offer one button to scan a barcode, another to manually enter a code, and another to upload a photo. Each feature is siloed and mapped directly to a specific action.

This action-based model made sense when computers needed exact instructions. But with the rise of AI-native apps, that model is rapidly becoming outdated.

In AI-native experiences, we move from action-based to intent-based design.

Instead of requiring the user to choose the exact path, the system understands the user’s intent — “I want to log this food” — and figures out how to get it done. Whether you describe it in text, snap a photo, or say it out loud, the system interprets your intent and chooses the right tools behind the scenes.

This changes everything.

AI-native apps don’t just execute predefined actions. They behave more like agents — autonomous, flexible, capable of multi-step reasoning. They’re not bound by rigid workflows. They adapt.

And yet, we still see many products bolt AI onto old action-based interfaces without rethinking the experience. It’s like putting a self-driving engine into a horse-drawn carriage.

To build truly AI-native apps, product teams need to shift their mindset:

Don’t ask “What can the user do?”
Ask “What does the user want to achieve?”

Design around intent, not interface. That’s the future.