Show, Don’t Tell: Enhancing Communication with AI Tools

“Show, don’t tell” are words I live by.

Recently, I put this into practice when prototyping a UI change for Adapt, Jeavio‘s LLM-powered knowledge platform. Instead of writing requirements docs, I tried something different.

I uploaded a screenshot to Claude, described the changes I wanted, and got back a working React component – turning what typically takes days of back-and-forth into a clear demonstration in under an hour.

Over the last few months, we’ve enhanced Adapt’s capabilities to handle complex queries like:
“Summarize last week’s meeting notes and identify action items.”
The platform breaks these down:
☑ Actions (what to do)
🗄️Resources (what to use)
🔎Constraints (how to filter)

While this approach has made Adapt very powerful, it also makes it difficult to understand how a user prompt results in a set of outputs.

Inspired by GitHub Copilot’s inline explanations, I wanted Adapt to provide similar transparency about its reasoning. Using Claude’s Artifacts feature, I quickly created and shared a high-fidelity prototype with my team, showing how this could work.