šŸ”‘ Key AI Reads for October 22, 2025

Issue 20 • Walmart adds ChatGPT as a commerce channel, Claude Skills make a splash, UX considerations for long-running AI tasks, understanding when to use knowledge graphs, Figma's ChatGPT integration

Publication note: I am on vacation this coming week; The AI UX Dispatch will return on Wednesday, November 5.

AI Commerce

Walmart brings shopping directly into ChatGPT

Walmart is partnering with OpenAI to enable direct shopping through ChatGPT using the new Instant Checkout feature. Starting this fall, users will be able to browse and purchase Walmart and Sam's Club products—including apparel, entertainment, and packaged food — all within the chat interface. People's existing Walmart account will automatically link to ChatGPT, making purchases as simple as clicking a "buy" button. Fresh food is notably excluded from the initial rollout.

Ben Thompson notes that what makes this move particularly interesting is Walmart's willingness to cede some control. Walmart is the same company that still refuses to accept Apple Pay in stores because it wants to own the customer relationship and transaction data. But Walmart appears to be betting that ChatGPT represents a fundamentally different commerce channel—one where customers are discovering products they didn't know existed, rather than searching for items they already have in mind. For Walmart, as a distant second-place player in e-commerce, gaining access to a new customer base may be worth incurring a small transaction fee. The question now is: will Amazon follow suit or try to compete with its own chatbot?

Walmart on ChatGPT | Ben Thompson ($)
⚔ Quick Read (5 minutes)

Frontier Models

Claude Skills might just be better than MCP

Anthropic just launched Claude Skills, a strikingly straightforward approach to giving Claude new abilities. A "skill" is simply a folder containing instructions (usually just a Markdown file) that tells Claude how to perform specific tasks—like following brand guidelines or working with Excel files. What makes this elegant is efficiency: Claude scans available skills at the start of a session using only a few dozen tokens per skill, then loads the full details only when needed. The skills that power Claude's document creation features (PDFs, Word docs, spreadsheets) are now publicly available in Anthropic's GitHub repo, and anyone can create their own.

The real story here might be how Skills compare to MCP (Model Context Protocol). While MCP generated enormous buzz since its November 2024 launch, it has a significant limitation: GitHub's MCP alone consumes tens of thousands of tokens. In contrast, because Skills rely on Claude's coding environment and CLI tools, they're token-efficient and remarkably easy to share—often just a single Markdown file. Even better, nothing is preventing other AI models from using Skills, making them potentially more universal than MCPs. Tech blogger Simon Willison predicts Skills will spark a "Cambrian explosion" that will make this year's MCP rush look modest by comparison.

Designing for AI

Slow AI: Designing for tasks that take hours or days

AI agents and powerful AI tools can now run for hours or even days to complete complex workflows, bringing back the "zombie UX" of batch processing from the earliest days of computing. Jakob Nielsen explores the unique design challenges this creates: users forget what they asked for during long waits, feel anxious about whether the AI is working correctly, and may struggle with the sunk cost fallacy even when they can see a long-running task is generating poor results.

His proposed solutions include:

  • Upfront clarification before long runs begin

  • "Conceptual breadcrumbs" that show the AI's reasoning at key milestones

  • Progressive disclosure of partial results allows users to course-correct mid-run

  • Context-reboarding summaries help users recall what they requested

Nielsen borrows principles from mainframe computing and factory production lines—including checkpointing for crash recovery, breaking work into small restartable units, and providing detailed progress indicators. The article provides thorough coverage of design patterns that we'll need as AI capabilities continue to advance rapidly.

Slow AI: Designing user control for long tasks
šŸ” Long Read (29 minutes) | šŸ’”Bookmark for Reference

Knowledge Graphs

Knowledge graphs: When are they worth it?

The AI world is divided into two camps on knowledge graphs: those who think they're essential for every generative AI application, and those who think they're completely overhyped. David Bechberger, who's spent the last two years building GraphRAG systems and agentic memory solutions, says both camps are wrong. Knowledge graphs excel at multi-hop reasoning—connecting concepts across multiple sources to answer "why" and "how are these connected" questions, but traditional vector-based RAG actually performs better for single-hop queries that require detailed information from one source.

The reality is that most real-world queries are heavily skewed toward simple lookups that vector retrieval handles perfectly well. Bechberger's seen companies spend months longer than planned getting graph-based systems to production, only to have end users notice no appreciable difference. His advice: before choosing any architecture, analyze a sample of your actual user queries to see if you're handling the kinds of questions that benefit from connected knowledge.

AI and the Design Process

Figma launches ChatGPT integration for FigJam diagrams

Figma now offers a ChatGPT app that can generate FigJam diagrams directly from chat conversations. Users can create flowcharts, sequence diagrams, state diagrams, and Gantt charts by simply mentioning Figma in their ChatGPT prompts and uploading photos, drawings, and PDFs. The integration is powered by Figma's MCP (Model Context Protocol) server and is available now to logged-in ChatGPT users outside the EU.  Figma positions this functionality as helpful in turning sketches into collaborative diagrams, visualizing software architecture, and mapping product roadmaps.

I tested this functionality this week with some process documentation I wanted visualized as a flow diagram. After connecting my Figma account to ChatGPT, it successfully generated basic diagrams with each process step represented as a node in FigJam. The results weren't perfect: it struggled with branching logic and added some unnecessary branches, but it gave me a solid starting point. Instead of facing a blank canvas, I had objects I could quickly refine into a more accurate and usable diagram.

That’s it for this week.

Thanks for reading, and see you next issue with more curated AI/UX news and insights. šŸ‘‹

All the best, Heidi

Reply

or to participate.