Interactivity

Extended overthinking

On AI, slot machines, and forgetting how to craft pixels.I’ve grown tired of AI lately… of what it does to me.Every new model, every new tool, every new workflow. Yes, they’re powerful. Yes, they let us do more. But there’s this weird disconnect growing between me and the things I make. When everything is instant, it starts feeling like a slot machine. Except you always win. Which makes it more addictive. And somehow… also boring?And you can’t stop. Social media keeps telling us to ship more, fa

Show HN: Sushidata – automating the painful parts of competitor and VoC research

Hi HN,A few months ago we noticed a pattern. Every GTM, product, and marketing team we talked to had the same problem. They were drowning in external data from Reddit, Discord, Slack communities, competitor sites, and social channels. But turning all of that noise into something structured and useful took an enormous amount of time.We watched people spend days copying screenshots into spreadsheets, tagging posts, and checking competitor websites by hand. We were doing the same thing ourselves an

Hidden cost of AI prototypes, leadership myths, how designers use AI

Weekly curated resources for designers — thinkers and makers.“Prototypes are no longer as special as they once were. They’re now the bare minimum.But this speed comes with limitations. Many AI-generated prototypes are never meant to survive past the moment they’re validated. They do their job in a meeting or a user test, but then they’re rebuilt by engineering or even trashed. But the prototypes didn’t “fail,” they were just created with a different intention and outcome.”The hidden cost of AI p

What designers can learn from the first iPhone moment of AI

Fifteen years ago, the iPhone killed Flash and nearly erased an entire design discipline.Continue reading on UX Collective »

Designers, we should be killing it right now

Designers should be thriving in the age of AI. Here’s why we aren’t, why it’s probably our fault, and how we can fix it.The Beatles, Magical Mystery Tour (source)Much has been said about the future of design in the age of AI. Some think the role will disappear completely. Others say only super-seniors will survive. And yet others say it’s all just a blip in time and there will be no fundamental change.I think all are wrong.For two reasons:Designers are naturally attuned to adapt to technological

Something big “might” be happening

Why the future of design belongs to creative thinkers and problem solversContinue reading on UX Collective »

No, VR can’t make you walk in others’ shoes

The shallowness of the “empathy machine.”Photo by Vitaly Gariev on UnsplashI once saw a “poverty simulation” designed to help raise funds and “put people in others’ shoes.” I felt… weird. Can you imagine someone going through a 10-minute fancy VR experience and suddenly claiming they understand the struggles?VR allows users to step into any experience from a first-person perspective. Some people call it an “empathy machine” and argue that it could influence decision-makers. I bought into it at f

On craft and connivence

Lessons in design, from a USB drive to the age of AI, and beyond.Continue reading on UX Collective »

Surveillance by default, consent by assumption

How consumer security products turn physical presence into assumed consentIllustration of a home security camera extending its gaze into shared neighbourhood space, symbolising the shift from private protection to ambient surveillance. Image generated using Grok Imagine.When Amazon’s Ring aired its Search Party advertisement during the Super Bowl, it presented a reassuring narrative: neighbours’ cameras cooperate, a missing dog is found, and communal order is restored. The unease the advert prov

UX questionnaires. Is it rocket science?

Mission control didn’t run on intuition. Neither should your design process.Continue reading on UX Collective »

AI as Art Director: Can Machines Develop Taste?

AI can mimic style but not taste. As machines start acting like art directors, they can generate infinite beauty — yet none of it means anything. This article dives into why true taste requires emotion, risk, and rebellion — things no algorithm will ever feel.

A Designer’s Guide To Eco-Friendly Interfaces

I’ve spent over two decades in the trenches of user experience design. I remember the transition from table-based layouts to CSS, the pivot to responsive design when the iPhone launched, and the rise of the “attention economy.” But as we navigate 2026, the industry is facing its most significant shift yet. We are moving past the era of “design at any cost” into the era of Sustainable UX.It’s not something most designers think about, including myself, until I was prompted by hearing about this as

Pixels of the Week – February 22, 2026

This edition looks at AI hype pressure in design and dev work and why chatbot-first UX often fails. Also: an amazing pink moth, embroidery insect art inspiration and a reminder to use prefers-reduced-motion for your animations.

Why most AI products fail before the first user interaction

Most AI features fail because they start with hype, not humans.Image Credit: AI Generated ImageMost AI products fail before the first user interaction because they don’t solve a real user problem. That may sound dramatic, but I keep hearing the same sentence in executive rooms: “We need an AI feature. Our competitor just launched one.” And just like that, features are built out of fear of being left behind rather than from a clear understanding of what users actually need.And this isn’t just my

Show HN: Create an onboarding flow on Flutter in 5 min

Hey Flutter devsIf you've shipped apps before, you know how important it is to have an efficient and polished onboarding flow. It's the first thing users see and often the reason they leave.You've probably first focused on the core of your app, what makes it different. And now, you want to push it to the store, but you know you have to build an onboarding flow... and it's a little painful.Onboarding flows are deceptive. They are super easy to build technically, but very diffi

Why is real-world ASR still ~85% when lab models claim >95%?

Curious to hear what approaches people are taking, what the bottlenecks are, and whether anyone here is pushing toward the goal of "AI that understands you, the first time."I've been diving into the gap between benchmark ASR performance and real-world speech. Models like Whisper and Deepgram show impressive >95% accuracy in ideal conditions. But in the wild — accents, noisy environments, emotional speech, code-switching, overlapping speakers — accuracy often drops sharply, ofte

The craft of the instruction

Writing AI prompts isn’t just a new technical skill — it’s how we can make our own thinking visibleIn this collage: Baby photo in front of Frank Lloyd Wright’s Fallingwater in Mill Run, Pennsylvania; “Towers on String — Variant Dispersed” by Haegue Yang, 2013; example of a writing instruction with corresponding output.Image credit: Personal photograph (Fallingwater, 2005); Haegue Yang, “Towers on String — Variant Dispersed,” 2013, Henry Art Museum; original writing instruction document by author

Show HN: UseWhisper.dev – AI Code Reviewer (please test and roast it)

Hey HN!I built UseWhisper.dev — an AI code reviewer that analyzes your code diffs, PRs, or snippets and returns review feedback instantly. It runs in the browser with no signup required, and is meant to give developers quick second opinions on logic, style, security, and best practices.https://usewhisper.devWhat it does:Paste a diff, GitHub PR link, or code snippetGet line-by-line intelligent feedbackSuggestions on readability, errors, anti-patternsNo login, minimal UI, fast responsesW

Show HN: utils.live – Developer utilities that run entirely in your browser

I kept opening different websites for simple dev tasks — formatting JSON, encoding Base64, testing regex patterns. Each one had ads, signup walls, or sent my data to a server. I wanted a single place where everything runs client-side with nothing leaving my browser.Each tool is a stateless pure function defined with Zod schemas. The schemas validate input at runtime and also generate the UI automatically — editor language, form fields, and output format are all inferred from the schema shape. To

Ask HN: Anyone doing production image editing with image models? How?

Hey HN — I’m building an app where users upload “real life” clothing photos (ex. a wrinkly shirt folded on the floor). The goal is to transform that single photo into a clean, ecommerce-style image of the garment.One key UX requirement: the output needs to be a PNG with transparency (alpha) so we can consistently crop/composite the garment into an on-rails UI (cards, outfit layouts, etc.). Think “subject cutout that drops cleanly into templates.”My current pipeline looks like: 1. User-uploa