When data is the difference between aid reaching people or not, design has to be invisible — and the intelligence has to be immediate.
UNDP's data had always existed — displacement tracking, nutrition assessments, climate analysis, risk registers. Humanity Hub makes it instantly accessible through AI search, interactive maps, and structured insights — for every agency in the field.
The first version established the core concept — AI search as the primary interface. Testing and stakeholder feedback pushed it toward something more personalised and contextually rich.
The shift from V1 to final wasn't cosmetic — it came from testing with real UNDP users who needed to feel the platform knew their context, not just their query. Personalisation and trending content reduced the cognitive load of starting a session from scratch.
On most product teams, a PM defines the feature set. They write the brief, scope the requirements, prioritise the roadmap. The designer executes within that structure.
This project had no PM. I sat across from UNDP stakeholders and had to do both jobs simultaneously — ask the right questions to surface requirements, synthesise what I heard into a feature set, propose a product direction, and then design it.
It demanded a different kind of rigour — less about executing well and more about framing the right problem to begin with. Getting that wrong at the start means designing the wrong thing beautifully.
I got it right by staying curious longer than was comfortable, and proposing less before I understood more.
The feature that changed everything: promptable maps. Type a natural language question — see the answer plotted on a map of Somalia. Refugee camps, UNICEF project sites, risk zones — all accessible through a sentence.
The interactive prototype did the work words couldn't. Seeing spatial data respond to a natural language query — in real time — changed the conversation in the room. That's when the project got its momentum.
Humanity Hub isn't just a search tool. It's a platform — with user management, organisation access controls, usage analytics, system health monitoring, and AI-generated insights about how the platform itself is being used.
The Admin Dashboard gives UNDP super admins visibility across all 214 users, 5 agencies (UNICEF, WFP, UNHCR, IOM, UNDP), and real-time search activity — including AI-surfaced signals like which agencies are most engaged and why.
Humanity Hub was the first project where I didn't hand off designs and wait. I shipped front-end code directly — using AI-assisted development to move at a speed that kept pace with the engineering team.
When you can raise a PR yourself, the conversation between design and engineering changes. Nuances that live in interaction details — timing, state transitions, edge cases — make it into the product because the designer is there to put them in.
It's a different kind of craft. And it made the product better.
The hardest design work isn't in Figma. It's in the room with stakeholders, asking the questions that define what you're going to build. Get that wrong and no amount of craft fixes it downstream.
Describing a promptable map would never have worked. Showing it to skeptical stakeholders did. In complex AI products, the interactive prototype is how you earn belief from people who've been burned by demos before.
When I started shipping front-end code, the quality of implementation improved. The gap between design intent and shipped product closed. That gap is where good design goes to die — and you can choose to close it.