Working in AI, caring about people

January 2026

Working in AI, caring about people

On building intelligent systems without losing sight of who they serve

I work in AI. I say this plainly because the phrase has become so loaded — with hype, with dread, with a thousand conference talks about the future — that it barely means anything anymore. So let me say what it means to me: I spend my days building systems that try to understand language, intention, and context, and I spend my evenings wondering whether we are doing it well enough, and for the right reasons.

The thing about artificial intelligence is that every interesting problem in it is, at bottom, a human problem. What does it mean to understand a sentence? That is not an engineering question — it is a question about empathy, about the vast unspoken context that humans carry into every conversation. When someone types a query at two in the morning, they are not just entering text. They are tired, or worried, or curious in a way that no embedding will fully capture. The best systems I have worked on are the ones that acknowledge this gap honestly rather than pretending it does not exist.

There is a tendency in this field to be dazzled by capability. A model can now write poetry, summarize legal documents, generate code. And these capabilities are genuinely remarkable. But capability without care is just power, and power without direction tends to serve whoever is already holding it. I am more interested in the quiet applications — the tool that helps a first-generation student draft a college essay they are proud of, the interface that lets a non-technical person ask a real question and get a real answer. These are not the demos that go viral, but they are the moments where the technology justifies itself.

What keeps me grounded is the people. Not users as an abstraction, not DAUs or retention curves, but the actual individuals on the other side of the screen. I have watched someone's face when a tool finally does what they needed it to do — not something impressive, just something right. That moment is small and unrepeatable and it is the entire point. If I ever lose sight of it, I hope someone will tap me on the shoulder and tell me to stop.

I do not think AI needs to be warm. I think it needs to be honest. Honest about what it knows and does not know. Honest about the fact that it is a tool built by people with biases and blind spots. Honest about the places where a human should step in. The most human thing we can build into these systems is not a friendly tone — it is the humility to say, I am not sure, and here is why.

So yes, I work in AI. But my work is about people. It always has been. The models are just the material — like wood, like clay, like ink. What matters is what you make with them, and whether the person who encounters it feels seen.