To Give and Get a Gift That Reveals
What you want is hidden—even from you, especially from you.
Three times now, I’ve unwrapped essentially the same gift. One was from an ex-girlfriend’s mother, another from my favorite teacher, another from a friend of a friend. SUPER THINKING and Thinking Like a Freak and Think Again—all of them good books, sure, and no knocks on the authors, but all circling the same premise: here’s a catalog of named cognitive tools, organized for quick retrieval, so you can Think Better™.
Each came from someone who knows me, likes me, and—wanting to be fair about this—was trying. They noticed something obvious about me: that I think a lot, out loud, at length, in public, perhaps too often, sometimes past the point where polite company has tried to move on to lighter subjects.
But the gift landed wrong every time, and it took me three tries to understand why.
These books don’t treat thinking as something I do about things that I care deeply about. They treat thinking as a hobby or a personality feature. Here are 309 mental models. Here are 99 biases. And the implicit portrait is: here’s a guy who collects frameworks the way some people collect vinyl or vintage watches. An obsessive thinker.
Which—maybe, yes, fine. But the label lands the way “intense” or “stocky” or “bubbly” lands: a description that isn’t wrong, exactly, but that you’d never choose as the caption under your own photograph. It doesn’t make me feel seen. It’s not how I see myself, anyway. It makes me feel flattened, simplified, reduced to a single visible attribute by someone standing at a distance somewhere safely outside of intimacy.
The books I’d actually want are by definition impossible for me to articulate clearly, because if I could, I might have sought them out already. I haven’t found them because I haven’t found the idea yet that would make me want them. That’s what my obsessive thinking really is: seeking. A good gift of a book would be one that notices that when I obsessively bring up certain ideas in conversation over and over, I’m like a truffle pig with his face in the dirt—I can smell what I’m looking for, somewhere around here, but I haven’t found it, and something in my brain won’t give up the scent until I do. A great gift giver understands that seeking and, if they can’t find it for me, puts real thought into what might help me find it.
I’ve gotten gifts like that, nearly fell in love on the spot. I hope you have experienced it too: someone who knew you well, who’d been paying a kind of attention you hadn’t paid yourself, handed you something and you felt that disorienting shock of recognition. Yes. This. Exactly this. You couldn’t have found it even if you searched for it. You couldn’t have typed it into a prompt. It’s not that you were being coy, or playing hard to get with some algorithm, but your desire didn’t yet exist in a retrievable form. It was latent, unformed: a pattern in your life that someone standing outside you—but close enough to care—could read before you could.
The gap is between the gift that pattern matches and the gift that reveals. The gift that makes sense on paper—matches your interests, fits your personality, is aligned with your purchase history—is an algorithmic gift, the kind that can be thoughtfully brought to you by targeted ads.
But the gift that reveals, that’s something different.
When we talk about the jobs AI will take, most assume the hard part of economic life is production. Making the thing. Executing the task. Generations of life in a world reconfigured by industrial reality have persuaded us that this is what work is.
But production was never the hardest part.
The real challenge in making products for others is knowing what to make, for whom, and why this rather than that. Our laptops have become astonishing production tools. They can write, code, design, analyze, generate. But what they still cannot do reliably is tell you what is worth producing in the first place.
Economists already have a language for part of this problem. Friedrich Hayek’s famous argument against socialist central planning was, at bottom, a knowledge argument: the information a socialist needs to allocate a society’s resources efficiently is dispersed across millions of minds, locally held, constantly updating, and never fully available to any one of those minds. Centralizing that knowledge is therefore impossible.
But suppose Hayek did not anticipate artificial minds that could—with enough data and compute—centralize far more of that knowledge than any human one ever could. With such a machine endowed with such an intelligence, the local-knowledge problem could be mitigated. Maybe, but maybe not, because even if we solve the computational problems, we still run into the question of whether all socially relevant knowledge is the kind that can be assembled at all.
Here Michael Polanyi builds on Hayek: some knowledge cannot be centralized because it cannot first be fully articulated—not even by the person who possesses it. Tacit knowledge, Polanyi says, is knowledge not fully available in propositional form. It is embodied, situational, aesthetic, relational. It is not always the sort of thing a person could simply upload. It is not legible to the artificial minds, because it’s not even legible to the human mind in which sits.
To give gifts that reveal requires access to exactly that kind of knowledge—something I haven’t made legible, something I may not be able to make legible, something that might surface only in the right conversation at the right time with the right person.
Sometimes the relevant knowledge is not hidden preference data waiting to be retrieved. It is a want that does not yet exist in articulate form, existing if at all as the potentiality of a want. It emerges only through encounter. The customer does not always know what he wants in the way a database entry knows its field value. Sometimes he knows only by recognizing it when it arrives.
To be sure, recommendation algorithms really do surface latent preferences. Spotify finds songs you never would have searched for and you love them; Amazon predicts purchases you did not know you were about to make and is often right. The algorithm examines millions of people who behaved like you, infers what comes next in the pattern, and serves it up. You hear the song and feel that jolt of recognition. They found you something new by pattern-matching you against your behavioral cousins, the millions of people whose choices help predict yours, and push you toward the center of the cluster. The “new” it gives you is more of what you already want.
The gift that reveals comes from collision with something genuinely different and distinct. Another person—carrying unarticulated knowledge of their own, half-formed patterns of their own, an angle of vision you cannot fully access or simulate. When your opacity meets theirs, something can emerge that was not sitting fully formed in either person’s head a moment before. That something is truly something new.
AI may be the most sophisticated legibility engine ever built. It reads, categorizes, models, predicts, making each of us legible to an unprecedented degree. It’s a mirror, an increasingly brilliant mirror, that reflects your patterns back to you with staggering fidelity. The mirror can surprise you with things you had not noticed before, but it cannot reveal anything not already in the room. Another person can, precisely because they aren’t you. Their partial, imperfect understanding of you can mismatch your own in ways that illuminate.
The most precious knowledge, for each of us, is often this tacit knowledge, fleeting and unstable and only partially legible, revealed in moments of real encounter.
Steve Jobs said that the customer doesn’t know what they want until you show it to them. Most read that as permission to a kind of visionary’s arrogance that might be necessary to Think Different, but I see now that it’s the opposite: it’s a plain description of the role of the entrepreneur.
When the customer tells you exactly what they want, you simply build to spec. You build the faster horse. But the most valuable wants are often hidden in tacit knowledge that does not yet exist in articulate form, wants that emerge only when entrepreneur and customer discover them together.
The entrepreneur’s job, always but especially now, is not to give the customer what they want, but to create the conditions in which the customer discovers it. To give them the gift that reveals.
Every value chain still terminates in a person who must be known—not in the data-aggregation sense, but in the way only another partial, imperfect, stake-bearing person can know them. The customer at the end of the chain is not merely a recipient. The customer is a discoverer.
And the job AI cannot replace is the one who stages that discovery.
