Evan Byrne

Software Engineer @ CRD

Dublin, Ireland

← Back to Blog

The Lure & The Landscape

April 16, 2026 · Evan Byrne
Marketing AI
The Lure & The Landscape CATCHING THE FISHERMAN, NOT THE FISH

The Lure

East of Lake Shore Drive during a spring quarter John-Henry Pezzuto awaited his writing class. This class was set to be taught by the one and only Larry McEnerney. The world-renowned writing expert and then director of the University of Chicago’s writing program. Despite McEnerney’s considerable reputation John still mustered up the courage to ask a valuable question. Did he have any books on writing to recommend? Typical of any academic, Larry began with a story seemingly unrelated to Pezzuto’s original question. He talked about how his father was a hobbyist fisherman who once asked Larry a question along the lines of “How do you sell a lot of fish bait?” Larry continued the story by saying he offered the logical answer of “it has to be good and catch a lot of fish!”. His father corrected him, “In order to sell a fishing lure that sells well, you need to make a lure that catches a lot of fishermen! Fishermen buy the lures, not fish.”

The exchange continued with Larry asking, “do you know how to write a writing book that sells well?” He offered the same non-answer he had made to his father, mumbling something about how you need to write a book that appeals to a lot of bad writers. No, he said. “You need to write a book that writing teachers like, because they are going to make their students buy copies.” It was time to go inside. “No,” Larry concluded, “I don’t have any writing books to recommend.”

This story captures something essential about how markets actually work. Something we prefer not to think about too directly. There is a gap between the person who buys a thing and the problem the thing claims to solve. The writing book isn’t optimized for teaching writing; it’s optimized for appealing to the people who assign writing books. The lure isn’t optimized for catching fish; it’s optimized for catching the eye of fishermen standing in the aisle of a sporting goods store.

This isn’t cynicism. It’s simply a recognition of where value is actually being created and captured. The lure-maker knows his customer. The writing-book author knows hers. The question, for the rest of us, is whether we can see clearly enough to know which side of the transaction we’re on.

The reason this is difficult is that our minds aren’t built for it. Psychologists have a name for the phenomenon: construal level theory. We think about distant things abstractly and near things concretely. A vacation planned six months out is a feeling, a vibe, an escape. A vacation happening tomorrow is a checklist of logistics, a series of small anxieties, a realisation that the beach hotel has a two-star rating you somehow overlooked. The version of your life where you own a cabin in the woods is peace and simplicity. The version where you actually have to maintain one is a parade of contractors and property taxes and a four-hour drive every time the pipes freeze.

Marketing has always exploited this gap between the distant and the near. The product in the advertisement exists at maximum psychological distance: it is potential, promise, transformation. The product in your hands is just a thing. The $200 blender will not, it turns out, make you the kind of person who drinks green smoothies at 6 a.m. It will sit on your counter, then in your cabinet, then in a garage sale. You weren’t buying a blender. You were buying a vision of yourself, and the blender was just the prop.

The landscape is barren, but the horizons are always full. You can see the city — towers, structures, signs of civilization and shelter. You walk towards it. But as you get closer the shapes begin to resolve: some of them are real, a small outpost perhaps, but most of what you saw was mirage. Heat shimmer and the atmosphere playing tricks with light and distance. You arrive at the outpost. It’s useful. There’s water, some shelter and provisions. But it’s not the city. It’s not what you saw — or thought you saw. But again from this new vantage point, you look further into the distance, and there is another city on the horizon. Larger this time, more defined, more real-looking than the last. Surely that one is real.

This is the structure of a certain kind of marketing — one that doesn’t require explicit lies, just the strategic management of distance. You never quite reach the thing you were sold, but you’re always close enough to something useful that you can’t call it fraud. The outpost has water in it. You can’t say you were deceived. And there’s always a next version, a next release, a next threshold that will finally deliver on the original promise. The city is always just past the next horizon.

The Vibes

We see this pattern play out more vividly in the space of Artificial Intelligence today. More narrowly in the area of LLMs and Agents. The retail investor believes it and this isn’t unjustified. Whether it’s the CEO, the vibe coder, or the respected paragon of knowledge coming out to say “This will change everything,” the average person would be naive to assume smoke with no gun, or more accurately in this cycle, a mushroom cloud without a nuke.

And they’ve seen the magic. In many ways LLMs have fundamentally changed what people reach for. Whether drafting a paper introduction, using it as a more interactive oracle, getting personal advice, or figuring out how to respond to a Hinge prompt. Basically anything that requires some level of cognitive depth. Don’t we all like to hit the immediate answer button? The thing is trained on the entire corpus of human knowledge. Wouldn’t you like to know how Plato would advise responding to that prompt?

But the problem is that people that are most excited tend to be the furthest from actualities. They haven’t seen the Klarna debacle, hallucination cases, or quietly returned enterprise contracts. Sometimes the lure doesn’t catch the fish.

When it comes to the frontline responders, also known as software engineers these days, you might as well frame their face beside Oppenheimer’s in terms of seeing the nuclear test when they first see an LLM one-shot a PRD document. When one experiences this it’s hard not to see that this technology is going to pay its dues. But has it? Every person you meet who uses agents to assist with development tends to tell you more about the wonders than the horrors of slop running wild. “I feel so much faster,” “This would have taken me ages.” But is this the reality? As with most things, what people say and what happens often doesn’t match up. This seemed to be the case in last year’s study from METR where developers predicted themselves to be 24% faster but ended up being 19% slower.

However, I’m not going to sit and write that things haven’t changed since then. The follow-up study attempted to re-run the study but now developers refuse to work without AI even for $50 an hour. And there was indeed an estimate that there was an 18% speedup in work, though the confidence interval was too wide to be reliable. So we even have a case now where there’s developmental lock-in. Developers have reached a new era where they refuse to return to bear skins and talons. The code monkey has been liberated and he doesn’t want to go back into the cage.

The Reality

In many ways I’m the chimp who was never in the cage. I started a Master’s in Computer Science, pivoting from Psychology, in 2023. The earliest versions of ChatGPT were already out, and I used them to help me learn, write, and debug. They weren’t trustworthy for quality work, but they were useful enough that I never went back. Even as I found them nearly useless for research. I reached the outpost and drank the water, but I keep finding myself at a new horizon: is this really it? And sometimes, with a kind of melancholy, I wonder where I’d be without it. Would I be faster? Better? Will the habits I’ve built of reading documentation cover to cover, coding by hand on side projects, resisting the shortcut even matter? Or will they be rendered irrelevant the moment I go back to work and hand a spec to an agent?

There is a thoughtful excitement and belief about what these harnesses and models can do. I share this excitement. Not in the sense of forgetting syntax and lobotomizing myself while the agent runs off to ruin the codebase, but in the sense of developing taste while being freed to think through harder problems. Driving a powerful engine rather than turning the crank.

Sometimes we get what we wish for but it’s not always what we want. People who market and sell understand this pipeline. It’s not a flaw of the system, it is the system. The lure doesn’t catch the fish, it catches the fisherman. For the most part the fisherman goes home happy because occasionally they catch something. So will we be happy to occasionally catch something? Or will we strive for a relationship where the agency that’s most valued is the one that drives the boat.

Beir Bua