From Betting Slips to Infinite Scroll: A Designer's Confession

Early in my career, I worked at a betting company. Not for long — a contract, maybe six months. But one meeting has stayed with me for over a decade.
We were reviewing a new gambling format. The product review was positive — engaged users, high session times, good retention numbers. Everything the dashboard said was working.
Then someone flagged a problem: too many people were winning.
Not winning big. Winning enough to stay engaged, to keep playing, to feel like the system was fair. The format was too generous. And that was the issue — not because the company was losing money overall, but because the win rate was sustainable enough that users might actually feel in control.
The verdict was swift: the format couldn't ship as-is. The economics required that users lose. Not sometimes. Systematically. By design.
The room moved on. I sat with it.
The Memory That Came Back
I left that company and moved into product design and UX. Built things I'm proud of — learning platforms, travel tools, enterprise software. The betting gig became a footnote.
Then a few years ago, I started reading about social media engagement design. Variable ratio reinforcement schedules. Infinite scroll mechanics. The deliberate manufacture of uncertainty — will the next swipe show something amazing or nothing at all? — to keep you scrolling.
The memory of that product review came back hard. Because the mechanics are identical.
A slot machine works on variable ratio reinforcement: unpredictable rewards at unpredictable intervals. The uncertainty is the hook. You pull the lever because the next one might be the one.
Infinite scroll works the same way. You keep scrolling because the next post might be the one that's interesting, funny, outrageous, or relevant. The feed is designed so you never know. The uncertainty is the feature.
The betting company was at least honest about what it was doing. The product was literally a gambling product. Nobody pretended otherwise.
Social media platforms use the same psychology, target the same neurological pathways, and create the same compulsive behaviours — but call it "engagement" and "user experience."
The Patterns Are Everywhere
Once you see it, you can't unsee it. Pull-to-refresh mimics a slot machine lever. Notification badges create urgency through intermittent variable rewards. Streak mechanics — Snapchat, Duolingo, GitHub — exploit loss aversion to keep you coming back not because you want to, but because you're afraid of losing your streak.
Instagram's algorithmic feed deliberately withholds content from people who would engage with it, then surfaces it later to maximise the dopamine hit of discovery. TikTok's For You page is essentially a real-time optimisation engine calibrating the perfect ratio of content you love, content you tolerate, and content that surprises you — the exact recipe for addictive variable reinforcement.
These aren't bugs. They're the product. Built by people like me — designers who understand cognitive psychology and user behaviour and are really, really good at applying it.
Where Complicity Starts
Here's the uncomfortable part. I've never designed a feature I'd describe as deliberately addictive. But I've designed features that maximise engagement, reduce friction, increase session time, and improve retention. I've celebrated when the metrics went up.
Is there a meaningful difference?
I think there is — but the line is blurrier than our industry pretends. "Good UX" and "addictive UX" share a lot of the same toolkit. The difference is intent and consequence, and those are harder to measure than click-through rates.
The March 2026 verdict in the Epic Games vs Apple case included expert testimony about app store designs that specifically exploit children's cognitive vulnerabilities — limited impulse control, susceptibility to social pressure, incomplete understanding of monetary transactions. These aren't fringe concerns anymore. They're entering legal proceedings.
What's Changing
Something is shifting. Slowly, unevenly, but perceptibly.
The EU's Digital Services Act now requires platforms to explain algorithmic curation and offer chronological alternatives. The UK's Online Safety Act puts duty-of-care obligations on platforms for the first time. Apple and Google have both introduced screen time tools — though cynics might note they're profiting from the problem and selling the cure.
More interesting to me is what's happening inside the design community. I'm seeing more designers push back on dark patterns in product reviews. More ethical design frameworks showing up in job specs. More conversations about whether "engagement" is actually the right metric to optimise for.
It's not enough yet. But it's more than nothing.
Where I Land
I'm not writing this from a position of moral authority. I designed engagement features for a betting company. I've celebrated rising session times. I've prioritised conversion rates over whether a user actually needed what they were being nudged toward.
What I can say is that awareness changes behaviour. The more I've understood about how these patterns work — the psychology, the reinforcement schedules, the deliberate manufacture of compulsive loops — the more carefully I've designed around them.
I don't think the answer is banning infinite scroll or criminalising variable reinforcement. Those are tools, and tools can be used well or badly. But I do think designers have a responsibility to understand the consequences of what they build — not just the intended consequences, but the second and third order effects.
The betting company was honest about the transaction. You put money in, the house takes a cut, and the maths ensures you lose over time. Everyone knows the deal.
Social media's transaction is less visible. You put attention in, the platform takes engagement, and the algorithm ensures you stay longer than you intended. The cost isn't money. It's time, focus, and — if you believe the research — mental health.
We should be at least as uncomfortable with that as we were with the betting slips.