Incentives are a Hell of a Drug

People tend to follow their incentive structures far more strongly than they follow any innate sense of morality or a loose goal structure they’ve set up for themselves. I’m not claiming to be above this either, and I don’t think anyone reasonably can. We all follow what we’re most strongly incentivized to do a vast majority of the time. In some sense, economics is the study of incentives and how people follow them. No wonder it’s called the dismal science.1

I don’t think all hope is lost here though. Even if we’re doomed to follow our incentives, we can still attempt to structure them in a way that aligns with our highest values. Most people are strongly incentivized not to murder, and it isn’t just because they’ll get punished. They’ll also feel really bad (just ask Raskolnikov). It’ll take an emotional toll, as well as a deeper psychological toll. Most people’s formulation of themselves as a good person involves not murdering.

Let’s back away from such extreme cases though, and examine the everyday choices we have between different paths. Maybe you have a choice between staying longer at work and making more money, or staying longer at home and spending more time with family. My argument is that you’ll do whichever you’re more strongly incentivized to do (to the best of your knowledge). Whichever path takes you closer to your implicit, underlying goals, is overwhelmingly likely to be the path you’ll take (and why wouldn’t it be?).

Again, this doesn’t leave out the possibility of morality. If you have a strong sense of your own moral system, you’re going to have incentives to follow through on it, just as much as if you had a strong sense that your highest value was “make as much money as possible”. It feels good to stick to our highest values, and if we aren’t incentivized by our highest values, then what are we incentivized by?

I think the trouble comes when we conflate our “highest values” with something we’d like to believe in, but don’t really believe in. You can ask people to donate to charity instead of spending it on social outings, but if they truly get more value in their life by going out with friends…you can guess where that money’s going. Having fun with friends isn’t a bad thing, but there’s an opportunity cost for every such action. It’s entirely possible to value both—maybe if you had a spare million dollars, you would allocate some fraction of it to increasingly extravagant fun with friends, but also give some to charity. In the real world though, we have such limited resources that we’re making such choices all the time, and they’re aligned with what we really value, not with what we say we value.

All kinds of incentives factor in to our decisions. I’ve already discussed some of the most obvious, like money or morality. There are plenty of others though, like social costs to being seen as different, or levels of physical exhaustion, or pleasing others so they like us more, or feeding a habit. We are balancing all of these all the time. Because we don’t have infinite time to weigh all these options appropriately, the choices we eventually do end up making are merely a rough approximation to our truest values.

I used to have a terrible habit of biting my nails. It wasn’t like I woke up every morning and decided that’s what I wanted to do that day, or that it aligned with my higher purpose in life. It was just a habit, and one that I had tried to kick with varying degrees of success. What eventually ended up working (and working far better than I expected) was buying a little bottle of what amounts to nail polish, but it has a very bitter taste to it. Even just the threat of that awful taste made me more conscientious, but when I absentmindedly slipped into the habit an immediate awfully bitter taste followed. Within a month, I didn’t need to use it anymore and the habit was broken.

The very best disincentives will be strong, immediate, and automatic. Obviously, a weak disincentive won’t propel you very far. “If I bite my nails, I’ll have one piece of broccoli instead of ice cream” doesn’t work at all—it’s not enough skin in the game. Immediacy also matters. If your disincentive is too abstract (e.g. at the end of the week, count up infractions and then do some penance for them) it’s not going to apply in medias res, and so won’t provide the proper feedback you need to understand the connection between bad habit and disincentive. Finally, automaticity helps loads. One tip for nailbiters I’ve since seen is that they should wear a rubber band around their wrist, and snap it once they notice any biting activity. This completely fails to stop the absentminded biter—by definition they won’t be paying attention until it’s too late!

It’s a bit unfortunate we don’t all have a device that can notice any bad habit we’re partaking in and release a bitter taste immediately (although you can only begin to imagine the nefarious purposes certain entities would use it for). Nailbiting is kind of lucky in that the physical act itself can be altered in a way that makes it strongly, immediately, automatically disincentivized.

There have been a million proposals for ways to adjust habits that don’t have this sort of easy fix. For building good habits, a usual suggestion is to build them into a daily routine, so that the good habit is triggered by something you already were going to do anyways. For avoiding bad habits, you can try clearing your house of triggers for them, or the rubber band method, or just mentally noting. Whatever you’re doing is at core a readjustment of your personal incentives, whether adding rewards or punishments, or replacing with something else entirely.

Social pressure is a bit of a mixed bag. Some advice says to tell everyone you know when you’re starting a new activity, so they’ll hold you to it, with a disincentive to disappoint them. Other advice says you should instead keep your goals to yourself, so you don’t incentivize “I told people” instead of “I did the thing”. I usually lean towards the second one here. Unless your friends have strong reasons to hold you to something, they’re likely to be too forgiving. Misaligning incentives is a bigger problem.

Just as how what we say we value can differ from what we actually value, what we believe we’re being incentivized to do can differ from our true external incentives. If you confuse some reward you’ve elected to give yourself with the true underlying task you wanted to accomplish, you’re less likely to complete the task even if you’re more likely to get the reward. The extreme example of this is “wireheading”, hacking into your reward centers to give you the greatest possible amount of pleasure even though if you’re sitting around blissed out all day, you’re not going to be making much progress on your true goals.

Beyond just self-improvement, I think the incentives-based view of the world can provide some pretty good insight into why people act the way they do. “Follow the money” is easily generalized to “follow the incentives”, which works better (although it’s harder to follow) because a lot of what we value is non-monetary. If you have a goal you strongly wish to accomplish above all else, your best bet is to set up an incentive structure that leads you towards it, and then act as you would otherwise act in the face of complicated or conflicted information. Let the incentives guide you home.

  1. The original reason is that Thomas Carlyle was dismissing the economics of his day (Carlyle argued that the emancipation of slaves would be bad for them). The phrase has more to do with Mill, not Malthus’ dismal predictions, as has been widely spread. Of course, the idea that slavery could be ended by sheer force of supply and demand is an example of incentives gone right.

❮ The Rise of Dark Downtempo
Good things happen at 100bpm
The Supreme Cleverness of Mario Kart's Mirror Mode ❯
Game design at its most efficient