Picture this, you’re driving in your car and you hit someone. Don’t worry, you did it on accident. You’re also a good person, so you know you’re responsible for your victims further wellbeing in some way. How would you, the software engineer, design what should happen next, in a general sense? How would you give […]
Picture this, you’re driving in your car and you hit someone. Don’t worry, you did it on accident. You’re also a good person, so you know you’re responsible for your victims further wellbeing in some way. How would you, the software engineer, design what should happen next, in a general sense? How would you give the drivers the chance to act responsibly and amend their wrongdoing?
Chances are you’d give those people some form of money (plus interest as a sign of good will). Now how long should they receive those benefits? Some months, a few years? How about until the end of their life? Keep in mind you might have crippled those people in way that they can’t work anymore.
We don’t have to wonder about such a long term punishment. That’s exactly what China has. You hit a person with your car, you pay for them until they die. But when they’re already underneath your car their death can come pretty quickly. Especially if you back over them a few times.
That’s exactly what happens in China. That rule turned drivers into killers because they knew funerals were cheaper than a lifetime payout. I’ve heard of people willing to drive for a good price, but those drivers are down right road killing their way to a bargain.
This got me churning about what behaviors we’re driving with our software. Our outcomes are nowhere near as drastic, but nonetheless the approach we take have an effect on users. Often in ways we don’t expect.
Take the Apple Watch. While a device I’d leave leaching hipster wrist come end of the world, people have lost weight because of it. It’s made a real difference because they’ve tried to fill out those activity rings when their wrist watcher demands them. That fundamental opinion about how you should live your life caused those behavior changes.
I’m still not convinced about outsourcing that to a giant company. Especially when their proposed solution is nothing but pop behaviorism. You know behaviorism, the carrot and stick approach. It lurks in those nicely animated badges on the watch you get for being such a good boy. Now stand! Good boy.
It seems when the desire is strong enough people are willing to become rats in a maze. They want to be healthy. To know they’re secure years down the road. That’s what causes Chinese drivers to react in ways that follow an almost predestined path. They eject civility out the sun roof, because it’s down to a you or me moment.
How do we handle that in our products? That the idea we’ve seeded people take an altogether off turn. Do we play into those quick fix solutions? Or do we solve the problem in a sustainable way?
Because usually what happens with a pop behaviorism solution is: the experiment ends. Whether that’s the research done on rats or your time in school. Once you run out of carrots to punish and sticks to please with — or was it the other way around? — the test subjects find their original determination a dry well. You’ve been trained so long to care for the aftermath. After a while, you’re only in it for the beatings and blessings. Though the Apple Watch makes the experiment last forever. There’s always another little fix incoming.
That’s the range of ramifications we’re dealt. The habits born of our perspective, in software or generally, can change lives and end them abruptly. It is our grand deception. We want people to go along with our vision but in turn we cause unforeseen rifts and ripples. Be careful which side you choose. It has a strong pull:
”Point of view is worth 80 IQ points.” – Alan Kay