Justin Fox je naredil odličen intervju s psihologom Gerdom Gigerenzerjem glede tega, zakaj je intuicija načeloma boljša od analitičnega razmišljanja v svetu negotovosti. Gre za razliko med tveganjem in negotovostjo, ki ju večina ljudi ne loči. Tudi ekonomski in finančni učbeniki delajo to napako (čeprav govorijo o negotovosti), posledično pa tudi finančni trgi, ki tveganja izračunavajo na podlagi predvidljivih rizikov, ne upoštevajo pa negotovosti, torej popolne nepredvidljivosti prihodnosti. Zato ne morejo predvideti nobene finančne krize. Gigerenzer zato zagovarja hevristiko namesto teorije verjetnosti, kar je v preteklosti privedlo do hudih intelektualnih sporov glede narave človeške racionalnosti. Toda realnost je pokazala, da ima Gigerenzer bolj prav.
Gigerenzer, ki trenutno dela projekt za britansko centralno banko, pravi, da je za pravilno odločanje bolj smiselno imeti pravila palca kot pa kompleksne modele za izračunavanja rizika. Natančneje, pri velikih bankah, ki gredo v zelo tvegane naložbe, je bolje imeti preprosto pravilo, da ne smejo financirati projektov, kjer je finančni vzvod večji od 10, pri malih bankah, ki se soočajo z manj tveganimi posli, pa lahko kompleksni modeli za izračunavanja rizika dodajo dodatno informacijo o upravičenosti projekta.
But a lot of your research over the years has shown people making mistakes.
Just imagine, a few centuries ago, who would have thought that everyone will be able to read and write? Now, today, we need risk literacy. I believe if we teach young people, children, the mathematics of uncertainty, statistical thinking, instead of only the mathematics of certainty – trigonometry, geometry, all beautiful things that most of us never need – then we can have a new society which is more able to deal with risk and uncertainty.
By teaching people how to deal with uncertainty, do you mean taking statistics class, studying decision theory?
If you’re in the world where you can calculate the risk, then statistical thinking is enough, and logic. If you go in a casino and play roulette, you can calculate how you will lose in the long run. But most of our problems are about uncertainty. So, for instance, in the course of the financial crisis, it was said that banks play in the casino. If only that would be true — then they could calculate the risks. But they play in the real world of uncertainty, where we do not know all the alternatives or the consequences, and the risks are very hard to estimate because everything is dynamic, there are domino effects, surprises happen, all kinds of things happen.
Risk modeling in the banks grew out of probability theory.
Right, and that’s the reason why these models fail. We need statistical thinking for a world where we can calculate the risk, but in a world of uncertainty, we need more. We need rules of thumb called heuristics, and good intuitions. That distinction is not made in most of economics and most of the other cognitive sciences, and people believe that they can model or reduce all uncertainty to risk.
You tell a story that I guess is borrowed from Nassim Taleb, about a turkey. What’s the problem with the way that turkey approached risk management?
Assume you are a turkey and it’s the first day of your life. A man comes in and you believe, “He kills me.” But he feeds you. Next day, he comes again and you fear, “He kills me,” but he feeds you. Third day, the same thing. By any standard model, the probability that he will feed you and not kill you increases day by day, and on day 100, it is higher than any before. And it’s the day before Thanksgiving, and you are dead meat. So the turkey confused the world of uncertainty with one of calculated risk. And the turkey illusion is probably not so often in turkeys, but mostly in people.
What kind of rule of thumb would help a person, or a turkey, in that sort of situation?
Let’s use people for that. For instance, the value at risk and other standard models that rating agencies used before the crisis in 2008 — the same thing happened there. The confidence increased year by year, and shortly before the crisis, it was highest. These types of models cannot predict any crisis, and have missed every one. They work when the world is stable. They’re like if you have an airbag in your car that works all the time except when you have an accident.
So we need to go away from probability theory and investigate smart heuristics. I have a project with the Bank of England called simple heuristics for a safer world of finance. We study what kind of simple heuristics could make the world safer. When Mervyn King was still the governor, I asked him which simple rules could help. Mervyn said start with no leverage ratio above 10 to one. Most banks don’t like this idea, for obvious reasons. They can do their own value-at-risk calculations with internal models and there is no way for the central banks to check that. But these kinds of simple rules are not as easy to game. There are not so many parameters to estimate.
Here’s a general idea: In a big bank that needs to estimate maybe thousands of parameters to calculate its value-at-risk, the error introduced by these estimates is so big that you should make it simple. If you are in a small bank that doesn’t do big investments, you are in a much safer and more stable mode. And here, the complex calculations may actually pay. So, in general, if you are in an uncertain world, make it simple. If you are in a world that’s highly predictable, make it complex.
What about the role of intuition and gut feelings in all of this? Clearly, in business, that’s a big issue.
Gut feelings are tools for an uncertain world. They’re not caprice. They are not a sixth sense or God’s voice. They are based on lots of experience, an unconscious form of intelligence.
I’ve worked with large companies and asked decision makers how often they base an important professional decision on that gut feeling. In the companies I’ve worked with, which are large international companies, about 50% of all decisions are at the end a gut decision.
But the same managers would never admit this in public. There’s fear of being made responsible if something goes wrong, so they have developed a few strategies to deal with this fear. One is to find reasons after the fact. A top manager may have a gut feeling, but then he asks an employee to find facts the next two weeks, and thereafter the decision is presented as a fact-based, big-data-based decision. That’s a waste of time, intelligence, and money.
Preberite celoten intervju v Justin Fox, Harvard Business Review