It's funny when professionals in one field make amateur mistakes in another.
When economists without any training in software design or large-scale programming start writing large computer programs, for example elaborate econometrics or complicated simulations, they tend to make what programmers consider rookie mistakes. Not programming errors; just missing out on several decades of wisdom on how to set up large programming endeavors: creating reusable code, working in modules, sharing data structures across problems, appropriate documentation -- the basics, ignored.
Of course, when the roles are reversed, that is when engineers and scientists start butting into economics and business problems, a similar situation arises. A good physicist I know makes a complete fool of himself every time he tries to write about economics, making basic mistakes that students of Econ-101 are taught to avoid.
The main difference seems to be that, while many economists and business modelers will appreciate the advice of programmers on how to handle large scale projects better, most non-economists and non-business researchers are unwilling to consider the possibility that there's actual knowledge behind the pronouncements of economists (and some business researchers).
Which is why I find this tweet so funny:
It's funny because it's true: most of the time technocratic pronouncements by technologists and scientists are either examples of the ceteris paribus fallacy (also known as the one-step-lookahead-problem) or would only work within a thorough command-and-control economy.
The ceteris paribus fallacy is the assumption that when we change something in a complex system, the only effects are local. (Ceteris paribus is Latin for "all the rest unchanged.") A common example is taxes: suppose that a group of people make $1M each and their tax rate is 30%. A one-step thinker might believe that increasing that tax rate to 90% would net $600k per person. That assumes that nothing else changes (other than the tax rate). In reality, it's likely that the increase in the tax rate would give the people in the group an incentive to shift time from paid production to leisure, which would reduce the pool of money to be taxed.
The need for a command-and-control economy to implement many of the quick-fix solutions of technologists and scientists comes from the law of unintended consequences. Essentially an elaboration on the ceteris paribus fallacy, the law says that the creativity of 15 year old boys looking for pictures of naked women online cannot be matched by the designers of adult filters: for any attempt at filtering done purely in the internet domain (i.e. without using physical force in the real world or its threat, aka without the police and court system), there'll be work-arounds popping up almost immediately.
Consider the case of subsidies for mixing biodiesel-like fuels with oil in industrial furnaces. Designed to lower the consumption of oil, it led to the opposite outcome when paper companies started adding oil to their hitherto wood-chip-byproducts only furnaces in order to get the subsidy. Pundits from the right and the left jumped on International Paper and others and screeched for legislative punishment; but the companies were just following the law -- a law which did not consider all its consequences.
Because people game rule systems to fit their own purposes (the purposes of the people living under the rule system, not the purposes of the people who make the rules), mechanism design in the real world is very difficult, prone to error, and almost never works as intended. Therefore, in most cases the only way a one-step based solution can work is by mandating the outcome: by using force to impose the outcome rather than by changing the incentives so that the outcome is desirable to the people. *
So, it is funny to see some people we know to be smart and knowledgeable about their field make rookie mistakes when talking about economics and business; but we should keep in mind that many others take the mantle of "science" or "technology" to assert power over us in matters for which they have no real authority or competence.
That's why that tweet is both funny and sad. Because it's true.
* Sunstein and Thaler's book Nudge suggests using psychology to solve the problems of mechanism design. My main objection to Nudge is that I don't trust that those who would change our behavior would give up if the nudges didn't work. In the words of Andy Stern of SEIU: "first we try the power of persuasion, then, if that doesn't work, we use the persuasion of power."
I fear that the Nudge argument would be used to sell the outcome to people ("sure we want people to eat veggies, but we are just making it a little more work to get the chocolate mousse, don't worry") and, once the outcome was sold, the velvet glove would come off of the iron fist ("tax on chocolate," "ban chocolate," and eventually the "war on chocolate dealers").
David Friedman wrote a much better critique of Nudge and its connection with slippery slopes here.