Reckoning With Competitive Capitalism

“There exists an obvious fact that seems utterly moral:
namely, that a man is always prey to his truths”

Albert Camus, The Myth of Sisyphus and Other Essays (1955)

I wrote a post about 2½ years ago (Aug. 31, 2017) with the same title as this one. It referred to University of Connecticut law professor James Kwak’s book Economism, which warns against “the pernicious influence of economism in contemporary society.” Prof. Kwak defines “economism” as “a distorted worldview based on a misleading caricature of economic knowledge,” and makes the case that free market ideology is guilty of it:

“The competitive market model can be a powerful tool, but it is only starting point in illuminating complex real-world issues, not the final word. In the real world, many other factors complicate the picture, sometimes beyond recognition.”

As we’ve seen, free market economic theory is based on the assumption of a “pure” capitalist state. Prof. Kwak calls for a new approach that meets the complex challenges of real life:

“Real change will not be achieved by mastering the details of marginal costs and marginal benefits, but by constructing a new, controlling narrative about how the world works.”

“Reckoning” means “a narrative account” and “a settling of accounts,” as in “Day of reckoning.”[1] A reckoning on economic policy therefore begins with an examination of  whether the prevailing ideology actually delivers what it theoretically promises. Honest reckoning is hard, because the neural circuits of our brains are predisposed to maintain status quo and resist change to both individual and cultural belief systems. The difficulty is amplified when fundamentalist ideology is at play, because  reckoning threatens historical cultural mythology, which is tantamount to sacrilege.

 “History is powerful. George Santayana’s warning that ‘those who cannot remember the past are condemned to repeat it’ rings true because the past influences the present.

“Unfortunately, history’s power does not depend on its accuracy:  A widely believed historical lie can have as much impact as a historical truth.

“President John F. Kennedy explained to Yale’s graduating class of 1962 that ‘the great enemy of the truth is very often not the lie — deliberate, contrived, and dishonest —  but the myth — persistent, persuasive, and unrealistic. Too often we hold fast to the clichés of our forebears…. We enjoy the comfort of opinion without the discomfort of thought.’”

The Founding Myth, by Andrew L. Seidel (2019)

Change that breaks with predominant ideologies and historical cultural myths requires more than individual changes of opinion:  it needs shifts in cultural belief and practice, and a willingness to learn from history. The odd are stacked against it, for reasons Pulitzer prize winning war correspondent Chris Hedges describes in War is a Force That Gives Us Meaning (2014):

“Every society, ethnic group or religion nurtures certain myths, often centered around the creation of the nation or the movement itself. These myths lie unseen beneath the surface, waiting for the moment to rise ascendant, to define and glorify followers or member in times of crisis. National myths are largely benign in times of peace…. They do not pose a major challenge to real historical study or a studied tolerance of others in peacetime.

“But national myths ignite a collective amnesia in war. They give past generations a nobility and greatness they never possessed…. They are stoked by the entertainment industry, in school lessons, stories, and quasi-historical ballads, preached in mosques, or championed in absurd historical dramas that are always wildly popular during war.

“Almost every group, and especially every nation, has such myths. These myths are the kindling nationalists use to light a conflict.

“Archeology, folklore, and the search for what is defined as authenticity are the tools used by nationalists to assail others and promote themselves. They dress it up as history, but it is myth.

“Real historical inquiry, in the process, is corrupted, assaulted, and often destroyed. Facts become interchangeable as opinions. Those facts that are inconvenient are discarded or denied. The obvious inconsistencies are ignored by those intoxicated with a newly found sense of national pride, and the exciting prospect of war.”

All of this makes the Business Roundtable’s Statement on the Purpose of a Corporation and the World Economic Forum’s Davos Manifesto (we looked at them last time) all the more remarkable, since they defy four decades of the prevailing economic myth that “The [sole] social responsibility of business is to increase its profits.”

On the other hand, a recent administrative order imposing work requirements on food stamps recipients offers an equally remarkable example of myth-driven policy-making. According to ABC News (Dec. 4, 2019), proponents say the move will “restore the dignity of work to a sizable segment of our population” — clearly a nod to the cultural myth that anybody with enough gumption (and enough education, funded by the newly nationalized student loan industry) can work their way out of poverty, and if they don’t, it’s their own fault. As we’ve seen, data to support this way of thinking has long been absent, but the myth prevails, and never mind that “all the rule change does is strip people from accessing the benefit,” that the food stamp program “is intended to address hunger and not compel people to work,” and that “those affected are impoverished, tend to live in rural areas, often face mental health issues and disabilities.”

Economism was published on January 10, 2017, just shy of three years ago as I write this. Today’s “Reckoning” post was inspired by a Time Magazine cover story last month:  How the Elites Lost Their Grip: in 2019, America’s 1% behaved badly and helped bring about a reckoning with capitalism, Time Magazine (Dec. 2-9, 2019). We’ll look at what it says about economic reckoning next time.

[1] Etymology Online.

Belief in the Free Market

Mammon

1909 painting The Worship of Mammon by Evelyn De Morgan.
https://en.wikipedia.org/wiki/Mammon

We saw last time that Milton Friedman and his colleagues at the Chicago School of Economics promoted the free market with fundamentalist zeal — an approach to economics that Joseph Stiglitz said was based on “religious belief.” Turns out that using religious-sounding language to talk about believing in capitalism isn’t as farfetched as it sounds on first hearing.

In the history of ideas, the “Disenchantment” refers to the idea that the Enlightenment ushered in an era when scientific knowledge would displace religious and philosophical belief. Reason, rationality, and objectivity would make the world less magical, spiritual, and subjective, and therefore “disenchanted.” You don’t need to know much history to know the Disenchantment never really played out — at least, certainly not in America.

“Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. What’s problematic is going overboard—letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts. The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts. Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.

“Why are we like this?

“The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.

“America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump. In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today, with reality and fantasy weirdly and dangerously blurred and commingled.”

Fantasyland:  How American Went Haywire, a 500-Year History, Kurt Andersen (2017)[1]

Villanova professor Eugene McCarraher makes the case that capitalism stepped up to fill the belief void created by Disenchantment enthusiasts, and became the new world religion.

Mammon book“Perhaps the grandest tale of capitalist modernity is entitled ‘The Disenchantment of the World’. Crystallised in the work of Max Weber but eloquently anticipated by Karl Marx, the story goes something like this: before the advent of capitalism, people believed that the world was enchanted, pervaded by mysterious, incalculable forces that ruled and animated the cosmos. Gods, spirits and other supernatural beings infused the material world, anchoring the most sublime and ultimate values in the ontological architecture of the Universe.

“In premodern Europe, Catholic Christianity epitomised enchantment in its sacramental cosmology and rituals, in which matter could serve as a conduit or mediator of God’s immeasurable grace. But as Calvinism, science and especially capitalism eroded this sacramental worldview, matter became nothing more than dumb, inert and manipulable stuff, disenchanted raw material open to the discovery of scientists, the mastery of technicians, and the exploitation of merchants and industrialists.

“Discredited in the course of enlightenment, the enchanted cosmos either withered into historical oblivion or went into the exile of private belief in liberal democracies…. With slight variations, ‘The Disenchantment of the World’ is the orthodox account of the birth and denouement of modernity, certified not only by secular intellectuals but by the religious intelligentsia as well.”

Mammon:  Far from representing rationality and logic, capitalism is modernity’s most beguiling and dangerous form of enchantment, Aeon Magazine (Oct. 22, 2019)

Prof. McCarraher develops his ideas further in his book The Enchantments of Mammon: How Capitalism Became the Religion of Modernity (2019). This is from the Amazon book blurb:

“If socialists and Wall Street bankers can agree on anything, it is the extreme rationalism of capital. At least since Max Weber, capitalism has been understood as part of the “disenchantment” of the world, stripping material objects and social relations of their mystery and sacredness. Ignoring the motive force of the spirit, capitalism rejects the awe-inspiring divine for the economics of supply and demand.

“Eugene McCarraher challenges this conventional view. Capitalism, he argues, is full of sacrament, whether or not it is acknowledged. Capitalist enchantment first flowered in the fields and factories of England and was brought to America by Puritans and evangelicals whose doctrine made ample room for industry and profit. Later, the corporation was mystically animated with human personhood, to preside over the Fordist endeavor to build a heavenly city of mechanized production and communion. By the twenty-first century, capitalism has become thoroughly enchanted by the neoliberal deification of ‘the market.’”

Economic theories — capitalism, Marxism, socialism — are ideologies:  they’re based on ideas that can’t be proven scientifically; they require belief. The reason thinkers like Kurt Andersen and Eugene McCarraher both use the term “dangerous” in connection with economic belief is because of the fundamentalist dynamics that invariably accompany ideological belief, secular or otherwise. We’ll look at that next time.

[1] The book is another case of American history as we never learned it. For the shorter version, see this Atlantic article.

Economic Fundamentalism

We saw last time that the goal of Chicago School free market economics was to promote “noncontaminated capitalism,” which in turn would generate societal economic utopia:

“The market, left to its own devices, would create just the right number of products at precisely the right prices, produced by workers at just the right wages to buy those products — an Eden of plentiful employment, boundless creativity and zero inflation.”

The Shock Doctrine:  The Rise of Disaster Capitalism, Naomi Klein (2017)

To the School’s free market advocates, these ideas were pure science:

“The starting premise is that the free market is a perfect scientific system, one in which individuals, acting on their own self-interested desires, create the maximum benefits for all. If follows ineluctably that if something is wrong with a free-market economy — high inflation or soaring unemployment — it has to be because the market is not truly free.”

The Shock Doctrine

Scientific method requires that theories be falsifiable:  you have to be able to objectively prove them wrong.

“The philosopher Karl Popper argued that what distinguishes a scientific theory from pseudoscience and pure metaphysics is the possibility that it might be falsified on exposure to empirical data. In other words, a theory is scientific if it has the potential to be proved wrong.”

But Is It Science? Aeon Magazine, Oct. 7, 2019.

But how do you prove an economic theory based on “uncontaminated capitalism” in an economically contaminated world?

“The challenge for Friedman and his colleagues was not to prove that a real work market could live up to their rapturous imaginings…. Friedman could not point to any living economy that proved if all ‘distortions’ were stripped away, what would be left would be a society in perfect health and bounteous, since no country in the world met the criteria for perfect laissez-faire. Unable to test their theories in central banks and ministries of trade, Friedman and his colleagues had to settle for elaborate and ingenious mathematical equations and computer models.”

The Shock Doctrine

Mathematical equations and computer models aren’t the same as empirical data collected in the real (“contaminated”) world. If falsifiability is what separates scientific knowledge from belief-based ideology, then Friedman’s free market theory is the latter. Some scientists are worried that this spin on scientific theorizing has become too prevalent nowadays:

 “In our post-truth age of casual lies, fake news and alternative facts, society is under extraordinary pressure from those pushing potentially dangerous antiscientific propaganda – ranging from climate-change denial to the anti-vaxxer movement to homeopathic medicines. I, for one, prefer a science that is rational and based on evidence, a science that is concerned with theories and empirical facts, a science that promotes the search for truth, no matter how transient or contingent. I prefer a science that does not readily admit theories so vague and slippery that empirical tests are either impossible or they mean absolutely nothing at all…. For me at least, there has to be a difference between science and pseudoscience; between science and pure metaphysics, or just plain ordinary bullshit.”

But Is It Science?

The Chicago School believed so ardently in the free market theory that its instructional approach took on the dynamics of belief-based indoctrination:

“Frank Knight, one of the founders of Chicago School economics, thought professors should ‘inculcate’ in their students the belief that economic belief is ‘a sacred feature of the system,’ not a debatable hypothesis.’”

The Shock Doctrine

This dynamic applies to every ideology that can’t be falsified — verified empirically. The ideology then becomes a fundamentalist belief system:

“Like all fundamentalist faiths, Chicago School economics is, for its true believers a closed loop. The Chicago solution is always the same:  a stricter and more complete application of the fundamentals.:

The Shock Doctrine

Journalist Chris Hedges describes the dynamics of “secular fundamentalism” in I Don’t Believe in Atheists. (The book’s title is too clever for its own good — a later version adds the subtitle “The Dangerous Rise of the Secular Fundamentalist.”)

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others…. The core belief systems of these secular and religious antagonists are identical.”

Thus we have Nobel prize-winning economist Milton Friedman famously saying, “Underlying most arguments against the free market is a lack of belief in freedom itself” — a statement entirely in keeping with the Mont Pelerin  Society’s idealistic Statement of Aims, which we looked at last time.

And thus we also have Nobel prize-winning economist Joseph Stiglitz countering with his thoughts about economics in a contaminated (“pathological”) world:

“The advocates of free markets in all their versions say that crises are rare events, though they have been happening with increasing frequency as we change the rules to reflect beliefs in perfect markets. I would argue that economists, like doctors, have much to learn from pathology. We see more clearly in these unusual events how the economy really functions. In the aftermath of the Great Depression, a peculiar doctrine came to be accepted, the so-called ‘neoclassical synthesis.’ It argued that once markets were restored to full employment, neoclassical principles would apply. The economy would be efficient. We should be clear: this was not a theorem but a religious belief.”

As we also saw last time, historical socialism and communism join free market capitalism in their fundamentalist zeal. In fact, some think that economics in general has become today’s dominant cultural form of belief-based thinking. More on that next time.

If You Like This, You Might Like…

icono2

I created a new blog. I want to tell you about it, and invite you to follow it.

I’ve spent the past ten years writing books, blogs, and articles on technology, jobs, economics, law, personal growth, cultural transformation, psychology, neurology, fitness and health… all sprinkled with futurism. In all those seemingly unrelated topics, I’ve been drawn to a common theme:  change. One lesson stands out:

Beliefs create who we are individually and collectively.
The first step of change is to be aware of them.
The second step is to leave them behind.

Beliefs inform personal and collective identity, establish perspective, explain biases, screen out inconsistent information, attract conforming experience, deflect non-conforming information and experience, and make decisions for us that we only rationalize in hindsight. Those things are useful:  beliefs help us locate our bewildered selves and draw us into protective communities.

We need that to survive and thrive.  But if we’re after change, beliefs can be too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

Those things are useful:  they tame the wild, advance civilization, help us locate our bewildered selves and draw us into protective communities. We need that to survive and thrive.  But if we’re after change, they’re too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

We need to become iconoclasts.

The Online Etymology Dictionary says that “iconoclast” originally meant “breaker or destroyer of images,” originally referring to religious zealots who vandalized icons in Catholic and Orthodox churches because they were “idols.” Later, the meaning was broadened to “one who attacks orthodox beliefs or cherished institutions.”

Our beliefs are reflected, transmitted, and reinforced in our religious, national, economic, and other cultural institutions. These become our icons, and we cherish them, invest them with great dignity, revere them as divine, respect them as Truth with a capital T, and fear their wrath if we neglect or resist them. We confer otherworldly status on them, treat them as handed down from an untouchable level of reality that supersedes our personal agency and self-efficacy. We devote ourselves to them, grant them unquestioned allegiance, and chastise those who don’t bow to them alongside us.

Doing that, we forget that our icons only exist because they were created out of belief in the first place. In the beginning, we made them up. From there, they evolved with us. To now and then examine, challenge, and reconfigure them and the institutions that sustain them is an act of creative empowerment — one of the highest and most difficult gifts of being human.

Change often begins when that still small voice pipes up and says, “Maybe not. Maybe something else is possible.” We are practiced in ignoring it; to become an iconoclast requires that we listen, and question the icons that warn us not to. From there, thinking back to the word’s origins, I like “challenge” better than “attack.”  I’m not an attacker by nature, I’m an essayist — a reflective, slow thinker who weighs things and tries to make sense of them. I’m especially not a debater or an evangelist — I’m not out to convince or convert anyone, and besides, I lack the quick-thinking mental skillset.

I’m also not an anarchist, libertarian, revolutionary… not even a wannabe Star Wars rebel hero, cool as that sounds. I was old enough in the 60’s to party at the dawning of the Age of Aquarius, but then it failed like all the other botched utopias — exposed as one more bogus roadmap claiming to chart the way back to the Garden.

Sorry, but the Garden has been closed for a long, long time.

garden closed

A friend used to say, “Some open minds ought to close for business.” Becoming an iconoclast requires enough open-mindedness to suspend status quo long enough to consider that something else is possible. That isn’t easy, but it is the essential beginning of change, and it can be done.

Change needs us to be okay with changing our minds.

All the above is what I had in mind when I created Iconoclast.blog. I am aware of its obvious potential for inviting scoffing on a good day, embarrassment and shaming on a worse, and vituperation, viciousness, trolling, and general spam and nastiness on the worst. (Which is why I disabled comments on the blog, and instead set up a Facebook page that offers ample raving opportunity.) Despite those risks, I plan to pick up some cherished icons and wonder out loud what might be possible in their absence. If you’re inclined to join me, then please click the follow button. I would enjoy the company.

There’s No Such Thing as a Free Lunch — True or False?

free lunch - mIlton friedman

free lunch - steven hawking

We can assume that the pros and cons of a universal basic income (UBI) have been thoroughly researched and reasonably analyzed, and that each side holds its position with utmost conviction.

We can also assume that none of that reasonableness and conviction will convert anyone from one side to the other, or win over the uncommitted. Reason doesn’t move us:  we use it to justify what we already decided, based on what we believe. SeeWhy Facts Don’t Change Our Minds,” The New Yorker (February 2017) and “This Article Won’t Change Your Mind,” The Atlantic (March 2017).

History doesn’t guide us either — see Why We Refuse to Learn From History, from Big Think and Why Don’t We Learn From History, from military historian Sir Basil Henry Liddell Hart. The latter is full of conventional wisdom:

“The most instructive, indeed the only method of learning to bear with dignity the vicissitude of fortune, is to recall the catastrophes of others.

“History is the best help, being a record of how things usually go wrong.

“There are two roads to the reformation for mankind— one through misfortunes of their own, the other through the misfortunes of others; the former is the most unmistakable, the latter the less painful.

“I would add that the only hope for humanity, now, is that my particular field of study, warfare, will become purely a subject of antiquarian interest. For with the advent of atomic weapons we have come either to the last page of war, at any rate on the major international scale we have known in the past, or to the last page of history.

Good advice maybe, but we’ve heard it before and besides, most of us would rather make our own mistakes.

If reasoned analysis and historical perspective don’t inform our responses to radically new ideas like UBI, then what does? Many things, but cultural belief is high on the list. Policy is rooted in culture, culture is rooted in shared beliefs, and beliefs are rooted in history. Cultural beliefs shape individual bias, and the whole belief system becomes sacred in the culture’s mythology. Try to subvert cultural beliefs, and the response is outrage and entrenchment.

All of which means that each of us probably had a quick true or false answer to the question in this week’s blog post title, and were ready to defend it with something that sounded reasonable. Our answer likely signals our knee jerk response to the idea of UBI. The “free lunch”– or, more accurately, “free money” — issue appears to be the UBI Great Divide:  get to that point, and you’re either pro or con, and there’s no neutral option. (See this for more about where the “no free lunch” phrase came from.[1])

The Great Divide is what tanked President Nixon’s UBI legislation. The plan, which would have paid a family of four $1,600/year (equivalent to $10,428 today) was set to launch in the midst of an outpouring of political self-congratulation and media endorsement, only to be scuttled by a memo from a White House staffer that described the failure of a British UBI experiment 150 years earlier. UBI apparently was in fact a free lunch, with no redeeming social purpose; thus its fate was sealed.

As it turns out, whether the experiment  failed or not was lost in a 19th Century fog of cultural belief which enabled opponents of the experiment to pounce on a bogus report about its impact to justify passing the Poor Law Amendment Act of 1834 — which is what they wanted to do anyway. The new Poor Law was that era’s version of workfare, and was generated by the worst kind of scarcity mentality applied to the worst kind of scarcity. Besides creating the backdrop to Charles Dickens’ writing, the new Poor Law’s philosophical roots still support today’s welfare system:

“The new Poor Law introduced perhaps the most heinous form of ‘public assistance’ that the world has ever witnessed. Believing the workhouses to be the only effective remedy against sloth and depravity, the Royal Commission forced the poor into senseless slave labor, from breaking stones to walking on treadmills….”

From “The Bizarre Tale Of President Nixon’s Basic Income Plan.”

If UBI is a free lunch, then it’s an affront to a culture that values self-sufficiency. If it isn’t, then it requires a vastly different cultural value system to support it. The former believes that doing something — “making a living” at a job — is how you earn your daily bread. The latter believes you’re entitled do sustenance if you are something:  i.e., a citizen or member of the nation, state, city, or other institution or community providing the UBI. The former is about activity, the latter is about identity. This Wired article captures the distinction:

“The idea [of UBI] is not exactly new—Thomas Paine proposed a form of basic income back in 1797—but in this country, aside from Social Security and Medicare, most government payouts are based on individual need rather than simply citizenship.”

UBI is about “simply citizenship.” It requires a cultural belief that everybody in the group shares its prosperity.  Cultural identity alone ensures basic sustenance — it’s a right, and that right makes Poor Laws and workfare obsolete.

The notion of cultural identity invites comparison between UBI and the “casino money” some Native American tribes pay their members. How’s that working? We’ll look at that next time.

[1] Yes, Milton Friedman did in fact say it, although he wasn’t the only one. And in a surprising twist, he has been criticized for advocating his own version of UBI.