Can Capitalism Buy Happiness? [2]

smiley face

We’ve been looking at the zero-sum economy’s winners and losers — the new “meritocracy” vs. the “precariat” and the Millennials.

We’ve also seen that winners and losers find common ground in higher education, where students of all stripes are increasingly stressed to the point of mental ill-health  — not by the demands of higher learning, but by the enveloping culture of hyper-competitive capitalism.

One predictable response has been for the established, older, prosperous, and powerful to wag the shame finger and tell the kids to quit whining and buck up:

“Student protests and demands for better mental health services are frequently dismissed in the press. ‘We just can’t cope with essay deadlines, and tests stress us out, moan snowflake students,’ read a headline in the Daily Mail in November 2017. In September 2018, the Times described today’s students as ‘Generation Snowflake’ and suggested that ‘helicopter parents’ had ‘coddled the minds’ of young people.”

The way universities are run is making us ill: inside the student mental health crisis. The Guardian (Sept. 27, 2019).

Truth is, we just don’t like to talk about mental illness, and if we regard it at all, tend to shoo it away as a personal problem or character flaw. Plus, there are enduring cultural myths that capitalism and its marketplace are “free,” and that anyone can make it with enough gumption. Together, these attitudes foster the “snowflake” judgment.

Mental illness is ultimately about a clash between the “reality” of the individual deemed to be mentally ill and the “reality” of the prevailing culture.[i] Conventional thinking sides with the culture, and uses pharmaceutical and other therapeutic interventions to realign the individual. As a result, the list of economic stressors is accepted as part of the culture’s normal life to which individuals are expected to conform,

Meanwhile, viewed on its own terms — outside of its cultural context — the list itself is long and dismaying. For example:

  • There has been a forty-year drought in middle class real income growth, with most households drifting downward while an economic elite soars at the top.
  • The percentage of Americans who are considered to be poor by Federal standards is approaching 50% — meaning they have no or limited access to what were historically considered “public goods” such as shelter and sustenance, education and healthcare, etc.
  • Public support safety nets have been replaced by the privatization of essential services. The social services that remain are expensive for the government to administer and are demeaning and counter-productive for recipients;
  • Soaring educational costs mean soaring and strangling student loans.
  • Runaway housing costs have made conventional home ownership unaffordable for the lower economic classes.
  • Due to the rise of the “rentier” economy, the general public must increasingly pay capital holders for the use and enjoyment of essential resources and intellectual property.
  • Upward mobility for the lower 90% is now a thing of the past (the “glass ceiling”). Meanwhile the top 10% is protected against drifting downward (the “glass floor”).
  • Touted “job creation” is mostly “gig economy” contract work, with no assurances of sustainability and no benefits such as healthcare, retirement, etc.
  • Prospects for sustainable income are bleak, and the new job market requires the “hustle” and the “grind” and the monetization of everything in a state of “total work.”
  • Meanwhile, GDP “growth” is largely due to production increasingly shifted not just off-shore, but to intelligent machines. Benefits accrue to capital holders, not wage-earners.
  • These job trends have increasingly resulted in social isolation and an unfulfilled struggle to find meaning and purpose at work.
  • Meanwhile a new generation of huge and powerful “corporate nation-states” now challenge conventional notions of national sovereignty, democracy, and policy-making.
  • The same is true of “philanthrocapitalism” and “social entrepreneurship.”

And there’s more.

While “snowflake” judgments turn a blind eye, for the past several years there has been a counter commentary that looks at the list systemically:  it examines how the capitalistic over-culture creates social mental ill health which is then transmitted to the individual. I.e., it asks if the culture’s assimilation of contemporary capitalistic belief and practice has become toxic to the point that it is making both society and its individual members sick. This is a huge shift in perspective, which we’ll explore further.

[1] For more on how cultural beliefs create collective reality, you might take a look at this article, which evaluates mental health diagnosis and treatment in light of the Cartesian worldview that still dominates the western world:  i.e.,the dualistic thinking that separates the natural world, which can be known scientifically, from the realm of soul or spirit, which can’t. I have talked about how cultural beliefs created social reality in prior blog series in this forum. I also address it in my other blog.

If You Like This, You Might Like…

icono2

I created a new blog. I want to tell you about it, and invite you to follow it.

I’ve spent the past ten years writing books, blogs, and articles on technology, jobs, economics, law, personal growth, cultural transformation, psychology, neurology, fitness and health… all sprinkled with futurism. In all those seemingly unrelated topics, I’ve been drawn to a common theme:  change. One lesson stands out:

Beliefs create who we are individually and collectively.
The first step of change is to be aware of them.
The second step is to leave them behind.

Beliefs inform personal and collective identity, establish perspective, explain biases, screen out inconsistent information, attract conforming experience, deflect non-conforming information and experience, and make decisions for us that we only rationalize in hindsight. Those things are useful:  beliefs help us locate our bewildered selves and draw us into protective communities.

We need that to survive and thrive.  But if we’re after change, beliefs can be too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

Those things are useful:  they tame the wild, advance civilization, help us locate our bewildered selves and draw us into protective communities. We need that to survive and thrive.  But if we’re after change, they’re too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

We need to become iconoclasts.

The Online Etymology Dictionary says that “iconoclast” originally meant “breaker or destroyer of images,” originally referring to religious zealots who vandalized icons in Catholic and Orthodox churches because they were “idols.” Later, the meaning was broadened to “one who attacks orthodox beliefs or cherished institutions.”

Our beliefs are reflected, transmitted, and reinforced in our religious, national, economic, and other cultural institutions. These become our icons, and we cherish them, invest them with great dignity, revere them as divine, respect them as Truth with a capital T, and fear their wrath if we neglect or resist them. We confer otherworldly status on them, treat them as handed down from an untouchable level of reality that supersedes our personal agency and self-efficacy. We devote ourselves to them, grant them unquestioned allegiance, and chastise those who don’t bow to them alongside us.

Doing that, we forget that our icons only exist because they were created out of belief in the first place. In the beginning, we made them up. From there, they evolved with us. To now and then examine, challenge, and reconfigure them and the institutions that sustain them is an act of creative empowerment — one of the highest and most difficult gifts of being human.

Change often begins when that still small voice pipes up and says, “Maybe not. Maybe something else is possible.” We are practiced in ignoring it; to become an iconoclast requires that we listen, and question the icons that warn us not to. From there, thinking back to the word’s origins, I like “challenge” better than “attack.”  I’m not an attacker by nature, I’m an essayist — a reflective, slow thinker who weighs things and tries to make sense of them. I’m especially not a debater or an evangelist — I’m not out to convince or convert anyone, and besides, I lack the quick-thinking mental skillset.

I’m also not an anarchist, libertarian, revolutionary… not even a wannabe Star Wars rebel hero, cool as that sounds. I was old enough in the 60’s to party at the dawning of the Age of Aquarius, but then it failed like all the other botched utopias — exposed as one more bogus roadmap claiming to chart the way back to the Garden.

Sorry, but the Garden has been closed for a long, long time.

garden closed

A friend used to say, “Some open minds ought to close for business.” Becoming an iconoclast requires enough open-mindedness to suspend status quo long enough to consider that something else is possible. That isn’t easy, but it is the essential beginning of change, and it can be done.

Change needs us to be okay with changing our minds.

All the above is what I had in mind when I created Iconoclast.blog. I am aware of its obvious potential for inviting scoffing on a good day, embarrassment and shaming on a worse, and vituperation, viciousness, trolling, and general spam and nastiness on the worst. (Which is why I disabled comments on the blog, and instead set up a Facebook page that offers ample raving opportunity.) Despite those risks, I plan to pick up some cherished icons and wonder out loud what might be possible in their absence. If you’re inclined to join me, then please click the follow button. I would enjoy the company.

There’s No Such Thing as a Free Lunch — True or False?

free lunch - mIlton friedman

free lunch - steven hawking

We can assume that the pros and cons of a universal basic income (UBI) have been thoroughly researched and reasonably analyzed, and that each side holds its position with utmost conviction.

We can also assume that none of that reasonableness and conviction will convert anyone from one side to the other, or win over the uncommitted. Reason doesn’t move us:  we use it to justify what we already decided, based on what we believe. SeeWhy Facts Don’t Change Our Minds,” The New Yorker (February 2017) and “This Article Won’t Change Your Mind,” The Atlantic (March 2017).

History doesn’t guide us either — see Why We Refuse to Learn From History, from Big Think and Why Don’t We Learn From History, from military historian Sir Basil Henry Liddell Hart. The latter is full of conventional wisdom:

“The most instructive, indeed the only method of learning to bear with dignity the vicissitude of fortune, is to recall the catastrophes of others.

“History is the best help, being a record of how things usually go wrong.

“There are two roads to the reformation for mankind— one through misfortunes of their own, the other through the misfortunes of others; the former is the most unmistakable, the latter the less painful.

“I would add that the only hope for humanity, now, is that my particular field of study, warfare, will become purely a subject of antiquarian interest. For with the advent of atomic weapons we have come either to the last page of war, at any rate on the major international scale we have known in the past, or to the last page of history.

Good advice maybe, but we’ve heard it before and besides, most of us would rather make our own mistakes.

If reasoned analysis and historical perspective don’t inform our responses to radically new ideas like UBI, then what does? Many things, but cultural belief is high on the list. Policy is rooted in culture, culture is rooted in shared beliefs, and beliefs are rooted in history. Cultural beliefs shape individual bias, and the whole belief system becomes sacred in the culture’s mythology. Try to subvert cultural beliefs, and the response is outrage and entrenchment.

All of which means that each of us probably had a quick true or false answer to the question in this week’s blog post title, and were ready to defend it with something that sounded reasonable. Our answer likely signals our knee jerk response to the idea of UBI. The “free lunch”– or, more accurately, “free money” — issue appears to be the UBI Great Divide:  get to that point, and you’re either pro or con, and there’s no neutral option. (See this for more about where the “no free lunch” phrase came from.[1])

The Great Divide is what tanked President Nixon’s UBI legislation. The plan, which would have paid a family of four $1,600/year (equivalent to $10,428 today) was set to launch in the midst of an outpouring of political self-congratulation and media endorsement, only to be scuttled by a memo from a White House staffer that described the failure of a British UBI experiment 150 years earlier. UBI apparently was in fact a free lunch, with no redeeming social purpose; thus its fate was sealed.

As it turns out, whether the experiment  failed or not was lost in a 19th Century fog of cultural belief which enabled opponents of the experiment to pounce on a bogus report about its impact to justify passing the Poor Law Amendment Act of 1834 — which is what they wanted to do anyway. The new Poor Law was that era’s version of workfare, and was generated by the worst kind of scarcity mentality applied to the worst kind of scarcity. Besides creating the backdrop to Charles Dickens’ writing, the new Poor Law’s philosophical roots still support today’s welfare system:

“The new Poor Law introduced perhaps the most heinous form of ‘public assistance’ that the world has ever witnessed. Believing the workhouses to be the only effective remedy against sloth and depravity, the Royal Commission forced the poor into senseless slave labor, from breaking stones to walking on treadmills….”

From “The Bizarre Tale Of President Nixon’s Basic Income Plan.”

If UBI is a free lunch, then it’s an affront to a culture that values self-sufficiency. If it isn’t, then it requires a vastly different cultural value system to support it. The former believes that doing something — “making a living” at a job — is how you earn your daily bread. The latter believes you’re entitled do sustenance if you are something:  i.e., a citizen or member of the nation, state, city, or other institution or community providing the UBI. The former is about activity, the latter is about identity. This Wired article captures the distinction:

“The idea [of UBI] is not exactly new—Thomas Paine proposed a form of basic income back in 1797—but in this country, aside from Social Security and Medicare, most government payouts are based on individual need rather than simply citizenship.”

UBI is about “simply citizenship.” It requires a cultural belief that everybody in the group shares its prosperity.  Cultural identity alone ensures basic sustenance — it’s a right, and that right makes Poor Laws and workfare obsolete.

The notion of cultural identity invites comparison between UBI and the “casino money” some Native American tribes pay their members. How’s that working? We’ll look at that next time.

[1] Yes, Milton Friedman did in fact say it, although he wasn’t the only one. And in a surprising twist, he has been criticized for advocating his own version of UBI.