Archeconomics

archangelI made up the term “archeconomics.” I’m using “arch” in the sense of “first principles” — e.g., as in “archetype.” An “arch” is the larger version of the smaller expressions of itself — e.g., not just a villain but an arch-villain, not just an angel but an archangel. Life goes big when an arch-something is at work:  experience expands beyond circumstance, meaning magnifies, significance is exaggerated.

Archeconomics is therefore the larger story behind economics.

I ended last week’s post by referring to the larger story behind the rentier economy. As usually happens when I’m on a research trail, several commentaries have appeared in my various feeds lately that look beyond the usual opinionated mash of current events and instead address over-arching ideas and issues. All of them deal in one way or another with the current status and possible future of the liberal worldview — an arch-topic if there ever was one.

The term “liberal” in this context doesn’t refer to political liberal vs. conservative, but rather to historical liberalism, which among other things gave us post-WWII neo-liberal economics. Mega-bestselling author Yuval Noah Harari describes this kind of liberalism in his latest book 21 Lessons for the 21st Century:

“In Western political discourse the term “liberal” is sometimes used today in a much narrower sense, to denote those who support specific causes such as gay marriage, gun control, and abortion rights. Yet most so-called conservatives also embrace the broad liberal worldview.

“The liberal story cherishes human liberty as its number one value. It argues that all authority ultimately stems from the free will of individual humans, as expressed in their feelings, desires, and choices. In politics, liberalism believes that the voter knows best. It therefore upholds democratic elections. In economics, liberalism maintains that the customer is always right. It therefore hails free-market principles. In personal matters, liberalism encourages people to listen to themselves, be true to themselves, and allow their hearts — as long as they do not infringe on the liberties of others. This personal freedom is enshrined in human rights.”

If you read Harari’s books Sapiens and Homo Deus. you have a sense of what you’ll find in 21 Lessons, but I found it worth reading on its own terms. Two recent special magazine editions also take on the fate of liberalism:  Is Democracy Dying? from The Atlantic andA Manifesto for Renewing Liberalism” from The Economist. The titles speak for themselves, and both are offered by publications with nearly two centuries of liberal editorial perspectives.

Another historical liberal offering from a conservative political point of view is “How Trumpism Will Outlast Trump,” from Time Magazine. Here’s the article’s précis:

“These intellectuals are committed to a new economic nationalism … They’re looking past Trump … to assert a fundamental truth: whatever you think of him, Donald Trump has shown a major failing in the way America’s political parties have been serving their constituents. The future of Trump’s revolution may depend on whether this young group can help fix the economy.”

Finally, here’s a trio of offerings that invoke environmental economics — the impact  of the global ecology on global economics being another archeconomics topic. The first is a scientific study published last week that predicted significant environmental degradation within a surprisingly short time. Second is an article about the study that wants to know “Why We Keep Ignoring Even the Most Dire Climate Change Warnings.” Third is last week’s announcement that the winner of this year’s Nobel Prize in Economics is an environmental economist.

Some or all of those titles should satisfy if you’re in the mood for some arch- reading.

Next time, we’ll return to plain old economics, with a look at how the low income social strata is faring in all the dust-up over rentiers and economic inequality, robotcs and machine learning, and the sagging paycheck going to human labor.

The Matthew Effect

“For to everyone who has will more be given, and he will have abundance;
but from him who has not, even what he has will be taken away.”

The Gospel of Matthew 25:29, Revised Standard Version

Economists call it the Matthew Effect or the Matthew Principle. Columbia sociologist Robert K. Merton used the former when he coined the term[1] by reference to its Biblical origins.[2] The more pedestrian version asserts that the rich get richer while the poor get poorer.

According to the Matthew Effect, social capital is better caught than taught, better inherited than achieved. That notion is borne out by current economic and demographic data[3] showing that the only children with a statistically relevant shot at experiencing a better standard of living than their parents are the ones born with a silver spoon in their mouths — or, as David Graeber says in Bullshit Jobs, the ones “from professional backgrounds,” where they are taught essential social capital mindsets and skills “from an early age.”[4]

Statistics are susceptible to ideological manipulation, but bell curves conceptualize trends into observable laws of societal thermodynamics. The Matthew Effect bell curve says it’s harder to get to the top by following the Horatio Alger path:  you’re starting too many standard deviations out; your odds are too low. On the other hand, if you start in the center (you’re born into the top), odds are you’ll stay there.

That might depend, however, on how long your forebears have been members of the club. Globetrotting wealth guru Jay Hughes has spoken and written widely of the concept of “shirt sleeves to shirt sleeves in three generations.” According to the aphorism, if the first generation of a family follows the Horatio Alger path to wealth, there’s a 70% chance the money will be gone by the end of the third generation, which means the social capital will be gone as well. That first generation might defy the odds through hard work and luck, but odds are they won’t create an enduring legacy for their heirs.

guy in a suit driving a tractor

My own law career was an exercise in another folk expression of the Matthew Effect:  “you can take the boy out of the country but you can’t take the country out of the boy.” (No, that’s not me in the photo — I just thought it made the point nicely.) My career finally hit its stride when I created a small firm serving “millionaire next door” clients — farmers, ranchers, and Main Street America business owners who became financially successful while remaining in the social milieu where they (and I) began. Nearly all of those families created their wealth during the post-WWII neoliberal economic surge, and are now entering the third generation. I wonder how many are experiencing the shirt sleeves aphorism.

Curiously, my transition out of law practice was also dominated by social capital considerations — in particular, a social capital misfiring. I had a big idea and some relevant skills (i.e., some relevant human capital — at least other people thought so), but lacked the social capital and failed to make the personal transformation essential to my new creative business venture.[5]

rocky field

In fact, it seems the Matthew Effect might be a larger theme in my life, not just my legal career. In that regard, I was surprised to find yet another one of my job stories in Bullshit Jobs. This one was about a townie who took a job as a farm laborer. His job included “picking rocks,” which involves tackling a rocky field with a heavy pry bar, sledge hammer, pick axe, spade, and brute strength, in an effort to remove the large rocks and make it tillable. I’d had that job, too. I was a teenager at the time, and it never occurred to me that it might be “completely pointless, unnecessary, or pernicious” (Graeber’s definition), which is how the guy in the book felt about it. In fact, when I told my parents about my first day of picking rocks over dinner, my dad was obviously so proud I thought he was going to run out and grill me a steak. Obviously I’d made some kind of rite of passage.

Picking rocks is just part of what you do if you work the land, and there’s nothing meaningless about it. I enjoyed it, actually — it was great training for the upcoming football season. I can scarcely imagine what my law career and life might have been like if I’d felt the same way about my first years of legal work as I did about picking rocks.

The Matthew Effect has far-reaching social, economic, legal, and ethical implications for the legal profession, where social capital is an important client- and career-development asset. Next time we’ll look at another lawyer who, like David Boies, rose from humble origins to superstar status, and whose story brings a whole new set of upward mobility issues to the table.

 

[1] Merton was originally trying to describe how it is that more well-known people get credit for things their subordinates do — for example, professors taking credit for the work of their research assistants — the professors enriching their credentials at the expense of their minions’ hard and anonymous work. Merton might just as well have been talking about law partners taking credit for the work of paralegals, law clerks. and associates.

[2] As for why “Matthew” when the other Synoptic Gospels (Mark and Luke) have the same verse, I suspect that’s in part because Matthew is the first book in the New Testament canon, but it may also substantiate a derivative application of Merton’s law made by U of Chicago super-statistician Stephen Stigler, known as the Law of Eponymy, which holds that “No scientific discovery is named after its original discoverer.” I.e., later arrivals collect the accolades the” original discoverer” never did. In that regard, Mark’s gospel is believed to have been written first, with Matthew and Luke’s coming later and deriving from it. That would make Mark the true original discoverer. That this economic phenomenon is not called the “Mark Effect” is therefore another example of Stigler’s law.

[3] See, e.g., the “Fading American Dream” graph and the “Geography of Upward Mobility in America” map in this NPR article.

[4] The phenomenon has been widely reported. See this study from Stanford and our trio to new Meristocrats from a few weeks back:  Richard V. Reeves and his book Dream Hoarders and his Brookings Institute monograph Saving Horatio Alger (we looked at those last time). The second was philosopher Matthew Stewart, author of numerous books and a recent article for The Atlantic called The 9.9 Percent is the New American Meritocracy. The third was Steven Brill, founder of The American Lawyer and Court TV, author of the book Tailspin: The People and Forces Behind America’s Fifty-Year Fall–and Those Fighting to Reverse It and also the writer of a Time Magazine feature called How Baby Boomers Broke America.

[5]  I’ve told that story elsewhere, and won’t repeat it here, but if you’re interested in more on this issue, a look at that particular social capital disaster might be illustrative. See my book Life Beyond Reason:  A Memoir of Mania.

Utopia For Realists Cont’d.

“Like humor and satire, utopias throw open the windows of the mind.”

Rutger Bregman

utopia for realistsContinuing  with Rutger Bregman’s analysis of utopian thinking that we began last week:

“Let’s first distinguish between two forms of utopian thought. The first is the most familiar, the utopia of the blueprint. Instead of abstract ideals, blueprints consist of immutable rules that tolerate no discussion.

“There is, however, another avenue of utopian thought, one that is all but forgotten. If the blueprint is a high-resolution photo, then this utopia is just a vague outline. It offers not solutions but guideposts. Instead of forcing us into a straitjacket, it inspires us to change. And it understands that, as Voltaire put it, the perfect is the enemy of the good. As one American philosopher has remarked, ‘any serious utopian thinker will be made uncomfortable by the very idea of the blueprint.’

“It was in this spirit that the British philosopher Thomas More literally wrote the book on utopia (and coined the term). More understood that utopia is dangerous when taken too seriously. ‘One needs to be believe passionately and also be able to see the absurdity of one’s own beliefs and laugh at them,’ observes philosopher and leading utopia expert Lyman Tower Sargent. Like humor and satire, utopias throw open the windows of the mind. And that’s vital. As people and societies get progressively older they become accustomed to the status quo, in which liberty can become a prison, and the truth can become lies. The modern creed — or worse, the belief that there’s nothing left to believe in — makes us blind to the shortsightedness and injustice that still surround us every day.”

Thus the lines are drawn between utopian blueprints grounded in dogma vs. utopian ideals arising from sympathy and compassion. Both begin with good intentions, but the pull of entropy is stronger with the former — at least, so says Rutger Bregman, and he’s got good company in Sir Thomas More and others. Blueprints require compliance, and its purveyors are zealously ready to enforce it. Ideals on the other hand inspire creativity, and creativity requires acting in the face of uncertainty, living with imperfection, responding with resourcefulness and resilience when best intentions don’t play out, and a lot of just plain showing up and grinding it out. I have a personal bias for coloring outside the lines, but I must confess that my own attempts to promote utopian workplace ideals have given me pause.

For years, I led interactive workshops designed to help people creatively engage with their big ideas about work and wellbeing — variously tailored for CLE ethics credits or for general audiences. I realized recently that, reduced to their essence, they employed the kinds of ideals advocated by beatnik-era philosopher and metaphysicist Alan Watts. (We met him several months ago — he’s the “What would you do if money were no object?” guy. )

alan watts cartoon

The workshops generated hundreds of heartwarming “this was life-changing” testimonies, but I could never quite get over this nagging feeling that the participants mostly hadn’t achieved escape velocity, and come next Monday they would be back to the despair of “But everybody knows you can’t earn any money that way.”

I especially wondered about the lawyers, for whom “I hate my job but love my paycheck” was a recurrent theme. The Post WWII neoliberal economic tide floated the legal profession’s boat, too, but prosperity has done little for lawyer happiness and well-being. True, we’re seeing substantial quality-of-life change in the profession recently (which I’ve blogged about in the past), but most have been around the edges, while overall lawyers’ workplace reality remains a bulwark of what one writer calls the “over-culture” — the overweening force of culturally-accepted norms about how things are and should be — and the legal over-culture has stepped in line with the worldwide workplace trend of favoring wealth over a sense of meaning and value.

Alan Watts’ ideals were widely adopted by the burgeoning self-help industry, which also rode the neoliberal tide to prosperous heights. Self-help tends to be long on inspiration and short on grinding, and sustainable creative change requires large doses of both. I served up both in the workshops, but still wonder if they were just too… well, um…beatnik … for the law profession. I’ll never know — the guy who promoted the workshops retired, and I quit doing them. If nothing else, writing this series has opened my eyes to how closely law practice mirrors worldwide economic and workplace dynamics.  We’ll look more at that in the coming weeks.

Utopia

utopia-fox

“Practical men who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back”

John Maynard Keynes

We met law professor and economics visionary James Kwak a few months ago. In his book Economism: Bad Economics and the Rise of Inequality (2017), he tells this well-known story about John Maynard Keynes:

“In 1930, John Maynard Keynes argued that, thanks to technological progress, the ‘economic problem’ would be solved in about a century and people would only work fifteen hours per week — primarily to keep themselves occupied. When freed from the need to accumulate wealth, the human life would change profoundly.”

This passage is from Keynes’ 1930 essay:

“I see us free, therefore, to return to some of the most sure and certain principles of religion and traditional virtue–that avarice is a vice, that the exaction of usury is a misdemeanor, and the love of money is detestable, that those who walk most truly in the paths of virtue and sane wisdom are take least thought for the morrow. We shall once more value ends above means and prefer the good to the useful. We shall honour those who can teach us how to pluck the hour and the day virtuously and well, the delightful people who are capable of taking direct enjoyment in things, the lilies of the field who toil not neither do they spin.”

The timing of Keynes’ essay is fascinating:  he wrote it right after the original Black Friday and as the Great Depression was rolling out. Today, it seems as though his prediction was more than out of time, it was just plain wrong. Plus, it was undeniably utopian — which for most of us is usually a warning sign. Someone says “utopia,” and we automatically hear “dystopia,” which is where utopias usually end up after “reproduc[ing] many of the same tyrannies that people were trying to escape: egoism, power struggles, envy, mistrust and fear.” “Utopia, Inc.,” Aeon Magazine.

commune family

It’s just another day in paradise
As you stumble to your bed
You’d give anything to silence
Those voices ringing in your head
You thought you could find happiness
Just over that green hill
You thought you would be satisfied
But you never will-

The Eagles

To be fair, the post-WWII surge truly was a worldwide feast of economic utopia, served up mostly by the Mont Pelerin Society and other champions of neoliberal ideology. If they didn’t create the precise utopia Keynes envisioned, that’s because even the best ideas can grow out of time:  a growing international body of data, analysis, and commentary indicates that continued unexamined allegiance to neoliberalism is rapidly turning postwar economic utopia into its opposite.

But what if we actually could, if not create utopia, then at least root out some persistent strains of dystopia — things like poverty, lack of access to meaningful work, even a more even-handed and less unequal income distribution? Kwak isn’t alone in thinking we could do just that, but to get there from here will require more than a new ideology to bump neoliberalism aside. Instead, we need an entirely new economic narrative, based on a new understanding of how the world works:

“Almost a century [after Keynes made his prediction], we have the physical, financial, and human capital necessary for everyone in our country to enjoy a comfortable standard of living, and within a few generations the same should be true of the entire planet, And yet our social organization remains the same as it was in the Great Depression:  some people work very hard and make more money than they will ever need, while many others are unable to find work and live in poverty.

“Real change will not be achieved by mastering the details of marginal costs and marginal benefits, but by constructing a new, controlling narrative about how the world works.”

Rooting out the persistent strains of economic dystopia in our midst will require a whole new way of thinking — maybe even some utopia thinking. If we’re going to go there, we’ll need to keep our wits about us. More on that next time.

Race Against the Machine

For the past several years, two MIT big thinkers[1] have been the go-to authorities in the scramble to explain how robotics, artificial intelligence, and big data are revolutionizing the economy and the working world. Their two books were published four and six years ago — so yesterday in the world of technology — but they were remarkably prescient when written, and have not diminished in relevance. They are:

Race Against the Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy (2012)

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies (2014)

Click here for a chapter-by-chapter digest of The Second Machine Age, written by an all star cast of economic commentators. Among other things, they acknowledge the authors’ view that neoliberal capitalism has not fared well in its dealings with the technological juggernaut, but in the absence of a better alternative, we might as well continue to ride the horse in the direction it’s going.

While admitting that History (not human choice) is “littered with unintended… side effects of well-intentioned social and economic policies”, the authors cite Tim O’Reilly[2] in pushing forward with technology’s momentum rather than clinging to the past or present. They suggest that we should let the technologies do their work and just find ways to deal with it. They are “skeptical of efforts to come up with fundamental alternatives to capitalism.”

David Rotman, editor of the MIT Technology Review cites The Second Machine Age extensively in an excellent, longer article, “How Technology is Destroying Jobs.” Although the article is packed with contrary analysis and opinion, the following excepts emphasize what many might consider the shadowy  side of the street (compared to the sunny side we looked at in the past couple posts). I added the headings below to emphasize that many of the general economic themes we’ve been talking about also apply to the specific dynamics of the job market.

It used to be that economic growth — including wealth creation — also created more jobs. It doesn’t work that way any more. Perhaps the most damning piece of evidence, according to Brynjolfsson, is a chart that only an economist could love. In economics, productivity—the amount of economic value created for a given unit of input, such as an hour of labor—is a crucial indicator of growth and wealth creation. It is a measure of progress. On the chart Brynjolfsson likes to show, separate lines represent productivity and total employment in the United States.

For years after World War II, the two lines closely tracked each other, with increases in jobs corresponding to increases in productivity. The pattern is clear: as businesses generated more value from their workers, the country as a whole became richer, which fueled more economic activity and created even more jobs. Then, beginning in 2000, the lines diverge; productivity continues to rise robustly, but employment suddenly wilts. By 2011, a significant gap appears between the two lines, showing economic growth with no parallel increase in job creation. Brynjolfsson and McAfee call it the “great decoupling.” And Brynjolfsson says he is confident that technology is behind both the healthy growth in productivity and the weak growth in jobs.

A rising economic tide no longer floats all boats. The result is a skewed allocation of the rewards of growth away from jobs — i.e., economic inequality. The contention that automation and digital technologies are partly responsible for today’s lack of jobs has obviously touched a raw nerve for many worried about their own employment. But this is only one consequence of what ­Brynjolfsson and McAfee see as a broader trend. The rapid acceleration of technological progress, they say, has greatly widened the gap between economic winners and losers—the income inequalities that many economists have worried about for decades..

“[S]teadily rising productivity raised all boats for much of the 20th century,” [Brynjolfsson] says. “Many people, especially economists, jumped to the conclusion that was just the way the world worked. I used to say that if we took care of productivity, everything else would take care of itself; it was the single most important economic statistic. But that’s no longer true.” He adds, “It’s one of the dirty secrets of economics: technology progress does grow the economy and create wealth, but there is no economic law that says everyone will benefit.” In other words, in the race against the machine, some are likely to win while many others lose.

That robots, automation, and software can replace people might seem obvious to anyone who’s worked in automotive manufacturing or as a travel agent. But Brynjolfsson and McAfee’s claim is more troubling and controversial. They believe that rapid technological change has been destroying jobs faster than it is creating them, contributing to the stagnation of median income and the growth of inequality in the United States.

Meanwhile, technology is taking over the jobs that are left– blue collar, white collar, and even the professions. [I]mpressive advances in computer technology—from improved industrial robotics to automated translation services—are largely behind the sluggish employment growth of the last 10 to 15 years. Even more ominous for workers, the MIT academics foresee dismal prospects for many types of jobs as these powerful new technologies are increasingly adopted not only in manufacturing, clerical, and retail work but in professions such as law, financial services, education, and medicine.

Technologies like the Web, artificial intelligence, big data, and improved analytics—all made possible by the ever increasing availability of cheap computing power and storage capacity—are automating many routine tasks. Countless traditional white-collar jobs, such as many in the post office and in customer service, have disappeared.

New technologies are “encroaching into human skills in a way that is completely unprecedented,” McAfee says, and many middle-class jobs are right in the bull’s-eye; even relatively high-skill work in education, medicine, and law is affected.

We’ll visit the shadowy side of the street again next time.

[1] Erik Brynjolfsson is director of the MIT Center for Digital Business, and Andrew McAfee is a principal research scientist at MIT who studies how digital technologies are changing business, the economy, and society.

[2] According to his official bio on his website, Tim O’Reilly “is the founder and CEO of  O’Reilly Media, Inc. His original business plan was simply ‘interesting work for interesting people,’ and that’s worked out pretty well. O’Reilly Media delivers online learning, publishes books, runs conferences, urges companies to create more value than they capture, and tries to change the world by spreading and amplifying the knowledge of innovators.”

Brave New (Jobs) World

“The American work environment is rapidly changing.
For better or worse, the days of the conventional full-time job
may be numbered.”

The above quote is from a December 5, 2016 Quartz article that reported the findings of economists Lawrence Katz (Harvard) and Alan Krueger (Princeton, former chairman of the White House Council of Economic Advisers) that 94% of all US jobs created between 2005 to 2015 were temporary, “alternative work” — with the biggest increases coming from freelancers, independent contractors, and contract employees (who work at a business but are paid by an outside firm).

These findings are consistent with what we looked at last time:  how neoliberal economics has eroded institutional support for the conventional notion of working for a living, resulting in a more individuated approach to the job market. Aeon Magazine recently offered an essay on this topic:  The Quitting Economy:  When employees are treated as short-term assets, they reinvent themselves as marketable goods, always ready to quit. Here are some samples:

“In the early 1990s, career advice in the United States changed. A new social philosophy, neoliberalism, was transforming society, including the nature of employment, and career counsellors and business writers had to respond. (Emphasis added.)

“US economic intellectuals raced to implement the ultra-individualist ideals of Friedrich Hayek, Milton Friedman and other members of the Mont Pelerin Society…In doing so… they developed a metaphor – that every person should think of herself as a business, the CEO of Me, Inc. The metaphor took off, and has had profound implications for how workplaces are run, how people understand their jobs, and how they plan careers, which increasingly revolve around quitting.

“The CEO of Me, Inc. is a job-quitter for a good reason – the business world has come to agree with Hayek that market value is the best measure of value. As a consequence, a career means a string of jobs at different companies. So workers respond in kind, thinking about how to shape their career in a world where you can expect so little from employers. In a society where market rules rule, the only way for an employee to know her value is to look for another job and, if she finds one, usually to quit.”

I.e., tooting your own résumé horn is no longer not so much about who you worked for, but what you did while you were there. And once you’re finished, don’t get comfortable, get moving. (This recent Time/Money article offers help for creating your new mobility résumé.)

A couple years ago I blogged here about a new form of law firm entirely staffed by contract attorneys. A quick Google search revealed that the trend toward lawyer “alternative” staffing has been gaining momentum. For example:

This May 26, 2017 Above the Law article reported a robust market for more conventional associate openings and lateral partner hires, but included this caveat:

“The one trend that we see continue to stick is the importance of the personal brand over the law firm brand, and that means that every attorney should really focus on how they differentiate themselves from the pack, regardless of where they hang their shingle.”

Upwork offers “Freelance Lawyer Jobs.” “Looking to hire faster and more affordably?” their website asks. “ Tackle your next Contract Law project with Upwork – the top freelancing website.”

Flexwork offers “Flexible & Telecommuting Attorney Jobs.”

Indeed posts “Remote Contract Attorney Jobs.”

And on it goes. Whether you’re hiring or looking to be hired, you do well to be schooled in the Brave New World of “alternative” jobs. For a further introduction, check out these articles on the “Gig Economy” from Investopedia and McKinsey. For more depth, see:

The Shift:  The Future of Work is Already Here (2011), by Lynda Gratton, Professor of Management Practice at London Business School, where she directs the program “Human Resource Strategy in Transforming Companies.”

Down and Out in the New Economy: How People Find (or Don’t Find) Work Today (2017), by University of Indiana Anthropology Professor LLana Gershon — the author of the Aeon article quoted above.

Next time, we’ll begin looking at three major non-human players in the new job marketplace:  artificial intelligence, big data, and robotics. They’re big, they’re bad, and they’re already elbowing their way into jobs long considered “safe.”

Capitalism on the Fritz

“In November 2008, as the global financial crash was gathering pace, the 82-year-old British monarch Queen Elizabeth visited the London School of Economics. She was there to open a new building, but she was more interested in the assembled academics. She asked them an innocent but pointed question. Given its extraordinary scale, how as it possible that no one saw it coming?

“The Queen’s question went to the heart of two huge failures. Western capitalism came close to collapsing in 2007-2008 and has still not recovered. And the vast majority of economists had not understood what was happening.”

rethinking capitalismThat’s from the Introduction to Rethinking Capitalism (2016), edited by Michael Jacobs and Mariana Mazzucato.[1] The editors and authors review a catalogue of chronic economic “dysfunction” that they trace to policy-makers’ continued allegiance to neoliberal economic orthodoxy even as it has been breaking down over the past four decades.

Before we get to their dysfunction list, let’s give the other side equal time. First, consider an open letter from Warren Buffett published in Time last week. It begins this way:

“I have good news. First, most American children are going to live far better than their parents did. Second, large gains in the living standards of Americans will continue for many generations to come.”

Mr. Buffett acknowledges that “The market system… has also left many people hopelessly behind,” but assures us that “These devastating side effects can be ameliorated,” observing that “a rich family takes care of all its children, not just those with talents valued by the marketplace.” With this compassionate caveat, he is definitely bullish on America’s economy:

“In the years of growth that certainly lie ahead, I have no doubt that America can both deliver riches to many and a decent life to all. We must not settle for less.”

So, apparently, is our Congress. The new tax law is a virtual pledge of allegiance to the neoliberal economic model. Barring a significant pullback of the law (which seems unlikely), we now have eight years to watch how its assumptions play out.

And now, back to Rethinking Capitalism’s dysfunction’s list (which I’ve seen restated over and over in my research):

  • Production and wages no longer move in tandem — the latter lag behind the former.
  • This has been going on now for several decades,[2] during which living standards (adjusted) for the majority of households have been flat.
  • This is a problem because consumer spending accounts for over 70% of U.S. GDP. What hurts consumers hurts the whole economy.
  • What economic growth there has been is mostly the result of spending fueled by consumer and corporate debt. This is especially true of the post-Great Recession “recovery.”
  • Meanwhile, companies have been increasing production through increased automation — most recently through intelligent machines — which means getting more done with fewer employees.
  • That means the portion of marginal output attributable to human (wage-earner) effort is less, which causes consumer incomes to fall.
  • The job marketplace has responded with new dynamics, featuring a worldwide rise of “non-standard’ work (temporary, part-time, and self-employed).[3]
  • Overall, there has been an increase in the number of lower-paid workers and a rise in intransigent unemployment — especially among young people.
  • Adjusting to these new realities has left traditional wage-earners with feelings of meaninglessness and disempowerment, fueling populist backlash political movements.
  • In the meantime, economic inequality (both wealth and income) has grown to levels not seen since pre-revolution France, the days of the Robber Barons, and the Roaring 20’s.
  • Economic inequality means that the shrinking share of compensation paid out in wages, salaries, bonuses, and benefits has been dramatically skewed toward the top of the earnings scale, with much less (both proportionately and absolutely) going to those at the middle and bottom. [4]
  • Increased wealth doesn’t mean increased consumer spending by the top 20% sufficient to offset lost demand (spending) by the lower 80% of income earners, other than as reflected by consumer debt.
  • Instead, increased wealth at the top end is turned into “rentable” assets — e.g., real estate. intellectual property, and privatized holdings in what used to be the “commons” — which both drives up their value (cost) and the rent derived from them. This creates a “rentier” culture in which lower income earners are increasingly stressed to meet rental rates, and ultimately are driven out of certain markets.
  • Inequality has also created a new working class system, in which a large share of workers are in precarious/uncertain/unsustainable employment and earning circumstances.
  • Inequality has also resulted in limitations on economic opportunity and social mobility — e.g., there is a new kind of “glass floor/glass ceiling” below which the top 20% are unlikely to fall and the bottom 80% are unlikely to rise.
  • In the meantime, the social safety nets that developed during the post-WWII boom (as Buffett’s “rich family” took care of “all its children”) have been largely torn down since the advent of “workfare” in the 80’s and 90’s, leaving those at the bottom and middle more exposed than ever.

The editors of Rethinking Capitalism believe that “These failings are not temporary, they are structural.” That conclusion has led some to believe that people like Warren Buffett are seriously misguided in their continued faith in Western capitalism as a reliable societal institution.

More on that next time.

[1] Michael Jacobs is an environmental economist and political theorist; at the time the book was published, he was a visiting professor at University College of London. Mariana Mazzucato is an economics professor at the University of Sussex.

[2] “In the US, real median household income was barely higher in 2014 than it had been in 1990, though GDP had increased by 78 percent over the same period. Though beginning earlier in the US, this divergence of average incomes from overall economic growth has not become a feature of most advanced economies.”  Rethinking Capitalism

[3] These have accounted for “half the jobs created since the 1990s and 60 per cent since the 2008 crisis.” Rethinking Capitalism

[4] Meanwhile, those at the very top of the income distribution have done exceedingly well… In the US, the incomes of the richest 1 percent rose by 142 per cent between 1980 and 2013 (from an average of $461,910, adjusted for inflation, to $1,119,315) and their share of national income doubled, from 10 to 20 per cent. In the first three years of the recovery after the 2008 crash, an extraordinary 91 per cent of the gains in income went to the richest one-hundredth of the population.” Rethinking Capitalism