Humor: The Borowitz Report

NEW HAVEN (The Borowitz Report)—After a report from the Yale Center on Climate Change Communication showed that the term “climate change” elicits relatively little concern from the American public, leading scientists are recommending replacing it with a new term: “You will be burnt to a crisp and die.”

Other terms under consideration by the scientists include “your cities will be ravaged by tsunamis and floods” and “earth will be a fiery hellhole incapable of supporting human life.”

Scientists were generally supportive of the suggestions, with many favoring the term “your future will involve rowing a boat down a river of rotting corpses.”

“Any of these terms would do a better job conveying the urgency of the problem,” Tracy Klugian, a spokesperson for the newly renamed Yale Center for Oh My God Wake Up You Assholes, said.

Get news satire from The Borowitz Report delivered to your inbox.

 

 

Naked Capitalism on For-Profit College Ripoffs

For-Profit Colleges as Factories of Debt

Posted on September 22, 2014 by

Yves here. The American higher education system has been sucking more and more of the economic life out of the children that supposedly represent our best and brightest, the ones with intelligence and self-disipline to do well enough to be accepted at college.

But even though the press has given some attention to how young adults, and sometimes their hapless parent/grandparent co-signers, can wind up carrying huge millstones of debt, there’s been comparatively less focus on the for-profit segment of the market. While their students constitute only 13% of the total college population, they account for 31% of student loans. Why such a disproportionately high debt load? As this post explains, the for-profit colleges are master predators, seeking out vulnerable targets like single mothers who will do what they think it takes to set themselves up to land a middle class job. This is the new American lower-class version of P.T. Barnum’s “a sucker is born every minute.” These social aspirants are easy to exploit because they haven’t gotten the memo that the American Dream is dead.

By Hannah Appel and Astra Taylor. Originally published at TomDispatch. Hannah Appel is a mother, activist, and assistant professor of anthropology at UCLA. Her work looks at the everyday life of capitalism and the economic imagination. She has been active with Occupy Wall Street since 2011.

Astra Taylor is a writer, documentary filmmaker (including Zizek! and Examined Life), and activist. Her book, The People’s Platform: Taking Back Power and Culture in the Digital Age (Metropolitan Books), was published in April. She helped launch the Occupy offshoot Strike Debt and its Rolling Jubilee campaign.

Imagine corporations that intentionally target low-income single mothers as ideal customers. Imagine that these same companies claim to sell tickets to the American dream — gainful employment, the chance for a middle class life. Imagine that the fine print on these tickets, once purchased, reveals them to be little more than debt contracts, profitable to the corporation’s investors, but disastrous for its customers. And imagine that these corporations receive tens of billions of dollars in taxpayer subsidies to do this dirty work. Now, know that these corporations actually exist and are universities.

Over the last three decades, the price of a year of college has increased by more than 1,200%. In the past, American higher education has always been associated with upward mobility, but with student loan debt quadrupling between 2003 and 2013, it’s time to ask whether education alone can really move people up the class ladder. This is a question of obvious relevance for low-income students and students of color.

As Cornell professor Noliwe Rooks and journalist Kai Wright have reported, black college enrollment has increased at nearly twice the rate of white enrollment in recent years, but a disproportionate number of those African-American students end up at for-profit schools. In 2011, two of those institutions, the University of Phoenix (with physical campuses in 39 states and massive online programs) and the online-only Ashford University, produced more black graduates than any other institutes of higher education in the country. Unfortunately, a recent survey by economist Rajeev Darolia shows that for-profit graduates fare little better on the job market than job seekers with high school degrees; their diplomas, that is, are a net loss, offering essentially the same grim job prospects as if they had never gone to college, plus a lifetime debt sentence.

Many students who enroll in such colleges don’t realize that there is a difference between for-profit, public, and private non-profit institutions of higher learning. All three are concerned with generating revenue, but only the for-profit model exists primarily to enrich its owners. The largest of these institutions are often publicly traded, nationally franchised corporations legally beholden to maximize profit for their shareholders before maximizing education for their students. While commercial vocational programs have existed since the nineteenth century, for-profit colleges in their current form are a relatively new phenomenon that began to boom with a series of initial public offerings in the 1990s, followed quickly by deregulation of the sector as the millennium approached. Bush administration legislation then weakened government oversight of such schools, while expanding their access to federal financial aid, making the industry irresistible to Wall Street investors.

While the for-profit business model has generally served investors well, it has failed students. Retention rates are abysmal and tuitions sky-high. For-profit colleges can be up to twice as expensive as Ivy League universities, and routinely cost five or six times the price of a community college education. The Medical Assistant program at for-profit Heald College in Fresno, California, costs $22,275. A comparable program at Fresno City College costs $1,650. An associate degree in paralegal studies at Everest College in Ontario, California, costs $41,149, compared to $2,392 for the same degree at Santa Ana College, a mere 30-minute drive away.

Exorbitant tuition means students, who tend to come from poor backgrounds, have to borrow from both the government and private sources, including Sallie Mae (the country’s largest originator, servicer, and collector of student loans) and banks like Chase and Wells Fargo. A whopping 96% of students who manage to graduate from for-profits leave owing money, and they typically carry twice the debt load of students from more traditional schools.

Public funds in the form of federal student loans has been called the “lifeblood” of the for-profit system, providing on average 86% of revenues. Such schools now enroll around 10% of America’s college students, but take in more than a quarter of all federal financial aid — as much as $33 billion in a single year. By some estimates it would cost less than half that amount to directly fund free higher education at all currently existing two- and four-year public colleges. In other words, for-profit schools represent not a “market solution” to increasing demand for the college experience, but the equivalent of a taxpayer-subsidized subprime education.

Pushing the Hot Button, Poking the Pain

The mantra is everywhere: a college education is the only way to climb out of poverty and create a better life. For-profit schools allow Wall Street investors and corporate executives to cash in on this faith.

Publicly traded schools have been shown to have profit margins, on average, of nearly 20%. A significant portion of these taxpayer-sourced proceeds are spent on Washington lobbyists to keep regulations weak and federal money pouring in. Meanwhile, these debt factories pay their chief executive officers $7.3 million in average yearly compensation. John Sperling, architect of the for-profit model and founder of the University of Phoenix, which serves more students than the entire University of California system or all the Ivy Leagues combined, died a billionaire in August.

Graduates of for-profit schools generally do not fare well. Indeed, they rarely find themselves in the kind of work they were promised when they enrolled, the kind of work that might enable them to repay their debts, let alone purchase the commodity-cornerstones of the American dream like a car or a home.

In the documentary “College Inc.,” produced by PBS’s investigative series Frontline, three young women recount how they enrolled in a nursing program at Everest College on the promise of $25-$35 an hour jobs on graduation. Course work, however, turned out to consist of visits to the Museum of Scientology to study “psychiatrics” and visits to a daycare center for their “pediatrics rotation.” They each paid nearly $30,000 for a 12-month program, only to find themselves unemployable because they had been taught nearly nothing about their chosen field.

In 2010, an undercover investigation by the Government Accountability Office tested 15 for-profit colleges and found that every one of them “made deceptive or otherwise questionable statements” to undercover applicants. These recruiting practices are now under increasing scrutiny from 20 state attorneys general, Senate investigators, and the Consumer Financial Protection Bureau (CFPB), amid allegations that many of these schools manipulate the job placement statistics of their graduates in the most cynical of ways.

The Iraq and Afghanistan Veterans of America, an organization that offers support in health, education, employment, and community-building to new veterans, put it this way in August 2013: “Using high-pressure sales tactics and false promises, these institutions lure veterans into enrolling into expensive programs, drain their post-9/11 GI Bill education benefits, and sign up for tens of thousands of dollars in loans. The for-profits take in the money but leave the students with a substandard education, heavy student loan debt, non-transferable credits, worthless degrees, or no degrees at all.”

Even President Obama has spoken out against instances where for-profit colleges preyed upon troops with brain damage: “These Marines had injuries so severe some of them couldn’t recall what courses the recruiter had signed them up for.”

As it happens, recruiters for such schools are manipulating more than statistics. They are mining the intersections of class, race, gender, inequality, insecurity, and shame to hook students. “Create a sense of urgency. Push their hot button. Don’t let the student off the phone. Dial, dial, dial,” a director of admissions at Argosy University, which operates in 23 states and online, told his enrollment counselors in an internal email.

A training manual for recruiters at ITT Tech, another multi-state and virtual behemoth, instructed its employees to “poke the pain a bit and remind them who else is depending on them and their commitment to a better future.”  It even included a “pain funnel” — that is, a visual guide to help recruiters exploit prospective students’ vulnerabilities. Pain was similarly a theme at Ashford University, where enrollment advisors were told by their superiors to “dig deep” into students’ suffering to “convince them that a college degree is going to solve all their problems.”

An internal document from Corinthian Colleges, Inc. (owner of Everest, Heald, and Wyotech colleges) specified that its target demographic is “isolated,” “impatient” individuals with “low self-esteem.”  They should have “few people in their lives who care about them and be stuck in their lives, unable to imagine a future or plan well.”

These recruiting strategies are as well funded as they are abhorrent. When an institution of higher learning is driven primarily by the needs of its shareholders, not its students, the drive to get “asses in classes” guarantees that marketing budgets will dwarf whatever is spent on faculty and instruction. According to David Halperin, author of Stealing America’s Future: How For-Profit Colleges Scam Taxpayers and Ruin Student’s Lives, “The University of Phoenix has spent as much as $600 million a year on advertising; it has regularly been Google’s largest advertiser, spending $200,000 a day.”

At some schools, the money put into the actual education of a single student has been as low as $700 per year. The Senate’s Health, Education, Labor, and Pensions Committee revealed that 30 of the for-profit industry’s biggest players spent $4.2 billion — or 22.7% of their revenue — on recruiting and marketing in 2010.

Subprime Schools, Swindled Students

In profit paradise, there are nonetheless signs of trouble. Corinthian College Inc., for instance, is under investigation by several state and federal agencies for falsifying job-placement rates and lying to students in marketing materials. In June, the Department of Education discovered that the company was on the verge of collapse and began supervising a search for buyers for its more than 100 campuses and online operations. In this “unwinding process,” some Corinthian campuses have already shut down. To make matters worse, this month the Consumer Financial Protection Bureau announced a $500 million lawsuit accusing Corinthian of running a “predatory lending scheme.”

As the failure of Corinthian unfolds, those who understood it to be a school — namely, its students — have been left in the lurch. Are their hard-earned degrees and credits worthless?  Should those who are enrolled stay put and hope for the storm to pass or jump ship to another institution? Social media reverberate with anxious questions.

Nathan Hornes started the Facebook group “Everest Avengers,” a forum where students who feel confused and betrayed can share information and organize. A 2014 graduate of Everest College’s Ontario, California, branch, Nathan graduated with a 3.9 GPA, a degree in Business Management, and $65,000 in debt. Unable to find the gainful employment Everest promised him, he currently works two fast-food restaurant jobs. Nathan’s dreams of starting a record label and a music camp for inner city kids will be deferred even further into some distant future when his debts come due: a six-month grace period expires in October and Nathan will owe $380 each month on Federal loans alone. “Do I want to pay bills or my loans?” he asks. Corinthian has already threatened to sue him if he fails to make payments.

Asked to explain Corinthian’s financial troubles, Trace Urdan, a market analyst for Wells Fargo Bank, Corinthian’s biggest equity investor, argued that the school attracts “subprime students” who “can be expected — as a group — to repay at levels far lower than most student loans.” And yet, as Corinthian’s financial woes mounted, the corporation stopped paying rent at its Los Angeles campuses and couldn’t pay its own substantial debts to lenders, including Bank of America, from whom it sought a debt waiver.

That Corinthian can request debt waivers from its lenders should give us pause. Who, one might ask, is the proper beneficiary of a debt waiver in this case? No such favors will be done for Nathan Hornes or other former Corinthian students, though they have effectively been led into a debt trap with an expert package of misrepresentations, emotional manipulation, and possibly fraud.

From Bad Apples to a Better System, or Everest Avenged

As is always the case with corporate scandals, Corinthian is now being described as a “bad apple” among for-profits, not evidence of a rotten orchard. The fact is that for-profits like Corinthian exemplify all the contradictions of the free-market model that reformers present as the only solution to the current crisis in higher education: not only are these schools 90% dependent on taxpayer money, but tenure doesn’t exist, there are no faculty unions, most courses are offered online with low overhead costs, and students are treated as “customers.”

It’s also worth remembering that at “public” universities, it is now nearly impossible for working class or even middle class students to graduate without debt. This sad state of affairs — so the common version of the story goes — is the consequence of economic hard-times, which require belt tightening and budget cuts. And so it has come to pass that strapped community colleges are now turning away would-be enrollees who wind up in the embrace of for-profits that proceed to squeeze every penny they can from them and the public purse as well. (All the while, of course, this same tale provides for-profits with a cover: they are offering a public service to a marginalized and needy population no one else will touch.)

The standard narrative that, in the face of shrinking tax revenues, public universities must relentlessly raise tuition rates turns out, however, to be full of holes. As political theorist Robert Meister points out, this version of the story ignores the complicity of university leaders in the process. Many of them were never passive victims of privatization; instead, they saw tuition, not taxpayer funding, as the superior and preferred form of revenue growth.

Beginning in the 1990s, universities, public and private, began working ever more closely with Wall Street, which meant using tuition payments not just as direct revenue but also as collateral for debt-financing. Consider the venerable but beleaguered University of California system: a 2012 report out of its Berkeley branch, “Swapping Our Futures,” shows that the whole system was losing $750,000 each month on interest-rate swaps — a financial product that promised lower borrowing costs, but ended up draining the U.C. system of already-scarce resources.

In the last decade, its swap agreements have cost it over $55 million and could, in the end, add up to a loss of $200 million. Financiers, as the university’s creditors, are promised ever-increasing tuition as the collateral on loans, forcing public schools to aggressively recruit ever more out-of-state students, who pay higher tuitions, and to raise the in-state tuition relentlessly as well, simply to meet debt burdens and keep credit ratings high.

Instead of being the social and economic leveler many believe it to be, American higher education in the twenty-first century too often compounds the problem of inequality through debt-servitude. Referring to student debt, which has by now reached $1.2 trillion, Meister suggests, “Add up the lifetime debt service that former students will pay on $1 trillion, over and above the principal they borrow, and you could run a very good public university system for what we are paying capital markets to fund an ever-worsening one.”

You Are Not a Loan

The big problem of how we finance education won’t be solved overnight. But one group is attempting to provide both immediate aid to students like Nathan Hornes and a vision for rethinking debt as a systemic issue. On September 17th, the Rolling Jubilee, an offshoot of Occupy Wall Street, announced the abolition of a portfolio of debt worth nearly $4 million originating from for-profit Everest College. This granted nearly 3,000 former students no-strings-attached debt relief.

The authors of this article have both been part of this effort. To date, the Rolling Jubilee has abolished nearly $20 million dollars of medical and educational debt by taking advantage of a little-known trade secret: debt is often sold to debt collectors for mere pennies on the dollar. A medical bill that was originally $1,000 might sell to a debt collector for 4% of its sticker price, or $40. This allowed the Rolling Jubilee project to make a multi-million dollar impact with a budget of approximately $700,000 raised in large part through small individual donations.

The point of the Rolling Jubilee is simple enough: we believe people shouldn’t have to go into debt for basic needs. For the last four decades, easy access to credit has masked stagnating wages and crumbling social services, forcing many Americans to debt-finance necessities like college, health care, and housing, while the creditor class has reaped enormous rewards. But while we mean the Jubilee’s acts to be significant, we know it is not a sustainable solution to the problem at hand. There is no way to buy and abolish all the odious debt sloshing around our economy, nor would we want to. Given the way our economy is structured, people would start slipping into the red again the minute their debts were wiped out.

The Rolling Jubilee instead raises a question: If a ragtag group of activists can find a way to provide immediate relief to even a few thousand defrauded students, why can’t the government?

The Consumer Financial Protection Bureau’s lawsuit against Corinthian Colleges, Inc. is a good first step, but it only applies to specific private loans originating after 2011, and it will likely take years to play out. Until it’s resolved, students are still technically on the hook and many will be harassed by unscrupulous debt collectors attempting to extract money from them while they still can. In the meantime, the Department of Education (DOE) — which has far greater purview than the CFPB — is effectively acting as a debt collector for a predatory lender, instead of using its discretionary power to help students. Why didn’t the DOE simply let Corinthian go bankrupt, as often happens to private institutions, and so let the students’ debts become dischargeable?

Such debt discharge is well within the DOE’s statutory powers. When a school under its jurisdiction has broken state laws or committed fraud it is, in fact, mandated to offer debt discharge to students. Yet in Corinthian’s opaque, unaccountable unwinding process, the Department of Education appears to be focused on keeping as many of these predatory “schools” open as possible.

No less troubling, the DOE actually stands to profit off Corinthian’s debt payments, as it does from all federally secured educational loans, regardless of the school they are associated with. Senator Elizabeth Warren has already sounded the alarm about the department’s conflict of interest when it comes to student debt, citing an estimate that the government stands to rake in up to $51 billion dollars in a single year on student loans. As Warren points out, it’s “obscene” for the government to treat education as a profit center.

Can there be any doubt that funds reaped from the repayment of federally backed loans by Corinthian students are especially ill-gotten gains? Nathan Hornes and his fellow students should be the beneficiaries of debt relief, not further dispossession.

Unless people agitate, no reprieve will be offered. Instead there may be slaps on the wrist for a few for-profit “bad apples,” with policymakers presenting possible small reductions in interest rates or income-based payments for student borrowers as major breakthroughs.

We need to think bigger. There is an old banking adage: if you owe the bank $1,000, the bank owns you; if you owe the bank $1 million, you own the bank. Individually, student debt is an incapacitating burden. But as Nathan and others are discovering, as a premise for collective action, it can offer a new kind of leverage. Debt collectives, effectively debtors’ unions, may be the next stage of anti-austerity organizing. Collective action offers many possibilities for building power against creditors through collective bargaining, including the power to threaten a debt strike. Where for-profits prey on people’s vulnerability, isolation, and shame, debt collectives would nurture feelings of strength, solidarity, and outrage.

Those who profit from education fear such a transformation, and understandably so. “We ask students to make payments while in school to help them develop the discipline and practice of repaying their federal and other loan obligations,” a Corinthian Colleges spokesman said in response to the news of CFPB’s lawsuit.

It’s absurd: a single mother working two jobs and attending online classes to better her life is discipline personified, even if she can’t always pay her loans on time. The executives and investors living large off her financial aid are the ones who need to be taught a lesson. Perhaps we should collectively demand that as part of their punishment these predators take a course in self-discipline taught by their former students.

Image

Twain2

Naked Capitalism: Economics of the 1%

The Deficit Disaster That Never Was

Posted on September 18, 2014 by

By John Weeks, the author of Economics of the 1%: How mainstream economics serves the rich, obscures reality and distorts policy, Anthem Press. Originally published at Triple Crisis

Some older readers might recall that during 2010-2013 politicians and the media manifested great anxiety over the unmanageable level of the deficit and a disastrously high public debt. Prominent among the deficit/debt Cassandraswere a Republican Congressman by the name of Paul Ryan and neo-Ayn-Randian Senator Rand Paul. (Paul Ryan, Rand Paul—could they be the same person cleverly occupying the House and the Senate simultaneously? The possibility cannot be ruled out.)

deficit weeks wanderer

 

Representative Ryan contemplates the fiscal cliff in 2012? (Detail from the painting “Wanderer above a sea of fog.”) Turns out there was nothing for him to see.

President Obama himself felt sufficiently moved by the Cassandra-ite chorus to plunge whole-heartedly into the daunting task of cleaning up the budget mess (which he caused, as we all know). All of this desperate scrambling to reduce the burgeoning deficit and debt sought to postpone the evil day when “financial markets” would wreak their havoc on the spend-thrift Obama government.

I revisit this angst over deficits and debts because, surprising as it may seem, the deficit is mysteriously going away. The chart below demonstrates what Arthur Conan Doyle might have titled “the curious incident of the deficit that failed to bite.” In late 2009, the overall fiscal balance of the federal government rose to the annual equivalent of almost $1.5 trillion, nearly 11% of gross national product (GDP). Three years later the deficit fell to $1 trillion (end of 2012), and at the end of last month barely exceeded $500 billion (just 3% of GDP). It seems on track to drop to 400 billion by the end of this year (see Bloomberg News).

So what happened to drag the deficit back from the “fiscal cliff” (checkout my Real News Network interview on this supposed “cliff”)? Did painful but courageous cuts in “entitlements” and other expenditures prevent the U.S. government from swan-diving off the edge? The answer is NO; expenditure cuts did not reduce the deficit. As the chart below shows, federal expenditure in 2014 is hardly different from what it was at the end of 2009, and the deficit is $1 trillion lower!

There is a simple explanation. The federal deficit is down by one trillion because revenue is up by one trillion. This revenue increase did not occur because tax rates were raised. In 2007, just before the crisis hit the fan (so to speak), federal revenues were 17.9% of GDP, and the Congressional Budget Office projects the share for this year at 17.6%.

Federal budget deficit and expenditure by quarter, 2001-2014 (billions of dollars)

federal deficit

 

The deficit is down because GDP is higher. By definition, increases in GDP (national income) occur because household and business incomes rise. With no change in tax rate, federal revenue increases when these incomes rise. The deficit declined because the economy expanded, not because “entitlements” or any other expenditure was cut.

Unless the president does something extremely foolish such as dropping the odd trillion dollars on another war (and “dropping” seems the appropriate word), I can confidently predict that the federal deficit will change to a surplus by the end of his term.

Continuing to set the record straight, I point out that the failure to increase federal spending has prolonged the recession and stagnation of the U.S. economy. Yes, you read that correctly—increase. Since revenue rises with growth, we could have the present deficit, 3% of GDP, with a much higher level of household income and lower unemployment had expenditure not flat-lined during 2009-2014. More expenditure, more growth, which would have prevented the deficit from rising as a share of GDP.

My argument is not some expenditure variant of what the first Bush correctly called “voodoo economics” (aka Reaganomics), which alleged that cutting tax rates would not reduce tax revenue. The second Bush managed definitively to refute this right wing flight of fantasy when he cut taxes for the rich. (Surprise! Revenue fell.) In order to reduce deficits absolutely without changing tax rates, an increase in public expenditure must provoke an increase in private investment.

However, this is an indirect effect. Public spending increases employment and household incomes. The subsequent increase in household demand can stimulate more private investment if the idle production capacity resulting from a recession declines. We have many examples of this process by which public expenditure “crowds-in” private investment, but it does not occur automatically. What does occur automatically is a rise in tax revenue when public expenditure increases private incomes.

Reactionaries dismiss this expenditure-growth-tax linkage by saying it is nothing but “Keynesian economics.” I wonder, when they are told that water runs downhill do they shrug and say, “just Newtonian nonsense”?

As a result of the lies by reactionaries about approaching deficit disaster, and the equally reactionary failure of the Obama government to challenge those lies, the recovery of the U.S. economy has been slow and halting. GDP actually fell in the first quarter of this year (then recovered in the second). The chart below shows the consequence for those in and those out of work. Six years after the recession began, inflation adjusted private weekly earnings have risen by only 4%, and total unemployment is still over a percentage point higher than in 2007. Not since the depression of the 1930s has unemployment declined so slowly.

Percentage point differences from the first quarter of 2008, Real Average Weekly Earnings (RAWE) in the private Sector and the unemployment rate, 2008-2014

real average earnings chart

 

The power of the fake-economic (“fakeonomic”) ideology that serves the 1% so well is impressive. Simple, easily available statistics unfiltered by any slight-of-hand faux-technical manipulations tell a clear story. The fiscal deficit of the United States resulted from the contraction of national income at the end of the 2000s. Reckless speculation due to foolish deregulation of financial corporations caused the contraction. Yet we are told that savage cuts in social expenditures are necessary to prevent those same corporations from speculating on public bonds.

There is a word in Yiddish, “chutzpah.” The favorite way to show the meaning of the word is to tell the story of a boy who murders his mother and father, then pleads to the court for leniency because he is an orphan. Now, we have a better example. The financial crooks that caused the collapse of the economy and explosion of the deficit warn us against speculation against the public debt unless we cut the few public sector funded social benefits that we have.

Now, that’s chutzpah.

Sources: Statistics: Department of the Treasury (http://fms.treas.gov/mts/index.html) & Economic Report of the President 2014 (http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=ERP) and Bureau of Labor Statistics (http://www.bls.gov/data/).

Naked Capitalism: ISIS and the Saudi Connection

Matt Stoller: The Solution to ISIS Is the First Amendment

Posted on September 17, 2014 by

Yves here. This post focuses on ISIS as a symptom of what is wrong with US policy-making. One way of reading it is as an introduction to the role of Saudi Prince Bandar and the sway that the Saudis have had over US policy for decades. This obvious fact is curiously airbushed out of most American coverage of Middle Eastern politics. Israel is depicted as having a lock over US policy, when in fact the US is capable of pulling Israel’s choke chain. For instance, a key development mentioned in passing in a new Real News Network story is that the US has clearly signaled its unwillingness to support continued US hyperaggression against the Palestinians, which appears to have been a gambit to secure domestic support for continued high defense spending. Obama disallowed a shipment of antitank missiles to Israel. As we’ve said for years, the way for the US to rein Israel in would be to halt or delay the supply of critical military parts. This is a more frontal version of precisely that sort of approach.

As Kissinger said, the US does not have an ideology, only interests. Our most important geopolitical interest has been and continues to be oil. US corporations simply could not function if they did not have access to cheap oil. Saudi light crude is and remains the largest, most readily accessible pool of the most valuable crude. Oh, and the country with the second biggest proven reserves of light sweet crude is Iraq.

If you want to get a handle on the politics of the Middle East, the linchpin is the US-Saudi relationship. The long-standing deal is simple: Saudi princes keeps oil prices in check in return for US support for being kept in power. The de facto discount against what the Saudis could make if they choked supply back to get better prices is protection money.

However, this relationship currently looks like a dysfunctional marriage where it’s clear there will be no divorce because there is no prenup in place, making the cost and uncertainty of a break-up too high for the partners. The Saudis are upset with the US because we haven’t attacked Iran. In fact, we have done the Saudis a great favor by not going beyond sanctions, since Iran would retaliate rapidly, in force, against Saudi refineries and other oil infrastructure and would close the Strait of Hormuz. The Saudis are also mightily aggrieved that the US has not gone into Syria…yet.

As Stoller explains, ISIS started out as and arguably still is Prince Bandar’s private army, which explains how well financed and professional they have proven to be. This sort of barely-one-step-removed operation is hardly uncommon in the Middle East. For instance, Qatar funds the Muslim Brotherhood. So one way to read Stoller’s post is as an introduction to Prince Bandar. And as much as he calls for more open discussion of the US foreign policy and the ever-rising cost and increased difficulty of maintaining our empire. Unfortunately, that also means looking at the implications of life with more costly oil. There are far too many powerful people who stand to lose if that were to come into play faster than it absolutely has to, which means propaganda and dissimulation are likely to continue to be the order of the day.

By Matt Stoller, who writes for Salon and has contributed to Politico, Alternet, Salon, The Nation and Reuters. You can reach him at stoller (at) gmail.com or follow him on Twitter at @matthewstoller. Originally published at Medium

As the elite panic about ISIS — the Islamic State of Iraq and Levant — continues apace, it’s worth looking at how violations of the First Amendment have allowed this group to flourish, and just generally screw up US policy-making. The gist of the problem is that Americans have been lied to for years about our foreign policy, and these lies have now created binding policy constraints on our leaders which make it impossible to eliminate groups like ISIS.

Let’s start by understanding what ISIS actually is. First, ISIS is a brutal fascistic movement of radical Sunni militants, well-armed and well-trained, and bent on the establishment of an Islamic Caliphate throughout the Middle East. Second, it may also be and almost certainly was an arm of a wealthy Gulf state allied with the United States. This contradiction probably doesn’t surprise you, but if it does, that’s only because it cuts against a standard narrative of good guys and bad guys peddled by various foreign policy interests. The reality is that ally and enemy in post-colonial lands is often a meaningless term —it’s better to describe interests. A good if overly romanticized Hollywood illustration of this dynamic is the movie Charlie Wilson’s War, about the secret collaboration between Saudi Arabia, Egypt, Pakistan Israel and the CIA to undermine the Soviets in Afghanistan. This foreign policy apparatus is usually hidden in plain sight, known to most financial, political, military, and corporate elites but not told to the American public.

ISIS, like Al Qaeda, is an armed and trained military group. Guns and training cost money, and this money came from somewhere. There are two Gulf states that finance Sunni militants — Qatar and Saudi Arabia. Both states use financial power derived from oil to build armed terrorist groups which then accomplish aims that their states cannot pursue openly. This occasionally slips out into the open. German Development Minister Gerd Mueller recently blamed Qatar, for instance, for financing ISIS. Qatar itself swiftly denied the charges and claimed it only funds Jabhat al-Nusra. Al-Nusra is the other radical Al Qaeda offshoot militant group fighting in Syria. In other words, Qatar denied funding ISIS by saying it funds Al-Qaeda. It’s a sort of ‘we fund the bad guys who want to kill Americans but not the really bad guys who behead them on social media,’ a non-denial denial by geopolitical psychopaths.

Steve Clemons, one of the few members of Washington’s foreign policy establishment who sometimes speaks clearly about what is actually going on with the American empire, believes Qatar. According to his sources, while the Qataris funded the radical group Al-Nusra in Syria, “ISIS has been a Saudi project.” Clemons goes further, and discusses a very important American and Saudi figure, Prince Bandar bin Sultan, then the head of Saudi Arabia’s intelligence services and a former ambassador to the United States (as well as a Washington, DC socialite). Clemons writes, “ISIS, in fact, may have been a major part of Bandar’s covert-ops strategy in Syria.”

In other words, ISIS got its start in Syria as part of the Arab Spring uprising, and it was financed by Saudi Arabia to go up against Assad. The Gulf states were using Syria to fight a proxy war against Iran, and the precursor of ISIS was one of their proxies in that war. It’s hard to imagine that today ISIS isn’t at least tacitly tolerated by a host of countries in the region, though its goodwill from neighboring countries may be running out. Today, ISIS may be self-sustaining, though it’s quite possible that money is still coming from conservative wealthy individuals in the Gulf states, money which originally comes from the West in the form of oil purchases.

In other words, Middle Eastern politics, and much of Western politics, is organized around oil money. In its economic consequences, the oil gusher of Saudi Arabia was similar to the Chinese trade in the 19th century that led to the opium wars. In the that episode, the British bought tea from China, but China didn’t want anything but precious metals from England, leading to a drain of what was then reserve currency to China. This wasn’t sustainable, so England attacked China in what was known as ‘the opium wars’ and forced the government to allow them to trade opium, which addicted large segments of the Chinese population (and eventually led to today’s drug war). Revenue from opium then balanced the cost of tea. The money that went from England to China was ‘recycled’ back to England by the opium trade. International monetary arrangements require such recycling, though it does not have to be so brutal.

In the 1970s, Saudi Arabia had something the West wanted — oil — but it didn’t want that much from the West. So we used a different kind of recycling arrangement (detailed by Tim Mitchell in his exquisite book Carbon Democracy). Saudi Arabia got dollars, and those dollars piled up in Western banks like Citigroup, which started lending that money out to South American countries in the early 1980s. There were several other mechanisms to recycle what was called “petrodollars”. The arms trade really picked up in the 1970s, and continues today. Gulf states buy a lot of fancy weapons, which moves some of the dollars back to the West. They also have huge sovereign wealth funds, and buy Western corporations, banks, real estate, and assets, as well as the politicians that come with all of that. This ‘recycles’ dollars back out of the Middle East — Saudi Arabia returns some dollars, and in return it gets power and influence in the US.

Foreign policy in the Gulf states is also organized around petrodollars. The Saudis don’t have to fight externally, they can simply fund terrorism against those they dislike. The Saudi state, like all states, isn’t a coherent whole, but a set of elites that interact with each other. There are thousands of ‘princes’ who basically just get oil income, but any of them can act independently and many of them do. It’s a bit like the CIA doing things without the President’s explicit permission; there’s a reason it was the CIA, the Saudis, and the Israelis financing the Taliban jointly in the 1980s. This has benefits, because then the Saudi state can have constructive ambiguity around its own role in financing terrorism. It also risks blowback, in that groups like ISIS or Al Qaeda can decide to take on the Saudi establishment itself.

The relationship between Saudi Arabia and the United States is the most important diplomatic and military relationship that we have. The Saudis are the slush fund for whatever the US wants to do when it doesn’t want that activity on the books. It also fulfills an important role in the oil markets akin to that of the IMF in the international financial markets, by managing its oil surplus to ensure financial and economic stability. This means shifts in a mercurial theocratic kingdom where the Saudi monarch is in his 80s, and most of the population is young, poor, and extremely religious conservatives, can turn world politics on a dime.

Right now, the Saudi government is still attempting to manage fallout from the war in Iraq and the Arab Spring uprisings, as well as the vacuum of power left when the United States withdrew from Iraq. It’s likely that at certain points it funded ISIS as one part of that strategy. Now you might think that the Saudi government financing a terrorist group with stated aims to attack the United States is a one-off, an accident. I mean the United States funded the Taliban in Afghanistan in the USSR in the 1980s. But you would be wrong.

For some reason, attacking the United States seems to be a goal of certain elements of the Saudi Arabian government and financial establishment. Parts of the Saudi government helped organize the attacks on the United States on 9/11 (or least that’s what Lawrence Wright of the New Yorker alleges). That, at least, was apparently one conclusion of the “Joint Inquiry into Intelligence Community Activities Before and After the Terrorist Attacks of September 2001″, better known as the 9/11 Commission Report. Why wasn’t this front-page news? Because this particularly portion of the report, these 28 pages, were classified. Periodically members of Congress gripe about this. As early as 2003, Senators were demanding the Bush administration declassify this section of the report — Chuck Schumer, who has a security clearance and can read the report, pretty much said outright that Saudi Arabia was behind the attacks.

Former Senator Bob Graham continues to complain about the public being kept in the dark. Who in particular in the Saudi government? I don’t know precisely who, but people in the United States government certainly do. It probably has something to do with the Saudi Arabian prince who associated with the hijackers being ‘spirited’ out of the US days after the attacks, even as all planes were grounded. Somehow, the FBI ‘mishandled’ the investigation of this prince and his companions.

The Saudi Ambassador to the US at this point was that same geopolitical sage we are already familiar with through his covert strategy with ISIS: Prince Bandar bin Sultan. Bandar, a colorful Talleyrand-like arms dealer and diplomat who deals with terrorist groups and DC power player alike, is so close to the Bush family that his nickname is ‘Bandar Bush’. The rumors I’ve heard in DC is that his house was a weird fanciful scene where food was served on gold plates. DC journalist Mark Leibovich wrote about the importance of Bandar to the Washington scene, “a most sacred of Official Washington shrines”.

At the memorial service, Barbara sat over near Ken Duberstein, a vintage Washington character in his own right, who did a brief stint as the White House chief of staff during the checked-out final months of Ronald Reagan’s second term. Duberstein and Mitchell are old friends. Jews by religion and local royalty by acclamation, they once shared a memorable erev Yom Kippur — the holiest night on the Jewish calendar — at a most sacred of Official Washington shrines: the McLean, Virginia, mansion of Prince Bandar bin Sultan, Saudi Arabia’s ambassador to the United States, and his wife, Haifa.

Yup, the Saudi who funded radical Sunni Muslim group ISIS once hosted Reagan’s former Chief of Staff to erev Yom Kippur festivities.

Prince Bandar’s dazzling hosting abilities in the DC social scene were an important part of his geopolitical arsenal. But just because he painted his private jet in Dallas Cowboys colors, and just because his son was a guest of Jerry Jones for the NFL draft, doesn’t mean he abandoned the other more traditional Saudi tools for geopolitical statecraft, such as supporting Muslim extremists that would engage in violent attacks on Westerners. It turns out that money for the 9/11 hijackers may have flowed through Bandar’s wife’s account at Riggs bank. Riggs was a haven for money launderers and dictators, and was controlled by the Allbritton family, “dear friends” of Ronald Reagan. It was also an instrument of CIA policy, “which included top current and former Riggs executives receiving U.S. government security clearances.” This relationship “could complicate any prosecution of the bank’s officials, according to private lawyers and former prosecutors.” The Albritton family later created Politico, which was arguably the most influential political publication in DC from 2008–2010.

In other words, the Saudi ambassador, who may have funneled money to 9/11 hijackers, also advised the Bush administration on U.S. foreign policy, and had deep and profitable relationships with U.S. media, banking, and political elites. He was also a social luminary in DC. This helped lay the foundation for the American foreign policy establishment consensus position, often forged at think tanks funded by foreign governments. From there, this consensus emanated outward into Politico-like publications, and then outward onto the television networks and into the homes of the remaining Americans will to pay attention to an infantilized deceptive version of American foreign policy.

And so, almost immediately after the attacks, Saddam Hussein became the designated bad guy and the Bush administration, supported by the entire Republican Party, foreign policy establishment, and a substantial chunk of Democrats (Bill and Hillary Clinton and John Kerry, for starters), prepared for war in Iraq. The Bush administration alluded many times to a supposed link between 9/11 and Hussein, which was a ludicrous conspiracy theory, but an acceptable one because it served the interests of the Bush administration and a coddled foreign policy elite. But rather than expose the entire secret deal by which elites conducted a shadow foreign policy through Saudi petrodollars, most journalists told Americans that Saddam Hussein had to go. Those journalists who didn’t subscribe to this, especially on TV, were fired.

As we’ve by now noticed, America has been on a glide path of dishonest policy-making since 9/11. One can imagine a different way of doing this. Imagine if the public had known that it was elements of the Saudi government who actually supported this attack. Imagine if they knew of the incredibly tight intertwining of Saudi elites with US elites, the Saudi extra-constitutional slush fund, petrodollar terrorism diplomacy, the long alliance with theocracy, and so forth. There would have had to be a reckoning for this mess of contradictions. Perhaps the public would have endorsed this deal. Perhaps the public would have accepted cheap gasoline in return for, as Ken Silverstein calls it, “The Secret World of Oil.” Rick Perlstein, in the book The Invisible Bridge, showed how the public tried to reckon with Vietnam, but then decided to turn away from truth in the 1970s, and to Ronald Reagan’s narrative of an America without flaws or limits. Perhaps that’s what would have happened, again, after 9/11.

But the public never got the chance for a reckoning. As in the 1970s, we never got a chance to understand the real costs of our geopolitical arrangements, and to examine alternatives. That was left to the fringes, for another ten years or so.Instead, what happened was a mixture of propaganda and censorship.

The propaganda was the organization of the culture for war with Iraq, perhaps the least popular introduction of a war since World War I. This included not only the social salons of Bandar Bush, but also Colin Powell’s show at the UN, false claims of evidence of Weapons of Mass Destruction by the Intelligence Community, and a shifting rationale for the invasion of Iraq.

It also included the explicit destruction of the reputation of anyone who attempting to understand 9/11 in the context of ‘blowback’, or what the CIA calls the consequence of secret foreign policy moves. Andrew Sullivan, for example, led the way on Sept. 16, 2001. He wrote:

The middle part of the country — the great red zone that voted for Bush — is clearly ready for war. The decadent left in its enclaves on the coasts is not dead -and may well mount a fifth column.

A whole series of neoconservative leaning pundits — Mickey Kaus, Tom Friedman, Bill Kristol, Jeff Jarvis, Glenn Reynolds — bathed in this slur-fest. Anyone casting doubt on the official version of 9/11, some variant of ‘they hate us for our freedom’, was dubbed a conspiracy theorist and subjected to McCartyite smears. Even as late as 2009, Van Jones resigned under pressure from the Obama White House ostensibly because he lent credibility to one of their theories by signing a petition about 9/11 he says he didn’t understand. Clearly the theory the Bush administration orchestrated 9/11 is ridiculous, but the campaign to elevate having stupid opinions or even associating with someone with stupid opinions into something closer to treason was just that — a campaign.

But the other part of the 9/11 narrative, aside from propaganda, was censorship. In America it’s not popular to talk about censorship, because it’s presumed that we don’t have it, as such. There are no rooms full of censors who choose what goes into newspapers, and what doesn’t. Our press is free. It’s right there in the First Amendment: “Congress shall make no law… prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press..”

Somehow, though, Senators, Congressmen, and intelligence officials are not supposed to talk about those 28 pages in the 9/11 Commission report which are classified. And why not? Well because according to President Bush (and now President Obama), doing so would compromise “national security”. But what, exactly, is censorship, if it’s not a prohibition on individuals to speak about certain topics? Traditionally, First Amendment law gives the highest protection to political speech, allowing for certain restrictions on commercial speech (like false advertising). But there is no higher form of speech than political speech, and there is more important form of political speech than the exposition of wrongdoing by the government. So how is this not censorship?

This is by design. As Senator Daniel Patrick Moynihan put it in a commission about the classification system in 1997, “It is now almost routine for American officials of unquestioned loyalty to reveal classified information as part of ongoing policy disputes—with one camp “leaking” information in support of a particular view, or to the detriment of another—or in support of settled administration policy. In the process, this degrades public service by giving a huge advantage to the least scrupulous players.” He continued, “Excessive secrecy has significant consequences for the national interest when, as a result, policymakers are not fully informed, government is not held accountable for its actions, and the public cannot engage in informed debate.”

What all this means that the reality of ISIS and what this group seeks is opaque to the public, and to policymakers not clued into the private salons where the details of secrets can be discussed. Even among those policymakers, the compartmentalized national security establishment means that no one really grasps the whole picture. The attempt to get the US into a war in Syria a year ago was similarly opaque. The public cannot make well-informed decisions about national security choices because information critical to such choices is withheld from them. It is withheld from them at the source, through the classification-censorship process, then by obfuscations in the salons and think tanks of DC and New York, and then finally through the bottleneck of the mass media itself.

This is what happened after 9/11, a lack of an informed debate due to propaganda, media control, and a special kind of censorship. Our policy on ISIS is the price for such ignorance. Polling shows Americans want something done on ISIS, but they have no confidence that what is being done will work. This is a remarkably astute way to see the situation, because foreign policy since 9/11 has been a series of geopolitical duct tape and costly disasters. Despite the layers of gauze and grime pulled over our foreign policy viewfinder, the public itself is aware that whatever we’re doing ain’t working.

Adopting a realistic policy on ISIS means a mass understanding who our allies actually are and what they want, as well as their leverage points against us and our leverage points on them. I believe Americans are ready for an adult conversation about our role in the world and the nature of the fraying American order, rather than more absurd and hollow bromides about American exceptionalism.
Until that happens, Americans will not be willing to pay any price for a foreign policy, and rightfully so. Fool me once, shame on you. And so forth.

Unwinding the classified state, and beginning the adult conversation put off for seventy years about the nature of American power, is the predicate for building a global order that can drain the swampy brutal corners of the world that allow groups like ISIS to grow and thrive. To make that unwinding happen, we need to start demanding the truth, not what ‘national security’ tells us we need to know. The Constitution does not mention the words ‘national security’, it says ‘common defense.’ And that means that Americans should be getting accurate information about what exactly we are defending.

Naked Capitalism: Throwing American Workers Under the Bus

we-call-this-freedom_kurt1a_011514

US Corporate Executives to Workers: Drop Dead

Posted on September 16, 2014 by

The Washington Post has a story that blandly supports the continued strip mining of the American economy. Of course, in Versailles that the nation’s capitol has become, this lobbyist-and-big-ticket-political-donor supporting point of view no doubt seems entirely logical.

The guts of the article:

Three years ago, Harvard Business School asked thousands of its graduates, many of whom are leaders of America’s top companies, where their firms had decided to locate jobs in the previous year. The responses led the researchers to declare a “competitiveness problem” at home: HBS Alumni reported 56 separate instances where they moved 1,000 or more U.S. jobs to foreign countries, zero cases of moving that many jobs in one block to America from abroad, and just four cases of creating that many new jobs in the United States. Three in four respondents said American competitiveness was falling.

Harvard released a similar survey this week, which suggested executives aren’t as glum about American competitiveness as they once were…

Companies don’t appear any more keen on American workers today, though. The Harvard grads are down on American education and on workers’ skill sets, but they admit they’re just not really engaged in improving either area. Three-quarters said their firms would rather invest in new technology than hire new employees. More than two-thirds said they’d rather rely on vendors for work that can be outsourced, as opposed to adding their own staff. A plurality said they expected to be less able to pay high wages and benefits to American workers.

The researchers who conducted the study call that a failure on the part of big American business. They say the market will eventually force companies to correct course and invest in what they call the “commons” of America’s workforce. “We think this mismatch is, at some fundamental sense, unsustainable,” Michael Porter, one of the professors behind the studies, said in an interview this week.

But what if it’s not?

Why, if you were a multinational corporation, would you feel a need to correct that mismatch? Why would you invest in American workers? Why would you create a job here?

At what point does it become a rational business decision for American companies to write off most Americans?

It’s hard to know where to begin with this. First, Harvard Business School is hardly a bastion of socialist thinking. Porter and his colleagues are correct to call out short-sightedness in the incumbents of C-suites. And there’s nary a mention of the role of the long-overvalued dollar, thanks to the lessons that China and the Asian tigers learned in the wake of the 1997 Asian crisis: keep your currency pegged low, run a big trade surplus so you have such a large foreign exchange warchest as to never again be subject to the tender ministrations of the IMF.

But second, and more worrisome, is a vastly larger intellectual failure on the part of the Washington Post and even the Harvard investigators. They’ve completely lost sight of whose interests are at work. The HBS grads are looting the American economy for their own personal profit. Making better products and developing new markets is hard and it takes time for that effort to pay off. Cutting costs is easy. Getting a pop in the price of your stock due to investors’ belief that offshoring and outsourcing will lower costs is even easier.

It’s far from a given, particularly at this juncture, that more outsourcing and offshoring is good for anyone aside from top executives, well-placed middle managers, and the various intermediaries in the outsourcing industry (yes, there is such a thing). We’ve been writing for years that even in the 1990s, we were hearing from executives at companies that sent operations overseas that the business case was weak, but they went ahead regardless to please Wall Street. Chief investment officers have said flatly that outsourcing is overrated as a cost saver.

In the early 2000s, we heard regularly from contacts at McKinsey that their clients had become so short-sighted that it was virtually impossible to get investments of any sort approved, even ones that on paper were no-brainers. Why? Any investment still has an expense component, meaning some costs will be reported as expenses on the income statement, as opposed to capitalized on the balance sheet. Companies were so loath to do anything that might blemish their quarterly earnings that they’d shun even remarkably attractive projects out of an antipathy for even a short-term lowering of quarterly profits.

And even when these projects actually do lower costs (as opposed to transfer income from lower-paid workers to middle managers, who need to do more coordinating, and senior executives), there are hidden costs in terms of extended supply chains, lost flexibility, and ceding the opportunity to develop expertise to vendors. In other words, even when profits improve, it’s typically achieved by making the company more fragile.

This unwillingness to invest represents a failure of capitalists to do their job on a massive scale. US-style short-termism has become all too common around the globe. As yours truly and Rob Parenteau wrote for the New York Times in 2010:

For instance, IMF and World Bank studies found a reduced reinvestment rate of profits in many Asian nations following the 1998 crisis. Similarly, a 2005 JPMorgan report noted with concern that since 2002, US corporations on average ran a net financial surplus of 1.7 percent of GDP, which contrasted with an average deficit of 1.2 percent of GDP for the preceding forty years. Companies as a whole historically ran fiscal surpluses, meaning in aggregate they saved rather than expanded, in economic downturns, not expansion phases.

The big culprit in America is that public companies are obsessed with quarterly earnings. Investing in future growth often reduces profits short term. The enterprise has to spend money, say on additional staff or extra marketing, before any new revenues come in the door. And for bolder initiatives like developing new products, the up front costs can be considerable (marketing research, product design, prototype development, legal expenses associated with patents, lining up contractors). Thus a fall in business investment short circuits a major driver of growth in capitalist economies.

Companies, while claiming they maximize shareholder value, increasingly prefer to pay their executives exorbitant bonuses, or issue special dividends to shareholders, or engage in financial speculation. They turn their backs on the traditional role of a capitalist – to find and exploit profitable opportunities to expand his activities

Some may argue that lower investment rates are the result of poor prospects, but the data does not support that view. Corporate profits have risen as a share of GDP since the early 1980s, reaching unprecedented levels right before the global financial crisis took hold. Even now, US profit margins are nearly two thirds of the way back to their prior cyclical high, despite a subpar recovery.

More than four years later, those sorry trends have continued, with profits now at a record share of GDP. But the top brass has been handsomely rewarded for its sorry behavior. They’ve discovered that the more they squeeze workers, both here and abroad, the more they can keep for themselves.

And in a bit of unintended irony, the Washington Post shows a headline for another way they’ve succeeded in making the greater public subsidize their profits: How the Postal Service subsidizes cheap Chinese goods. As readers know, the sources of corporate welfare are legion. Walmart’s super low wages are subsidized by $6.2 billion a year in public assistance, a significant portion of its $17 billion in reported 2013 profits. And that’s before you factor in the value of state and local tax breaks that Walmart gets by pitting communities against each other when it is planning new store locations. These concessions are particularly dubious given that Walmarts don’t create jobs, but do a combination of destroy them (by putting smaller retailers out of business) and steal them (by syphoning retail sales and hence other jobs) from neighboring communities.

While Walmart is the poster child of subsidized profits, the Bentonville giant has plenty of company, including large financial firms (so heavily subsidized and backstopped as to not be properly considered private companies), Big Pharma (a huge beneficiary of decades of NIH-funded research) and Big Auto (which has played the “pit communities against each other” game as adeptly as Walmart in securing subsidies for moving plants into union-hostile Sunbelt states).

Unfortunately, Porter appears to have characterized the problem accurately when he depicts the attitude of these self-serving executives as a looting of the commons of labor, meaning much of America. And the precursor of the early industrial period show that this can be a sustainable strategy until workers finally rebel. The Bolshevik revolution, which was actually a peasant revolt, was more than a century after the enclosure movement began its successful program to turn independent yeoman farmers into desperate factory wage-slaves. So while history suggests that capitalists will push workers beyond their breaking point, that rupture can be a very long time in coming.

Trickle-Down Nonsense

The following is from the blog Economics Online Tutor, well worth a visit for the entire post:

Fallacies of Republican Party and Trickle Down Economic Policies

Trickle down economics is the main source of the problems facing the economy over the past 30 years or longer. Trickle down economics is nothing more than wealth redistribution to the very rich.

No matter how progressive the income tax rates are, the wealthy never lose wealth from the tax rates, and the poor never gain wealth from the tax rates. The system says that wealth redistributes upward, always. When the tax rates on the rich are lowered, this means that it redistributes at a faster rate. The question becomes, not whether the rich deserve more, but how much the new gains should be taxed. In the past the economy simply worked better when the income tax rates were more progressive. They have become much less progressive, due to lower rates at the top and more loopholes, and the overall economy has suffered because of it. These changes are due to “trickle down” concepts. In a practical sense, the change from “how it used to be” to “how it is now” is by definition a redistribution. Trickle down economics redistributes wealth to the rich. The result is that the gap between the wealthy and everybody else has grown tremendously by these policies; the American dream of upward mobility is available to fewer people. The middle class is shrinking. The poor are becoming poorer. These changes mean less economic activity, fewer jobs, less money for basic purchases that create the incentives for investment and job growth, and more people living in poverty in the richest nation in the world. We have the resources to eliminate this poverty, but we choose, through economic policies based on a misguided philosophy and fear tactics that falsely characterize the alternatives, to maintain and compound these problems.

The first pie chart in the first image shows clearly that prior to the widespread implementation of trickle down economic policies in the Reagan administration, the rich (shown here as the top 10% of income earners) received more of the income gains per person than everybody else. This is consistent with market incentives for innovation, investment, and risk-taking under all market-based economic schools of thought. But the top 10% did not receive everything: they shared one-third of all income gains among 10% of earners; the lower 90% shared the other two-thirds. The rich gained more, but everybody gained. Just as economic theory would suggest. By having gains throughout the different classes of earners, the economy could keep moving ahead with consumption, investments, savings, and wages.

But over time, under policies consistent with trickle down economics, that has all changed. The second pie chart in that first image shows that the gains have ALL gone to the top 10%, with the top 1% receiving most of the gains. The bottom 90% has received NO gains – this group has actually lost income. The result is that the largest segment of the economy has lost purchasing power. The standard of living for most Americans has either declined, or has been maintained only through borrowing. Notice that this chart immediately precedes the Great Recession, which was brought on largely due to credit bubbles. Without purchasing power, people will spend less and save less. This equates to less investment, less production in the economy.

These charts and many similar charts have been circulating around Facebook pages. They get shared often, and the information on who originated them sometimes gets lost in the process. These are sourced in regards to the data being used, and are consistent with what is known through many sources, including both private and government sources. To see more such images, check out the photo albums on the Facebook page for this website.

Image

Sanders

Charts on Fixing Economic Inequality

UC Berkeley’s Haas Institute for a Fair and Inclusive Society and the Economic Policy Institute released this report on six approaches to broadening our economy. This chart gives a summary:

HaasInstitute_InequalityNotInevitable_Infographic

Naked Capitalism on Dynasty Capitalism

All in the Family: How the Koch Brothers, Sheldon Adelson, Sam Walton, Bill Gates, and Other Billionaires Are Undermining America

Posted on September 12, 2014 by

Yves here. Even though monied dynasties have long had outsized influence in the US, Steve Fraser contends that billionaires and their scions like the Koch brothers, the Walton heirs, and Sheldon Adelson wield far more power than their predecessors and are in the process of remaking America.

By Steve Fraser, the author of Wall Street: America’s Dream Palace. His next book, The Age of Acquiescence: The Life and Death of American Resistance to Organized Wealth and Power, will be published by Little Brown in February. He is a writer, historian, and co-founder of the American Empire Project. Originally published at TomDispatch

George Baer was a railroad and coal mining magnate at the turn of the twentieth century.  Amid a violent and protracted strike that shut down much of the country’s anthracite coal industry, Baer defied President Teddy Roosevelt’s appeal to arbitrate the issues at stake, saying, “The rights and interests of the laboring man will be protected and cared for… not by the labor agitators, but by the Christian men of property to whom God has given control of the property rights of the country.”  To the Anthracite Coal Commission investigating the uproar, Baer insisted, “These men don’t suffer. Why hell, half of them don’t even speak English.”

We might call that adopting the imperial position.  Titans of industry and finance back then often assumed that they had the right to supersede the law and tutor the rest of America on how best to order its affairs.  They liked to play God.  It’s a habit that’s returned with a vengeance in our own time.

The Koch brothers are only the most conspicuous among a whole tribe of “self-made” billionaires who imagine themselves architects or master builders of a revamped, rehabilitated America. The resurgence of what might be called dynastic or family capitalism, as opposed to the more impersonal managerial capitalism many of us grew up with, is changing the nation’s political chemistry.

Our own masters of the universe, like the “robber barons” of old, are inordinately impressed with their ascendancy to the summit of economic power.  Add their personal triumphs to American culture’s perennial love affair with business — President Calvin Coolidge, for instance, is remembered today only for proclaiming that “the business of America is business” — and you have a formula for megalomania.

Take Jeff Greene, otherwise known as the “Meltdown Mogul.”  Back in 2010, he had the chutzpah to campaign in the Democratic primary for a Florida senate seat in a Miami neighborhood ravaged by the subprime mortgage debacle — precisely the arena in which he had grown fabulously rich.  In the process, he rallied locals against Washington insiders and regaled them with stories of his life as a busboy at the Breakers Hotel in Palm Beach.  Protected from the Florida sun by his Prada shades, he alluded to his wealth as evidence that, as a maestro of collateralized debt obligations, no one knew better than he how to run the economy he had helped to pulverize.  He put an exclamation point on his campaign by flying off in his private jet only after securely strapping himself in with his gold-plated seat buckles.

Olympian entrepreneurs like Greene regularly end up seeing themselves as tycoons-cum-savants.  When they run for office, they do so as if they were trying to get elected to the board of directors of America, Inc.  Some will brook no interference with their will.  Property, lots of it, in a society given over to its worship, becomes a blank check: everything is permitted to those who have it.

Dream and Nightmare

This, then, is the indigenous romance of American capitalism.  The man from nowhere becomes a Napoleon of business and so a hero because he confirms a cherished legend: namely, that it’s the primordial birthright of those lucky enough to live in the New World to rise out of obscurity to unimaginable heights.  All of this, so the legend tells us, comes through the application of disciplined effort, commercial cunning and foresight, a take-no-prisoners competitive instinct, and a gambler’s sang froid in the face of the unforgiving riskiness of the marketplace.  Master all of that and you deserve to be a master of our universe.  (Conversely, this is the dark fairy tale that nineteenth century Gilded Age anti-capitalist rebels knew as “the Property Beast.”)

What makes the creation of the titan particularly confounding is that it seems as if it shouldn’t be so.  Inside the colorless warrens of the counting house and factory workshop, a pedestrian preoccupation with profit and loss might be expected to smother all those instincts we associate with the warrior, the statesman, and the visionary, not to mention the tyrant.  As Joseph Schumpeter, the mid-twentieth century political economist, once observed, “There is surely no trace of any mystic glamour” about the sober-minded bourgeois.  He is not likely to “say boo to a goose.”

Yet the titan of capitalism overcomes that propensity.  As Schumpeter put it, he transforms himself into the sort of man who can “bend a nation to his will,” use his “extraordinary physical and nervous energy” to become “a leading man.”   Something happens through the experience of commercial conquest so intoxicating that it breeds a willful arrogance and a lust for absolute power of the sort for which George Baer hankered.  Call it the absolutism of self-righteous money.

Sheldon Adelson, Charles and David Koch, Sam Walton, Rupert Murdoch, Linda McMahon, or hedge fund honchos like John Paulson and Steven Cohen all conform in one way or another to this historic profile.  Powers to be reckoned with, they presume to know best what we should teach our kids and how we should do it; how to defend the country’s borders against alien invasion, revitalize international trade, cure what ails the health-care delivery system, create jobs where there are none, rejigger the tax code, balance the national budget, put truculent labor unions in their place, and keep the country on the moral and racial straight and narrow.

All this purported wisdom and self-assurance is home bred.  That is to say, these people are first of all family or dynastic capitalists, not the faceless men in suits who shimmy their way up the greased pole that configures the managerial hierarchies of corporate America.  Functionaries at the highest levels of the modern corporation may be just as wealthy, but they are a fungible bunch, whose loyalty to any particular outfit may expire whenever a more attractive stock option from another firm comes their way.

In addition, in our age of mega-mergers and acquisitions, corporations go in and out of existence with remarkable frequency, morphing into a shifting array of abstract acronyms.  They are carriers of great power, but without an organic attachment to distinct individuals or family lineages.

Instead dynasts of yesteryear and today have created family businesses or, as in the case of the Koch brothers and Rupert Murdoch, taken over ones launched by their fathers to which they are fiercely devoted. They guard their business sanctuaries by keeping them private, wary of becoming dependent on outside capital resources that might interfere with their freedom to do what they please with what they’ve amassed.

And they think of what they’ve built up not so much as a pile of cash, but as a patrimony to which they are bound by ties of blood, religion, region, and race.  These attachments turn ordinary business into something more transcendent.  They represent the tissues of a way of life, even a philosophy of life.  Its moral precepts about work, individual freedom, family relations, sexual correctness, meritocracy, equality, and social responsibility are formed out of the same process of self-invention that gave birth to the family business.  Habits of methodical self-discipline and the nurturing and prudential stewardship that occasionally turns a modest competency into a propertied goliath encourage the instinct to instruct and command.

There is no Tycoon Party in the U.S. imposing ideological uniformity on a group of billionaires who, by their very nature as übermensch, march to their own drummers and differ on many matters.  Some are philanthropically minded, others parsimonious; some are pietistic, others indifferent.  Wall Street hedge fund creators may donate to Obama and be card-carrying social liberals on matters of love and marriage, while heartland types like the Koch brothers obviously take another tack politically.  But all of them subscribe to one thing: a belief in their own omniscience and irresistible will.

There at the Creation

Business dynasts have enacted this imperial drama since the dawn of American capitalism — indeed, especially then, before the publicly traded corporation and managerial capitalism began supplanting their family capitalist predecessors at the turn of the twentieth century.  John Jacob Astor, America’s first millionaire, whose offices were once located on Manhattan Island where Zucotti Park now stands, was the most literal sort of empire builder.  In league with Thomas Jefferson, he attempted to extend that president’s “empire for liberty” all the way to the western edge of the continent and push out the British.  There, on the Oregon coast, he established the fur-trading colony of Astoria to consolidate his global control of the luxury fur trade.

In this joint venture, president and tycoon both failed.  Astor, however, was perfectly ready to defy the highest authority in the land and deal with the British when it mattered most.  So when Jefferson embargoed trade with that country in the run-up to the War of 1812, the founder of one of the country’s most luminous dynasties simply ran the blockade.  An unapologetic elitist, Astor admired Napoleon, assumed the masses were not to be left to their own devices, and believed deeply that property ought to be the prerequisite for both social position and political power.

Traits like Astor’s willfulness and self-sufficiency cropped up frequently in the founding generation of America’s “captains of industry.” Often they were accompanied by a chest-thumping braggadocio and thumb-in-your eye irreverence.  Cornelius Vanderbilt, called by his latest biographer “the first tycoon,” was known in his day as “the Commodore.”  Supposedly, he warned someone foolish enough to challenge his supremacy in the steamboat business that “I won’t sue you, I’ll ruin you.”

Or take “Jubilee” Jim Fisk. He fancied himself an admiral but wasn’t one, and after the Civil War, when caught plundering the Erie Railroad, boasted that he was “born to be bad.”   Later on, when a plot he hatched to corner the nation’s supply of gold left him running from the law, Jim classically summed up the scandal this way: “Nothing lost save honor.”

More than a century before Mitt Romney and Bain Capital came along, Jay Gould, a champion railroad speculator and buccaneering capitalist, scoured the country for companies to buy, loot, and sell.  Known by his many detractors as “the Mephistopheles of Wall Street,” he once remarked, when faced with a strike against one of his railroads, that he could “hire one half of the working class to kill the other half.”

George Pullman, nicknamed “the Duke” in America’s world of self-made royalty, wasn’t shy about dealing roughly with the rowdy “mob” either.  As a rising industrialist in Chicago in the 1870s, he — along with other young men from the city’s new manufacturing elite — actually took up arms to put down a labor insurgency and financed the building of urban armories, stocked with the latest artillery, including a new machine gun marketed as the “Tramp Terror.” (This was but one instance among many of terrorism from above by the forces of “law and order.”)

However, Pullman was better known for displaying his overlordship in quite a different fashion.  Cultivating his sense of dynastic noblesse oblige, he erected a model town, which he aptly named Pullman, just outside Chicago. There residents not only labored to manufacture sleeping cars for the nation’s trains, but were also tutored in how to live respectable lives — no drinking, no gambling, proper dress and deportment — while living in company-owned houses, shopping at company-owned stores, worshipping at company churches, playing in company parks, reading company-approved books in the company library, and learning the “three Rs” from company schoolmarms.  Think of it as a Potemkin working class village, a commercialized idyll of feudal harmony — until it wasn’t.  The dream morphed into a nightmare when “the Duke” suddenly began to slash wages and evict his “subjects” amid the worst depression of the nineteenth century.  This, in turn, provoked a nationwide strike and boycott, eventually crushed by federal troops.

The business autocrats of the Gilded Age could be rude and crude like Gould, Vanderbilt, and Fisk or adopt the veneer of civilization like Pullman.  Some of these “geniuses” of big business belonged to what Americans used to call the “shoddy aristocracy.”  Fisk had, after all, started out as a confidence man in circuses and Gould accumulated his “start-up capital” by bilking a business partner.  “Uncle” Daniel Drew, top dog on Wall Street around the time of the Civil War (and a pious one at that, who founded Drew Theological Seminary), had once been a cattle drover. Before bringing his cows to the New York market, he would feed them salt licks to make sure they were thirsty and then fill them with water so they would make it to the auction block weighing far more than their mere flesh and bones could account for.  He bequeathed America the practice of “watered stock.”

Not all the founding fathers of our original tycoonery, however, were social invisibles or refugees from the commercial badlands.  They could also hail from the highest precincts of the social register.  The Morgans were a distinguished banking and insurance clan going all the way back to colonial days.  J.P. Morgan was therefore to the manor born.  At the turn of the twentieth century, he functioned as the country’s unofficial central banker, meaning he had the power to allocate much of the capital that American society depended on.  Nonetheless, when asked about bearing such a heavy social responsibility, he bluntly responded, “I owe the public nothing.”

This sort of unabashed indifference to the general welfare was typical and didn’t end in the new century.  During the Great Depression of the 1930s, the managements of some major publicly owned corporations felt compelled by a newly militant labor movement and the shift in the political atmosphere that accompanied President Franklin Roosevelt’s New Deal to recognize and bargain with the unions formed by their employees.  Not so long before, some of these corporations, in particular United States Steel, had left a trail of blood on the streets of the steel towns of Pennsylvania and Ohio when they crushed the Great Steel Strike of 1919.  But times had changed.

Not so, however, for the adamantine patriarchs who still owned and ran the nation’s “little steel” companies (which were hardly little).  Men like Tom Girdler of Republic Steel resented any interference with their right to rule over what happened on their premises and hated the New Deal, as well as its allies in the labor movement, because they challenged that absolutism.  So it was that, on Memorial Day 1937, 10 strikers were shot in the back and killed while picketing Girdler’s Chicago factory.

The Great U-Turn

By and large, however, the middle decades of the twentieth century were dominated by modern concerns like U.S. Steel, General Motors, and General Electric, whose corporate CEOs were more sensitive to the pressures of their multiple constituencies.  These included not only workers, but legions of shareholders, customers, suppliers, and local and regional public officials.

Publicly held corporations are, for the most part, owned not by a family, dynasty, or even a handful of business partners, but by a vast sea of shareholders. Those “owners” have little if anything to do with running “their” complex companies.  This is left to a managerial cadre captained by lavishly rewarded chief executives.  Their concerns are inherently political, but not necessarily ideological.  They worry about their brand’s reputation, have multiple dealings with a broad array of government agencies, look to curry favor with politicians from both parties, and are generally reasonably vigilant about being politically correct when it comes to matters of race, gender, and other socially sensitive issues.  Behaving in this way is, after all, a marketing strategy that shows up where it matters most — on the bottom line.

Over the last several decades, however, history has done a U-turn.  Old-style private enterprises of enormous size have made a remarkable comeback.  Partly, this is a consequence of the way the federal government has encouraged private enterprise through the tax code, land-use policy, and subsidized finance.  It is also the outcome of a new system of decentralized, flexible capitalism in which large, complex corporations have downloaded functions once performed internally onto an array of outside, independent firms.

Family capitalism has experienced a renaissance.  Even giant firms are now often controlled by their owners the way Andrew Carnegie once captained his steel works or Henry Ford his car company.  Some of these new family firms were previously publicly traded corporations that went private.  A buy-out craze initiated by private equity firms hungry for quick turn-around profits, like Mitt Romney’s infamous Bain Capital, lent the process a major hand. This might be thought of as entrepreneurial capitalism for the short-term, a strictly finance-driven strategy.

But family-based firms in it for the long haul have also proliferated and flourished in this era of economic turbulence.  These are no longer stodgy, technologically antiquated outfits, narrowly dedicated to churning out a single, time-tested product.  They are often remarkably adept at responding to shifts in the market, often highly diversified in what they make and sell, and — thanks to the expansion of capital markets — they now enjoy a degree of financial independence not unlike that of their dynastic forebears of the nineteenth century, who relied on internally generated resources to keep free of the banks.  They have been cropping up in newer growth sectors of the economy, including retail, entertainment, energy, finance, and high tech.  Nor are they necessarily small-fry mom-and-pop operations.  One-third of the Fortune 500 now fall into the category of family-controlled.

Feet firmly anchored in their business fiefdoms, family patriarchs loom over the twenty-first-century landscape, lending it a back-to-the-future air.  They exercise enormous political influence.  They talk loudly and carry big sticks.  Their money elects officials, finances their own campaigns for public office, and is reconfiguring our political culture by fertilizing a rain forest of think tanks, journals, and political action committees.  A nation which, a generation ago, largely abandoned its historic resistance to organized wealth and power has allowed this newest version of the “robber baron” to dominate the public arena to a degree that might have astonished even John Jacob Astor and Cornelius Vanderbilt.

The Political Imperative

That ancestral generation, living in an era when the state was weak and kept on short rations, didn’t need to be as immersed in political affairs.  Contacting a kept senator or federal judge when needed was enough.  The modern regulatory and bureaucratic welfare state has extended its reach so far and wide that it needs to be steered, if not dismantled.

Some of our new tycoons try doing one or the other from off-stage through a bevy of front organizations and hand-selected candidates for public office.  Others dive right into the electoral arena themselves.  Linda McMahon, who with her husband created the World Wrestling Entertainment empire, is a two-time loser in senate races in Connecticut.  Rick Scott, a pharmaceutical entrepreneur, did better, becoming Florida’s governor.  Such figures, and other triumphalist types like them, claim their rise to business supremacy as their chief credential, often their only credential, when running for office or simply telling those holding office what to do.

Our entrepreneurial maestros come in a remarkable range of sizes and shapes.  On style points, “the Donald” looms largest.  Like so many nineteenth century dynasts, his family origins are modest.  A German grandfather arriving here in 1885 was a wine maker, a barber, and a saloonkeeper in California; father Fred became the Henry Ford of homebuilding, helped along by New Deal low-cost housing subsidies.  His son went after splashier, flashier enterprises like casinos, luxury resorts, high-end hotels, and domiciles for the 1%.  In all of this, the family name, splashed on towers of every sort and “the Donald’s” image — laminated hair-do and all — became his company’s chief assets.

Famous for nothing other than being very rich, Trump feels free to hold forth on every conceivable subject of public import from same-sex marriage to the geopolitics of the Middle East.  Periodically, he tosses his hat into the electoral arena.  But he comports himself like a clown.  He even has a game named after himself: “Trump — The Game,” whose play currency bears Donald’s face and whose lowest denomination is $10 million.  No wonder no one takes his right-wing bluster too seriously.  A modern day “Jubilee Jim Fisk,” craving attention so much he’s willing to make himself ridiculous, the Donald is his own reality TV show.

Rupert Murdoch, on the other hand, looks and dresses like an accountant and lives mainly in the shadows.  Like Trump, he inherited a family business.  Unlike Trump, his family pedigree was auspicious.  His father was Sir Keith, a media magnate from Melbourne, Australia, and Rupert went to Oxford.  Now, the family’s media influence straddles continents, as Rupert attempts — sometimes with great success — to make or break political careers and steer whole political parties to the right.

The News Corporation is a dynastic institution of the modern kind in which Murdoch uses relatively little capital and a complex company structure to maintain and vigorously exercise the family’s control.  When the Ford Motor Company finally went public in 1956, it did something similar to retain the Ford family’s dominant position.  So, too, did Google, whose “dual-class share structure” allowed its founders Larry Page and Sergey Brin to continue calling the shots.  Murdoch’s empire may, on first glance, seem to conform to American-style managerial corporate capitalism, apparently rootless, cosmopolitan, fixed on the bottom line. In fact, it is tightly tethered to Murdoch’s personality and conservative political inclinations and to the rocky dynamics of the Murdoch succession.  That is invariably the case with our new breed of dynastic capitalists.

Sheldon Adelson, the CEO of the Las Vegas Sands Corporation and sugar daddy to right-wing political wannabes from city hall to the White House, lacks Murdoch’s finesse but shares his convictions and his outsized ambition to command the political arena.  He’s the eighth richest man in the world, but grew up poor as a Ukrainian Jew living in the Dorchester neighborhood of Boston.  His father was a cab driver and his mother ran a knitting shop.  He went to trade school to become a court reporter and was a college drop-out.  He started several small businesses that failed, winning and losing fortunes.  Then he gambled and hit the jackpot, establishing lavish hotels and casinos around the world.  When he again lost big time during the global financial implosion of 2007-2008, he responded the way any nineteenth century sea dog capitalist might have: “So I lost twenty-five billion dollars.  I started out with zero… [there is] no such thing as fear, not to any entrepreneur.  Concern, yes.  Fear, no.”

A committed Zionist, Adelson was once a Democrat.  But he jumped ship over Israel and because he believed the party’s economic policies were ruining the country.  (He’s described Obama’s goal as “a socialist-style economy.”)  He established the Freedom Watch’s dark-money group as a counterweight to George Soros’s Open Society and to MoveOn.org.  According to one account, Adelson “seeks to dominate politics and public policy through the raw power of money.”  That has, for instance, meant backing Newt Gingrich in the Republican presidential primaries of 2012 against Mitt Romney, whom he denounced as a “predatory capitalist” (talk about the pot calling the kettle black!), and not long after, funneling cash to candidate Romney.

Free Markets and the Almighty

Charles and David Koch are perfect specimens of this new breed of family capitalists on steroids.  Koch Industries is a gigantic conglomerate headquartered in the heartland city of Wichita, Kansas.  Charles, who really runs the company, lives there.  David, the social and philanthropic half of this fraternal duopoly, resides in New York City.  Not unlike George “the Duke” Pullman, Charles has converted Wichita into something like a company city, where criticism of Koch Industries is muted at best.

The firm’s annual revenue is in the neighborhood of $10 billion, generated by oil refineries, thousands of miles of pipelines, paper towels, Dixie cups, Georgia Pacific lumber, Lycra, and Stainmaster Carpet, among other businesses.  It is the second largest privately owned company in the United States.  (Cargill, the international food conglomerate, comes first.)  The brothers are inordinately wealthy, even for our “new tycoonery.”  Only Warren Buffett and Bill Gates are richer.

While the average businessman or corporate executive is likely to be pretty non-ideological, the Koch brothers are dedicated libertarians.  Their free market orthodoxy makes them adamant opponents of all forms of government regulation.  Since their companies are among the top 10 air polluters in the United States, that also comports well with their material interests — and the Kochs come by their beliefs naturally, so to speak.

Their father, Fred, was the son of a Dutch printer who settled in Texas and started a newspaper.  He later became a chemical engineer and invented a better method for converting oil into gasoline.  In one of history’s little jokes, he was driven out of the industry by the oil giants who saw him as a threat.  Today, Koch Industries is sometimes labeled “the Standard Oil of our time,” an irony it’s not clear the family would appreciate.  After a sojourn in Joseph Stalin’s Soviet Union (of all places), helping train oil engineers, Fred returned stateside to set up his own oil refinery business in Wichita.  There, he joined the John Birch Society and ranted about the imminent Communist takeover of the government.  In that connection he was particularly worried that “the colored man looms large in the Communist plan to take over America.”

Father Fred raised his sons in the stern regimen of the work ethic and instructed the boys in the libertarian catechism. This left them lifelong foes of the New Deal and every social and economic reform since.  That included not only predictable measures like government health insurance, social security, and corporate taxes, but anything connected to the leviathan state.  Even the CIA and the FBI are on the Koch chopping block.

Dynastic conservatism of this sort has sometimes taken a generation to mature.  Sam Walton, like many of his nineteenth-century analogs, was not a political animal.  He just wanted to be left alone to do his thing and deploy his power over the marketplace.  So he stayed clear of electoral and party politics, although he implicitly relied on the racial, gender, and political order of the old South, which kept wages low and unions out, to build his business in the Ozarks.  After his death in 1992, however, Sam’s heirs entered the political arena in a big way.

In other respects Sam Walton conformed to type.  He was impressed with himself, noting that “capital isn’t scarce; vision is” (although his “one stop shopping” concept was already part of the retail industry before he started Walmart).  His origins were humble.  He was born on a farm in Kingfisher, Oklahoma.  His father left farming for a while to become a mortgage broker, which in the Great Depression meant he was a farm re-possessor for Metropolitan Life Insurance.  Sam did farm chores, then worked his way through college, and started his retail career with a small operation partly funded by his father-in-law.

At every juncture, the firm’s expansion depended on a network of family relations.  Soon enough, his stores blanketed rural and small-town America.  Through all the glory years, Sam’s day began before dawn as he woke up in the same house he’d lived in for more than 30 years.  Then, dressed in clothes from one of his discount stores, off he went to work in his red Ford pick-up truck.

Some dynasts are pietistic and some infuse their business with religion.  Sam Walton did a bit of both.  In his studiously modest “life style,” there was a kind of outward piety.  Living without pretension, nose to the grindstone, and methodically building up the family patrimony has for centuries carried a sacerdotal significance, leaving aside any specific Protestant profession of religious faith.  But there was professing as well.  Though not a fundamentalist, he was a loyal member of the First Presbyterian Church in Bentonville, Arkansas, where he was a “ruling elder” and occasionally taught Sunday school (something he had also done in college as president of the Burall Bible Class Club).

Christianity would play a formative role in his labor relations strategy at Walmart.  His employees — “associates,” he dubbed them — were drawn from an Ozark world of Christian fraternity which Walmart management cultivated.  “Servant leadership” was a concept designed to encourage workers to undertake their duties serving the company’s customers in the same spirit as Jesus, who saw himself as a “servant leader.”

This helped discourage animosities in the work force, as well as blunting the — to Walton — dangerous desire to do something about them through unionizing or responding in any other way to the company’s decidedly subpar working conditions and wages.  An aura of Christian spiritualism plus company-scripted songs and cheers focused on instilling company loyalty, profit-sharing schemes, and performance bonuses constituted a twentieth century version of Pullman’s town idyll.

All of this remained in place after Sam’s passing.  What changed was the decision of his fabulously wealthy relatives to enter the political arena.  Walton lobbying operations now cover a broad range of issues, including lowering corporate taxes and getting rid of the estate tax entirely, as his heirs subsidize mainly Republican candidates and causes.  Most prominent of all have been the Walton efforts to privatize education through vouchers or by other means, often enough turning public institutions into religiously affiliated schools.

Wall Street has never been known for its piety.  But the tycoons who founded the Street’s most lucrative hedge funds — men like John Paulson, Paul Tudor James II, and Steve Cohen, among others — are also determined to up-end the public school system.  They are among the country’s most powerful proponents of charter schools.  Like J.P. Morgan of old, these men grew up in privilege, went to prep schools and the Ivy League, and have zero experience with public education or the minorities who tend to make up a large proportion of charter school student bodies.

No matter.  After all, some of these people make several million dollars a day.  What an elixir!  They are joined in this educational crusade by fellow business conquistadors of less imposing social backgrounds like Mark Zuckerberg, who has ensured that Facebook will remain a family domain even while “going public.”  Another example would be Bill Gates, the most celebrated of a brace of techno-frontiersmen who — legend would have it — did their pioneering in homely garages, even though the wonders they invented would have been inconceivable without decades of government investment in military-related science and technology.  What can’t these people do, what don’t they know?  They are empire builders and liberal with their advice and money when it comes to managing the educational affairs of the nation.  They also benefit handsomely from a provision in the tax code passed during the Clinton years that rewards them for investing in “businesses” like charter schools.

Our imperial tycoons are a mixed lot.  They range from hip technologists like Zuckerberg to heroic nerds like Bill Gates, and include yesteryear traditionalists like Sam Walton and the Koch brothers.  What they share with each other and their robber baron ancestors is a god-like desire to create the world in their image.

Watching someone play god may amuse us, as “the Donald” can do in an appalling sort of way.  It is, however, a dangerous game with potentially deadly consequences for a democratic way of life already on life support.

Follow

Get every new post delivered to your Inbox.

Join 225 other followers

%d bloggers like this: