Naked Capitalism: Matt Taibbi on Perp Walks


Matt Taibbi on Democracy Now: Banks Admit to Crimes, Pay $5 Billion, And Still No One Goes to Jail

Posted on May 22, 2015 by

Matt Taibbi discussed the latest round of bank settlements on Democracy Now. This time, Citigroup, JPMorgan Chase, Barclays and Royal Bank of Scotland pleaded guilty to conspiracy to manipulate foreign exchange rates in dollars and Euros. While this settlement was a step forward by virtue of the banks admitted to specific, criminal acts, yet again the institution is fined while the executives go unpunished. And we again have to listen to the intelligence-insulting claims that the top brass were victims of bad actors.

On a related issue, reader Glenn M e-mailed us:

Perhaps I missed it, but I have seen no news stories regarding the final results of Eric Holder’s 90-day ultimatum back in February to finally land indictments against Wall Streeters for the mortgage crisis.
According to my calculations, this “ultimatum” expired May 17.

There haven’t seemed to have been any stories noting the failure of this ultimatum, which was announced with great fanfare.

Here is a link from the day of:
Eric Holder launches 90-day crusade against bank leaders

From the article:

After years of pressure from some lawmakers, civic leaders and Occupy Wall Street protesters, the country’s No. 1 law enforcer said Tuesday he has instructed many of his 93 federal prosecutors to review any residential mortgage fraud case they have brought against a financial institution stemming from the 2008 financial crisis to see if any executive could be held accountable for the company’s actions.

Both civil and criminal cases will be on the table, Holder said.

The prosecutors have been given three months to report their findings to Washington.

And predictably, this bold talk was yet another show for the rubes. Nothing took place and no report was made public. Some Democrats in the Senate, such as Sherrod Brown and Elizabeth Warren, threatened to slow-walk the confirmation of Loretta Lynch unless cases were filed against individuals. But as an insider tersely put it, “Dems voted for Lynch and lost their leverage.”

Naked Capitalism: Krugman Stopped Short


Bill Black: Krugman is Half Right

Posted on May 18, 2015 by

By Bill Black, the author of The Best Way to Rob a Bank is to Own One and an associate professor of economics and law at the University of Missouri-Kansas City. Jointly published with New Economic Perspectives

Paul Krugman has a nice column entitled “Fraternity of Failure” dated May 15, 2015.

In Bushworld, in other words, playing a central role in catastrophic policy failure doesn’t disqualify you from future influence. If anything, a record of being disastrously wrong on national security issues seems to be a required credential.

But refusal to learn from experience, combined with a version of political correctness in which you’re only acceptable if you have been wrong about crucial issues, is pervasive in the modern Republican Party.

Krugman moves from foreign policy to economic policy and sees the same fraternity of failure among Republican economists.

Take my usual focus, economic policy. If you look at the list of economists who appear to have significant influence on Republican leaders, including the likely presidential candidates, you find that nearly all of them agreed, back during the “Bush boom,” that there was no housing bubble and the American economic future was bright; that nearly all of them predicted that the Federal Reserve’s efforts to fight the economic crisis that developed when that nonexistent bubble popped would lead to severe inflation; and that nearly all of them predicted that Obamacare, which went fully into effect in 2014, would be a huge job-killer.

Given how badly these predictions turned out — we had the biggest housing bust in history, inflation paranoia has been wrong for six years and counting, and 2014 delivered the best job growth since 1999 — you might think that there would be some room in the G.O.P. for economists who didn’t get everything wrong. But there isn’t. Having been completely wrong about the economy, like having been completely wrong about Iraq, seems to be a required credential.

What’s going on here? My best explanation is that we’re witnessing the effects of extreme tribalism. On the modern right, everything is a political litmus test. Anyone who tried to think through the pros and cons of the Iraq war was, by definition, an enemy of President George W. Bush and probably hated America; anyone who questioned whether the Federal Reserve was really debasing the currency was surely an enemy of capitalism and freedom.

It doesn’t matter that the skeptics have been proved right. Simply raising questions about the orthodoxies of the moment leads to excommunication, from which there is no coming back. So the only “experts” left standing are those who made all the approved mistakes. It’s kind of a fraternity of failure: men and women united by a shared history of getting everything wrong, and refusing to admit it. Will they get the chance to add more chapters to their reign of error?

Krugman’s explanation is compelling, except that it ignores the rival fraternity of failure inhabited by economists and finance “experts” who support the New Democrats and New Labour. Krugman chooses three economic prediction issues to discuss: was there a housing bubble, would the Fed’s liquidity programs in response to the crash cause hyper-inflation, and would Obamacare be a huge job-killer? He faces tight word count limits and one can only discuss a few examples in any column. The three examples he chose are certainly legitimate. But the examples he chose focus on Republican errors, are not the most important economic errors, and are not as clean as he implies in purportedly differentiating between Republican error and Democratic Party economist success.

The Housing Bubble

Economists who tend to support the Democratic Party like Dean Baker were the ones who got the housing bubble correct – and early. But, overwhelmingly, economists who tend to support the Democratic Party either got the bubble wrong, or made minimal efforts to warn about the bubble and call for public sector actions to burst it.

The reason the vast majority of economists, regardless of political affiliation, got the bubble wrong has next to nothing to do with partisan “tribalism.” The reason they got it wrong is because orthodox economists of all political persuasions believed in economic myths that had been falsified by white-collar criminologists 75 years ago. For a bubble to occur, market prices for that good (and a vast array of financial derivatives for which that good constitutes the “underlying) must systematically move in the wrong direction away from the “efficient” price – for many years and (in the case of housing) by over $1 trillion cumulatively. Orthodox economists, believed in the efficient market hypothesis and knew such bubbles were impossible.

The most logical explanation for such a bubble is a rational explanation – widespread “accounting control fraud” by lenders and loan purchasers. Orthodox economists make a standard assumption of rational behavior, including by criminals. But orthodox economists have a primitive tribal taboo against the “f” word – fraud. When it comes to bubbles, therefore, orthodox economists overwhelmingly simply assume mass irrationality. They would rather drop their most cherished assumptions about economic behavior than admit the reality that there are elite white-collar criminals and that their crimes can become epidemic when the incentive structures are so perverse that they produce a criminogenic environment. .

This problem of dogma is compounded by the problem that the orthodox responses to a bubble are clumsy, slow, and awful in terms of their “collateral damage” to the Nation, particularly those most in need. Basically, the orthodox response is to throw the economy in a recession – hoping to kill off the bubble and reduce the severity of the eventual recession it would have caused when it collapsed on its own. Worse, the orthodox policy recommendation to avoid future bubbles is permanent monetary austerity and higher interest rates to deter bubbles – producing rolling recessions and weak growth. Indeed, orthodox economists are so dubious of their ability to correctly identify a bubble and so cognizant of the grave harms and risks posed by trying to use orthodox responses to bubbles that their standard recommendation is to do nothing even if they suspect that a bubble is developing.

The vastly better response to a bubble like the housing bubble is unknown to orthodox economists and is never taught to students. The answer is to (1) put the fraudulent lenders and loan purchasers out of business by vigorous supervision, enforcement, and prosecution and (2) to limit their growth by effective regulation and bans on loan products most conducive to fraud. The regulators and prosecutors must break the “Gresham’s” dynamic that can make fraud epidemic.

We did this as regulators in 1984-1986 and deliberately burst the developing real estate bubbles in the Southwest before they could cause any national, much less international, economic crisis. It was certainly not pain free and we were aided by passage of the 1986 tax reform act that ended some of the most perverse real estate tax incentives, but it worked brilliantly and represents even today the most successful policy intervention against a bubble. The same strategy would have prevented the Great Recession, but the anti-regulators refused to follow our lead.

Inflation and Monetary Stimulus

This was an odd formulation by Krugman because it ignored fiscal stimulus which is where economists that tend to support the Democratic Party had their greatest success relative to economists that almost invariably support the Republican Party. Krugman’s point on the “hyper-inflation” crazy-hawks is correct, but it would have been even more forceful had Krugman brought in fiscal policy.

Obamacare as a Faux Job Killer

This too could have been explained in a manner that would have added support to Krugman’s thesis. The key to recall, that Krugman knows but did not mention here, is that Obamacare is a direct steal from a far-right-wing “think” tank that overwhelmingly supports Republicans. Further, Republican economists frequently supported the plan and Mitt Romney famously was the first to implement the plan when he was Governor of Massachusetts. Now, however, one would risk being torn apart at a Republican gathering by announcing support for the ultra-right-wing health insurance plan. Krugman, years ago, explained many of the defects of Romneycare/Obamacare. Of late he has also explained that while Romney/Obamacare is a poorly designed plan because of the dogmas of its right-wing drafters, it is also much better than “no care.” “No-care” was the prior system.

Republican economists act like virtually all Republicans in constantly inventing myths and horror stories about Romney/Obamacare. At this juncture, the problem is plainly one of ethics. The Republican economists are so eager to curry favor with Republican politicians that they lie as a matter of routine when they try to picture a hard-right-wing health insurance program as a near-Communist conspiracy.

But What If We Looked at Democratic Economists on the Most Critical Issues?

The single most important economic issue of the last three decades has been the three “de’s” – deregulation, desupervision, and de facto decriminalization. On that issue, which has driven our three most recent financial scandals, economists who are prominent New Democrat supporters have been disastrous. Yes, the Republican economists are often even worse. On the most important economic issue the dominant economists of both parties have simply formed rival fraternities with the same membership requirement – a track record of failure.

The Garn-St Germain Act that prompted the federal v. state regulatory race to the bottom that generated the savings and loan (S&L) debacle was drafted by a Republican economist (Dick Pratt), but it enjoyed broad bipartisan support and faced no opposition from economists associated with the Democratic Party. Not a single outside academic economist associated with the Democratic Party supported our struggle to reregulate, resupervise, and prosecute the industry and the elite frauds driving the S&L debacle. Democrats and Republicans formed a bipartisan effort to defeat our efforts against the fraud epidemic. Speaker Wright led the opposition in the House and four of the five Senators who did Charles Keating’s bidding in his jihad against our efforts to reregulate the industry were Democrats.

Of the two Democratic economists President Reagan sought to appoint to leadership positions running our regulatory agency, one (George Benston) was Keating’s man and a ferocious proponent of the three “de’s.” The other (Larry White) was always instinctually a fraud-denier and an opponent of regulation. To White’s credit, he overcame these instincts in a number of important decisions, but he was never able to overcome his dogmas sufficiently to become a leader against the fraud epidemics raging in the industry.

There are many problems with the three “de’s,” but the paramount problem is that they can create criminogenic environments that produce epidemics of “control fraud” that cause catastrophic damage. But orthodox economists of parties share the primitive tribal taboo against the “f” word. Orthodox economists of both parties share the same absurd standard assumptions that (implicitly) exclude fraud (it is impossible if there is perfect information and people act rationally).

Orthodox economists of both parties are proudly mono-disciplinary. They virtually never read any of the sophisticated modern research on white-collar criminology even though James Galbraith has explained (in his article prompted by a Krugman column) that our predictive success far exceeds economists’ predictive strength. Similarly, George Akerlof and Paul Romer explained in their 1993 article on “looting” that the S&L regulators got the fraud epidemic right from the beginning while the economists missed it.

Orthodox economists of both parties use almost exclusively econometric and modelling techniques that must produce systematic, massive error in the presence of epidemics of accounting control fraud. The mathematics on this is indisputable and these economists are proud to a point well-beyond arrogance about their purported quantitative expertise. Why then do they continue to use almost exclusively a failed methodology that they know must produce systematic, massive error in the presence of fraud? They also know that under their standard assumption about human behavior the bankers’ quants should design the models to systematically, and dramatically overvalue assets (by ignoring fraud) in order to maximize the quants’ (and their senior officers’) compensation.

Krugman is better on this subject than many economists. He famously got the California energy crisis correct by be willing to believe that Enron and its co-conspirators had formed a cartel to restrict supply. Cartels are another area in which economists supporting both parties were long insane. They claimed, contrary to all known experience (and Adam Smith’s famous warning) that cartels were, at worst, so ephemeral, with a half-life similar to Ununoctium, that they were not worth worrying about. Criminologists never made this mistake, but why would an economist read the criminology literature to learn about crimes like fraud and cartels?

It was the Rubinites in the U.S. and New Labour in the UK using orthodox economics as their weapon that constituted the Schwerpunkt of the assault against effective regulation under President Bill Clinton and Vice President Al Gore and Prime Ministers Tony Blair and Gordon Brown. Clinton’s destruction of Glass-Steagall and squashing Brooksley Born’s effort to protect us from fraud in financial derivatives through passage of the Commodity Futures Modernization Act of 2000 that gets the most attention, but the desupervision of the financial industry and the embrace of the regulatory race to the bottom were far more destructive. It was under Clinton and Gore’s assault that the underwriting rule we used in 1991 to drive liar’s loans out of the S&L industry was replaced with a guideline deliberately crafted to be unenforceable and useless. It was Clinton and Gore who began and primarily “accomplished” cutting the FDIC staff by more than three-quarters and the OTS staff of S&L regulators by more than half. I have just completed a series of articles detailing how Blair and Brown championed the City of London “winning” the regulatory race to the bottom by destroying the last vestiges of financial regulation, supervision, enforcement, and prosecutions in the UK.

Similarly, it was the Rubinites in the U.S. and the Blairites in the UK who spread the economic nonsense that a government with a sovereign currency (like the U.S. and the UK) was just like a consumer household. Under this myth, government deficits were immoral and harmful while budget surpluses demonstrated superior morality and were desirable. Blair and Brown turned the City into the financial cesspool of the world by championing the regulatory race to the bottom (which the City “won”), produced massive epidemics of control fraud that caused a financial crisis and threw the UK into the Great Recession, and then “bled the patient” through self-destructive and economically illiterate austerity while Labour’s economists generally remained silent about how insane these policies were or even provided support.

It was President Obama, having seen the catastrophe set in motion by the Rubinites (yes, greatly exacerbated by President Bush and his “wrecking crew”), who decided to place the Rubinites in power in his administration and to supplement them with Republican failures like Ben Bernanke and Timothy Geithner (who dropped his party affiliation to set up such promotions).

Here is how I explained Obama’s and Bush’s deliberate embrace of officials with a track record of failure it on April 3, 2009 in my first interview by Bill Moyers.

WILLIAM K. BLACK: These are all people who have failed. Paulson failed, Geithner failed. They were all promoted because they failed, not because…

BILL MOYERS: What do you mean?

WILLIAM K. BLACK: Well, Geithner has, was one of our nation’s top regulators, during the entire subprime scandal, that I just described. He took absolutely no effective action. He gave no warning. He did nothing in response to the FBI warning that there was an epidemic of fraud. All this pig in the poke stuff happened under him. So, in his phrase about legacy assets. Well he’s a failed legacy regulator.

BILL MOYERS: But he denies that he was a regulator. Let me show you some of his testimony before Congress. Take a look at this

TIMOTHY GEITHNER: I’ve never been a regulator, for better or worse. And I think you’re right to say that we have to be very skeptical that regulation can solve all of these problems. We have parts of our system that are overwhelmed by regulation.

BILL MOYERS: Overwhelmed by regulation! It wasn’t the absence of regulation that was the problem, it was despite the presence of regulation you’ve got huge risks that build up.

WILLIAM K. BLACK: Well, he may be right that he never regulated, but his job was to regulate. That was his mission statement.


WILLIAM K. BLACK: As president of the Federal Reserve Bank of New York, which is responsible for regulating most of the largest bank holding companies in America. And he’s completely wrong that we had too much regulation in some of these areas. I mean, he gives no details, obviously. But that’s just plain wrong.

The point I was explaining was that prominent politicians frequently feel they can gain politically by taking policy positions that are destructive but popular. These politicians want to be able to trot out their economist in such circumstances and have him (more rarely, her) say something that is economic nonsense that the economist fabricates to make the politician’s terrible policy argument sound like it accords with economic history. Ethical economists, therefore, tend to be disfavored by prominent politicians. One of the inherent consequences of presenting nonsense economic positions is that the economist’s predictions will fail – repeatedly. An economist who is willing to repeat a fabricated economic claim that he knows to be false to support his politician’s latest insane policy proposal will have a record of failure that prominent politicians will love. This economist is willing to lie, repeatedly, for me when I need him to lie.

My proposal to Bill Moyers was that we try the opposite strategy. Fire the persistent failures and the cheats and hire people with a track record for integrity and getting things right.

WILLIAM K. BLACK: Now, going forward, get rid of the people that have caused the problems. That’s a pretty straightforward thing, as well. Why would we keep CEOs and CFOs and other senior officers that caused the problems? That’s facially nuts. That’s our current system.

So stop that current system. We’re hiding the losses, instead of trying to find out the real losses. Stop that, because you need good information to make good decisions, right? Follow what works instead of what’s failed. Start appointing people who have records of success, instead of records of failure. That would be another nice place to start. There are lots of things we can do. Even today, as late as it is. Even though they’ve had a terrible start to the administration. They could change, and they could change within weeks. And by the way, the folks who are the better regulators, they paid their taxes. So, you can get them through the vetting process a lot quicker.

Black and Krugman on Choosing Failures and the Resultant “Reign of Error”

In addition to identifying the same dynamic of deliberately choosing failures that I explained in 2009 that Krugman has now named fittingly the “Fraternity of Failure,” I notice that his same May 15, 2015 column used a phrase I had just used two days earlier in an article in my series of columns on New Labour.

Note to Blair: it would be a truly excellent thing for the world if financial regulators were to “always err on the side of caution” and to have only “one-way pressures” “to guard the public interest” rather than to aid and abet the City banksters’ “reign of error” and fraud. The fact that Blair felt that (mythical) UK financial regulators devoted “to guard[ing] the public interest” were a disaster tells you all you need to know about how deeply he was in the banksters’ pocket even before they made him “filthy rich” (in the immortal words of Blair’s Red Tory strategist, Peter Mandelson).

Krugman’s column aptly uses the same “reign of error” phrase in an analogous context.

It doesn’t matter that the skeptics have been proved right. Simply raising questions about the orthodoxies of the moment leads to excommunication, from which there is no coming back. So the only “experts” left standing are those who made all the approved mistakes. It’s kind of a fraternity of failure: men and women united by a shared history of getting everything wrong, and refusing to admit it. Will they get the chance to add more chapters to their reign of error?

(A tip as a criminologist: contrary to all the cop shows on TV where the boss pronounces “there is no such thing as a coincidence,” coincidences such as these are common. Anyone who understands statistics understands why.)

Naked Capitalism on Perpetual War


William J. Astore: The American Military Uncontained, Chaos Spread, Casualties Inflicted, Missions Unaccomplished

Posted on May 15, 2015 by

Yves here. This post is an important, sobering description of how US military overreach became institutionalized.

By William J. Astore, a retired lieutenant colonel (USAF) who edits the blogThe Contrary Perspective. Originally published at TomDispatch<

It’s 1990. I’m a young captain in the U.S. Air Force.  I’ve just witnessed the fall of the Berlin Wall, something I never thought I’d see, short of a third world war.  Right now I’m witnessing the slow death of the Soviet Union, without the accompanying nuclear Armageddon so many feared.  Still, I’m slightly nervous as my military gears up for an unexpected new campaign, Operation Desert Shield/Storm, to expel Iraqi autocrat Saddam Hussein’s military from Kuwait.  It’s a confusing moment.  After all, the Soviet Union was forever (until it wasn’t) and Saddam had been a stalwart U.S. friend, his country a bulwark against the Iran of the Ayatollahs.  (For anyone who doubts that history, just check out the now-infamous 1983 photo of Donald Rumsfeld, then special envoy for President Reagan, all smiles and shaking hands with Saddam in Baghdad.)  Still, whatever my anxieties, the Soviet Union collapsed without a whimper and the campaign against Saddam’s battle-tested forces proved to be a “cakewalk,” with ground combat over in a mere 100 hours.

Think of it as the trifecta moment: Vietnam syndrome vanquished forever, Saddam’s army destroyed, and the U.S. left standing as the planet’s “sole superpower.”

Post-Desert Storm, the military of which I was a part stood triumphant on a planet that was visibly ours and ours alone.  Washington had won the Cold War.  It had won everything, in fact.  End of story.  Saddam admittedly was still in power in Baghdad, but he had been soundly spanked.  Not a single peer enemy loomed on the horizon.  It seemed as if, in the words of former U.N. ambassador and uber-conservative Jeane Kirkpatrick, the U.S. could return to being a normal country in normal times.

What Kirkpatrick meant was that, with the triumph of freedom movements in Central and Eastern Europe and the rollback of communism, the U.S. military could return to its historical roots, demobilizing after its victory in the Cold War even as a “new world order” was emerging.  But it didn’t happen.  Not by a long shot.  Despite all the happy talk back then about a “new world order,” the U.S. military never gave a serious thought to becoming a “normal” military for normal times.  Instead, for our leaders, both military and civilian, the thought process took quite a different turn.  You might sum up their thinking this way, retrospectively: Why should we demobilize or even downsize significantly or rein in our global ambitions at a moment when we can finally give them full expression?  Why would we want a “peace dividend” when we could leverage our military assets and become a global power the likes of which the world has never seen, one that would put the Romans and the British in the historical shade?  Conservative columnist Charles Krauthammer caught the spirit of the moment in February 2001 when he wrote, “America is no mere international citizen. It is the dominant power in the world, more dominant than any since Rome. Accordingly, America is in a position to reshape norms, alter expectations, and create new realities. How? By unapologetic and implacable demonstrations of will.”

What I didn’t realize back then was: America’s famed “containment policy” vis-à-vis the Soviet Union didn’t just contain that superpower — it contained us, too.  With the Soviet Union gone, the U.S. military was freed from containment.  There was nowhere it couldn’t go and nothing it couldn’t do — or so the top officials of the Bush administration came into power thinking, even before 9/11.  Consider our legacy military bases from the Cold War era that already spanned the globe in an historically unprecedented way.  Built largely to contain the Soviets, they could be repurposed as launching pads for interventions of every sort.  Consider all those weapon systems meant to deter Soviet aggression.  They could be used to project power on a planet seemingly without rivals.

Now was the time to go for broke.  Now was the time to go “all in,” to borrow the title of Paula Broadwell’s fawning biography of her mentor and lover, General David Petraeus.  Under the circumstances, peace dividends were for wimps.  In 1993, Madeleine Albright, secretary of state under Bill Clinton, caught the coming post-Cold War mood of twenty-first-century America perfectly when she challenged Joint Chiefs Chairman Colin Powell angrily over what she considered a too-cautious U.S. approach to the former Yugoslavia. “What’s the point of having this superb military that you’re always talking about,” she asked, “if we can’t use it?”

Yet even as civilian leaders hankered to flex America’s military muscle in unpromising places like Bosnia and Somalia in the 1990s, and Afghanistan, Iraq, Libya, Pakistan, and Yemen in this century, the military itself has remained remarkably mired in Cold War thinking.  If I could transport the 1990 version of me to 2015, here’s one thing that would stun him a quarter-century after the collapse of the Soviet Union: the force structure of the U.S. military has changed remarkably little.  Its nuclear triad of land-based ICBMs, submarine-launched SLBMs, and nuclear-capable bombers remains thoroughly intact.  Indeed, it’s being updated and enhanced at mind-boggling expense (perhaps as high as a trillion dollars over the next three decades).  The U.S. Navy?  Still built around large, super-expensive, and vulnerable aircraft carrier task forces.  The U.S. Air Force?  Still pursuing new, ultra-high-tech strategic bombers and new, wildly expensive fighters and attack aircraft — first the F-22, now the F-35, both supremely disappointing.  The U.S. Army?  Still configured to fight large-scale, conventional battles, a surplus of M-1 Abrams tanks sitting in mothballs just in case they’re needed to plug the Fulda Gap in Germany against a raging Red Army.  Except it’s 2015, not 1990, and no mass of Soviet T-72 tanks remains poised to surge through that gap.

Much of our military today remains structured to meet and defeat a Soviet threat that long ago ceased to exist.  (Occasional sparring matches with Vladimir Putin’s Russia in and around Ukraine do not add up to the heated “rumbles in the jungle” we fought with the Soviet leaders of yesteryear.)  And it’s not just a matter of weaponry.  Our military hierarchy remains wildly and unsustainably top-heavy, with a Cold War-style cupboard of generals and admirals, as if we were still stockpiling brass in case of another world war and a further expansion of what is already uncontestably the largest military on the planet.  If you had asked me in 1990 what the U.S. military would look like in 2015, the one thing I wouldn’t have guessed was that, in its force structure, it would look basically the same.

This persistence of such Cold War structures and the thinking that goes with them is a vivid illustration of military inertia, the plodding last-war conservatism that is a common enough phenomenon in military history.  It’s also a reminder that the military-industrial-congressional-complex that President Dwight Eisenhower first warned us about in 1961 remains in expansion mode more than half a century later, with its taste for business as usual (meaning, among other things, wildly expensive weapons systems).  Above all, though, it’s an illustration of something far more disturbing: the failure of democratic America to seize the possibility of a less militarized world.

Today, it’s hard to recapture the heady optimism of 1990, the idea that this country, as after any war, might at least begin to take steps to demobilize, however modestly, to become a more peaceable land.  That’s why 1990 should be considered the high-water mark of the U.S. military.  At that moment, we were poised on the brink of a new normalcy — and then it all began to go wrong.  To understand how, it’s important to see not just what remained the same, but also what began to change and just how we ended up with today’s mutant military.

Paramilitaries Without, Militaries Within, Civilian Torturers, and Assassins Withal

Put me back again in my slimmer, uniformed 1990 body and catapult me for a second time to 2015.  What do I see in this military moment that surprises me?  Unmanned aerial vehicles, or drones, for sure.  Networked computers everywhere and the reality of a military preparing for “cyberwar.”  Incessant talk of terrorism as America’s chief threat.  A revival, however haltingly, of counterinsurgency operations, or COIN, a phenomenon abandoned in Vietnam with a stake through its heart (or so I thought then).  Uncontrolled and largely unaccountable mass surveillance of civilian society that in the Cold War era would have been a hallmark of the “Evil Empire.”

More than anything, however, what would truly have shocked the 1990 version of me is the almost unimaginable way the military has “privatized” in the twenty-first century.  The presence of paramilitary forces (mercenary companies like DynCorp and the former Blackwater, now joined with Triple Canopy in the Constellis Group) and private corporations like KBR doing typical military tasks like cooking and cleaning (what happened to privates doing KP?), delivering the mail, and mounting guard duty on military bases abroad; an American intelligence system that’s filled to the brim with tens of thousands of private contractors; a new Department of Defense called the Department of Homeland Security (“homeland” being a word I would once have associated, to be blunt, with Nazi Germany) that has also embraced paramilitaries and privatizers of every sort; the rapid rise of a special operations community, by the tens of thousands, that has come to constitute a vast, privileged, highly secretive military caste within the larger armed forces; and, most shocking of all, the public embrace of torture and assassination by America’s civilian leaders — the very kinds of tactics and techniques I associated in 1990 with the evils of communism.

Walking about in such a world in 2015, the 1990-me would truly find himself a stranger in a strange land.  This time-traveling Bill Astore’s befuddlement could, I suspect, be summed up in an impolite sentiment expressed in three letters: WTF?

Think about it.  In 2015, so many of America’s “trigger-pullers” overseas are no longer, strictly speaking, professional military.  They’re mercenaries, guns for hire, or CIA drone pilots (some on loan from the Air Force), or warrior corporations and intelligence contractors looking to get in on a piece of the action in a war on terror where progress is defined — official denials to the contrary — by body count, by the number of “enemy combatants” killed in drone or other strikes.

Indeed, the very persistence of traditional Cold War structures and postures within the “big” military has helped hide the full-scale emergence of a new and dangerous mutant version of our armed forces.  A bewildering mish-mash of special ops, civilian contractors (both armed and unarmed), and CIA and other intelligence operatives, all plunged into a penumbra of secrecy, all largely hidden from view (even as they’re openly celebrated in various Hollywood action movies), this mutant military is forever clamoring for a greater piece of the action.

While the old-fashioned, uniformed military guards its Cold War turf, preserved like some set of monstrous museum exhibits, the mutant military strives with great success to expand its power across the globe.  Since 9/11, it’s the mutant military that has gotten the lion’s share of the action and much of the adulation — here’s looking at you, SEAL Team 6 — along with its ultimate enabler, the civilian commander-in-chief, now acting in essence as America’s assassin-in-chief.

Think of it this way: a quarter-century after the end of the Cold War, the U.S. military is completely uncontained.  Washington’s foreign policies are strikingly military-first ones, and nothing seems to be out of bounds.  Its two major parts, the Cold War-era “big” military, still very much alive and kicking, and the new-era military of special ops, contractors, and paramilitaries seek to dominate everything.   Nuclear, conventional, unconventional, land, sea, air, space, cyber, you name it: all realms must be mastered.

Except it can’t master the one realm that matters most: itself.  And it can’t find the one thing that such an uncontained military was supposed to guarantee: victory (not in a single place anywhere on Earth).

Loaded with loot and praised to the rafters, America’s uncontained military has no discipline and no direction.  It never has to make truly tough choices, like getting rid of ICBMs or shedding its obscenely bloated top ranks of officers or cancelling redundant weapon systems like the F-35.  It just aims to do it all, just about everywhere.  As Nick Turse reported recently, U.S. special ops touched down in 150 countries between 2011 and 2014.  And the results of all this activity have been remarkably repetitive and should by now be tragically predictable: lots of chaos spread, lots of casualties inflicted, and in every case, mission unaccomplished.

The Future Isn’t What It Used to Be

Say what you will of the Cold War, at least it had an end.  The overriding danger of the current American military moment is that it may lack one.

Once upon a time, the U.S. military was more or less tied to continental defense and limited by strong rivals in its hegemonic designs.  No longer.  Today, it has uncontained ambitions across the globe and even as it continually stumbles in achieving them, whether in Iraq, Afghanistan, Yemen, or elsewhere, its growth is assured, as our leaders trip over one another in continuing to shower it with staggering sums of money and unconditional love.

No military should ever be trusted and no military should ever be left uncontained.  Our nation’s founders knew this lesson.  Five-star general Dwight D. Eisenhower took pains in his farewell address in 1961 to remind us of it again.  How did we as a people come to forget it?  WTF, America?

What I do know is this: Take an uncontained, mutating military, sprinkle it with unconditional love and plenty of dough, and you have a recipe for disaster.  So excuse me for being more than a little nervous about what we’ll all find when America flips the calendar by another quarter-century to the year 2040.

Jim Hightower: Save the Post Office from the Honchos


Stop Postal Executives from Destroying our Postal Service

Monday, May 11, 2015   |   Posted by Jim Hightower

When a big-name retailer finds its sales in a slow downward spiral, the geniuses in the executive suite often try to keep their profits up by cheapening their product and delivering less to customers.

To see how well this strategy works, look no further than the declining sales at Walmart and McDonald’s. When the geniuses in charge of these behemoths applied the cut-back strategy, their slow decline turned into a perilous nose-dive. You’d think their experience would keep other executives from making the same mistake, but here comes an even bigger – and much more important – retail behemoth saying, “We have to cut to survive.”

That’s the pronouncement last year by the honcho of the US Postal Service, which has been eliminating employees, closing facilities, and reducing services for years. Each new round of reductions drives away more customers, which causes clueless executives to prescribe more cuts. In a January decree, USPS virtually eliminated overnight delivery of first-class mail, and it’s now planning to close or consolidate 82 regional mail processing plants. This means fewer workers handling the nation’s growing load of mail, creating further delays in delivery. The answer to this, say the slap happy executives, is – guess what? – to cut even more “service” out of postal service. They want to close hundreds of our local post offices and eliminate Saturday mail delivery (which is one of USPS’ major competitive advantages).

This is Jim Hightower saying… Fed up with the deliberate degradation of this vital public service, postal workers themselves are putting forth a vision and innovative plan not merely for USPS to survive, but thrive. With more than 70 other national groups, they’ve forged “A Grand Alliance to Save Our Public Postal Service.” To be part of its actions, go to:

“APWU Asks Union Members to Build Support for Postal Bills,” American Postal Workers Union, February 19, 2015,


Humor: The Borowitz Report

Scientists: Earth Endangered by New Strain of Fact-Resistant Humans



MINNEAPOLIS (The Borowitz Report) – Scientists have discovered a powerful new strain of fact-resistant humans who are threatening the ability of Earth to sustain life, a sobering new study reports.

The research, conducted by the University of Minnesota, identifies a virulent strain of humans who are virtually immune to any form of verifiable knowledge, leaving scientists at a loss as to how to combat them.

“These humans appear to have all the faculties necessary to receive and process information,” Davis Logsdon, one of the scientists who contributed to the study, said. “And yet, somehow, they have developed defenses that, for all intents and purposes, have rendered those faculties totally inactive.”

More worryingly, Logsdon said, “As facts have multiplied, their defenses against those facts have only grown more powerful.”

While scientists have no clear understanding of the mechanisms that prevent the fact-resistant humans from absorbing data, they theorize that the strain may have developed the ability to intercept and discard information en route from the auditory nerve to the brain. “The normal functions of human consciousness have been completely nullified,” Logsdon said.

While reaffirming the gloomy assessments of the study, Logsdon held out hope that the threat of fact-resistant humans could be mitigated in the future. “Our research is very preliminary, but it’s possible that they will become more receptive to facts once they are in an environment without food, water, or oxygen,” he said.

Get news satire from The Borowitz Report delivered to your inbox.

Naked Capitalism on Our Shaky Banking System


Mr. Market Says Dodd-Frank Isn’t Working

Posted on May 11, 2015 by

Yves here. While I am not convinced that breaking up big banks solves the “too big to fail” problem. Hedge fund LTCM nearly brought down the financial system in 1998. The comparatively small and simple by modern standards #4 bank in 1984, Continental Illinois, took seven years to resolve. Nevertheless, it would be a big step in the right direction. One of the advantages isn’t just reducing the size of firms but increasing their diversity. Andrew Haldane of the Bank of England warned that one of the big sources of instability in our modern financial system is a monoculture, in which the biggest firms are pursuing virtually identical strategies and using similar risk models (not just VaR, but FICO in their retail businesses) and trading approaches. Similarly, having more specialized players also means more contested regulatory demands as players with different customers and products jockey for regulatory advantage.

By Alexander Arapoglou, a professor of finance at the University of North Carolina’s Kenan-Flagler Business School, who been a derivatives trader and head of risk management worldwide for various global financial institutions, and Jerri-Lynn Scofield, who has worked as a securities lawyer and a derivatives trader

The objective of the 2010 Dodd-Frank legislation and other post-financial crisis regulatory reforms was to make the too big to fail banks so safe that they could not fail. Has this goal been achieved? The rating agencies answer no. If these agencies were convinced that the plethora of new rules– including increased capital requirements– had led big banks to achieve unquestioned credit standing, their bonds would be rated AAA.

Instead, bond ratings for the top 5 US financial institutions now hover around single A. These ratings scream to anyone who’s listening that the too big to fail institutions may indeed, fail. Mr. Market agrees: debt issued by too big to fail banks currently trades at prices consistent with their credit ratings.

Reforms undertaken since the demise of Bear Stearns and Lehman Brothers have failed on several fronts. The too big too fail banks are not fail-safe. They are more brittle and unable to act as shock absorbers than they were before. Holders of their shares are not deploying capital efficiently and small business is starved of financing. Where did regulatory policy take a wrong turn?

The missteps began during the financial crisis, when regulators were faced with two choices. The first would have turned back the clock on deregulation and re-erected walls between securities sales and trading, asset management, and commercial and retail banking. After such restructuring, smaller, more specialized financial firms would pose little risk to the rest of the economy if any failed.

The alternative approach– the one that was followed– was to accept that there were institutions that were too big to fail. Making these giant institutions safer became the regulatory priority. New rules were enacted and more aggressive and intrusive regulatory scrutiny mandated so, it was hoped, to prevent financial institutions from shooting themselves in the foot.

As a result, bank examiners have moved into the offices of many financial firms full time. They attend board meetings and drive business priorities by asking questions. Decisions are scrutinized lest they encourage untoward risk taking. The list of good intentioned ideas goes on and on. But to what end?
This matters because since 2008, too big to fail institutions have become much larger, posing an even greater risk to the economy than they did before. Bigger banks continue to increase their market share, the number of community banks has declined by 40%, and since June 2010, 500 of these have failed outright.

As banks have got bigger, many market participants— including at least one bank CEO– have conceded that these behemoths have become too big to manage. Goldman Sachs has reported that even JP Morgan Chase, one of the most profitable big banks, would be worth more broken into parts than kept whole– as is true with many conglomerates. The profitability of smaller, more narrowly-focused financial institutions usually exceeds that of institutions that follow a universal banking model, partly due to requirements that systemically important institutions maintain extra capital and overhead.

So how are we left? Dodd-Frank has sidestepped dealing with the central problem – a concentration of systemic risk that hangs over the real economy. Bank shareholders are worse off. Excessive capital requirements have burdened the economy without any offsetting increase in safety. The benefits to the broader economy of greater competition and better distribution have been forfeited without any offsetting gain. The corporate bond market has lost liquidity, adding costs and risk to the overall system for financing jobs, pensions, university endowments and insurance. And the decline of community banks has fallen hardest on small businesses, America’s biggest employer.

If regulators continue to to stumble down the wrong regulatory road, their next step might be to limit competition further, raising prices to guarantee bank profitability. This approach would certainly be safe, yet it would be costly, not only in terms of the cost of bank services, but it would also stymie new business formation.

There is an alternative: not to turn the clock back blindly, but to examine first what it was that made the Depression-era Glass-Steagall financial structure so robust. The regulators of that time quite rightly focused on preventing conflicts of interest and financial contagion. Their solution was to prevent the otherwise inevitable consolidation and oligopoly that follows when single firms are allowed to offer universal banking services by instead splitting up the largest financial firms and confining them to separate lines of business. More modern experts on regulation, such as Senator Elizabeth Warren, just last week again endorsed the general Glass-Steagall approach.

That framework wasn’t perfect; while it separated the domestic securities business from domestic banking, large banks continued to have securities operations in London and Tokyo. Yet while such regulations were firmly in place, the broader economy was insulated from boom-bust financial cycles generated by bank failures.

A modern version would be more far reaching. The most important banking function is the clearing function. If the financial system crashes, paychecks and payments for industrial and other supplies are “lost” and the entire economy would grind to a halt. Thus, as a first step, clearing and custody functions should be separated from everything else. They must be protected and restructured so as to best survive any future systemic shock.

But reform must go further. Trading of derivatives, securities and foreign exchange involves significantly more risk than the rest of banking. These operations should also be segregated — and more completely than in the watered-down Volcker rule that the Federal Reserve has more or less indefinitely deferred. Likewise, asset management activities should be conducted in distinct companies to avoid self-dealing. Brokerage functions should be spun off to avoid conflicts. Here, Eliot Spitzer well understood more than a dozen years ago that allowing one firm to undertake investment banking and sell securities sparked practices that inflated the firm’s bottom line at the expense of its brokerage customers. Many current regulators still have failed to absorb this lesson. Mergers and acquisitions activity should be made apart from lending decisions.

Breaking up the biggest banks would eliminate the too big to fail problem. Yet that wouldn’t be the only gain. If Goldman is right, this approach would benefit bank shareholders as well. A better plan where everyone benefits…. What’s not to like?


LUV News: Monsanto’s Killer Weed Killer


Monsanto Is Threatening the Safety of Children With Its Toxic Weed Killers

by Mary Ellen Kustin, Soren Rundquist / Environmental Working Group

Genetically engineered crops, or GMOs, have led to an explosion in growers’ use of herbicides, with the result that children at hundreds of elementary schools across the country go to class close by fields that are regularly doused with escalating amounts of toxic weed killers.

GMO corn and soybeans have been genetically engineered to withstand being blasted with glyphosate – an herbicide that the World Health Organization recently classified as “probably carcinogenic to humans.” The proximity of many schools to fields blanketed in the chemical puts kids at risk of exposure. But it gets worse.

Overreliance on glyphosate has spawned the emergence of “superweeds” that resist the herbicide, so now producers of GMO crops are turning to even more harmful chemicals. First up is 2,4-D, a World War II-era defoliant that has been linked to non-Hodgkin lymphoma, Parkinson’s disease and reproductive problems. Young children are especially vulnerable to it.

A new EWG interactive map shows the amounts of glyphosate sprayed in each U.S. county and tallies the 3,247 elementary schools that are located within 1,000 feet of a corn or soybean field and the 487 schools that are within 200 feet. Click on any county on the map to see how much GMO corn and soy acreage has increased there as well as the number of nearby elementary schools.

The 15 states outlined on the map across the center of the country are the ones where the Environmental Protection Agency has approved the use of Dow AgroSciences’ Enlist Duo – a combination of glyphosate and 2,4-D – on GMO corn and soybeans engineered to tolerate both weed killers.

The chart shows the 10 states with the most elementary schools within 1,000 feet of a corn or soybean field. These states account for 53 percent of the total acreage planted with genetically engineered GMO corn and soy. EPA has approved the use of Enlist Duo in seven of them.


The inescapable connection between GMO crops and increased use of toxic herbicides is one reason why many people want to know whether the products they buy contain GMOs. Polls show that more than 90 percent of consumers favor labeling GMOs, but without a mandatory labeling law, they have no way to know for sure.


EWG approximated school locations using the ESRI ( landmark shape file for schools, derived from the U.S. Geological Survey Geographic Names Information System – Schools layer. These are considered the best available data for school locations. The data were filtered to the best of EWG’s knowledge to include only locations whose attributed name reflects an operating elementary school, but they may inadvertently include some free-standing school administrative offices or buildings that formerly housed schools but are now in other use.

Zones within 200 feet and 1,000 feet of each school were delineated using the school’s point location in the ESRI data, not the physical footprint of the school grounds. As a result, EWG’s analysis may over- or under-estimate the exact distance of school grounds to the boundaries of nearby corn or soybean fields. School locations were evaluated for proximity to the boundaries of corn and soybean fields as delineated in the USDA 2013 cropland data layer (30-meter resolution).

EWG acknowledges that spatial analyses of this kind may include some level of error (such as incorrect or outdated school or crop field locations or boundaries) even with standard, best available data sources. EWG welcomes information to revise and correct any locational errors in the underlying data.

Data on estimated glyphosate use was drawn from the U.S. Geological Survey’s Estimated Annual Agricultural Pesticide Use for Counties of the Conterminous United States (2008-2012 & 1992-2009). According to the USGS, “Pesticide use estimates from this study are suitable for making national, regional, and watershed assessments of annual pesticide use, however the reliability of estimates generally decreases with scale.

Data on the acreage of genetically modified corn and soybeans were assembled by extrapolating from county-planted acreage using state percentages of biotech varieties by crop, as reported by the USDA. For corn, state level “herbicide resistant” + “stacked gene” varieties were used to extrapolate county-level planted acreage. If a state was not specifically listed in the USDA NASS Acreage Report, the category “Other” was used in the extrapolation. For soybeans, the state-level “all biotech varieties” was used to extrapolate planted acres at the county level. If a state was not specifically listed in the USDA NASS Acreage Report, the category “Other” was used in the county extrapolation.

Naked Capitalism: Milking the Poor

Notes for an Elite Playbook: The Self-Licking Ice Cream Cone

By Lambert Strether of Corrente.

Recently, in “Control Fraud and For-Profit “Universities” (Et Tu, Bill Clinton?)” I took Bill Blacks formula for accounting control fraud, modified it, and showed how the modification could be used to describe the business practices of two for-profit universities,

The “Self-Licking Ice Cream Cone” Defined

The phrase “self-licking ice cream cone” was first used by S. Pete Worden, in the Proceedings of the 7th Cambridge Workshop on Cool stars, stellar systems, and the sun (!), 1992, who uses the Space Shuttle as an example:

“The Self-Licking Ice Cream Cone”

Since NASA effectively works for the most porkish part of Congress, it is not surprising that their programs are designed to maximize and perpetuate jobs programs in key Congressional districts. The Space Shuttle-Space Station is an outrageous example. Almost two-thirds of NASA’s budget is tied up in this self-licking program. The Shuttle is an unbelievably costly was to get to space at $1 billion a pop. The Space Station is a silly design. Yet, this Station is designed so it can only be built by the Shuttle and the Shuttle is the only way to construct the Station. Furthermore, the Shuttle has to be “improved” to support the Station with a new solid rocket motor which is to be built you guessed it in the District of the Chairman of the House Appropriations Committee. Since there are tens of thousands of jobs tied up in these programs and most of NASA’s budget as well, there is not only no money to get out of this endless do-loop, there are positive political pressures to make sure we don’t get out.

Ben Brody gives a useful definition and a second example in his article, “The definitive glossary of modern US military slang” (hat tip, Another Word for It):

A military doctrine or political process that appears to exist in order to justify its own existence[1], often producing irrelevant indicators of its own success. For example, continually releasing figures on the amount of Taliban weapons seized, as if there were a finite supply of such weapons. While seizing the weapons, soldiers raid Afghan villages, enraging the residents and legitimizing the Taliban’s cause.

Note that both writers begin with a sense of puzzlement and outrage: Self-Licking Ice Cream Cones (SLICCs) lack justification, by definition, yet exist. Now let’s take a moment to critique both Brody’s definition and Worden’s usage. (I know it seems churlish to critique the inventor of a term for his own usage of it, but the interests of science are paramount.)

Brody’s definition is both too broad and too shallow. It’s too broad, because what system — especially a political and/or economic system — does not appear to exist in order to justify its own existence? But it’s too shallow, since although Brody presents what I would call a correct fact set, he doesn’t make obvious conclusion explicit. He writes: “Soldiers raid Afghan villages, enraging the residents and legitimizing the Taliban’s cause,” but he doesn’t conclude that the raids are creating more Taliban (creating more raids (creating more Taliban (….)), in the iterative process Chalmers Johnson called blowback, and is in fact a feedback loop. However, Brody does notice the corrupt — not “irrelevant” — metrics[2] that instrument the system: At each turn round the loop, we capture more weapons, so we must be succeeding, right? (Brody also limits usage of the term to the military, which I do not propose to do.)

Worden’s usage includes the loop that Brody misses; he calls it, in good scientific FORTRAN style, a “do loop.” But Worden’s terminology is revealing, since a “do loop” is not necessarily a feedback loop, as Brody’s is. A system with feedback loops inserts its results back into itself; it’s recursive (like the negative feedback loop that connects a thermostat to a furnace, or the positive feedback of a squealing microphone or a Minskyian “deviation amplifying system”). Worden’s implicit definition, then, is also too broad and too shallow. Too broad (like Brody’s) because the world is full of systems that go round and round and round, stable, because of negative feedback, like my furnace (hopefully). And too shallow because, absent the notion of positive feedback, we can’t model the kind of feedback that can make a bad situation worse (and then worse (and then worse (….))).

So I would like to propose a different definition:

“Solutions” that amplify, to a rentier’s profit, the very “problem” they claim to solve.

This approach has the merit of including the feedback loop but making it positive (“amplify”), connoting false justification (via the irony quotes shrouding solutions, and problems, but including a metric — profit![3] — that’s far more appropriate than weapons counts or jobs, which are mere proxies for profit. In addition, I say “rentier”[4] since, again by definition, a SLICC is about business and not industry; hence the false justifications. A business model that sold bottled water to people after polluting their wells would be the mother of all SLICCs.

Ferguson as a Self-Licking Ice Cream Cones

The ills of the Ferguson law enforcement system are well-known[5], but let’s take a quick look at them through the SLICC frame. (Quartz has a fine article “by the numbers.”) NPR summarizes:

A new report released the week after 18-year old Michael Brown was shot and killed in Ferguson helps explain why. ArchCity Defenders, a St. Louis-area public defender group, says in its report that more than half the courts in St. Louis County engage in the “illegal and harmful practices” of charging high court fines and fees on nonviolent offenses like traffic violations — and then arresting people when they don’t pay. The report singles out courts in three communities, including Ferguson.

Last year, Ferguson collected $2.6 million in court fines and fees. It was the city’s second-biggest source of income of the $20 million it collected in revenues.

Earlier this year, in the series Guilty and Charged, NPR’s investigations unit found that the practices in Ferguson are common across the country. The series reported that nationwide, the costs of the justice system are billed increasingly to defendants and offenders, and that this creates harsher treatment of the poor. Because people with money can pay their hundreds or thousands of dollars in fines and fees right away, they are usually done with the court system.

People who can’t pay their fines and fees go on payment plans. But then there are extra fees, sometimes interest — 12 percent on felonies in Washington state — and, if poor people fall behind on payments, they may go to jail. Courts often ignore laws, Supreme Court rulings and protections that outlaw the equivalent of debtors prisons.

In Ferguson, Harvey says going to court creates more anger. The system, he says, favors people who can hire a lawyer. But poorer defendants simply take a guilty plea.

“And then if you can’t pay all the fines at once, they put you on a pay docket, and that just means [you] come to the court once a month and pay a certain dollar amount or explain why you haven’t paid,” Harvey says.

But the ticket may be in a far-away court that’s not easy to get to in a region with sometimes spotty public transportation. If someone doesn’t pay, a warrant can be issued for their arrest.

Jeff Smith, an assistant professor at the New School and a former Missouri state senator from St. Louis, says Ferguson “facilitates a debtors prison” because of the high number of arrest warrants that get issued when people don’t pay. When people go to jail, they sometimes lose their jobs.

They get caught in this downward spiral, and it happens to a lot of people. This stuff accumulates,” he says.

“It’s a risk to go to the store,” says [Better Family Life CEO] “Outside of that community, it’s a risk to go to any educational institution, to get a job, to go for job interviews. Especially since most of the jobs are maybe 5 to 10 miles away. So some of them just don’t even try anymore.”

Morther Jones describes the same dynamic:

Court fines for minor infractions tend to snowball. For example, drivers accumulate points for speeding, rolling through stop signs, or driving without insurance. You can pay to wipe your record, which is pricey. If you can’t afford to, and rack up enough points, your license will be suspended and your insurance costs will probably jump. Need to get to work? If you’re caught driving with a suspended license, your court fines increase, you gain more points, and your suspension is lengthened. That’s how rolling through a stop sign could end up costing you your job, messing up your degree plans, and more.

So, the problem — or “problem” — is revenue, right? Well, no, not exactly. Emerson Electric is a corporation with $24 billion in revenue, and its property tax valuation is “rock bottom.” And the solution — or “solution” is turning law enforcement into a business, just Reaon’s Robert Poole advocated, right? How’s that working out?

And in the middle — between the “problem” and the “solution” — we’ve got the “amplification,” the positive feedback, where the “solution” makes the “problem” worse (the “snowball,” as Mother Jones calls it; the “downward spiral,” as NPR has it). I can see several:

1) The downward spiral of those arrested: They pay interest on their fines, and if they fail to make a payment, they’re arrested and imprisoned, from which they can escape only by making more payments;

2) The downward spiral of the community: People can’t risk “going to the store,” let alone getting a job or an education. Even leaving the human element aside, I can’t imagine that’s good for property and hence property taxes, or for sales taxes (which Ferguson apparently has).

3) The downward spiral of the municipality: And then, of course, there’s the Michael Brown shooting and subsequent events, which, again leaving the human element aside, can’t be good for revenues either (absent some future real estate development along West Florissant, and it would very interesting to know which insiders know which properties, if any, are up for that).

Given that Ferguson was turned by its local elites into a giant debtors prison with “law enforcement” transformed into a collection agency — and with the debtors disproportionately black — I don’t see how policymakers could have imagined that anything other than what happened, would happen. An explosion is often the outcome of a feedback system in runaway mode.

Oh, and the rentiers, and that pesky metric, profit. Forbes:

According to its 2014 Comprehensive Annual Financial Report (CAFR), the 21,000-person city has $25.9 million in total debt, or well over $1,000 per resident. About $3 million of this came from building an aquatic park; another $6.2 million is for tax increment financing bonds on a private mixed-use redevelopment project. The city also gave employees 6% raises that year, and like many others, grants defined-benefit pensions to retirees. … Money was spent on a boondoggle here and a subsidy there; employee pay raises were granted in an era of private sector wage stagnation; and the city spent over 5% of its $17.8 million budget on interest.

My point here is merely to show that Ferguson fits my definition of a SLICC — “to a rentier’s profit” — and not to quarrel with or justify the funding decisions of the Ferguson town government. That said, Emerson Electric is mysteriously absent from the Forbes discussion, and the “boondoggles” look like the sort of boondoggles any desperate small town makes, in the attempt to right itself; no worse than anywhere else. Moreover, the 2008 financial collapse is also mysteriously absent. Bloomberg:

Violent unrest that captured global attention is revealing Ferguson, Missouri, as a city still struggling to mend its finances more than five years after the end of the longest U.S. recession since the 1930s. .. Ferguson acknowledged in its budget last year that “the recovery has been extraordinarily slow” and it has struggled to collect revenue. After 2007, the city lost almost $1.5 million annually in sales taxes and hasn’t fully recovered, according to the document.

So, whatever a solution for Ferguson might be, we can be sure that the “solution” is not law enforcement for profit.

Conclusion and Exhortation

Readers, I wonder if you can give more examples of Self-Licking Ice Cream Cones? And more importantly, can you refine the definition? Are Accounting Control Fraud and Self-Licking Ice Cream Cones really commensurate?

Even more importantly, can you propose other plays? I think reverse engineering a playbook out of observed elite behavior would be very useful; we might be able to skate, as it were, where the hockey puck is going to be, instead of where it is.


[1] Wikipedia’s definition is similar and has similar weaknesses, besides being even more teleological: “A self-perpetuating system that has no purpose other than to sustain itself.”

[2] Tarak Barkawi in Al Jazeera:

Neoliberalism, with its audit culture and fetish for short term quantitative indicators, is a mass production facility for self-licking cones. Everywhere bottom line measures of “efficiency” shape the activities of organisations and determine career advancement, selecting the kind of people and personalities who prosper in the system.

[3] The archetypal play — often cited during the era — could be South Park’s famous “Underpants Gnomes” (non-) business plan. Consider this a template:

The process, as explained by thee gnomes goes something like this.

Step 1. Collect underpants.

Step 2. ?????

Step 3. PROFIT

Of course, our elites tend to have a much firmer grasp on Step 2, and its substeps, than the Underpants Gnomes do; the Playbook is there to flesh that part out (besides substituting some good for the Gnomes’ primitive accumulation of underpants.

[4] Rentier: “[A] person living on income from property or investments.” Both of the examples I give below involve state action. Holders of government bonds are rentiers by definition, and benefit at least from municipal bonds in Ferguson, as well as from conflict investment.

[5] Marginal Revolution, to their credit, was strong on this issue: see “Ferguson and the Modern Debtor’s Prison.” However, it would have been nice to have some acknowledgement of how libertarians created the ideological justifications for the system whose outcomes they now decry.

[6] Autocoprophagous gets at the same idea, but as a word, it’s just too fancy. I myself would go so far as to define the creation of SLICCs as the very definition of corruption[6] — far more so than cash in a white envelope, or even a job for a family member.

Counterpunch: Our Robber-Baron Society



There are two economies in the USA.  The official one, in which capitalism is wonderful, with money falling from the trees on everyone—you learn this from any college economics 101 course. 

And there is the real economy, in which the people who work the hardest, sweating and bleeding on the lower economic rungs of the working class, die the youngest from stress—and those who inherit from robber barons live long stress-free lives in splendor off the sweat and blood of the working class, never so much as lifting a finger from womb to tomb.

The early capitalists were smarter, actually paying some of the taxes themselves and allowing the wages of the workers, who generate the wealth, to increase to a near living wage, even toward the bottom rungs of the “economic ladder.”  That is, until 1973, when greed took over, and wages have dropped since for average workers, and for those at the minimum wage, dropping since 1968.  What happened to the money?  The capitalists, those who live on investments, have it, and are taking a larger share of the pie.

The early capitalists were closer to the French revolution and could see what the masses can do to an aristocracy, but today’s plutocrats are ignorant of guillotines, and more ruthless in their greed.

Mike Whitney explores the current state of the economy, following.  —Jack Balkwill, LUV News

Obama’s “No Growth, No Jobs, No Recovery” Economy Gives Up The Ghost


The world’s biggest economy ground to a standstill in the first quarter of 2015 wracked by massive job losses in the oil sector, falling personal consumption, weak exports and droopy fixed investment. Real gross domestic product (GDP), the value of the production of goods and services in the US, increased at an abysmal annual rate of just 0.2 percent in Q1 ’15 according to the Bureau of Economic Analysis demonstrating conclusively that 6 years of zero rates and Large-Scale Asset Purchases (LSAP)– which have enriched stock speculators, inflated the largest asset-price bubble in history, and exacerbated inequality to levels not seen since the Gilded Age– have done nothing to improve the real economy, boost demand or reduce unemployment. As the BEA data illustrates, the US economy is basically DOA, a victim of criminal congressional negligence and Central Bank chicanery.

From the BEA release: “The deceleration in real GDP growth in the first quarter reflected a deceleration in PCE, downturns in exports, in nonresidential fixed investment, and in state and local government spending, and a deceleration in residential fixed investment that were partly offset by a deceleration in imports and upturns in private inventory investment and in federal government spending.”

Translation: The economy is in the shitter. Consumers aren’t spending because the crap-ass jobs they landed after the crisis pay half as much as the jobs they lost when Wall Street blew up the financial system. Personal savings are up and spending is down because households face an uncertain future where pensions are being trimmed and Social Security is under attack. Also, spending is impacted by the historic low (employment) participation rate which indicates that joblessness is much higher than the government’s phony numbers suggest. When workers are unemployed they don’t spend, activity drops, and the economy tanks. It’s that simple. Today’s data just confirms what most people already know, that the economy stinks and that they’re being ripped off by a voracious oligarchy that’s stacked the deck in their favor.

The US economy is stuck in the mud because our bought-and-paid-for congress has relinquished all authority and handed over the management of the economy to the industry-controlled Federal Reserve. Whereas our current budget deficits are in the range of 2 percent per annum, the government should be spending a lot more to compensate for the slowdown in private sector spending and investment. In the past, the congress and president would initiate sensible Keynesian fiscal stimulus programs to keep the economy sputtering along while households repaired their balance sheets or businesses struggled with weak demand. Those tried-and-true remedies have been jettisoned for the new monetarist orthodoxy that requires that all the nation’s wealth be filtered through the Wall Street casino so that the pampered thieves who destroyed the country with their mortgage-securities-Ponzi-scam be further rewarded for their insatiable greed.

Manufacturing, retail sales, MBA purchase applications, business investment etc, are all in the toilet. There’s a very good chance the economy is already in recession which will undoubtedly send stocks even higher since every proclamation of bad news generates a buying frenzy by clever speculators who anticipate that the Fed will continue to extend the zero rates and easy money to infinity.

It’s worth noting that the economy had been hanging on by the skin of its teeth mainly do to strong activity in the oil patch where credit expansion, intensive corporate investment, and high-paying jobs (which supported 4 additional jobs in the local economy!) contributed more than $200 billion per year to GDP. Now domestic oil production is in deep distress. Layoffs recently surpassed the 100,000 milestone (See: Oil Layoffs Hit 100,000 and Counting, Wall Street Journal) and borrowing has dried up. Economist Warren Mosler explains the impact the cutbacks in domestic oil have had on GDP in this video from RT that I have transcribed:

“The price drop in oil has turned out to be the unambiguous negative that we had talked about before….where income saved by the consumer, is lost by another consumer. For every dollar not spend by one consumer, another doesn’t get it. you’re just left with the collapse in capital expenditures. (business investment) It turns out, there was about $150 borrowed in the sector last year, driving what modest growth we had last year. Since that disappeared, all the numbers have been going straight down. Unless something steps up to the plate to replace the lost borrowing-to-spend from chasing $100 oil, I see no hope whatsoever.” (Warren Mosler Interview, RT)

Economic recovery requires credit expansion, business investment and jobs. All three of these were severely impacted by the Obama’s goofy plan to push down oil prices in order to destroy the Russian economy. Here’s a brief summary:

“John Kerry, the US Secretary of State, allegedly struck a deal with King Abdullah in September under which the Saudis would sell crude at below the prevailing market price. That would help explain why the price has been falling at a time when, given the turmoil in Iraq and Syria caused by Islamic State, it would normally have been rising.” (Stakes are high as US plays the oil card against Iran and Russia, Larry Eliot, Guardian)

As indicated by today’s ghastly GDP data, Obama not only shot himself in the foot, he might have blown off his whole leg. Aside from the colossal growth in private inventories–which will be a drag on future growth–todays report was nothing short of a disaster.

Jim Hightower on Smearing Environmentalists


“Dr. Evil” turns out to be “Dr. Silly”

Friday, April 24, 2015   |   Jim Hightower

Imagine a political campaign against environmentalists that’s so negative, so ridiculously slanted and downright dirty, that it actually repulsed executives of some of America’s biggest fracking corporations.

Wow – it’s got to take a big wad of ugly to gag a fracker! But in the gross world of political rancor, few cough up hairballs as foul as those produced by Rick Berman. He specializes in taking secret funding from major corporations to publicly slime environmentalists, low-wage workers, and anyone else perceived by his corporate clients as enemies.

So last year, Berman was in Colorado Springs, at a meeting of Big Oil frackers about his down and dirty plan to smear and ridicule the grassroots enviros who’ve dared to oppose the fracking of Colorado’s land, water, people, and communities. Dubbing the campaign “Big Green Radicals,” the Berman team revealed that their PR firm had dug into the personal lives of Sierra Club board members, looking for tidbits to embarrass them. Gut it up, Berman cried out to the executives, “You can either win ugly or lose pretty.” The Little Generalissimo then urged them to pony up some $3 million for his assault, saying they should “think of this as an endless war,” adding pointedly, “and you have to budget for it.”

Unfortunately for the sleaze peddler, one appalled energy executive recorded his crude pitch and leaked it to the media. “That you have to play dirty to win,” the executive explained, “just left a bad taste in my mouth.” Even Anadarko, an aggressive fracking corporation with 13,000 fracked wells in the Rockies, publicly rejected Berman’s political play, telling the New York Times: “It does not align with our values.”

Berman likes to be called “Dr. Evil,” but he’s so coarse, strident, bombastic, and clownish that he’s become known as “Dr. Silly.”

“Hard-Nosed Advice From Veteran Lobbyist: ‘Win Ugly or Lose Pretty’,”, October 30, 2014.