The Clintons and the Fed Are Gasping Over the April Issue of Harper’s
By Pam Martens: March 19, 2015
Andrew Cockburn, Washington Editor of Harper’s
Hillary Clinton just can’t catch a break. As her self-inflicted imbroglio over erasing 30,000 emails involving her time as Secretary of State continues to command press attention, the April issue of Harper’s Magazine is focusing gasp-worthy attention on the “loan-sharking” business that Bill Clinton, as President, assisted in transforming into the too-big-to-fail Citigroup that played a leading role in bringing the country to the brink of financial collapse in 2008.
Janet Yellen’s Fed can’t be too happy either about the revelations. The Fed just gave Citigroup a clean bill of health last week under its so-called rigorous stress tests and is allowing the bank to spend like a drunken sailor, raising its dividend 400 percent with permission to buy back as much as $7.8 billion of its own stock. The Fed’s qualitative portion of the stress test is said to look at both risk controls and the internal culture of the bank. Citigroup remains under multiple criminal investigations for money laundering and involvement in rigging currency markets. Apparently, in the Fed’s eyes, this is now de rigueur on Wall Street.
Harper’s six-page article jolts the reader with nugget after nugget unearthed from Citigroup’s unseemly history – facts that both the Clintons and the Fed would no doubt prefer to stay buried. This epistle to greed and excess and regulatory hubris is written by Andrew Cockburn, Harper’s Washington Editor and a man well credentialed to do it justice. Cockburn and his wife, Leslie Cockburn, co-produced the documentary, American Casino, which provided an in-depth look at the players behind the 2008 financial collapse. Cockburn’s father, Claud Cockburn, was on the scene in 1929, covering the epic crash for the London Times. (His memoir of Black Thursday on Wall Street in 1929 can be read here.)
The Harper’s article is subtitled “The catastrophic incompetence of Citigroup,” obviously a tongue-in-cheek assessment since Cockburn meticulously documents the serial charges of crimes at Citigroup as a business model.
Cockburn traces the history of how Sandy Weill parlayed Commercial Credit through a series of mergers that, thanks to the repeal of the Glass-Steagall Act by President Clinton, culminated in the too-big-to-fail Citigroup. The banking behemoth replicated the exact model that brought on the 1929 crash and Great Depression by holding savings deposits while being allowed to gamble with the deposits in wild speculations on Wall Street.
The Glass-Steagall Act, which had successfully protected bank savings deposits from 1933 until its repeal by Clinton in 1999, had prevented FDIC insured institutions from merging with investment banks on Wall Street. Despite the ravaging effects of the 2008 crash, the Glass-Steagall Act has not been restored, although two bills were introduced in Congress to do just that: the Return to Prudent Banking Act of 2015 and the Elizabeth Warren inspired 21st Century Glass-Steagall Act.
Cockburn neatly captures where today’s Citigroup culture found its roots, writing:
“Weill had recently been eased out from Shearson Lehman/American Express, a financial conglomerate he had helped to build. Eager to get back in the game, he bought a Baltimore firm called Commercial Credit. In the view of Weill and his protégé, Jamie Dimon [now CEO at JPMorgan Chase], their new acquisition was in the beneficent business of supplying ‘consumer finance’ to ‘Main Street America.’ Their office receptionist, Alison Falls, thought otherwise. Overhearing their conversation at work one day, she called out, ‘Hey, guys, this is the loan- sharking business. Consumer finance is just a nice way to describe it.’
“Falls had it right. Commercial Credit made loans to poor people at predatory interest rates. Strapped to pay off their loans, borrowers were encouraged to refinance, with added fees each time. Gail Kubiniec, who was then an assistant sales manager at the company’s branch office in Tonawanda, New York, remembers that the basic aim was to lend money to ‘people uneducated about credit. You could take a five-hundred-dollar loan and pack it with extra items like life insurance—that was very lucrative. Then you could roll it over with more extra items, then reroll the new loan, and the borrower would go on paying and paying and paying.’ ”
Cockburn prints an excerpt from an affidavit that Kubiniec eventually filed with the Federal Trade Commission in 2001 about the practices of Commercial Credit, which had by then changed its name to CitiFinancial:
“I and other employees would often determine how much insurance could be sold to a borrower based on the borrower’s occupation, race, age, and education level. If someone appeared uneducated, inarticulate, was a minority, or was particularly old or young, I would try to include all the coverages CitiFinancial offered. The more gullible the consumer appeared, the more coverages I would try to include in the loan.”
If Kubiniec’s story has a ring of familiarity, it’s likely because you read Alayne Fleischmann’s allegations against JPMorgan Chase concerning the subprime housing bust in Matt Taibbi’s investigative report in the November issue of Rolling Stone.
Other revelations are equally cringe-worthy. Cockburn writes that Irzen Octa, a Citibank credit card customer in Jakarta, Indonesia, “was beaten to death in 2011 by Citi’s collection agents when he visited a bank branch to discuss his account.”
Cockburn also details how Citigroup this past December succeeded in sneaking a rule to keep trillions of dollars in derivatives held by banks like itself on the books of their FDIC-insured depository bank rather than their investment banking unit – thus guaranteeing that the taxpayer is back on the hook for future derivative implosions. The language was inserted into a must-pass appropriations bill to keep the government running and succeeded in passage, effectively repealing a key provision of the Dodd-Frank financial reform legislation.
Against that backdrop, consider this statement from Michael Greenberger, former Director of Trading and Markets for the Commodity Futures Trading Commission from 1997 to 1999, which appears in the Cockburns’ American Casino:
“On December 15, 2000, around 7 o’clock, Phil Gramm, Republican Senator of Texas, and Chair of the Senate Finance Committee, walked to the floor of the Senate and introduced a 262-page bill as a rider to the 11,000 page appropriation bill, which excluded from regulation the financial instruments that are probably most at the heart of the present meltdown. He not only excluded them from all Federal regulation, but he excluded them from State regulation as well, which is important because these instruments could be viewed as gambling instruments.”
When is enough ever going to be enough when it comes to Wall Street.
LETTER FROM WASHINGTON — From the April 2015 issue
Saving the Whale, Again
The catastrophic incompetence of Citigroup
By Andrew Cockburn
In the late fall of 1970, a forty-five-foot sperm whale beached itself on the Oregon coast and expired. Local authorities, puzzling over how best to dispose of the huge rotting carcass, decided to blow it up, trusting that seabirds and other scavengers would consume any remains not carried out to sea. A half-ton of dynamite was accordingly packed around the whale and detonated, but things did not go as planned. Instead of the intended tidy dissolution, huge chunks of decaying blubber rained down far and wide, destroying property and inflicting a noxious stench throughout the landscape.
That fiasco, a financial-industry lobbyist suggested to me recently, was the perfect metaphor for Citigroup, the megabank described by one leading Wall Street analyst as “the Zelig of financial recklessness,” involved in every speculative catastrophe of the past few decades. Here, after all, was another beached leviathan perpetually threatening to die, leaving a nondisposable corpse, unless the rest of us keep it alive by pouring water over it.
From the April 2015 issue
Saving the Whale, Again
By Andrew Cockburn
In the late fall of 1970, a forty-five-foot sperm whale beached itself on the Oregon coast and expired. Local authorities, puzzling over how best to dispose of the huge rotting carcass, decided to blow it up, trusting that seabirds and other scavengers would consume any remains not carried out to sea. A half-ton of dynamite was accordingly packed around the whale and detonated, but things did not go as planned. Instead of the intended tidy dissolution, huge chunks of decaying blubber rained down far and wide, destroying property and inflicting a noxious stench throughout the landscape.
That fiasco, a financial-industry lobbyist suggested to me recently, was the perfect metaphor for Citigroup, the megabank described by one leading Wall Street analyst as “the Zelig of financial recklessness,” involved in every speculative catastrophe of the past few decades. Here, after all, was another beached leviathan perpetually threatening to die, leaving a nondisposable corpse, unless the rest of us keep it alive by pouring water over it.
Back in 2008, this potent threat elicited hurried bailouts in the trillions of dollars to save Citigroup from its latest debacle. The bank had placed enormous bets on risky derivatives that had gone very, very wrong — a prime cause, many argued, of the overall crash. In hopes of warding off a repeat disaster, Congress passed the Dodd–Frank Act, in 2010, which, among other corrective measures, banned taxpayer-insured banks from trading the more toxic varieties of derivatives, notably credit-default swaps. The law stipulated that such trades should be “pushed out” to uninsured affiliates, thereby forcing the firms to assume the risk themselves.
All the major banks chafed at this restriction, but Citigroup took the lead in overturning it. Its eagerness is best explained by the fact that while the other Wall Street behemoths are currently tapering their derivatives trading, Citi has been expanding its own. As of September 2014, its portfolio of potentially lethal financial instruments had a notional value of $70 trillion.1 So as Congress rushed to vote on a “must-pass” spending bill a few months ago, Citigroup lobbyists enlisted a pliable legislator to insert a provision eliminating the push-out rule.
1 Notional value is the “face amount” of the contract. For example, ABC Company might purchase a credit-default swap that will pay $100 million if XYZ Company defaults on its debt. The notional value of the swap is $100 million, even if the instrument itself is trading at a fraction of that amount.
Dennis Kelleher, of the financial-reform group Better Markets, pithily summarized the issue for me. “The push-out rule said you can do all the derivatives trading you want,” he noted. “You just can’t shift your losses to the American people.” By inserting its stealth provision, the banking giant ensured that “taxpayers are now on the hook for high-risk derivatives trading. That’s why Citigroup drafted it, that’s why Citigroup spent a fortune on lawyers and lobbyists and campaign contributions to make it happen.”
Congressional leaders in both parties made sure that Citigroup got its way. Republicans, with the exception of a dwindling band of Tea Party stalwarts, were enthusiastic in their support. Democrats were more sheepish, with the president himself publicly decrying the measure even as he lobbied Congress to pass the spending bill itself. Even so, the megabank’s maneuvers generated widespread outrage, and Elizabeth Warren seized the moment.
“Enough is enough!” she declared in an impassioned speech on the Senate floor, denouncing Citigroup’s coup. Comparing the bank’s power to that of the Democratic and Republican parties, she highlighted Citigroup’s “unprecedented” grip on the Obama Administration, citing seven current or recent high-level policymakers with close ties to the firm. Her roll call included Jacob Lew, a former chairman of the Office of Management and Budget — “also a Citi alum,” said Warren, “but I’m double-counting here, because now he’s the Secretary of the Treasury.”
Sheila Bair, who was chair of the Federal Deposit Insurance Corporation (FDIC) from 2006 to 2011, confirms Warren’s assessment, citing her own experiences on the inside. “They intimidate you,” she told me recently, referring to the big financial institutions. “I think this has been a big problem with this administration. You see all these former Citi people influencing government, and you’re afraid to voice opinions that are critical of them or different from their views.”
Multitrillion-dollar derivatives trades may have little direct impact on ordinary Americans, unless and until they bring down the economy, as they did in 2008. But other recent Citigroup initiatives will have more immediate effects. According to the Federal Reserve, 52 percent of Americans are unable to lay their hands on as little as $400 in an emergency. Instead, millions of people in urgent need turn to consumer-loan companies, which charge high interest rates. Among the leaders in this field is OneMain Financial, a Citigroup subsidiary, whose website declares its dedication to the penniless consumer: “Your needs. Your goals. Your dreams.™”
Intent on shedding consumer-related subsidiaries in order to concentrate on trading, Citi has for some time been planning to sell OneMain. To hit its target price of $4 billion, however, Citi needed to boost the company’s already substantial profit margin, which was up 31 percent in 2013 — and the way to do that was to persuade state legislatures to loosen restrictions on interest rates. This usurer-relief campaign has been increasingly successful, with lawmakers in Arizona, Florida, Indiana, Kentucky, Missouri, and North Carolina buying the argument that lenders such as OneMain actually “work with their customer,” as demonstrated by low default rates.
OneMain “definitely led the lobbying effort in North Carolina,” Chris Kukla, senior vice president at the Center for Responsible Lending, told me. He said the loan company was “pretty aggressive” in collecting its money. When a borrower does default, companies like OneMain “back up a truck to the house and take the furniture and the TV set.” However, the company much prefers to keep customers on the hook by repeatedly and expensively refinancing their loans — which helps to explain the low default rates.
Citi’s efforts paid off in June 2013, when the North Carolina legislature raised the ceiling on interest rates. By Kukla’s calculation, the revised law has made the situation for borrowers much worse. The interest on an average loan of about $3,000 has risen from slightly more than 20 percent to 30 percent; borrowing that money costs the company itself just three percent, at most.
OneMain is part of Citigroup thanks to a Wall Street dealmaker named Sandy Weill, who realized the stunning possibilities of this kind of business back in 1986. At the time, Weill had recently been eased out from Shearson Lehman/American Express, a financial conglomerate he had helped to build. Eager to get back in the game, he bought a Baltimore firm called Commercial Credit. In the view of Weill and his protégé, Jamie Dimon, their new acquisition was in the beneficent business of supplying “consumer finance” to “Main Street America.” Their office receptionist, Alison Falls, thought otherwise. Overhearing their conversation at work one day, she called out, “Hey, guys, this is the loan-sharking business. ‘Consumer finance’ is just a nice way to describe it.”
Falls had it right. Commercial Credit made loans to poor people at predatory interest rates. Strapped to pay off their loans, borrowers were encouraged to refinance, with added fees each time. Gail Kubiniec, who was then an assistant sales manager at the company’s branch office in Tonawanda, New York, remembers that the basic aim was to lend money to “people uneducated about credit. You could take a five-hundred-dollar loan and pack it with extra items like life insurance — that was very lucrative. Then you could roll it over with more extra items, then reroll the new loan, and the borrower would go on paying and paying and paying.”
Weill considered these practices a “platform” on which his company could grow — and indeed, Commercial Credit stock rose 40 percent in his first year. Not only did this boost his already considerable personal fortune, it enriched his loyal team, the members of which would one day reach commanding heights on Wall Street. Dimon is now the head of JPMorgan Chase. Charles Prince served first as CEO and then as chairman of Citigroup. Robert Willumstad became president of Citigroup and later headed American International Group, where he oversaw the insurer’s spectacular crash in 2008.
By 1988, Commercial Credit was generating enough profit for Weill to take over Primerica, a much bigger company involved in insurance, stockbroking, and other financial services. Three years later, however, a Forbes article reported that “the insurance operations are a can of worms,” and that Weill’s ambitions were still being underwritten by his Baltimore-based cash cow. “Primerica does have one crown jewel,” the article noted, “the company Sandy Weill started with: Commercial Credit.”
Weill bought the venerable Travelers Insurance in 1993, at which point his empire had assets of $100 billion. That same year, he acquired the Shearson Lehman brokerage house (the latest iteration of the company that had ejected him back in 1986). As deal followed deal, Weill fixed his eye on Citicorp, a huge commercial bank with billions of dollars in customer deposits. The fact that such a merger would be against the law was of no consequence. This was, after all, the Clinton–Greenspan era, when a rising tide of corruption was lifting anything on Wall Street that could float, however rotten.
The law that would have blocked the merger was the Glass–Steagall Act, passed in the depths of the Great Depression and prompted by the catastrophic speculations of none other than Citi (i.e., the National City Bank, as it was known at the time). Under the leadership of Charles “Sunshine Charley” Mitchell, the bank had vigorously embraced “cross-selling”: lending money to investors to buy shares of companies in which the bank itself held stakes. Those funds vaporized in the 1929 meltdown. “Mitchell more than any fifty men is responsible for this stock crash,” said Senator Carter Glass of Virginia soon after the market plummeted.
Glass, along with Representative Henry B. Steagall of Alabama, sponsored the eponymous law that decreed a rigid separation between commercial banks, which manage deposit accounts for individuals and businesses, and investment banks, which facilitate the buying and selling of stocks, bonds, and other financial instruments. Glass–Steagall should have barred Weill from getting his hands on Citicorp. Instead, he got provisional clearance for the merger from Alan Greenspan at the Federal Reserve. Once the deal was consummated, in 1998, Weill moved to secure a repeal of the irksome legislation — an easy task, given the enthusiastic support he received from President Bill Clinton and Treasury Secretary Robert Rubin, a former co-chair of Goldman Sachs. Glass–Steagall was duly struck down a year later. A beaming Clinton, extolling the repeal of “antiquated laws,” signed the bill with Weill at his side. By then Rubin had already become co-chairman of Citigroup, as the merged entity was called, garnering a total of $126 million in compensation over the following nine years.
“These guys are excellent at politics,” Arthur Wilmarth, a professor specializing in banking law at the George Washington University Law School, told me. “Look at how they persuaded Clinton, Greenspan, and Rubin to do their bidding. But they’re lousy at running their own business.”
Bair agrees, insisting that Citigroup was “really a cobbled-together series of acquisitions. I think they relied too much on their government connections, as opposed to managing the bank well.” Even before the merger, Citicorp had a historic record of bad bets stretching all the way back to the War of 1812: one of the bank’s founding directors made an investment in a licensed privateer, only to see the ship sail out of New York Harbor and disappear without a trace. Since then, the firm has repeatedly brought itself to the brink of ruin, making a slew of foolhardy loans to corporations during the 1970s and to developing countries during the 1980s.
Under Weill, however, the merged firm set new records for reckless gambles and fraud. It was Citigroup that helped to cook Enron’s books, disguising $4 billion worth of loans on the balance sheet as operating cash flow. Citigroup’s executives apparently understood what they were doing, but carried on regardless — the payoff being the $200 million in fees earned from the energy-trading firm before it collapsed amid bankruptcy and criminal charges. (As it turned out, crime did not pay, at least not for Citigroup’s stockholders, since the firm ended up shelling out $100 million in civil penalties to the SEC and $3.7 billion to settle claims by Enron investors.)
2 In a striking example of the law of unintended consequences, Grubman’s promotion of telecom led to huge overcapacity in the industry — which became a boon to the U.S. military after 9/11 for use in drone operations, among other things.
Equally favored as a client was the WorldCom communications conglomerate. Jack Grubman, Citi’s star telecom analyst, served as an adviser to Bernard Ebbers, WorldCom’s CEO, while relentlessly touting the company’s stock to unwitting investors. For his services, Grubman received more than $67.5 million between 1999 and 2002 — hardly excessive compensation, considering that he had helped Citigroup to generate almost $1.2 billion in fees from WorldCom and other communications firms. Subsequent events followed their normal course. WorldCom declared bankruptcy, Ebbers went to jail, Grubman paid a $15 million fine and was banned from the securities industry for life, and Citigroup settled a WorldCom investors’ suit for $2.6 billion and paid a $300 million fine to the SEC. None of Citigroup’s senior executives suffered any penalty.2
As Weill and his associates scaled the heights of New York society, contributing to such worthy causes as the refurbishment of Carnegie Hall, they retained their loan-shark business, which they renamed CitiFinancial in the wake of the big merger. For Gail Kubiniec, who continued to work for the firm as an assistant sales manager, little else changed. As the great housing bubble of the new millennium got under way, however, she noticed increased demands from management to push high-interest home mortgages.
“I felt those house values were inflated,” Kubiniec told me recently. In addition, the fact that “people didn’t always understand about making timely payments” worked to the company’s advantage. A late payment was an opportunity. “The hammer would come down,” she recalled. “You’d call them and call them to get them to come in and refinance” — at which point more fees could be tacked on to the loan. Finally, disgusted with the high-pressure tactics inflicted on poor clients, Kubiniec decided to “hang up,” as she put it.
In a devastating affidavit filed with the Federal Trade Commission in 2001, Kubiniec laid bare the sleazy practices at the heart of CitiFinancial’s business model, such as “Rocopoly Money” — quarterly bonuses for employees based on the number of existing borrowers they could lure into new loans:
I and other employees would often determine how much insurance could be sold to a borrower based on the borrower’s occupation, race, age, and education level. If someone appeared uneducated, inarticulate, was a minority, or was particularly old or young, I would try to include all the coverages CitiFinancial offered. The more gullible the consumer appeared, the more coverages I would try to include in the loan.
Such revelations may have been embarrassing, and moderately expensive: Citi ended up paying $240 million in penalties and legal settlements. They made little difference, however, to the company’s operations. As Kubiniec pointed out to me, these fines amounted to “pennies” compared with the firm’s consumer-loan profits — more than $4 billion between 2002 and 2003, a nice percentage of the $33 billion in overall profits hauled in by Citigroup during those years. As part of the settlement, CitiFinancial pledged to reform its abusive lending practices, but there was little change in the way the sales force marketed its loans.
Still, the fitful attention from regulatory agencies began to irritate Weill, making his life “extraordinarily difficult,” as he later recalled. In 2003, he resigned as CEO of Citigroup, bequeathing control to Prince, the lawyer he had found at the loan-shark firm in Baltimore. (Weill retained the office of chairman until 2006.)
Prince certainly had the merit of knowing a great deal about Citigroup’s checkered past. He also had a powerful supervisor in Rubin, the affable, media-friendly operator who had not only greased the wheels for the repeal of Glass–Steagall but also helped to fend off regulatory curbs on risky speculation. Now, as chairman of the Citigroup executive committee, with a $15 million annual paycheck, the former treasury secretary was ready to provide guidance on boosting earnings, profits, and, of course, executive bonuses. One colleague described Rubin as “the Wizard of Oz behind Citigroup. . . . He certainly was the guy deferred to on key strategic decisions and certain key business decisions vis-à-vis risk.”
Despite Citi’s recent troubles with Enron and WorldCom, Rubin urged Prince to dive into even riskier waters by amping up proprietary trading — using the firm’s own money to bet on market movements, often with complex financial instruments. Thanks to the 1998 merger, these bets could now be made using Citibank depositors’ funds, which were helpfully insured by the FDIC. Furthermore, in the wake of the Commodities Futures Modernization Act, a toxic piece of legislation signed by Clinton in his final days in office, riskier forms of speculation — notably credit-default swaps — were now exempt from regulation and oversight.
As the housing bubble continued to inflate, opportunities for “prop trading” were becoming more lucrative by the day, powered by subprime mortgages that CitiFinancial and other bottom-grazing lenders were selling to poor people, especially African Americans. In particular, Citi’s sales force pushed adjustable-rate mortgages, which offered borrowers a low interest rate that later adjusted upward. In the blunt words of Bair, such loans “were purposefully designed to be unaffordable, to force borrowers into a series of refinancings and the fat fees that went along with them.”
This, of course, was the Commercial Credit business model. The idea was to maneuver poor borrowers into debt bondage, now rendered even more attractive because Wall Street had devised ways to securitize the designed-to-fail subprime loans. The loans were packaged into bundles of mortgage-backed securities, which were then repackaged into collateralized debt obligations (C.D.O.’s), which were sliced into interest-bearing tranches according to their presumed credit-worthiness. These C.D.O.’s could then be chopped into ever more abstruse instruments that were increasingly divorced from reality. Asked who constituted the market for such exotic stuff, an anonymous trader in the 2009 documentary American Casino gave the only possible answer: “Idiots.”
As other banks started to see big returns from the C.D.O. bonanza, Rubin felt increased pressure to join the party. Accordingly, in early 2005, the Wizard helped Prince persuade the Citigroup board to take on much more risk. The firm’s C.D.O. production soared, doubling to $35 billion between 2005 and 2007. This river of cash had a suitably tonic effect on senior-executive bonuses. In 2006 alone, Tom Maheras, the chairman of Citigroup’s investment bank, was awarded $34 million in salary and bonuses, and his colleagues and subordinates received similarly lavish amounts.
But the pyramid of profit rested on a narrow point: the borrowers cajoled into loans they couldn’t afford by the aggressive sales teams at CitiFinancial and other subprime lenders. Well before Maheras and his associates received their bonus checks, the market had turned. Home sales peaked in the summer of 2005 before starting a steady, then steepening, slide. By spring 2007, subprime borrowers were defaulting on their loans and losing their homes to foreclosure at an accelerating rate. The bubble was bursting, but Citi’s management was in denial. “I think our performance is going to last much longer than the market turbulence does,” a defiant Prince declared in August of that year.
Eager to ensure adequate supplies of subprime debt for the C.D.O. machine, Citi took over the notoriously abusive lender Ameriquest in September 2007. As C.D.O.’s became harder to sell, the firm’s traders joined the idiots and began hoarding their own bogus creations, while relentlessly pumping out more for a market that no longer wanted them.
Others on Wall Street were waking up to what was happening. Goldman Sachs, “a ruthless shop,” in Wilmarth’s words, had reaped billions from marketing C.D.O.’s, and it continued to do so. But as early as 2006, it began to short (that is, bet against) C.D.O.’s it sold to credulous customers. Citi, meanwhile, held on blindly to its deteriorating portfolio.
Prince was forced out at the end of 2007, after the bank admitted to $10 billion in losses on subprime loans and C.D.O.’s — a figure that would balloon to $40 billion by the end of 2008. He had taken home $158 million in cash and stock during the previous four years. His replacement, a hedge-fund manager named Vikram Pandit, collected $165 million when Citi obligingly bought his fund, which went belly up a few months after the purchase.
The executive shake-up made no difference to the firm’s cratering fortunes. By November 2008, Citigroup was insolvent. But it knew where to turn for help. Bair laughed as she recalled how Rubin was lauded at the time for arranging Citi’s latest round of bailouts — “like that was his job as titular head of the organization, to make sure the government took care of them.”
Given the outcome, that might not have been such a bad business plan. Three successive bailouts at the height of the crisis pumped a total of $45 billion in taxpayer money into the firm, along with $306 billion in loan guarantees, not to mention more than $2.5 trillion in low-cost loans from the Federal Reserve. Regulators also turned a blind eye to such little matters as Citigroup’s lies to investors and the SEC, in late 2007, about the bank’s $39 billion exposure to subprime losses. “While financial fraud of this magnitude would typically be worthy of jail time, the SEC delivered minor slaps on the wrist to just two individuals,” Pam Martens, a Wall Street money manager for twenty-one years and subsequently an acerbic commentator on the industry, later wrote. “Citigroup paid the pittance of $75 million.”
The multitrillion-dollar bailouts generated public revulsion against all the major banks and were an important factor in the rise of the Tea Party. But in the view of key decision-makers, including Bair, the bailouts were largely about Citigroup. “The over-the-top generosity,” she told me, “was driven in part by the desire to help Citi and cover up its outlier status.” In other words, everyone was showered with money to distract attention from the one bankrupt institution that was seriously in need of it.
As the world of finance had grappled with the deepening crisis in the summer of 2008, the country at large remained oblivious to the drama, focusing instead on the presidential race, in which Barack Obama and John McCain were running neck and neck. On September 15, the day Lehman Brothers filed for bankruptcy, McCain was almost two points ahead in the polls. But the collapse of the nation’s fourth-largest investment bank woke voters to the reality of the crash. They swung over to the “change candidate,” handing Obama an overwhelming victory in November. Three days after the vote, the president-elect appeared onstage in Chicago to discuss his economic policy. At his side was Rubin.
Unsurprisingly, the new administration was soon well stocked with Citigroup alumni such as Jacob Lew, who was appointed chief operating officer of the State Department. He had most recently been the chief financial officer of Citi’s Alternative Investments unit, a prop-trading group that lost $509 million in the first quarter of 2008 alone. Lew’s 2006 Citi employment contract provides useful insight into at least one of the ways in which big business infiltrates government. The contract stipulated that if Lew left the company, he would lose his “guaranteed incentive and retention award,” amounting to about $1.5 million in 2008 — unless he departed in order to accept a “full-time high-level position with the United States government or regulatory body.” In other words, Citigroup was effectively paying Lew to take a government job, in which he would either direct policy or regulate Citigroup.
Joining Lew in Washington as deputy national security adviser for international economic affairs was Michael Froman, a Harvard acquaintance of Obama’s who had introduced the president to Rubin in 2004. Froman had more recently headed the Emerging Markets Strategy division at Citigroup, pocketing more than $7.4 million in 2008, even as taxpayers were pouring billions into the failing firm.
Yet another friend of Citi’s was moving into an even more potent position. As head of the immensely powerful New York Federal Reserve Bank, Timothy Geithner had stood resolutely in Citi’s corner, his loyalty perhaps enhanced by a call he had received from Weill in November 2007, as the crisis was gathering speed. Prince had just been fired. “What would you think of running Citi?” Weill reportedly asked him.3 It seems fair to say that Geithner gave the firm little cause for complaint in the following months — deriding, for example, Bair’s suggestion that bankruptcy proceedings be started for its insolvent commercial bank. “Tim seemed to view his job as protecting Citigroup from me,” Bair later wrote in her memoir, “when he should have been worried about protecting the taxpayers from Citi.”
3 Geithner demurred after thinking hard about the offer — which may have been less than straightforward, since Rubin had already decided on Pandit, a move he would certainly have discussed with Weill.
Once installed at Treasury, Geithner had the more important task of protecting Citigroup from the president of the United States, since Obama had sensibly concluded that the whale should be broken up and disposed of. “Okay, so we do Citigroup and we do it thoroughly and well,” the president told his advisers in March 2009. But only Treasury had the bureaucratic resources to dismantle such an enormous financial carcass, and Geithner showed no interest in handling the job. According to the journalist Ron Suskind, he simply ignored the president’s directive, and Obama let the matter drop.
Freed from the threat of termination, Citigroup returned to business as usual. Subprime foreclosures were still ripping through communities across the country, peaking in 2010. Cara Stretch, a foreclosure-prevention specialist at St. Ambrose, a Baltimore housing-aid center just two miles from CitiFinancial headquarters, was working overtime, helping desperate homeowners to hang on to their houses. CitiFinancial, she recalls, was “impossible to deal with,” totally resistant to loan modifications and eager only to collect or foreclose.4
4 Stretch’s clients were at least luckier than Irzen Octa, a Citibank credit card customer in Jakarta, Indonesia. Octa was beaten to death in 2011 by Citi’s collection agents when he visited a bank branch to discuss his account.
By this time most firms had abandoned the bubble-era practice of handing out shady housing loans, then securitizing and reselling them. But not CitiMortgage, the company’s other, supposedly upmarket mortgage arm. CitiMortgage carried right on selling mortgages it had every reason to believe were unlikely to be repaid — and it did so all the way through 2011. Since the private market had dried up, the firm’s most frequent customer was the Federal Housing Agency, which meant that the American taxpayer was getting it in the neck once again.
CitiMortgage had every reason to know it was moving fraudulent paper, at least as of March 2011. That was when Sherry Hunt, a quality-control officer at the company’s headquarters in O’Fallon, Missouri, explained to the H.R. department that CitiMortgage was processing and selling thousands of such loans, and had even set up a “quality rebuttal group” to ensure that as few loans as possible got rejected, however questionable. Nothing came of her complaint, so Hunt forwarded her copious documentation to the U.S. Attorney in Manhattan, who promptly brought suit against Citi. So damning was the evidence that the bank not only caved, paying $158.3 million to settle the charges, it even admitted that it had done something wrong, a rarity in such cases.
Such unseemly revelations about Citigroup, along with those of its fellow banks, evoked a vehement reaction from Wilmarth. “You had systematic fraud at the origination stage,” he told me, “then you had systematic fraud at the securitization stage, then you had systematic fraud at the foreclosure stage. At what point do we consider these institutions to have become effectively criminal enterprises?”
Naturally, no such charges were ever brought against Citigroup or its peers. Critics complained that the banks were considered “too big to jail.” Or, as Attorney General Eric Holder ponderously phrased it in March 2013, “I am concerned that the size of some of these institutions becomes so large that it does become difficult for us to prosecute them when we are hit with indications that if you do prosecute, if you do bring a criminal charge, it will have a negative impact on the national economy, perhaps world economy.”
Nevertheless, there was general agreement in Washington that something had to be done to prevent another such fiasco. An initiative by Senators Sherrod Brown and Ted Kaufman to break up the big banks was speedily crushed, confirming a reflective comment from Senator Richard Durbin of Illinois: “The banks own this place.” Instead, after extensive labors, Congress assembled the 2,300-page Dodd–Frank Wall Street Reform and Consumer Protection Act — a huge revenue-spinner for the battalions of lobbyists and lawyers deployed to whittle down anything deemed hurtful to Wall Street.
They failed, however, to stop the push-out rule, which was inserted by Senator Blanche Lincoln of Arkansas. Lincoln had previously been deemed “reliable” by Wall Street, but in 2010 she was facing a tough primary battle against a union-backed opponent — hence her opportunistic swing to the left. Thereafter, attempts to delete Lincoln’s rule became an almost annual event in the congressional calendar, indicating just how important a cause this was for the banks in general, and for Citi in particular.
Pandit, having overseen an 89 percent decline in Citi’s stock price during his tenure, was shown the door in October 2012. His send-off: a paltry $6.7 million, which showed just how far things had declined for failed executives since the financial crash. His successor, Michael Corbat, had spent almost his entire career at Citigroup. Soon after his appointment as CEO, he announced that one of his goals was to “stop destroying our shareholders’ capital.” He hoped Citigroup “served a social purpose” and later added that he wanted banking to be thought of as “boring.”
The past few years, however, suggest that Citi culture has not changed much. Corbat has moved to shrink the firm’s consumer business, closing bank branches — even in important markets such as Dallas and Houston — in favor of stepping up speculative trading operations. The effort to unload OneMain, with its associated crusade on behalf of extortionate interest rates, is part of that same initiative.
So eager is Citigroup to be seen as a dynamic trading concern that it has been offering Citibank depositors the opportunity to trade in the riskiest arena of all: foreign-currency exchange. Citi FX solicits customers who have a minimum balance of $10,000 to wade into FOREX trading. It also offers them thirty-three-to-one leverage, meaning amateur traders can wager with just 3 percent down — a $330,000 bet on a $10,000 deposit. Clicking on the Citi FX “Risk Disclosure” link reveals that this is a purely in-house operation, and reminds visitors that “when you lose money trading, your national bank is making money on such trades.” In other words, the customer is betting against the house, exactly as in a casino. Pam Martens, who unearthed this shabby initiative, wonders whether a “U.S.-subsidized bank that is attempting to restore its reputation after a decade of outrageous missteps” should be enticing retail clients into currency trading, which she describes as a “surefire way to lose their money.”
While introducing customers to the wonders of FOREX may not be risky for the bank, ratcheting up derivative trades, especially when other megabanks such as JPMorgan Chase are backing off, is, as Bair expressed to me, “quite alarming.” Yet it does help to explain Citi’s determination to quash the push-out rule in spite of the terrible PR. (Rep. Kevin Yoder, the obscure Kansas Republican delegated by the lobbyists to insert Citigroup’s provision, found his Facebook page erupting with abusive comments, one of the more printable of them calling him a “pathetic waste of a slime mold.”)
Within days of killing the push-out rule, Citigroup bought the commodity- and energy-trading arm of Credit Suisse, an adventurous move in view of the ongoing collapse in global oil prices. Even more troubling is the heavy investment Citigroup, along with JPMorgan and Wells Fargo, has made in collateralized loan obligations — this decade’s C.D.O.’s, which consist of high-yield, high-risk junk bonds sliced into tranches. A high proportion of such junk was issued by energy firms and snapped up in massive quantities by Wall Street largely on the back of the oil-shale boom, now deflating at a precipitous rate. “I think those bonds are already on the edge of the cliff,” says Martens.
Watch out for falling blubber.
© 2014 Harper’s Magazine.
The Citi That Always Cheats:
Two Hundred Years Are Enough
As it celebrates its 200-year anniversary, the time has come for Citigroup to be put to sleep. Citigroup is a model for the Wall Street greed, fraud and failure that have devastated the lives of millions of people.
In the late 1990s, Citigroup chief Sanford Weill spearheaded the lobbying campaign to repeal the 1933 Glass-Steagal Act, which barred banks from speculating with depositors’ accounts. As a souvenir, Sandy Weill received the White House pen with which the repeal was signed into law.
As a result, depositor banks like Citi were able to participate in the enormous housing market and other speculations of the 2000s. Citigroup was and remains the largest single holder of the subprime mortgages used in creating and selling the toxic securities that triggered the 2007-2009 crash.
During this time, Citigroup economists found a new name for the philosophy that has always guided the company’s policies. They called it “plutonomy” – an economy built to meet the demands and whims of the richest 1 percent. A 2005 memo by a Citigroup global strategist reads: “We project that the plutonomies (the U.S., UK, and Canada) will likely see even more income inequality, disproportionately feeding off a further rise in the profit share in their economies, capitalist-friendly governments, more technology-driven productivity, and globalization… Since we think the plutonomy is here, is going to get stronger… It is a good time to switch out of stocks that sell to the masses and back to the plutonomy basket.”
Citigroup is no stranger to bailouts. In the 1970s, Citi led the banking industry’s charge into knowingly lending US-backed Latin American dictatorships more than they could repay. When major Latin American countries defaulted in 1982, Citi received a backdoor bailout from the IMF and wealthy countries like the US.
Wherever there is misery and crisis, Citi and its Wall Street siblings see the opportunity to profit. Citi subsidiary Banamex in Mexico is best-known for its massive involvement in laundering money for the illegal drug industry.
Where Are Our Demands?
We’d like to imagine we could make demands of such an organization.
We’d like to appeal to the consciences of Wall Street executives who receive million-dollar bonuses for failure, while they foreclose on underwater mortgages and throw families out on the street.
We’d like to imagine Citigroup would put an immediate moratorium on foreclosures, and help the people it has made homeless.
We’d like to imagine Citigroup would throw open the books on any and all of its wrongdoing in recent decades, just because it would be the right thing to do.
We’d like to imagine Citigroup will pay full restitutions to the investors it defrauded, and deliver a refund to the taxpayers who rescued it, along with a giant-size Thank-You card.
We’d like to imagine that Citigroup CEO Vikram Pandit will finally make good on his promise to meet with Occupy Wall Street members.
We’d like to imagine that former Citigroup CEO Sandy Weill will apologize for spearheading the deregulations of 1999 that helped cause the crash of 2008, and that he will return the White House pen with which the repeal of Glass-Steagal was signed into law. We want to put it in a future Museum of Corporate Crime.
We love our Mets, so we’d love it if Citi would take its disreputable name and logo away from the new Shea Stadium, since they’re not actually paying for any of it. The taxpayers gave Citigroup $45 billion in the same year that Citi Field opened.
Incidentally, the Mets and Yankees between them received more than $1.1 billion in public subsidies to build their new stadiums. Therefore the new Mets ballpark should be called Taxpayer Field.
We apologize if some of our imagined demands seem to be in a humorous vein, or academic. Given the human suffering, a moratorium on foreclosures and debt relief for individuals would certainly be the most important, life-or-death issues. But we also think the history is essential to remember.
We can always submit our appeals to the Citigroup executives and call these our “demands.” We can stand outside the building and chant, “Stop your bets – Free the Mets – Vikram must forgive the debts!” But we’ve learned enough from 200 years of Citigroup history to know how its executives will respond to our demands.
We have seen that the banks who caused the 2007-2008 crash – like Citigroup and Goldman Sachs and Bank of America – were rewarded for it. With help from our captured government, today they are bigger, and more powerful and more dangerous than ever.
As long as it’s profitable, Wall Street won’t stop committing fraud. Vikram Pandit won’t do the right thing just because we ask nicely. Citigroup won’t surrender to reason without a peaceful and powerful political struggle of the people.
Four years ago, Citigroup and other major Wall Street banks caused the most dramatic financial crash since 1929. The institutions responsible for the resulting world-wide misery are a public danger. They should have been liquidated. The executives should have been fired, and subjected to criminal investigation. What happened instead? Banks bailed out by the public continue to profit from predatory actions, and We the People continue to pay the price. Where money controls politics, Wall Street is above the law…
Washington Misses Bigger Picture of New Chinese Bank
by Jim Lobe
Bibi Netanyahu’s election, persistent violence through much of the Middle East and North Africa, and intensified efforts to forge a nuclear deal between the P5+1 and Iran topped the news here in Washington this week. But a much bigger story in terms of the future order of global politics was taking place in Europe and Beijing.
The story was simply this: virtually all of the closest European allies of the United States, beginning with Britain, defied pressure from Washington by deciding to apply for founding membership in the Asian Infrastructure Investment Bank (AIIB). This Chinese initiative could quickly rival the World Bank and the Asia Development Bank as a major source of funding for big development projects across Eurasia.
The new bank, which offers a serious multilateral alternative to the Western-dominated international financial institutions (IFIs) established in the post-World War II order, is expected to attract about three dozen initial members, including all of China’s Asian neighbors (with the possible exception of Japan). Australia, Russia, Saudi Arabia, and other Gulf states are also likely to join by the March 31 deadline set by Beijing for prospective co-founders to apply. Its $50 billion in initial capital is expected to double with the addition of new members, and that amount could quickly grow given China’s $3 trillion in foreign-exchange reserves. More details about the bank can be found in a helpful Q&A here at the Council on Foreign Relations website.
Along with the so-called BRICS bank—whose membership so far is limited to Brazil, Russia, India, China and South Africa—the AIIB poses a real “challenge to the existing global economic order,” which, of course, Western nations have dominated since the establishment of the International Monetary Fund (IMF) and the World Bank in the final days of World War II. As one unnamed European official told The New York Times, “We have moved from the world of 1945.”
That Washington’s closest Western allies are now racing to join the AIIB over U.S. objections offers yet more evidence that the “unipolar moment” celebrated by neoconservatives and aggressive nationalists 25 years ago and then reaffirmed by the same forces after the 2003 Iraq invasion is well and truly. And yet, these same neoconservatives continue to insist that—but for Obama’s weakness and defeatism—the United States remains so powerful that it really doesn’t have to take account of anyone’s interests outside its borders except, maybe, Israel’s.
(That Washington’s closest Western allies are now racing to join the new bank over U.S. objections could also presage a greater willingness to abandon the international sanctions regime against Iran if Washington is seen as responsible for the collapse of the P5+1 nuclear negotiations with Tehran. Granted, Iran’s economy—and its potential as a source of investment capital—is itsy-bitsy compared to China.)
Indeed, commentators are depicting US allies’ decision to join the AIIB (see here, here, and here as examples) as a debacle for U.S. diplomacy. The Wall Street Journal editorial board has predictably blamed Obama for defeat, calling it a “case study in declining American influence” (although it also defended Washington’s decision against joining and accused Britain of “appeasing China for commercial purposes.”)
What the Journal predictably didn’t mention was a key reason why the administration did not seek membership in the new bank: there was virtually no chance that a Republican-dominated Congress would approve it. Indeed, one reason Beijing launched its initiative and so many of our allies in both Asia and Europe have decided to join is their frustration with Republicans in Congress who have refused to ratify a major reform package designed to give developing countries, including China, a little more voting power on the Western-dominated governing boards of the IMF and the World Bank. The Group of 20 (G20) biggest economic powers actually proposed this reform in 2010, and it doesn’t even reduce Washington’s voting power, which gives it an effective veto over major policy changes in both institutions. As a result of this intransigence, the United States is the only G-20 member that has failed to ratify the reforms, effectively blocking their implementation. As noted by a New York Times editorial Friday,
Congress bears considerable blame for refusing to pass legislation to shift voting power more fairly among IMF member states, including China. China’s move to create the new development bank is part of the price being paid for that obstruction.
Indeed, Treasury Secretary Jacob Lew made this point implicitly in testimony this week in which he also restated U.S. reservations about the AIIB:
Our continued failure to approve the IMF quota and governance reforms is causing other countries, including some of our allies, to question our commitment to the IMF and other multilateral institutions that we worked to create and that advance important US and global economic and security interests.
…The IMF reforms will help convince emerging economies to remain anchored in the multilateral system that the United States helped design and continues to lead.
Now, of course, China would probably have created the AIIB on its own even if the Congress had ratified the IMF package. But the repeated congressional refusal to do so gave the Europeans (who have supported the reforms despite the fact that they would lose the most voting power if the reforms were implemented) and other U.S. allies an additional reason to join up. And none of this absolves the Obama administration of its own diplomatic failure in persuading its allies to hold back. Or the administration might have tried a different strategy: joining the Bank and then trying to get Congress on board. Surprisingly, Sen. Tom Cotton (R-AR), the upper chamber’s new neoconservative heartthrob, told an audience at the Hudson Institute this past week that it would have been better for the U.S. to sign up so as to gain some influence over the Bank’s operations and policies. (However, one of Cotton’s reported sugar-daddies, Sheldon Adelson, has always been loathe to alienate Beijing’s leaders for fear they could interfere with his lucrative casino interests in Macao.) After all, as more than one commentator has noted, the Obama administration has long argued that Beijing should assume a leadership role in global affairs commensurate with its wealth and geo-strategic importance.
Given all the negative commentary by Asia and development specialists, it’s still possible that Obama may reconsider U.S. opposition to membership before the March 31 deadline, as suggested by Elizabeth Economy of the Council on Foreign Relations and others.
But the main point here is that official Washington—including Republicans in Congress and the mainstream media—is not paying adequate attention to major shifts in the global order and how isolated the United States has become vis-à-vis the “international community,” especially its most important allies. Still stuck shoulder-deep in the Middle East, the vaunted “pivot” to the Pacific looks increasingly hollow, especially with a Republican Congress agitating to dig us in even deeper by, for example, sticking slavishly by a Netanyahu-led Israel and trying to sabotage an Iran deal. Active Republican resistance to even modest moves advocated by the administration on global warming is also harming our credibility with allies, as well as others, as it has since George W. Bush renounced the Kyoto Protocol. One could go on and on. It’s very difficult to exercise global “leadership” when you’ve isolated yourself from the rest of the world and fail to take account of how much the world has changed from that much-cherished “unipolar moment.”
The end of capitalism has begun
Without us noticing, we are entering the postcapitalist era. At the heart of further change to come is information technology, new ways of working and the sharing economy. The old ways will take a long while to disappear, but it’s time to be utopian
he red flags and marching songs of Syriza during the Greek crisis, plus the expectation that the banks would be nationalised, revived briefly a 20th-century dream: the forced destruction of the market from above. For much of the 20th century this was how the left conceived the first stage of an economy beyond capitalism. The force would be applied by the working class, either at the ballot box or on the barricades. The lever would be the state. The opportunity would come through frequent episodes of economic collapse.
Instead over the past 25 years it has been the left’s project that has collapsed. The market destroyed the plan; individualism replaced collectivism and solidarity; the hugely expanded workforce of the world looks like a “proletariat”, but no longer thinks or behaves as it once did.
If you lived through all this, and disliked capitalism, it was traumatic. But in the process technology has created a new route out, which the remnants of the old left – and all other forces influenced by it – have either to embrace or die. Capitalism, it turns out, will not be abolished by forced-march techniques. It will be abolished by creating something more dynamic that exists, at first, almost unseen within the old system, but which will break through, reshaping the economy around new values and behaviours. I call this postcapitalism.
As with the end of feudalism 500 years ago, capitalism’s replacement by postcapitalism will be accelerated by external shocks and shaped by the emergence of a new kind of human being. And it has started.
Postcapitalism is possible because of three major changes information technology has brought about in the past 25 years. First, it has reduced the need for work, blurred the edges between work and free time and loosened the relationship between work and wages. The coming wave of automation, currently stalled because our social infrastructure cannot bear the consequences, will hugely diminish the amount of work needed – not just to subsist but to provide a decent life for all.
Second, information is corroding the market’s ability to form prices correctly. That is because markets are based on scarcity while information is abundant. The system’s defence mechanism is to form monopolies – the giant tech companies – on a scale not seen in the past 200 years, yet they cannot last. By building business models and share valuations based on the capture and privatisation of all socially produced information, such firms are constructing a fragile corporate edifice at odds with the most basic need of humanity, which is to use ideas freely.
Third, we’re seeing the spontaneous rise of collaborative production: goods, services and organisations are appearing that no longer respond to the dictates of the market and the managerial hierarchy. The biggest information product in the world – Wikipedia – is made by volunteers for free, abolishing the encyclopedia business and depriving the advertising industry of an estimated $3bn a year in revenue.
Almost unnoticed, in the niches and hollows of the market system, whole swaths of economic life are beginning to move to a different rhythm. Parallel currencies, time banks, cooperatives and self-managed spaces have proliferated, barely noticed by the economics profession, and often as a direct result of the shattering of the old structures in the post-2008 crisis.
You only find this new economy if you look hard for it. In Greece, when a grassroots NGO mapped the country’s food co-ops, alternative producers, parallel currencies and local exchange systems they found more than 70 substantive projects and hundreds of smaller initiatives ranging from squats to carpools to free kindergartens. To mainstream economics such things seem barely to qualify as economic activity – but that’s the point. They exist because they trade, however haltingly and inefficiently, in the currency of postcapitalism: free time, networked activity and free stuff. It seems a meagre and unofficial and even dangerous thing from which to craft an entire alternative to a global system, but so did money and credit in the age of Edward III.
New forms of ownership, new forms of lending, new legal contracts: a whole business subculture has emerged over the past 10 years, which the media has dubbed the “sharing economy”. Buzzwords such as the “commons” and “peer-production” are thrown around, but few have bothered to ask what this development means for capitalism itself.
I believe it offers an escape route – but only if these micro-level projects are nurtured, promoted and protected by a fundamental change in what governments do. And this must be driven by a change in our thinking – about technology, ownership and work. So that, when we create the elements of the new system, we can say to ourselves, and to others: “This is no longer simply my survival mechanism, my bolt hole from the neoliberal world; this is a new way of living in the process of formation.”
The 2008 crash wiped 13% off global production and 20% off global trade. Global growth became negative – on a scale where anything below +3% is counted as a recession. It produced, in the west, a depression phase longer than in 1929-33, and even now, amid a pallid recovery, has left mainstream economists terrified about the prospect of long-term stagnation. The aftershocks in Europe are tearing the continent apart.
The solutions have been austerity plus monetary excess. But they are not working. In the worst-hit countries, the pension system has been destroyed, the retirement age is being hiked to 70, and education is being privatised so that graduates now face a lifetime of high debt. Services are being dismantled and infrastructure projects put on hold.
Even now many people fail to grasp the true meaning of the word “austerity”. Austerity is not eight years of spending cuts, as in the UK, or even the social catastrophe inflicted on Greece. It means driving the wages, social wages and living standards in the west down for decades until they meet those of the middle class in China and India on the way up.
Meanwhile in the absence of any alternative model, the conditions for another crisis are being assembled. Real wages have fallen or remained stagnant in Japan, the southern Eurozone, the US and UK. The shadow banking system has been reassembled, and is now bigger than it was in 2008. New rules demanding banks hold more reserves have been watered down or delayed. Meanwhile, flushed with free money, the 1% has got richer.
Neoliberalism, then, has morphed into a system programmed to inflict recurrent catastrophic failures. Worse than that, it has broken the 200-year pattern of industrial capitalism wherein an economic crisis spurs new forms of technological innovation that benefit everybody.
That is because neoliberalism was the first economic model in 200 years the upswing of which was premised on the suppression of wages and smashing the social power and resilience of the working class. If we review the take-off periods studied by long-cycle theorists – the 1850s in Europe, the 1900s and 1950s across the globe – it was the strength of organised labour that forced entrepreneurs and corporations to stop trying to revive outdated business models through wage cuts, and to innovate their way to a new form of capitalism.
The result is that, in each upswing, we find a synthesis of automation, higher wages and higher-value consumption. Today there is no pressure from the workforce, and the technology at the centre of this innovation wave does not demand the creation of higher-consumer spending, or the re‑employment of the old workforce in new jobs. Information is a machine for grinding the price of things lower and slashing the work time needed to support life on the planet.
As a result, large parts of the business class have become neo-luddites. Faced with the possibility of creating gene-sequencing labs, they instead start coffee shops, nail bars and contract cleaning firms: the banking system, the planning system and late neoliberal culture reward above all the creator of low-value, long-hours jobs.
Innovation is happening but it has not, so far, triggered the fifth long upswing for capitalism that long-cycle theory would expect. The reasons lie in the specific nature of information technology.
We’re surrounded not just by intelligent machines but by a new layer of reality centred on information. Consider an airliner: a computer flies it; it has been designed, stress-tested and “virtually manufactured” millions of times; it is firing back real-time information to its manufacturers. On board are people squinting at screens connected, in some lucky countries, to the internet.
Seen from the ground it is the same white metal bird as in the James Bond era. But it is now both an intelligent machine and a node on a network. It has an information content and is adding “information value” as well as physical value to the world. On a packed business flight, when everyone’s peering at Excel or Powerpoint, the passenger cabin is best understood as an information factory.
But what is all this information worth? You won’t find an answer in the accounts: intellectual property is valued in modern accounting standards by guesswork. A study for the SAS Institute in 2013 found that, in order to put a value on data, neither the cost of gathering it, nor the market value or the future income from it could be adequately calculated. Only through a form of accounting that included non-economic benefits, and risks, could companies actually explain to their shareholders what their data was really worth. Something is broken in the logic we use to value the most important thing in the modern world.
The great technological advance of the early 21st century consists not only of new objects and processes, but of old ones made intelligent. The knowledge content of products is becoming more valuable than the physical things that are used to produce them. But it is a value measured as usefulness, not exchange or asset value. In the 1990s economists and technologists began to have the same thought at once: that this new role for information was creating a new, “third” kind of capitalism – as different from industrial capitalism as industrial capitalism was to the merchant and slave capitalism of the 17th and 18th centuries. But they have struggled to describe the dynamics of the new “cognitive” capitalism. And for a reason. Its dynamics are profoundly non-capitalist.
During and right after the second world war, economists viewed information simply as a “public good”. The US government even decreed that no profit should be made out of patents, only from the production process itself. Then we began to understand intellectual property. In 1962, Kenneth Arrow, the guru of mainstream economics, said that in a free market economy the purpose of inventing things is to create intellectual property rights. He noted: “precisely to the extent that it is successful there is an underutilisation of information.”
You can observe the truth of this in every e-business model ever constructed: monopolise and protect data, capture the free social data generated by user interaction, push commercial forces into areas of data production that were non-commercial before, mine the existing data for predictive value – always and everywhere ensuring nobody but the corporation can utilise the results.
If we restate Arrow’s principle in reverse, its revolutionary implications are obvious: if a free market economy plus intellectual property leads to the “underutilisation of information”, then an economy based on the full utilisation of information cannot tolerate the free market or absolute intellectual property rights. The business models of all our modern digital giants are designed to prevent the abundance of information.
Yet information is abundant. Information goods are freely replicable. Once a thing is made, it can be copied/pasted infinitely. A music track or the giant database you use to build an airliner has a production cost; but its cost of reproduction falls towards zero. Therefore, if the normal price mechanism of capitalism prevails over time, its price will fall towards zero, too.
For the past 25 years economics has been wrestling with this problem: all mainstream economics proceeds from a condition of scarcity, yet the most dynamic force in our modern world is abundant and, as hippy genius Stewart Brand once put it, “wants to be free”.
There is, alongside the world of monopolised information and surveillance created by corporations and governments, a different dynamic growing up around information: information as a social good, free at the point of use, incapable of being owned or exploited or priced. I’ve surveyed the attempts by economists and business gurus to build a framework to understand the dynamics of an economy based on abundant, socially-held information. But it was actually imagined by one 19th-century economist in the era of the telegraph and the steam engine. His name? Karl Marx.
The scene is Kentish Town, London, February 1858, sometime around 4am. Marx is a wanted man in Germany and is hard at work scribbling thought-experiments and notes-to-self. When they finally get to see what Marx is writing on this night, the left intellectuals of the 1960s will admit that it “challenges every serious interpretation of Marx yet conceived”. It is called “The Fragment on Machines”.
In the “Fragment” Marx imagines an economy in which the main role of machines is to produce, and the main role of people is to supervise them. He was clear that, in such an economy, the main productive force would be information. The productive power of such machines as the automated cotton-spinning machine, the telegraph and the steam locomotive did not depend on the amount of labour it took to produce them but on the state of social knowledge. Organisation and knowledge, in other words, made a bigger contribution to productive power than the work of making and running the machines.
Given what Marxism was to become – a theory of exploitation based on the theft of labour time – this is a revolutionary statement. It suggests that, once knowledge becomes a productive force in its own right, outweighing the actual labour spent creating a machine, the big question becomes not one of “wages versus profits” but who controls what Marx called the “power of knowledge”.
In an economy where machines do most of the work, the nature of the knowledge locked inside the machines must, he writes, be “social”. In a final late-night thought experiment Marx imagined the end point of this trajectory: the creation of an “ideal machine”, which lasts forever and costs nothing. A machine that could be built for nothing would, he said, add no value at all to the production process and rapidly, over several accounting periods, reduce the price, profit and labour costs of everything else it touched.
Once you understand that information is physical, and that software is a machine, and that storage, bandwidth and processing power are collapsing in price at exponential rates, the value of Marx’s thinking becomes clear. We are surrounded by machines that cost nothing and could, if we wanted them to, last forever.
In these musings, not published until the mid-20th century, Marx imagined information coming to be stored and shared in something called a “general intellect” – which was the mind of everybody on Earth connected by social knowledge, in which every upgrade benefits everybody. In short, he had imagined something close to the information economy in which we live. And, he wrote, its existence would “blow capitalism sky high”.
Marx imagined something close to our information economy. He wrote its existence would blow capitalism sky high
With the terrain changed, the old path beyond capitalism imagined by the left of the 20th century is lost.
But a different path has opened up. Collaborative production, using network technology to produce goods and services that only work when they are free, or shared, defines the route beyond the market system. It will need the state to create the framework – just as it created the framework for factory labour, sound currencies and free trade in the early 19th century. The postcapitalist sector is likely to coexist with the market sector for decades, but major change is happening.
Networks restore “granularity” to the postcapitalist project. That is, they can be the basis of a non-market system that replicates itself, which does not need to be created afresh every morning on the computer screen of a commissar.
The transition will involve the state, the market and collaborative production beyond the market. But to make it happen, the entire project of the left, from protest groups to the mainstream social democratic and liberal parties, will have to be reconfigured. In fact, once people understand the logic of the postcapitalist transition, such ideas will no longer be the property of the left – but of a much wider movement, for which we will need new labels.
Who can make this happen? In the old left project it was the industrial working class. More than 200 years ago, the radical journalist John Thelwall warned the men who built the English factories that they had created a new and dangerous form of democracy: “Every large workshop and manufactory is a sort of political society, which no act of parliament can silence, and no magistrate disperse.”
Today the whole of society is a factory. We all participate in the creation and recreation of the brands, norms and institutions that surround us. At the same time the communication grids vital for everyday work and profit are buzzing with shared knowledge and discontent. Today it is the network – like the workshop 200 years ago – that they “cannot silence or disperse”.
True, states can shut down Facebook, Twitter, even the entire internet and mobile network in times of crisis, paralysing the economy in the process. And they can store and monitor every kilobyte of information we produce. But they cannot reimpose the hierarchical, propaganda-driven and ignorant society of 50 years ago, except – as in China, North Korea or Iran – by opting out of key parts of modern life. It would be, as sociologist Manuel Castells put it, like trying to de-electrify a country.
By creating millions of networked people, financially exploited but with the whole of human intelligence one thumb-swipe away, info-capitalism has created a new agent of change in history: the educated and connected human being.
This will be more than just an economic transition. There are, of course, the parallel and urgent tasks of decarbonising the world and dealing with demographic and fiscal timebombs. But I’m concentrating on the economic transition triggered by information because, up to now, it has been sidelined. Peer-to-peer has become pigeonholed as a niche obsession for visionaries, while the “big boys” of leftwing economics get on with critiquing austerity.
In fact, on the ground in places such as Greece, resistance to austerity and the creation of “networks you can’t default on” – as one activist put it to me – go hand in hand. Above all, postcapitalism as a concept is about new forms of human behaviour that conventional economics would hardly recognise as relevant.
So how do we visualise the transition ahead? The only coherent parallel we have is the replacement of feudalism by capitalism – and thanks to the work of epidemiologists, geneticists and data analysts, we know a lot more about that transition than we did 50 years ago when it was “owned” by social science. The first thing we have to recognise is: different modes of production are structured around different things. Feudalism was an economic system structured by customs and laws about “obligation”. Capitalism was structured by something purely economic: the market. We can predict, from this, that postcapitalism – whose precondition is abundance – will not simply be a modified form of a complex market society. But we can only begin to grasp at a positive vision of what it will be like.
I don’t mean this as a way to avoid the question: the general economic parameters of a postcapitalist society by, for example, the year 2075, can be outlined. But if such a society is structured around human liberation, not economics, unpredictable things will begin to shape it.
For example, the most obvious thing to Shakespeare, writing in 1600, was that the market had called forth new kinds of behaviour and morality. By analogy, the most obvious “economic” thing to the Shakespeare of 2075 will be the total upheaval in gender relationships, or sexuality, or health. Perhaps there will not even be any playwrights: perhaps the very nature of the media we use to tell stories will change – just as it changed in Elizabethan London when the first public theatres were built.
Think of the difference between, say, Horatio in Hamlet and a character such as Daniel Doyce in Dickens’s Little Dorrit. Both carry around with them a characteristic obsession of their age – Horatio is obsessed with humanist philosophy; Doyce is obsessed with patenting his invention. There can be no character like Doyce in Shakespeare; he would, at best, get a bit part as a working-class comic figure. Yet, by the time Dickens described Doyce, most of his readers knew somebody like him. Just as Shakespeare could not have imagined Doyce, so we too cannot imagine the kind of human beings society will produce once economics is no longer central to life. But we can see their prefigurative forms in the lives of young people all over the world breaking down 20th-century barriers around sexuality, work, creativity and the self.
The feudal model of agriculture collided, first, with environmental limits and then with a massive external shock – the Black Death. After that, there was a demographic shock: too few workers for the land, which raised their wages and made the old feudal obligation system impossible to enforce. The labour shortage also forced technological innovation. The new technologies that underpinned the rise of merchant capitalism were the ones that stimulated commerce (printing and accountancy), the creation of tradeable wealth (mining, the compass and fast ships) and productivity (mathematics and the scientific method).
Present throughout the whole process was something that looks incidental to the old system – money and credit – but which was actually destined to become the basis of the new system. In feudalism, many laws and customs were actually shaped around ignoring money; credit was, in high feudalism, seen as sinful. So when money and credit burst through the boundaries to create a market system, it felt like a revolution. Then, what gave the new system its energy was the discovery of a virtually unlimited source of free wealth in the Americas.
A combination of all these factors took a set of people who had been marginalised under feudalism – humanists, scientists, craftsmen, lawyers, radical preachers and bohemian playwrights such as Shakespeare – and put them at the head of a social transformation. At key moments, though tentatively at first, the state switched from hindering the change to promoting it.
Today, the thing that is corroding capitalism, barely rationalised by mainstream economics, is information. Most laws concerning information define the right of corporations to hoard it and the right of states to access it, irrespective of the human rights of citizens. The equivalent of the printing press and the scientific method is information technology and its spillover into all other technologies, from genetics to healthcare to agriculture to the movies, where it is quickly reducing costs.
The modern equivalent of the long stagnation of late feudalism is the stalled take-off of the third industrial revolution, where instead of rapidly automating work out of existence, we are reduced to creating what David Graeber calls “bullshit jobs” on low pay. And many economies are stagnating.
The equivalent of the new source of free wealth? It’s not exactly wealth: it’s the “externalities” – the free stuff and wellbeing generated by networked interaction. It is the rise of non-market production, of unownable information, of peer networks and unmanaged enterprises. The internet, French economist Yann Moulier-Boutang says, is “both the ship and the ocean” when it comes to the modern equivalent of the discovery of the new world. In fact, it is the ship, the compass, the ocean and the gold.
The modern day external shocks are clear: energy depletion, climate change, ageing populations and migration. They are altering the dynamics of capitalism and making it unworkable in the long term. They have not yet had the same impact as the Black Death – but as we saw in New Orleans in 2005, it does not take the bubonic plague to destroy social order and functional infrastructure in a financially complex and impoverished society.
Once you understand the transition in this way, the need is not for a supercomputed Five Year Plan – but a project, the aim of which should be to expand those technologies, business models and behaviours that dissolve market forces, socialise knowledge, eradicate the need for work and push the economy towards abundance. I call it Project Zero – because its aims are a zero-carbon-energy system; the production of machines, products and services with zero marginal costs; and the reduction of necessary work time as close as possible to zero.
Most 20th-century leftists believed that they did not have the luxury of a managed transition: it was an article of faith for them that nothing of the coming system could exist within the old one – though the working class always attempted to create an alternative life within and “despite” capitalism. As a result, once the possibility of a Soviet-style transition disappeared, the modern left became preoccupied simply with opposing things: the privatisation of healthcare, anti-union laws, fracking – the list goes on.
If I am right, the logical focus for supporters of postcapitalism is to build alternatives within the system; to use governmental power in a radical and disruptive way; and to direct all actions towards the transition – not the defence of random elements of the old system. We have to learn what’s urgent, and what’s important, and that sometimes they do not coincide.
The power of imagination will become critical. In an information society, no thought, debate or dream is wasted – whether conceived in a tent camp, prison cell or the table football space of a startup company.
As with virtual manufacturing, in the transition to postcapitalism the work done at the design stage can reduce mistakes in the implementation stage. And the design of the postcapitalist world, as with software, can be modular. Different people can work on it in different places, at different speeds, with relative autonomy from each other. If I could summon one thing into existence for free it would be a global institution that modelled capitalism correctly: an open source model of the whole economy; official, grey and black. Every experiment run through it would enrich it; it would be open source and with as many datapoints as the most complex climate models.
The main contradiction today is between the possibility of free, abundant goods and information; and a system of monopolies, banks and governments trying to keep things private, scarce and commercial. Everything comes down to the struggle between the network and the hierarchy: between old forms of society moulded around capitalism and new forms of society that prefigure what comes next.
Is it utopian to believe we’re on the verge of an evolution beyond capitalism? We live in a world in which gay men and women can marry, and in which contraception has, within the space of 50 years, made the average working-class woman freer than the craziest libertine of the Bloomsbury era. Why do we, then, find it so hard to imagine economic freedom?
It is the elites – cut off in their dark-limo world – whose project looks as forlorn as that of the millennial sects of the 19th century. The democracy of riot squads, corrupt politicians, magnate-controlled newspapers and the surveillance state looks as phoney and fragile as East Germany did 30 years ago.
All readings of human history have to allow for the possibility of a negative outcome. It haunts us in the zombie movie, the disaster movie, in the post-apocalytic wasteland of films such as The Road or Elysium. But why should we not form a picture of the ideal life, built out of abundant information, non-hierarchical work and the dissociation of work from wages?
Millions of people are beginning to realise they have been sold a dream at odds with what reality can deliver. Their response is anger – and retreat towards national forms of capitalism that can only tear the world apart. Watching these emerge, from the pro-Grexit left factions in Syriza to the Front National and the isolationism of the American right has been like watching the nightmares we had during the Lehman Brothers crisis come true.
We need more than just a bunch of utopian dreams and small-scale horizontal projects. We need a project based on reason, evidence and testable designs, that cuts with the grain of history and is sustainable by the planet. And we need to get on with it.
A New Plan for American Cities To Free Themselves of Wall Street’s Control
Arbitrary financial fees are sucking cities and states dry. But they can change the terms if they band together and bargain collectively.
BY SAQIB BHATTI
Just three U.S. cities—New York, Los Angeles and Chicago—together with their related agencies and pension funds, do nearly $600 billion of business with Wall Street every year.
In August 2014, the Los Angeles City Council debated whether to call for the renegotiation of the city’s financial deals. A report by the labor-community coalition Fix L.A. found that the city had spent more than twice as much on banking fees in fiscal year 2013 as it had on street services.
To try to balance its budget, Los Angeles had enacted hundreds of millions of dollars in cuts over the previous five years. City jobs had been slashed by 10 percent, flood control procedures had been cut back, crumbling sidewalks were not repaired and alleys were rarely cleared of debris. Sewer inspections ceased entirely; the number of sewer overflows doubled from 2008 to 2013.
The campaign slogan wrote itself: “Invest in our streets, not Wall Street!”
At the city council debate, Timothy Butcher, a worker with the Bureau of Street Services, got up and said, “I don’t know a whole lot about high finance. I’m just a truck driver. But I do know, if I go to a bank and they give me a bad deal, I don’t deal with that bank any more. And I don’t understand why the city can’t use the same kind of concept on some of these big banks, saying, ‘Hey, help us out or, you know, we’re not going to deal with you any more.’ ”
The City Council approved the resolution unanimously.
It was a blow against both the austerity agenda and the iron grip of Wall Street on American cities. State and local governments in the United States rely on Wall Street firms to put together bond deals, manage their investments and provide financial services. For this, banks charge billions of dollars in fees each year. Public officials believe they have little choice but to cough up. When there are revenue shortfalls, cities typically impose austerity measures and cut essential community services, but Wall Street gets a free pass—payments to banks are considered untouchable.
Public officials assume (wrongly) that financial fees are set in stone because they are based on so-called market rates. However, market rates aren’t preordained by God. Banks set them, and public finance officials simply don’t demand anything substantively lower.
So, what if cities took a page from the labor movement and bargained collectively over interest rates and other financial deals?
The simple reason why anti-union politicians are waging a war on collective bargaining by workers is that it works: There is power in numbers. The basic idea behind such bargaining is to shift the balance of power in the employer-employee relationship and empower workers to negotiate with owners on a more equal footing.
But collective bargaining does not have to be limited to the workplace. Student organizations such as United Students Against Sweatshops have forced university administrations to negotiate over labor standards for their merchandise vendors. Consumer unions press retailers over issues like pricing and safety standards. Community organizations are able to negotiate community-benefit agreements with major corporations in their cities and win benefits such as local hiring policies and community investment standards.
Similarly, public finance officials in cities, states and school districts across the country could apply collective bargaining practices to their financial relationships with Wall Street. While there is no established mechanism for them to do so, there are some creative options worth exploring. For example, cities could establish a nonprofit or publicly funded agency to set guidelines for municipal finance deals and refuse to do business with any bank that does not comply. (More on this later.)
This may sound pie-in-the-sky, but the reality is that American taxpayer dollars are a tremendous source of bargaining power. Just three U.S. cities— New York, Los Angeles and Chicago— together with their related agencies and pension funds, do nearly $600 billion of business with Wall Street every year, more than the gross domestic product of Sweden. Wall Street wants a piece of that action. If it has to jump through a few hoops to get it, it will. This gives public officials the leverage to demand lower interest rates and fairer terms, freeing up scarce funds for community services like parks, libraries and schools.
Over the last few decades, the banking industry has shifted its profit model away from interest. Big banks’ profits now rely heavily on fees—the money charged for creating loans, packaging them into securities, selling them and servicing them. This structure incentivizes banks to push more complex and expensive deals, like adjustable-rate mortgages and variable-rate bonds, that require fees and add-ons.
Banking fees do not have to bear any relationship to the actual cost of providing services. Banks charge whatever they can get away with, which is why fees have shot up as banks have consolidated and customers’ choices have narrowed. For example, in 2007, Bank of America raised its ATM fee for non-customers from $2 to $3. In all likelihood, the bank’s costs hadn’t suddenly risen 50 percent, despite a spokesperson’s claim that the fee hike would offset “significant” expansion and upgrade of its machines. Banks also arbitrarily raised prices on credit enhancements for municipal borrowers after the financial crash.
For cities and states, which deal in large dollar amounts, this nickel-anddiming hits particularly hard. A 1 percent fee on a $200 million bond is a lot more money than a 1 percent fee on a $200,000 mortgage. That explains why the city of Los Angeles paid $334 million in publicly disclosed fees for financial services in fiscal year 2013, according to the Fix L.A. report. This amount did not include principal or interest on any debt, and neither did it include fees that are not publicly disclosed, like the astronomical fees hedge funds and private equity firms charge pension funds to manage investments.
In Illinois, a preliminary analysis by researchers at the Service Employees International Union (SEIU)—full disclosure: where I used to work—found that the state’s pension funds spent approximately $400 million in publicly disclosed fees in 2014 alone. New York City Comptroller Scott Stringer has released a report showing that nearly all of the returns from the city’s five pension funds over the past 10 years—approximately $2.5 billion—have been eaten up by fees. An investigation by the International Business Times found that New Jersey’s pension funds paid more than $600 million in financial fees in 2014.
Every dollar that banks collect in fees from state and local governments and pension funds is a dollar not going toward essential neighborhood services. It’s not just the streets and sewers of Los Angeles. Illinois is teetering on the edge of a government shutdown. Already, Gov. Bruce Rauner has slashed funding for college scholarships for low-income students, taken a hatchet to vital healthcare programs like Medicaid, and cut state funding for CeaseFire, a highly regarded violence-prevention program with a proven track record.
Most public officials still resist acknowledging that these fees are a problem. When Gov. Rauner tried to cut the municipal share of state income tax revenue by 50 percent this spring, the Illinois House of Representatives responded with a first-of-its-kind resolution urging the state to match any such cuts with proportional cuts to financial-service fees. SEIU also proposed a reduction of financial-service fees during its contract negotiations for state workers, but this was roundly rejected by the Rauner administration.
Of course, Rauner has personally profited from these fees in the past. Before deciding to run for office, he was the managing director of a private equity firm that did business with Illinois pension funds, GTCR LLC.
But even public finance officials who don’t have direct industry ties typically drag their feet on fee reductions. The Los Angeles City Council’s efforts to pressure banks into renegotiating or terminating costly financial deals were met with stiff resistance from the city’s financial officers.
There are a number of reasons why finance staff can be reluctant, if not obstructionist, in efforts to curtail banking fees. One is the revolving door between public finance jobs and Wall Street. Another is the fact that public officials can be outflanked by smoothtalking bankers making dishonest and deceptive sales pitches. But perhaps the biggest reason is that officials truly believe they got the best deal they could. Los Angeles’s finance staff point out that even though they paid $334 million in fees in 2013 alone, they actually did better than many of their peers.
When Councilmember Paul Koretz called for a vote on the motion in Los Angeles, he skewered the City Administrative Officer’s (CAO) office, saying: “Our lack of success in negotiating thus far could partly be a factor of CAO saying that, ‘Hey, this is a fine deal and we’ve done as well on this as anything else we could do.’ ”
Changing the rules
Under the current system, Wall Street sets the rules of the game and public officials think they have no choice but to play on those terms. They may negotiate around the margins and get a fee lowered by half a percentage point, but they do not typically push back on the illogic of the underlying fee structures.
Cities that consider taking a stand against Wall Street are routinely told that if they do, their credit ratings will be downgraded, and banks and investors will stop doing business with them. In reality, the public finance officials who claim they have no choice but to pay high fees and accept onerous terms from Wall Street banks are like elephants afraid of mice. The notion that Wall Street could sustain a prolonged boycott against a city or state as punishment goes against the very nature of banking. U.S. taxpayer dollars are among the largest pools of capital in the world. If there is money to be made, there will always be a bank that will step in to get that business.
Similarly, threats about credit rating downgrades are baseless. Rating agencies are concerned with a borrower’s ability to pay back its bondholders. If anything, negotiating lower fees with banks would free up money and make cities and states less likely to default.
Some cities and states are already blazing the trail. In 2010, then-Massachusetts State Treasurer Timothy Cahill moved state deposits out of Bank of America, Citigroup and Wells Fargo because the banks’ credit card operations did not comply with the state’s usury law, which caps interest rates at 18 percent.
In 2012, the city of Oakland initiated a boycott of Goldman Sachs because the bank refused to renegotiate a deal that had put the city on the losing side of a risky interest-rate bet costing $4 milion in annual fees and payments.
And earlier this year, the Board of Supervisors of Santa Cruz County, Calif., voted not to do any new business for the next five years with banks convicted of felonies. The boycott affects the five banks, including JPMorgan Chase and Citigroup, that pleaded guilty to illegally rigging foreign exchange rates.
These actions are first steps. However, they would be significantly more effective if cities and states joined together. When Oakland—a mid-sized city of 400,000 people—boycotted Goldman Sachs, Goldman didn’t flinch. But if several cities, states and school districts banded together and threatened a boycott, the banking behemoth would be forced to take notice.
Power in numbers
In an ideal world, the federal government would establish standards for protecting state and local officials against predatory financial deals. In the same way that there is a Consumer Financial Protection Bureau, there is a dire need for a Municipal Financial Protection Bureau whose top priority would be to protect taxpayers’ interests. Even though there are already agencies with oversight over municipal finance—such as the Municipal Securities Rulemaking Board and the Securities and Exchange Commission—protecting cities and states from abuse is not their priority. And they have close ties to the financial services industry.
Because federal regulation has proven woefully inadequate, and the chances of effective congressional action in the near future are slim to none, cities and states need to step up.
If just New York, Los Angeles and Chicago banded together and threatened to withhold their collective $600 billion of potential annual business with Wall Street, they wouldn’t have to simply accept the so-called market rates. They have enough bargaining power to set their own.
Together, they could refuse to sign contracts that prevent them from publicly disclosing fees. If they also get their state governments and pension funds on board, they could alter fee structures for things like bond underwriting. They could require any bank that pitches products to sign a fiduciary agreement, meaning they are legally required to put taxpayer interests ahead of their own.
Santa Cruz County Supervisor Ryan Coonerty has already said he is reaching out to other jurisdictions across the country to urge them to join in refusing to do business with felonious banks. If public officials were to coordinate their demands and present a unified front, they could force the banks to take them seriously.
My organization, the Roosevelt Institute’s ReFund America Project, works with community-labor coalitions in cities nationwide that are calling for a reduction in bank fees and an end to predatory municipal finance deals. Last summer, ReFund America and Local Progress—a network linking local elected officials with unions and progressive groups—led a small meeting called “A Progressive Vision for Municipal Finance.” We brought together organizers, policy experts and public officials to discuss various proposals for fixing municipal finance. Among those present were four city councilmembers and three representatives from mayors’ offices. These officials expressed strong interest in developing a bargaining vehicle that would allow cities to take collective action to stand up to Wall Street.
One idea was the creation of a nonprofit or public agency to set municipal finance guidelines. Individual cities and states could subscribe to these guidelines and the agency would in effect become the gatekeeper for banks wishing to do business with them. The more subscribers the agency had, the more bargaining power it would hold. Strict controls would help ensure the agency remained scrupulously independent of Wall Street. That organization could even be the precursor to a national Municipal Financial Protection Bureau.
People over profit
Together, American cities, states and pension funds hold untold power. If they flex their muscles and organize around coordinated demands, they can radically transform taxpayers’ relationship with Wall Street.
In 2012, a community leader from Oakland attended the Goldman Sachs shareholder meeting in New York City and urged CEO Lloyd Blankfein to renegotiate its interest rate swap with the city to avoid library closures and layoffs. He said it was “an issue of morality.” Blankfein responded, “No, I think it’s a matter of shareholder assets.”
This is the mentality that led Rolling Stone’s Matt Taibbi to call Goldman Sachs “a great vampire squid wrapped around the face of humanity, relentlessly jamming its blood funnel into anything that smells like money.”
It’s not just Goldman. All of municipal finance has become an extractive industry, pumping billions away from the communities that need them most. Morality is an externality that financial firms seldom concern themselves with. The financial sector’s fee-based business model is designed to maximize profits, not to protect taxpayers.
Banks may not have a moral compass, but their business contracts with our state and local governments can and should. After all, our cities, states and school districts are not simply fodder for Wall Street’s insatiable greed. Our elected leaders have a duty to protect us from predatory financial practices. Cities and states can force banks to charge drastically lower fees, do away with arbitrary fee structures and eliminate onerous terms that divert billions of dollars away from the most vulnerable members of our society into bonus checks for our nation’s wealthiest few.
Governors in states like Wisconsin, Michigan and Illinois are waging war on collective bargaining and telling taxpayers that empowering publicsector unions robs state coffers, but the real drain on public treasuries is the billions in fees paid to banks every year. And unlike money that goes into workers’ pockets, most of these fees are not recycled back into the local economy but sent to offshore tax havens or invested in complex financial schemes. The irony is that collective bargaining is one of the most effective tools available to public officials who truly want to do right by taxpayers—and cast off Wall Street’s tentacles.
http://www.nytimes.com/2015/09/21/busin ... tests.html
(video at link)
Drug Goes From $13.50 a Tablet to $750, Overnight
Specialists in infectious disease are protesting a gigantic overnight increase in the price of a 62-year-old drug that is the standard of care for treating a life-threatening parasitic infection.
The drug, called Daraprim, was acquired in August by Turing Pharmaceuticals, a start-up run by a former hedge fund manager. Turing immediately raised the price to $750 a tablet from $13.50, bringing the annual cost of treatment for some patients to hundreds of thousands of dollars.
“What is it that they are doing differently that has led to this dramatic increase?” said Dr. Judith Aberg, the chief of the division of infectious diseases at the Icahn School of Medicine at Mount Sinai. She said the price increase could force hospitals to use “alternative therapies that may not have the same efficacy.”
Turing’s price increase is not an isolated example. While most of the attention on pharmaceutical prices has been on new drugs for diseases like cancer, hepatitis C and high cholesterol, there is also growing concern about huge price increases on older drugs, some of them generic, that have long been mainstays of treatment...
Martin Shkreli (born April 1, 1983) is an Albanian American hedge fund manager and entrepreneur, specializing in healthcare businesses and is a co-founder of MSMB Capital Management, of Retrophin, Inc. and the founder of Turing Pharmaceuticals AG.
He is a co-founder and was the Chief Executive Officer of Retrophin LLC, a biotechnology firm founded in 2011.
In September 2015, he was criticized by several public health organizations for obtaining manufacturing licenses on old, out-of-patent, life-saving medicines including pyrimethamine (brand name Daraprim) and increasing the prices of the drugs in the US, sometimes by more than 5000%. Pyrimethamine is listed in the WHO Model List of Essential Medicines, a list of the most important medications needed in a basic health system. He was accused of manipulating the price and taking these basic drugs out of of reach of millions of needy patients worldwide.
In 2000, Shkreli was a college intern and then clerk at Jim Cramer's Cramer, Berkowitz, & Co. After four years at Cramer Berkowitz, he held jobs at UBS and Intrepid Capital Management before starting his first hedge fund, Elea Capital Management, in 2006. Shkreli launched MSMB Capital Management (named after the two founding Portfolio Managers, Martin Shkreli and Marek Biestek) in 2009....
"Every time I saw a bumper sticker which said, 'Where's my bailout?' it hurt," he told Capital Download.
That is a rare admission of emotion. Former Treasury Secretary Tim Geithner once described Bernanke as the Buddha of central banking, his demeanor impassive even during disaster. In the 2011 HBO movie Too Big To Fail, actor Paul Giamatti won a Screen Actors Guild award for his portrayal of the Fed chairman as restrained in all things, from his soft-spoken speech to his choice of oatmeal for breakfast. In comparison, Treasury Secretary Hank Paulson seemed almost volcanic.
He hasn't seen the movie. "I read the book, but I like to say I saw the original, so it wasn't necessary to see the movie," he says.
Will he ever watch it? "If there's nothing else on," he shrugs, "maybe so."
“It would have been my preference to have more investigation of individual action, since obviously everything that went wrong or was illegal was done by some individual, not be an abstract firm. And so in that respect I think that there should have been more accountability at the individual level,” Bernanke said in an interview with USA Today on Sunday.
While all the major Wall Street firms – J.P. Morgan Chase (JPM), Goldman Sachs (GS), Citigroup (NYSE: C), Bank of America (NYSE: BAC), Wells Fargo (WFC), etc… — have agreed to pay fines now totaling in the hundreds of billions stemming from widespread fraud that occurred ahead of the 2008 meltdown, not a single top Wall Street executive was charged ever as a criminal.
The former Fed chair… blamed ongoing political hostility targeting the Fed on the Fed’s inability to properly communicate what it was doing at the time and why the bailouts were necessary.
“The Fed is not a law enforcement agency,” he said. “The Department of Justice and others are responsible for that, and a lot of their efforts have been to indict or threaten to indict financial firms. Now a financial firm is of course a legal fiction. It’s not a person. You can’t put a financial firm in jail.”
At a Parliamentary Committee hearing a few years ago I asserted, boldly, that global interest rates were at their lowest-ever levels. A wise colleague challenged me afterwards: “How do you know they weren’t lower in Babylonian times?” Several exhausted research assistants later I can report that, luckily, I was on safe ground. Interest rates appear to be lower than at any time in the past 5000 years...
SEC settles with hedge fund billionaire Steven Cohen
By Renae Merle January 8 at 3:59 PM
Steven A. Cohen has settled with the Securities and Exchange Commission. Bloomberg
NEW YORK — Billionaire Steven A. Cohen has been in the crosshairs of federal prosecutors for nearly a decade. His hedge fund, SAC Capital, was once one of the most powerful on Wall Street, managing more than $15 billion for investors and producing stellar returns for years.
But prosecutors suspected that SAC’s success was too good to be true.
U.S. Attorney Preet Bharara in Manhattan once called Cohen’s hedge fund as a “veritable magnet for market cheaters.” When, in 2013, SAC agreed to pay $1.2 billion to settle charges that it tolerated rampant insider trading it was one of the highest-profile successes in the government’s aggressive push against insider trading.
Still, connecting Cohen, one of the richest people on the world, directly to those misdeeds has remained elusive. And on Friday, the Securities and Exchange Commission essentially conceded. The Wall Street watchdog settled its nearly three-year old civil case against Cohen, who was accused of failing to properly supervise employees, with no financial penalty.
Instead, Cohen’s new firm, Point72, which manages his $10 billion personal fortune, must hire an independent consultant to make sure it complies with securities laws. Once at risk of being banned from the industry for life, Cohen can begin managing others’ money again in 2018, under the agreement.
“It’s the ultimate slap on the wrist, if he’s smart in two years he [Cohen] will be back managing money,” said Gene Murphy, a white-collar defense attorney at Murphy & Hourihane in Chicago.
It is a remarkable turnaround for one of the most recognizable people on Wall Street. Even as he lived under federal investigation, Cohen remained an influential figure with industry insiders closely following which stocks he bought and sold. When Cohen spent $155 million for a painting by Pablo Picasso titled “Le Reve” just days after SAC settled with federal regulators it was covered in the local tabloid press.
“Inevitably, some will ask why I agreed to settle,” Cohen said in a letter to Point72 employees obtained by The Washington Post. “The longer the pending litigation lingered, the more it distracted from the world-class Firm that we are building.”
“When SAC pled guilty, I vowed that what happened to SAC would never happen to Point72,” Cohen said in the letter to employees. “It is a testament to your perseverance, talent, and focus that we not only survived an event that would have ended most firms, but we thrived in the wake of it.”
The settlement is a humbling end to one of the government’s most high-profile cases. The allegations against Cohen weakened last year when prosecutors were forced to drop its case against former SAC employee, Michael Steinberg. Steinberg, a longtime Cohen confidant, had been found guilty of trading on illegal tips involving technology stocks. But prosecutors abandoned that case and several others after an appeals court decision made it more difficult for authorities to purse certain kinds of wrongful trading cases.
For a while, the SEC appeared poised to continue to pursue Cohen by focusing on his alleged failure to supervise another employee, Mathew Martoma, who was convicted last year of what prosecutors described as the most lucrative insider-trading scheme ever. Martoma is appealing his conviction and Cohen did not acknowledge wrongdoing in Friday’s settlement.
“Insider trading cases have always been difficult to make, but with the [appeals court ruling], the SEC was dealt an incredibly tough hand, and the deal with Cohen is probably the best it could have hoped for under the circumstances,” said Jordan Thomas, a partner at Labaton Sucharow and a former Justice Department trial lawyer.
Russia Breaking Wall St Oil Price Monopoly
By F. William Engdahl
January 13, 2016 "Information Clearing House" - "NEO" - Russia has just taken significant steps that will break the present Wall Street oil price monopoly, at least for a huge part of the world oil market. The move is part of a longer-term strategy of decoupling Russia’s economy and especially its very significant export of oil, from the US dollar, today the Achilles Heel of the Russian economy.
Later in November the Russian Energy Ministry has announced that it will begin test-trading of a new Russian oil benchmark. While this might sound like small beer to many, it’s huge. If successful, and there is no reason why it won’t be, the Russian crude oil benchmark futures contract traded on Russian exchanges, will price oil in rubles and no longer in US dollars. It is part of a de-dollarization move that Russia, China and a growing number of other countries have quietly begun.
The setting of an oil benchmark price is at the heart of the method used by major Wall Street banks to control world oil prices. Oil is the world’s largest commodity in dollar terms. Today, the price of Russian crude oil is referenced to what is called the Brent price. The problem is that the Brent field, along with other major North Sea oil fields is in major decline, meaning that Wall Street can use a vanishing benchmark to leverage control over vastly larger oil volumes. The other problem is that the Brent contract is controlled essentially by Wall Street and the derivatives manipulations of banks like Goldman Sachs, Morgan Stanley, JP MorganChase and Citibank.
The ‘Petrodollar’ demise
The sale of oil denominated in dollars is essential for the support of the US dollar. In turn, maintaining demand for dollars by world central banks for their currency reserves to back foreign trade of countries like China, Japan or Germany, is essential if the United States dollar is to remain the leading world reserve currency. That status as world’s leading reserve currency is one of two pillars of American hegemony since the end of World War II. The second pillar is world military supremacy.
US wars financed with others’ dollars
Because all other nations need to acquire dollars to buy imports of oil and most other commodities, a country such as Russia or China typically invests the trade surplus dollars its companies earn in the form of US government bonds or similar US government securities. The only other candidate large enough, the Euro, since the 2010 Greek crisis, is seen as more risky.
That leading reserve role of the US dollar, since August 1971 when the dollar broke from gold-backing, has essentially allowed the US Government to run seemingly endless budget deficits without having to worry about rising interest rates, like having a permanent overdraft credit at your bank.
That in effect has allowed Washington to create a record $18.6 trillion federal debt without major concern. Today the ratio of US government debt to GDP is 111%. In 2001 when George W. Bush took office and before trillions were spent on the Afghan and Iraq “War on Terror,” US debt to GDP was just half, or 55%. The glib expression in Washington is that “debt doesn’t matter,” as the assumption is that the world—Russia, China, Japan, India, Germany–will always buy US debt with their trade surplus dollars. The ability of Washington to hold the lead reserve currency role, a strategic priority for Washington and Wall Street, is vitally tied to how world oil prices are determined.
In the period up until the end of the 1980’s world oil prices were determined largely by real daily supply and demand. It was the province of oil buyers and oil sellers. Then Goldman Sachs decided to buy the small Wall Street commodity brokerage, J. Aron in the 1980’s. They had their eye set on transforming how oil is traded in world markets.
It was the advent of “paper oil,” oil traded in futures, contracts independent of delivery of physical crude, easier for the large banks to manipulate based on rumors and derivative market skullduggery, as a handful of Wall Street banks dominated oil futures trades and knew just who held what positions, a convenient insider role that is rarely mentioned inn polite company. It was the beginning of transforming oil trading into a casino where Goldman Sachs, Morgan Stanley, JP MorganChase and a few other giant Wall Street banks ran the crap tables.
In the aftermath of the 1973 rise in the price of OPEC oil by some 400% in a matter of months following the October, 1973 Yom Kippur war, the US Treasury sent a high-level emissary to Riyadh, Saudi Arabia. In 1975 US Treasury Assistant Secretary, Jack F. Bennett, was sent to Saudi Arabia to secure an agreement with the monarchy that Saudi and all OPEC oil will only be traded in US dollars, not Japanese Yen or German Marks or any other. Bennett then went to take a high job at Exxon. The Saudis got major military guarantees and equipment in return and from that point, despite major efforts of oil importing countries, oil to this day is sold on world markets in dollars and the price is set by Wall Street via control of the derivatives or futures exchanges such as Intercontinental Exchange or ICE in London, the NYMEX commodity exchange in New York, or the Dubai Mercantile Exchange which sets the benchmark for Arab crude prices. All are owned by a tight-knit group of Wall Street banks–Goldman Sachs, JP MorganChase, Citigroup and others. At the time Secretary of State Henry Kissinger reportedly stated, “If you control the oil, you control entire nations.” Oil has been at the heart of the Dollar System since 1945.
Russian benchmark importance
Today, prices for Russian oil exports are set according to the Brent price in as traded London and New York. With the launch of Russia’s benchmark trading, that is due to change, likely very dramatically. The new contract for Russian crude in rubles, not dollars, will trade on the St. Petersburg International Mercantile Exchange (SPIMEX).
The Brent benchmark contract are used presently to price not only Russian crude oil. It’s used to set the price for over two-thirds of all internationally traded oil. The problem is that the North Sea production of the Brent blend is declining to the point today only 1 million barrels Brent blend production sets the price for 67% of all international oil traded. The Russian ruble contract could make a major dent in the demand for oil dollars once it is accepted.
Russia is the world’s largest oil producer, so creation of a Russian oil benchmark independent from the dollar is significant, to put it mildly. In 2013 Russia produced 10.5 million barrels per day, slightly more than Saudi Arabia. Because natural gas is mainly used in Russia, fully 75% of their oil can be exported. Europe is by far Russia’s main oil customer, buying 3.5 million barrels a day or 80% of total Russian oil exports. The Urals Blend, a mixture of Russian oil varieties, is Russia’s main exported oil grade. The main European customers are Germany, the Netherlands and Poland. To put Russia’s benchmark move into perspective, the other large suppliers of crude oil to Europe – Saudi Arabia (890,000 bpd), Nigeria (810,000 bpd), Kazakhstan (580,000 bpd) and Libya (560,000 bpd) – lag far behind Russia. As well, domestic production of crude oil in Europe is declining quickly. Oil output from Europe fell just below 3 Mb/d in 2013, following steady declines in the North Sea which is the basis of the Brent benchmark.
End to dollar hegemony good for US
The Russian move to price in rubles its large oil exports to world markets, especially Western Europe, and increasingly to China and Asia via the ESPO pipeline and other routes, on the new Russian oil benchmark in the St. Petersburg International Mercantile Exchange is by no means the only move to lessen dependence of countries on the dollar for oil. Sometime early next year China, the world’s second-largest oil importer, plans to launch its own oil benchmark contract. Like the Russian, China’s benchmark will be denominated not in dollars but in Chinese Yuan. It will be traded on the Shanghai International Energy Exchange.
Step-by-step, Russia, China and other emerging economies are taking measures to lessen their dependency on the US dollar, to “de-dollarize.” Oil is the world’s largest traded commodity and it is almost entirely priced in dollars. Were that to end, the ability of the US military industrial complex to wage wars without end would be in deep trouble.
Perhaps that would open some doors to more peaceful ideas such as spending US taxpayer dollars on rebuilding the horrendous deterioration of basic USA economic infrastructure. The American Society of Civil Engineers in 2013 estimated $3.6 trillion of basic infrastructure investment is needed in the United States over the next five years. They report that one out of every 9 bridges in America, more than 70,000 across the country, are deficient. Almost one-third of the major roads in the US are in poor condition. Only 2 of 14 major ports on the eastern seaboard will be able to accommodate the super-sized cargo ships that will soon be coming through the newly expanded Panama Canal. There are more than 14,000 miles of high-speed rail operating around the world, but none in the United States.
That kind of basic infrastructure spending would be a far more economically beneficial source of real jobs and real tax revenue for the United States than more of John McCain’s endless wars. Investment in infrastructure, as I have noted in previous articles, has a multiplier effect in creating new markets. Infrastructure creates economic efficiencies and tax revenues of some 11 to 1 for every one dollar invested as the economy becomes more efficient.
A dramatic decline for the role of the dollar as world reserve currency, if coupled with a Russia-styled domestic refocus on rebuilding America’s domestic economy, rather than out-sourcing everything, could go a major way to rebalance a world gone mad with war. Paradoxically, the de-dollarization, by denying Washington the ability to finance future wars by the investment in US Treasury debt from Chinese, Russian and other foreign bond buyers, could be a valuable contribution to genuine world peace. Wouldn’t that be nice for a change?
Users browsing this forum: No registered users and 0 guests