AlterNet | Are Mega-Corporations and Wall Street Killing Electronic Dance Music? Julianne Escobedo Shepherd wrote:Posted on July 1, 2012, Printed on July 8, 2012
Over the last two years, electronic music has become bigger across the United States than at any point in history, even at the height of the rave era in the 1990s. For lifelong fans, its sudden rise has been astonishing. For years, while house and techno were born essentially in the Midwest of America, those of us stranded stateside have looked on as electronic became a staple of European pop culture, while we were left seeking out underground clubs and boutique record stores, feeling niche-ier than ever. But now, dance music is so mainstream that the corporate powers that be have rebranded it—electronic dance music, or EDM, which self-respecting dance music fans tend to despise.
As “EDM” spreads, it seems that it could even supplant hip-hop as the country’s dominant youth culture. The evidence is in the music: producer/DJs like Skrillex and Deadmau5 pull millions of dollars in fees, and have become godheads for young fans obsessed with the deep wobble of dubstep. Meanwhile, classic R&B and rap stars like Usher, Rihanna and Nicki Minaj currently rule the Billboard top 10 with singles that sound suspiciously like techno and house.
Where the zeitgeist has changed, so has the money. Gone are the underground warehouse raves of two decades ago (unless you know where to look!). The leading promoters of dance music events are the selfsame huge corporate entities that push the term EDM—and are, some dance fans say, robbing the music of its soul for their own end. (As corporations do.)
This year, the Electric Daisy Carnival, a festival held in Las Vegas, attracted an unprecedented 140,000 people a day. (The slowest day: 90,000.) But the main purveyor of corporate EDM is someone you may have heard of: Live Nation Entertainment, the gianormous entity that was borne of a much-disputed merger between Live Nation promotions (FKA Clear Channel Entertainment) and Ticketmaster, forming what some still see as a monopoly.
This week, Live Nation purchased Hard Events, one of the bigger quasi-underground promotions companies, which throws parties across the country and also puts on a yearly rave cruise called Holy Ship! that’s been a hot ticket among dance fans despite its full-vacation cost. The New York Times:
Big money is flowing into electronic dance music. In the latest example of corporate interest in this once-ignored market, Live Nation Entertainment said on Tuesday that it had acquired Hard Events, a Los Angeles company that has put on popular festivals and concerts across North America.
Yet such investments are fueling fears that a bubble is taking hold in the world of electronic dance music, or E.D.M., jeopardizing the creative and commercial health of the music. The issue has been intensely debated inside the music business, and recently some of the genre’s stars have sounded alarms as well.
Meanwhile, from early June, a piece titled “A Concert Mogul is Betting on Electronic Dance Music” detailed how Live Nation’s former mastermind, Robert F.X. Sillerman, is planning to float a whopping $1 billion into the market, investing in already established, smaller concert promoters across the country and letting them bring in the money.
As with every genre before it, corporate influence in a previously untouched genre is cause for handwringing, not least because the dilution of the music seems inevitable the bigger it gets. Live Nation’s acquisition of Hard was especially upsetting, because of Hard’s reputation for supporting underground as well as marquee artists and respecting the purity of the experience. For instance, the upcoming line-up for Hard Summer 2012, in Los Angeles, places big names like Skrillex, Nero and Boyz Noise alongside more boutique artists like Araabmuzik, Buraka Som Sistema and Brenmar. With the influence of Live Nation, one wonders if this sort of juxtaposition will continue to flourish or if ticket prices will become even less affordable?
Deadmau5, for one, seemed verklempt. From the New York Times article:
“E.D.M. has turned into a massively marketed cruise ship, and it’s sinking fast,” the D.J. and producer Deadmau5 wrote on his Tumblr page on Tuesday. “All I’m trying to do is put on my life jacket and swim as far away from this shipwreck as fast as I can.”
Some dance fans might find Deadmau5’s freakout pretty rich—he’s on the cover of Rolling Stone right now, and many believe his large-scale, somewhat ham-fisted take on the already ham-fisted genre of dubstep is part of the problem. On the other hand, if a man who has benefited from the creation of EDM (TM!) is feeling this kind of distress, one wonders if all of us should be worried.
Dance music, like hip-hop before it, is accompanied by a very specific and largely organic culture. Sometimes fueled, notoriously, by body-high drugs like ecstasy and MDMA, it’s got an even subtler code to it that goes beyond the broad strokes of party drugs, and spans fashion to slang to micro-genres. (Dance music all over the world is constantly splitting into micro-genres.) If a niche culture suddenly explodes into a massive moneymaker—one Wall Street is increasingly dabbling with—what happens to that culture?
Respected and (relatively) inveterate electronic music critic Philip Sherburne has been pondering these things from the vantage of someone who’s been involved for decades. (One of his claims to fame: he coined the term “microhouse” in 2001, describing a then-popular, mostly German take on house music that seemed to model itself on ping-pong balls echoing across Styrofoam.) In a recent piece for Spin, Sherburne predicts:
As Spinal Tap's lovably cutthroat Bobbi Flekman once explained, "Money talks, and bullshit walks." But if the big money's not careful, it could witness an exodus of the dance-music faithful from the playground being constructed for them with all the haste of a Nevada high-rise. Not that the major labels and mega-agencies don't have a place in the equation, but electronic dance music always has been, fundamentally, a culture of independent labels and intractable fans, and that's not likely to change any time soon. We may be headed for a split in the scene, as the big investors create a gated community designed to their own specifications, and the strong-willed masses make their own rave in the vacant lot adjacent.
It’s an optimistic take, and he goes on to point out that corporate rock didn’t wipe out the underground. (I would counter that hip-hop, with its knotty combination of race and marketing, didn’t exactly escape its corporatization unscathed.) The problem with a potential corporate split between mainstream EDM and other types of non-corpo dance music is that in some cases, the music and the fans can stop speaking to each other, and you end up with a viciously divided scene like the underground vs. mainstream rap arguments of the late 1990s. (Which were incredibly tiring and are finally just starting to heal.) Others tend to agree; an item at Global Dance Music called “Electronic Dance Music Continues to Go Corporate”:
Until a genre garners mainstream attention and popularity, independent promoters are entirely dedicated to keeping a particular scene running. Interest from large corporate heads provides added financial stability and cushion to independent promoters with varying degrees of involvement, the absolute brand evangelists, but with little to no experience. Let's call promoters the pioneers of the game; does a corporation fit?
The financial details of the deal between Live Nation and Hard Events were not disclosed to the public, and while the EDM community continues to worry about corporate involvement in a once-underground genre, our option resides in watching a developing fickle genre unfold.
Unfolding, indeed. For now, the Hard Festival lineup remains true to its past and its fanbase; meanwhile, niche-ier, independent dance parties continue to rule the underground, from LA to Detroit, Chicago to Miami, New York to London, Distrito Federal to Buenos Aires, Soweto to Goa, as surely as if it were 1992. But in the meantime, dance music fans should keep watch.
As the Future of Music Coalition points out, there’s a chasm of difference between what we consider a “better dance music experience,” and what a white-collar corporate mogul considers a better dance music experience. Especially when his favorite musician is Bob Dylan.
Julianne Escobedo Shepherd is an associate editor at AlterNet and a Brooklyn-based freelance writer and editor. Formerly the executive editor of The FADER, her work has appeared in VIBE, SPIN, New York Times and various other magazines and websites.
© 2012 Independent Media Institute. All rights reserved.
View this story online at: http://www.alternet.org/story/156101/
New York Times | A Concert Mogul Is Betting on Electronic Dance Music, BEN SISARIO wrote:Posted on June 5, 2012
The man who corporatized the concert industry is back, and he wants to dance.
Robert F. X. Sillerman, the media executive who transformed the live music business in the 1990s by combining regional concert promoters into the nationwide powerhouse that became Live Nation, has returned to the business with the first of what he expects will be a string of investments in electronic dance music, the industry’s latest trend.
Echoing his strategy in the concert business, Mr. Sillerman is pursuing independent companies that put on dance festivals, D.J. parties and other events where the crowds might range from a few hundred people to tens of thousands. He said in an interview on Monday that his first acquisition was Disco Productions, a Louisiana company that was founded by a rave promoter, Donnie Estopinal, and puts on events throughout the country.
Mr. Sillerman, 64, said that in addition to that deal he was in negotiations with up to 50 other companies, and had tentative agreements with about 15 of them. He declined to disclose terms of the Disco Productions deal, but said that he expected his new company — which is called SFX Entertainment, reviving the name of his earlier concert business — to spend $1 billion on acquisitions within a year, and that he wanted to take it public this summer.
The plan for SFX, Mr. Sillerman said, is still being formulated but will involve using the Internet to connect fans of dance music. If his strategies from the 1990s are a guide, he might also want to deliver this aggregated audience to major advertisers and marketers.
“There’s a wave of interest in attending concerts that have less to do with the specific music and more to do with the experience attached to the music,” he said, referring to the immersive appeal of many large-scale dance events. “Our thought is that the experience of attending an individual event can be perpetuated and made better by connecting the people, not just when they’re consuming the entertainment but when they’re away from it.”
When Mr. Sillerman, who made his fortune in radio, turned to the concert world in the mid-’90s, it was dominated by independent promoters with regional fiefs. Promoters like Michael Cohl, who worked with the Rolling Stones and U2, and who is now the lead producer of “Spider-Man: Turn Off the Dark” on Broadway, were also beginning to put on global mega-tours with corporate sponsorships.
Over a few years, SFX spent $1.2 billion to buy dozens of regional promoters and combined them into a national organization. In 2000, Clear Channel Communications bought SFX for $4.4 billion. Clear Channel later spun off its concert division into Live Nation, which in 2010 merged with Ticketmaster to form Live Nation Entertainment.
“Bob changed the game that had been in place for decades,” said Josh Baron, the editor of the music magazine Relix and co-author of the book “Ticket Masters: The Rise of the Concert Industry and How the Public Got Scalped.”
He added, “He brought together promoters who were archrivals, and he brought Wall Street to the rock business.”
Mr. Sillerman’s corporate approach to the concert industry has had its detractors, who say that it has led to higher prices for consumers and contributed to a hypercompetitive bidding process that has made events far riskier to put on.
Live Nation has also had problems. Since going public in 2005, it has never turned an annual profit. But Mr. Sillerman said he believed that the basic business strategy behind combining multiple concert companies was sound. “The fundamental premise that combining businesses that are inefficient and making them more efficient always makes sense,” he said.
Mr. Sillerman’s investments are only the latest in a wave of corporate interest in dance music. Long considered a stable if marginal genre, dance has lately entered the mainstream as never before, drawing more than 100,000 fans to festivals like Electric Daisy Carnival in Las Vegas and Ultra in Miami. This week Electric Daisy is sponsoring a business conference in Las Vegas called EDMbiz, before the festival returns this weekend.
Live Nation has also been moving aggressively into dance music. Last month it bought a major British festival promoter, Cream Holdings, and last week it announced that it would be putting on a two-night dance event, Sensation, at the new Barclays Center in Brooklyn in October.
Mr. Sillerman — whose own tastes lean to Bob Dylan and Paul Simon — said that after setting the overall corporate strategy for the new SFX he expected to leave regional promoters like Mr. Estopinal to put on events the way they saw fit. But the popularity of the dance genre and the promise of connecting audiences on the Internet, he said, had enormous profit potential.
“I’m confident we’ll do an excellent job empowering these kids to be as good as they can be,” he said of the promoters he expected to bring into the fold. “I’m also confident that we will create a better experience for the fans. Can we monetize that? If we can, this will dwarf the first SFX. That’s the whole game.”
- www.counterpunch.org - https://www.counterpunch.org -
The Net’s Good Old Boys: Hacking the Arpanet
By Geoff Dutton
On January 4, 2018 @ 2:00 am
It’s hard to imagine now, but there was a time before the Internet, a time when computers took up more space than the acolytes who tended to their needs. In the 70s I was one such boffin, a postgrad hacking away in a university R&D lab. Computers then were still quite dear, and so we made do with terminals that sucked electrons from the teat of a minicomputer several blocks away through fiber cable.
Our digital host had recently been hooked up to the Arpanet, the Internet’s predecessor, giving us real-time access to several dozen academic, government, and military computers scattered across the US. We used it to chat and exchange files and email with people we knew here and there, but mostly we wasted time and bandwidth psyching out the robot psychotherapist Eliza and playing text-based games like Adventure and Hunt the Wumpus, just like today’s youth do but more primitively.
DoD’s Advanced Research Project Agency (ARPA) had funded the network to develop a prototype military communications system. They let scientists play with it and observed what they were up to—how carefully, nobody without an appropriate security clearance can really say. For we geeks, it was a cozy play-space with a few thousand presumably collegial users. No spam, no malware, no ads, no Web, and so it would remain for another dozen years. But it did not remain free of intruders for long.
Soon it became evident that strangers were snooping around the Net—mystery hackers who, after a guy who called himself the Lone Ranger befriended and ratted them out to the FBI, turned out to be a bunch of teenagers. An odd collection of middle- and high school misfits who traded exploits on dial-up hosts, hacking long-distance calls to get to Arpanet portals with purloined credentials they passed around. An Atari 400 or a Radio Shack TRS80 and a modem was all the hardware needed for a kiddie to sneak in, swipe logins (most people used default passwords or none at all), read files and emails, and for a lark change file and account names and passwords to whatever they felt like. Few of these kids ever met face-to-face. They bonded through online bulletin boards—the social media of the day—hidden behind screen names. At a tender age, these kids instinctively knew that real business gets transacted under the radar.
It was innocent fun for the “whiz kids,” as they were then labeled. Had ransomware been in circulation then, they wouldn’t have used it; they were explorers, not thieves. Their only ill-gotten gains were slices of computer time and free phreaked phone calls. Things are different now that everyone and everything is online. Empires are at stake, or so key players believe, and that makes it so. After all, did not the military create the Internet in service to empire?
Today as perhaps then the underlying task of the Internet is surveillance, but nobody who does it admits to it or that maximizing ad revenues is the main concern of corporate news media is. And now that Net Neutrality is a lost cause, muffling anti-establishment messages, enforcing ignorance and shaping opinions will become much easier to them to do. They just want us to have such a good time indulging in online pastimes that we stop caring who’s filtering facts and following our mousetracks.
By the 80s, some of us saw this coming. Not so much all the mining, trading, and manipulating of personal data, but what seemed like a dash to create Big Brother, antennae on down. In this century, every netizen is quickly becoming inured not just to the personal data complex sorting through our lives, preferences, and opinions, but to being tracked wherever we may go. As Max Barry writes in Lexicon,
…I don’t care that much if these organizations want to know where I go and what I buy. But what bothers me is how HARD they’re working for all that data, how much money they’re spending, and how they never admit that’s what they want. It means that information must really be valuable for some reason, and I just wonder to who and why.
I hardly recall the cyber-experts asking that. Seldom did Computer Science literature from that formative era broach network technology’s totalitarian potential. The high priests endlessly discuss computer system vulnerabilities and how to prevent them, but mostly focus on fending off intrusions from hackers and criminals, not spooks, the military, or law enforcement. Even the acolytes who knew why the DoD was building the Arpanet did not seem particularly interested in discussing any undesirable consequences. As Matt Novak remarks in an article about Arpanet in PaleoFuture:
When you look at how the early internet was used by the intelligence and defense communities, you see that our internet infrastructure was never the Wild West. It was built deliberately and strategically. Some of the earliest uses of the ARPANET were for monitoring the military activities of America’s adversaries, decades before most people even knew what networked computing was.
Novak compiled an animation that visualizes the Arpanet extending its tentacles over its 20 years of existence. Halfway through, by 1980, the Arpanet had proven itself. By then it was already clogged with civilian email messages, something DoD disapproved of. Understanding that Arpanet was inherently insecure, the Pentagon established its own communications network (MILNET), as did the NSA with its COINS II network (Community On-line Information System), built to internally share intelligence data away from prying eyes. As these networks were based on and had gateways to the Arpanet, they were similarly open to compromise.
The Internet wasn’t designed to be secure; quite the opposite, it is rife with holes in its backdoor code and protocols deliberately put there for reasons that might or might not have to do with government surveillance. A frequent and well-informed commenter on Bruce Schneier’s security blog notes (12/16/2017):
… most Internet vulnerabilities at the protocol and standards layers have been there since day one. Because they were quite deliberately built in from day zero.
It was almost certainly not done maliciously but to “solve problems within resource constraints” that no longer apply.
The thing is nobody wants to spend money to solve these problems, usually portrayed with the excuse of “don’t break legacy systems” as it’s the almost perfect “get out clause”. As well as the biggest soirce [sic] of not just technical debt but building in security vulnerabilities across the board.
The Arpanet was hacked not long after it came out. See this timeline to read up on the highlights. Some of its vulnerabilities stem from its architects’ inability to credit malicious actors the cleverness to find their way in. Others developed over time as engineers applied patch upon patch, desperately trying to keep up with the Internet’s burgeoning size and traffic levels of online transactions, big data, and multimedia. In essence no roadmap, no anticipation of criminal use or how to thwart it, compounded by a general reluctance on the part of IT vendors, their customers, and government agencies to assume the cost and complexities securing of their systems.
Of course, spy agencies are happy to exploit the security holes and seem to snoop everywhere they care to. The question here is, did the architects of Arpanet deliberately laden their standards, protocols, software, and machinery with features that would allow the government to intercept traffic and penetrate computers on the Net and disregard security flaws when brought to their attention?
That’s quite possible. After the NSA seeded doubt in 2012, companies around the world refused to purchase network routers from Chinese maker Huawei, concerned that they concealed “back doors” in their code that would let a knowledgeable hacker log on to intercept or interject network packets. This after Der Spiegel broke news that NSA had infiltrated Huawei’s IT system to retrieve source code for their products, as well as “a list of 1,400 customers as well as internal documents providing training to engineers on the use of Huawei products, among other things.”
There’s no direct evidence that NSA slipped spyware into Huawei equipment, but in 2014 the Intercept revealed NSA documents indicating that NSA was “interdicting” shipments of US-made networking equipment destined overseas to inject its own code before buyers took delivery. A network router or a switch may have 30M lines of code, not easy for a customer to verify that it’s spook-free. And should NSA’s bugs be discovered, the manufacturer’s reputation and stock price will certainly take hits.
We can only speculate whether there are similar trapdoors in routers sold in the US market, but it almost doesn’t matter. NSA has many other ways to sniff out your packets to archive your messages for future reference, along with data on your computer’s Internet traffic generously supplied by your ISP. And for that robust surveillance capability, give thanks to Bobby Inman and the unsung architects of the Arpanet who paved the way.
Don’t know Bobby Inman? Read “The Net’s Good Old Boys, Part 2: If We only Knew Then,” coming up next.
For further reading
Wired Magazine 5/9/06: Ex-NSA Chief Assails Bush Taps
DefenseTech News 5/9/06: Ex-NSA Chief Blasts Taps, Calls for CIA Breakup
Max Barry, Lexicon, Penguin Books, 2013
Der Spiegel 3/22/14: NSA Spied on Chinese Government and Networking Firm
CNET News 5/12/14: NSA reportedly installing spyware on US-made hardware
Washington Post 5/30/15: Net of Insecurity: A Flaw in the Design
Washington Post 6/22/15: Net of Insecurity: A disaster foretold — and ignored
Gizmodo 2/20/15: A History of Internet Spying, Part 2
Gizmodo 8/15/15: The Secret Project to Turn the Internet into an Anti-Soviet Spy Network
PC World 11/17/15: How Cisco is trying to keep NSA spies out of its gear
Article printed from www.counterpunch.org: https://www.counterpunch.org
URL to article: https://www.counterpunch.org/2018/01/04 ... e-arpanet/
Bishop warns of 'evil internet'
Saturday, 8 April, 2000, 12:02 GMT 13:02 UK
The internet has the potential to destroy society, the Archbishop of York has warned. Archbishop David Hope said that computer "wizardry" was in danger of creating a "society without a soul".
"This technology is something that could ultimately devour us," he said in an interview with Conservatism, the quarterly journal of the Conservative Christian Fellowship. The archbishop's comments follow a Church of England report that warned that society should wake up to the ethical and spiritual implications of the internet.
In February, the archbishop of Canterbury, Dr George Carey, warned about what he saw as the perils of internet use, saying it could be exclusive and isolating.
'No social interaction'
Dr Hope expressed concerned at the way the internet could limit levels of human interaction. "I fear that we are becoming a nation which simply sits in front of a television screen and orders its lives at the press of a button or mouse," he said. "The danger is in having all this wizardry in individual homes which people never leave and where there is, as a result, no social interaction. Like all these developments, there is that which has the potential for good, and that which has the potential for evil. There is in the internet the potential for destroying ourselves."
Chris Wright, chairman of the group Christians on the Internet, said he was sympathetic to Dr Hope's comments. But he warned against Christians failing to become involved in the development of the new technology.
"He has pointed out the dangers of the internet just like there are great dangers in books and other communication mediums. I think there is an even greater danger, though, in being afraid of it," he said. "Just like the Church is deeply involved in work in areas such as red light districts in towns and cities and working amongst the dregs of society, we need to be involved in the internet and using our influence for the good."
Before she set out in her carriage to St Paul's for her Diamond Jubilee in 1897, Victoria sent a telegram from the Buckingham Palace cable room: Thank my beloved people. May God bless them. It was immediately transmitted to the St Martin's Le Grand Central Telegraph Office.
By the time Victoria had been helped into her carriage in the palace courtyard, the message had reached Tehran.
As her carriage rolled past the massed cavalrymen and Londoners on the Mall, her message had reached Ottawa, all the West African colonies down to the Cape of Good Hope, every British Caribbean island and the Queen's beloved India. Before Victoria had clattered under Admiralty Arch, her message had reached Melbourne and Wellington, the edge of the empire and the end of the world.
Friends and enemies alike couldn't quite fit the empire in their heads. The New York Times wrote of the occasion, "we are part, and a great part, of the Greater Britain which seems so plainly destined to dominate this planet." Women's groups in Brooklyn sang God Save the Queen. In France, Le Figaro declared that Rome had been 'equalled, if not surpassed' and even the Germans described Victoria's empire as 'practically unassailable'.
This is an astonishing level of technological superiority for its time. No other nation came close. The Empire, at its peak, almost qualifies as a Breakaway Civilisation of the Richard Dolan type. And it was the communications infrastructure that kept the Empire and its navy unassailable.
The World Google Controls and Surveillance Capitalism
By Julian Vigo
December 17, 2018 @ 1:49 am
I have been following the scandal of the UK’s Investigatory Powers Act (also known as the Snoopers’ Charter) and Holland’s Sleepwet and their relationship to the encroaching government powers over private data, privacy, data collection, surveillance, and free speech for several years now. And very much related to these bills created ostensibly to protest us from “terrorism,” is Google’s encroaching powers over our lives, to include the freedom of expression protected by most national laws, not to mention EU and UN Charters, around the planet today.
When the Internet became a tool for communication and research in the late1980s (usually through universities and research institutes) and later rrendered public through commercial Internet service providers (ISPs) in 1991, most people were slow to catch on. Initially, I was inculcated into Internet culture by virtue of being a graduate student at New York University where I came to depend on their computer labs to churn out papers when not using friends’ computers. I still remember Archie, Telnet, and line mode browsers before the release of ViolaWWW. By the mid 1990s students were curious about hypertext through Memex and Xanada while many others made their personal webpage which they would write in html with the help of on- or off-line instructions. The concept of a free website builder had not yet emerged and everything was very much ad hoc, individuals figuring out how to fiddle with html as if a late 20th century Mini Cooper under whose hood the user would play around. And yes, the flashing bright lights that every webpage seemed to embrace as if a will to trigger everyone visiting their page an epileptic seizure.
These were the golden days of the Internet when anything was acceptable to include the esthetically challenging, old school graphics, and the simple layout with repeating background images that defies any description. These were the days that websites were entirely about content such that if you want to read up on the Klingon Language Institute, presentation was tertiary, if even a concern at all. Even by the mid 1990s most businesses had not caught onto the potential of the Internet for marketing, public relations, and advertising. The finances needed for publicity were still largely functioning through traditional modalities and when companies did not think that people would be using the Internet for commerce, much less research.
In 1995, when the NSF (National Science Foundation) began charging a fee for registering domain names there were only 120,000 registered domain names. By 1998, this number rose to over 3 million. And while Amazon started in 1994, the birth of eBay the year later kicked off e-commerce definitively. Still most businesses did not actively incorporate the Internet into their structures and the cost of building a website was not even an afterthought for most given the Internet on a shoestring approach that many of us ran with. I was working on my PhD at this time and finding that my ability to learn languages was directly applicable to computer languages where I was able to volunteer for friends and even carve out a living writing web pages and making early e-commerce sites for friends. Web designers in Manhattan were quickly becoming desirable and well paid as we rolled towards the new millennium with more and more businesses and individuals realizing the potential of the Internet.
The thing is until 2000, the Internet existed for most people as this virtual encyclopedia, news reference, information center to check out cinema times. There were even early prototypes of Skype and messenger like ICQ where peer-to-peer communications were viewed as a novelty. I had my first Internet conversation from my apartment in Park Slope to a man living at the foot of Mount Kilimanjaro. The Internet was an information highway, unregulated, and quite flexible considering kinds of technology it was slowly replacing. Privacy schmivacy, right?
However, since 9/11 specifically and more recently around a series of culture wars, we are seeing how governments around the planet from the beginning of the new millennium had locked up ship and set out various legal initiatives that make it possible for governments to spy on its citizens. The US can be credited with fomenting such legislation that claims to do one thing (secure the “homeland”) while in reality, doing something quite different. So 45 days after 9/11 the Patriot Act, a vile piece of legislation that resulted in the disappearance of over 14,000 Muslim men within the United States, was born. The residual force of the Patriot Act lay in the fact that this law made it easier for the US government to spy on its citizens with the government issuing National Security Letters (NSLs) without the need for a judge to sign off. The Patriot Act gave a new twist to McCarthyism since it put the power of the law into the hands of 43,000 law enforcement agents who had access to phone records collected through the NSLs. While most people today are aware of the importance of Edward Snowden’s and Julian Assange’s efforts to challenge the US government’s illegal acts of espionage on its own citizenry and illegal acts of violence, what many do not remember is how the Global War on Terror (GWOT) instigated much of the laws which rolled out enormous powers to Homeland Security, which decimated in the INS (Immigration and Naturalization Service) and put immigration in the same bracket as terrorology.
From the US to the EU, one thing has become painfully clear to me in recent months: free speech, the freedom of conscience, and privacy are all under threat by big tech companies like Facebook, Twitter, and Google. In fact, these companies are far more the enemy of the people than the NSA (National Security Agency) or GCHQ, the UK’s Government Communications Headquarters. And Snowden has said as much referring to how he and his colleagues in the NSA were at the very least subject to some degree of democratic oversight while companies like Google and Facebook, as we saw recently with Zuckerberg’s testimony to Congress this past Spring, maintain a business model which perfectly combines capitalism with surveillance and it is all perfectly unregulated.
In 2014, John Bellamy Foster and Robert W. McChesney introduced the term “surveillance capitalism” in Monthly Review, an independent socialist magazine where they explain its inception from the post-war architecture which combined the vehicle of sales framed within a Madison Avenue centralized corporate marketing revolution together with the creation of a permanent state of war headed by the Pentagon where the Cold War was buttressed by arms and fictional nuclear preparedness on the one hand, and the shop ‘til you drop on the other. The military-industrial complex and the marketing of society, according to Foster and McChesney, constituted the two principle surplus-absorption mechanisms until the financial crisis of the 1970s when a third vector of surplus-absorption was added: that of financialization which supplemented the system as the previous two mechanisms waned:
Each of these means of surplus absorption were to add impetus in different ways to the communications revolution, associated with the development of computers, digital technology, and the Internet. Each necessitated new forms of surveillance and control. The result was a universalization of surveillance, associated with all three areas of: (1) militarism/imperialism/security; (2) corporate-based marketing and the media system; and (3) the world of finance.
It is hard to do such a brilliant article justice, but suffice it to say that Foster and McChesney give an excellent history of how the hunt for Edward Snowden was not news. They chronicle a long history dating back to the “Army Files” (also known as CONUS) scandal where the Army had been spying on and keeping files on over seven million U.S. citizens through the use of over 1,500 plainclothes agents. It was because of the CONUS scandal that Americans came to know of ARPANET, the precursor to today’s Internet where these secret files of Americans were kept and where the “limitless storage of data” proved a threat to healthy democracy.
Surveillance capitalism is now part of our everyday where even the follow-up quality control questionnaires and all the privacy tick boxes we are asked to tick form part of a larger private sector databank of information. The problem is that most people think that such information is “harmless” and that it is of little consequence to their safety or privacy. But surveillance capitalism, as Foster and McChesney show us, surveillance capitalism could go much further than any government surveillance:
Like advertising and national security, it had an insatiable need for data. Its profitable expansion relied heavily on the securitization of household mortgages; a vast extension of credit-card usage; and the growth of health insurance and pension funds, student loans, and other elements of personal finance. Every aspect of household income, spending, and credit was incorporated into massive data banks and evaluated in terms of markets and risk. Between 1982 and 1990 the average debt load of individuals in the United States increased by 30 percent and with it the commercial penetration into personal lives.
So now with the government having the private sector doing its bidding in terms of farming information of its “client base,” business was not making a killing but private individuals were going further into debt while losing their freedom of privacy. Conterminous to individuals being stripped of their democratic freedom of privacy came the removal of the freedom of speech, recently cemented by the recent “redrafting” of NAFTA whereby major corporations like Google, Facebook and Twitter were positioned to be the main benefactors of what is now called United States-Mexico-Canada Agreement (USMCA):
These big tech companies have been trying to reinvoke their immunity as previously held under Section 230 of the Communication Decency Act through NAFTA (North American Free Trade Agreement) renegotiations. And last month they were successful as NAFTA’s substitute, the United States-Mexico-Canada Agreement(USMCA), will now extend the immunity Congress had earlier provided with Section 230 of the Communications Decency Act of 1996 (CDA) into neighboring North American countries. Not only is this is a gift to the tech industry, but it is a complete paradox. The tech industry lobbied heavily to get back Section 230 immunity by invoking “free expression” for its users while conterminously taking on the policing free speech on its platforms. In short, big tech’s request for absolute immunity, in light of its use of Section 230 to justify political bias and censorship, reveals a troubling present for free speech on the net.
Over the past year there has been an unprecedented amount of thought policing on social media by Facebook and Twitter where now there are rules that penalize users for “fake news” and other thought crimes while Facebook and Twitter have closed down hundreds of political media pages just before November’s midterm elections. Censorship is now commonplace on these platforms just as Google is once again facing a fresh wave of criticism from human rights groups over its plan to launch a censored search engine in China, a project called Dragonfly. In an eery twist to the democratization of Information that was once predicted in the early 1990s with the public launch of the Internet, we are now seeing how information, in the wrong hands, is not only not progressive, but is proving to be quite dangerous.
The masses of people playing Candy Crush and using Viber on their mobiles are overwhelmingly unaware of their participation in data mining how their participation poses a danger to a healthy democracy. We need to stay informed about the encroachment of big business and social media corporations in our private lives and the depths to which the private sector can farm information. In the end, who controls this information and how it is employed is another and far grimmer question that we must ask, even at the risk of uncovering terrifying and inexorable truths.
Monopoly-Finance Capital, the Military-Industrial Complex, and the Digital Age
by John Bellamy Foster and Robert W. McChesney
(Jul 01, 2014)
Topics: Internet , Media , Movements , Political Economy , Stagnation , State Repression
Places: Americas , Global , United States
John Bellamy Foster is editor of Monthly Review and professor of sociology at the University of Oregon. Robert W. McChesney is the Gutgsell Endowed Professor in the Department of Communication at the University of Illinois. They are the coauthors of The Endless Crisis: How Monopoly-Finance Capital Creates Stagnation and Upheaval from the USA to China (Monthly Review Press, 2012).
The United States came out of the Second World War as the hegemonic power in the world economy. The war had lifted the U.S. economy out of the Great Depression by providing the needed effective demand in the form of endless orders for armaments and troops. Real output rose by 65 percent between 1940 and 1944, and industrial production jumped by 90 percent.1 At the immediate end of the war, due to the destruction of the European and Japanese economies, the United States accounted for over 60 percent of world manufacturing output.2 The very palpable fear at the top of society as the war came to a close was that of a reversion to the pre-war situation in which domestic demand would be insufficient to absorb the enormous and growing potential economic surplus generated by the production system, thereby leading to a renewed condition of economic stagnation and depression.
Assistant Secretary of State Dean Acheson declared in November 1944 before the Special Congressional Committee on Postwar Economic Policy and Planning, that if the economy slipped back to where it was before the war “it seems clear that we are in for a very bad time, so far as the economic and social position of the country is concerned. We cannot go through another ten years like the ten years at the end of the twenties and the beginning of the thirties [i.e., the Stock Market Crash and the Great Depression], without the most far-reaching consequences upon our economic and social system.” Acheson made it clear that the difficulty was not that the economy suffered from a lack of productivity, but rather that it was too productive. “When we look at the problem we may say it is a problem of markets. You don’t have a problem of production. The United States has unlimited creative energy. The important thing is markets.”3
Postwar planners in industry and government moved quickly to stabilize the system through the massive promotion of a sales effort in the form of a corporate marketing revolution based in Madison Avenue, and through the creation of a permanent warfare state, dedicated to the imperial control of world markets and to fighting the Cold War, with its headquarters in the Pentagon. The sales effort and the military-industrial complex constituted the two main surplus-absorption mechanisms (beyond capitalist consumption and investment) in the U.S. economy in the first quarter-century after the Second World War. After the crisis of the 1970s, a third added surplus-absorption mechanism, financialization, emerged, propping up the underlying system of accumulation as the stimulus provided by the sales effort and militarism waned. Each of these means of surplus absorption were to add impetus in different ways to the communications revolution, associated with the development of computers, digital technology, and the Internet. Each necessitated new forms of surveillance and control. The result was a universalization of surveillance, associated with all three areas of: (1) militarism/imperialism/security; (2) corporate-based marketing and the media system; and (3) the world of finance.
The Warfare State
Soon after the war a new Pentagon capitalism was formed in Washington. A crucial element in the post-Second World War economy of the United States was the creation of the warfare state, rooted in a military-industrial complex. On April 27, 1946, General Dwight D. Eisenhower, chief of staff of the Army, issued a “Memorandum for Directors and Chiefs of War Department General and Special Staff Divisions and Bureaus and the Commanding Generals of the Major Commands” on the subject of “Scientific and Technological Resources as Military Assets.” Seymour Melman later referred to this memo as the founding document of what President Eisenhower—in his famous January 17, 1961 farewell address to the nation—was to call the “military-industrial complex.” In this memo General Eisenhower emphasized that a close, continuing contractual relationship be set up between the military and civilian scientists, technologists, industry, and the universities. “The future security of the nation,” he wrote, “demands that all those civilian resources which by conversion or redirection constitute our main support in time of emergency be associated closely with the activities of the Army in time of peace.” This required an enormous expansion of the national security system, bringing civilian scientists, industry, and contractors within this expanding and secretive arm of government. “Proper employment of this [civilian] talent requires that the [given] civilian agency shall have the benefit of our estimates of future military problems and shall work closely with Plans and the Research Development authorities. A most effective procedure is the letting of contracts for aid in planning. The use of such a procedure will greatly enhance the validity of our planning as well as ensure sounder strategic equipment programs.” Eisenhower insisted that scientists should be given the greatest possible freedom to conduct research but under conditions increasingly framed by the “fundamental problems” of the military.
A crucial aspect of this plan, Eisenhower explained, was for the military state to be able to absorb large parts of the industrial and technological capacity of the nation in times of national emergency, so that they become “organic parts of our military structure…. The degree of cooperation with science and industry achieved during the recent [Second World] war should by no means be considered the ultimate;” rather, the relationship should expand. “It is our duty,” he wrote, “to support broad research programs in educational institutions, in industry, and in whatever field might be of importance to the Army. Close integration of military and civilian resources will not only directly benefit the Army, but indirectly contribute to the nation’s security.” Eisenhower therefore called for “the utmost integration of civilian and military resources and…securing the most effective unified direction of our research and development activities”—an integration that he said was already “being consolidated in a separate section on the highest War Department level.”4
Eisenhower’s emphasis in 1946 on an organic integration of the military with civilian science, technology, and industry within a larger interactive network was not so much opposed to, as complementary with, the vision of a warfare economy, based on military Keynesianism, emanating from the Truman administration. The Employment Act of 1946 created the Council of Economic Advisers charged with presenting an annual report on the economy and organizing the White House’s economic growth policy. The first chairman of the Council of Economic Advisers was Edwin Nourse, famous for his role in the 1934 publication of the Brookings Institution study, America’s Capacity to Produce, which pointed to the problem of market saturation and excess productive capacity in the U.S. economy. The vice chairman was Leon Keyserling, who was to emerge as the foremost proponent of military Keynesianism in the United States. In 1949 Nourse stepped down and Keyserling replaced him. Meanwhile, the National Security Council was created with the passage of the National Security Act of 1947 (which also created the CIA). Together, the Council of Economic Advisors and the National Security Council were to construct the foundation of the U.S. warfare state. Truman formed the ultra-shadowy National Security Agency (NSA) in 1952 as an arm of the military charged with conducting clandestine electronic monitoring of potential foreign (and domestic) subversive activities.5
In 1950 Paul H. Nitze, director of the Department of State’s Policy Planning Staff under Acheson, was given the leading role in drafting National Security Council Report 68 (NSC-68), which established an overall U.S. geopolitical grand strategy for waging the Cold War and global imperialism. Significantly, NSC-68 saw a great boost to government spending as a crucial element in preventing economic stagnation: “There are grounds for predicting that the United States and other free nations will within a period of a few years at most experience a decline in economic activity of serious proportions unless more positive government programs are developed than are now available.” This provided an added justification, beyond geopolitical concerns, for a massive rearmament based on military Keynesian “guns and butter” principles. The economic analysis of NSC-68 was the result of direct consultations that Nitze had with Keyserling, who was to exert a strong influence on the report.
NSC-68 raised the possibility of a greatly expanded U.S. economy, based on the experience of the Second World War, in which increased military procurement and sustained domestic consumption were seen as fully compatible in the context of a full employment economy, but not obtainable otherwise. Such an economy could provide both guns and butter. “The United States,” the report said, “could achieve a substantial absolute increase in output and could thereby increase the allocation of resources to a build-up of economic and military strength of itself and its allies without suffering a decline in its real standard of living.” Indeed, “in an emergency the United States could devote 50 percent of its gross national product” to military expenditures, foreign assistance, and investment—“or five to six times as much as at present.” The report strongly stressed that the huge rearmament program being advocated did not require any hard choices economically, as it “might not result in a real decrease in the standard of living” but could even produce the opposite:
The economic effects of the program might be to increase the gross national product by more than the amount being absorbed for additional military and foreign assistances purposes. One of the most significant lessons of our World War II experience was that the American economy, when it operates at a level approaching full efficiency [full capacity], can provide enormous resources for purposes other than civilian consumption while simultaneously providing for a high standard of living. After allowing for price changes, personal consumption expenditures rose by about one-fifth between 1939 and 1944, even though the economy had in the meantime increased the amount of resources going into Government $60–$65 billion (in 1939 prices).6
Keyserling, in his capacity as chairman of the Council of Economic Advisers, was asked to provide an economic assessment of NSC-68, despite his direct input into the report itself. In a memorandum that he wrote on December 8, 1950, he indicated the planned buildup of expenditure on national security for 1952 envisioned in NSC-68 was well below the capacity of the economy. It would reach only 25 percent of national output in 1952, whereas national security expenditures had risen to 42 percent in 1944. Although likely cutting into domestic consumption “the general civilian consumption standards which would be possible under the proposed programs could hardly be described as severe,” while overall output and employment in the economy would increase.7
NCS-68 called for a more than tripling of military spending. The rearmament strategy advocated in the report was couched primarily in Cold War terms, as a means of promoting the so-called “Containment” doctrine announced by Truman in March 1947, and only secondarily in terms of the economy.8 But the two objectives were seen as congruent. In April 1950, two months before the United States entered the Korean War, Business Week declared that the calls for increased government spending, particularly on the military, were the result of “a combination of concern over tense Russian relations and a growing fear of a rising level of unemployment here at home.”9 This reflected the general character of the political economy of the Cold War. As Harry Magdoff ironically noted at the end of his Age of Imperialism in 1969: “Just as the fight against Communism helps the search for profits, so the search for profits helps the fight against Communism. What more perfect harmony of interests could be imagined?”10
The NSC-68 plan for rearmament was soon implemented for the U.S. political economy, with the shift to continuing high military expenditures made possible by the Korean War. By the time that war was brought to an end a much larger military system was in place. Although Eisenhower made efforts to cut military spending after the war, it was to remain “more than three times higher than it was before NSC-68 and the Korean conflict.”11 In 1957, at the beginning of Eisenhower’s second term, military spending was 10 percent of U.S. GDP.12 This reflected the rise of a warfare state, which Scott Nearing, writing in Monthly Review in 1964, defined as a state “which uses war and the threat of war as the decisive instruments of its foreign policy. In a warfare state the body politic places at the top of its list of state activities, planning for war, preparing for war, and waging war when opportunity offers.”13
Already by the end of the Korean War the new warfare state was deeply entrenched. As Eisenhower’s first defense secretary, Charles Erwin Wilson (sometimes referred to as “General Motors Wilson,” as a former president of General Motors, and to distinguish him from Charles E. Wilson [see below]), was to tell Congress, the ascendancy of the military, once in place, was virtually irreversible: “One of the most serious things about this defense business is that so many Americans are getting a vested interest in it: properties, business, jobs, employment, votes, opportunities for promotion and advancement, bigger salaries for scientists and all that. It is a troublesome business…. If you try to change suddenly you get into trouble…. If you shut the whole business off now, you will have the state of California in trouble because such a big percentage of the aircraft industry is in California.”14 Indeed, what had already been put into place to a considerable degree was what the president of General Electric and executive vice chairman of the War Production Board, Charles E. Wilson (sometimes referred to as “General Electric Wilson”), had strenuously lobbied for in 1944: the maintenance of a permanent war economy, in which “an industrial capacity for war, and a research capacity for war” were linked to the state and the armed forces.15
In all of this the role of military spending as a means of creating effective demand was obvious to economists and business alike. Harvard economist Sumner Slichter noted at a banking convention in late 1949 that given the level of Cold War expenditures, a return to conditions of severe depression was “difficult to conceive.” Military spending, he explained, “increases the demand for goods, helps sustain a high level of employment, accelerates technological progress and helps the country to raise its standard of living.” U.S. business’s view of the heightened military budget, as reflected in the sentiments expressed in the U.S. corporate media, was ecstatic. Celebrating the development of the hydrogen bomb in 1954, U.S. News and World Report wrote: “What H-bomb means to business. A long period…of big orders. In the years ahead, the effects of the new bomb will keep on increasing. As one appraiser puts it: ‘The H-bomb has blown depression-thinking out the window.’”16
On the left, Paul A. Baran and Paul M. Sweezy’s classic work, Monopoly Capital, published in 1966, saw militarism and imperialism as motivated first and foremost by the needs of the U.S. empire, and secondly by its role (along with the sales effort) as one of the two main absorbers—beyond capitalist consumption and investment—of the rising economic surplus generated by the economy. All other options for government stimulus spending ran into political roadblocks established by powerful corporate interests. Civilian government spending as a percentage of GDP, excluding transfer payments, Baran and Sweezy argued, had reached its “outer limits” by the late 1930s, when civilian government consumption and investment had risen to 14.5 percent in 1938–1939—a proposition that has remained true ever since, with civilian government spending (consumption and investment) standing at 14 percent of GDP in 2013. (That, however, exaggerates the government’s maintenance of a commitment to “social welfare,” as prisons and domestic policing have come to provide an outsized share of “civilian” government spending in the past three decades.) Consequently, military spending was viewed as more variable than civilian government spending, more readily turned to by the system as a means for “pump-priming” the economy.17
Nevertheless, military spending, Baran and Sweezy argued, faced its own contradictions, and was “not a perfectly free variable through manipulation of which the leaders of the oligarchy can maintain the right head of steam in the economic engine.” The main limitations were of course the total destructiveness of war itself, which meant that a Third World War between the major powers had to be avoided. Open warfare was therefore mainly directed at the periphery of the imperialist world economy, with the United States maintaining a “global military machine to police a global empire,” including over a thousand military bases abroad by the mid-1960s, as a means of propelling U.S. forces around the world.
This reality was bound to generate increased resistance, as in the case of Vietnam, both in the periphery and amongst the U.S. population.18 Indeed, the open revolt of the U.S. ground troops in Vietnam by the early 1970s (along with protests at home) all but forced the military to abandon the military draft as impractical for the types of Third World invasions and occupations that had become standard—compelling it to turn, instead, to a professional army.19 The invasions of the past two decades would have faced much greater popular resistance if they had required a draft to field the armed forces.
Inherent in such attempts to police a world empire were two requirements: First, a widespread propaganda campaign to make empire appear benevolent, necessary, essentially democratic, inherently “American,” and therefore unquestionable in legitimate debate. For an empire, the flip side of propaganda is popular ignorance. Vietnam’s “greatest contribution,” according to Defense Secretary Robert McNamara in its immediate aftermath, was teaching the U.S. government that in the future it was essential “to go to war without arousing the public ire.” McNamara said this was “almost a necessity in our history, because this is the kind of war we’ll likely be facing for the next fifty years.”20 Here the U.S. news media do yeoman’s work legitimizing the imperial system and obstructing popular understanding at every turn. Second, there is the stick to go with the propaganda carrot—a heavy reliance on covert intervention in the periphery and domestic surveillance and oppression.
The Sales Effort
The sales effort headquartered in Madison Avenue was to be the main success story of U.S. monopoly capitalism in the 1950s, and a key means of absorbing economic surplus. Outside of capitalist luxury consumption, the sales effort absorbed economic surplus chiefly by means of what Baran and Sweezy called “profits by deduction,” giving higher wages to workers (or to a relatively privileged element of the working class) and then manipulating them to buy largely wasteful conveniences and unnecessary, ultimately unsatisfying, packaged goods of all kinds. The end result was to chain most people to their jobs without improving their real standard of living or position vis-á-vis the means of production.21 Production, as Thorstein Veblen anticipated in the 1920s, became more and more about the manufacturing of “saleable appearances” rather than genuine use values.22 In the postwar years a qualitatively new phase of consumer capitalism emerged based, as Martin Mayer wrote in 1958 in Madison Avenue, on “a tripartite business, composed of clients (the companies which make the branded products and pay to advertise them), agencies (which prepare and place the ads), and media (the newspapers, magazines, broadcasting stations—each an individual medium for advertising—which carry the message to the public).”23 Beyond advertising itself was the much larger realm of corporate marketing, involving such areas as targeting, motivation research, product design, sales promotion, and direct marketing.24
Marketing evolved quickly in its period of greatest advance in the 1950s into a highly organized system of customer surveillance, targeting propaganda, and psychological manipulation of populations. Consumer savings during the Second World War had grown enormously and the “Ad Men” of Madison Avenue became almost synonymous with the new “consumer culture” of the 1950s aimed at the promotion of innumerable, supposedly distinct brands. The result was an encouragement of high levels of consumer spending and a general lifting of the economy, as workers were conditioned to see themselves as consumers in all their non-working hours, reinforcing their dependence on their jobs while feeding the economic juggernaut. In this way the sales effort emerged as the dominant process governing the entire cultural apparatus of monopoly capitalism.25
There is no doubt that the growth of marketing expenditures in the 1950s, with advertising jumping in nominal terms from $3 billion in 1929, to $10 billion in 1957, to $12 billion in 1962, served to expand total effective demand in the economy, creating new employment and markets, and stimulating investment in new product lines, while also encouraging prodigious amounts of commercial waste in superfluous packaging, product obsolescence, the production of useless goods foisted on consumers, etc. The entire marketing system constituted “a relentless war against saving and in favor of consumption.”26 By the late 1950s, U.S. annual advertising spending was about 20–25 percent of military spending. And since advertising has always been a small part of overall marketing expenditures—the total size of which is, however, notoriously difficult to measure since it permeates all aspects of the system—the surplus-absorbing effect of the entire sales effort during the so-called “golden age” of the 1950s and ‘60s was likely roughly comparable to that of military spending as a means of surplus absorption, particularly in those years when an actual war was not taking place.27
The tremendous growth of marketing in these years was inseparable from the consolidation of monopoly capitalist accumulation. Price competition no longer occupied the central place in the competitive structure of the economy, as oligopolies operating in tandem through a process of indirect collusion ensured that the general price level went only one way—up. Instead, the oligopolistic rivalry that increasingly prevailed in the economy took the form of what came to be known as “monopolistic competition,” in which the competitive struggle was mainly over market share for particular brands, and thus centered on the sales effort. As welfare economist Tibor Scitovsky observed: “The secular rise in advertising expenditures is a sign of a secular rise of profit margins and decline of price competition.” In Baran and Sweezy’s analysis “price competition” had “largely receded as a means of attracting the public’s custom,” yielding “to new [wasteful] ways of sales promotion: advertising, variation of the products’ appearance and packaging, ‘planned obsolescence,’ model changes, credit schemes, and the like.”28
The corporation that spent the most on advertising in the United States in the 1950s was General Motors, then the largest corporation in the world, which had pioneered in product differentiation based on cosmetic model changes (such as chrome or tailfins). It built into its cars both (physical) product obsolescence and psychological obsolescence, and was the price leader in the industry—with the other giant automakers readily falling in line and sharing in the loot.
The largest marketer of packaged goods in the United States, and (next to General Motors) the largest purchaser of advertising, was Procter & Gamble. The company manufactured soaps, cleaners, and detergents such as Ivory, Tide, Cheer, Camay, Oxydol, Cascade, Comet, Joy, and Lava; Crest and Gleem toothpastes; Crisco shortening; Jif peanut butter; and many other branded products. Procter & Gamble is credited with having invented modern brand management beginning with Neil McElroy’s famous May 13, 1931 internal corporate memorandum. Dismayed by having the job of promoting Camay soap as a subsidiary product in an environment dominated by Procter & Gamble’s own Ivory soap, McElroy proposed that Procter and Gamble’s various brands be managed by separate teams and marketed as completely distinct businesses, within a context of product differentiation in which the brands were targeted at different consumer markets. Later, as president of Procter & Gamble, McElroy embraced the soap opera, developing programing that was designed to be conducive to commercialism first and foremost, based on constant repetition both of story lines and product pitches. Procter & Gamble also emerged as a pioneer in conducting market research aimed at its potential customers. In addition, McElroy established large-scale “blue sky” scientific research laboratories at Procter & Gamble where the researchers were relatively free to explore new ideas with respect to consumer products.29
Procter & Gamble’s considerable success in the 1950s in integrating advertising and programing in private broadcasting could be seen as symbolizing the triumph of commercialism in the U.S. media system in the post-Second World War era. “As early as the general advent of radio in the 1920s,” Herb Schiller was to write in Mass Communications and Empire, “and deepening with the introduction of television in the late 1940s and early 1950s, the electronic apparatus has been largely at the disposal of the business system and the ‘national advertiser’ in particular…. The comprehensive employment of sophisticated communication facilities and ancillary services such as surveys, to the instruction and persuasion of consumers, is the foremost identifying feature of developed capitalism…. Scarcely a cultural space remains…that is outside the commercial web.”30 The government readily handed over the airwaves for free to corporations, while maintaining only the most minimal regulatory structure aimed primarily at protecting rather than restraining commercial privileges.31
The Military Industrial Complex and ARPANET
After nine years heading Procter & Gamble, McElroy agreed to become Eisenhower’s new Secretary of Defense. On October 4, 1957 the defense secretary nominee was in Huntsville, Alabama touring the Redstone Arsenal, the Army’s rocket program, and conversing with German émigré Wernher von Braun, considered the founder of modern rocketry, when news of the Soviet launching of Sputnik arrived. Five days later McElroy was sworn in as secretary of defense with all of Washington discussing the question of Soviet technological dominance. The launch of Sputnik II a month later only increased the pressure on the Eisenhower administration. After conferring with Ernest O. Lawrence, a major figure in the Manhattan Project, McElroy proposed the launching of a centralized agency for advanced scientific research projects, drawing on a broad network of scientific talent in universities and corporate manufacturing firms across the country. On November 20, 1957, he went to Capitol Hill for the first time and presented his idea of a “single manager” for all defense research, which would initially focus on ballistic missile, satellite, and space research and development programs, but which would have clear contracting authority and an unlimited, unconstrained research agenda. On January 7, 1958, Eisenhower requested Congress to provide startup funds for the new Advanced Research Projects Agency (ARPA). McElroy chose Roy Johnson, a vice president of General Electric, as the first ARPA director.
Right away ARPA set the goal of the militarization of space, including global surveillance satellites, communications satellites, and strategic orbital weapons systems, plus a moon mission. However, following the creation of the National Aeronautic and Space Agency (NASA) in the late summer of 1958, the civilian space programs were gradually stripped away from ARPA; and by 1959 most of its military space programs, along with the larger part of its funds, were also gone. Johnson resigned. However, rather than abolishing ARPA, McElroy, before leaving the Defense Department and returning as CEO of Procter & Gamble in 1959, revised ARPA’s charter to make it more clearly a blue sky technology operation of the Department of Defense, superseding all of the armed forces. ARPA (renamed the Defense Advanced Research Projects Agency or DARPA in 1972) worked on developing anti-ballistic missile systems, and on Transit, the predecessor to the Global Positioning System (GPS). Its most remarkable work in its early years, though, was associated with the development of packet-switching digital communications technology, incorporating the insights of engineer Paul Baran at the Rand Corporation, which led to the original Internet and the packet satellite network. In the 1980s DARPA concentrated on the promotion of Ronald Reagan’s Star Wars initiative in what has been called the Second Cold War. In the 1990s and early 2000s it was to develop technologies of digital surveillance in close alliance with the NSA, along with military drone technology.32
It was with the appointment in 1961 of ARPA’s third director, Jack P. Ruina, a scientist who was formerly a deputy assistant director of the Air Force, that the organization became a major force in computer research. Ruina purchased a massive Q-32 computer from the Air Force to allow ARPA to research military command and control issues. Ruina brought in J.C.R. Licklider of MIT, a behavioral scientist and computer programmer, to run ARPA’s command and control and behavioral science divisions. Licklider created contractual relations with the best computer scientists at universities across the country, and introduced an internal culture that focused on the idea of networking based on interconnected computers. Over the course of the 1960s ARPA became the center of work on computer networking, resulting by the early 1970s in the creation of ARPANET, the precursor of today’s Internet.
The product of the Eisenhower administration, ARPA existed alongside hundreds of other defense agencies formed in the Truman and Eisenhower years, yet it alone was conceived as the scientific-technological apex of the rapidly developing military-industrial complex. Under Eisenhower, at McElroy’s instigation, the United States invaded Soviet air space with its U-2 spy plane, shot down by the Soviets in May 1960, and became engaged in counterinsurgency operations in Indochina and elsewhere.33 The military policy of his administration remained expansive. Yet, Eisenhower’s farewell address to the nation on January 17, 1961, showed his own second thoughts, uncertainty, ambivalence, and even fear at what had been created. Eisenhower pointed to the fact that the United States had developed “a permanent armaments industry of vast proportions…. We annually spend on military security more than the net income of all United States corporations.” He went on to urge that the government “guard against the acquisition of unwarranted influence…by the military industrial complex,” and to warn that society could become “captive of a scientific technological elite” under circumstances where “the power of money is ever present.”
Eisenhower’s warnings were deliberately vague. He did not define the “military-industrial complex,” using the term only once in his speech. Yet, his comments were directed at the reality of the military-technological-corporate complex that he had himself played the leading role in instituting beginning in 1946, and that had been massively extended in his years in the White House. By 1962, 56.2 percent of the sales of the electronics industry in the United States were going to the military and the closely allied civilian space industry.34
The Vietnam War Era and Domestic Surveillance
The peak years of economic growth and near-full employment in the 1950s and ‘60s coincided with the years of the Korean and Vietnam Wars. Although these wars were fought under slogans of the “Containment of Communism” and the “Defense of the Free World,” the real purpose in the case of both conflicts was to maintain the security of the world capitalist economy and U.S. hegemony in the face of forces seeking to break free. Yet if the geopolitics of empire and the Cold War were first and foremost in motivating these wars, the fact that they also required huge bursts of military spending that lifted the whole economy was not, as we have seen, lost on the dominant political-economic forces, and indeed entered directly into the calculations of the power elite.
Such a system of military-imperial dominance and capital accumulation naturally creates not only its own external enemies but its “internal enemies” as well—which in the eyes of the power structure consists of all those opposed to capitalism and the warfare state, along with all those forces in society that are seen as potentially disruptive. A warfare state thus naturally militates into a surveillance state.
The growth in the late 1950s and ‘60s of social protest, first over civil rights, and later the anti-Vietnam War movement and other causes, led to a massive increase in the military and quasi-military (or secret police) surveillance of the U.S. population. The years 1970–1971 saw the emergence of the “Army Files” (or CONUS) scandal, when it was revealed that the Army had been spying on and keeping dossiers on over seven million U.S. citizens. These dossiers were originally housed in its Investigative Records Library—with most of the files kept in a steel room, two stories high and half-a-block long—at Fort Holabird, Maryland. Along with these dossiers were satellite files, including a “vast subversives file” on civil rights and anti-war protestors and separate file cabinets devoted to incidents involving “civil disturbances” more generally, or dissent within the Army. In 1967 the military had completed construction of a secret national teletype service to allow rapid communication of intelligence gathered on the population. The Counterintelligence Analysis Branch was in charge of the construction of a huge Compendium, combining information from the surveillance files with the object of computerizing the data. Surveillance was carried out on participants in the Poor Peoples’ March on Washington in 1968, visitors to Martin Luther King, Jr.’s grave, black nationalists, socialist organizations, and those engaged in anti-war demonstrations of more than twenty people across the entire country. The Army had 1,500 plainclothes agents, working out of three hundred offices.35
In the continuing Congressional investigations into the Army intelligence files, and its subversives file in particular—which the Army said had been destroyed—it was later discovered that the data had been transmitted to the NSA,
via the ARPANET, a computer network connecting more than 50 government agencies and universities throughout the country. The network is funded by the Department of Defense Advanced Research Projects Agency (ARPA)….The information, according to intelligence sources, was transferred and stored at the headquarters of the National Security Agency (NSA), at Fort Meade, Maryland. The Army files were transmitted on the ARPANET in about January 1972, sources say, more than two years after the material—and the data banks maintained at the [Army’s] Fort Holabird facility—were ordered destroyed.36
For many Americans this was the first indication that such a thing as ARPANET existed. Already in the 1970s the NSA was thus implicated in using the early proto-Internet system as part of its surveillance operations of the U.S. public. Stung by such revelations, Senator Sam Ervin, best known for his role as chairman of the Senate Watergate Committee, but long involved in the Army Files investigation, delivered a speech at MIT in April 1975 declaring that the danger to privacy had accelerated due to the presence of computers which allowed “limitless storage of data, and retrieval at lightening-like speed.”37 The Senate investigations into the Army surveillance of the population and its databases caused University of Michigan law professor Arthur R. Miller to declare, as early as 1971, before the Senate Subcommittee on Constitutional Rights, chaired by Ervin:
Whether he knows it or not, each time a citizen files a tax return, applies for life insurance or a credit card, seeks government benefits, or interviews for a job, a dossier is opened under his name and an informational profile is sketched. It has now reached the point at which whenever we travel on a commercial airline, reserve a room at one of the national hotel chains, or rent a car we are likely to leave distinctive electronic tracks in the memory of a computer—tracks that can tell a great deal about our activities, habits, and associations when collated and analyzed. Few people seem to appreciate the fact that modern technology is capable of monitoring, centralizing, and evaluating these electronic entries—no matter how numerous they may be—thereby making credible the fear that many Americans have of a womb-to-tomb dossier on each of us.
Even though the threat to our informational privacy is growing constantly, most Americans remain unaware of the extent to which federal agencies and private companies are using computers and microfilm technology to collect, store, and exchange information about the activities of private citizens. Rarely does a day go by without the existence of some new data bank being disclosed…. Consider the information practices of the United States Army. Early this year it was revealed that for some time Army intelligence systematically was keeping watch on the lawful political activity of a number of groups and preparing “incident” reports and dossiers on individuals engaging in a wide range of legal protests.38
The 1970s also revealed the FBI’s massive surveillance and movement-disruption program, COINTELPRO (an acronym for Counterintelligence Program). Between 1956 and 1975 the FBI, under J. Edgar Hoover, engaged in a wide array of surveillance and illegal activities (break-ins, forgeries, agent-provocateur actions, wrongful imprisonment, and violence) modeled after earlier actions taken against the Communist Party—directed at dissident groups, including socialist organizations, civil rights leaders, journalists, and New Left war critics. These actions were seen as “justified” by the FBI in cases where groups, such as the Socialist Workers Party, ran candidates for public office that supported causes like “Castro’s Cuba and integration…in the South.” New Left groups were targeted on the basis that they commonly “urge revolution” and “call for the defeat of the United States in Vietnam.”39
Under the codename Project MINARET, during the Johnson and Nixon years the NSA tapped the electronic communications of leading U.S. critics of the war, including over 1,600 U.S. citizens who were put on the NSA watch list. Among the individuals targeted were such figures as Martin Luther King, Jr., Whitney Young, Eldridge Cleaver, Stokely Carmichael, Jane Fonda, Tom Hayden, and Muhammad Ali. Beyond these, the NSA watch list also included such prominent establishment figures as U.S. Senators Frank Church and Howard Baker, New York Times columnist Tom Wicker, and Washington Post columnist Art Buchwald. The revelations on the NSA’s Project MINARET together with COINTELPRO led to the passage of the Foreign Intelligence Surveillance Act of 1978, which limited the powers of the federal government to conduct surveillance of U.S. citizens.40
In the early 1970s the NSA launched its code name Project ECHELON, conducted jointly with Britain, Canada, Australia, and New Zealand (collectively known as the Five Eyes), aimed at the interception of civilian telecommunications conveyed by means of communication satellites. As William Blum wrote in Rogue State in 2005, “the ECHELON system works by indiscriminately intercepting huge quantities of communications and using computers to identify and extract messages of interest from the unwanted ones. Every intercepted message—all the embassy cables, the business deals, the sex talk, the birthday greetings—is searched for key words, which could be anything the searchers think might be of interest.” The NSA’s listening base in England encompassed 560 acres. Aside from collecting national security information, the NSA has been involved in commercial espionage on behalf of corporations, including stealing technology. In 1994 the NSA and the CIA turned over data that caused the European Airbus Industries to lose lucrative international contracts to their U.S. counterparts.41
Financialization, Data Mining, and Cyberwar
Following the drawing down and end of the Vietnam War, the U.S. economy entered an economic crisis, which developed into a long period of deepening stagnation, characterized by declining real economic growth rates and rising unemployment and underemployment.42 If military spending and an expanded Madison Avenue-based sales effort were the main added factors allowing for the absorption of economic surplus in the 1950s and ‘60s, their stimulative effect lessened in the 1980s and after, despite sharp increases in consumer credit (including credit cards) to boost the sales effort, and despite the Second Cold War unleashed by Reagan, inflating military spending. Reagan promoted a de facto military Keynesianism, lowering taxes primarily on corporations and the rich while giving a big boost to military spending. This included his expensive Star Wars program of anti-missile defense in which DARPA was to play a leading part. Attacks on labor unions, wages, and civilian government spending on behalf of workers and the poor became more severe, ushering in the age of neoliberalism.
A light was shown briefly on the scale and illegality of Reagan-era warfare state and secret government activities with the exposure of the Iran-Contra Affair in Washington. It led to the conviction on August 7, 1990, of Reagan’s National Security Advisor, Admiral John Poindexter, for five counts of lying to Congress and obstructing the investigations of Congressional Committees into Iran-Contra, involving the illegal selling of arms to Iran as a means of secretly funding the Contras waging war on the Nicaraguan government. (The convictions were later overturned on the basis that several witnesses against him had been affected by Poindexter’s testimony to Congress, even though he had been given immunity for his testimony.)
At the same time, Poindexter was also caught in another scandal through his authorship of National Security Decision Directive (NSDD)-145 (signed by Reagan). NSDD-145 would have centralized control over all computer databases in the United States, allowing the military to examine private computer databases for “sensitive but unclassified information”—making the NSA a computer czar. Faced with an outcry from private industry, and in the midst of the fallout over Iran Contra—both of which focused on Poindexter—NSDD-145 was withdrawn. After a period working for Syntek, a private firm contracting with DARPA, Poindexter reemerged in 2002 as the head of the Information Awareness Office in DARPA, designed to implement the technological basis for the Total Information Awareness (TIA) Program, to be carried out by the NSA, and directed at aggregating and analyzing all digitalized communications of the U.S. population. The Defense Department itself described it as creating a “virtual centralized grand database” on all electronic transmissions. One of the big contractors for the TIA program was Booz Allen Hamilton, a giant defense contractor. The head of the intelligence business at Booz Allen, Mike McConnell (former NSA director in the George H.W. Bush administration and later director of national intelligence under George W. Bush), was a close associate of Poindexter. Congress intervened to defund the program (then renamed Terrorism Information Awareness) in 2003, with the intention of closing it down completely—after a scandal arose from its development of an online futures trading market speculating on terrorist attacks, drawing attention to Poindexter and TIA.43
However, it was neoliberal financialization, even more than the warfare state, that characterized the Reagan era. With economic surplus no longer finding sufficient profitable outlets in what economists called the “real economy,” more and more money capital flowed into speculation in the financial sector. Meanwhile, decades of imperial expansion, particularly in the Vietnam War period, had created a huge overhang of dollars abroad in the form of what came to be called the “Eurodollar market,” generating a growing demand from abroad for outlets for this surplus money capital within the U.S. economy. Financial institutions responded to this increased demand for speculative products by creating an endless array of new speculative instruments in the form of various kinds of futures, options, and derivatives. The U.S. and the world economy saw a skyrocketing growth of speculative activity, visible in the growth of debt leverage—with financial corporate debt rising from around 10 percent of U.S. GDP in 1970 to over 40 percent in 1990, and continuing to soar thereafter.44 Not only did this help absorb surplus through the growing expenditures on fixed investment (chiefly business structures and computers) and employment (a growing army of financial analysts) in the real economy, but the speculative increase in the value of financial assets increased the wealth of the capitalist class independently from production, resulting in a certain percentage of this increased financial wealth being spent as luxury goods, thereby effectively absorbing surplus and stimulating the economy.
As early as May 1983, in an article entitled “Production and Finance” in Monthly Review, Harry Magdoff and Paul M. Sweezy described the massive long-term shift to an economy in which a huge “financial superstructure” dominated over the underlying production system. The result was the advent of a seemingly permanent financial-bubble prone economy. Such an economy was unstable and parasitic to the extreme, with constant fears of financial meltdown, and hence a growing role of central bankers as lenders of last resort, intervening periodically to prop up an increasingly fragile financial system. Sweezy was later to refer to this as “the financialization of the capital accumulation process.”45
Alan Greenspan, appointed chair of the Federal Reserve Board by Reagan in 1987, presided over two decades of rapid financial expansion, made possible by frequent interventions of the Federal Reserve Board to provide greater liquidity as the lender of last resort, and by an increasingly deregulated market environment in which to operate. All of this increased Wall Street’s power in Washington, to the point where it has come to dominate governance at the upper levels, in a manner even greater than that enjoyed by manufacturers in the immediate postwar years.47 This then accelerated policies promoting financialization.
Financialization was spectacularly enhanced by high-speed computer networks, which became critical mechanisms for the newly created speculative markets, and no small amount of financial chicanery.47 But financialization’s encouragement of surveillance capitalism went far deeper. Like advertising and national security, it had an insatiable need for data. Its profitable expansion relied heavily on the securitization of household mortgages; a vast extension of credit-card usage; and the growth of health insurance and pension funds, student loans, and other elements of personal finance. Every aspect of household income, spending, and credit was incorporated into massive data banks and evaluated in terms of markets and risk. Between 1982 and 1990 the average debt load of individuals in the United States increased by 30 percent and with it the commercial penetration into personal lives. As Christian Parenti wrote in his 1991 book, The Soft Cage, “the records produced by credit cards, bankcards, discount cards, Internet accounts, online shopping, travel receipts and health insurance all map our lives by creating digital files in corporate databases.”48 By 2000, as Michael Dawson reported in The Consumer Trap, nearly all major corporations in the United States were building huge databases, and were linked to data mining enterprises. “Symmetrical Research was advertising services such as its Advanced Analytic Solutions, which promised corporate clients ‘the power of one of the world’s most advanced marketing data analytics teams, with proprietary tools enabling the statistical analysis of…[data of the size of] the 35 terabyte Mastercard data set.’ A terabyte…is one trillion units of computerized information.”49
The largest data broker in the United States today, the marketing giant Acxiom has 23,000 computer servers processing in excess of 50 trillion data transactions annually. It keeps on average some 1,500 data points on more than 200 million Americans, in the form of “digital dossiers” on each individual, attaching to each person a thirteen-digit code that allows them to be followed wherever they go, combining online and offline data on individuals. Much of the data is now gleaned from social media, such as Facebook. Acxiom organizes this information into “premium proprietary behavioral insights.” Each person is also placed in one of seventy lifestyle clusters, focusing particularly on class, spending habits, and geographical location. Acxiom sells this data (giving varying access to its data banks) to its customers, which include twelve of the top fifteen credit-card issuing companies; seven of the top ten retail banks; five of the top ten insurance companies; six of the top ten brokerage firms; eight of the top ten media/telecommunication companies; seven of the top ten retailers; eleven of the top fourteen global automakers; and three of the top ten pharmaceutical firms. Its clients include about half of the largest one-hundred corporations in the United States.
Since September 2001 Acxiom has worked closely at sharing data with the FBI, the Pentagon, and Homeland Security. In 2001, Acxiom appointed General Wesley Clark, the former NATO Supreme Allied Commander in Europe in the Kosovo War and a future U.S. presidential candidate, to its board of directors. The company paid Clark over $800,000 as a lobbyist, primarily in relation to the Department of Defense and Homeland Security. Through Clark, Acxiom began working with Poindexter’s DARPA-based TIA, helping set up the technological systems for total surveillance of the U.S. and global population.50
CBS’s 60 Minutes reported in March 2014 that clicking on the New York Times website can mean that more than a dozen third parties are “on the page that are essentially tracking your movements.” Most of the 50 million people who downloaded the “Brightest Flashlight Free” app on to their smartphone did not recognize that “the companies that gave them to you for free were using the apps to track your every movement and pass it along to other companies.” The iPhone app “Path Social,” which was ostensibly designed to help people share photos and memories with their friends, tapped into user’s digital address books and contact lists, taking all of that information. The data broker firm Epsilon has a marketing database containing more than 8 billion consumer transactions. The data broker firm Choicepoint, now part of the data giant Elsevier, maintains 17 billion records on businesses and individuals, which it has sold to around 100,000 clients, including numerous government agencies.51
Financial institutions themselves sell such data. Forbes magazine wrote in 2013 that “in most aspects of our lives, companies and marketers can freely collect details about us and sell to whomever they like without restriction.” However, financial institutions, it pointed out, were legally prohibited in most cases from directly selling such information. Nevertheless, Forbes explained that many financial institutions do market their data in various ways, and some 27 percent violate all aspects of the legal regulations.52
Financialization—or the long-term growth of speculation on financial assets relative to GDP—meant the intrusion of finance into all aspects of life, requiring new extensions of surveillance and information control as forms of financial risk management. As the economy became more financialized, it became increasingly vulnerable to financial meltdowns, increasing risk perceptions on the part of investors and the perceived need for risk management, encryption of data, and security.
Today the fears of cyberwar aimed at financial institutions, the entire financial system, and the military system, is at the top of national security concerns. McConnell, who had left his job at Booz Allen to become director of national intelligence in 2007 under George W. Bush, informed the president that, “If the 9/11 perpetrators had focused on a single U.S. bank through cyberattack, and it had been successful, it would have had an order of magnitude greater impact on the U.S. economy than the physical attack.” Secretary of the Treasury Henry Paulson, former CEO of Goldman Sachs, agreed. Bush was so alarmed that within a short time the Comprehensive National Cybersecurity Initiative (2008) was in place, which greatly expanded the NSA’s authority to carry out surveillance on the Internet domestically, leading to the construction of its $1.5 billion data center in Utah.53 Leon Panetta, U.S. defense secretary under Obama, warned that a cyberattack on the U.S. financial system might be the “next Pearl Harbor.” In July 2011 Barack Obama signed an executive order declaring that the infiltration of financial markets by transnational criminal organizations constituted a national emergency. Symantec, a cybersecurity firm, estimated in 2010 that three-quarters of “phishing” attacks designed to get people to give up financial data were not aimed at individuals but were directed at the financial sector.54
In addition to hackers breaking into databases, large scale attacks on entire security systems are feared. The sudden drop in the stock market on May 6, 2010, attributed to high speed algorithmic trading, was thought to prefigure a new possible form of cyberwar aimed at dragging reeling markets down further using short-selling, options, and swaps—a kind of “force multiplier” in military-speak. Hackers using malicious codes to crash or jam whole networks can mobilize Botnets or robotic networks of hundreds of thousands of machines. According to Mortimer Zuckerman, chairman and editor-in-chief of U.S. News and World Report, writing in the Wall Street Journal, digitalized systems are extraordinarily vulnerable to attack: “the average [offensive] malware has about 175 lines of code, which can attack defense software using between 5 million and 10 million lines of code.” The U.S./Israeli-developed “Stutnex” worm aimed at Iran, which reportedly infiltrated the computers controlling Iranian nuclear centrifuge facilities, is seen as an indication of the scale and precision with which cyberattacks can now demobilize whole systems.55
The Internet and Monopoly Capital
ARPANET was connected only to those universities and their computer science departments that had Department of Defense funding and security clearances. With the success of the system, computer science departments at universities and private industry were all eager to be connected to the network. This resulted in the creation by the National Science Foundation of the Computer Science Research Network (CSNET), which consisted of ARPANET, a Telenet system, and PhoneNet for email. Soon other, private internets were created. In 1985 the National Science Foundation constructed five supercomputers across the country to be the backbone of a larger NSFNET, which brought universities in general and private corporations into what had merged into a much wider Internet with a common protocol, resulting in a massive growth of users who could access it through personal computers, via Internet Service Providers.
ARPANET ceased operations in 1989. In the early 1990s the World Wide Web was developed, leading to an astronomical increase in users, and the rapid commercialization of the Internet. Three key developments followed: (1) In 1995 NSFNET was privatized, and NSFNET itself decommissioned, with the backbone of the system being controlled by private Internet Service Providers;56 (2) the Telecommunications Act of 1996 introduced a massive deregulation of telecommunications and media, setting the stage for further concentration and cenoentralization of capital in these industries;57 (3) the Financial Services Modernization Act of 1999, promoted by Federal Reserve Chairman Alan Greenspan, Treasury Secretary Robert Rubin, and Deputy Treasury Secretary Lawrence Summers under the Clinton administration, deregulated the financial sector in an attempt to feed the financial bubble that was developing.58 These three elements coalesced into one of the biggest merger waves in history, known as the dot-com or New Economy bubble. The ongoing concentration of capital was thus given a huge boost in the technology and finance sectors, leading to ever greater levels of monopoly power.
The dot-com bubble burst in 2000. But by that time a virtual Internet cartel had emerged, despite all the rhetoric of “friction-free capitalism” by Bill Gates and others.59 By the end of the decade the Internet had come to play a central role in capital accumulation, and the firms that ruled the Internet were almost all “monopolies,” by the way economists use the term. This did not mean that these firms sold 100 percent of an industry’s output, but rather that they sold a sufficient amount to control the price of the product and how much competition they would have. (Even John D. Rockefeller’s Standard Oil monopoly at its peak controlled just over 80 percent of the market.) By 2014, three of the four largest U.S. corporations in market valuation—Apple, Microsoft, and Google—were Internet monopolies. Twelve of the thirty most valuable U.S. corporations were media giants and/or Internet monopolies, including Verizon, Amazon, Disney, Comcast, Intel, Facebook, Qualcomm, and Oracle. These firms used network effects, technical standards, patent law, and good old-fashioned barriers-to-entry to lock in their market power, and they used their monopoly gushers to broaden their digital empires. With this economic power comes immense political power, such that these firms face no threat from regulators in Washington. To the contrary, the U.S. government is little short of a private army for the Internet giants as they pursue their global ambitions.60
The major means of wealth generation on the Internet and through proprietary platforms such as apps is the surveillance of the population, allowing for a handful of firms to reap the lion’s share of the gains from the enormous sales effort in the U.S. economy. The digitalization of surveillance has radically changed the nature of advertising. The old system of advertisers purchasing ad space or time in media with the hope of getting the media user to notice the advertisement while she sought out news or entertainment is becoming passé. Advertisers no longer need to subsidize journalism or media content production to reach their target audiences. Instead, they can pinpoint their desired audience to a person and locate them wherever they are online (and often where they are in physical space) due to ubiquitous surveillance. The premise of the system is that there is no effective privacy. The consequences are that the commercial system of media content production, especially journalism, is in collapse, with nothing in the wings to replace it.
These monopolistic corporate entities readily cooperate with the repressive arm of the state in the form of its military, intelligence, and police functions. The result is to enhance enormously the secret national security state, relative to the government as a whole. Edward Snowden’s revelations of the NSA’s Prism program, together with other leaks, have shown a pattern of a tight interweaving of the military with giant computer-Internet corporations, creating what has been called a “military-digital complex.”61 Indeed, Beatrice Edwards, the executive director of the Government Accountability Project, argues that what has emerged is a “government-corporate surveillance complex.”62
This extends beyond the vast private contractor network to “secret collaboration” with the main Internet and telecom companies.63 Notable examples of partly cooperative, partly legally coerced sharing of data include:
A 2009 report by the NSA’s inspector general leaked by Snowden stated that the NSA has built collaborative relationships with over “100 companies.”64
Microsoft provided the NSA with pre-encryption “back door” access to its popular Outlook.com email portal, to its Skype Internet phone calls and chat (with its 663 million global users), and to SkyDrive, Microsoft’s cloud storage system (which has 250 million users). The Snowden files show that Microsoft actively collaborated with the NSA. Glenn Greenwald writes: “Microsoft spent ‘many months’ working to provide the government easy access to that [the SkyDrive] data.” The same was the case for Skype, while in the case of Outlook.com it took only a few months for the Microsoft and the NSA working together to ensure the NSA’s complete access.65
The NSA paid $10 million to the computer security company RSA to promote a back door to encryption products. The NSA devised a flawed formula for generating random numbers for encryption with RSA inserting it into its software tool Bsafe, which had been designed to enhance security in personal computers and other digital products.66
AT&T voluntarily sold metadata on phone calls to the CIA for over $10 million a year in connection with the latter’s counterterrorism investigations.67
Verizon (and likely AT&T and Sprint as well) provided the NSA with metadata on all calls in its (their) systems, both within the United States and between the United States and other countries. Such metadata has been supplied to the NSA under both the Bush and Obama administrations.68
Microsoft, Google, Yahoo, and Facebook turned over the data from tens of thousands of their accounts on individuals every six months to the NSA and other intelligence agencies, with a rapid rise in the number of accounts turned over to the secret government.69
In 2012 DARPA Director Regina Dugan left her position to join Google. During her period as director, DARPA had been at the forefront of drone research, presenting the first prototype demonstrations in the early 1990s. However, the outgrowth of this in the deployment of General Atomic Aeronautical System’s Predator drones in warfare did not occur until the late 1990s in the Kosovo War, with Clark as the Supreme Allied Commander. The first use of such drones for global, extra-territorial assassination, outside a field of war—now a staple of Obama’s “anti-terrorism” strategy—took place in 2002.70 In the opening years of this century DARPA extended its research to developing drones that could be used for mobile wi-fi capabilities. Dugan’s switch to Google in the private sector—at a time when she was under governmental investigation for giving hefty DARPA contracts to RedX, a bomb-detection corporation that she had co-founded and partly owned—was connected to Google’s interest in developing high-altitude drones with wi-fi delivering capabilities. In 2014 Google announced that it was buying Titan Aerospace, a U.S.-based start-up company for building drones which cruise at the very edge of the atmosphere. Facebook meanwhile bought the UK corporation, Ascenta, which specializes in making high-altitude solar drones. Such drones would allow the spread of the Internet to new areas. The goal was to capitalize on a new military technology and create larger global Internet monopolies, while expanding the military-digital complex.71
By 2005–2007 broad estimates suggested that U.S. marketing expenditures (defined fairly narrowly) were running at about $1 trillion a year; real (both acknowledged and unacknowledged) military expenditures at about $1 trillion annually; and FIRE (finance, insurance, and real estate) expenditures at approximately $2.5 trillion.72 In the digital age, these three sectors of the political economy, each of which arose parasitically on the production base of the economy, were increasingly connected in a web of technology and data sharing. As the most advanced technologies (usually military developed) went private, many of those involved in the warfare economy, such as DARPA’s Dugan, were in a position to exploit the knowledge and connections that they had accumulated by shifting to the private sector, crossing fairly easily from one system of security and surveillance to another.
A kind of linguistic convergence mirrored the centralized structure of monopoly-finance capital in the age of digital surveillance with “securitization” increasingly standing simultaneously for a world dominated by: (1) financial derivatives trading, (2) a network of public and private surveillance, (3) the militarization of security-control systems, and (4) the removal of judicial processes from effective civilian control.73
Total Information Awareness, Prism, and Snowden
Close watchers of the U.S. empire recognized that Congress’s attempt to close down Poindexter’s TIA Program had only been partly successful. Faced with Congressional opposition DARPA and the NSA shifted the program to private industry, where a deeper level of secrecy existed, since government accountability was less. As Chalmers Johnson wrote in his Dismantling the Empire in 2010:
However, Congress’s action did not end the “total information awareness” program. The National Security Agency secretly decided to continue it through its private contractors. The NSA easily persuaded SAIC [Science Applications International Corporation] and Booz Allen Hamilton to carry on with what Congress had declared to be a violation of the privacy rights of the American public—for a price. As far as we know, Admiral Poindexter’s “Total Information Awareness Program” is still going strong today.74
Such a transfer was more readily carried out, given that McConnell, in his capacity as director of the intelligence business at Booz Allen, was already contracting with Poindexter and the Total Information Awareness program. Hence program design, technology, and funding could be readily shifted out of the government into the shadowy world of military contracting. It remained linked to the NSA and its overall super-secret, post-9/11 operation for the domestic surveillance of all Americans. Known in official documents as the “President’s Surveillance Program,” intelligence insiders referred to it simply as “The Program.” It was carried out under the supervision of NSA Director General Michael V. Hayden until 2005, who then moved on to become director of the CIA. Hayden’s replacement was the single-minded General Keith Alexander, whose motto was “Collect It All.” Alexander stepped down as head of the NSA in March 2014, in the midst of the Snowden revelations, and was succeeded by Admiral Mike Rogers.75
The relation between the intelligence establishment and the private contracting industry is a revolving door. McConnell, Bush’s director of national intelligence, is once again at Booz Allen, now as vice c hairman; while James Clapper, Obama’s current director of national intelligence, is a former Booz Allen executive. Booz Allen is majority owned by the Carlyle Group, which specializes in private equity investment and ownership of military contractors. The Carlyle Group has been involved in some of the largest leveraged buyouts, and has long had a close relationship to the Bush family.76
The Snowden files clearly reveal that while Poindexter’s TIA program within DARPA was being defunded by an irate Congress, the NSA had already commenced its own related secret program, part of the President’s Surveillance Program, beginning shortly after 9/11 with Boundless Informant, a warrantless wiretapping program directed at both telephony and email. It took considerably longer to get Prism, which (like Poindexter’s TIA) was directed at total Internet surveillance, up and running, since this required both new technology and cooperation with the major Internet corporations. The technological development and much of the actual surveillance work was to be increasingly centered in Booz Allen and other private contractors. Although the NSA itself has as many as 30,000 employees, it relies on a larger workforce of some 60,000 employed by private contractors.77
In May 2013, Edward Snowden, a middle-level technician at Booz Allen Hamilton who had access to 1.7 to 1.8 million documents, placed large numbers of NSA documents on several thumb drives and fled the country for Hong Kong. From there he courageously revealed the magnitude of NSA spying on the U.S. and global populations.78 Snowden provided documentary evidence, in the form of an NSA power point, that indicated that the NSA, in its own words, had managed to gain “direct access”—i.e., independent of all intermediaries—to practically all data circulating on the Internet within the U.S. sphere. It also gained access to data from mobile phones emanating from hundreds of millions of Americans as well as populations abroad—operating thorough Boundless Informant, Prism, and other secret projects within “The Program.” According to one NSA slide, nine technology companies (Microsoft, Apple, Google, Yahoo, Facebook, Youtube, PalTalk, Skype, AOL), had all signed up and become, in some sense, corporate partners with Prism. The slide states that the data is collected “directly from the servers of these U.S. Service Providers.”79 The NSA acquisitions director, in a document provided by Snowden, indicated that its back door allowed the NSA access to hundreds of millions of user accounts. According to Snowden himself, speaking from Hong Kong:
The US government co-opts US corporate power to its own ends. Companies such as Google, Facebook, Apple and Microsoft all get together with the NSA. [They] provide the NSA direct access to the back ends of all of the systems you use to communicate, to store data, to put things in the cloud, and even just to send birthday wishes and keep a record of your life. They give [the] NSA direct access, so that they don’t need to oversee, so they can’t be held liable for it.80
Snowden explained that even a middle-level technician in a private corporation engaged in intelligence, such as himself, could tap into the data of any individual in the United States:
While they may be intending to target someone associated with a foreign government or someone they suspect of terrorism, they are collecting your communications to do so. Any analyst at any time can target anyone. Any selector, anywhere. Whether these communications may be picked up depends on the range of the sensor networks and the authorities an analyst is empowered with. Not all analysts have the ability to target everybody. But I, sitting at my desk, certainly had the authority to wiretap anyone, from you, to your accountant, to a federal judge, and even the president, if I had a personal email [address].81
The Snowden documents reveal that increasingly the NSA did not need the active cooperation of the major Internet and telecom firms but could tap directly into their systems. By 2010, as a result of its BULLRUN and EDGEHILL programs, the NSA had made huge progress in breaking almost any encryption, using supercomputers that could crack algorithms, the building blocks of encryption, thus hacking into nearly all messages. Further, the documents show that the NSA put a back door into the cyberspace security norms established by the National Institute of Standards and Technology. The NSA claims that it has been able to put “design changes” into commercial encryption that make the security appear intact, yet it is nonetheless open to NSA penetration.82 As the Washington Post explained, the NSA does not infiltrate server databases. Rather it gets “‘data on the fly.’ The NSA and GCHQ [Britain’s Government Communications Headquarters] do not break into user accounts that are stored on Yahoo and Google computers. They intercept the information as it travels over fiber optic cables from one data center to another.” The NSA is also working with its British counterpart, GCHQ to intercept the private clouds of Yahoo and Google, which use private fiber optic highways outside the public Internet, to protect their data.83
The NSA has access to more than 80 percent of international telephone calls, for which it pays the U.S. telecom monopolies hundreds of millions of dollars a year. And it has broken into Internet data abroad.84 By these means it has spied even on the heads of state of its allies.
The government and the corporate media sought to brand Snowden as a traitor. Two leading figures seeking to discredit Snowden in the media circuit are Clark, who invariably fails to disclose his own role in surveillance capitalism (having left Acxiom he is now on the advisory board of the cyber-intelligence corporation Tiversa), and McConnell (who downplays the continuous revolving door that has allowed him to move back and forth between the U.S. intelligence establishment and Booz Allen). Both have claimed that Snowden has compromised the security of the United States, by letting the population of the country and the world know the extent to which their every move is under surveillance.85
The Snowden revelations bewildered a U.S. population already struggling with numerous intrusions into their private lives, and ubiquitous surveillance. Dissident hackers associated with Anonymous and Wikileaks, and courageous whistle-blowers, like Snowden and Chelsea (formerly Bradley) Manning—the twenty-five-year-old soldier who released hundreds of thousands of classified documents—have been fighting the secret government-corporate security state.86 Numerous organizations have been struggling for free speech and privacy rights in the new surveillance capitalism.87 The population as a whole, however, has yet to perceive the dangers to democracy in an environment already dominated by a political system best characterized as a “dollarocracy,” and now facing a military-financial-digital complex of unbelievable dimensions, data mining every aspect of life—and already using these new technological tools for repression of dissident groups.88
So far the Snowden revelations have mainly disturbed the elites, making it clear that monopolistic corporations, and particularly the intelligence community, are able to penetrate into the deepest secrets at every level of society. Employees in some private corporations working for the NSA have the ability to hack into most corporate data. The most likely result of all of this is a coming together of giant firms with the security apparatus of government, at the expense of the larger population.
Meanwhile the likelihood of cyberwar increases, threatening the entire capitalist system, and the U.S. empire itself. Ironically, the very structure of imperialism has increased security threats. (And, of course, the threat of cyberwar will be used as a justification for reducing individual rights and noncommercial values online ever more.) The global labor arbitrage, by means of which multinational corporations based in the United States and elsewhere take advantage of low wages in other countries, means that most production of computer hardware, including chips, is now done abroad, primarily in Asia.89 A critical concern of the U.S. Defense Department (which purchases 1 percent of the world’s integrated circuit production) has become the hacking of digital malware into the circuits of chips and computer devices themselves, leading to the possibility that critical weapons could be programmed to malfunction at a certain time or for weapons to arm or disarm. Hacked circuits could be used to bring down financial as well as defense systems. DARPA has nine contracts out to private corporations seeking to develop the means for dealing with these vulnerabilities.90
Nevertheless, such vulnerabilities are truly inescapable in today’s hyper-imperialist system growing out of the contradictions of monopoly-finance capital. Its very economic exploitation of the world population, as well as its own, has left the U.S. imperial system open to attack, producing ever greater attempts at control. These are signs of a dying empire. To prevent total human and planetary disaster it is necessary that the vox populi be heard once again and for the empire to go. The digital revolution must be demilitarized and subjected to democratic values and governance, with all that entails. There is no other way.
↩Richard B. DuBoff, Accumulation and Power (Armonk, NY: M.E. Sharpe, 1989), 91.
↩William H. Branson, “Trends in United States International Trade and Investment Since World War II,” in Martin Feldstein, ed., The American Economy in Transition (Chicago: University of Chicago Press, 1980), 183.
↩Dean Acheson, quoted in William Appleman Williams, The Tragedy of American Diplomacy (New York: Dell, 1962), 235–36.
↩General Dwight D. Eisenhower, “Memorandum for Directors and Chiefs of War Department General and Special Staff Divisions and Bureaus and the Commanding Generals of the Major Commands; Subject: Scientific and Technological Resources as Military Assets,” April 1946. Published as Appendix A in Seymour Melman, Pentagon Capitalism (New York: McGraw Hill, 1971), 231–34.
↩“‘No Such Agency’ Spies on the Communications of the World,” Washington Post, June 6, 2013, http://washingtonpost.com.
↩U.S. State Department, Foreign Relations of the United States, 1950. National Security Affairs; Foreign Economic Policy, vol. 1, http://digital.library.wisc.edu, 258–61, 284–86.
↩S. Nelson Drew, ed., NSC-68: Forging the Strategy of Containment; With Analyses by Paul H. Nitze (Washington, DC: National Defense University, 1994), 117; “The Narcissism of NSC-68,” November 12, 2009, http://econospeak.blogspot.com.
↩Dean Acheson, Present at the Creation (New York: W.W. Norton, 1987), 377; Thomas H. Etzold and John Lewis Gaddis, Containment: Documents on American Policy and Strategy, 1949–50 (New York: Columbia University Press, 1978), chapter 7; Institute for Economic Democracy, “NSC-68, Master Plan for the Cold War,” http://ied.info; Fred Block, “Economic Instability and Military Strength: The Paradoxes of the Rearmament Decision,” Politics and Society 10, no. 35 (1980): 35–58.
↩Business Week, April 15, 1950, 15, quoted in Harold G. Vatter, The U.S. Economy in the 1950s (New York: W.W. Norton, 1963), 72.
↩Harry Magdoff, The Age of Imperialism (New York: Monthly Review Press, 1969), 200–201.
↩Lynn Turgeon, Bastard Keynesianism: The Evolution of Economic Thinking and Policymaking Since World War II (Westport, CT: Greenwood Press, 1996), 13; Noam Chomsky, Necessary Illusions (Boston: South End Press, 1989), 183.
↩Paul A. Baran and Paul M. Sweezy, Monopoly Capital (New York: Monthly Review Press, 1966), 152.
↩Scott Nearing, “World Events,” Monthly Review 16, no. 2 (June 1964): 122.
↩Quoted in Fred J. Cook, The Warfare State (New York: Macmillan, 1962):165–66.
↩“WPB Aide Urges U.S. to Keep War Set-Up,” New York Times, January 20, 1944; Charles E. Wilson, “For the Common Defense,” Army Ordnance 26, no. 143 (March–April 1944): 285–88.
↩Slichter and U.S. News and World Report, quoted in Cook, The Warfare State, 171.
↩Bureau of Economic Analysis, “National Income and Product Accounts,” Table 1.1.5 (Gross Domestic Product), and Table 3.9.5 (Government Consumption Expenditures and Gross Investment), http://bea.gov; Baran and Sweezy, Monopoly Capital, 161, 207–13; John Bellamy Foster and Robert W. McChesney, “A New New Deal under Obama?” Monthly Review, 60, no. 9 (February 2009): 1–11; Hannah Holleman, Robert W. McChesney, John Bellamy Foster, and R. Jamil Jonna, “The Penal State in an Age of Crisis,” Monthly Review 61, no. 2 (June 2009): 1–17.
↩Baran and Sweezy, Monopoly Capital, 191, 206, 213–17.
↩For an excellent discussion of this, see Andrew J. Bacevich, Breach of Trust: How Americans Failed Their Soldiers and Their Country (New York: Metropolitan Books, 2013), 48–79.
↩Barbara W. Tuchman, The March of Folly: From Troy to Vietnam (New York: Random House, 1984), 326.
↩See Paul A. Baran and Paul M. Sweezy, “Some Theoretical Implications,” Monthly Review 64, no. 3 (July–August 2012): 45–58; John Bellamy Foster, The Theory of Monopoly Capitalism, new edition (New York: Monthly Review Press, 2014), xiv–xviii.
↩Thorstein Veblen, Absentee Ownership and Business Enterprise in Recent Times (New York: Augustus M. Kelley, 1964), 300.
↩Martin Mayer, Madison Avenue (New York: Harper, 1958), 13–14.
↩See Michael Dawson, The Consumer Trap (Urbana: University of Illinois Press, 2005).
↩On the concept of the cultural apparatus, see John Bellamy Foster and Robert W. McChesney, “The Cultural Apparatus of Monopoly Capital,” Monthly Review 65, no. 3 (July–August 2013): 1–33.
↩Baran and Sweezy, Monopoly Capital, 118–28.
↩Advertising spending, as noted above, was $10 billion in 1957, while annual military spending in the Eisenhower administration was $40–$50 billion. On the latter figure see Turgeon, Bastard Keynesianism, 13.
↩Baran and Sweezy, Monopoly Capital, 115–17.
↩Dennis Daye, “Great Moments in Branding: Neil McElroy Memo,” June 12, 2009, http://brandingstrategyinsider.com; Mayer, Madison Avenue, 26; Editors of Advertising Age, The House that Ivory Built (Lincoln, IL: National Textbook Co., 1988), 20–21, 158; Katie Hafner and Matthew Lyon, Where Wizards Stay Up Late (New York: Simon and Schuster, 1996), 14.
↩Herbert I. Schiller, Mass Communications and American Empire (Boulder: Westview Press, 1992), 8–9.
↩See the detailed critique of the Federal Communications Commission in this respect in Monthly Review in the late 1950s: Leo Huberman and Paul M. Sweezy, “Behind the FCC Scandal,” Monthly Review 9, no. 12 (April 1958): 401–11.
↩Hafner and Lyon, Where Wizards Stay Up Late, 14–21, 255; L. Parker Temple III, Shades of Gray: National Security and the Evolution of Space Reconnaissance (Reston, VA: American Institute of Aeronautics and Astronautics, 2005), 132–33, 142, 146, 192–200, 208–18, 233, 242.
↩Helen Bury, Eisenhower and the Cold War Arms Race (New York: I.B. Tauris, 2014), 205; William Conrad Gibbons, The U.S. Government and the Vietnam War: Executive and Legislative Roles and Relationships; Part IV: July 1965–January 1968 (Princeton: Princeton University Press, 1995), 3–4.
↩President Dwight D. Eisenhower, “President Eisenhower’s Farewell to the Nation.” Published as Appendix B in Melman, Pentagon Capitalism, 235-39; Charles E. Nathanson, “The Militarization of the American Economy,” in David Horowitz, ed., Corporations and the Cold War (New York: Monthly Review Press, 1969), 209.
↩Christopher H. Pyle, Military Surveillance of Civilian Politics, 1967–1970 (New York: Garland Publishing, 1986), 69–81, “Military Intelligence Overkill,” in Sam J. Ervin, et. al., Uncle Sam is Watching You: Highlights from the Hearings of the Senate Subcommittee on Constitutional Rights (Washington, DC: Public Affairs Press, 1971), 74–147; Christopher H. Pyle, “Be Afraid, Be Very Afraid, of Spying by U.S. Army,” December 5, 2002, http://bintjbeil.com; Seth F. Kreimer, “Watching the Watchers: Surveillance, Transparency, and Political Freedom in the War on Terror,” University of Pennsylvania Journal of Constitutional Law 133 (September 2004): 138–44; Frank J. Donner, The Age of Surveillance (New York: Alfred A. Knopf, 1980), 287–320.
↩“Computers Carried Army Files; MIT Investigation Underway,” The Tech, April 11, 1975, http://tech.mit.edu; Hafner and Lyon, Where Wizards Stay Up Late, 231; Gibbons, The U.S. Government and the Vietnam War, 854.
↩“Ervin Discusses Privacy,” The Tech, April 11, 1975, http://tech.mit.edu.
↩Arthur R. Miller, “The Surveillance Society,” in Ervin, et. al., Uncle Sam is Watching You, 25–26.
↩FBI COINTELPRO documents quoted (and displayed) in Noam Chomsky, “Introduction,” in Nelson Blackstock, ed., COINTELPRO: The FBI’s Secret War on Political Freedom (New York: Pathfinder, 1988), 15–16, 25–33.
↩Matthew M. Aid and William Burr, “Secret Cold War Documents Reveal NSA Spied on Senators,” Foreign Policy, September 25, 2013, http://foreignpolicy.com.
↩William Blum, Rogue State (Monroe, ME: Common Courage, 2005), 271–74, and “Anti-Empire Report #118,” June 26, 2013, http://williamblum.org.
↩See John Bellamy Foster and Robert W. McChesney, The Endless Crisis (New York: Monthly Review Press, 2012).
↩William Safire, “You Are a Suspect,” New York Times, November 14, 2002, http://nytimes.com; Shane Harris, The Watchers: The Rise of America’s Surveillance State (New York: Penguin, 2010), 194–235, and “Total Recall,” Foreign Policy, June 19, 2013, http://foreignpolicy.com; “Threats and Responses,” New York Times, July 29, 2003, http://nytimes.com; “Pentagon Prepares a Future Market on Terror Attacks,” New York Times, July 29, 2003; Whitfield Diffie and Saul Landau, Privacy on the Line (Cambridge, MA: The MIT Press, 1998), 66–67; “Chief Takes Over New Agency to Thwart Attacks on U.S.,” New York Times, February 13, 2002; White House, National Security Decision Directive Number 145, “National Policy on Telecommunications and Automated Information Security Systems,” September 17, 1984, http://fas.org; Chalmers Johnson, Dismantling the Empire (New York: Henry Holt, 2010), 104–5.
↩Fred Magdoff and John Bellamy Foster, “Stagnation and Financialization: The Nature of the Contradiction,” Monthly Review 66, no. 1 (May 2014): 9.
↩Harry Magdoff and Paul M. Sweezy, “Production and Finance,” Monthly Review 35, no. 1 (May 1983): 1–13; Paul M. Sweezy, “More (or Less) on Globalization,” Monthly Review 49, no. 4 (September 1997): 1–4.
↩See Nomi Prins, All The Presidents’ Bankers: The Hidden Alliances that Drive American Power (New York: Nation Books, 2014).
↩See Michael Lewis, Flash Boys (New York: W.W. Norton, 2014).
↩Christian Parenti, The Soft Cage: Surveillance in America (New York: Basic Books, 2003), 91–92,96.
↩Dawson, The Consumer Trap, 51.
↩CBS 60 Minutes, “The Data Brokers: Selling Your Personal Information,” March 9, 2014, http://cbsnews.com; “Never Heard of Acxiom?,” Fortune, February 23, 2004, http://money.cnn.com.
↩CBS 60 Minutes, “The Data Brokers”; “Never Heard of Acxiom?”; Lois Beckett, “Everything We Want to Know About What Data Brokers Know About You,” Propublica, September 13, 2013, https://propublica.org; U.S Senate, Staff Report for Chairman [Jay] Rockefeller, Office of Oversight and Investigations Majority Staff, Committee on Commerce, Science, and Transportation, “A Review of the Data Broker Industry,” December 18, 2013, 29; http://commerce.senate.gov; Alice E. Marwick, “How Your Data Are Being Deeply Mined,” New York Review of Books, January 9, 2014, http://nybooks.com.
↩“What Chase and Other Banks Won’t Tell You About Selling Your Data,” Forbes, October 17, 2013, http://forbes.com.
↩Harris, The Watchers, 322–29.
↩“Financial Terrorism: The War on Terabytes,” Economist, December 31, 2011, http://economist.com.
↩Ibid; Mortimer Zuckerman, “How to Fight and Win the Cyberwar,” Wall Street Journal, December 6, 2010, http://online.wsj.com.; James Bamford, “The Secret War,” Wired, June 12, 2013, http://wired.com.
↩Haftner and Lyon, Where Wizards Stay Up Late, 242–56; Robert W. McChesney, Digital Disconnect (New York: New Press, 2013), 102–4.
↩On the Telecommunications Act of 1996, see Robert W. McChesney, The Problem of the Media (New York: Monthly Review Press, 2004), 51–56.
↩John Bellamy Foster and Hannah Holleman, “The Financial Power Elite,” Monthly Review 62, no. 1 (May 2010): 1–19.
↩Bill Gates, The Road Ahead (New York: Viking, 1995), 171, 241–42, and “Keynote Address,” in O’Reilly Associates, ed., The Internet and Society (Cambridge, MA: Harvard University Press, 1997), 32; Michael Dawson and John Bellamy Foster, “Virtual Capitalism,” in Robert W. McChesney, Ellen Meiksins Wood, and John Bellamy Foster, eds., Capitalism and the Information Age (New York: Monthly Review Press, 1998), 51–67.
↩McChesney, Digital Disconnect, 103–37.
↩McChesney, Digital Disconnect, 158.
↩Beatrice Edwards, The Rise of the American Corporate Security State (San Francisco: Berrett-Koehler, 2014), 41 (reprinted in this issue, 54); Mark Karlin, “Six Reasons to Be Afraid of the Private Sector/Government Security State” (interview with Beatrice Edwards), Truthout, May 16, 2014, http://truth-out.org.
↩Glenn Greenwald, No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State (New York: Henry Holt, 2014), 114.
↩Luke Harding, The Snowden Files (New York: Vintage, 2014), 202.
↩“Revealed: The NSA’s Secret Campaign to Crack, Undermine Internet Security,” ProPublica/New York Times, September 5, 2013, http://propublica.org; “Microsoft Handed the NSA Access to Encrypted Messages,” Guardian, July 11, 2013, http://theguardan.com; Greenwald, No Place to Hide, 112–15.
↩“Exclusive: Secret Contract Tied NSA and Security Industry Pioneer,” Reuters, December 20, 2013, http://reuters.com.
↩“C.I.A. Is Said to Pay AT&T for Call Data,” New York Times, November 7, 2013, http://nytimes.com.
↩Glenn Greenwald, “NSA Collecting Phone Records of Millions of Verizon Customers Daily,” Guardian, June 6, 2013, http://theguardian.com; “CIA is Said to Pay AT&T for Call Data,” New York Times, November 7, 2013, http://nytimes.com; Electronic Frontier Foundation, “NSA Spying on Americans,” https://eff.org.
↩“Microsoft, Facebook, Google, and Yahoo Release US Surveillance Requests,” Guardian, February 3, 2014, http://theguardian.com.
↩Larry Greenemeir, “The Drone Wars,” Scientific American, September 2, 2011, http://scientificamerican.com.
↩“Why Facebook and Google Are Buying Into Drones,” Guardian, April 20, 2014, http://theguardian.com; Denise Young, “The Edge of Possibility: Regina Dugan,” Virginia Tech Magazine 35, no. 4, Summer 2013, http://vtmag.vt.edu; Alan McDuffie, “Darpa Turns Aging Surveillance Drones Into Wi-Fi Hotspots,” Wired, April 14, 2014, http://wired.com.
↩“U.S. Marketing Spending Exceeded $1 Trillion in 2005,” Metrics 2.0, June 26, 2006, http://metrics2.com; John Bellamy Foster, Hannah Holleman, and Robert W. McChesney, “The U.S. Imperial Triangle and Military Spending,” Monthly Review 60, no. 5 (October 2008): 1–19; U.S. Bureau of Economic Analysis, Survey of Current Business, May 2008, 43, http://bea.gov.
↩Max Haiven, “Financialization and the Cultural Politics of Securitization,” Cultural Politics 9, no. 3 (2013): 239–62.
↩Johnson, Dismantling the Empire, 104–5.
↩Frontline, “United States of Secrets,” May 13, 2014, http://pbs.org; Ryan Lizza, “State of Deception,” New Yorker, December 16, 2013, http://newyorker.com; Greenwald, No Place to Hide, 95–97; Electronic Frontier Foundation, “How the NSA’s Domestic Spying Program Works,” https://eff.org.
↩“Booz Allen, the World’s Most Profitable Spy Organization,” Bloomberg Business Week, June 20, 2013, http://businessweek.com; “Booz Allen Executive Leadership: John M. (Mike) McConnell, Vice Chairman,” accessed May 30, 2014, https://boozallen.com.
↩Greenwald, No Place to Hide, 101; Glenn Greenwald and Ewen MacAskill, “Boundless Informant,” Guardian, June 11, 2013, http://theguardian.com.
↩Greenwald, No Place to Hide, 48; “Ex-NSA Chief Details Snowden’s Hiring at Agency, Booz Allen,” Wall Street Journal, February 4, 2014, http://online.wsj.com.
↩Greenwald, No Place to Hide, 108.
↩Harding, The Snowden Files, 197–99.
↩Harding, The Snowden Files, 204.
↩Harding, The Snowden Files, 208–14.
↩“How We Know the NSA had Access to Internal Google and Yahoo Cloud Data,” November 4, 2013, http://washingtonpost.com; Electronic Frontier Foundation, “How the NSA’s Domestic Spying Program Works.”
↩Harding, The Snowden Files, 203.
↩James Ridgeway, “Wesley Clark Remains Cagey on the Stump,” Village Voice, January 13, 2004, http://villagevoice.com; “Tiversa Advisory Board: General Wesley Clark,” accessed May 30, 2014, http://tiversa.com; “Ex-NSA Chief Details Snowden’s Hiring at Agency, Booz Allen.”
↩“Similarities Seen in Leaks by Snowden, Manning,” Baltimore Sun, June 10, 2013, http://articles.baltimoresun.com.
↩On such groups see Heidi Boghosian, Spying on Democracy (San Francisco: City Light Books, 2013), 265–89.
↩John Nichols and Robert W. McChesney, Dollarocracy (New York: Nation Books, 2013).
↩On the global labor arbitrage see Foster and McChesney, The Endless Crisis, 137–54.
↩Adam Rawnsley, “Can Darpa Fix the Cybersecurity Problem from Hell?,” Wired, August 5, 2011, http://wired.com.
JackRiddler » Wed May 15, 2019 8:04 am wrote:.
this is like the third large wave of this kind of propaganda from the old corporate media against the Internet. They have never stopped. In the beginning (and still today) it was all about Wild West Internet, home to nothing but child pornography and terrorist bomb wholesalers and, most unpspeakably evil of all, people who download Hollywood product. Action must be taken!
Up to ten years ago it was all about the scandal that there were bloggers at all, that they were writing for free (hey, I agree with that one - pay me!!!), that they had the same rights and protections as newspaper journalists even though they could write whatever they liked, and, presumptuously enough, that loads of them actually were journalists and doing a better job of research and reporting than most of the old newspaper crew. (That's not how the old crew were putting it, of course.)
Now, of course, almost all of the remaining newspaper staff ARE BLOGGERS. What used to be considered blogging activity is 90% of what's left of them. They do more Twitter than bothering to call sources. (And it's okay since most of'em are paid shit.)
Of course the multifunctional #Russiagate has also been an epic campaign for censorship on behalf of the old corporate media, and the pressure on the social media corps to return to their NSA roots and add some FBI has begun to yield results.
Also, they get to displace their responsibility for Trump. Fucking history textbooks will be written that explain the election of Trump as a result of social media encouraging "polarization" and "filter bubbles" and enabling "conspiracy theory" (which never played a role in American politics ever before, seriously, and please don't confuse #Russiagate with a conspiracy theory!) and the influence of ads from a pennyante Russian spam-storefront company -- and possibly have not a single word about the absolutely central role of NBC and CNN in predetermining Trump's victory within the GOP. Also, Cambridge Analytica, if it is mentioned, will be made to look like some kind of minor sidekick dog of the teeny teeny tail of the Petrograd spamfront.
Users browsing this forum: Google [Bot] and 13 guests