Surveillance

Moderators: DrVolin, 82_28, Elvis, Jeff

Re: Surveillance

Postby Grizzly » Thu Feb 18, 2016 10:59 am

Heard someone somewhere say, stop calling it,' Surveillance', a watered down version, like the word, intelligence. Call it was it is, SPYING. Anyway, after listening to John Young Sound-cloud interview that somebody posted here, the other day I'm more apt to believe him than Snowden anyway, after a long time of not really trusting cryptome. I have come to believe him (John Young). And that Snowden is prolly the left wing part of the CIA, NSA, What ever alphabet agency... good cop/bad cop bullshit..

Addendum, because using my phone is annoyingly, sloppy.
Last edited by Grizzly on Thu Feb 18, 2016 11:36 pm, edited 2 times in total.
If Barthes can forgive me, “What the public wants is the image of passion Justice, not passion Justice itself.”
Grizzly
 
Posts: 1879
Joined: Wed Oct 26, 2011 4:15 pm
Blog: View Blog (0)

Re: Surveillance

Postby Grizzly » Thu Feb 18, 2016 11:28 pm

Prosecution argues that turning on your phone means you consent to being tracked
http://www.phonearena.com/news/Prosecut ... ed_id78259

Back in 2014, Baltimore cops were looking for one Kerron Andrews. A warrant was issued for Mr. Andrews as the cops were looking to arrest him for attempted murder. And while the cops did not request an approval to use a device called the Hailstorm to find their man, they employed it anyway. The Hailstorm is a tracking tool similar to the Stingray, in that it intercepts and collects bulk data headed to a cell tower. The data can be used to help find someone's location. It also records all of the person's phone calls.

The Hailstorm did lead to Mr. Andrews whereabouts, and he was arrested. But the judge, after discovering that the police used the Hailstorm without approval, said that the cops had violated the defendant's Fourth Amendment right against unreasonable search and seizure. The judge granted a request by the defense to suppress the evidence collected by the Hailstorm.

The state has appealed the decision, and in its filing it presents a legal theory that is disturbing. The prosecution says that since every cellphone sports an off switch, Andrews' decision to have his phone turned on indicated he was consenting to be tracked.

If the appeals court goes along with this argument and allows the evidence collected by the Hailstorm to be used during the trial, it will mean that as far as the cops are concerned, you are giving up your privacy each time you press that power button on your handset and turn it on.

As soon as we hear how the appeals court rules, we will let you know. In the meantime, better keep your finger off the power button on your phone if you don't want the authorities to know what you are up to.


Image

Thanks for the tip!
Last edited by Grizzly on Fri Feb 19, 2016 12:11 am, edited 1 time in total.
If Barthes can forgive me, “What the public wants is the image of passion Justice, not passion Justice itself.”
Grizzly
 
Posts: 1879
Joined: Wed Oct 26, 2011 4:15 pm
Blog: View Blog (0)

Re: Surveillance

Postby Grizzly » Thu Feb 18, 2016 11:44 pm

Image

If Barthes can forgive me, “What the public wants is the image of passion Justice, not passion Justice itself.”
Grizzly
 
Posts: 1879
Joined: Wed Oct 26, 2011 4:15 pm
Blog: View Blog (0)

Re: Surveillance

Postby Grizzly » Sat Feb 20, 2016 3:01 am

What are the upcoming cultural and political implications of gathering affective data on large populations?


If Barthes can forgive me, “What the public wants is the image of passion Justice, not passion Justice itself.”
Grizzly
 
Posts: 1879
Joined: Wed Oct 26, 2011 4:15 pm
Blog: View Blog (0)

Re: Surveillance

Postby identity » Mon Mar 14, 2016 8:28 am

via Frankfurter Allgemeine Zeitung (hat tip to cryptogon!):

Google as a Fortune Teller
The Secrets of Surveillance Capitalism

Governmental control is nothing compared to what Google is up to. The company is creating a wholly new genus of capitalism, a systemic coherent new logic of accumulation we should call surveillance capitalism. Is there nothing we can do?


Google surpassed Apple as the world’s most highly valued company in January for the first time since 2010. (Back then each company was worth less than 200 billion. Now each is valued at well over 500 billion.) While Google’s new lead lasted only a few days, the company’s success has implications for everyone who lives within the reach of the Internet. Why? Because Google is ground zero for a wholly new subspecies of capitalism in which profits derive from the unilateral surveillance and modification of human behavior. This is a new surveillance capitalism that is unimaginable outside the inscrutable high velocity circuits of Google’s digital universe, whose signature feature is the Internet and its successors. While the world is riveted by the showdown between Apple and the FBI, the real truth is that the surveillance capabilities being developed by surveillance capitalists are the envy of every state security agency. What are the secrets of this new capitalism, how do they produce such staggering wealth, and how can we protect ourselves from its invasive power?

“Most Americans realize that there are two groups of people who are monitored regularly as they move about the country. The first group is monitored involuntarily by a court order requiring that a tracking device be attached to their ankle. The second group includes everyone else…”

Some will think that this statement is certainly true. Others will worry that it could become true. Perhaps some think it’s ridiculous. It’s not a quote from a dystopian novel, a Silicon Valley executive, or even an NSA official. These are the words of an auto insurance industry consultant intended as a defense of “automotive telematics” and the astonishingly intrusive surveillance capabilities of the allegedly benign systems that are already in use or under development. It’s an industry that has been notoriously exploitative toward customers and has had obvious cause to be anxious about the implications of self-driving cars for its business model. Now, data about where we are, where we’re going, how we’re feeling, what we’re saying, the details of our driving, and the conditions of our vehicle are turning into beacons of revenue that illuminate a new commercial prospect. According to the industry literature, these data can be used for dynamic real-time driver behavior modification triggering punishments (real-time rate hikes, financial penalties, curfews, engine lock-downs) or rewards (rate discounts, coupons, gold stars to redeem for future benefits).

Bloomberg Business Week notes that these automotive systems will give insurers a chance to boost revenue by selling customer driving data in the same way that Google profits by collecting information on those who use its search engine. The CEO of Allstate Insurance wants to be like Google. He says, “There are lots of people who are monetizing data today. You get on Google, and it seems like it’s free. It’s not free. You’re giving them information; they sell your information. Could we, should we, sell this information we get from people driving around to various people and capture some additional profit source…? It’s a long-term game.”

Who are these “various people” and what is this “long-term game”? The game is no longer about sending you a mail order catalogue or even about targeting online advertising. The game is selling access to the real-time flow of your daily life –your reality—in order to directly influence and modify your behavior for profit. This is the gateway to a new universe of monetization opportunities: restaurants who want to be your destination. Service vendors who want to fix your brake pads. Shops who will lure you like the fabled Sirens. The “various people” are anyone, and everyone who wants a piece of your behavior for profit. Small wonder, then, that Google recently announced that its maps will not only provide the route you search but will also suggest a destination.

The goal: to change people’s actual behavior at scale

This is just one peephole, in one corner, of one industry, and the peepholes are multiplying like cockroaches. Among the many interviews I’ve conducted over the past three years, the Chief Data Scientist of a much-admired Silicon Valley company that develops applications to improve students’ learning told me, “The goal of everything we do is to change people’s actual behavior at scale. When people use our app, we can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad. We can test how actionable our cues are for them and how profitable for us”.

The very idea of a functional, effective, affordable product as a sufficient basis for economic exchange is dying. The sports apparel company Under Armour is reinventing its products as wearable technologies. The CEO wants to be like Google. He says, "If it all sounds eerily like those ads that, because of your browsing history, follow you around the Internet, that's exactly the point--except Under Armour is tracking real behavior and the data is more specific… making people better athletes makes them need more of our gear.” The examples of this new logic are endless, from smart vodka bottles to Internet-enabled rectal thermometers and quite literally everything in between. A Goldman Sachs report calls it a “gold rush,” a race to “vast amounts of data.”

The assault on behavioral data

We’ve entered virgin territory here. The assault on behavioral data is so sweeping that it can no longer be circumscribed by the concept of privacy and its contests. This is a different kind of challenge now, one that threatens the existential and political canon of the modern liberal order defined by principles of self-determination that have been centuries, even millennia, in the making. I am thinking of matters that include, but are not limited to, the sanctity of the individual and the ideals of social equality; the development of identity, autonomy, and moral reasoning; the integrity of contract, the freedom that accrues to the making and fulfilling of promises; norms and rules of collective agreement; the functions of market democracy; the political integrity of societies; and the future of democratic sovereignty. In the fullness of time, we will look back on the establishment in Europe of the “Right to be Forgotten” and the EU’s more recent invalidation of the Safe Harbor doctrine as early milestones in a gradual reckoning with the true dimensions of this challenge.

There was a time when we laid responsibility for the assault on behavioral data at the door of the state and its security agencies. Later, we also blamed the cunning practices of a handful of banks, data brokers, and Internet companies. Some attribute the assault to an inevitable “age of big data,” as if it were possible to conceive of data born pure and blameless, data suspended in some celestial place where facts sublimate into truth.

Capitalism has been hijacked by surveillance

I’ve come to a different conclusion: The assault we face is driven in large measure by the exceptional appetites of a wholly new genus of capitalism, a systemic coherent new logic of accumulation that I call surveillance capitalism. Capitalism has been hijacked by a lucrative surveillance project that subverts the “normal” evolutionary mechanisms associated with its historical success and corrupts the unity of supply and demand that has for centuries, however imperfectly, tethered capitalism to the genuine needs of its populations and societies, thus enabling the fruitful expansion of market democracy.

Surveillance capitalism is a novel economic mutation bred from the clandestine coupling of the vast powers of the digital with the radical indifference and intrinsic narcissism of the financial capitalism and its neoliberal vision that have dominated commerce for at least three decades, especially in the Anglo economies. It is an unprecedented market form that roots and flourishes in lawless space. It was first discovered and consolidated at Google, then adopted by Facebook, and quickly diffused across the Internet. Cyberspace was its birthplace because, as Google/Alphabet Chairperson Eric Schmidt and his coauthor, Jared Cohen, celebrate on the very first page of their book about the digital age, “the online world is not truly bound by terrestrial laws…it’s the world’s largest ungoverned space.”

While surveillance capitalism taps the invasive powers of the Internet as the source of capital formation and wealth creation, it is now, as I have suggested, poised to transform commercial practice across the real world too. An analogy is the rapid spread of mass production and administration throughout the industrialized world in the early twentieth century, but with one major caveat. Mass production was interdependent with its populations who were its consumers and employees. In contrast, surveillance capitalism preys on dependent populations who are neither its consumers nor its employees and are largely ignorant of its procedures.

Internet access as a fundamental human right

We once fled to the Internet as solace and solution, our needs for effective life thwarted by the distant and increasingly ruthless operations of late twentieth century capitalism. In less than two decades after the Mosaic web browser was released to the public enabling easy access to the World Wide Web, a 2010 BBC poll found that 79% of people in 26 countries considered Internet access to be a fundamental human right. This is the Scylla and Charybdis of our plight. It is nearly impossible to imagine effective social participation ––from employment, to education, to healthcare–– without Internet access and know-how, even as these once flourishing networked spaces fall to a new and even more exploitative capitalist regime. It’s happened quickly and without our understanding or agreement. This is because the regime’s most poignant harms, now and later, have been difficult to grasp or theorize, blurred by extreme velocity and camouflaged by expensive and illegible machine operations, secretive corporate practices, masterful rhetorical misdirection, and purposeful cultural misappropriation.

Taming this new force depends upon careful naming. This symbiosis of naming and taming is vividly illustrated in the recent history of HIV research, and I offer it as analogy. For three decades scientists aimed to create a vaccine that followed the logic of earlier cures, training the immune system to produce neutralizing antibodies, but mounting data revealed unanticipated behaviors of the HIV virus that defy the patterns of other infectious diseases.

HIV research as analogy

The tide began to turn at the International AIDS Conference in 2012, when new strategies were presented that rely on a close understanding of the biology of rare HIV carriers whose blood produces natural antibodies. Research began to shift toward methods that reproduce this self-vaccinating response. A leading researcher announced, “We know the face of the enemy now, and so we have some real clues about how to approach the problem.”

The point for us is that every successful vaccine begins with a close understanding of the enemy disease. We tend to rely on mental models, vocabularies, and tools distilled from past catastrophes. I am thinking of the twentieth century’s totalitarian nightmares or the monopolistic predations of Gilded Age capitalism. But the vaccines we’ve developed to fight those earlier threats are not sufficient or even appropriate for the novel challenges we face. It’s like we’re hurling snowballs at a smooth marble wall only to watch them slide down its façade, leaving nothing but a wet smear: a fine paid here, an operational detour there.

An evolutionary dead-end

I want to say plainly that surveillance capitalism is not the only current modality of information capitalism, nor is it the only possible model for the future. Its fast track to capital accumulation and rapid institutionalization, however, has made it the default model of information capitalism. The questions I pose are these: Will surveillance capitalism become the dominant logic of accumulation in our time or, will it be an evolutionary dead-end –– a toothed bird in capitalism’s longer journey? What will an effective vaccine entail?

A cure depends upon many individual, social, and legal adaptations, but I am convinced that fighting the “enemy disease” cannot begin without a fresh grasp of the novel mechanisms that account for surveillance capitalism’s successful transformation of investment into capital. This has been one focus of my work in a new book, Master or Slave: The Fight for the Soul of Our Information Civilization, which will be published early next year. In the short space of this essay, I’d like to share some of my thoughts on this problem.

Fortune telling and selling

New economic logics and their commercial models are discovered by people in a time and place and then perfected through trial and error. Ford discovered and systematized mass production. General Motors institutionalized mass production as a new phase of capitalist development with the discovery and perfection of large-scale administration and professional management. In our time, Google is to surveillance capitalism what Ford and General Motors were to mass-production and managerial capitalism a century ago: discoverer, inventor, pioneer, role model, lead practitioner, and diffusion hub.

Specifically, Google is the mothership and ideal type of a new economic logic based on fortune telling and selling, an ancient and eternally lucrative craft that has exploited the human confrontation with uncertainty from the beginning of the human story. Paradoxically, the certainty of uncertainty is both an enduring source of anxiety and one of our most fruitful facts. It produced the universal need for social trust and cohesion, systems of social organization, familial bonding, and legitimate authority, the contract as formal recognition of reciprocal rights and obligations, and the theory and practice of what we call “free will.” When we eliminate uncertainty, we forfeit the human replenishment that attaches to the challenge of asserting predictability in the face of an always-unknown future in favor of the blankness of perpetual compliance with someone else’s plan.

Only incidentally related to advertising

Most people credit Google’s success to its advertising model. But the discoveries that led to Google’s rapid rise in revenue and market capitalization are only incidentally related to advertising. Google’s success derives from its ability to predict the future – specifically the future of behavior. Here is what I mean:

From the start, Google had collected data on users’ search-related behavior as a byproduct of query activity. Back then, these data logs were treated as waste, not even safely or methodically stored. Eventually, the young company came to understand that these logs could be used to teach and continuously improve its search engine.

The problem was this: Serving users with amazing search results “used up” all the value that users created when they inadvertently provided behavioral data. It’s a complete and self-contained process in which users are ends-in-themselves. All the value that users create is reinvested in the user experience in the form of improved search. In this cycle, there was nothing left over for Google to turn into capital. As long as the effectiveness of the search engine needed users’ behavioral data about as much as users needed search, charging a fee for service was too risky. Google was cool, but it wasn’t yet capitalism –– just one of many Internet startups that boasted “eyeballs” but no revenue.

Shift in the use of behavioral data

The year 2001 brought the dot.com bust and mounting investor pressures at Google. Back then advertisers selected the search term pages for their displays. Google decided to try and boost ad revenue by applying its already substantial analytical capabilities to the challenge of increasing an ad’s relevance to users –– and thus its value to advertisers. Operationally this meant that Google would finally repurpose its growing cache of behavioral data. Now the data would also be used to match ads with keywords, exploiting subtleties that only its access to behavioral data, combined with its analytical capabilities, could reveal.

It’s now clear that this shift in the use of behavioral data was an historic turning point. Behavioral data that were once discarded or ignored were rediscovered as what I call behavioral surplus. Google’s dramatic success in “matching” ads to pages revealed the transformational value of this behavioral surplus as a means of generating revenue and ultimately turning investment into capital. Behavioral surplus was the game-changing zero-cost asset that could be diverted from service improvement toward a genuine market exchange. Key to this formula, however, is the fact that this new market exchange was not an exchange with users but rather with other companies who understood how to make money from bets on users’ future behavior. In this new context, users were no longer an end-in-themselves. Instead they became a means to profits in a new kind of marketplace in which users are neither buyers nor sellers nor products. Users are the source of free raw material that feeds a new kind of manufacturing process.

While these facts are known, their significance has not been fully appreciated or adequately theorized. What just happened was the discovery of a surprisingly profitable commercial equation –– a series of lawful relationships that were gradually institutionalized in the sui generis economic logic of surveillance capitalism. It’s like a newly sighted planet with its own physics of time and space, its sixty-seven hour days, emerald sky, inverted mountain ranges, and dry water.

A parasitic form of profit

The equation: First, the push for more users and more channels, services, devices, places, and spaces is imperative for access to an ever-expanding range of behavioral surplus. Users are the human nature-al resource that provides this free raw material. Second, the application of machine learning, artificial intelligence, and data science for continuous algorithmic improvement constitutes an immensely expensive, sophisticated, and exclusive twenty-first century “means of production.” Third, the new manufacturing process converts behavioral surplus into prediction products designed to predict behavior now and soon. Fourth, these prediction products are sold into a new kind of meta-market that trades exclusively in future behavior. The better (more predictive) the product, the lower the risks for buyers, and the greater the volume of sales. Surveillance capitalism’s profits derive primarily, if not entirely, from such markets for future behavior.

While advertisers have been the dominant buyers in the early history of this new kind of marketplace, there is no substantive reason why such markets should be limited to this group. The already visible trend is that any actor with an interest in monetizing probabilistic information about our behavior and/or influencing future behavior can pay to play in a marketplace where the behavioral fortunes of individuals, groups, bodies, and things are told and sold. This is how in our own lifetimes we observe capitalism shifting under our gaze: once profits from products and services, then profits from speculation, and now profits from surveillance. This latest mutation may help explain why the explosion of the digital has failed, so far, to decisively impact economic growth, as so many of its capabilities are diverted into a fundamentally parasitic form of profit.

Unoriginal Sin

The significance of behavioral surplus was quickly camouflaged, both at Google and eventually throughout the Internet industry, with labels like “digital exhaust,” “digital breadcrumbs,” and so on. These euphemisms for behavioral surplus operate as ideological filters, in exactly the same way that the earliest maps of the North American continent labeled whole regions with terms like “heathens,” “infidels,” “idolaters,” “primitives,” “vassals,” or “rebels.” On the strength of those labels, native peoples, their places and claims, were erased from the invaders’ moral and legal equations, legitimating their acts of taking and breaking in the name of Church and Monarchy.

We are the native peoples now whose tacit claims to self-determination have vanished from the maps of our own behavior. They are erased in an astonishing and audacious act of dispossession by surveillance that claims its right to ignore every boundary in its thirst for knowledge of and influence over the most detailed nuances of our behavior. For those who wondered about the logical completion of the global processes of commodification, the answer is that they complete themselves in the dispossession of our intimate quotidian reality, now reborn as behavior to be monitored and modified, bought and sold.

The process that began in cyberspace mirrors the nineteenth century capitalist expansions that preceded the age of imperialism. Back then, as Hannah Arendt described it in The Origins of Totalitarianism, “the so-called laws of capitalism were actually allowed to create realities” as they traveled to less developed regions where law did not follow. “The secret of the new happy fulfillment,” she wrote, “was precisely that economic laws no longer stood in the way of the greed of the owning classes.” There, “money could finally beget money,” without having to go “the long way of investment in production…”

“The original sin of simple robbery”

For Arendt, these foreign adventures of capital clarified an essential mechanism of capitalism. Marx had developed the idea of “primitive accumulation” as a big-bang theory –– Arendt called it “the original sin of simple robbery” –– in which the taking of lands and natural resources was the foundational event that enabled capital accumulation and the rise of the market system. The capitalist expansions of the 1860s and 1870s demonstrated, Arendt wrote, that this sort of original sin had to be repeated over and over, “lest the motor of capital accumulation suddenly die down.”

In his book The New Imperialism, geographer and social theorist David Harvey built on this insight with his notion of “accumulation by dispossession.” “What accumulation by dispossession does,” he writes, “is to release a set of assets…at very low (and in some instances zero) cost. Overaccumulated capital can seize hold of such assets and immediately turn them to profitable use…It can also reflect attempts by determined entrepreneurs…to ‘join the system’ and seek the benefits of capital accumulation.”

Breakthrough into “the system”

The process by which behavioral surplus led to the discovery of surveillance capitalism exemplifies this pattern. It is the foundational act of dispossession for a new logic of capitalism built on profits from surveillance that paved the way for Google to become a capitalist enterprise. Indeed, in 2002, Google’s first profitable year, founder Sergey Brin relished his breakthrough into “the system”, as he told Levy,

Honestly, when we were still in the dot-com boom days, I felt like a schmuck. I had an Internet start- up — so did everybody else. It was unprofitable, like everybody else’s, and how hard is that? But when we became profitable, I felt like we had built a real business.”

Brin was a capitalist all right, but it was a mutation of capitalism unlike anything the world had seen.

Once we understand this equation, it becomes clear that demanding privacy from surveillance capitalists or lobbying for an end to commercial surveillance on the Internet is like asking Henry Ford to make each Model T by hand. It’s like asking a giraffe to shorten its neck or a cow to give up chewing. Such demands are existential threats that violate the basic mechanisms of the entity’s survival. How can we expect companies whose economic existence depends upon behavioral surplus to cease capturing behavioral data voluntarily? It’s like asking for suicide.

More behavioral surplus for Google

The imperatives of surveillance capitalism mean that there must always be more behavioral surplus for Google and others to turn into surveillance assets, master as prediction, sell into exclusive markets for future behavior, and transform into capital. At Google and its new holding company called Alphabet, for example, every operation and investment aims to increasing the harvest of behavioral surplus from people, bodies, things, processes, and places in both the virtual and the real world. This is how a sixty-seven hour day dawns and darkens in an emerald sky. Nothing short of a social revolt that revokes collective agreement to the practices associated with the dispossession of behavior will alter surveillance capitalism’s claim to manifest data destiny.

What is the new vaccine? We need to reimagine how to intervene in the specific mechanisms that produce surveillance profits and in so doing reassert the primacy of the liberal order in the twenty-first century capitalist project. In undertaking this challenge we must be mindful that contesting Google, or any other surveillance capitalist, on the grounds of monopoly is a 20th century solution to a 20th century problem that, while still vitally important, does not necessarily disrupt surveillance capitalism’s commercial equation. We need new interventions that interrupt, outlaw, or regulate 1) the initial capture of behavioral surplus, 2) the use of behavioral surplus as free raw material, 3) excessive and exclusive concentrations of the new means of production, 4) the manufacture of prediction products, 5) the sale of prediction products, 6) the use of prediction products for third-order operations of modification, influence, and control, and 5) the monetization of the results of these operations. This is necessary for society, for people, for the future, and it is also necessary to restore the healthy evolution of capitalism itself.

A coup from above

In the conventional narrative of the privacy threat, institutional secrecy has grown, and individual privacy rights have been eroded. But that framing is misleading, because privacy and secrecy are not opposites but rather moments in a sequence. Secrecy is an effect; privacy is the cause. Exercising one’s right to privacy produces choice, and one can choose to keep something secret or to share it. Privacy rights thus confer decision rights, but these decision rights are merely the lid on the Pandora’s Box of the liberal order. Inside the box, political and economic sovereignty meet and mingle with even deeper and subtler causes: the idea of the individual, the emergence of the self, the felt experience of free will.

Surveillance capitalism does not erode these decision rights –– along with their causes and their effects –– but rather it redistributes them. Instead of many people having some rights, these rights have been concentrated within the surveillance regime, opening up an entirely new dimension of social inequality. The full implications of this development have preoccupied me for many years now, and with each day my sense of danger intensifies. The space of this essay does not allow me to follow these facts to their conclusions, but I offer this thought in summary.

Surveillance capitalism reaches beyond the conventional institutional terrain of the private firm. It accumulates not only surveillance assets and capital, but also rights. This unilateral redistribution of rights sustains a privately administered compliance regime of rewards and punishments that is largely free from detection or sanction. It operates without meaningful mechanisms of consent either in the traditional form of “exit, voice, or loyalty” associated with markets or in the form of democratic oversight expressed in law and regulation.

Profoundly anti-democratic power

In result, surveillance capitalism conjures a profoundly anti-democratic power that qualifies as a coup from above: not a coup d’état, but rather a coup des gens, an overthrow of the people’s sovereignty. It challenges principles and practices of self-determination ––in psychic life and social relations, politics and governance –– for which humanity has suffered long and sacrificed much. For this reason alone, such principles should not be forfeit to the unilateral pursuit of a disfigured capitalism. Worse still would be their forfeit to our own ignorance, learned helplessness, inattention, inconvenience, habituation, or drift. This, I believe, is the ground on which our contests for the future will be fought.

Hannah Arendt once observed that indignation is the natural human response to that which degrades human dignity. Referring to her work on the origins of totalitarianism she wrote, “If I describe these conditions without permitting my indignation to interfere, then I have lifted this particular phenomenon out of its context in human society and have thereby robbed it of part of its nature, deprived it of one of its important inherent qualities.”

So it is for me and perhaps for you: The bare facts of surveillance capitalism necessarily arouse my indignation because they demean human dignity. The future of this narrative will depend upon the indignant scholars and journalists drawn to this frontier project, indignant elected officials and policy makers who understand that their authority originates in the foundational values of democratic communities, and indignant citizens who act in the knowledge that effectiveness without autonomy is not effective, dependency-induced compliance is no social contract, and freedom from uncertainty is no freedom.


Shoshana Zuboff is the Charles Edward Wilson Professor, Emerita, Harvard Business School. This essay was written for a 2016 address at Green Templeton College, Oxford. Her forthcoming book is Master or Slave: The Fight for the Soul of Our Information Civilization, to be published by Eichborn in Germany and Public Affairs in the U.S.
He was disoriented in all three spheres.
Somnolence alternated with excitement.
When not in hell he was convinced he was in Eden.
User avatar
identity
 
Posts: 483
Joined: Fri Mar 20, 2015 5:00 am
Blog: View Blog (0)

Re: Surveillance

Postby backtoiam » Tue Mar 22, 2016 2:17 pm

The Dangers of New York City’s Public Wi-Fi
Published: March 21, 2016

If you’re a NYC resident and find yourself excited about the city’s public Wi-Fi network known as LinkNYC, you may want to think again.

As the New York Civil Liberties Union (NYCLU) notes:

March 16, 2016 — The city’s new public Wi-Fi network LinkNYC raises several privacy concerns for users, the New York Civil Liberties Union announced today after sending a letter to the Office of the Mayor on Tuesday. CityBridge, the company behind the LinkNYC kiosks that have begun replacing phone booths in Manhattan, retains a vast amount of information about users – often indefinitely – building a massive database that carries a risk of security breaches and unwarranted NYPD surveillance.

“New Yorkers’ private online activities shouldn’t be used to create a massive database that’s within the ready grasp of the NYPD,” said Donna Lieberman, executive director of the NYCLU. “Free public Wi-Fi can be an invaluable resource for this city, but New Yorkers need to know there are too many strings attached.”

LinkNYC, which was publicly launched in January, will eventually become a network of as many as 7,500 to 10,000 public kiosks offering fast and free Wi-Fi throughout all five boroughs. The sheer volume of information gathered by this powerful network will create a massive database of information that will present attractive opportunities for hackers and for law enforcement surveillance, and will carry an undue risk of abuse, misuse and unauthorized access.

In order to register for LinkNYC, users must submit their e-mail addresses and agree to allow CityBridge to collect information about what websites they visit on their devices, where and how long they linger on certain information on a webpage, and what links they click on. CityBridge’s privacy policy only offers to make “reasonable efforts” to clear out this massive amount of personally identifiable user information, and even then, only if there have been 12 months of user inactivity. New Yorkers who use LinkNYC regularly will have their personally identifiable information stored for a lifetime and beyond.

Mayor Bill de Blasio’s launch of LinkNYC, that occurred at the same time that NYPD Commissioner Bratton spoke about the Apple controversy, also raised concerns that the NYPD will be requesting information from CityBridge and that CityBridge will be cooperating – troubling because LinkNYC users are not guaranteed notification if the NYPD requests to access their information. And according to its privacy policy, data collected by environmental sensors or cameras at the LinkNYC kiosks may be available to the city or NYPD. In its letter, the NYCLU requests to know if the environmental sensors and cameras will be routinely feeding into any city or NYPD systems, including the controversial Domain Awareness System, and contends that users must be specifically notified if this is the case.


You’ve been warned.

http://www.blacklistednews.com/The_Dang ... 8/Y/M.html
"A mind stretched by a new idea can never return to it's original dimensions." Oliver Wendell Holmes
backtoiam
 
Posts: 2101
Joined: Mon Aug 31, 2015 9:22 am
Blog: View Blog (0)

Re: Surveillance

Postby elfismiles » Fri May 20, 2016 9:44 am

@TheAVClub

The exec producers of Person Of Interest suspect Facebook will destroy the world http://avc.lu/26JpAf4
Image



Brian Barrett Security Date of Publication: 05.19.16.
New Surveillance System May Let Cops Use All of the Cameras

The 30 million or so surveillance cameras peering into nearly every corner of American life might freak you out a bit, but you could always tell yourself that no one can access them all. Until now.

Computer scientists have created a way of letting law enforcement tap any camera that isn’t password protected so they can determine where to send help or how to respond to a crime. “It’s a way to help people take advantage of information that’s out there,” says David Ebert, an electrical and computer engineer at Purdue University.

The system, which is just a proof of concept, alarms privacy advocates who worry that prudent surveillance could easily lead to government overreach, or worse, unauthorized use. It relies upon two tools developed independently at Purdue. The Visual Analytics Law Enforcement Toolkit superimposes the rate and location of crimes and the location of police surveillance cameras. CAM2 reveals the location and orientation of public network cameras, like the one outside your apartment. You could do the same thing with a search engine like Shodan, but CAM2 makes the job far easier, which is the scary part. Aggregating all these individual feeds makes it potentially much more invasive.

Purdue limits access to registered users, and the terms of service for CAM2 state “you agree not to use the platform to determine the identity of any specific individuals contained in any video or video stream.” A reasonable step to ensure privacy, but difficult to enforce (though the team promises the system will have strict security if it ever goes online).

“I can certainly see the utility for first responders,” says Dave Maass, an investigative researcher with digital rights group EFF. “But it does open up the potential for some unseemly surveillance.”

Beyond the specter of universal government surveillance lies the risk of someone hacking the system. To Maass, it brings to mind the TV show Person of Interest and its band of vigilantes who tap government cameras to predict and prevent crimes. This is not so far-fetched. Last year, the EFF discovered that anyone could access more than 100 “secure” automated license plate readers. “I think it becomes a very tempting target,” says Gautam Hans, policy counsel at the Center for Democracy & Technology. “Thinking about security issues is going to be a major concern.”

Granted, the system does not tap private feeds, nor does it peer into private spaces like someone’s home. But aggregating this data and mapping it against specific crimes or emergencies is troubling. Hans says there’s no way of knowing when someone violates the terms of service and targets an individual, and the patchwork of regulations governing how agencies can use such technology is no guarantee against government over-reach.

Still, Hans is pragmatic and realizes the Purdue researchers have a noble goal. “At a certain level there’s only so much you can do to prevent the march of technology,” he says. “It’s not the best use of our time to rail against its existence. At a certain point we need to figure out how to use it effectively, or at least with extensive oversight.”

https://www.wired.com/2016/05/new-surve ... e-cameras/
User avatar
elfismiles
 
Posts: 8214
Joined: Fri Aug 11, 2006 6:46 pm
Blog: View Blog (4)

Re: Surveillance

Postby identity » Sat May 21, 2016 5:12 am

https://www.theguardian.com/technology/2016/may/17/findface-face-recognition-app-end-public-anonymity-vkontakte

Face recognition app taking Russia by storm may bring end to public anonymity

findface.ru.jpg
findface.ru.jpg (55.41 KiB) Viewed 3957 times


FindFace compares photos to profile pictures on social network Vkontakte and works out identities with 70% reliability

Findface has amassed 500,000 users in the short time since the launch


Shaun Walker in Moscow

If the founders of a new face recognition app get their way, anonymity in public could soon be a thing of the past. FindFace, launched two months ago and currently taking Russia by storm, allows users to photograph people in a crowd and work out their identities, with 70% reliability.

It works by comparing photographs to profile pictures on Vkontakte, a social network popular in Russia and the former Soviet Union, with more than 200 million accounts. In future, the designers imagine a world where people walking past you on the street could find your social network profile by sneaking a photograph of you, and shops, advertisers and the police could pick your face out of crowds and track you down via social networks.

In the short time since the launch, Findface has amassed 500,000 users and processed nearly 3m searches, according to its founders, 26-year-old Artem Kukharenko, and 29-year-old Alexander Kabakov.

Kukharenko is a lanky, quietly spoken computer nerd who has come up with the algorithm that makes FindFace such an impressive piece of technology, while Kabakov is the garrulous money and marketing man, who does all of the talking when the pair meet the Guardian.

Unlike other face recognition technology, their algorithm allows quick searches in big data sets. “Three million searches in a database of nearly 1bn photographs: that’s hundreds of trillions of comparisons, and all on four normal servers. With this algorithm, you can search through a billion photographs in less than a second from a normal computer,” said Kabakov, during an interview at the company’s modest central Moscow office. The app will give you the most likely match to the face that is uploaded, as well as 10 people it thinks look similar.

Kabakov says the app could revolutionise dating: “If you see someone you like, you can photograph them, find their identity, and then send them a friend request.” The interaction doesn’t always have to involve the rather creepy opening gambit of clandestine street photography, he added: “It also looks for similar people. So you could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages.”

Some have sounded the alarm about the potentially disturbing implications. Already the app has been used by a St Petersburg photographer to snap and identify people on the city’s metro, as well as by online vigilantes to uncover the social media profiles of female porn actors and harass them.

The technology can work with any photographic database, though it currently cannot use Facebook, because even the public photographs are stored in a way that is harder to access than Vkontakte, the app’s creators say.

But the FindFace app is really just a shop window for the technology, the founders said. There is a paid function for those who want to make more than 30 searches a month, but this is more to regulate the servers from overload rather than to make money. They believe the real money-spinner from their face-recognition technology will come from law enforcement and retail.

Kukharenko and Kabakov have recently returned from the US, and Kabakov was due to travel to Macau and present the technology to a casino chain. The pair claim they have been contacted by police in Russian regions, who told them they started loading suspect or witness photographs into FindFace and came up with results. “It’s nuts: there were cases that had seen no movement for years, and now they are being solved,” said Kabakov.

The startup is in the final stages of signing a contract with Moscow city government to work with the city’s network of 150,000 CCTV cameras. If a crime is committed, the mugshots of anyone in the area can be fed into the system and matched with photographs of wanted lists, court records, and even social networks.

It does not take a wild imagination to come up with sinister applications in this field too; for example authoritarian regimes able to tag and identify participants in street protests. Kabakov and Kukharenko said they had not received an approach from Russia’s FSB security service, but “if the FSB were to get in touch, of course we’d listen to any offers they had”.

The pair also have big plans for the retail sector. Kabakov imagines a world where cameras fix you looking at, say, a stereo in a shop, the retailer finds your identity, and then targets you with marketing for stereos in the subsequent days.

Again, it sounds a little disturbing. But Kabakov said, as a philosophy graduate, he believes we cannot stop technological progress so must work with it and make sure it stays open and transparent.

“In today’s world we are surrounded by gadgets. Our phones, televisions, fridges, everything around us is sending real-time information about us. Already we have full data on people’s movements, their interests and so on. A person should understand that in the modern world he is under the spotlight of technology. You just have to live with that.”


edited to add this related item:

http://www.theguardian.com/world/2016/apr/14/russian-photographer-yegor-tsvetkov-identifies-strangers-facial-recognition-app

Thursday 14 April 2016 15.27 BST
Elena Cresci


A Russian photographer has proved how easy it is to track down people on social media using facial recognition software.

Yegor Tsvetkov took photos of strangers on St Petersburg’s metro and used a facial recognition app which trawls through profiles on VKontakte, Russia’s biggest social network, to track down their online profiles.

Named “Your Face is Big Data”, the series of photographs shows how powerful facial recognition software has become, to the point that a complete stranger can find you at the click of a button.

Tsvetkov told the Guardian the project aimed to show technology can affect privacy, particularly if you don’t activate the relevant settings on your social media profiles.

“Nobody noticed that I photographed them, but I used a simple camera and I didn’t try to hide it,” he said.

“One girl in the project texted me after the publication and said that it was a bad feeling when she saw herself … but she fully understood my idea.”

The software he used is called FindFace, and was developed by the Moscow-based company N-Tech.Lab.

Launched in February, the app trawls through millions of profiles on VKontakte to find the person you are looking for within seconds.

Tsvetkov showed just how well this software works. When the Guardian ran some of his photographs through the site, the profiles of most of his subjects were easy to locate.

The Guardian has not published any of these photos to protect people’s anonymity.

Facial recognition software has proved problematic for Facebook. In 2011, its commitment to privacy was questioned when it turned on facial recognition software to automatically identify people in photos. In Germany, it was threatened with legal action for violating privacy laws.

Currently, it is not possible to trawl through Facebook using facial recognition and, as of yet, there is no western equivalent of FindFace.
He was disoriented in all three spheres.
Somnolence alternated with excitement.
When not in hell he was convinced he was in Eden.
User avatar
identity
 
Posts: 483
Joined: Fri Mar 20, 2015 5:00 am
Blog: View Blog (0)

Re: Surveillance

Postby elfismiles » Tue May 24, 2016 11:09 am

I don't know which thread would be best to post this in ... meh, dystopian big brother society thread it is :thumbsup

Why does it have to be an ISRAELI start-up...

Terrorist or pedophile? This start-up says it can out secrets by analyzing faces
By Matt McFarland May 24 at 6:30 AM

Our faces may reveal a lot more about us than we expect. (Kacper Pempel/Reuters)

An Israeli start-up says it can take one look at a person’s face and realize character traits that are undetectable to the human eye.

Faception said it’s already signed a contract with a homeland security agency to help identify terrorists. The company said its technology also can be used to identify everything from great poker players to extroverts, pedophiles, geniuses and white collar-criminals.

“We understand the human much better than other humans understand each other,” said Faception chief executive Shai Gilboa. “Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.”

Faception has built 15 different classifiers, which Gilboa said evaluate with 80 percent accuracy certain traits. The start-up is pushing forward, seeing tremendous power in a machine’s ability to analyze images.

Yet experts caution there are ethical questions and profound limits to the effectiveness of technology such as this.

“Can I predict that you’re an ax murderer by looking at your face and therefore should I arrest you?” said Pedro Domingos, a professor of computer science at the University of Washington and author of “The Master Algorithm.” “You can see how this would be controversial.”

Gilboa said he also serves as the company’s chief ethics officer and will never make his classifiers that predict negative traits available to the general public.

The danger lies in the computer system’s imperfections. Because of that, Gilboa envisions governments considering his findings along with other sources to better identify terrorists. Even so, the use of the data is troubling to some.

“The evidence that there is accuracy in these judgments is extremely weak,” said Alexander Todorov, a Princeton psychology professor whose research includes facial perception. “Just when we thought that physiognomy ended 100 years ago. Oh, well.”

Faception recently showed off its technology at a poker tournament organized by a start-up that shares investors with Faception. Gilboa said that Faception predicted before the tournament that four players out of the 50 amateurs would be the best. When the dust settled two of those four were among the event’s three finalists. To make its prediction Faception analyzed photos of the 50 players against a Faception database of professional poker players.

There are challenges in trying to use artificial intelligence systems to draw conclusions such as this. A computer that is trained to analyze images will only be as good as the examples it is trained on. If the computer is exposed to a narrow or outdated sample of data, its conclusions will be skewed. Additionally, there’s the risk the system will make an accurate prediction, but not necessarily for the right reasons.

Domingos, the University of Washington professor, shared the example of a colleague who trained a computer system to tell the difference between dogs and wolves. Tests proved the system was almost 100 percent accurate. But it turned out the computer was successful because it learned to look for snow in the background of the photos. All of the wolf photos were taken in the snow, whereas the dog pictures weren’t.

Also, an artificial intelligence system might zero in on a trait that could be changed by a person — such as the presence of a beard — limiting its ability to make an accurate prediction.

“If somebody came to me and said ‘I have a company that’s going to try to do this,’ my answer to them would be ‘nah, go do something more promising,’ ” Domingos said. “But on the other hand, machine learning brings us lots of surprises every day.”

https://www.washingtonpost.com/news/inn ... ing-faces/
User avatar
elfismiles
 
Posts: 8214
Joined: Fri Aug 11, 2006 6:46 pm
Blog: View Blog (4)

Re: Surveillance

Postby elfismiles » Sat Jun 11, 2016 8:53 am

DEA Wants Inside Your Medical Records to Fight the War on Drugs
The feds are fighting to look at millions of private files without a warrant, including those of two transgender men who are taking testosterone.
http://www.thedailybeast.com/articles/2 ... drugs.html
User avatar
elfismiles
 
Posts: 8214
Joined: Fri Aug 11, 2006 6:46 pm
Blog: View Blog (4)

Re: Surveillance

Postby Grizzly » Mon Jun 13, 2016 8:50 am

The NSA wants to monitor pacemakers and other medical devices

http://www.theverge.com/2016/6/11/11910 ... al-devices

The NSA is interested in collecting information from pacemakers and other biomedical devices for national security purposes, according to The Intercept. Richard Ledgett, the agency's deputy director, reportedly said at a conference yesterday that, "We’re looking at it sort of theoretically from a research point of view right now."


More ar through link..
If Barthes can forgive me, “What the public wants is the image of passion Justice, not passion Justice itself.”
Grizzly
 
Posts: 1879
Joined: Wed Oct 26, 2011 4:15 pm
Blog: View Blog (0)

Re: Surveillance

Postby elfismiles » Mon Jun 20, 2016 11:05 am

surveillance cam locations must be kept secret
"Disclosure of even minor details about them may cause jeopardy," bureau says.

by David Kravets - Jun 15, 2016 1:01pm CDT
http://arstechnica.com/tech-policy/2016 ... pt-secret/
User avatar
elfismiles
 
Posts: 8214
Joined: Fri Aug 11, 2006 6:46 pm
Blog: View Blog (4)

Re: Surveillance

Postby elfismiles » Mon Jun 20, 2016 11:07 am

elfismiles » 20 Jun 2016 15:05 wrote:FBI says surveillance cam locations must be kept secret
"Disclosure of even minor details about them may cause jeopardy," bureau says.

by David Kravets - Jun 15, 2016 1:01pm CDT
http://arstechnica.com/tech-policy/2016 ... pt-secret/
User avatar
elfismiles
 
Posts: 8214
Joined: Fri Aug 11, 2006 6:46 pm
Blog: View Blog (4)

Re: Surveillance

Postby Grizzly » Thu Aug 04, 2016 7:35 pm

More SPYING...




You are under surveillance right now.

Your cell phone provider tracks your location and knows who’s with you. Your online and in-store purchasing patterns are recorded, and reveal if you're unemployed, sick, or pregnant. Your e-mails and texts expose your intimate and casual friends. Google knows what you’re thinking because it saves your private searches. Facebook can determine your sexual orientation without you ever mentioning it.

The powers that surveil us do more than simply store this information. Corporations use surveillance to manipulate not only the news articles and advertisements we each see, but also the prices we’re offered. Governments use surveillance to discriminate, censor, chill free speech, and put people in danger worldwide. And both sides share this information with each other or, even worse, lose it to cybercriminals in huge data breaches.

Much of this is voluntary: we cooperate with corporate surveillance because it promises us convenience, and we submit to government surveillance because it promises us protection. The result is a mass surveillance society of our own making. But have we given up more than we’ve gained? In Data and Goliath, security expert Bruce Schneier offers another path, one that values both security and privacy. He shows us exactly what we can do to reform our government surveillance programs and shake up surveillance-based business models, while also providing tips for you to protect your privacy every day. You'll never look at your phone, your computer, your credit cards, or even your car in the same way again.
If Barthes can forgive me, “What the public wants is the image of passion Justice, not passion Justice itself.”
Grizzly
 
Posts: 1879
Joined: Wed Oct 26, 2011 4:15 pm
Blog: View Blog (0)

Re: Surveillance

Postby Agent Orange Cooper » Fri Aug 05, 2016 2:06 pm

http://www.bloomberg.com/news/articles/ ... ican-adult

This Company Has Built a Profile on Every American Adult
Every move you make. Every click you take. Every game you play. Every place you stay. They’ll be watching you.

Forget telephoto lenses and fake mustaches: The most important tools for America’s 35,000 private investigators are database subscription services. For more than a decade, professional snoops have been able to search troves of public and nonpublic records—known addresses, DMV records, photographs of a person’s car—and condense them into comprehensive reports costing as little as $10. Now they can combine that information with the kinds of things marketers know about you, such as which politicians you donate to, what you spend on groceries, and whether it’s weird that you ate in last night, to create a portrait of your life and predict your behavior.

IDI, a year-old company in the so-called data-fusion business, is the first to centralize and weaponize all that information for its customers. The Boca Raton, Fla., company’s database service, idiCORE, combines public records with purchasing, demographic, and behavioral data. Chief Executive Officer Derek Dubner says the system isn’t waiting for requests from clients—it’s already built a profile on every American adult, including young people who wouldn’t be swept up in conventional databases, which only index transactions. “We have data on that 21-year-old who’s living at home with mom and dad,” he says.

Dubner declined to provide a demo of idiCORE or furnish the company’s report on me. But he says these personal profiles include all known addresses, phone numbers, and e-mail addresses; every piece of property ever bought or sold, plus related mortgages; past and present vehicles owned; criminal citations, from speeding tickets on up; voter registration; hunting permits; and names and phone numbers of neighbors. The reports also include photos of cars taken by private companies using automated license plate readers—billions of snapshots tagged with GPS coordinates and time stamps to help PIs surveil people or bust alibis.

IDI also runs two coupon websites, allamericansavings.com and samplesandsavings.com, that collect purchasing and behavioral data. When I signed up for the latter, I was asked for my e-mail address, birthday, and home address, information that could easily link me with my idiCORE profile. The site also asked if I suffered from arthritis, asthma, diabetes, or depression, ostensibly to help tailor its discounts.

CONT'D
User avatar
Agent Orange Cooper
 
Posts: 449
Joined: Tue Oct 06, 2015 2:44 am
Blog: View Blog (0)

PreviousNext

Return to Data And Research

Who is online

Users browsing this forum: No registered users and 2 guests