The creepiness that is Facebook

Moderators: Elvis, DrVolin, Jeff

Re: The creepiness that is Facebook

Postby seemslikeadream » Mon Feb 19, 2018 11:10 am

Facebook Still Lying About Its Role in the 2016 Election

Josh Marshall

Alex Milan Tracy/Sipa USA
I flagged this on Twitter before President Trump started flogging it. But I’m not at all surprised that he did. Because, somewhat to my surprise, it revealed that Facebook seems still to be committed to lying, albeit now more artfully, about its role in the 2016 election and more broadly as a channel of choice for propaganda and misinformation.

First, here’s the tweet I saw from Facebook’s VP of advertising: Rob Goldman …

Rob Goldman
@robjective
Feb 16
More
Most of the coverage of Russian meddling involves their attempt to effect the outcome of the 2016 US election. I have seen all of the Russian ads and I can say very definitively that swaying the election was *NOT* the main goal.



Rob Goldman‏
@robjective
Follow Follow @robjective
More
The majority of the Russian ad spend happened AFTER the election. We shared that fact, but very few outlets have covered it because it doesn’t align with the main media narrative of Tump and the election.Hard Questions: Russian Ads Delivered to Congress
https://newsroom.fb.com/news/2017/10/ha ... -congress/


There is, as they say, a lot to discuss here. Facebook was a bad actor by complicity in the entire 2016 election Russian interference campaign. As I’ve noted in other posts, it’s an engine built to maximize engagement for ad sales and data collection which operates with no need to price its negative externalities. To pull that out of jargon into more concrete terms, it’s like a factory that is highly profitable in large part because it can dump its toxic waste into the local river. Facebook is designed to do stuff like this. So it’s not some shocking or unexpected occurrence that this happened.

For months, Facebook’s executives, including its founder and CEO not only denied but mocked the idea that its platform had been used to distribute misinformation and propaganda during the 2016 campaign. It came clean only under tremendous pressure, both legal pressure and the pressure of public opinion. It’s laughable and frankly offensive that any executive from Facebook thinks he can lecture anyone on this topic. Facebook had dirty hands in this whole drama and had to be dragged kicking and screaming to any discussion of its responsibility.

Of course there’s no bright line between the goal of electing Trump and the broader goal of sowing discord and confusion. The essence of the whole story is that both overlapped. But note Goldman’s effort to deemphasize the election and argue that the real goal wasn’t electing Donald Trump. Presumably an ad executive wouldn’t feel like he had a particular dog in that fight either way. But it’s clearly a pretty key thing for him. He even apes what amounts to quasi-Trumpian rhetoric in saying the media distorts the story because the facts “don’t align with the main media narrative of Trump and the election.” This is silly. Elections are a big deal. It’s hardly surprising that people would focus on the election, even though it’s continued since. What is this about exactly? Is Goldman some kind of hardcore Trumper?

I have no idea what Goldman’s politics are. But whatever they are, I don’t think they are what’s driving this. It’s built into Facebook’s business model and of a piece with its corporate culture. The business model of Facebook is universal usage. It doesn’t target one demographic or regional or political audience. The whole point of Facebook is that everybody be on it. Much of its network value is bound up in that universality. Everyone’s on Facebook. Everyone has an account.

This business model has critical political implications. Much like a television network, it can’t be perceived as taking sides in America’s increasingly polarized politics. That could cut it off from a big chunk of its potential audience. This fact has shaped the behavior of all of the tech giants over the last decade but none more so than Facebook. Nothing is qmoreq politically divisive today than the question of Russia’s role in the 2016 election. And yet Facebook is implicated in that story. It can’t avoid it, despite trying desperately to do so. Simply put, if Facebook collectively says what is obvious: that Russia decided it wanted to elect Donald Trump President and used Facebook as one tool to do that, it becomes just another part of the ‘fake news’ universally derided by Trump and his supporters.

It’s an understandable dodge in a way. After all, Facebook’s business model requires being Facebook for everyone. But what it tells me is that Facebook is still in the business of lying about its role in the 2016 election. President Trump himself clearly saw immediately that Goldman’s line was an effort to align Facebook with President Trump’s messaging – namely, it wasn’t about electing Trump and anyone who says the contrary is just trying to push “the main media narrative of T[r]ump and the election.”
Donald J. Trump‏Verified account
@realDonaldTrump
Follow Follow @realDonaldTrump
More
Donald J. Trump Retweeted Rob Goldman
The Fake News Media never fails. Hard to ignore this fact from the Vice President of Facebook Ads, Rob Goldman!


In other words, the story of Russia’s effort to elect Donald Trump is just ‘fake news’. Facebook remains part of the problem.
https://talkingpointsmemo.com/edblog/fa ... re-1111934
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby Karmamatterz » Mon Feb 19, 2018 1:48 pm

There is, as they say, a lot to discuss here. Facebook was a bad actor by complicity in the entire 2016 election Russian interference campaign. As I’ve noted in other posts, it’s an engine built to maximize engagement for ad sales and data collection which operates with no need to price its negative externalities. To pull that out of jargon into more concrete terms, it’s like a factory that is highly profitable in large part because it can dump its toxic waste into the local river. Facebook is designed to do stuff like this. So it’s not some shocking or unexpected occurrence that this happened.

For months, Facebook’s executives, including its founder and CEO not only denied but mocked the idea that its platform had been used to distribute misinformation and propaganda during the 2016 campaign. It came clean only under tremendous pressure, both legal pressure and the pressure of public opinion. It’s laughable and frankly offensive that any executive from Facebook thinks he can lecture anyone on this topic. Facebook had dirty hands in this whole drama and had to be dragged kicking and screaming to any discussion of its responsibility.


The author is pretty smug in expecting media companies and tech players to fess up to what they do. So unrealistic it's hard to take Marshall seriously. He is naive, or just posturing as his way of virtue signaling to his audience. Facebook is what it is, a beast that has largely turned an entire generation into a bunch of validation seeking emotionally stunted brats believing they are entitled to anything they need/want with a swipe. The author says FB is designed to "do stuff like this" and that it not shocking that it happened. Then spends the rest of the article acting like Zuck or others owe him an explanation. LOL. I've sat in conferences with FB executives in the same room hearing their explanations for what happened. It's always bullshit lies. My favorite is when they say they were taken by surprise, or that they were naive in thinking bad actors would use their platform in such a manner. A complete joke. Nobody should take Facebook, or any tech giant seriously about anything they say. Unless they are raising the cost of their products/sevices, then you can believe them.

Calling FB complicit seems to mean they were supposed to be doing something about ads published on their platform in regards to the election. FB does in fact have safeguards in place for ads that are published, but they more about hate speech, copyright infringement or bogus clickbait scammy stuff. Does anybody in their right mind believe that the media would care about Facebook selling ads to "bad actors" if Hillary had won the election? No way in hell would this investigation have happened or would anybody have cared. Just imagine if the Russian troll job had the goal of getting Hillary elected, and their ad campaigns were successful. There would be little or any discussion about foreign influences. If Hillary had won, and the troll job had helped, AND the Right or Trumpy had called for an investigation the whole thing would be labeled as a wacko right wing conspiracy.

America was trolled with all the new tech tools they so love. People are indignant that such a thing could happen. The media is complicit in that they are not reporting how many times the U.S. government has done the same thing to other countries. They don't want to expose that shadowy side because of pure laziness, and it would eventually show them to be complicit in all kinds of other shenanigans. The media is a fault for shitty reporting. I won't even call it journalism as most of it isn't. We are to blame for buying into the hype, propaganda and crap that is shoveled in front of our faces. It is our fault that people are not able to discern the difference between actual good, solid journalism and media coverage via agenda driven strategies to serve ideologies. The media is rarely concerned with the facts and true journalism. Trust the media as much as you trust Facebook.
User avatar
Karmamatterz
 
Posts: 828
Joined: Sun Aug 19, 2012 10:58 pm
Blog: View Blog (0)

Re: The creepiness that is Facebook

Postby Karmamatterz » Mon Feb 19, 2018 4:10 pm

Rather than double post here is the link to the info on the GIGANTIC media buy the Russian troll factory spent with Facebook. Big joke people. $100k barely gets you in the door with exposure. It's a drop in the bucket and laughable that people think these ads impacted the election. As the article state probably a third of the ads were never even seen. Anybody that knows how to measure viewablity could tell you that.

viewtopic.php?f=8&t=40854&p=651349#p651349
User avatar
Karmamatterz
 
Posts: 828
Joined: Sun Aug 19, 2012 10:58 pm
Blog: View Blog (0)

Re: The creepiness that is Facebook

Postby seemslikeadream » Mon Feb 19, 2018 8:25 pm

Jonathan AlbrightFollow
Professor and researcher in news, journalism, and #hashtags. Award-nominated data journalist. Media, communication, and technology.
Nov 8, 2017

Image
Instagram, Meme Seeding, and the Truth about Facebook Manipulation, Pt. 1
The last couple of weeks have brought us the first new major revelations about the reach and scope of the IRA media influence campaign. Yet the most important development about the ongoing Facebook investigation isn’t the tenfold increase in the company’s updated estimate of the organic reach of “ads” on its platform.
While the estimate increasing the reach of IRA content from 10 million people to 126 million people is surely a leap, after last week’s testimony, the real question we should be asking is: how did we suddenly arrive at 150 million?
The answer is Instagram.

.........

https://medium.com/berkman-klein-center ... e4d0b61db5
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Sat Mar 17, 2018 10:24 am

BIGGER STORY THAN THE MCCABE FIRING


Russian data launderin’


Facebook bans Trump campaign’s data analytics firm, Cambridge Analytica, for failing to delete data that it had taken inappropriately from users of the social network.

FACEBOOK FAILED TO PROTECT 30 MILLION USERS FROM HAVING THEIR DATA HARVESTED BY TRUMP CAMPAIGN AFFILIATE

“We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis that the entire company was built on.”

Image
Image

Image


Very good, hard-hitting & non-credulous interview with the Cambridge Analytica whistleblower Chris Wylie. It's worth the 16 minutes it takes to watch.

Whistleblower reveals to Channel 4 News data grab of 50 million Facebook profiles by Cambridge Analytica – data firm linked to Trump win

Andy Davies
17 Mar 2018

Home Affairs Correspondent
The British data firm described as “pivotal” in Donald Trump’s presidential victory was behind a ‘data grab’ of more than 50 million Facebook profiles, a whistleblower has revealed to Channel 4 News. In part one of Channel 4 News’ ‘Data, Democracy and Dirty Tricks’ investigation, in an exclusive television interview, Chris Wylie, former Research Director at Cambridge Analytica tells all. In collaboration with the Observer newspaper.https://www.channel4.com/news/cambridge ... e-election




seemslikeadream » Wed Oct 25, 2017 11:26 pm wrote:
FACEBOOK FAILED TO PROTECT 30 MILLION USERS FROM HAVING THEIR DATA HARVESTED BY TRUMP CAMPAIGN AFFILIATE
Mattathias Schwartz
March 30 2017, 1:01 p.m.
LEIA EM PORTUGUÊS

IN 2014, TRACES of an unusual survey, connected to Facebook, began appearing on internet message boards. The boards were frequented by remote freelance workers who bid on “human intelligence tasks” in an online marketplace, called Mechanical Turk, controlled by Amazon. The “turkers,” as they’re known, tend to perform work that is rote and repetitive, like flagging pornographic images or digging through search engine results for email addresses. Most jobs pay between 1 and 15 cents. “Turking makes us our rent money and helps pay off debt,” one turker told The Intercept. Another turker has called the work “voluntary slave labor.”

The task posted by “Global Science Research” appeared ordinary, at least on the surface. The company offered turkers $1 or $2 to complete an online survey. But there were a couple of additional requirements as well. First, Global Science Research was only interested in American turkers. Second, the turkers had to download a Facebook app before they could collect payment. Global Science Research said the app would “download some information about you and your network … basic demographics and likes of categories, places, famous people, etc. from you and your friends.”

“Our terms of service clearly prohibit misuse,” said a spokesperson for Amazon Web Services, by email. “When we learned of this activity back in 2015, we suspended the requester for violating our terms of service.”

Although Facebook’s early growth was driven by closed, exclusive networks at college and universities, it has gradually herded users to agree to increasingly permissive terms of service. By 2014, anything a user’s friends could see was also potentially visible to the developers of any app that they chose to download. Some of the turkers noticed that the Global Science Research app appeared to be taking advantage of Facebook’s porousness. “Someone can learn everything about you by looking at hundreds of pics, messages, friends, and likes,” warned one, writing on a message board. “More than you realize.” Others were more blasé. “I don’t put any info on FB,” one wrote. “Not even my real name … it’s backwards that people put sooo much info on Facebook, and then complain when their privacy is violated.”

In late 2015, the turkers began reporting that the Global Science Research survey had abruptly shut down. The Guardian had published a report that exposed exactly who the turkers were working for. Their data was being collected by Aleksandr Kogan, a young lecturer at Cambridge University. Kogan founded Global Science Research in 2014, after the university’s psychology department refused to allow him to use its own pool of data for commercial purposes. The data collection that Kogan undertook independent of the university was done on behalf of a military contractor called Strategic Communication Laboratories, or SCL. The company’s election division claims to use “data-driven messaging” as part of “delivering electoral success.”

SCL has a growing U.S. spin-off, called Cambridge Analytica, which was paid millions of dollars by Donald Trump’s campaign. Much of the money came from committees funded by the hedge fund billionaire Robert Mercer, who reportedly has a large stake in Cambridge Analytica. For a time, one of Cambridge Analytica’s officers was Stephen K. Bannon, Trump’s senior adviser. Months after Bannon claimed to have severed ties with the company, checks from the Trump campaign for Cambridge Analytica’s services continued to show up at one of Bannon’s addresses in Los Angeles.

“You can say Mr. Mercer declined to comment,” said Jonathan Gasthalter, a spokesperson for Robert Mercer, by email.
Image
FaceBook Elections signs stand in the media area at Quicken Loans Arena in Cleveland, Thursday, Aug. 6, 2015, before the first Republican presidential debate. (AP Photo/John Minchillo) Facebook Elections signs in the media area at Quicken Loans Arena in Cleveland, Aug. 6, 2015, before the first Republican presidential debate of the 2016 election. Photo: John Minchillo/AP
The Intercept interviewed five individuals familiar with Kogan’s work for SCL. All declined to be identified, citing concerns about an ongoing inquiry at Cambridge and fears of possible litigation. Two sources familiar with the SCL project told The Intercept that Kogan had arranged for more than 100,000 people to complete the Facebook survey and download an app. A third source with direct knowledge of the project said that Global Science Research obtained data from 185,000 survey participants as well as their Facebook friends. The source said that this group of 185,000 was recruited through a data company, not Mechanical Turk, and that it yielded 30 million usable profiles. No one in this larger group of 30 million knew that “likes” and demographic data from their Facebook profiles were being harvested by political operatives hired to influence American voters.

Kogan declined to comment. In late 2014, he gave a talk in Singapore in which he claimed to have “a sample of 50+ million individuals about whom we have the capacity to predict virtually any trait.” Global Science Research’s public filings for 2015 show the company holding 145,111 British pounds in its bank account. Kogan has since changed his name to Spectre. Writing online, he has said that he changed his name to Spectre after getting married. “My wife and I are both scientists and quite religious, and light is a strong symbol of both,” he explained.

The purpose of Kogan’s work was to develop an algorithm for the “national profiling capacity of American citizens” as part of SCL’s work on U.S. elections, according to an internal document signed by an SCL employee describing the research.

“We do not do any work with Facebook likes,” wrote Lindsey Platts, a spokesperson for Cambridge Analytica, in an email. The company currently “has no relationship with GSR,” Platts said.

“Cambridge Analytica does not comment on specific clients or projects,” she added when asked whether the company was involved with Global Science Research’s work in 2014 and 2015.

The Guardian, which was was the first to report on Cambridge Analytica’s work on U.S. elections, in late 2015, noted that the company drew on research “spanning tens of millions of Facebook users, harvested largely without their permission.” Kogan disputed this at the time, telling The Guardian that his turker surveys had collected no more than “a couple of thousand responses” for any one client. While it is unclear how many responses Global Science Research obtained through Mechanical Turk and how many it recruited through a data company, all five of the sources interviewed by The Intercept confirmed that Kogan’s work on behalf of SCL involved collecting data from survey participants’ networks of Facebook friends, individuals who had not themselves consented to give their data to Global Science Research and were not aware that they were the objects of Kogan’s study. In September 2016, Alexander Nix, Cambridge Analytica’s CEO, said that the company built a model based on “hundreds and hundreds of thousands of Americans” filling out personality surveys, generating a “model to predict the personality of every single adult in the United States of America.”

Shortly after The Guardian published its 2015 article, Facebook contacted Global Science Research and requested that it delete the data it had taken from Facebook users. Facebook’s policies give Facebook the right to delete data gathered by any app deemed to be “negatively impacting the Platform.” The company believes that Kogan and SCL complied with the request, which was made during the Republican primary, before Cambridge Analytica switched over from Ted Cruz’s campaign to Donald Trump’s. It remains unclear what was ultimately done with the Facebook data, or whether any models or algorithms derived from it wound up being used by the Trump campaign.

In public, Facebook continues to maintain that whatever happened during the run-up to the election was business as usual. “Our investigation to date has not uncovered anything that suggests wrongdoing,” a Facebook spokesperson told The Intercept.

Facebook appears not to have considered Global Science Research’s data collection to have been a serious ethical lapse. Joseph Chancellor, Kogan’s main collaborator on the SCL project and a former co-owner of Global Science Research, is now employed by Facebook Research. “The work that he did previously has no bearing on the work that he does at Facebook,” a Facebook spokesperson told The Intercept.

Chancellor declined to comment.

Cambridge Analytica has marketed itself as classifying voters using five personality traits known as OCEAN — Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism — the same model used by University of Cambridge researchers for in-house, non-commercial research. The question of whether OCEAN made a difference in the presidential election remains unanswered. Some have argued that big data analytics is a magic bullet for drilling into the psychology of individual voters; others are more skeptical. The predictive power of Facebook likes is not in dispute. A 2013 study by three of Kogan’s former colleagues at the University of Cambridge showed that likes alone could predict race with 95 percent accuracy and political party with 85 percent accuracy. Less clear is their power as a tool for targeted persuasion; Cambridge Analytica has claimed that OCEAN scores can be used to drive voter and consumer behavior through “microtargeting,” meaning narrowly tailored messages. Nix has said that neurotic voters tend to be moved by “rational and fear-based” arguments, while introverted, agreeable voters are more susceptible to “tradition and habits and family and community.”

Dan Gillmor, director of the Knight Center at Arizona State University, said he was skeptical of the idea that the Trump campaign got a decisive edge from data analytics. But, he added, such techniques will likely become more effective in the future. “It’s reasonable to believe that sooner or later, we’re going to see widespread manipulation of people’s decision-making, including in elections, in ways that are more widespread and granular, but even less detectable than today,” he wrote in an email.


Trump’s circle has been open about its use of Facebook to influence the vote. Joel Pollak, an editor at Breitbart, writes in his campaign memoir about Trump’s “armies of Facebook ‘friends,’ … bypassing the gatekeepers in the traditional media.” Roger Stone, a longtime Trump adviser, has written in his own campaign memoir about “geo-targeting” cities to deliver a debunked claim that Bill Clinton had fathered a child out of wedlock, and narrowing down the audience “based on preferences in music, age range, black culture, and other urban interests.”

Clinton, of course, had her own analytics effort, and digital market research is a normal part of any political campaign. But the quantity of data compiled on individuals during the run-up to the election is striking. Alexander Nix, head of Cambridge Analytica, has claimed to “have a massive database of 4-5,000 data points on every adult in America.” Immediately after the election, the company tried to take credit for the win, claiming that its data helped the Trump campaign set the candidate’s travel schedule and place online ads that were viewed 1.5 billion times. Since then, the company has been de-emphasizing its reliance on psychological profiling.

The Information Commissioner’s Office, an official privacy watchdog within the British government, is now looking into whether Cambridge Analytica and similar companies might pose a risk to voters’ rights. The British inquiry was triggered by reports in The Observer of ties between Robert Mercer, Cambridge Analytica, and the Leave.EU campaign, which worked to persuade British voters to leave the European Union. While Nix has previously talked about the firm’s work for Leave.EU, Cambridge Analytica now denies that it had any paid role in the campaign.
Image
Twickenham, members of Leave EU and UKIP hand out leaflets<br /><br /><br /><br /> Grassroots Out action day on EU membership, London, Britain - 05 Mar 2016</p><br /><br /><br /> <p> (Rex Features via AP Images) Leave.EU signage is displayed in London on March 5, 2016. Photo: Rex Features/AP Images
In the U.S., where privacy laws are looser, there is no investigation. Cambridge Analytica is said to be pitching its products to several federal agencies, including the Joint Chiefs of Staff. SCL, its parent company, has new offices near the White House and has reportedly been advised by Gen. Michael Flynn, Trump’s former national security adviser, on how to increase its federal business. (A spokesperson for Flynn denied that he had done any work for SCL.)

Years before the arrival of Kogan’s turkers, Facebook founder Mark Zuckerberg tried to address privacy concerns around the company’s controversial Beacon program, which quietly funneled data from outside websites into Facebook, often without Facebook users being aware of the process. Reflecting on Beacon, Zuckerberg attributed part of Facebook’s success to giving “people control over what and how they share information.” He said that he regretted making Beacon an “opt-out system instead of opt-in … if someone forgot to decline to share something, Beacon went ahead and still shared it with their friends.”

Seven years later, Facebook appears to have made the same mistake, but with far greater consequences. In mid-2014, however, Facebook announced a new review process, where the company would make sure that new apps asked only for data they would actually use. “People want more control,” the company said at that time. “It’s going to make a huge difference with building trust with your app’s audience.” Existing apps were given a full year to switch over to have Facebook review how they handled user data. By that time, Global Science Research already had what it needed.
https://theintercept.com/2017/03/30/fac ... affiliate/


Massachusetts launches probe into Cambridge Analytica’s use of Facebook data
BY LUIS SANCHEZ - 03/17/18 02:19 PM EDT 47

Massachusetts launches probe into Cambridge Analytica’s use of Facebook data
© Getty Images
Massachusetts attorney general Maura Healey (D) announced Saturday that her state will launch an investigation into Cambridge Analytica, a data firm used by the Trump campaign during the 2016 election, after Facebook suspended the firm.

"Massachusetts residents deserve answers immediately from Facebook and Cambridge Analytica. We are launching an investigation," Healey tweeted.


Cambridge Analytica was suspended on Friday after reports that it had not fully deleted data it obtained from Cambridge University professor Aleksandr Kogan.

The professor was found to have harvested more than 50 million Facebook profiles from his app, which required a Facebook login, despite only 270,000 having given permission for their data to be harvested, according to a New York Times report Saturday.



About thirty million of the profiles Kogan gave the firm had enough information to create psychographic profiles, the newspaper reported.

Facebook said discovered that the firm had violated its rules in 2015 and demanded that the firm certify it had destroyed the data it had received. The firm provided the certification.

However, Facebook said it suspended the firm after recent reports came out that said Cambridge Analytica did not destroy all of its data.

“If true, this is another unacceptable violation of trust and the commitments they made. We are suspending SCL/Cambridge Analytica, Wylie and Kogan from Facebook, pending further information,” Facebook Vice President Paul Grewal said in a statement issued Friday.

“Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules. By passing information on to a third party, including SCL/Cambridge Analytica and Christopher Wylie of Eunoia Technologies, he violated our platform policies,” the statement read.

Cambridge Analytica has come under scrutiny for its involvement in the 2016 presidential election. President Trump’s former strategist and chief campaign executive Steve Bannon was a former vice president of the firm.

Special counsel Robert Mueller has reportedly requested all the emails between the firm and the Trump campaign and the firm’s CEO has been reportedly interviewed by the House Intelligence Committee.
http://thehill.com/policy/technology/37 ... ssion=true
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Sat Mar 17, 2018 6:50 pm

Carole Cadwalladr

Here he is. This is the story @facebook tried to suppress. The Cambridge Analytica whistleblower goes on the record in tomorrow’s @ObsNewReview
Image


Carole Cadwalladr

Yesterday @facebook threatened to sue us. Today we publish this.
Meet the whistleblower blowing the lid off Facebook & Cambridge Analytica.

Image


Perjury is the new normal. Facebook just gave proof to this whopper of a lie.
Image



Today’s Cambridge Analytica story is important. Mercer owns it. Bannon was previously CEO. Flynn consulted for them. Cambridge Analytica contacted Wikileaks re: hacked Clinton emails. Kushner used them for the Trump campaign.


Rebekah Mercer asked Cambridge Analytica CEO if the company could help organize H. Clinton emails that were being released by WikiLeaks.


Revekah Mercer sits Board of Heritage Foundation. Cambridge Analytica sold their data to Heritage Foundation

An interesting detail in the NYT article - the lawyer advising Cambridge Analytica, Laurence Levy, worked at the time for Bracewell & Giuliani - and Levy is one of the lawyers that later in 2016 followed Giuliani to Greenberg Traurig
https://www.nytimes.com/2018/03/17/us/p ... paign.html
Image



Cambridge Analytica created a pitch for Russia's 2nd biggest oil co Lukoil & CEO Vagit Alekperov, a former Soviet oil minister & assoc of Putin - while no evidence CA did business w/Lukoil they were briefed on FB, microtargeting, data & election disruption

Image


An important point from @d1gi that today's news is only part of the story - the other part is how Cambridge Analytica merged Facebook info with other data and distribution channels

Let me emphasize this is tip of the iceberg. CA *merged* its FB linked databases with comScore Rentrak in Oct 2016; massive FB Custom Audiences targeting exploit w/Audiences Overlap tool from Jan 2016; massive ad fraud op/millions of fake FB & Instagram accounts to boost likes...

Image



Jonathan AlbrightFollow
Professor and researcher in news, journalism, and #hashtags. Award-nominated data journalist. Media, communication, and technology.
Jul 30, 2017
Who Hacked the Election? Ad Tech did. Through “Fake News,” Identity Resolution and Hyper-Personalization

Image

Complete graph of ad tech infrastructure, tracking, unique ID, and server technologies from earlier group of “fake news” sites
Identifying the Identifiers:
Several months ago, I captured hundreds of trackers, scripts, and “ad tech” resources that loaded onto my computer as I visited a group of 110 hyper-partisan, parody, hoax, pseudoscience, and propaganda (ie, “fake news”) sites. These sites form part of what I call the “micro-propaganda machine.”
#Election2016: Propaganda-lytics & Weaponized Shadow Trackers

A journey into the behavioral tracking technologies of the right-wing “micro-propaganda machine”
medium.com
Since the issue is still at the center of the “election hacking” and voter “micro-targeting” debates, to better understand the role of this weaponized tracking infrastructure in the news ecosystem, I spent some time filling this network out with more complete data. To do this, I collected an extensive list of all the software, companies, and services that these scripts, cookies, APIs, unique identifiers, content customization services, business intelligence services, and ID resources were “calling home” to from my earlier ad tech “scrape” of the same sites.
This time around, using a set of tools including Threatcrowd, Maltego, and Gephi, and along with some advanced spreadsheet and data viz work, I revisited this group, adding the deep layer of ad tech, content customization and targeting technologies, and A/B testing platforms that this “fake news” behavioral tracking infrastructure is meant to “deliver on.”
The data I present here suggests that before we keep pointing fingers at specific countries and tweeting about companies “hacking the election,” as well as to solve the scourge of “fake news,” it might be good to look inward. By this, I mean we should start the quest for transparency in politics with a few firms based in New York City and Silicon Valley.
The Sources
Before the “ad tech is everywhere” and “voter targeting is nothing new” arguments come up, remember: I’m not talking about Slate, Buzzfeed, the National Review, or even #12 top tracker awardee and shell site the Drudge Report, but from a highly coordinated campaign to drive traffic to a list of players such as:
Chicks on the Right: A site without a header (title) on a mobile-unoptimized site front page?
https://medium.com/tow-center/who-hacke ... d4019f705f
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Sat Mar 17, 2018 9:45 pm

The Trump Campaign paid Cambridge Analytica $6 million for a database of 220 million U.S. voters including private details of their personality and activities.


BREAKING: So the Cambridge Analytica Whistleblower has been 'depersonned' by @facebook without any chance to retrieve his contacts or private materials.

Christopher Wylie
@chrisinsilico
Follow Follow @chrisinsilico
More
Suspended by @facebook. For blowing the whistle. On something they have known privately for 2 years.



Follow-Up Questions For Facebook, Cambridge Analytica and Trump Campaign on Massive Breach
by Justin Hendrix
March 17, 2018


Late on Friday night, Facebook made a surprising announcement. The company said it was suspending the British firm Strategic Communication Laboratories (SCL), and its political data analytics firm, Cambridge Analytica. In 2016, Cambridge Analytica famously played a role in microtargeting messages for Donald Trump’s presidential election, using Facebook data in its models. According to bombshell reports in the New York Times and the Observer this morning, it appears that the firm stole the user information it acquired from Facebook.

A whistleblower—a former Cambridge Analytica employee—presented a dossier of evidence to reporters that, according to the Observer, “includes emails, invoices, contracts and bank transfers that reveal more than 50 million profiles – mostly belonging to registered US voters – were harvested from the site in the largest ever breach of Facebook data.” The story is surprising on a number of levels. It suggests that Alexander Nix, the CEO of Cambridge Analytica, intentionally made misrepresentations in recent testimony to the British Parliament. It implicates the hedge fund billionaire Robert Mercer and his daughter, Rebekah, who together played a major role in the Trump campaign. But more than anything, it calls into question Facebook’s handling of what is clearly a massive breach of user privacy.

Journalists, regulatory bodies and Congress should be ready to ask a number of pressing questions to get to the bottom of exactly what happened. The answers are important- governments around the world are considering how best to regulate technology companies, and this extraordinary incident gets to the heart of the relationship between personal data, microtargeting, dark money and the impact of their combination with unaccountable platforms on the health of democracies.

Here are seven key questions:

1. Why did Facebook take more than two years to inform the public of this massive breach?

News reports suggest the company knew of the breach in 2015. The Intercept published allegations that more than 30 million users were affected in March 2017. Further, as Daily Beast reporter Spencer Ackerman put it on Twitter, “Zero acknowledgement from Facebook throughout 2016 & 2017 that anything was untoward about its relationship with the Trump campaign, despite reportedly knowing in August 2016 that Trump-camp partner Cambridge Analytica possessed tens of millions of illicitly acquired profiles.” When confronted with proof of the breach, the New York Times says “Facebook downplayed the scope of the leak and questioned whether any of the data still remained out of its control.” Why did the company behave this way?

2. Did the Trump campaign or Cambridge Analytica violate campaign finance laws?

The New York Times report states that “whether the [SCL’s] American ventures violated election laws would depend on foreign employees’ roles in each campaign, and on whether their work counted as strategic advice under Federal Election Commission rules.” Bizarrely, Cambridge Analytica told the Times that Nix never had any strategic role in the Trump campaign, despite mountains of evidence he led the company’s efforts. There is plenty of video of him discussing what Cambridge Analytica did for Donald Trump. It beggars belief that he would claim otherwise. Did Cambridge Analytica sell its services to the Trump campaign for fair market value or, by contrast, was it employing its data in its own direct-voter-contact advertising, for example, on behalf of the Mercers? The latter would trigger federal election laws that restrict the participation of foreign nationals in certain kinds of decision-making.

3. Did Trump campaign or Cambridge Analytica employees lie to Congress, or to the British Parliament?

Steve Bannon served on the board of Cambridge Analytica. Jared Kushner and Brad Parscale have each been credited with playing a role in the campaign’s data strategy–and Kushner boasted of his direct involvement and work with Cambridge Analytica. What did they know about Cambridge Analytica’s methodology, and were they at any point aware that the firm was trading on stolen data? Was Bannon, Kushner, and Parscale’s testimony to Congress accurate?

Further, did Alexander Nix lie to Congress or the British Parliament? The New York Times reports, as one example, that “while Mr. Nix has told lawmakers that the company does not have Facebook data, a former employee said that he had recently seen hundreds of gigabytes on Cambridge servers, and that the files were not encrypted.” Will members of Congress raise concerns if they were misled? Will British political representatives be willing to consider whether Nix’s testimony constituted contempt of Parliament?

4. Did Facebook’s failure to disclose this breach to the public and notify its directly affected consumers break any laws?

According to the National Conference of State Legislatures, 48 states have laws regarding notification in the event of breaches. Did Facebook fail to satisfy any of these laws, or any federal statute? The company has had the opportunity to disclose the breach several times, including ahead of and during testimony to lawmakers last year. Facebook says that it is “completely false” to say this involved a “data breach,” but is the company so certain that personal information was not part of the breach? Importantly, did the breach involve personal information of third parties (such as friends of the users who directly interacted with the app)? And is the company claiming the breach does not involve stolen sensitive information using some unusual definition of the word “stolen”? The Attorney General of Massachusetts has already announced that she will launch an investigation.

5. Did any of the Facebook embeds in the Trump campaign know that stolen data was being used for targeting?

A study last year in the journal Political Communication detailed the extent to which Facebook, like other technology companies, went well beyond “promoting their services and facilitating digital advertising buys” to “actively shaping campaign communication through their close collaboration with political staffers.” A 60 Minutes profile of Trump campaign digital director Brad Parscale focused on how Facebook embedded employees to help the campaign use its platform. Did any of these embedded Facebook employees know that the campaign was using stolen Facebook data in its models?

6. Did Facebook have evidence its own employees mishandled this situation? Was any disciplinary action taken?

These events unfolded over the course of years, and while the company is adamant that it has taken steps to ensure its policies are enforced, it raises the question: was anyone at any level of the company disciplined over a breach that saw information about 50 million Americans used for political purposes without their permission? What specifically has the company done to change its policies, access to its data and internal security training to ensure nothing like this can happen again?

7. Did other organizations or individuals exploit these apparent weaknesses, and are there other breaches we do not know about?

Given the number of times that Facebook has said things that turned out to be incomplete or false–such as the ever-expanding disclosure of the number of Americans affected by the Russian disinformation campaign in the 2016 election–why should we believe that this is the only breach of this kind that occurred? It is impossible to know how much Facebook user data has been sold, traded or is just sitting on various third party servers. Think of all the old Facebook games and apps, or any other third party use of Facebook user authentication. It is hard to imagine this is the only incident. How can the company and its senior leadership maintain public trust, and why do they deserve it?

***

News of the Facebook data breach broke hours just after Cambridge Analytica was served with papers in a lawsuit by an American professor, David Carroll, who is seeking more information about how his data was handled via mechanisms available to users to ask such questions under British law. In the United States, there are no similar rules. It is time to seriously question how best to regulate technology companies to give citizens and governments the means to defend themselves from such breaches, and to better understand how data, dark money and politics combine to influence citizens and undermine democracy.

Journalists, regulators, and lawmakers should start asking tough questions. Senator Mark Warner of Virginia, the ranking member of the Senate Intelligence Committee, has asked the CEOs of the technology companies to testify before the 2018 midterm elections. Will it be too late?
https://www.justsecurity.org/54045/foll ... ve-breach/


Cambridge Analytica: links to Moscow oil firm and St Petersburg university

Emma Graham-Harrison Sat 17 Mar 2018 17.59 EDT
Data company gave briefing to Moscow firm Lukoil, and the lecturer who developed the crucial algorithm worked for St Petersburg university

The Twelve colleges building of St Petersburg university Russia
St Petersburg State University, where Aleksandr Kogan taught and received research grants. Photograph: Alamy
Aleksandr Kogan, the Cambridge University academic who orchestrated the harvesting of Facebook data, had previously unreported ties to a Russian university, including a teaching position and grants for research into the social media network, the Observer has discovered. Cambridge Analytica, the data firm he worked with – which funded the project to turn tens of millions of Facebook profiles into a unique political weapon – also attracted interest from a key Russian firm with links to the Kremlin.

Energy firm Lukoil, which is now on the US sanctions list and has been used as a vehicle of government influence, saw a presentation on the firm’s work in 2014. It began with a focus on voter suppression in Nigeria, and Cambridge Analytica also discussed “micro-targeting” individuals on social media during elections.

The revelations come at a time of intense US scrutiny of Russian meddling in the 2016 US presidential election, with 13 Russians criminally charged last month with interfering to help Donald Trump.

In Britain, concerns about Russian propaganda have been mounting, with the prime minister, Theresa May, recently attacking Russia for spreading fake news, accusing Moscow of attempts to “weaponise information” and influence polls.

Lukoil, Russia’s second-largest oil company, discussed with Cambridge Analytica the data company’s powerful social media marketing system, which was already being deployed for Republican Ted Cruz in the US presidential primaries and was later used to back Brexit and Trump.

Alexander Nix, chief executive of Cambridge Analytica, emailed colleagues after initial contacts to say that Lukoil wanted a clearer explanation of “how our services are going to apply to the petroleum business”.

“They understand behavioural micro-targeting in the context of elections (as per your excellent document/white paper) but they are failing to make the connection between voters and their consumers,” he wrote in an email seen by the Observer.

A slide presentation prepared for the Lukoil pitch focuses first on election disruption strategies used by Cambridge Analytica’s parent company, SCL, in Nigeria. They are presented under the heading “Election: Inoculation”, a military term used in “psychological operations” and disinformation campaigns. Other SCL documents show that the material shared with Lukoil included posters and videos apparently aimed at alarming or demoralising voters, including warnings of violence and fraud.

Discussion of services offered by Cambridge Analytica was apparently going right to the top of Lukoil, even though its retail operations in America are a very minor corner of the oil and gas giant’s empire. Asking for a detailed presentation of Cambridge Analyticas’s work in July 2014, Nix told his colleague the document would be “shared with the CEO of the business”.

The chief executive of Lukoil, Vagit Alekperov, is a former Soviet oil minister who has said the strategic aims of Lukoil are closely aligned with those of Russia. “I have only one task connected with politics, to help the country and the company. I’m not close to Mr Putin, but I treat him with great respect,” he told the New York Times.

Cambridge Analytica said an affiliate company had talked to Lukoil Turkey about a loyalty card scheme and proposed a pilot study with a small number of petrol stations there, but the project had not gone ahead. “[The talks] were about potential commercial work in Turkey and did not involve any discussion of political work,” a spokesman said. “Cambridge Analytica and its affiliate companies have not worked in Russia and have not worked for a Russian company or organisation.”

Last month Nix told MPs: “We have never worked with a Russian organisation in Russia or any other company. We do no have any relationship with Russia or Russian individuals.”

That appears to contradict the company documents seen by the Observer, that list Russia as one of the countries where Cambridge Analytica and affiliate companies have clients.

Christopher Wylie, the whistleblower who has come forward to talk to the Observer, said it was never entirely clear what the Russian firm hoped to get from the operation.

“Alexander Nix’s presentation didn’t make any sense to me,” said Wylie, who left Cambridge Analytica soon after the initial meetings. “If this was a commercial deal, why were they so interested in our political targeting?”

Lukoil did not respond to requests for comments.

Kogan, a lecturer who worked with Cambridge Analytica on building up the database of US voters then at the heart of the company’s plans, said he had not had any connection to the Lukoil pitch.

But while he was helping turn Facebook profiles into a political tool he was also an associate professor at St Petersburg State University, taking Russian government grants to fund other research into social media. “Stress, health, and psychological wellbeing in social networks: cross-cultural investigation” was the title of one piece of research. Online posts showed Kogan lecturing in Russian. One talk was called: “New methods of communication as an effective political instrument”.

Cambridge University said academics are allowed to take on outside work but are expected to inform their head of institution, a rule Kogan had complied with. “We understand that Dr Kogan informed his head of department of discussions with St Petersburg University regarding a collaboration; it was understood that this work and any associated grants would be in a private capacity,” a spokesman said.

Apart from that, Kogan appears to have largely kept the work private. Colleagues said they had not heard about the post in St Petersburg. “I am very surprised by that. No one knew,” one academic who asked not to be named told the Observer. Russia is not mentioned in a 10-page CV Kogan posted on a university website in 2015. The CV lists undergraduate prizes and grants of a few thousand dollars and links to dozens of media interviews.

One Cambridge Analytica employee mentioned Kogan’s Russian work in an email to Nix in March 2014 discussing a pitch to a Caribbean nation for a security contract, including “criminal psychographic profiling via intercepts”.

“We may want to either loop in or find out a bit more about the interesting work Alex Kogan has been doing for the Russians and see how/if it applies,” the colleague wrote.

Kogan told the Observer: “Nothing I did on the Russian project was at all related to Cambridge Analytica in any way. No data or models.” His recollection was that the Russia project had started a year after his collaboration with Cambridge Analytica ended.

He said the St Petersburg position emerged by chance on a social visit. A native Russian speaker, Kogan was born in Moldova and brought up in Moscow until he was seven, when his family emigrated to the US, where he later obtained citizenship.

However, he stayed in touch with family friends in Russia and visited regularly. On one trip, he said, he “dropped an email” to the psychology department at St Petersburg.

“We met, had a nice chat, and decided let’s try to collaborate – give me more reason to visit there,” he told the Observer in an email.
https://www.theguardian.com/news/2018/m ... are_btn_tw



This April 2017 thread describes how Joseph Chancellor — Alekansandr Kogan's main collaborator on the Cambridge Analytica/SCL project that harvested Facebook data from unwitting users — was later hired by Facebook.

After he helped steal your data.

... from Facebook.


PSYCHOLOGICAL WARFARE

Posted on September 5, 2017 by Zev Shalev and Tracie McElroy

THE SECRET STORY OF HOW AMERICAN MINDS WERE MANIPULATED DURING THE 2016 ELECTION CAMPAIGN AND HOW DONALD TRUMP’S BACKERS ARE MAKING MONEY OFF IT.
Psychologist Michal Kosinski switched on the TV in his Zurich hotel room on the morning of November ninth. The headlines were all about Donald Trump’s against-all-odds win in the U.S. presidential election. The news filled the Polish-born academic with dread, Kosinski told Das Magazine. The 34-year-old psychologist was sure of one thing: the scientific model he had created helped elect Trump.

Kosinski is a pioneer in the field of psychometry. It’s the science of predicting behavior using a psychological profile gleamed from social media activity. There are five criteria which analysts use to define your personality called the OCEAN model.

Openness to new ideas and learning.
Conscientiousness is a measure of your reliability, promptness and organization.
Extraversion measures openness and sociability.
Agreeableness refers to your kindness, empathy and warmth.
Neuroticism measures your emotional stability and rate of negative emotions.
In a 2012 study, Kosinski and two colleagues at Cambridge University were able to predict any person’s race, political affiliations and sexual orientation by analyzing someone’s social media activity.

While at Cambridge, Kosinski met fellow researcher Aleksandr Kogan. Russian-born Kogan asked about using Kosinski’s method for election manipulation. “The whole thing began to stink,” Kosinski told German Das Magazine.

Kosinski wanted nothing to do with using his research for malevolent means and broke off contact with Kogan. Since then, Kosinski has travelled the world lecturing about the dangers of ‘big data’ manipulation warning it can “threaten the welfare, freedom, or even the lives of men.”

Kogan, who has since changed his name to Alexandr Specter, had very different plans for his colleagues’ research.
Image
The secret story of how American minds were manipulated during the 2016 election and how Donald Trump’s backers are making money off it.
Dr. Michal Kosinski (left) met Dr. Aleksandr Spectre (middle) at Cambridge University where they both studied Psychometry. Spectre launched a research study of American voters using Facebook for the parent company of Cambridge Analytica, whose CEO is Alexander Nix (right).

After the invasion of the Ukraine, a Russian born researcher at Cambridge University began harvesting private data of 30 million U.S. Facebook users for Cambridge Analytica.

In 2014, Kogan founded “Global Science Research”, which was funded by London-based Scientific Communications Laboratory (SCL) and its subsidiary Cambridge Analytica. Kogan’s company hired 185,000 Americans to take an online survey and download a Facebook app. Once downloaded, the app gave Kogan information on 30 million unsuspecting Facebook users who were Facebook friends with the original survey-takers, according to The Intercept. Kogan has used Kosinski’s model to build an an algorithm to profile American voters.

In 2015, Kogan’s app was shut down because of complaints about privacy concerns. By then Kogan had gathered data on millions of Americans. “No one in this larger group of 30 million knew that “likes” and demographic data from their Facebook profiles were being harvested by political operatives hired to influence American voters,” according to The Intercept.

SCL has since distanced itself from the use of this data and Kogan’s GSR. “We do not do any work with Facebook likes,” Cambridge Analytica spokesman Lindsey Platts, said in a statement. The company “has no relationship with GSR,” Platts said.
Image
The secret story of how American minds were manipulated during the 2016 election and how Donald Trump’s backers are making money off it.

Cambridge Analytica CEO Alexander Nix explains how his company can profile and target any voter in the U.S.

The Trump Campaign paid Cambridge Analytica $6 million for a database of 220 million U.S. voters including private details of their personality and activities.

In 2016, Cambridge Analytica, under the leadership of CEO Alexander Nix and vice-president Stephen Bannon helped manipulate the British electorate to vote for Brexit. The previously unknown company had achieved the unthinkable in breaking up the E.U. and then began to turn its attention to U.S. politics.


Donald Trump even tweeted on Aug 18: The secret story of how American minds were manipulated during the 2016 election and how Donald Trump’s backers are making money off it.
Image

Every time you take one of those Facebook quizzes, you know the ones, which celebrity are you most like? Which Game of Thrones character are you? What country are you really from? You may think you’re taking an innocuous quiz, but what you’re really doing is giving a data mining firm like Cambridge Analytica, “express consent” to profile you. The company then cross-references that information with other data and creates an OCEAN profile of you which is even more accurate than a partner or family member could provide.

Donald Trump paid Cambridge Analytica $6 million for work on his 2016 campaign. The company’s data scientists were able to figure out Trump voters who were “very low in neuroticism, quite low in openness and slightly conscientious.” Once profiled, Cambridge Analytica could target their specific interests with tailor-made election messages and specific news content (real or fake) to work on their emotions, interests and regional location.

Think of it as a digital soapbox where the speaker is digitally crafted to appeal to your every fear and desire and pinpoint exactly where you live. Take for example, Donald Trump’s focus on the Opioid crisis and immigration during the 2016 election campaign. All of the messages were designed to surgically target voters with these seemingly “local” concerns in important swing states.




As Russian hackers targeted state voters rolls, Cambridge Analytica provided the Trump campaign with private voting history.

Cambridge Analytica assembled a database for the Trump campaign of 220 million voters known as Project Alamo. The data set included voter registration records, gun ownership records, credit card purchase histories, internet viewing history, car purchases and information from data-mining companies like Experian PLC, Datalogix, Epsilon and Acxiom Corporation.

Theresa Hong was a key digital officer for the Trump campaign. Hong told the BBC the data-set also included information about voting history. “Some of the attributes would be when was the last time they voted, who did they vote for.” In response to a question on how Cambridge Analytica would know all that information, Hong replied, “that’s their secret sauce.” (video above).

You’ll recall Russian hackers were able to infiltrate voters rolls in almost every state before the election. “We had to assume that they actually tried to at least rattle the doorknobs on all 50, and we just happened to find them in a few of them.” Michael Daniel, who led the Obama White House effort to secure the vote against Russian intrusions, told Time Magazine.

Add to this, a mysterious server in Trump Tower which is suspected of compiling a massive database of hacked voter information with Russia’s Alfa Bank and the DeVos family-owned Spectrum Health, and you begin to see a clearer picture of how Cambridge Analytica may have gotten all this data. Cambridge Analytica’s parent company was until recently owned by a British-Iranian businessman with ties to Putin-linked Ukrainian oligarch Dmitry Firtash who is wanted for bribery by the U.S. and was allegedly involved in a racketeering scheme with Paul Manafort.

From their digital headquarters in San Antonio, Texas, the Trump campaign placed between 70,000-175,000 different pieces of content on Facebook every day specifically targeting profiles provided from Cambridge Analytica. Trump’s digital advertising chief, Gary Coby, likened the effort to “high-frequency trading” and says Trump used Facebook “like no one else in politics has ever done.” In fact, all of Trump’s $85 million digital budget went into Facebook coffers.

Much of the targeted content that was delivered to voters was news created by Cambridge Analytica’s sister company Breitbart and Russian sponsored sites like Wikileaks, Sputnik and Russia Today. The FBI is investigating the role these news organizations played in Russia’s interference in the U.S. elections, according to McClatchy.

The secret story of how American minds were manipulated during the 2016 election campaign and how Donald Trump’s backers are making money off it.
Donald Trump’s key donor owns Cambridge Analytica, Stephen Bannon, Trump’s Former Senior Strategist was VP of CA and Michael Flynn former National Security Adviser was an advisor to the company.
Image

Trump Administration inked a $496,000 deal with Cambridge Analytica while Michael Flynn and Stephen Bannon were still in office which presents a possible conflict of interest.

Cambridge Analytica is partly owned by New York hedge fund manager and major Trump donor Robert Mercer, according to Politico. Donald Trump’s former senior adviser Stephen Bannon was until recently the company’s vice-president and owned a stake in the company which could be worth as much as $5 Million. Trump’s former National Security Adviser Lt-Gen Michael Flynn, a compromised Russian asset, also disclosed he played an advisory role for Cambridge Analytica.

On February 17, the Trump Administration paid $496,000 upfront to Cambridge Analytica’s parent company SCL in a contract with the U.S. State Department. SCL’s role at the State Department is to “assess the impact of foreign propaganda campaigns and provide intelligence agencies with predictions and insight on emerging threats,” according to the Washington Post.

SCL is also working a deal with the Pentagon to teach them “how to conduct effective psychological operations,” says the Post. SCL has hired new staffers and opened a new office just up the street from the White House. SCL’s effort is being driven by a “former aide to now-departed national security adviser Michael Flynn, who served as an adviser to the company in the past,” says the Post.

Considering the senior roles Mercer, Bannon and Flynn held within Cambridge Analytica, the new Washington operation of SLC and the State Department deal could well indicate a conflict of interest.

More importantly, the way Cambridge Analytica gained access to some 30 million Facebook accounts without users’ consent, along with private voting records, raises serious privacy concerns.

The office of Special Counsel Robert Mueller is investigating all aspects of a possible conspiracy between Russia and the Trump campaign to sway the election. “That includes investigating the nature of any links between individuals associated with the Trump campaign and the Russian government and whether there was any coordination between the campaign and Russia’s efforts,” Former FBI Director James Comey testified before a congressional committee in March.





seemslikeadream » Thu Mar 30, 2017 7:33 pm wrote:
FACEBOOK FAILED TO PROTECT 30 MILLION USERS FROM HAVING THEIR DATA HARVESTED BY TRUMP CAMPAIGN AFFILIATE
Mattathias Schwartz
March 30 2017, 1:01 p.m.
IN 2014, TRACES of an unusual survey, connected to Facebook, began appearing on internet message boards. The boards were frequented by remote freelance workers who bid on “human intelligence tasks” in an online marketplace, called Mechanical Turk, controlled by Amazon. The “turkers,” as they’re known, tend to perform work that is rote and repetitive, like flagging pornographic images or digging through search engine results for email addresses. Most jobs pay between 1 and 15 cents. “Turking makes us our rent money and helps pay off debt,” one turker told The Intercept. Another turker has called the work “voluntary slave labor.”

The task posted by “Global Science Research” appeared ordinary, at least on the surface. The company offered turkers $1 or $2 to complete an online survey. But there were a couple of additional requirements as well. First, Global Science Research was only interested in American turkers. Second, the turkers had to download a Facebook app before they could collect payment. Global Science Research said the app would “download some information about you and your network … basic demographics and likes of categories, places, famous people, etc. from you and your friends.”

“Our terms of service clearly prohibit misuse,” said a spokesperson for Amazon Web Services, by email. “When we learned of this activity back in 2015, we suspended the requester for violating our terms of service.”

Although Facebook’s early growth was driven by closed, exclusive networks at college and universities, it has gradually herded users to agree to increasingly permissive terms of service. By 2014, anything a user’s friends could see was also potentially visible to the developers of any app that they chose to download. Some of the turkers noticed that the Global Science Research app appeared to be taking advantage of Facebook’s porousness. “Someone can learn everything about you by looking at hundreds of pics, messages, friends, and likes,” warned one, writing on a message board. “More than you realize.” Others were more blasé. “I don’t put any info on FB,” one wrote. “Not even my real name … it’s backwards that people put sooo much info on Facebook, and then complain when their privacy is violated.”

In late 2015, the turkers began reporting that the Global Science Research survey had abruptly shut down. The Guardian had published a report that exposed exactly who the turkers were working for. Their data was being collected by Aleksandr Kogan, a young lecturer at Cambridge University. Kogan founded Global Science Research in 2014, after the university’s psychology department refused to allow him to use its own pool of data for commercial purposes. The data collection that Kogan undertook independent of the university was done on behalf of a military contractor called Strategic Communication Laboratories, or SCL. The company’s election division claims to use “data-driven messaging” as part of “delivering electoral success.”

SCL has a growing U.S. spin-off, called Cambridge Analytica, which was paid millions of dollars by Donald Trump’s campaign. Much of the money came from committees funded by the hedge fund billionaire Robert Mercer, who reportedly has a large stake in Cambridge Analytica. For a time, one of Cambridge Analytica’s officers was Stephen K. Bannon, Trump’s senior adviser. Months after Bannon claimed to have severed ties with the company, checks from the Trump campaign for Cambridge Analytica’s services continued to show up at one of Bannon’s addresses in Los Angeles.

“You can say Mr. Mercer declined to comment,” said Jonathan Gasthalter, a spokesperson for Robert Mercer, by email.

FaceBook Elections signs stand in the media area at Quicken Loans Arena in Cleveland, Thursday, Aug. 6, 2015, before the first Republican presidential debate. (AP Photo/John Minchillo) Facebook Elections signs in the media area at Quicken Loans Arena in Cleveland, Aug. 6, 2015, before the first Republican presidential debate of the 2016 election. Photo: John Minchillo/AP
The Intercept interviewed five individuals familiar with Kogan’s work for SCL. All declined to be identified, citing concerns about an ongoing inquiry at Cambridge and fears of possible litigation. Two sources familiar with the SCL project told The Intercept that Kogan had arranged for more than 100,000 people to complete the Facebook survey and download an app. A third source with direct knowledge of the project said that Global Science Research obtained data from 185,000 survey participants as well as their Facebook friends. The source said that this group of 185,000 was recruited through a data company, not Mechanical Turk, and that it yielded 30 million usable profiles. No one in this larger group of 30 million knew that “likes” and demographic data from their Facebook profiles were being harvested by political operatives hired to influence American voters.

Kogan declined to comment. In late 2014, he gave a talk in Singapore in which he claimed to have “a sample of 50+ million individuals about whom we have the capacity to predict virtually any trait.” Global Science Research’s public filings for 2015 show the company holding 145,111 British pounds in its bank account. Kogan has since changed his name to Spectre. Writing online, he has said that he changed his name to Spectre after getting married. “My wife and I are both scientists and quite religious, and light is a strong symbol of both,” he explained.

The purpose of Kogan’s work was to develop an algorithm for the “national profiling capacity of American citizens” as part of SCL’s work on U.S. elections, according to an internal document signed by an SCL employee describing the research.

“We do not do any work with Facebook likes,” wrote Lindsey Platts, a spokesperson for Cambridge Analytica, in an email. The company currently “has no relationship with GSR,” Platts said.

“Cambridge Analytica does not comment on specific clients or projects,” she added when asked whether the company was involved with Global Science Research’s work in 2014 and 2015.

The Guardian, which was was the first to report on Cambridge Analytica’s work on U.S. elections, in late 2015, noted that the company drew on research “spanning tens of millions of Facebook users, harvested largely without their permission.” Kogan disputed this at the time, telling The Guardian that his turker surveys had collected no more than “a couple of thousand responses” for any one client. While it is unclear how many responses Global Science Research obtained through Mechanical Turk and how many it recruited through a data company, all five of the sources interviewed by The Intercept confirmed that Kogan’s work on behalf of SCL involved collecting data from survey participants’ networks of Facebook friends, individuals who had not themselves consented to give their data to Global Science Research and were not aware that they were the objects of Kogan’s study. In September 2016, Alexander Nix, Cambridge Analytica’s CEO, said that the company built a model based on “hundreds and hundreds of thousands of Americans” filling out personality surveys, generating a “model to predict the personality of every single adult in the United States of America.”

Shortly after The Guardian published its 2015 article, Facebook contacted Global Science Research and requested that it delete the data it had taken from Facebook users. Facebook’s policies give Facebook the right to delete data gathered by any app deemed to be “negatively impacting the Platform.” The company believes that Kogan and SCL complied with the request, which was made during the Republican primary, before Cambridge Analytica switched over from Ted Cruz’s campaign to Donald Trump’s. It remains unclear what was ultimately done with the Facebook data, or whether any models or algorithms derived from it wound up being used by the Trump campaign.

In public, Facebook continues to maintain that whatever happened during the run-up to the election was business as usual. “Our investigation to date has not uncovered anything that suggests wrongdoing,” a Facebook spokesperson told The Intercept.

Facebook appears not to have considered Global Science Research’s data collection to have been a serious ethical lapse. Joseph Chancellor, Kogan’s main collaborator on the SCL project and a former co-owner of Global Science Research, is now employed by Facebook Research. “The work that he did previously has no bearing on the work that he does at Facebook,” a Facebook spokesperson told The Intercept.

Chancellor declined to comment.

Cambridge Analytica has marketed itself as classifying voters using five personality traits known as OCEAN — Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism — the same model used by University of Cambridge researchers for in-house, non-commercial research. The question of whether OCEAN made a difference in the presidential election remains unanswered. Some have argued that big data analytics is a magic bullet for drilling into the psychology of individual voters; others are more skeptical. The predictive power of Facebook likes is not in dispute. A 2013 study by three of Kogan’s former colleagues at the University of Cambridge showed that likes alone could predict race with 95 percent accuracy and political party with 85 percent accuracy. Less clear is their power as a tool for targeted persuasion; Cambridge Analytica has claimed that OCEAN scores can be used to drive voter and consumer behavior through “microtargeting,” meaning narrowly tailored messages. Nix has said that neurotic voters tend to be moved by “rational and fear-based” arguments, while introverted, agreeable voters are more susceptible to “tradition and habits and family and community.”

Dan Gillmor, director of the Knight Center at Arizona State University, said he was skeptical of the idea that the Trump campaign got a decisive edge from data analytics. But, he added, such techniques will likely become more effective in the future. “It’s reasonable to believe that sooner or later, we’re going to see widespread manipulation of people’s decision-making, including in elections, in ways that are more widespread and granular, but even less detectable than today,” he wrote in an email.

LOS ANGELES, CA - SEPTEMBER 15: Republican presidential candidate Donald Trump (L) throws a hat to supporters during a campaign rally aboard the USS Iowa on September 15, 2015 in Los Angeles, California. Donald Trump is campaigning in Los Angeles a day ahead of the CNN GOP debate that will be broadcast from the Ronald Reagan Presidential Library in Simi Valley. (Photo by Justin Sullivan/Getty Images) Donald Trump throws a hat to supporters during a campaign rally on Sept. 15, 2015, in Los Angeles. Photo: Justin Sullivan/Getty Images
Trump’s circle has been open about its use of Facebook to influence the vote. Joel Pollak, an editor at Breitbart, writes in his campaign memoir about Trump’s “armies of Facebook ‘friends,’ … bypassing the gatekeepers in the traditional media.” Roger Stone, a longtime Trump adviser, has written in his own campaign memoir about “geo-targeting” cities to deliver a debunked claim that Bill Clinton had fathered a child out of wedlock, and narrowing down the audience “based on preferences in music, age range, black culture, and other urban interests.”

Clinton, of course, had her own analytics effort, and digital market research is a normal part of any political campaign. But the quantity of data compiled on individuals during the run-up to the election is striking. Alexander Nix, head of Cambridge Analytica, has claimed to “have a massive database of 4-5,000 data points on every adult in America.” Immediately after the election, the company tried to take credit for the win, claiming that its data helped the Trump campaign set the candidate’s travel schedule and place online ads that were viewed 1.5 billion times. Since then, the company has been de-emphasizing its reliance on psychological profiling.

The Information Commissioner’s Office, an official privacy watchdog within the British government, is now looking into whether Cambridge Analytica and similar companies might pose a risk to voters’ rights. The British inquiry was triggered by reports in The Observer of ties between Robert Mercer, Cambridge Analytica, and the Leave.EU campaign, which worked to persuade British voters to leave the European Union. While Nix has previously talked about the firm’s work for Leave.EU, Cambridge Analytica now denies that it had any paid role in the campaign.

Twickenham, members of Leave EU and UKIP hand out leaflets<br /><br /><br /><br /> Grassroots Out action day on EU membership, London, Britain - 05 Mar 2016</p><br /><br /><br /> <p> (Rex Features via AP Images) Leave.EU signage is displayed in London on March 5, 2016. Photo: Rex Features/AP Images
In the U.S., where privacy laws are looser, there is no investigation. Cambridge Analytica is said to be pitching its products to several federal agencies, including the Joint Chiefs of Staff. SCL, its parent company, has new offices near the White House and has reportedly been advised by Gen. Michael Flynn, Trump’s former national security adviser, on how to increase its federal business. (A spokesperson for Flynn denied that he had done any work for SCL.)

Years before the arrival of Kogan’s turkers, Facebook founder Mark Zuckerberg tried to address privacy concerns around the company’s controversial Beacon program, which quietly funneled data from outside websites into Facebook, often without Facebook users being aware of the process. Reflecting on Beacon, Zuckerberg attributed part of Facebook’s success to giving “people control over what and how they share information.” He said that he regretted making Beacon an “opt-out system instead of opt-in … if someone forgot to decline to share something, Beacon went ahead and still shared it with their friends.”

Seven years later, Facebook appears to have made the same mistake, but with far greater consequences. In mid-2014, however, Facebook announced a new review process, where the company would make sure that new apps asked only for data they would actually use. “People want more control,” the company said at that time. “It’s going to make a huge difference with building trust with your app’s audience.” Existing apps were given a full year to switch over to have Facebook review how they handled user data. By that time, Global Science Research already had what it needed.
https://theintercept.com/2017/03/30/fac ... affiliate/



Robert Mercer 7 Billion Reasons to Steal an Election
viewtopic.php?f=8&t=40573


Image


Facebook Must Notify Users Who Had Data Stolen By Cambridge Analytica
Petition by Justin Hendrix

To be delivered to Mark Zuckerberg, CEO, Facebook

Facebook has revealed that in 2015 a University of Cambridge researcher, Aleksandr Kogan, passed user data to third parties, including SCL Group/Cambridge Analytica and Christopher Wylie of Eunoia Technologies, in violation of the company's platform policies. Cambridge Analytica, Kogan and Eunoia certified to Facebook that they destroyed the data, but they did not. Three years later, the public is only now being made aware of this breach.

Cambridge Analyica was a contractor to President Donald Trump's 2016 election campaign. Indeed, its CEO, Alexander Nix, made efforts to contact Wikileaks founder Julian Assange to help distribute hacked DNC emails to help Donald Trump get elected. Because of the sensitive nature of these parties, and because it should be liable to its users for such breaches, Facebook has a duty to inform all users affected by this breach.

Notification should include all users whose data was directly or indirectly shared with these parties, and how that information was used to target them with political advertising by Cambridge Analytica, the Cruz or Trump campaigns, or any other entity that CA did business with.
https://petitions.moveon.org/show.html? ... est_group=


Why would a Russian oil company be targeting American voters?




Cambridge Analytica scrambles to halt Channel 4 exposé

Firm with links to Trump election under pressure after Facebook ban over data harvesting

Matthew Garrahan in New York and Hannah Kuchler in San Francisco 13 HOURS AGO 164
Cambridge Analytica, the data firm alleged to have used the personal information of millions of Facebook users without their knowledge in its work for Donald Trump’s election campaign, is trying to stop the broadcast of an undercover Channel 4 News report in which its chief executive talks unguardedly about its practices.

Channel 4 reporters posed as prospective clients and had a series of meetings with Cambridge Analytica that they secretly filmed — including at least one with Alexander Nix, its chief executive. Channel 4 declined to comment.

Mr Nix referred the FT to Cambridge Analytica’s spokesperson when asked if he was aware of the Channel 4 report, which is due to air this week, according to people briefed on the situation. Cambridge Analytica’s spokesman declined to comment on the undercover Channel 4 report.

The company is under mounting pressure over how it uses personal data in political and election campaign work. It was banned by Facebook on Friday, which claimed it had violated the social network’s rules by failing to delete Facebook user data collected by an app for research purposes.

Press reports on Saturday claimed the company had harvested data from more than 50m profiles mostly belonging to US voters.

Christopher Wylie, a former Cambridge Analytica employee, showed documents to the New York Times, The Observer and Channel 4 News, which the news outlets said detailed a programme that used data from a survey without users’ permission. Some 270,000 users had granted permission for their data to be used for research purposes, not passed to a political data analytics firm, and they may have exposed data from their friends in the process.

Mr Nix recently told a parliamentary select committee that Cambridge Analytica did not use data from Facebook, including Facebook Likes, or any personality modelling.


Alexander Nix, Cambridge Analytica chief executive
Facebook said that in 2015 it discovered that Dr Aleksandr Kogan, a psychology professor at the University of Cambridge, had passed data collected by a personality prediction app that ran on Facebook to Cambridge Analytica.

The “this is your digital life” app was billed as a research app used by psychologists. Some 270,000 people downloaded the app, which used Facebook login details, and gave it consent to access data from their Facebook profiles including their city, the likes and information about their friends.

In a statement, Cambridge Analytica said it believed Dr Kogan’s company Global Science Research was complying with UK law. It also said the data it had received from GSR was not used during its work in the 2016 US presidential election.

“GSR was a company led by a seemingly reputable academic at an internationally renowned institution who made explicit contractual commitments to us regarding the its legal authority to license data to SCL Elections [Cambridge Analytica’s parent]. It would be entirely incorrect to attempt to claim that SCL Elections illegally acquired Facebook data,” the company wrote in a statement. 

But a report in The Observer based on an interview with Mr Wylie, the Cambridge Analytica whistleblower, said the number of affected profiles was far larger and more than 50m. The New York Times reported that 30m profiles contained enough data to match users to other records and build psychographic profiles.

Even if Cambridge Analytica no longer holds any Facebook data, it may have used the data to build the algorithms and profiles it now sells to political campaigns around the world and businesses. The Financial Times is among the company’s former clients, having once used Cambridge Analytica for a market research project.
https://www.ft.com/content/7ed1572c-2aa ... 7563b0b0f4
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Mon Mar 19, 2018 10:01 am

Homepage

UC Irvine.
Nov 21, 2017
Kremlin Propagandist Boasted of His Hacking Efforts, Strongly Implied Colluding With Trump Team in Facebook Posts

A former Duma deputy with close ties to Putin made the posts less than a week after Donald Trump won the election.

On November 12th 2016, just days after Donald Trump was elected President, a Russian man named Konstantin Rykov posted on Facebook detailing how, “Donald and I decided to free America and make it great again.” In a two part series that reads like a fantasy novel, Rykov illustrated in detail the four year effort to elect Trump.
Excerpts of the Facebook posts are below. Translations via Google Translate and Lorenz Cohen:

Part one

It’s time for wonderful stories. I’ll tell you about (now it’s possible) how Donald Trump and I decided to free America and make it great again. This took us as much as 4 years and 2 more days.
It all started at night from 6 to 7 November 2012.
[Trump] lifted his plane to the sky and flew between New York and DC, calling the whole world through his twitter — to start a march on Washington!
Without a moment’s thought, I wrote him a replay, which sounded like this in Russian: “I’m ready.” What should I do? “
Suddenly! There was a thin squeak of warning in the DM.
It was a message from Donald Trump. More precisely a picture. In the picture he was sitting in the armchair of his jet, smiling cheerfully and showing me the thumb of his right hand.


Part two

What was our idea with Donald Trump?
For four years and two days .. it was necessary to get to everyone in the brain and grab all possible means of mass perception of reality. Ensure the victory of Donald in the election of the US President. Then create a political alliance between the United States, France, Russia (and a number of other states) and establish a new world order.
Our idea was insane, but realizable.
In order to understand everything for the beginning, it was necessary to “digitize” all possible types of modern man.
Donald decided to invite for this task — the special scientific department of the “Cambridge University.”
British scientists from Cambridge Analytica suggested making 5,000 existing human psychotypes — the “ideal image” of a possible Trump supporter. Then .. put this image back on all psychotypes and thus pick up a universal key to anyone and everyone.
Then it was only necessary to upload this data to information flows and social networks. And we began to look for those who would have coped with this task better than others.
At the very beginning of the brave and romantic [story] was not very much. A pair of hacker groups, civil journalists from WikiLeaks and political strategist Mikhail Kovalev.
The next step was to develop a system for transferring tasks and information, so that no intelligence and NSA could burn it.


Though the posts don’t directly confirm collusion with the Trump campaign, Rykov’s use of “we” and “our” strongly suggests some level of coordination. Short of accessing his Direct Messages, the claim that Trump sent Rykov a private message on election night 2012 is unverifiable. The picture that Rykov included, however, was not an original picture. Analyzing the source code on the Instagram photo confirms that Rykov posted the picture of Trump approximately six hours after Melania Trump posted it on her Twitter:

ImageImage

Image
Instagram source code

Image

Though this was not an original image, it is not out of the realm of possibility that Trump sent DM’s of the picture to those who were praising him on Twitter. The mention of working with WikiLeaks is especially interesting, given the fact that President Trump’s CIA Director Mike Pompeo has said that WikiLeaks is a “hostile intelligence service” acting as an arm of the Kremlin. A year after the election, it was revealed that Donald Trump Jr. was in contact with WikiLeaks and aided the organization in spreading the contents of the Russian hacking efforts.
Throughout 2015 and 2016, Rykov was a vocal supporter of Donald Trump. He set up a website dedicated to aggregating and promoting positive news on Trump, and was featured in the conservative Washington Examiner in a piece titled “Putin Loves Donald Trump” as the, “Kremlin mouthpiece.” Then candidate-Trump retweeted the article.

Establishing Rykov’s proximity to the Kremlin

Image
Konstantin Rykov has a long and complicated history in Russian politics and popular culture. In the late 1990s and early 2000s, Rykov was somewhat of an online pioneer creating multiple websites focused on culture, news, and politics. In 2007, as a 28-year-old, Rykov was elected to the Duma as a deputy in Vladimir Putin’s United Russia party. The tech-savvy Rykov quickly became one of Russia’s most influential politicians. He has remained an ardent supporter of Putin, creating numerous websites dedicated to the Russian President. In The Net Delusion: The Dark Side of Internet Freedom, renowned political/tech scientist Evgeny Morozov explained that, “Russian leaders follow [Rykov’s] lead.”
Rykov’s far-right vision is world-wide. He has supported the Scottish Independence effort in 2014, Brexit in the UK, Marine Le Pen in France and Donald Trump in the United States. The Kremlin propagandist’s influential position in the Kremlin was further established in 2015 when hacked text messages were released by Anonymous International that show continual discussion between Rykov and Putin adviser Timur Prokopenko — the head of the Russian domestic affairs department. The two men discuss a quid-pro-quo between French far-right leader Marine Le Pen and the Kremlin. Le Pen was to recognize the Russian annexation of Crimea as legitimate in exchange for Russian money in the form of large loans. Le Pen denies knowing Rykov or Prokopenko and maintains her innocence, despite the fact that she did recognize Crimea and then subsequently receive over 40 million euros in loans from Moscow’s First Czech-Russian Bank. The cash-strapped National Front Party was previously denied loans from French banks.
Further eroding the credibility of Le Pen, an investigative report by the French newspaper Navalny revealed that Rykov owns a $2 million villa in the town of Mougins. He is listed as a tax resident of France, meaning that he must either permanently live in France, work there, or have his main economic interests in France. Rykov claims he lives in Moscow. He did not return requests for comment.

Rykov’s shared associates with Trump

Image

Yulya Alferova and Artem Klyushin with Donald Trump in Moscow, 2013
Rykov is one degree of separation away from Trump in two different areas. First, recall that during Trump’s trip to Moscow in 2013 he was seen with Yulya Alferova nearly every step of the way. She even claimed that Trump was running for President in January of 2015, months before he would announce his run. Alferova’s husband is tech entrepreneur Artem Klyushin who regularly corresponds with Rykov on social media. He was with Alferova and Trump throughout Trump’s time in Moscow. In March of 2016, Klyushin posted pictures with Rykov in what he called a “secret meeting.” The exact topic of the meeting remains unknown.
Secondly, former Fox News producer Jack Hanick was a special guest at a Trump election night party in Moscow organized by Rykov.

Hanick co-founded a conservative Orthodox TV channel in Russia called Tsargrad TV with Putin’s close confidant Konstantin Malofeev. Tsargrad was the only major channel to cover Carter Page’s full speech in Moscow in 2015, and the pro-Trump parties on election night and the Inauguration. Hanick has regularly appeared on his channel, defending the alt-right and denouncing “globalists.” A former associate of Hanick’s, who demanded anonymity, told me that Hanick still occasionally keeps in contact with his Fox News friends.

A regular commentator on Tsargrad is former Russian intelligence officer Leonid Reshetnikov. He is also the former head of the think tank Russian Institute for Strategic Studies (RISS). In April, Reuters reported that Reshetnikov’s RISS was the group responsible for drawing up the plan for the 2016 US election interference campaign.
The revelations about Konstantin Rykov, his confession of working with WikiLeaks and hacker groups, and his shared associations with the Kremlin and Donald Trump are troubling. He is yet another figure in an ever-expanding investigation into possible collusion between the Trump campaign the the Russian government.
https://medium.com/@ScottMStedman/kreml ... 05104965a1


Cambridge Analytica whistleblower says company worked with Corey Lewandowski, Steve Bannon
In a live interview with TODAY’s Savannah Guthrie, Christopher Wylie, a former employee of British company Cambridge Analytica, says the company misused personal Facebook data of some 50 million people to help influence the 2016 presidential election. Wylie says the company met with former Trump campaign manager (and current outside adviser) Corey Lewandowski, former chief strategist Steve Bannon as well as Russian oil companies.

https://www.today.com/video/cambridge-a ... 9326915651


Just to add to this, Bannon was on the (still cryptic) board of Cambridge Analytica. All of his companies—Breitbart, his shadowy production company Glittering Steel (which boosted the Uranium One conspiracy theory in their movie Clinton Cash)—shared a U.S. address with Cambridge.
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Tue Mar 20, 2018 10:36 am

Vladeck, now a professor at Georgetown Law, said violations of the consent decree could carry a penalty of $40,000 per violation, meaning that if news reports that the data of 50 million people were shared proves true, the company’s possible exposure runs into the trillions of dollars.


Facebook Could Face ‘Trillions’ In Fines: Is This The Beginning Of The End?
therealheisenberg / 1 day ago
Boy, it’s looking like Facebook might be severely fucked.

Shares are down sharply to start the week and as noted early Monday morning, some analysts are already out warning that the data breach fiasco has the potential to spiral out of control.

Image

I’m not seeing anything from the major banks yet, but you can bet some folks are panic-crunching the numbers this morning to try and come up with some kind of back-of-the- envelope calculations for the potential liability here. Raoul Pal’s got some thoughts:

Raoul Pal

@RaoulGMI
I’m wondering whether we are fast approaching the tipping point for $FB and $GOOGL ‘s fall from monopolistic power, as global regulators begin to take notice as to all the (inadvertent?) inappropriate behavior that has taken place. The short selling case is building fast...
3:48 PM - Mar 18, 2018

388

167 people are talking about this




One person who thinks this could be really – really – bad is David Vladeck, former director of the FTC’s Bureau of Consumer Protection. He teaches at Georgetown Law now, and he spoke to the Washington Post for a piece out Sunday.

Although Facebook insists it didn’t violate a critical consent decree from 2011 that mandates users be informed and requires their permission before their data can be shared, David thinks they might be wrong. Here are a couple of excerpts from that WaPo piece:

Two former federal officials who crafted the landmark consent decree governing how Facebook handles user privacy say the company may have violated that decree when it shared information from tens of millions of users with a data analysis firm that later worked for President Trump’s 2016 campaign.

Such a violation, if eventually confirmed by the Federal Trade Commission, could lead to many millions of dollars in fines against Facebook, said David Vladeck, who as the director of the FTC’s Bureau of Consumer Protection oversaw the investigation of alleged privacy violations by Facebook and the subsequent consent decree resolving the case in 2011. He left that position in 2012.

On Sunday morning, Vladeck said in an interview with The Washington Post that Facebook’s sharing of data with Cambridge Analytica “raises serious questions about compliance with the FTC consent decree.”

[…]

Vladeck, now a professor at Georgetown Law, said violations of the consent decree could carry a penalty of $40,000 per violation, meaning that if news reports that the data of 50 million people were shared proves true, the company’s possible exposure runs into the trillions of dollars.

I’m no Facebook analyst, but I’m going to go out on a limb here and say that if they were fined “trillions of dollars” that might be the beginning of the end for the company.

I mean look, obviously that’s so unlikely to happen that it’s probably not even worth pondering, but then again, given all the recent bad press that Facebook has garnered for its unwitting role in helping bad actors spread propaganda and misinformation, lawmakers are probably at wit’s end with the company at this juncture.

Here’s what Jessica Rich, former deputy director for the Bureau of Consumer Protection who oversaw the FTC’s privacy program and also led the investigation into Facebook before the 2011 consent decree said in an e-mail to WaPo:

Depending on how all the facts shake out, Facebook’s actions could violate any or all of [the consent decree] provisions, to the tune of many millions of dollars in penalties. They could also constitute violations of both US and EU laws. Facebook can look forward to multiple investigations and potentially a whole lot of liability here.

Long story short, this looks like a disaster in the making at the worst possible time and not to put too fine a point on it, but when combined with everything we now know about the extent to which the platform was hijacked for nefarious purposes ahead of the election (see our archive on that here), this casts considerable doubt on Mark Zuckerberg’s competency as someone who is capable of running a company that seems to have escaped the lab and is now running amok in the village.

***************

This morning, Sens. Amy Klobuchar, D-Minn., and John Kennedy, R-La., are out calling for Senate Judiciary Committee hearings during which lawmakers could publicly question technology company CEOs. Here’s the letter:

Dear Chairman Grassley:

We write to express serious concern regarding recent reports that data from millions of Americans was misused in order to influence voters, and to urge you to convene a hearing with the CEOs of major technology companies — including Facebook, Google, and Twitter — regarding the security of Americans’ data in light of this significant breach.

Reports indicate that private information from the Facebook profiles of more than 50 million users — representing nearly a quarter of potential U.S. voters in 2016 — was taken to conduct sophisticated psychological targeting for political ads in order to influence voters. The reports further indicate that Facebook knew about this breach more than two years ago and failed to acknowledge it and take swift and meaningful action.

While Facebook has pledged to enforce its policies to protect people’s information, questions remain as to whether those policies are sufficient and whether Congress should take action to protect people’s private information. The Committee considered similar cybersecurity issues in an October hearing featuring testimony from the former chairman and CEO of Equifax. We believe that the Committee should revisit these issues in light of recent events and upcoming elections.

Important questions also remain unanswered about the role of these technology companies in our democracy. Major social media platforms store an enormous amount of data and have a user base larger than all of the major broadcasting companies combined. The remarkable innovation that these companies have championed has changed how we share and collect information. In the process, Facebook, Google, and Twitter have amassed unprecedented amounts of personal data and use this data when selling advertising, including political advertisements. The lack of oversight on how data is stored and how political advertisements are sold raises concerns about the integrity of American elections as well as privacy rights.

Senators from both parties have called for more transparency and accountability from social media platforms in their efforts to guard against interference by foreign actors. Testimony before this Committee and others from current Administration officials, as well as former officials from the Administrations of President George W. Bush and President Obama, has made clear that the threat of foreign interference continues to exist, and that these foreign powers will make similar attempts to interfere in future elections.

It is our view that Senators on the Judiciary Committee should have the opportunity to question the CEOs of technology companies about these critical matters. While this Committee’s Subcommittee on Crime and Terrorism convened a hearing with witnesses representing Facebook, Twitter, and Google in October of 2017, we have yet to hear from the leaders of these companies directly. A hearing featuring testimony with CEOs would provide the Committee the opportunity to hear an update on the progress of these companies’ voluntary measures to combat attempted foreign interference and what is being done to protect Americans’ data and limit abuse of the platforms, as well as to assess what measures should be taken before the next elections.

It is for these reasons that we request that you announce a hearing of the Judiciary Committee at which Senators can publicly question the CEOs of technology companies. We would be happy to discuss this matter with you further and we appreciate your consideration of this request.
https://heisenbergreport.com/2018/03/19 ... f-the-end/


A Hurricane Flattens Facebook



Two weeks ago, Facebook learned that The New York Times, Guardian, and Observer were working on blockbuster stories based on interviews with a man named Christopher Wylie. The core of the tale was familiar but the details were new, and now the scandal was attached to a charismatic face with a top of pink hair. Four years ago, a slug of Facebook data on 50 million Americans was sucked down by a UK academic named Aleksandr Kogan, and wrongly sold to Cambridge Analytica. Wylie, who worked at the firm and has never talked publicly before, showed the newspapers a trove of emails and invoices to prove his allegations. Worse, Cambridge appears to have lied to Facebook about entirely deleting the data.

To Facebook, before the stories went live, the scandal appeared bad but manageable. The worst deeds had been done outside of Facebook and long ago. Plus like weather forecasters in the Caribbean, Facebook has been busy lately. Just in the past month, they’ve had to deal with scandals created by vacuous Friday tweets from an ad executive, porn, the darn Russian bots, angry politicians in Sri Lanka, and even the United Nations. All of those crises have passed with limited damage. And perhaps that’s why the company appears to have underestimated the power of the storm clouds moving in.

Facebook has burned its fingers on issues of data privacy frequently in its 14 year history. But this time it was different.
On Friday night, the company made its first move, jumping out in front of the news reports to publish its own blog post announcing that it was suspending Cambridge Analytica’s use of the platform. It also made one last stern appeal to ask The Guardian not to use the word “breach” in it’s story. The word, the company argued, was inaccurate. Data had been misused, but moats and walls had not been breached. The Guardian apparently did not find that argument sympathetic or persuasive. On Saturday its story appeared, “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach.”

The crisis was familiar in a way: Facebook has burned its fingers on issues of data privacy frequently in its 14 year history. But this time it was different. The data leakage hadn’t helped Unilever sell mayonnaise. It appeared to have helped Donald Trump sell a political vision of division and antipathy. The news made it look as if Facebook’s data controls were lax and that its executives were indifferent. Around the world lawmakers, regulators, and Facebook users began asking very publicly how they could support a platform that didn’t do more to protect them. Soon, powerful politicians were chiming in and demanding to hear from Zuckerberg.

As the storm built over the weekend, Facebook’s executives, including Mark Zuckerberg and Sheryl Sandberg, strategized and argued late into the night. They knew that the public was hammering them, but they also believed that the fault lay much more with Cambridge Analytica than with them. Still, there were four main questions that consumed them. How could they tighten up the system to make sure this didn’t happen again? What should they do about all the calls for Zuckerberg to testify? Should they sue Cambridge Analytica? And what could they do about psychologist Joseph Chancellor, who had helped found Kogan’s firm and who now worked, of all places, at Facebook?

By Monday, Facebook remained frozen, and Zuckerberg and Sandberg stayed silent. Then, late in the afternoon in Menlo Park, more bad news came. The New York Times reported that Alex Stamos, the company’s well-respected chief of security, had grown dissatisfied with the top of senior management and was planning to exit in a few months. Some people had known this for a while, but it was still a very bad look. You don’t want news about your head of data security bailing when you’re having a crisis about how to secure your data. And then news broke that Facebook had been denied in its efforts to get access to Cambridge Analytica’s servers. The United Kingdom’s Information Commissioner’s Office, which had started an investigation, would handle that.

An all-hands meeting was called for Tuesday but for some reason it would be led by Facebook’s legal counsel not its leaders, both of whom have remained deafeningly silent. Meanwhile, the stock had collapsed, chopping $36 billion off the company’s market value on Monday. By mid-Tuesday morning, it had fallen 10 percent since the scandal broke. What the company expected to be a tough summer storm had turned into a Category 5 hurricane.

Walking in the Front Door

The story of how Kogan ended up with data on 50 million American Facebook users sounds like it should involve secret handshakes and black hats. But Kogan actually got his Facebook data by just walking in Facebook’s front door and asking for it. Like all technology platforms, Facebook encourages outside software developers to build applications to run inside it, just like Google does with its Android operating system and Apple does with iOS. And so in November 2013 Kogan, a psychology professor at the University of Cambridge, created an application developer account on Facebook and explained why he wanted access to Facebook’s data for a research project. He started work soon thereafter.

Kogan had created the most anodyne of tools for electoral manipulation: an app based on personality quizzes. Users signed up and answered a series of questions. Then the app would take those answers, mush them together with that person’s Facebook likes and declared interests, and spit out a profile that was supposed to know the test-taker better than he knew himself.

About 270,000 Americans participated. However what they didn’t know was that by agreeing to take the quiz and giving Facebook access to their data, they also granted access to many of their Facebook friends’ likes and interests as well. Users could turn off this setting, but it’s hard to turn off something you don’t know exists and that you couldn’t find if you did. Kogan, quickly ended up with data on roughly 50 million people.

About five months after Kogan began his research, Facebook announced that it was tightening its app review policies. For one: Developers couldn’t mine data from your friends anymore. The barn door was shut, but Facebook told all the horses already in the pasture that they had another year to run around. Kogan, then, got a year and a half to do his business. And when the stricter policies went into effect, Facebook promptly rejected version two of his app.

By then Kogan had already mined the data and sold it to Cambridge Analytica, violating his agreement with Facebook and revealing one of the strange asymmetries of this story. Facebook knows everything about its users—but in some ways it knows nothing about its developers. And so Facebook didn’t start to suspect that Kogan had misused its data until it read a blaring headline in The Guardian in December 2015: “Ted Cruz using firm that harvested data on millions of unwitting Facebook users.”

That story passed out of the cycle quickly though, swept away by news about the Iowa caucuses. And so while Facebook’s legal team might have been sweating at the end of 2015, outwardly Zuckerberg projected an air of total calm. His first public statement after the Guardian story broke was a Christmas note about all the books he’d read: “Reading has given me more perspective on a number of topics – from science to religion, from poverty to prosperity, from health to energy to social justice, from political philosophy to foreign policy, and from history to futuristic fiction.”

An Incomplete Response

When the 2015 Guardian story broke, Facebook immediately secured written assertions from Cambridge Analytica, Kogan, and Christopher Wylie that the data had been deleted. Lawyers on all sides started talking and by the early summer of 2016, Facebook had more substantial legal agreements with Kogan and Wylie certifying that the data had been deleted. Cambridge Analytica signed similar documents, but their paperwork wasn’t submitted until 2017. Facebook’s lawyers describe the process as a tortured and intense legal process. Wylie describes it as a pinkie promise. “All they asked me to do was tick a box on a form and post it back,” he told the Guardian.

Facebook’s stronger option would have been to insist on an audit of all of Cambridge Analytica’s machines. Did the data still exist, and had it been used at all? And in fact, according to the standard rules that developers agree to, Facebook reserves that right. “We can audit your app to ensure it is safe and does not violate our Terms. If requested, you must provide us with proof that your app complies with our terms,” the policy currently states, as it did then.

Kogan, too, may have merited closer scrutiny regardless, especially in the context of the 2016 presidential campaign. In addition to his University of Cambridge appointment, Kkogan was also an associate professor at St. Petersburg State University, and had accepted research grants from the Russian government.

'All options are on the table.'
Paul Grewal, Facebook Deputy General Counsel
Why didn’t Facebook conduct an audit—a decision that may go down as Facebook’s most crucial mistake? Perhaps because no audit can ever be completely persuasive. Even if no trace of data exists on a server, it could still have been stuck on a hard-drive and shoved in a closet. Facebook’s legal team also insists that an audit would have been time-consuming, and despite the rights granted in a developer contract would have required a court order. A third possible explanation is fear of accusations of political bias. Most of the senior employees at Facebook are Democrats who blanche at allegations that they would let politics seep into the platform.

Whatever the reason, Facebook trusted the signed documents from Cambridge Analytica. That June, of 2016, Facebook staff even went down to San Antonio to sit side by side with Trump campaign officials and the Cambridge Analytica consultants by their side.

To Facebook, the story seemed to go away. In the year following Trump’s victory, public interest advocates hammered Cambridge Analytica over its data practices, and other publications, particularly The Intercept, dug into its practices. But Facebook, according to executives at the company, never thought to double check if the data was gone until reporters began to call this winter. And then it was only after the story broke that Facebook considered serious action including suing Cambridge Analytica. A lawyer for the company, Paul Grewal, told Wired on Monday evening that “all options are on the table.”

What Comes Next

Of Facebook’s many problems, one of the most confusing appears to be figuring out what to do with Chancellor, who currently works with the VR team. He may know about the fate of the user data, but this weekend the company was debating how forcefully it could ask him since it could be considered a violation of rules protecting employees from being forced to give up trade secrets from previous jobs.

A harder question is when, and how exactly, Zuckerberg and Sandberg should emerge from their bunkers. Sandberg, in particular, has passed through the crucible of the past two years relatively unscathed. Zuckerberg’s name now trends on Twitter when crises hit, and this magazine put his bruised face on the cover. Even Stamos has taken heat during the outcry over the Russia investigation. And a small bevy of brave employees have waded out into the rushing rivers of Twitter, where they have generally been sucked below the surface or swept over waterfalls.

At its core, according to a former Facebook executive, the problem is really an existential one.
The last most vexing question is what to do to make Facebook data safer. For much of the past year, Facebook has been besieged by critics saying that it should make its data more open. It should let outsiders audit its data and peer around inside with a flashlight. But it was an excess of openness with developers—and opaque privacy practices—that got the company in trouble here. Facebook tightened up third-party access in 2015, meaning an exact replay of the Cambridge Analytica fiasco couldn’t happen today. But if the company decides to close down even further, then what happens to the researchers doing genuinely important work using the platform? How well can you vet intentions? A possible solution would be for Facebook to change its data retention policies. But doing so could undermine how the service fundamentally works, and make it far more difficult to catch malevolent actors—like Russian propaganda teams—after the fact.

User data is now the foundation of the internet. Every time you download an app, you give the developer access to bits of your personal information. Every time you engage with any technology company—Facebook, Google, Amazon, and so on—you help build their giant database of information. In exchange, you trust that they won’t do bad things with that data, because you want the services they offer.

Responding to a thread about how to fix the problem, Stamos tweeted, “I don’t think a digital utopia where everybody has privacy, anonymity and choice, but the bad guys are magically kept out, can exist.”

At its core, according to a former Facebook executive, the problem is really an existential one. The company is very good at dealing with things that happen frequently and have very low stakes. When mistakes happen, they move on. According to the executive, the philosophy of the company has long been “We’re trying to do good things. We’ll make mistakes. But people are good and the world is forgiving.”

If Facebook doesn’t find a satisfactory solution, it faces the unsavory prospect of heavy regulation. Already in the UK, the General Data Protection Regulation rule will give people much more insight and control over what data companies like Facebook take, and how it’s used. In the US, senators like Ron Wyden, Mark Warner, Amy Klobuchar, and others may have the appetite for similar legislation in the US, if Facebook’s privacy woes continue.

Facebook will hold its all-hands today, and hope for that inevitable moment when something horrible happens elsewhere and everyone’s attention turns. But it also knows that things might get worse, much worse. The nightmare scenario will come if the Cambridge Analytica story fully converges with the story of Russian meddling in American democracy: if it turns out that the Facebook data harvested by Cambridge Analytica ended up in the hands of Putin’s trolls.

At that point, Facebook will have to deal with yet another devastating asymmetry: data from a silly quiz app, created under obsolete rules, fueling a national security crisis. But those asymmetries are just part of the nature of Facebook today. The company has immense power, and it’s only begun to grapple with its immense responsibility. And the world isn’t as forgiving of Silicon Valley as it used to be.

Facebook and Cambridge Analytica
https://www.wired.com/story/facebook-ca ... -response/
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Tue Mar 20, 2018 4:08 pm

UK Parliament summons Facebook's Mark Zuckerberg to be questioned over Cambridge Analytica scandal

Lizzy Buchan Political CorrespondentTuesday 20 March 2018 19:39 GMT

Social media giant accused of 'misleading' Parliament over risk of companies acquiring users private data

MPs have summoned Facebook’s Mark Zuckerberg to give evidence over the “catastrophic failure of process” behind the Cambridge Analytica data breach.

Damian Collins, chair of the influential Digital, Culture, Media and Sport Committee (DCMS), said the social media giant had previously given “misleading” evidence to Parliament and “consistently understated the risk” of user data being used without their consent.

In a sternly worded letter, Mr Collins said it was time for the Facebook founder to address MPs over allegations that Cambridge Analytica carried out an illegal data grab on more than 50 million social media profiles in 2014.


READ MORE
UK data watchdog vows to search Cambridge Analytica servers
It comes after senior figures at Cambridge Analytica were secretly filmed boasting that they could entrap politicians and use former spies to gather information to influence foreign elections.

The London-based political consulting firm was credited with aiding Donald Trump’s 2016 US presidential run and was also employed by the Leave campaign during the EU referendum.

The letter states: “The committee has repeatedly asked Facebook about how companies acquire and hold on to user data from their site, and in particular about whether data had been taken without their consent.

“Your officials’ answers have consistently understated this risk, and have been misleading to the committee.

READ MORE
UK data watchdog vows to search Cambridge Analytica servers
Cambridge Analytica offered to entrap politicians
The Cambridge Analytica outrage is just anger about Trump and Brexit
Facebook stock loses $25bn amid huge ‘data breach’ scandal
‘Full cooperation’: Theresa May attacks Cambridge Analytica amid probe
“It is now time to hear from a senior Facebook executive with the sufficient authority to give an accurate account of this catastrophic failure of process.

“There is a strong public interest test regarding user protection. Accordingly we are sure you will understand the need for a representative from right at the top of the organisation to address concerns.

“Given your commitment at the start of the New Year to ‘fixing’ Facebook, I hope that this representative will be you.”

20 Celebrities Whose Siblings Are More Attractive Than They Are
Hooch
Famous Gay Couples Who Are Engaged or Married That'll Melt Your Heart
ArticlesVally
10 Countries Where You Can Retire for $200K
MoneyWise
by Taboola Sponsored Links
The row emerged after a whistleblower told The Observer that data on millions of Facebook users had been seized by Cambridge Analytica, and then it was not destroyed as agreed.

Christopher Wylie, a former research director for the firm, said this data was used to build software that could target voters and influence their political choices.

The UK’s information watchdog has sought a warrant to search the offices of Cambridge Analytica as part of a probe into the use of personal data for political campaigns.

An investigation by Facebook was dramatically halted last night to allow the Information Commissioner’s Office to pursue its inquiry.

Brexit so far: in pictures
22
show all
Theresa May has expressed concern at the reports and expects Facebook, Cambridge Analytica and all the organisations involved to “cooperate fully”, her spokesman said.

Further claims about Cambridge Analytica emerged on Monday in an undercover investigation by Channel 4 News, which revealed chief executive Alexander Nix discussing entrapment and using ex spies to discover dirt on political opponents.

Mr Nix denied the claims to reporters outside his offices on Tuesday, saying that “appearances can be deceptive” when asked if the firm had previously used entrapment.

When asked if CA would abandon its political work, Mr Nix gave no reply but firmly denied he had misled parliament over its use of data, saying “absolutely not”.

A Facebook spokesperson told The Independent: “We have received a letter from the Digital, Culture, Media and Sports Committee and will of course respond by the given deadline.

“In the meantime, we continue to engage with the committee and respond to their requests for information.”
https://www.independent.co.uk/news/uk/p ... 64906.html
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Wed Mar 21, 2018 12:22 pm

The Cambridge Analytica saga is a scandal of Facebook’s own making
John Harris

This mess was inevitable. Facebook has worked tirelessly to gather as much data on users as it could – and to profit from it

Wed 21 Mar 2018 07.33 EDT Last modified on Wed 21 Mar 2018 11.30 EDT

Facebook CEO Mark Zuckerberg speaks at the F8 summit in San Francisco, California. Photograph: Josh Edelson/AFP/Getty Images

Big corporate scandals tend not to come completely out of the blue. As with politicians, accident-prone companies rarely become that way by accident, and a spectacular crisis can often arrive at the end of a long spell of bad decisions and confidence curdling into hubris. So it is with the tale of Facebook and Cambridge Analytica, and a saga that vividly highlights the awful mess that the biggest player in billions of online lives has turned into.

Four days after a story pursued for over a year by my brilliant Observer colleague Carole Cadwalladr burst open, its plot now feels very familiar: in early 2014, 270,000 people did an online “personality test” that appears to have resulted in information about 50 million of their Facebook friends being passed to the nasty and amoral men featured in Channel 4’s secret filming, which would be in contravention of Facebook’s rules about data being used by third parties for commercial purposes. In the second act, Facebook failed to alert users and took only limited steps to recover and secure the data in question. On Tuesday, as Facebook’s value continued to slide, the plot thickened, with the re-appearance of a whistleblower named Sandy Parakilas, who claimed that hundreds of millions more people were likely to have had similar information harvested by outside companies, and that while he was working for the company between 2011 and 2012, Facebook’s systems for monitoring how such information was used often seemed to barely exist.

Play Video 1:06
Cambridge Analytica caught in undercover sting boasting about entrapping politicians - video

Even if Facebook has since changed its rules to stop third-party apps gaining access to data from people’s friends, all this still goes back to something that remains absolutely fundamental to the company. A lot of its users know, and yet constantly choose to forget: beneath all the bromides about “bringing the world closer together” gushed out by its founder and CEO Mark Zuckerberg, and the joy of posting your holiday pictures, Facebook’s employees tirelessly work to amass as much data as they can about users and their online friends and make vast amounts of money by facilitating micro-targeting by advertisers. (This has had nasty aspects beyond political messaging: it was only last year, for example, that Facebook decisively stopped housing advertisers excluding certain ethnic groups, and disabled people).

If you use its services as their creators intend and cough out the small details of your life on a daily – or even hourly – basis, Facebook will know all about your family, friends, education, politics, travel habits, taste in clothes, connected devices, and scores of things besides. Its eyes can extend just about everywhere online: to quote from its privacy policy, “We receive data whenever you visit a game, application, or website that uses Facebook Platform or visit a site with a Facebook feature … sometimes through cookies.” And though third-party apps can be restricted from scooping up personal information, we all know what tends to deliver their makers what they want: the fact that most people have no idea how to restrict access to their data, and are subtly enticed to ignore such things.

All this stuff defines Facebook’s raison d’etre. Indeed, hinting at its drive for omniscience, Zuckerberg once habitually talked about what Facebook insiders called radical transparency, an idea that partly amounted to an insistence that old ideas about privacy were becoming outmoded. Facebook was leading the way, and this was nothing but a good thing.

“To get people to the point where there’s more openness – that’s a big challenge,” Zuckerberg said. “But I think we’ll do it. I just think it will take time. The concept that the world will be better if you share more is something that’s pretty foreign to a lot of people, and it runs into all these privacy concerns.” (You could write a doctoral thesis about those words: the professed belief in improving the lot of humanity sounding distinctly like window-dressing for the company’s pursuit of endlessly increasing revenues; the seeming impatience summed up in the words “all these privacy concerns”.) In retrospect, talking like that, and encouraging your people to think of a lot of worries about personal confidentiality as increasingly the stuff of the past, was always going to invite disaster.

At times, Mark Zuckerberg looks like someone who cannot keep up even with himself

Facebook’s latest bout of anxiety and what some people call “reputational damage” now dates back at least 18 months. By the end of the US presidential election campaign, its algorithms had ensured that the top fake stories in people’s news feeds were generating more engagement than the most popular real ones. Zuckerberg initially described the claim that Facebook had been instrumental in the victory of Donald Trump as a “pretty crazy idea”, only to recant. Having been scared by Twitter and enthusiastically pushing the idea that Facebook could be a news platform, he then ran in the opposite direction, insisting that its job was to allow people to share “personal moments”. At times, he looks like someone who cannot keep up even with himself.

Facebook sometimes behaves like a government – sending in “auditors” to examine material at the London offices of Cambridge Analytica while the UK information commissioner’s investigators waited for legal permission to do the same thing, and reportedly demanding access to the whistleblower Christopher Wylie’s phone and computer. But at the same time, its bosses defy the most basic expectations of corporate governance. Like Facebook’s COO Sheryl Sandberg, Zuckerberg is still nowhere to be seen: a statement issued on Tuesday said he and Sandberg were “working around the clock to get all the facts and take the appropriate action moving forward”, and that “the entire company is outraged we were deceived”, which is most of the way to being laughable. Were it not for his $70bn fortune, he would arguably inspire pity, rather than anger: it looks like he is in way over his head.

Play Video 3:41
Everything you need to know about the Cambridge Analytica exposé – video explainer

Even if the majority of Facebook users still seem content to give it the data it constantly devours, over the past two or three years, a rising chorus of voices has demanded that governments and legislators bring the company to heel. The EU’s General Data Protection Regulation represents a step in the right direction; as does the fact that the Cambridge Analytica scandal is being looked into by the US federal trade commission. The work being done by the Tory MP Damian Collins as the chair of the digital, culture, media and sport select committee is great to see. But even at their most potent, these efforts do not get near questions centred on Facebook’s sheer size, and the possibility of anti-monopoly action that would have to originate on the company’s home turf.

In the US, anti-trust actions only succeed if a supposedly monopolistic company can be found to have affected consumers’ wellbeing in terms of the quality of products and services they can access, the levels of innovation in a given economic sector, and in particular, the prices people have to pay. The fact that Facebook would probably slip free of such criteria surely suggests that the rules are unfit for the online age, and that a different set of considerations ought be introduced, perhaps built around the power a company wields, relative to its collective competence. In those terms, Zuckerberg and his colleagues are guilty of an epic fail, and everything that now happens to them should follow from it.

• John Harris is a Guardian columnist
https://www.theguardian.com/commentisfr ... ers-profit
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Sat Mar 24, 2018 1:46 pm

“Dumb fucks.” That’s how Mark Zuckerberg described users of Facebook for trusting him with their personal data back in 2004. If the last week is anything to go by, he was right.



'A grand illusion': seven days that shattered Facebook's facade
The Cambridge Analytica Files

Revelations about the depths of Facebook’s failure to protect our data has finally pulled back the curtain, observers say

Cambridge Analytica scandal: the biggest revelations so far

Olivia Solon in San Francisco
Sat 24 Mar 2018 05.00 EDT Last modified on Sat 24 Mar 2018 10.27 EDT


One expert said the Cambridge Analytica revelations will finally get people to ‘pay attention not just to Facebook but the entire surveillance economy’. Composite: Bloomberg

“Dumb fucks.” That’s how Mark Zuckerberg described users of Facebook for trusting him with their personal data back in 2004. If the last week is anything to go by, he was right.

Since the Observer reported that the personal data of about 50 million Americans had been harvested from Facebook and improperly shared with the political consultancy Cambridge Analytica, it has become increasingly apparent that the social network has been far more lax with its data sharing practices than many users realised.

As the scandal unfurled over the last seven days, Facebook’s lackluster response has highlighted a fundamental challenge for the company: how can it condemn the practice on which its business model depends?

“This is the story we have been waiting for so people will pay attention not just to Facebook but the entire surveillance economy,” said Siva Vaidhyanathan, a professor of media studies at the University of Virginia.

They may now regret it but they knowingly unleashed the forces that have led to this lack of trust and loss of privacy
Since Zuckerberg’s “dumb fucks” comment, Facebook has gone to great lengths to convince members of the public that it’s all about “connecting people” and “building a global community”. This pseudo-uplifting marketing speak is much easier for employees and users to stomach than the mission of “guzzling personal data so we can micro-target you with advertising”.

In the wake of the revelations that Cambridge Analytica misappropriated data collected by Dr Aleksandr Kogan under the guise of academic research, Facebook has scrambled to blame these rogue third parties for “platform abuse”. “The entire company is outraged we were deceived,” it said in a statement on Tuesday.

However in highlighting the apparent deceit, the company has been forced to shine a light on its underlying business model and years of careless data sharing practices.

Sure, the data changed hands between the researcher and Cambridge Analytica in apparent violation of Kogan’s agreement with Facebook, but everything else was above board. The amount of data Cambridge Analytica got hold of and used to deliver targeted advertising based on personality types – including activities, interests, check-ins, location, photos, religion, politics, relationship details – was not unusual in the slightest. This was a feature, not a bug.

Play Video 13:04
Cambridge Analytica whistleblower: 'We spent $1m harvesting millions of Facebook profiles' – video
‘Extremely friendly to app developers’

There are thousands of other developers, including the makers of the dating app Tinder, games such as FarmVille, as well as consultants to Barack Obama’s 2012 presidential campaign, who slurped huge quantities of data about users and their friends – all thanks to Facebook’s overly permissive “Graph API”, the interface through which third parties could interact with Facebook’s platform.

Facebook opened up in order to attract app developers to join Facebook’s ecosystem at a time when the company was playing catch-up in shifting its business from desktops to smartphones. It was a symbiotic relationship that was critical to Facebook’s growth.

“They wanted to push as much of the conversation, ad revenue and digital activity as possible and made it extremely friendly to app developers,” said Jeff Hauser, of the Center for Economic and Policy Research. “Now they are complaining that the developers abused them. They wanted that. They were encouraging it. They may now regret it but they knowingly unleashed the forces that have led to this lack of trust and loss of privacy.”

'Utterly horrifying': ex-Facebook insider says covert data harvesting was routine
Read more

The terms were updated in April 2014 to restrict the data new developers could get hold of, including people’s friends’ data, but only after four years of access to the Facebook firehose. Companies that plugged in before April 2014 had another year before access was restricted.

“There are all sorts of companies that are in possession of terabytes of information from before 2015,” said Jeff Hauser of the Center for Economic Policy and Research. “Facebook’s practices don’t bear up to close, informed scrutiny nearly as well as they look from the 30,000ft view, which is how people had been viewing Facebook previously.”

Cambridge Analytica claims it helped get Trump elected by using data to target voters on Facebook.
Facebook Twitter Pinterest Cambridge Analytica claims it helped get Trump elected by using data to target voters on Facebook. Photograph: Win Mcnamee/AFP/Getty Images
Mark Zuckerberg apologises for Facebook's 'mistakes' over Cambridge Analytica
Read more

For too long consumers have thought about privacy on Facebook in terms of whether their ex-boyfriends or bosses could see their photos. However, as we fiddle around with our profile privacy settings, the real intrusions have been taking place elsewhere.

“In this sense, Facebook’s ‘privacy settings’ are a grand illusion. Control over post-sharing – people we share to – should really be called ‘publicity settings’,” explains Jonathan Albright, the research director at the Tow Center for Digital Journalism. “Likewise, control over passive sharing – the information people [including third party apps] can take from us – should be called ‘privacy settings’.”

Essentially Facebook gives us privacy “busywork” to make us think we have control, while making it very difficult to truly lock down our accounts.


Facebook is dealing with a PR minefield. The more it talks about its advertising practices, the more the #DeleteFacebook movement grows. Even the co-founder of WhatsApp Brian Acton, who profited from Facebook’s $19bn acquisition of his app, this week said he was deleting his account.

“This is the biggest issue I’ve ever seen any technology company face in my time,” said Roger McNamee, Zuckerberg’s former mentor.

“It’s not like tech hasn’t had a lot of scandals,” he said, mentioning the Theranos fraud case and MiniScribe packing actual bricks into boxes instead of hard drives. “But no one else has played a role in undermining democracy or the persecution of minorities before. This is a whole new ball game in the tech world and it’s really, really horrible.”

Facebook first discovered that Kogan had shared data with Cambridge Analytica when a Guardian journalist contacted the company about it at the end of 2015. It asked Cambridge Analytica to delete the data and revoked Kogan’s apps’ API access. However, Facebook relied on Cambridge Analytica’s word that they had done so.

Left unregulated, this market will continue to be prone to deception and lacking in transparency
When the Observer contacted Facebook last week with testimony from a whistleblower stating that Cambridge Analytica had not deleted the data, Facebook’s reaction was to try to get ahead of the story by publishing its own disclosure late on Friday and sending a legal warning to try to prevent publication of its bombshell discoveries.

Then followed five days of virtual silence from the company, as the chorus of calls from critics grew louder, and further details of Facebook’s business dealings emerged.

A second whistleblower, the former Facebook manager Sandy Parakilas, revealed that he found Facebook’s lack of control over the data given to outside developers “utterly horrifying”. He told the Guardian that he had warned senior executives at the company that its lax approach to data protection risked a major breach, but that he was discouraged from investigating further.

At around the same time, it emerged that the co-director of the company that harvested the Facebook data before passing it to Cambridge Analytic is a current employee at Facebook. Joseph Chancellor worked alongside Kogan at Global Science Research, which exfiltrated the data using a personality app under the guise of academic research.

Demand for answers

Politicians on both sides of the Atlantic called for answers. In the US, the Democratic senator Mark Warner called for regulation, describing the online political advertising market as the “wild west”.

“Whether it’s allowing Russians to purchase political ads, or extensive micro-targeting based on ill-gotten user data, it’s clear that, left unregulated, this market will continue to be prone to deception and lacking in transparency,” he said.

The Federal Trade Commission plans to examine whether the social networking site violated a 2011 data privacy agreement with the agency over its data-sharing practices.

The people owned the web, tech giants stole it. This is how we take it back
Jonathan Freedland

In the UK, MPs summoned Facebook’s chief executive, Mark Zuckerberg, to give evidence to a select committee investigating fake news.

“I think they are in a very bad situation because they have long benefitted from the tech illiteracy of the political community,” said Hauser.

The backlash spooked investors, wiping almost $50bn off the valuation of the company in two days, although the stock has since rallied slightly.

On Wednesday, Zuckerberg finally broke his silence in a Facebook post acknowledging that the policies that allowed the misuse of data were a “breach of trust between Facebook and the people who share their data with us and expect us to protect it”.


The social network is facing calls for answers from lawmakers on both sides of the Atlantic. Photograph: Josh Edelson/AFP/Getty Images

Facebook’s chief operating officer, Sheryl Sandberg, added her own comment: “We know that this was a major violation of people’s trust, and I deeply regret that we didn’t do enough to deal with it.”

The company will investigate apps that had access to “large amounts of information” before the 2014 changes and audit thousands of apps that show “suspicious activity”. The company will also inform those whose data was “misused”, including people who were directly affected by the Kogan operation.

These actions don’t go far enough, said Vaidhyanathan.

“Facebook has a history of putting on that innocent little boy voice: ‘Oh I didn’t know that I shouldn’t hold the cat by its tail,’” he said. “I think we’re tired of it at this point.”

These problems were pointed out by scholars years ago, said Robyn Caplan, a researcher at Data & Society, but Facebook’s response was slow and insufficient.

“They have been trying to put out a lot of little fires but we need them to build a fire department,” she said.
https://www.theguardian.com/technology/ ... are_btn_tw
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby seemslikeadream » Mon Mar 26, 2018 11:39 am

Christopher Wylie

Hey @facebook I am not a suspect in the ongoing CA investigation. Because I have been proactively working for months with the @ICOnews and British authorities. Because I am the whistleblower.

Image


FTC confirms it's investigating Facebook
BY HARPER NEIDIG - 03/26/18 10:57 AM EDT
The Federal Trade Commission (FTC) on Monday confirmed that it had opened an investigation into Facebook’s privacy practices following reports that data from 50 million users landed in the hands of a political consulting firm without their consent.

Tom Pahl, the acting FTC bureau chief for consumer protection, said in a statement that the agency would be investigating whether the incident constituted a violation of a 2011 agreement that Facebook signed to settle charges over other privacy concerns.

“Companies who have settled previous FTC actions must also comply with FTC order provisions imposing privacy and data security requirements,” Pahl said. “Accordingly, the FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook.”
http://thehill.com/policy/technology/38 ... o-facebook
Mazars and Deutsche Bank could have ended this nightmare before it started.
They could still get him out of office.
But instead, they want mass death.
Don’t forget that.
User avatar
seemslikeadream
 
Posts: 32090
Joined: Wed Apr 27, 2005 11:28 pm
Location: into the black
Blog: View Blog (83)

Re: The creepiness that is Facebook

Postby stillrobertpaulsen » Mon Apr 02, 2018 2:57 pm

ICE Used Private Facebook Data to Find and Track Criminal Suspect, Internal Emails Show
Lee Fang

March 26 2018, 12:39 p.m.

Because of errors inserted during editing, the original version of this article contained the mistaken assertion that ICE used private Facebook data to track unauthorized immigrants. The story has been corrected. See below for full correction.

Cambridge Analytica may have had access to the personal information of tens of millions of unwitting Americans, but a genuine debate has emerged about whether the company had the sophistication to put that data effectively to use on behalf of Donald Trump’s presidential campaign.

But one other organization that has ready access to Facebook’s trove of personal data has a much better track record of using such information effectively: U.S. Immigration and Customs Enforcement.

ICE, the federal agency tasked with Trump’s program of mass deportation, uses backend Facebook data to locate and track suspects, according to a string of emails and documents obtained by The Intercept through a public records request. The hunt for one particular suspect provides a rare window into how ICE agents use social media and powerful data analytics tools to find targets.

In February and March of 2017, several ICE agents were in communication with a detective from Las Cruces, New Mexico, to find information about a particular person. They were ultimately able to obtain backend Facebook data revealing a log of when the account was accessed and the IP addresses corresponding to each login. Lea Whitis, an agent with Homeland Security Investigations, the investigative arm of ICE, emailed the team a “Facebook Business Record” revealing the suspect’s phone number and the locations of each login into his account during a date range.

Law enforcement agents routinely use bank, telephone, and internet records for investigations, but the extent to which ICE uses social media is not well known.

A Facebook spokesperson, in a statement, said that ICE does not have any unique access to data:

Facebook does not provide ICE or any other law enforcement agency with any special data access to assist with the enforcement of immigration law. We have strict processes in place to handle these government requests. Every request we receive is checked for legal sufficiency. We require officials to provide a detailed description of the legal and factual basis for their request, and we push back when we find legal deficiencies or overly broad or vague demands for information.

In this case, our records show that ICE sent valid legal process to us in an investigation said to involve an active child predator. We take the enforcement of laws protecting children from child predators very seriously, and we responded to ICE’s valid request with data consistent with our publicly available data disclosure standards. ICE did not identify any immigration law violations in connection with its data request to Facebook in this case.

One of the agents involved in the hunt responded that they could combine the data with “IP address information back from T-Mobile.” Another agent chimed in to say that the agency had sent the phone company an expedited summons for information.

“I am going to see if our Palantir guy is here to dump the Western Union info in there since I know there is a way to triangulate the area he’s sending money from and narrow down time of day etc,” responded Jen Miller, an ICE agent on the email thread.

Palantir is a controversial data analytics firm co-founded by billionaire investor Peter Thiel. The company, which does business with the military and major intelligence agencies, has contracted with ICE since 2014. As journalist Spencer Woodman reported last year, the company developed a special system for ICE to access a vast “ecosystem” of data to facilitate immigration officials in both discovering targets and then creating and administering cases against them.
email-1522092002

But there is little public disclosure of how ICE uses the Palantir platform to track individuals. The emails obtained by The Intercept show that private Facebook information is yet another data point for agents in pursuit of a suspect.

“I have not heard of HSI going and getting private information from Facebook. What we’ve been seeing is when they use someone’s Facebook page and print what they’ve been posting to use as evidence to argue that person is a gang member,” said Rachel K. Prandini, staff attorney with the Immigrant Legal Resource Center.

“Photos with friends ICE thinks are gang members, doing hand signs that ICE alleges are gang signs, or wearing clothes that ICE believes indicate gang membership are being pulled from Facebook and submitted as evidence in immigration court proceedings,” Prandini added.

Matthew Bourke, a spokesperson for ICE, emailed The Intercept to say the agency would not “comment on investigative techniques or tactics other than to say that during the course of a criminal investigation, we have the ability to seek subpoenas and court orders to legally compel a company to provide information that may assist in case completion and subsequent prosecution.”

“Court orders are an established procedure that is consistent with all other federal law enforcement agencies. Additionally, investigators can use open-source information that is readily available on various social-media platforms during the course of an investigation,” Bourke added.

The Stored Communications Act provides broad powers for law enforcement to request information from communication service providers, including Facebook. The law delineates a variety of types of data that can be requested, much of it without a court order.

Facebook publishes a semiannual transparency report detailing the number of government requests for user data. The report does not break down which law enforcement agencies are making the requests for data, so it is unclear how many of the requests came from ICE.

The report reveals that from January 2017 through June 2017, Facebook received 32,716 requests for data from 52,280 users. Facebook notes in its report that it complied with 85 percent of the requests and “approximately 57% of legal process we received from authorities in the U.S. was accompanied by a non-disclosure order legally prohibiting us from notifying the affected users.”

“Occasionally companies will push back, but that’s relatively rare,” says Nathan Wessler, staff attorney with the American Civil Liberties Union’s Speech, Privacy, and Technology Project. “From time to time, companies will push back if they receive what they perceive to be a grossly over-broad request, like a true dragnet request, or if there’s a request for example to unmask the identity of someone who is engaging in First Amendment-protected speech online anonymously. But they — for the vast majority of the requests — they comply.”

“For these subpoenas, it’s trivially easy for ICE or any other law enforcement agencies to issue,” explained Wessler. “They don’t require the involvement of a judge ahead of time. It’s really just a piece of paper that they’ve prepared ahead of time, a form, and they fill in a couple of pieces of information about what they’re looking for and they self-certify what they’re looking for is relevant to an ongoing investigation.”

Facebook is facing increasing demands about how it manages user data. Recent reports from a whistleblower have refocused attention on the Silicon Valley company for allowing Cambridge Analytica, a campaign consulting firm that served groups supporting Trump’s presidential campaign, access to user data.

Though Facebook and its founder Mark Zuckerberg have provided political support for immigration reform efforts in the past, the company has received relatively little scrutiny for its role in ICE’s deportation machine. Last year, ICE agents requested private Facebook data to obtain a cellphone number for an unauthorized immigrant in Detroit who they were pursuing. That number was then tracked through a cell site simulator, a powerful surveillance tool used to vacuum cellphone calls and user location data.

The agency appears to continue its search for greater technology tools to track and apprehend unauthorized immigrants. Last month, ICE released a request for proposal for a private contractor to provide tools to track target employment data, credit checks, vehicle accident reports, pay day loans, and other data sources. The Department of Homeland Security, meanwhile, has made aggressive new efforts to obtain social media data from those entering and exiting the country.

Wessler noted that the use of Facebook data combined with Palantir shows the agency is expanding its reach.

“It speaks to the importance of companies like Palantir to have tremendous ability to amass a great deal of information about people, Wessler said. “Just because a federal agency can pay for a contract to provide a service doesn’t mean it is a good idea when it’s enabling a massive deportation apparatus without appropriate checks and balances.”

Correction: March 26, 2018
Due to errors by editor Ryan Grim, this story and its headline originally reported that the investigation referred to in the ICE emails targeted an immigrant. The story as filed did not include those errors, or any others. The documents reported on in the story do not establish that the target of the investigation was an immigrant or that the individual was being pursued for immigration violations. The target of the investigation was, according to the documents, based in the New York metropolitan area, while several of the ICE agents on the emails were based in New Mexico. Additionally, this story has been updated to include a comment from Facebook.
"Huey Long once said, “Fascism will come to America in the name of anti-fascism.” I'm afraid, based on my own experience, that fascism will come to America in the name of national security."
-Jim Garrison 1967
User avatar
stillrobertpaulsen
 
Posts: 2414
Joined: Wed Jan 14, 2009 2:43 pm
Location: California
Blog: View Blog (37)

Re: The creepiness that is Facebook

Postby 82_28 » Mon Apr 02, 2018 5:34 pm

Resistance is futile. . .

Facebook Has Been Preparing for #DeleteFacebook for More Than a Decade

There has never been a better time to #DeleteFacebook. So says the movement to ditch the social platform, in light of revelations about its complicitness in Cambridge Analytica harvesting 50 million people’s data in 2014.

But Facebook’s nearly 2 billion users have nowhere else to go. That’s because, with a few exceptions, Facebook has managed to squash its competitors, either by cloning or acquiring them—a tactic it’s used to remain relevant and irreplaceable. For the past 14 years, since its inception, Facebook has been preparing for this very moment. And now that it’s here, the company continues to monopolize the way humans interact online.

“I don’t think we’ve seen a meaningful number of people act on that, but, you know, it’s not good,” Zuckerberg told the New York Times of users deleting their accounts. A Reuters survey of 2,237 Americans this month showed that 41 percent of them trust Facebook to lawfully protect their data. According to a recent poll on Blind, an anonymous chat app for technology employees, 31 percent said they planned to delete Facebook.

It’s not uncommon for technology products to copy each other. Silicon Valley is bursting with startups that are virtually indistinguishable, but Facebook’s own brand of mimicry is viewed by some as incompatible with progress—a natural monopoly—leaving too little room for similar ideas to compete. Today, Facebook is a technology behemoth; the Frankenstein’s monster of social media platforms. Not only has the company consumed its competitors, it’s consumed our habits, making it too hard and too inconvenient for the average person to #DeleteFacebook.

I’m not an active Facebook user, but haven’t left for fear of being disconnected—from family, friends, and even people I’ve never met. To older generations, like my mom, who said she’s concerned by Facebook’s actions, but can’t imagine deleting her account, cutting the cord means severing rekindled relationships and familial ties. People who are 55 years and older are the social network’s fastest growing demographic. There are countless ways that Facebook has exploited the psychological effects of Being Online, but that’s not what truly handcuffs us to the site.

In 2009, Facebook purchased the now-defunct FriendFeed, a social media aggregator, for $15 million, plus $32.5 million in stock. Around that time, Facebook was already replicating FriendFeed’s original features, such as the “Like” button, and real-time updates.

“There are still numerous ways FriendFeed beats out Facebook’s News Feed setup,” Techcrunch reported. “One of these is the way stories are ‘floated’ to the top as new users comment on them. And FriendFeed’s system is truly real-time, unlike Facebook’s feed which users have to manually refresh.”

One year later, Facebook paid $40 million for all the social networking patents owned by Friendster, another long-gone, but once beloved, social platform. “It was important that Facebook remove any shadow of a doubt that someone else had the rights to the intellectual property behind its core technology,” wrote Gigaom.

In 2012, it famously paid $1 billion for Instagram after killing what appeared to be its own photo sharing app. Two years later, for $19 billion, it purchased WhatsApp, the wildly popular messaging app and Facebook Messenger competitor.

Then, Facebook spent $150 million in 2013 on a VPN (virtual private network) technology called Onavo. It allows the company to spy on user behavior under the guise of protecting their data, according to the Washington Post. With our browsing habits and app usage, Facebook can ascertain our desires, and fulfil them—whether it’s Snapchat-like filters, or WeChat-like games.

Facebook has acquired a host of diverse companies. Either absorbing them, borrowing from them, or simply keeping them under its umbrella. A location-based check-in startup, mobile advertising talent, group messaging technology, a Q&A service, and a social gifting platform, among others—core features of the Facebook we know and use today.

“Our full mission statement is: give people the power to build community and bring the world closer together,” Zuckerberg said at last year’s Facebook Community Summit. “That reflects that we can't do this ourselves, but only by empowering people to build communities and bring people together.”

But is Facebook bringing communities together, or is it making it impossible to build them elsewhere?


https://motherboard.vice.com/en_us/arti ... n-a-decade
There is no me. There is no you. There is all. There is no you. There is no me. And that is all. A profound acceptance of an enormous pageantry. A haunting certainty that the unifying principle of this universe is love. -- Propagandhi
User avatar
82_28
 
Posts: 11194
Joined: Fri Nov 30, 2007 4:34 am
Location: North of Queen Anne
Blog: View Blog (0)

PreviousNext

Return to General Discussion

Who is online

Users browsing this forum: No registered users and 39 guests