Back in 2014, Baltimore cops were looking for one Kerron Andrews. A warrant was issued for Mr. Andrews as the cops were looking to arrest him for attempted murder. And while the cops did not request an approval to use a device called the Hailstorm to find their man, they employed it anyway. The Hailstorm is a tracking tool similar to the Stingray, in that it intercepts and collects bulk data headed to a cell tower. The data can be used to help find someone's location. It also records all of the person's phone calls.
The Hailstorm did lead to Mr. Andrews whereabouts, and he was arrested. But the judge, after discovering that the police used the Hailstorm without approval, said that the cops had violated the defendant's Fourth Amendment right against unreasonable search and seizure. The judge granted a request by the defense to suppress the evidence collected by the Hailstorm.
The state has appealed the decision, and in its filing it presents a legal theory that is disturbing. The prosecution says that since every cellphone sports an off switch, Andrews' decision to have his phone turned on indicated he was consenting to be tracked.
If the appeals court goes along with this argument and allows the evidence collected by the Hailstorm to be used during the trial, it will mean that as far as the cops are concerned, you are giving up your privacy each time you press that power button on your handset and turn it on.
As soon as we hear how the appeals court rules, we will let you know. In the meantime, better keep your finger off the power button on your phone if you don't want the authorities to know what you are up to.
The Dangers of New York City’s Public Wi-Fi
Published: March 21, 2016
If you’re a NYC resident and find yourself excited about the city’s public Wi-Fi network known as LinkNYC, you may want to think again.
As the New York Civil Liberties Union (NYCLU) notes:
March 16, 2016 — The city’s new public Wi-Fi network LinkNYC raises several privacy concerns for users, the New York Civil Liberties Union announced today after sending a letter to the Office of the Mayor on Tuesday. CityBridge, the company behind the LinkNYC kiosks that have begun replacing phone booths in Manhattan, retains a vast amount of information about users – often indefinitely – building a massive database that carries a risk of security breaches and unwarranted NYPD surveillance.
“New Yorkers’ private online activities shouldn’t be used to create a massive database that’s within the ready grasp of the NYPD,” said Donna Lieberman, executive director of the NYCLU. “Free public Wi-Fi can be an invaluable resource for this city, but New Yorkers need to know there are too many strings attached.”
LinkNYC, which was publicly launched in January, will eventually become a network of as many as 7,500 to 10,000 public kiosks offering fast and free Wi-Fi throughout all five boroughs. The sheer volume of information gathered by this powerful network will create a massive database of information that will present attractive opportunities for hackers and for law enforcement surveillance, and will carry an undue risk of abuse, misuse and unauthorized access.
You’ve been warned.
http://www.blacklistednews.com/The_Dang ... 8/Y/M.html
Brian Barrett Security Date of Publication: 05.19.16.
New Surveillance System May Let Cops Use All of the Cameras
The 30 million or so surveillance cameras peering into nearly every corner of American life might freak you out a bit, but you could always tell yourself that no one can access them all. Until now.
Computer scientists have created a way of letting law enforcement tap any camera that isn’t password protected so they can determine where to send help or how to respond to a crime. “It’s a way to help people take advantage of information that’s out there,” says David Ebert, an electrical and computer engineer at Purdue University.
The system, which is just a proof of concept, alarms privacy advocates who worry that prudent surveillance could easily lead to government overreach, or worse, unauthorized use. It relies upon two tools developed independently at Purdue. The Visual Analytics Law Enforcement Toolkit superimposes the rate and location of crimes and the location of police surveillance cameras. CAM2 reveals the location and orientation of public network cameras, like the one outside your apartment. You could do the same thing with a search engine like Shodan, but CAM2 makes the job far easier, which is the scary part. Aggregating all these individual feeds makes it potentially much more invasive.
Purdue limits access to registered users, and the terms of service for CAM2 state “you agree not to use the platform to determine the identity of any specific individuals contained in any video or video stream.” A reasonable step to ensure privacy, but difficult to enforce (though the team promises the system will have strict security if it ever goes online).
“I can certainly see the utility for first responders,” says Dave Maass, an investigative researcher with digital rights group EFF. “But it does open up the potential for some unseemly surveillance.”
Beyond the specter of universal government surveillance lies the risk of someone hacking the system. To Maass, it brings to mind the TV show Person of Interest and its band of vigilantes who tap government cameras to predict and prevent crimes. This is not so far-fetched. Last year, the EFF discovered that anyone could access more than 100 “secure” automated license plate readers. “I think it becomes a very tempting target,” says Gautam Hans, policy counsel at the Center for Democracy & Technology. “Thinking about security issues is going to be a major concern.”
Granted, the system does not tap private feeds, nor does it peer into private spaces like someone’s home. But aggregating this data and mapping it against specific crimes or emergencies is troubling. Hans says there’s no way of knowing when someone violates the terms of service and targets an individual, and the patchwork of regulations governing how agencies can use such technology is no guarantee against government over-reach.
Still, Hans is pragmatic and realizes the Purdue researchers have a noble goal. “At a certain level there’s only so much you can do to prevent the march of technology,” he says. “It’s not the best use of our time to rail against its existence. At a certain point we need to figure out how to use it effectively, or at least with extensive oversight.”
https://www.wired.com/2016/05/new-surve ... e-cameras/
Face recognition app taking Russia by storm may bring end to public anonymity
FindFace compares photos to profile pictures on social network Vkontakte and works out identities with 70% reliability
Findface has amassed 500,000 users in the short time since the launch
Shaun Walker in Moscow
If the founders of a new face recognition app get their way, anonymity in public could soon be a thing of the past. FindFace, launched two months ago and currently taking Russia by storm, allows users to photograph people in a crowd and work out their identities, with 70% reliability.
It works by comparing photographs to profile pictures on Vkontakte, a social network popular in Russia and the former Soviet Union, with more than 200 million accounts. In future, the designers imagine a world where people walking past you on the street could find your social network profile by sneaking a photograph of you, and shops, advertisers and the police could pick your face out of crowds and track you down via social networks.
In the short time since the launch, Findface has amassed 500,000 users and processed nearly 3m searches, according to its founders, 26-year-old Artem Kukharenko, and 29-year-old Alexander Kabakov.
Kukharenko is a lanky, quietly spoken computer nerd who has come up with the algorithm that makes FindFace such an impressive piece of technology, while Kabakov is the garrulous money and marketing man, who does all of the talking when the pair meet the Guardian.
Unlike other face recognition technology, their algorithm allows quick searches in big data sets. “Three million searches in a database of nearly 1bn photographs: that’s hundreds of trillions of comparisons, and all on four normal servers. With this algorithm, you can search through a billion photographs in less than a second from a normal computer,” said Kabakov, during an interview at the company’s modest central Moscow office. The app will give you the most likely match to the face that is uploaded, as well as 10 people it thinks look similar.
Kabakov says the app could revolutionise dating: “If you see someone you like, you can photograph them, find their identity, and then send them a friend request.” The interaction doesn’t always have to involve the rather creepy opening gambit of clandestine street photography, he added: “It also looks for similar people. So you could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages.”
Some have sounded the alarm about the potentially disturbing implications. Already the app has been used by a St Petersburg photographer to snap and identify people on the city’s metro, as well as by online vigilantes to uncover the social media profiles of female porn actors and harass them.
The technology can work with any photographic database, though it currently cannot use Facebook, because even the public photographs are stored in a way that is harder to access than Vkontakte, the app’s creators say.
But the FindFace app is really just a shop window for the technology, the founders said. There is a paid function for those who want to make more than 30 searches a month, but this is more to regulate the servers from overload rather than to make money. They believe the real money-spinner from their face-recognition technology will come from law enforcement and retail.
Kukharenko and Kabakov have recently returned from the US, and Kabakov was due to travel to Macau and present the technology to a casino chain. The pair claim they have been contacted by police in Russian regions, who told them they started loading suspect or witness photographs into FindFace and came up with results. “It’s nuts: there were cases that had seen no movement for years, and now they are being solved,” said Kabakov.
The startup is in the final stages of signing a contract with Moscow city government to work with the city’s network of 150,000 CCTV cameras. If a crime is committed, the mugshots of anyone in the area can be fed into the system and matched with photographs of wanted lists, court records, and even social networks.
It does not take a wild imagination to come up with sinister applications in this field too; for example authoritarian regimes able to tag and identify participants in street protests. Kabakov and Kukharenko said they had not received an approach from Russia’s FSB security service, but “if the FSB were to get in touch, of course we’d listen to any offers they had”.
The pair also have big plans for the retail sector. Kabakov imagines a world where cameras fix you looking at, say, a stereo in a shop, the retailer finds your identity, and then targets you with marketing for stereos in the subsequent days.
Again, it sounds a little disturbing. But Kabakov said, as a philosophy graduate, he believes we cannot stop technological progress so must work with it and make sure it stays open and transparent.
“In today’s world we are surrounded by gadgets. Our phones, televisions, fridges, everything around us is sending real-time information about us. Already we have full data on people’s movements, their interests and so on. A person should understand that in the modern world he is under the spotlight of technology. You just have to live with that.”
Thursday 14 April 2016 15.27 BST
A Russian photographer has proved how easy it is to track down people on social media using facial recognition software.
Yegor Tsvetkov took photos of strangers on St Petersburg’s metro and used a facial recognition app which trawls through profiles on VKontakte, Russia’s biggest social network, to track down their online profiles.
Named “Your Face is Big Data”, the series of photographs shows how powerful facial recognition software has become, to the point that a complete stranger can find you at the click of a button.
Tsvetkov told the Guardian the project aimed to show technology can affect privacy, particularly if you don’t activate the relevant settings on your social media profiles.
“Nobody noticed that I photographed them, but I used a simple camera and I didn’t try to hide it,” he said.
“One girl in the project texted me after the publication and said that it was a bad feeling when she saw herself … but she fully understood my idea.”
The software he used is called FindFace, and was developed by the Moscow-based company N-Tech.Lab.
Launched in February, the app trawls through millions of profiles on VKontakte to find the person you are looking for within seconds.
Tsvetkov showed just how well this software works. When the Guardian ran some of his photographs through the site, the profiles of most of his subjects were easy to locate.
The Guardian has not published any of these photos to protect people’s anonymity.
Facial recognition software has proved problematic for Facebook. In 2011, its commitment to privacy was questioned when it turned on facial recognition software to automatically identify people in photos. In Germany, it was threatened with legal action for violating privacy laws.
Currently, it is not possible to trawl through Facebook using facial recognition and, as of yet, there is no western equivalent of FindFace.
Terrorist or pedophile? This start-up says it can out secrets by analyzing faces
By Matt McFarland May 24 at 6:30 AM
Our faces may reveal a lot more about us than we expect. (Kacper Pempel/Reuters)
An Israeli start-up says it can take one look at a person’s face and realize character traits that are undetectable to the human eye.
Faception said it’s already signed a contract with a homeland security agency to help identify terrorists. The company said its technology also can be used to identify everything from great poker players to extroverts, pedophiles, geniuses and white collar-criminals.
“We understand the human much better than other humans understand each other,” said Faception chief executive Shai Gilboa. “Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.”
Faception has built 15 different classifiers, which Gilboa said evaluate with 80 percent accuracy certain traits. The start-up is pushing forward, seeing tremendous power in a machine’s ability to analyze images.
Yet experts caution there are ethical questions and profound limits to the effectiveness of technology such as this.
“Can I predict that you’re an ax murderer by looking at your face and therefore should I arrest you?” said Pedro Domingos, a professor of computer science at the University of Washington and author of “The Master Algorithm.” “You can see how this would be controversial.”
Gilboa said he also serves as the company’s chief ethics officer and will never make his classifiers that predict negative traits available to the general public.
The danger lies in the computer system’s imperfections. Because of that, Gilboa envisions governments considering his findings along with other sources to better identify terrorists. Even so, the use of the data is troubling to some.
“The evidence that there is accuracy in these judgments is extremely weak,” said Alexander Todorov, a Princeton psychology professor whose research includes facial perception. “Just when we thought that physiognomy ended 100 years ago. Oh, well.”
Faception recently showed off its technology at a poker tournament organized by a start-up that shares investors with Faception. Gilboa said that Faception predicted before the tournament that four players out of the 50 amateurs would be the best. When the dust settled two of those four were among the event’s three finalists. To make its prediction Faception analyzed photos of the 50 players against a Faception database of professional poker players.
There are challenges in trying to use artificial intelligence systems to draw conclusions such as this. A computer that is trained to analyze images will only be as good as the examples it is trained on. If the computer is exposed to a narrow or outdated sample of data, its conclusions will be skewed. Additionally, there’s the risk the system will make an accurate prediction, but not necessarily for the right reasons.
Domingos, the University of Washington professor, shared the example of a colleague who trained a computer system to tell the difference between dogs and wolves. Tests proved the system was almost 100 percent accurate. But it turned out the computer was successful because it learned to look for snow in the background of the photos. All of the wolf photos were taken in the snow, whereas the dog pictures weren’t.
Also, an artificial intelligence system might zero in on a trait that could be changed by a person — such as the presence of a beard — limiting its ability to make an accurate prediction.
“If somebody came to me and said ‘I have a company that’s going to try to do this,’ my answer to them would be ‘nah, go do something more promising,’ ” Domingos said. “But on the other hand, machine learning brings us lots of surprises every day.”
https://www.washingtonpost.com/news/inn ... ing-faces/
The NSA is interested in collecting information from pacemakers and other biomedical devices for national security purposes, according to The Intercept. Richard Ledgett, the agency's deputy director, reportedly said at a conference yesterday that, "We’re looking at it sort of theoretically from a research point of view right now."
elfismiles » 20 Jun 2016 15:05 wrote:FBI says surveillance cam locations must be kept secret
"Disclosure of even minor details about them may cause jeopardy," bureau says.
by David Kravets - Jun 15, 2016 1:01pm CDT
http://arstechnica.com/tech-policy/2016 ... pt-secret/
You are under surveillance right now.
Your cell phone provider tracks your location and knows who’s with you. Your online and in-store purchasing patterns are recorded, and reveal if you're unemployed, sick, or pregnant. Your e-mails and texts expose your intimate and casual friends. Google knows what you’re thinking because it saves your private searches. Facebook can determine your sexual orientation without you ever mentioning it.
The powers that surveil us do more than simply store this information. Corporations use surveillance to manipulate not only the news articles and advertisements we each see, but also the prices we’re offered. Governments use surveillance to discriminate, censor, chill free speech, and put people in danger worldwide. And both sides share this information with each other or, even worse, lose it to cybercriminals in huge data breaches.
Much of this is voluntary: we cooperate with corporate surveillance because it promises us convenience, and we submit to government surveillance because it promises us protection. The result is a mass surveillance society of our own making. But have we given up more than we’ve gained? In Data and Goliath, security expert Bruce Schneier offers another path, one that values both security and privacy. He shows us exactly what we can do to reform our government surveillance programs and shake up surveillance-based business models, while also providing tips for you to protect your privacy every day. You'll never look at your phone, your computer, your credit cards, or even your car in the same way again.
This Company Has Built a Profile on Every American Adult
Every move you make. Every click you take. Every game you play. Every place you stay. They’ll be watching you.
Forget telephoto lenses and fake mustaches: The most important tools for America’s 35,000 private investigators are database subscription services. For more than a decade, professional snoops have been able to search troves of public and nonpublic records—known addresses, DMV records, photographs of a person’s car—and condense them into comprehensive reports costing as little as $10. Now they can combine that information with the kinds of things marketers know about you, such as which politicians you donate to, what you spend on groceries, and whether it’s weird that you ate in last night, to create a portrait of your life and predict your behavior.
IDI, a year-old company in the so-called data-fusion business, is the first to centralize and weaponize all that information for its customers. The Boca Raton, Fla., company’s database service, idiCORE, combines public records with purchasing, demographic, and behavioral data. Chief Executive Officer Derek Dubner says the system isn’t waiting for requests from clients—it’s already built a profile on every American adult, including young people who wouldn’t be swept up in conventional databases, which only index transactions. “We have data on that 21-year-old who’s living at home with mom and dad,” he says.
Dubner declined to provide a demo of idiCORE or furnish the company’s report on me. But he says these personal profiles include all known addresses, phone numbers, and e-mail addresses; every piece of property ever bought or sold, plus related mortgages; past and present vehicles owned; criminal citations, from speeding tickets on up; voter registration; hunting permits; and names and phone numbers of neighbors. The reports also include photos of cars taken by private companies using automated license plate readers—billions of snapshots tagged with GPS coordinates and time stamps to help PIs surveil people or bust alibis.
IDI also runs two coupon websites, allamericansavings.com and samplesandsavings.com, that collect purchasing and behavioral data. When I signed up for the latter, I was asked for my e-mail address, birthday, and home address, information that could easily link me with my idiCORE profile. The site also asked if I suffered from arthritis, asthma, diabetes, or depression, ostensibly to help tailor its discounts.
Users browsing this forum: Google [Bot] and 4 guests