DrEvil » 02 Feb 2023 22:38 wrote:So... toll roads that don't apply to locals?
So you love the idea of cameras tracking all vehicles at all times to issue automatic fines to the unauthorized vehicles?
Seriously?
Moderators: Elvis, DrVolin, Jeff
DrEvil » 02 Feb 2023 22:38 wrote:So... toll roads that don't apply to locals?
Key facts
The United Nations General Assembly has set an ambitious target of halving the global number of deaths and injuries from road traffic crashes by 2030 (A/RES/74/299).
Road traffic injuries are the leading cause of death for children and young adults aged 5-29 years.
Approximately 1.3 million people die each year as a result of road traffic crashes.
More than half of all road traffic deaths are among vulnerable road users: pedestrians, cyclists, and motorcyclists.
93% of the world's fatalities on the roads occur in low- and middle-income countries, even though these countries have approximately 60% of the world's vehicles.
Road traffic crashes cost most countries 3% of their gross domestic product.
Study links ambient PM2.5 and ozone specifically caused by vehicle exhaust emissions to ~361,000 premature deaths worldwide in 2010 and ~385,000 in 2015
On-road diesel vehicles were responsible for nearly half of the health impacts of air pollution from vehicles worldwide in 2015, and two-thirds of impacts in India, France, Germany, and Italy
The global cost of these transportation-attributable health impacts in 2010 and 2015 was approximately US$1 trillion
Exhaust from vehicles is a major source of outdoor air pollution worldwide. The health impacts are immense but unevenly distributed, both geographically and among various segments of the transportation sector, such as light-duty and heavy-duty vehicles, shipping, and off-road machinery.
A new study provides the most detailed picture available to date of the global, regional, and local health impacts attributable to emissions from four transportation subsectors: on-road diesel vehicles, other on-road vehicles, shipping, and non-road mobile engines such as agricultural and construction equipment. The study, by researchers from the International Council on Clean Transportation, George Washington University Milken Institute School of Public Health, and the University of Colorado Boulder, links state-of-the-art vehicle emissions, air pollution, and epidemiological models to estimate health impacts at the global, regional, national, and local levels in 2010 and 2015.
One 20th century mistake made cities horrible, congested, lonely places to live
Larson, an architect by training who now works on futuristic plans to adapt urban centers for the 21st century, said that the overriding mistake of last century was building cities around cars.
"Cars kill innovation," he said. "They lower density, they lead to traffic congestion and parking problems, and waste land for storing cars 24 hours a day."
Up until about 1880, he said, cities functioned more like networked villages. A person likely worked, relaxed, and shopped all within about 20 minutes of their home. Cities of that era of course had plenty of their own challenges, but they functioned as cohesive units. Residents were more likely to interact and function together.
But the modern era, and the introduction of the car, changed that.
"The early modernists saw the future as defined by separate functions: housing, commercial, industry," he said. "So people had these quiet, high quality residential areas, in large part in reaction to tenement slums and the awful living conditions that were quite often found in cities."
Many wealthy, white people moved out to suburbs, and commuted to work in city centers on major highways. Those highways divided up the neighborhoods of those who remained in cities, and isolating and ghettoizing many poor and minority populations.
"By the 1950s we were redesigning cities to privilege the needs of machines over the needs of humans," Larson said.
bi graphics what cities could look like in 100 years lead image
Skye Gould/Samantha Lee/Business Insider
The result was urban spaces that were more and more crowded but less dense. If you're in a city right now, the space around you is likely a mix of buildings and areas for people to walk and relax, all chopped up by wide streets and parking spaces. All that room given over to cars takes away from and interrupts the spaces human beings use to live and move around. It reduces the number of people who can fit in a square mile, while making that same square mile feel more crowded and uncomfortable. (Cut that space for cars down drastically, and you're left with a much more human city.)
This history won't shock anyone who's paid attention to the history of cities over the course of the last 100-plus years. But it's a key element of the theory that informs trends in urban planning that people like Larson hope will define the 21st century.
A return to the neighborhood-as-village model would see more people packed, hopefully thoughtfully, into cities themselves, but streets and parking areas given over to communal spaces and modes of transport that don't involve hauling tons of steel around on four wheels.
The most vital cities, he said, will offer more personalized, shared transportation options and walkable spaces, and privilege the needs of urbanites over suburbanites. You can already see it happening: Bike shares, roads like Broadway in New York turned over to communal spaces, European cities banning private cars from downtown areas. But cities still have a long way to go.
A number of major cities have joined the "car-free" movement, which aims to reduce air pollution and improve safety among residents.
Most cities that are starting to ban cars are located in Europe, though a few others, such as New York, are making considerable strides.
In addition to implementing outright bans, cities have enacted measures to encourage cycling and make public spaces more pedestrian-friendly.
As small cities successfully implement plans to ditch their vehicles, many large urban areas are determined to follow in their footsteps.
The idea of a car-free city is not without its challenges. Though bikes and public transit are widely available in most cities, cars remain a preferred method of transportation for many urban commuters.
Studies have shown that it's notoriously difficult to change a driver's commuting habits, even when free public transit is involved.
The alternative is high levels of car pollution, which contributes to around 20% of the world's carbon dioxide emissions. An Oxford study found that around 10,000 people die prematurely in Europe each year due to pollution from diesel cars alone.
Automatic Number Plate Recognition (ANPR)
We use ANPR (Automatic Number Plate Recognition) technology to help detect, deter and disrupt criminal activity at a local, force, regional and national level. This includes travelling criminals (those using the road network to avoid being caught), organised crime groups and terrorists.
ANPR provides lines of enquiry and evidence in the investigation of crime and is used by forces throughout England, Wales, Scotland and Northern Ireland.
How it works
As a vehicle passes an ANPR camera, its registration number is read and instantly checked against database records of vehicles of interest.
Police officers can stop a vehicle, speak to the occupants and, where necessary, make arrests.
ANPR has proved to be important in the detection of many offences, including locating, for example, people wanted for arrest or missing, witnesses, stolen vehicles, uninsured vehicles and uncovering cases of major crime.
How data is stored
A record for all vehicles passing by a camera is stored, including those for vehicles that are not known to be of interest at the time of the read.
At present ANPR cameras nationally, submit on average around 60 million ANPR ‘read’ records to national ANPR systems daily.
ANPR data from each police force is stored together with similar data from other forces for one year.
...
However, private data brokers also track this kind of data and help surveil citizens — without a warrant. There is a large market for personal data, compiled from information people volunteer, information people unwittingly yield — for example, via mobile apps — and information that is stolen in data breaches. Among the customers for this largely unregulated data are federal, state, and local law enforcement agencies.
Whether or not you pass under the gaze of a surveillance camera or license plate reader, you are tracked by your mobile phone. GPS tells weather apps or maps your location, Wi-Fi uses your location, and cell-tower triangulation tracks your phone. Bluetooth can identify and track your smartphone, and not just for contact tracing, Apple’s “Find My” service, or to connect headphones.
People volunteer their locations for ride-sharing or for games like Pokemon Go or Ingress, but apps can also collect and share locations without your knowledge. Many late-model cars feature telematics that tracks locations — for example, OnStar or Bluelink. All this makes opting out impractical.
The same thing is true online. Most websites feature ad trackers and third-party cookies, which are stored in your browser whenever you visit a site. They identify you when you visit other sites so advertisers can follow you around. Some websites also use keylogging, which monitors what you type into a page before hitting submit. Similarly, session recording monitors mouse movements, clicks, scrolling, and typing, even if you don’t click “submit.”
Ad trackers know when you browsed where, which browser you used, and what your device’s internet address is. Google and Facebook are among the main beneficiaries, but there are many data brokers slicing and dicing such information by religion, ethnicity, political affiliations, social media profiles, income, and medical history for profit.
...
Nobody expects to be invisible on the streets, at borders, or in shopping centers. But who has access to all that surveillance data, and how long is it stored? There is no single U.S. privacy law at the federal level, and states cope with a regulatory patchwork; only five states — California, Colorado, Connecticut, Utah, and Virginia — have privacy laws.
It is possible to limit location tracking on your phone, but not to avoid it completely. Data brokers are supposed to mask your personally identifiable data before selling it. But this “anonymization” is meaningless since individuals are easily identified by cross-referencing additional data sets. This makes it easy for bounty hunters and stalkers to abuse the system.
The biggest risk to most people arises when there is a data breach, which is happening more often — whether it is a leaky app or careless hotel chain, a DMV data sale, a compromised credit bureau, or indeed a data brokering middleman whose cloud storage is hacked.
Thousands of data brokers in the United States buy, aggregate, disclose, and sell billions of data elements on Americans with virtually no oversight. As the data broker industry proliferates, companies have enormous financial incentives to collect consumers’ personal data, while data brokers have little financial incentive to protect consumer data. For these companies, consumers are the product, not the customer. Companies also maintain information about consumers that is often inaccurate, wrongfully denying them credit, housing, or even a job.
Data brokers collect and aggregate many types of personal information: names, addresses, telephone numbers, e-mail addresses, gender, age, marital status, children, education, profession, income, political preferences, and cars and real estate owned. Data brokers also collect information on an individual’s purchases, where they shop, and how they pay for their purchases. In addition, data brokers collect health information, the sites we visit online, and the advertisements we click on. And thanks to the proliferation of smartphones and wearables, data brokers collect and sell real-time location data.
The lack of a comprehensive baseline U.S. privacy law has allowed the data broker industry to build profiles on millions of Americans at great cost to our privacy, civil rights, national security, and democracy. Congress must pass comprehensive privacy legislation and create a U.S. Data Protection Agency to regulate the out-of-control data broker industry.
The Data Broker Industry
Data brokers use secret algorithms to build profiles on every American citizen, regardless of whether the individual even knows that the data broker exists. As such, consumers now face the specter of a “scored society” where they do not have access to the most basic information on how they are evaluated. The data broker industry’s secret algorithms can be used to determine the interest rates on mortgages and credit cards, raise consumers’ interest rates, or deny people jobs. In one instance, a consumer found that his credit score suffered a forty-point hit simply because he requested accurate information about his mortgage. Data brokers even scrape social media and score consumers based on factors such as their political activity on Twitter.
The use of algorithms can also have widespread discriminatory effects. The Equal Credit Opportunity Act (ECOA) prohibits lenders from discriminating in credit decisions. Still, studies have demonstrated that Black and Latino communities have lower credit scores as a group than whites. Current law does not allow consumers or regulators to evaluate these scores to determine whether they violate ECOA. Although consumers have the right to request their credit scores, they do not have the right to know how this score is determined.
Over the past few years, data brokers and federal military, intelligence, and law enforcement agencies have formed a vast, secretive partnership to surveil the movements of millions of people. Many of the mobile apps on our cell phones track our movements with great precision and frequency. Data brokers harvest our location data from the app developers, and then sell it to these agencies. Once in government hands, the data is used by the military to spy on people overseas, by ICE to monitor people in and around the U.S., and by criminal investigators like the FBI and Secret Service. This post will draw on recent research and reporting to explain how this surveillance partnership works, why is it alarming, and what can we do about it.
Where does the data come from?
Weather apps, navigation apps, coupon apps, and “family safety” apps often request location access in order to enable key features. But once an app has location access, it typically has free rein to share that access with just about anyone.
That’s where the location data broker industry comes in.
.....
Governments and the private sector are increasingly relying on data-driven technologies to help contain the novel coronavirus, Covid-19. While some see technological solutions as a critical tool for contact tracing, quarantine enforcement, tracking the spread of the virus, and allocating medical resources, these practices raise significant human rights concerns. Human Rights Watch is particularly concerned about proposals for the use of mobile location data in the Covid-19 response because the data usually contains sensitive and revealing insights about people’s identity, location, behavior, associations, and activities.
Mobile location data programs to combat Covid-19 may not be scientifically necessary and could lead to human rights abuses if they are not equipped with effective safeguards to protect privacy. The long history of emergency measures, such as surveillance measures put in place to counter terrorism, shows that they often go too far, fail to have their desired effect, and, once approved, often outlast their justification.
This Q&A explains the different ways that governments are using mobile location data to respond to Covid-19, the human rights concerns associated with these measures, and human rights standards that should be applied when using such data. It includes illustrative cases, recommendations, and guidelines to help evaluate the human rights risks posed by the use of mobile location data.
Contact Tracing Using Data Provided by Telecommunications Providers
Governments are accessing data from Telcos in contact tracing efforts. In Israel, an emergency regulation approved by the government on March 17 authorized Shin Bet, Israel’s internal security service, to receive, collect, and process “technological data,” including location data, from Telcos without user consent to predict which citizens have been exposed to the virus. Under the program, the health ministry sends alerts to people’s phones ordering them to self-quarantine. The cabinet circumvented the parliament in approving the emergency regulation. Israel’s supreme court later ruled that the government needed to pass a law authorizing such “fulfills the principles of privacy protection” or else it would be halted. The health ministry on March 23 also released a voluntary app, ostensibly to back up Shin Bet efforts, to inform people if they have come in contact with an infected person.
In Armenia, the parliament on March 31 passed amendments giving the authorities very broad surveillance powers, which require Telcos to hand over the phone records for all of their customers, including phone numbers and the location, time, and date of their calls and text messages. The authorities can use that data to identify individuals who are infected and should be isolated or close contacts who should self-quarantine, or to monitor individuals in isolation or quarantine.
In Russia, the prime minister on March 20 ordered the communications ministry to design a national system to track people who have been in contact with coronavirus patients, using location data provided by individuals’ mobile phone provider. On April 1, the communications ministry confirmed it had designed the system. The communications ministry has demanded that regional authorities provide lists of mobile phone numbers of people infected with coronavirus, as well as the phone numbers of citizens who are quarantined at home either because they had traveled abroad or had contact with infected people.
In Ecuador, on March 16, the president issued an emergency decree authorizing the government to use data from satellite and mobile telephone platforms to monitor people who tested positive for the virus, those who have been in close contact with someone who tested positive, those who have symptoms, and those subjected to mandatory isolation for having entered the country from abroad.
Bluetooth Contact Tracing
In Singapore, the government on March 20 launched TraceTogether, a Bluetooth-based contact tracing app, to supplement its human contact tracing efforts. When a person is contacted, they are required by law to assist the health ministry in accurately mapping out their movements and interactions to minimize the risk of widespread infection. Data logs are stored on phones in encrypted form, using “cryptographically generated temporary IDs”. However, when a TraceTogether user is a confirmed Covid-19 case and agrees to upload the data log in the app to the health ministry, the health ministry will decrypt the temporary IDs in the user’s app and obtain a list of phone numbers from the uploaded data log.
The European Commission on April 8 adopted a recommendation to pursue a pan-European coordinated approach for the use of mobile applications for contact tracing, among other purposes. The common approach will be guided by privacy and data protection principles, including data minimization and appropriate safeguards such as psaggregation, encryption, and decentralization. It will also be voluntary, with a preference for Bluetooth-based proximity tracing. Further guidance is due to be adopted on the data protection and privacy implications of the use of mobile applications. The European Parliament on April 17 adopted a resolution reinforcing the commission’s recommendation, demanding full transparency so that people can verify the underlying protocol for security and privacy of such apps. In the meantime, a number of European Union countries, including France, Germany, and the Netherlands, are in the process of selecting contact tracing apps.
In Norway, the National Institute of Public Health on April 16 launched a voluntary, self-reporting app that will monitor users’ movements and then ask people to go into quarantine if they have been exposed to someone who tested positive for the coronavirus. When a user is confirmed as having coronavirus, the app will then retrieve their location data and send a text message to every other user who has been within 2 meters of that person for more than 15 minutes, instructing them to go into quarantine.
Mobile Apps to Enforce Quarantine and Social Distancing Orders
Authorities in cities and provinces across China are using the app Health Code, which was developed by private companies, to make decisions about whom to quarantine and for how long. The app assigns each of its approximately 700 million users one of 3 colors: green enables unrestricted movement, yellow requires 7 days of quarantine, and red requires 14 days of quarantine. To enter buildings, go to the supermarket, use public transport, and move around their neighborhood, people must scan a QR code at a manned checkpoint. However, the rules behind color assignments are secretive, making it difficult for individuals to understand why they were assigned a particular color, or what circumstances might trigger a change of color. The app also collects users’ location data and shares it with the police. Users have complained that the app’s decisions are arbitrary and difficult to appeal; some of them have been confined to their homes for indefinite periods even after serving the quarantine period mandated by the app.
In Turkey, the health minister declared on April 7 that it is mandatory for people infected with Covid-19 to download an app called “Life fits inside the house” as part of the “Pandemic Isolation Tracking Project.” The app follows the movement of people instructed to self-isolate, and if they leave their homes, they receive a warning via SMS and are contacted instantly through automatic call technology and told to return to isolation. Under the program, those who fail to comply with the warning and continue to violate the quarantine are reported to relevant law enforcement and face administrative measures and sanctions, which can include jail time ranging from two months to a year in accordance with Article 195 of Turkish Penal Code. Human Rights Watch has not yet investigated how widespread the use of the app is in practice and whether the Turkish authorities have made efforts to enforce its use.
In Moscow, the city government on April launched an app to track the movement of coronavirus patients. The app is mandatory for all patients who have been ordered to stay at home. It requests access to the user’s calls, location, camera, storage, network information, sensors, and other data to ensure people do not leave their home while contagious. This app is in addition to the installation of one of the world’s biggest surveillance camera systems equipped with facial recognition technology to ensure that everyone placed under self-quarantine stays off the streets. On April 15, Moscow also introduced a digital permit system for non-essential travel, both on public transport and private vehicles.
Big Data Analytics
In the EU, eight major Telcos have agreed to share anonymized metadata with the European Commission for modelling and predicting the propagation of the coronavirus. An official from the commission said the data will be aggregated and anonymized and that the commission will delete it when the pandemic is over. Still, the European Data Protection Supervisor warned about the possibility of such measures becoming permanent.
In the US, mobile advertising companies, which gather the location data of mobile and internet users to target and sell ads, are reportedly supplying analyses of people’s locations and movements to the CDC and certain state and local governments. In the context of Covid-19, this data sharing arrangement is apparently designed to help the authorities better understand how infections spread and refine public health responses. Much of this arrangement, including how data is collected, shared, anonymized, and analysed, is unknown. It has also been reported that the federal government is building a national coronavirus surveillance system to monitor and forecast rates of infection and hospitalization across the country. It is unclear whether this project is linked to the CDC’s partnership with the mobile advertising industry.
In South Korea, in addition to using cell phone location data, CCTV cameras, and tracking of debit, ATM, and credit cards to identify people infected with coronavirus, the authorities created a publicly available map using aggregate data of infected individuals to allow other people to check whether they may have crossed paths with someone infected with the virus. The platform was officially launched on March 26. Health authorities also send out cell phone notifications containing very detailed information on confirmed cases, including the age, gender, and daily routes infected people took 48 hours before being quarantined. The purpose of the disclosures is to enable potential untraceable contacts (for example, strangers who were in the same restaurant as the confirmed case at the same time) to recognize and prepare for possible infection.
In Ecuador, the president on April 6 announced the SOS Covid tool, which works with information obtained from the emergency service, the ministry of telecommunications, the ministry of health, mobile-service providers, and the Salud EC App (see below) to monitor whether the quarantine is being observed, detect cases, carry out massive tests, and identify areas of risk due to crowding.
stickdog99 » Fri Feb 03, 2023 10:58 am wrote:DrEvil » 02 Feb 2023 22:38 wrote:So... toll roads that don't apply to locals?
So you love the idea of cameras tracking all vehicles at all times to issue automatic fines to the unauthorized vehicles?
Seriously?
In Norway, the National Institute of Public Health on April 16 launched a voluntary, self-reporting app that will monitor users’ movements and then ask people to go into quarantine if they have been exposed to someone who tested positive for the coronavirus. When a user is confirmed as having coronavirus, the app will then retrieve their location data and send a text message to every other user who has been within 2 meters of that person for more than 15 minutes, instructing them to go into quarantine.
Gnomad » 04 Feb 2023 05:39 wrote:I don't love them in the slightest, thats not what I am saying. I was just pointing out that getting all worked up about this one instance of trying to reduce congestion in the city, is misguided at best.
I don't think we should have any tracking of people at all. And measures like this are not used at all in my country. When congestion tolls were discussed here, vehicle tracking was pretty vehemently opposed.
If a congestion payment system is needed, it should be done without necessitating tracking at individual level. But in the UK, they already have a huge license plate tracking system and nobody bats an eye. They already did the whole mass-surveillance thing. https://www.police.uk/advice/advice-and ... tion-anpr/
This system includes cameras all along the major roads, as well as cameras mounted on every single police car.Automatic Number Plate Recognition (ANPR)
We use ANPR (Automatic Number Plate Recognition) technology to help detect, deter and disrupt criminal activity at a local, force, regional and national level. This includes travelling criminals (those using the road network to avoid being caught), organised crime groups and terrorists.
ANPR provides lines of enquiry and evidence in the investigation of crime and is used by forces throughout England, Wales, Scotland and Northern Ireland.
How it works
As a vehicle passes an ANPR camera, its registration number is read and instantly checked against database records of vehicles of interest.
Police officers can stop a vehicle, speak to the occupants and, where necessary, make arrests.
ANPR has proved to be important in the detection of many offences, including locating, for example, people wanted for arrest or missing, witnesses, stolen vehicles, uninsured vehicles and uncovering cases of major crime.
How data is stored
A record for all vehicles passing by a camera is stored, including those for vehicles that are not known to be of interest at the time of the read.
At present ANPR cameras nationally, submit on average around 60 million ANPR ‘read’ records to national ANPR systems daily.
ANPR data from each police force is stored together with similar data from other forces for one year.
Be mad at THAT, not at a congestion reduction measure.
Do you really think they need those few cameras, when they already have a bloody nation-wide 24/7 plate surveillance system, and have had that for years? 60 million records a day, every day, every month, 12 months a year. That net catches preetty much most people in any more populated areas of the UK. And I bet it is not just the police that have access, it is any and all intelligence services, and probably every influential rich person or politician with a few contacts.
Not to mention what can be freely bought on the market, data-wise from every single smartphone users habits (and often, location data as well):
https://www.inverse.com/innovation/tech ... lance-apps...
However, private data brokers also track this kind of data and help surveil citizens — without a warrant. There is a large market for personal data, compiled from information people volunteer, information people unwittingly yield — for example, via mobile apps — and information that is stolen in data breaches. Among the customers for this largely unregulated data are federal, state, and local law enforcement agencies.
Whether or not you pass under the gaze of a surveillance camera or license plate reader, you are tracked by your mobile phone. GPS tells weather apps or maps your location, Wi-Fi uses your location, and cell-tower triangulation tracks your phone. Bluetooth can identify and track your smartphone, and not just for contact tracing, Apple’s “Find My” service, or to connect headphones.
People volunteer their locations for ride-sharing or for games like Pokemon Go or Ingress, but apps can also collect and share locations without your knowledge. Many late-model cars feature telematics that tracks locations — for example, OnStar or Bluelink. All this makes opting out impractical.
The same thing is true online. Most websites feature ad trackers and third-party cookies, which are stored in your browser whenever you visit a site. They identify you when you visit other sites so advertisers can follow you around. Some websites also use keylogging, which monitors what you type into a page before hitting submit. Similarly, session recording monitors mouse movements, clicks, scrolling, and typing, even if you don’t click “submit.”
Ad trackers know when you browsed where, which browser you used, and what your device’s internet address is. Google and Facebook are among the main beneficiaries, but there are many data brokers slicing and dicing such information by religion, ethnicity, political affiliations, social media profiles, income, and medical history for profit.
...
Nobody expects to be invisible on the streets, at borders, or in shopping centers. But who has access to all that surveillance data, and how long is it stored? There is no single U.S. privacy law at the federal level, and states cope with a regulatory patchwork; only five states — California, Colorado, Connecticut, Utah, and Virginia — have privacy laws.
It is possible to limit location tracking on your phone, but not to avoid it completely. Data brokers are supposed to mask your personally identifiable data before selling it. But this “anonymization” is meaningless since individuals are easily identified by cross-referencing additional data sets. This makes it easy for bounty hunters and stalkers to abuse the system.
The biggest risk to most people arises when there is a data breach, which is happening more often — whether it is a leaky app or careless hotel chain, a DMV data sale, a compromised credit bureau, or indeed a data brokering middleman whose cloud storage is hacked.
https://www.techdirt.com/2019/11/26/cal ... -dmv-data/
https://epic.org/issues/consumer-privacy/data-brokers/Thousands of data brokers in the United States buy, aggregate, disclose, and sell billions of data elements on Americans with virtually no oversight. As the data broker industry proliferates, companies have enormous financial incentives to collect consumers’ personal data, while data brokers have little financial incentive to protect consumer data. For these companies, consumers are the product, not the customer. Companies also maintain information about consumers that is often inaccurate, wrongfully denying them credit, housing, or even a job.
Data brokers collect and aggregate many types of personal information: names, addresses, telephone numbers, e-mail addresses, gender, age, marital status, children, education, profession, income, political preferences, and cars and real estate owned. Data brokers also collect information on an individual’s purchases, where they shop, and how they pay for their purchases. In addition, data brokers collect health information, the sites we visit online, and the advertisements we click on. And thanks to the proliferation of smartphones and wearables, data brokers collect and sell real-time location data.
The lack of a comprehensive baseline U.S. privacy law has allowed the data broker industry to build profiles on millions of Americans at great cost to our privacy, civil rights, national security, and democracy. Congress must pass comprehensive privacy legislation and create a U.S. Data Protection Agency to regulate the out-of-control data broker industry.
The Data Broker Industry
Data brokers use secret algorithms to build profiles on every American citizen, regardless of whether the individual even knows that the data broker exists. As such, consumers now face the specter of a “scored society” where they do not have access to the most basic information on how they are evaluated. The data broker industry’s secret algorithms can be used to determine the interest rates on mortgages and credit cards, raise consumers’ interest rates, or deny people jobs. In one instance, a consumer found that his credit score suffered a forty-point hit simply because he requested accurate information about his mortgage. Data brokers even scrape social media and score consumers based on factors such as their political activity on Twitter.
The use of algorithms can also have widespread discriminatory effects. The Equal Credit Opportunity Act (ECOA) prohibits lenders from discriminating in credit decisions. Still, studies have demonstrated that Black and Latino communities have lower credit scores as a group than whites. Current law does not allow consumers or regulators to evaluate these scores to determine whether they violate ECOA. Although consumers have the right to request their credit scores, they do not have the right to know how this score is determined.
https://www.eff.org/deeplinks/2022/06/h ... ation-dataOver the past few years, data brokers and federal military, intelligence, and law enforcement agencies have formed a vast, secretive partnership to surveil the movements of millions of people. Many of the mobile apps on our cell phones track our movements with great precision and frequency. Data brokers harvest our location data from the app developers, and then sell it to these agencies. Once in government hands, the data is used by the military to spy on people overseas, by ICE to monitor people in and around the U.S., and by criminal investigators like the FBI and Secret Service. This post will draw on recent research and reporting to explain how this surveillance partnership works, why is it alarming, and what can we do about it.
Where does the data come from?
Weather apps, navigation apps, coupon apps, and “family safety” apps often request location access in order to enable key features. But once an app has location access, it typically has free rein to share that access with just about anyone.
That’s where the location data broker industry comes in.
.....
https://www.hrw.org/news/2020/05/13/mob ... ovid-19-qaGovernments and the private sector are increasingly relying on data-driven technologies to help contain the novel coronavirus, Covid-19. While some see technological solutions as a critical tool for contact tracing, quarantine enforcement, tracking the spread of the virus, and allocating medical resources, these practices raise significant human rights concerns. Human Rights Watch is particularly concerned about proposals for the use of mobile location data in the Covid-19 response because the data usually contains sensitive and revealing insights about people’s identity, location, behavior, associations, and activities.
Mobile location data programs to combat Covid-19 may not be scientifically necessary and could lead to human rights abuses if they are not equipped with effective safeguards to protect privacy. The long history of emergency measures, such as surveillance measures put in place to counter terrorism, shows that they often go too far, fail to have their desired effect, and, once approved, often outlast their justification.
This Q&A explains the different ways that governments are using mobile location data to respond to Covid-19, the human rights concerns associated with these measures, and human rights standards that should be applied when using such data. It includes illustrative cases, recommendations, and guidelines to help evaluate the human rights risks posed by the use of mobile location data.Contact Tracing Using Data Provided by Telecommunications Providers
Governments are accessing data from Telcos in contact tracing efforts. In Israel, an emergency regulation approved by the government on March 17 authorized Shin Bet, Israel’s internal security service, to receive, collect, and process “technological data,” including location data, from Telcos without user consent to predict which citizens have been exposed to the virus. Under the program, the health ministry sends alerts to people’s phones ordering them to self-quarantine. The cabinet circumvented the parliament in approving the emergency regulation. Israel’s supreme court later ruled that the government needed to pass a law authorizing such “fulfills the principles of privacy protection” or else it would be halted. The health ministry on March 23 also released a voluntary app, ostensibly to back up Shin Bet efforts, to inform people if they have come in contact with an infected person.
In Armenia, the parliament on March 31 passed amendments giving the authorities very broad surveillance powers, which require Telcos to hand over the phone records for all of their customers, including phone numbers and the location, time, and date of their calls and text messages. The authorities can use that data to identify individuals who are infected and should be isolated or close contacts who should self-quarantine, or to monitor individuals in isolation or quarantine.
In Russia, the prime minister on March 20 ordered the communications ministry to design a national system to track people who have been in contact with coronavirus patients, using location data provided by individuals’ mobile phone provider. On April 1, the communications ministry confirmed it had designed the system. The communications ministry has demanded that regional authorities provide lists of mobile phone numbers of people infected with coronavirus, as well as the phone numbers of citizens who are quarantined at home either because they had traveled abroad or had contact with infected people.
In Ecuador, on March 16, the president issued an emergency decree authorizing the government to use data from satellite and mobile telephone platforms to monitor people who tested positive for the virus, those who have been in close contact with someone who tested positive, those who have symptoms, and those subjected to mandatory isolation for having entered the country from abroad.
Bluetooth Contact Tracing
In Singapore, the government on March 20 launched TraceTogether, a Bluetooth-based contact tracing app, to supplement its human contact tracing efforts. When a person is contacted, they are required by law to assist the health ministry in accurately mapping out their movements and interactions to minimize the risk of widespread infection. Data logs are stored on phones in encrypted form, using “cryptographically generated temporary IDs”. However, when a TraceTogether user is a confirmed Covid-19 case and agrees to upload the data log in the app to the health ministry, the health ministry will decrypt the temporary IDs in the user’s app and obtain a list of phone numbers from the uploaded data log.
The European Commission on April 8 adopted a recommendation to pursue a pan-European coordinated approach for the use of mobile applications for contact tracing, among other purposes. The common approach will be guided by privacy and data protection principles, including data minimization and appropriate safeguards such as psaggregation, encryption, and decentralization. It will also be voluntary, with a preference for Bluetooth-based proximity tracing. Further guidance is due to be adopted on the data protection and privacy implications of the use of mobile applications. The European Parliament on April 17 adopted a resolution reinforcing the commission’s recommendation, demanding full transparency so that people can verify the underlying protocol for security and privacy of such apps. In the meantime, a number of European Union countries, including France, Germany, and the Netherlands, are in the process of selecting contact tracing apps.
In Norway, the National Institute of Public Health on April 16 launched a voluntary, self-reporting app that will monitor users’ movements and then ask people to go into quarantine if they have been exposed to someone who tested positive for the coronavirus. When a user is confirmed as having coronavirus, the app will then retrieve their location data and send a text message to every other user who has been within 2 meters of that person for more than 15 minutes, instructing them to go into quarantine.
Mobile Apps to Enforce Quarantine and Social Distancing Orders
Authorities in cities and provinces across China are using the app Health Code, which was developed by private companies, to make decisions about whom to quarantine and for how long. The app assigns each of its approximately 700 million users one of 3 colors: green enables unrestricted movement, yellow requires 7 days of quarantine, and red requires 14 days of quarantine. To enter buildings, go to the supermarket, use public transport, and move around their neighborhood, people must scan a QR code at a manned checkpoint. However, the rules behind color assignments are secretive, making it difficult for individuals to understand why they were assigned a particular color, or what circumstances might trigger a change of color. The app also collects users’ location data and shares it with the police. Users have complained that the app’s decisions are arbitrary and difficult to appeal; some of them have been confined to their homes for indefinite periods even after serving the quarantine period mandated by the app.
In Turkey, the health minister declared on April 7 that it is mandatory for people infected with Covid-19 to download an app called “Life fits inside the house” as part of the “Pandemic Isolation Tracking Project.” The app follows the movement of people instructed to self-isolate, and if they leave their homes, they receive a warning via SMS and are contacted instantly through automatic call technology and told to return to isolation. Under the program, those who fail to comply with the warning and continue to violate the quarantine are reported to relevant law enforcement and face administrative measures and sanctions, which can include jail time ranging from two months to a year in accordance with Article 195 of Turkish Penal Code. Human Rights Watch has not yet investigated how widespread the use of the app is in practice and whether the Turkish authorities have made efforts to enforce its use.
In Moscow, the city government on April launched an app to track the movement of coronavirus patients. The app is mandatory for all patients who have been ordered to stay at home. It requests access to the user’s calls, location, camera, storage, network information, sensors, and other data to ensure people do not leave their home while contagious. This app is in addition to the installation of one of the world’s biggest surveillance camera systems equipped with facial recognition technology to ensure that everyone placed under self-quarantine stays off the streets. On April 15, Moscow also introduced a digital permit system for non-essential travel, both on public transport and private vehicles.
Big Data Analytics
In the EU, eight major Telcos have agreed to share anonymized metadata with the European Commission for modelling and predicting the propagation of the coronavirus. An official from the commission said the data will be aggregated and anonymized and that the commission will delete it when the pandemic is over. Still, the European Data Protection Supervisor warned about the possibility of such measures becoming permanent.
In the US, mobile advertising companies, which gather the location data of mobile and internet users to target and sell ads, are reportedly supplying analyses of people’s locations and movements to the CDC and certain state and local governments. In the context of Covid-19, this data sharing arrangement is apparently designed to help the authorities better understand how infections spread and refine public health responses. Much of this arrangement, including how data is collected, shared, anonymized, and analysed, is unknown. It has also been reported that the federal government is building a national coronavirus surveillance system to monitor and forecast rates of infection and hospitalization across the country. It is unclear whether this project is linked to the CDC’s partnership with the mobile advertising industry.
In South Korea, in addition to using cell phone location data, CCTV cameras, and tracking of debit, ATM, and credit cards to identify people infected with coronavirus, the authorities created a publicly available map using aggregate data of infected individuals to allow other people to check whether they may have crossed paths with someone infected with the virus. The platform was officially launched on March 26. Health authorities also send out cell phone notifications containing very detailed information on confirmed cases, including the age, gender, and daily routes infected people took 48 hours before being quarantined. The purpose of the disclosures is to enable potential untraceable contacts (for example, strangers who were in the same restaurant as the confirmed case at the same time) to recognize and prepare for possible infection.
In Ecuador, the president on April 6 announced the SOS Covid tool, which works with information obtained from the emergency service, the ministry of telecommunications, the ministry of health, mobile-service providers, and the Salud EC App (see below) to monitor whether the quarantine is being observed, detect cases, carry out massive tests, and identify areas of risk due to crowding.
Your license plate is the least of your worries.
And yeah, I don't own a "smart"phone at all.
DrEvil » Mon Feb 13, 2023 12:22 am wrote:I like the solution people came up with in Vernor Vinge's Rainbows End: instead of trying to delete data do the opposite and create insane amounts of false data. If someone tries tracking you there's five hundred completely different versions of you running around.
https://www.theregister.com/2008/05/16/antiphormlite/
Coding activists have developed an application designed to confound Phorm's controversial behaviour-tracking software by simulating random web-browsing.
The folks behind AntiPhormLite says this means actual browsing habits are buried in noise. The app, which is available free of charge, is designed to poison the anonymised click stream Phorm collects with meaningless junk, thereby (at least in theory) undermining its business model.
Its developers reckon the chaff AntiPhormLite generates would be indistinguishable from genuine surfing. AntiPhormLite works with any browser a user cares to use and includes customised options so that each installation can be configured differently, making countermeasures Phorm might apply more difficult to develop...
Users browsing this forum: No registered users and 8 guests