Privacy: The Big Five II
The film Anon with Clive Owen shows a perhaps not-so futuristic world, one where everyone's data and whereabouts is able to be viewed. The plot centers around the main character being puzzled when a person he passes has no data showing; how is that possible? Indeed, as Jack Shafer of Politco wrote: Your “bloat-laden smartphone” harvests and sells personal information, including real-time location, to third parties. Your internet service provider does the same with your online habits. Gizmodo readers know all about how home devices --toothbrushes, smart TVs, Amazon Echos, sleep monitors, coffee makers, thermostats, smart lights, bathroom scales, et al.-- spy on them via the internet. Meanwhile, car companies have joined the queue to sell your driver data. We’re living in a Philip K. Dick novel! And they want more...Facebook* and the others have realized that their audience is just beginning, even as those under 25 begin to look to new horizons. Said James Temperton in WIRED, the largest group now joining Facebook is those over 65.
Of course, this is not meant to be directed only at Facebook for the business of selling your privacy and data is just business, and everyone from advertisers to governments want that data. For advertisers, the goal is to narrow and target your tastes so that you will buy their products and consume more material things, enhancing the word consume(r). But move to another group and it becomes a bit scarier. Take this from Bloomberg Businessweek: Police and sheriff’s departments in New York, New Orleans, Chicago, and Los Angeles have also used it, frequently ensnaring in the digital dragnet people who aren’t suspected of committing any crime. People and objects pop up on the Palantir screen inside boxes connected to other boxes by radiating lines labeled with the relationship: “Colleague of,” “Lives with,” “Operator of [cell number],” “Owner of [vehicle],” “Sibling of,” even “Lover of.” If the authorities have a picture, the rest is easy. Tapping databases of driver’s license and ID photos, law enforcement agencies can now identify more than half the population of U.S. adults. The article was describing Peter Thiel's company, Palantir, a data-mining company that is favored by governments, large financial institutions and even the CIA. As the cover illustration noted, it "mines" your hobbies, savings accounts, resumes, ATM usage, friends, web history, emails, printer usage, credit history and more. In China, a new type of credit, social "credit," is becoming a reality; something as simple as punching in one wrong number of your credit card can send your profile into freefall and deny you boarding on a plane or train; said Foreign Policy about the system which affects both people and companies (the system started in 2013 and is planned for full implementation in just over a year): The national credit system planned for 2020 will be an “ecosystem” made up of schemes of various sizes and reaches, run by cities, government ministries, online payment providers, down to neighborhoods, libraries, and businesses, say Chinese researchers who are designing the national scheme. It will all be interconnected by an invisible web of information. But contrary to some Western press accounts, which often confuse existing private credit systems with the future schemes, it will not be a unified platform where one can type in his or her ID and get a single three-digit score that will decide their lives. This caricature of a system that doles out unique scores to 1.4 billion people could not work technically nor politically, says Rogier Creemers, a scholar of Chinese law at the Leiden University Institute for Area Studies in the Netherlands. The system would instead expand and automatize existing forms of bureaucratic control, formalizing the existing controls and monitoring of Chinese citizens.
How is this possible? Are we growing ever closer to Big Brother or Blade Runner? You've likely become aware of the high-definition stadium cams that can easily zero in on your face from across the field (advertisers target you with these cams, watching what you're eating, drinking, or wearing among other things); but security and satellite cams are far more advanced, something you can see in a recent National Geographic article where the street camera identifies a face from a distance of three football/soccer fields away. The recent issue of Scientific American tells of a simple way to paint your walls at home with nickel paint, install a few electrodes, connect them together and then paint your wall as normal; instant sensor that can track your movements, act as a touch pad, and even monitor your nearby appliances; to an outsider there is no difference in your wall's appearance (imagine hotels or offices doing the same). Add to that the 36 million voice-activated devices already in homes in the U.S. and as reporter April Glaser wrote in Slate.com, they know "when you wake up, what items you need around the house, the music you like, how many other people live at your house, your eating habits and more." Is your "smart" tv watching you instead of the other way around?
All of this data crunching requires huge amounts of logistics and power., of course...and storage; but one has to remember that in the computational world each defined storage unit goes up by 1024 times. 1024 kilobytes = one megabyte; 1024 megabytes = one gigabyte; 1024 gigabytes = one terabyte; 1024 terabytes = one petabyte (the next one up is the exabyte). So a petabyte is a lot of storage (most external storage devices for the home are only a few terabytes at best)...but when such storage of every photo, email, conversation, medical history and more seems too vast to comprehend, just remember that a single gram of DNA can store 216 petabytes. I threw that in because it's important to bring this all back to perspective. Is this tracking and following and monitoring all getting out of hand, causing us to lose sight of...us? Europe just instituted new laws that can penalize companies that violate such personal data with fines as high as 4% of their global revenues (this took effect on May 25, 2018 and is the reasons so many sites are notifying you of updates and changes to their user agreements).** For Facebook, that would have meant a fine of about $1.5 billion. Facebook will earn an expected $21 billion on sales of $55 billion, 98% of which comes from what it sells to advertisers, those profiles of its 2.1 billion users. Why target Facebook? Said the New York Review of Books: In 2016, a ProPublica investigation revealed that Facebook’s advertising portal was allowing landlords to prevent African-Americans, Latinos, and other “ethnic affinity” groups from seeing their ads, in apparent violation of the Fair Housing Act and other laws. Facebook blamed advertisers for misusing its algorithm and proposed a better machine-learning algorithm as a solution. The predictable tendency at technology companies is to classify moral failings as technical issues and reject the need for direct human oversight. Facebook’s new tool was supposed to flag attempts to place discriminatory ads and reject them. But when the same journalists checked again a year later, Facebook was still approving the same kinds of biased ads; it remained a simple matter to offer rental housing while excluding such groups as African-Americans, mothers of high school students, Spanish speakers, and people interested in wheelchair ramps. The article also mentioned this: The German startup SearchInk has programmed a handwriting recognition algorithm that can predict with 80 percent accuracy whether a sample was penned by a man or woman. The data scientists who invented it do not know precisely how it does this. The same is true of the much-criticized “gay faces” algorithm, which can, according to its Stanford University creators, distinguish the faces of homosexual and heterosexual men with 81 percent accuracy. They have only a hypothesis about what correlations the algorithm might be finding in photos (narrower jaws and longer noses, possibly).
To his credit, CEO Zuckerberg has pledged to put 20,000 of his employees onto reviewing both content and security by the end of the year. Was this due to his Congressional questioning? Perhaps, but last year Facebook spent $11.5 million on lobbying Congressional staff, a figure beaten only by Google's Alphabet division which spent $18 million (the highest figure spent by any company that sought favors from Congress). But let's face it, all of this for the most part came with our permission. Those who felt obligated to carefully read the details when signing up with Facebook (their terms of service contract) will have likely spotted the approximately 3200 words "which contains dozens of links to more information," said an investigative piece in TIME. Said another article in the same magazine: Andrew Przybylski, a psychologist at Oxford University, notes that we don’t yet have robust, peer-reviewed studies on whether screen time is linked to depression or how children’s brains are affected by tech. That’s largely because those vast databases of user behavior owned by big tech firms like Facebook are proprietary. “They own the richest social database that has ever existed, and we can’t touch it,” Przybylski says. “We spend many hours engaged with them, but all the analysis of us happens behind closed doors.” Facebook says that it now downgrades viral videos which the company says has alone reduced user views by 50 million hours daily as tabulated at the end of last year.
All of this would appear to be of our own choosing in a sense. We want that linking and communication and deals and gossip, even if it means giving up a bit of our privacy. But then there's the issue of things we might not want to expose...such as our medical records. In a piece in the London Review of Books author Paul Taylor brings up the issue of our medical records being sold to firms for usage in (so far) drug trials and efficacy of certain drugs and vaccines. The data is anonymized (no specifics about your name or identifying personal information) and thus is considered not yours; a similar argument is ongoing in the U.S. over who or what also owns your X-rays and scan results (so far, indications are that they belong to the hospital and not you). Said the article: Although it may seem at odds with a patient's rights, it is crucial to the data's value: if you ask for consent not everyone will give it, and, worse, the people who do give consent aren't typical, so the data no longer tell you what you need to know. The legal and ethical justification isn't, however, based on the value or the science it enables, but on the idea that since the data is anonymized, the patient no longer has any rights over it.
Okay, all of this writing on what we knowingly or unknowingly are giving up was meant to just rattle the cage a bit; privacy means a lot to some and very little to others. But the issue of our personal habits and being tracked 24/7 was partially summed up by neuroscientist Susan Greenfield (she also happens to be a Baroness) in an earlier article: The notion of privacy, of privation, is shutting something out. We need to cut ourselves off. Everyone seems to think that it's great to be connected and exposed all the time. But what happens when everything is literal and visual? How do you explain a concept like honor when you can't find it on Google Images? The universe of the abstract is inexplicable. The nuance in life disappears. In some ways, it would seem, we are choosing to become anonymous.
*An interesting and more detailed study of the travails of Facebook appeared in WIRED, a story telling of how even a few of Facebook's employees discovered just how far the company can reach into their messages and records, and Mark Zuckerberg's desire to overtake whatever stood in his way, be it Twitter or Rupert Murdoch. Now Facebook has just launched a new dating feature to take on the likes of Tinder, Match and OKCupid, all targeted to its 200 million users who have already identified themselves as "single."
**A more detailed study of the law, labeled the General Data Protection Regulation (GDPR) can be found in a story in Bloomberg Businessweek.
Of course, this is not meant to be directed only at Facebook for the business of selling your privacy and data is just business, and everyone from advertisers to governments want that data. For advertisers, the goal is to narrow and target your tastes so that you will buy their products and consume more material things, enhancing the word consume(r). But move to another group and it becomes a bit scarier. Take this from Bloomberg Businessweek: Police and sheriff’s departments in New York, New Orleans, Chicago, and Los Angeles have also used it, frequently ensnaring in the digital dragnet people who aren’t suspected of committing any crime. People and objects pop up on the Palantir screen inside boxes connected to other boxes by radiating lines labeled with the relationship: “Colleague of,” “Lives with,” “Operator of [cell number],” “Owner of [vehicle],” “Sibling of,” even “Lover of.” If the authorities have a picture, the rest is easy. Tapping databases of driver’s license and ID photos, law enforcement agencies can now identify more than half the population of U.S. adults. The article was describing Peter Thiel's company, Palantir, a data-mining company that is favored by governments, large financial institutions and even the CIA. As the cover illustration noted, it "mines" your hobbies, savings accounts, resumes, ATM usage, friends, web history, emails, printer usage, credit history and more. In China, a new type of credit, social "credit," is becoming a reality; something as simple as punching in one wrong number of your credit card can send your profile into freefall and deny you boarding on a plane or train; said Foreign Policy about the system which affects both people and companies (the system started in 2013 and is planned for full implementation in just over a year): The national credit system planned for 2020 will be an “ecosystem” made up of schemes of various sizes and reaches, run by cities, government ministries, online payment providers, down to neighborhoods, libraries, and businesses, say Chinese researchers who are designing the national scheme. It will all be interconnected by an invisible web of information. But contrary to some Western press accounts, which often confuse existing private credit systems with the future schemes, it will not be a unified platform where one can type in his or her ID and get a single three-digit score that will decide their lives. This caricature of a system that doles out unique scores to 1.4 billion people could not work technically nor politically, says Rogier Creemers, a scholar of Chinese law at the Leiden University Institute for Area Studies in the Netherlands. The system would instead expand and automatize existing forms of bureaucratic control, formalizing the existing controls and monitoring of Chinese citizens.
How is this possible? Are we growing ever closer to Big Brother or Blade Runner? You've likely become aware of the high-definition stadium cams that can easily zero in on your face from across the field (advertisers target you with these cams, watching what you're eating, drinking, or wearing among other things); but security and satellite cams are far more advanced, something you can see in a recent National Geographic article where the street camera identifies a face from a distance of three football/soccer fields away. The recent issue of Scientific American tells of a simple way to paint your walls at home with nickel paint, install a few electrodes, connect them together and then paint your wall as normal; instant sensor that can track your movements, act as a touch pad, and even monitor your nearby appliances; to an outsider there is no difference in your wall's appearance (imagine hotels or offices doing the same). Add to that the 36 million voice-activated devices already in homes in the U.S. and as reporter April Glaser wrote in Slate.com, they know "when you wake up, what items you need around the house, the music you like, how many other people live at your house, your eating habits and more." Is your "smart" tv watching you instead of the other way around?
All of this data crunching requires huge amounts of logistics and power., of course...and storage; but one has to remember that in the computational world each defined storage unit goes up by 1024 times. 1024 kilobytes = one megabyte; 1024 megabytes = one gigabyte; 1024 gigabytes = one terabyte; 1024 terabytes = one petabyte (the next one up is the exabyte). So a petabyte is a lot of storage (most external storage devices for the home are only a few terabytes at best)...but when such storage of every photo, email, conversation, medical history and more seems too vast to comprehend, just remember that a single gram of DNA can store 216 petabytes. I threw that in because it's important to bring this all back to perspective. Is this tracking and following and monitoring all getting out of hand, causing us to lose sight of...us? Europe just instituted new laws that can penalize companies that violate such personal data with fines as high as 4% of their global revenues (this took effect on May 25, 2018 and is the reasons so many sites are notifying you of updates and changes to their user agreements).** For Facebook, that would have meant a fine of about $1.5 billion. Facebook will earn an expected $21 billion on sales of $55 billion, 98% of which comes from what it sells to advertisers, those profiles of its 2.1 billion users. Why target Facebook? Said the New York Review of Books: In 2016, a ProPublica investigation revealed that Facebook’s advertising portal was allowing landlords to prevent African-Americans, Latinos, and other “ethnic affinity” groups from seeing their ads, in apparent violation of the Fair Housing Act and other laws. Facebook blamed advertisers for misusing its algorithm and proposed a better machine-learning algorithm as a solution. The predictable tendency at technology companies is to classify moral failings as technical issues and reject the need for direct human oversight. Facebook’s new tool was supposed to flag attempts to place discriminatory ads and reject them. But when the same journalists checked again a year later, Facebook was still approving the same kinds of biased ads; it remained a simple matter to offer rental housing while excluding such groups as African-Americans, mothers of high school students, Spanish speakers, and people interested in wheelchair ramps. The article also mentioned this: The German startup SearchInk has programmed a handwriting recognition algorithm that can predict with 80 percent accuracy whether a sample was penned by a man or woman. The data scientists who invented it do not know precisely how it does this. The same is true of the much-criticized “gay faces” algorithm, which can, according to its Stanford University creators, distinguish the faces of homosexual and heterosexual men with 81 percent accuracy. They have only a hypothesis about what correlations the algorithm might be finding in photos (narrower jaws and longer noses, possibly).
To his credit, CEO Zuckerberg has pledged to put 20,000 of his employees onto reviewing both content and security by the end of the year. Was this due to his Congressional questioning? Perhaps, but last year Facebook spent $11.5 million on lobbying Congressional staff, a figure beaten only by Google's Alphabet division which spent $18 million (the highest figure spent by any company that sought favors from Congress). But let's face it, all of this for the most part came with our permission. Those who felt obligated to carefully read the details when signing up with Facebook (their terms of service contract) will have likely spotted the approximately 3200 words "which contains dozens of links to more information," said an investigative piece in TIME. Said another article in the same magazine: Andrew Przybylski, a psychologist at Oxford University, notes that we don’t yet have robust, peer-reviewed studies on whether screen time is linked to depression or how children’s brains are affected by tech. That’s largely because those vast databases of user behavior owned by big tech firms like Facebook are proprietary. “They own the richest social database that has ever existed, and we can’t touch it,” Przybylski says. “We spend many hours engaged with them, but all the analysis of us happens behind closed doors.” Facebook says that it now downgrades viral videos which the company says has alone reduced user views by 50 million hours daily as tabulated at the end of last year.
All of this would appear to be of our own choosing in a sense. We want that linking and communication and deals and gossip, even if it means giving up a bit of our privacy. But then there's the issue of things we might not want to expose...such as our medical records. In a piece in the London Review of Books author Paul Taylor brings up the issue of our medical records being sold to firms for usage in (so far) drug trials and efficacy of certain drugs and vaccines. The data is anonymized (no specifics about your name or identifying personal information) and thus is considered not yours; a similar argument is ongoing in the U.S. over who or what also owns your X-rays and scan results (so far, indications are that they belong to the hospital and not you). Said the article: Although it may seem at odds with a patient's rights, it is crucial to the data's value: if you ask for consent not everyone will give it, and, worse, the people who do give consent aren't typical, so the data no longer tell you what you need to know. The legal and ethical justification isn't, however, based on the value or the science it enables, but on the idea that since the data is anonymized, the patient no longer has any rights over it.
Okay, all of this writing on what we knowingly or unknowingly are giving up was meant to just rattle the cage a bit; privacy means a lot to some and very little to others. But the issue of our personal habits and being tracked 24/7 was partially summed up by neuroscientist Susan Greenfield (she also happens to be a Baroness) in an earlier article: The notion of privacy, of privation, is shutting something out. We need to cut ourselves off. Everyone seems to think that it's great to be connected and exposed all the time. But what happens when everything is literal and visual? How do you explain a concept like honor when you can't find it on Google Images? The universe of the abstract is inexplicable. The nuance in life disappears. In some ways, it would seem, we are choosing to become anonymous.
*An interesting and more detailed study of the travails of Facebook appeared in WIRED, a story telling of how even a few of Facebook's employees discovered just how far the company can reach into their messages and records, and Mark Zuckerberg's desire to overtake whatever stood in his way, be it Twitter or Rupert Murdoch. Now Facebook has just launched a new dating feature to take on the likes of Tinder, Match and OKCupid, all targeted to its 200 million users who have already identified themselves as "single."
**A more detailed study of the law, labeled the General Data Protection Regulation (GDPR) can be found in a story in Bloomberg Businessweek.
Comments
Post a Comment
What do YOU think? Good, bad or indifferent, this blog is happy to hear your thoughts...criticisms, corrections and suggestions always welcome.