BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Why Facebook's '10-Year Challenge' Is A Disaster For Big Data Surveillance

Following
This article is more than 5 years old.

So, did they or didn’t they?

Actually, it doesn’t matter. The important point is that people immediately suspected that they might have done. The benefit of the doubt was in short supply despite Facebook’s categoric denials: “The 10-year challenge is a user-generated meme that started on its own, without our involvement,” they said. “It’s evidence of the fun people have on Facebook, and that’s it.”

The world of Big Data has clearly changed in the last year. Cambridge Analytica lifted the gargantuan rock, and brighter light than ever before came shining down on the ugly world of data-trading. And 2018 has also been the worst year yet for data breaches. Thousands of incidents. Billions of records. As IoT multiplies many times over the numbers of data capturing and storing devices in the world, it’s not going to get any better. Let’s face it, we can’t secure the devices and the data we have now.

In light of Cambridge Analytica and its alleged links to Russia and the Brexit and Trump campaigns, it became clear that data exploitation was a loose thread that starts small until you keep pulling and pulling and pulling. In a world of connectivity, a few hundred thousand individuals opened the floodgates to tens of millions of people. We will not soon forget the tortuous sight of Facebook CEO Mark Zuckerberg in front of Congress in April, his very public summons to the principal’s office.

“How do you maintain a business model in which users don’t pay for your services?”

“What about the part people are worried about, not the fun part?”

“Are you willing to expand my right to prohibit you from sharing my data?”

“Yes or no. Will you commit to changing all the user default settings to minimize to the greatest extent possible the collection and use of users’ data?”

Zuckerberg answered each of these questions carefully, as one should when tip-toeing through a minefield.

Harvard professor Shoshana Zuboff has dubbed Big Data’s business model ‘surveillance capitalism’. “Nearly every product or service that begins with the word ‘smart’ or ‘personalised’, every internet-enabled device, every ‘digital assistant’, is simply a supply-chain interface for the unobstructed flow of behavioral data on its way to predicting our futures in a surveillance economy,” she told the Guardian’s John Naughton.

How Do I Look Now?

The first social media bandwagon of 2019 looked like more of the same, no different to buckets of icy water or blindfolded teens crashing around their living rooms. Following in the wake of a tidal wave of celebrities, Facebook registered more than 5 million engagements over the course of three days, according to Talkwalker, as people queued to share photographic evidence of their aging since 2009. But then that tweet from Kate O’Neil went viral, sarcastically (her word) asking “how all this data could be mined to train facial recognition algorithms on age progression and age recognition?”

Maybe we’re changing. Maybe the toddler has learned from experience that the stove gets hot.

In 2009, in an interview with Wired, Mark Zuckerberg said that “when I started Facebook from my dorm room in 2004, the idea that my roommates and I talked about all the time was a world that was more open. We believed that people being able to share the information they wanted and having access to the information they wanted is just a better world… No one wants to live in a surveillance society.”

Earlier this month, the pointed title of a Vanity Fair article pretty well summed up the last 12 months for the social media giant. “Mark Zuckerberg’s 2019 resolution: convince people Facebook isn’t evil.”

Force for good in 2009 to force for evil in 2019.

How’s that for a 10-year challenge.

So, Is The Tide Now Turning?

If the tide is turning, it’s being driven by biometrics and specifically facial recognition. Apparently, this is the step too far. Harvest my likes and clicks. Fine, I guess. But don’t identify my friends and trace my face through social media and into the real world. And, definitely, don’t sell the government the technology to track me out on the streets.

Last week, “a coalition of over 85 racial justice, faith, and civil, human, and immigrants’ rights groups” petitioned Microsoft, Amazon and Google to stop selling facial recognition technology to the US Government. According to Nicole Ozer of California’s ACLU, “we are at a crossroads with face surveillance, and the choices made by these companies now will determine whether the next generation will have to fear being tracked by the government for attending a protest, going to their place of worship, or simply living their lives.”

In December, Google responded to employee and public pressure and confirmed that “Google Cloud has chosen not to offer general-purpose facial recognition APIs before working through important technology and policy questions.” Ironically, the same month the company was successful in having a lawsuit dismissed in Chicago for “collecting, storing and using” biometric (read facial recognition) data without written consent. Earlier in 2018, the company elected not to extend its involvement in the US Defense Department’s Project Maven, a drone AI imaging program. It was no accident that the decision was announced in a meeting with Google employees following an internal petition and resignations over their involvement.

Now, this week, a group of Amazon investors has filed a shareholder resolution asking the company to cease marketing its facial recognition software to law enforcement until an independent assessment confirms that the technology “does not cause or contribute to actual or potential violations of civil and human rights.” Amazon has spent the last year defending its decision to market a real-time facial recognition system to US law enforcement. Just as its online commerce platform democratized retail and AWS democratized Big Data, so Rekognition now wants to do the same for surveillance. It’s this underlying approach that underpins its rumored doorbell that can detect suspicious people who come within sight of its camera.

Meanwhile, Microsoft has taken an industry lead (if its public statements can be taken on trust) in calling for regulation, for policy to address this issue. “We believe it’s important for governments in 2019 to start adopting laws to regulate this technology,” they said in a December blog post. “The facial recognition genie, so to speak, is just emerging from the bottle. Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues. By that time, these challenges will be much more difficult to bottle back up.”

And then there’s Apple, clearly seeking to present themselves as ‘not Google’ when it comes to tracking us through our smart devices. At CES this month, despite not attending, the company splashed the side of a Vegas building with a billboard proclaiming: “what happens on your iPhone stays on your iPhone.” This follows on from Tim Cook very publicly resisting law enforcement requests to help crack Apple devices for user data. “Privacy to us is a human right. It’s a civil liberty,” he has said. The company’s former security chief actually departed for the ACLU last year. “I am part of a growing group of people who believe that some of the issues are in fact not technical but policy,” Jon Callas told Foreign Policy.

The Step Too Far

Marjory Stoneman Douglas school in Florida, where Nikolas Cruz killed seventeen people just under a year ago, has just announced that they will deploy facial recognition technology. "This will be essential in helping to improve our security measures, to track who belongs and quickly alerting who does not belong on campus," said Lori Alhadeff, mother of one of the young victims of the shooting. Fear in US schools is real and widespread: a Pew Research Center survey last year found that 57% of US teens and 63% of their parents worried about a school shooting. Last fall, the Lockport school district in New York State became one of the first to deploy a grant-funded facial recognition system with such incidents in mind.

For Big Data these should have been bricks in the wall of popular acceptance. Perception often trumps reality, and some facial recognition vendors are now claiming that their technologies can detect and contain an incident in real-time. The Washington Post pointed out last year that “an expanding web of largely unknown security contractors is marketing face recognition directly to school and community-center leaders, pitching the technology as an all-seeing shield against school shootings like those at Parkland and Santa Fe.” But the ultimate winners of the facial recognition space race for ubiquity and accessibility will not be specialist vendors, it will be the industry giants who build and maintain a lead in AI-driven surveillance, including facial recognition, given their access to unlimited imagery and associated data. The surveillance industry will need to operate with that reality. The likely outcome is AI as a utility that can be used on demand, similar to the existing Rekognition model. The trick will be the deployability of solutions. How data is captured and analyzed. The impediment will be bandwidth and making it work in real time, not data accuracy.

That's been the plan. But, as ever with facial recognition, it’s not going down universally well. This appears to be the technology straw that is threatening to break Big Data’s back. Whereas we have seemingly gone along with everything up until now, the idea that the platforms which dominate our digital lives will now slip out into the physical world, recognizing us as we pass, and selling that information is apparently too much. Finally. And we haven't even started to debate the really controversial subjects like social media scraping and ethnicity identification.

Up until now, the major issue seems to be with the sale of such technologies to law enforcement: as per the criticism of Amazon, Google and US and UK law enforcement. Ironically, the much more pernicious threat is the commercial exploitation of such data: either with the blending of our digital online selves with our physical real-world selves or with commercial entities playing at being law enforcement by compiling watch lists of ‘persons of interest’. Look to Russia and China if you need encouragement to support regulation of commercial exploitation of biometric data.

Dorothy Isn’t In Kansas Anymore

As things stand, only 18% of Americans think the government should strictly limit the use of facial recognition even if it negatively impacts public safety. In the West, we do not live in anything close to a surveillance state, despite the hyperbolic claims of privacy campaigners. And most people recognize that.

But the most valuable market opportunity for Big Data is commercial, not governmental. The realization of this is the first change we will now start to see. This is the battleground for biometric privacy. This is why that viral challenge became so interesting. If I can analyze how people age, where they are how they evolve, who they know, what they think and feel and like and do, I can sell their data to people who can sell them more.

In October, NYC council member Ritchie Torres introduced a bill that would mandate local businesses to notify customers of the use of biometric technology. “We’re increasingly living in a marketplace where companies are collecting vast quantities of personal data without the public’s consent or knowledge,” he said. “In a free and open society, I have the right to know whether a company is collecting my personal data, why a company is collecting my data, and whether a company will retain my data and for what purpose.”

The issue for Big Data is that people now expect the worst. The vagaries of training AI algorithms and structuring sophisticated data exploitation schemes are out in the open. So, in that regard, it doesn’t matter if Facebook intended to train a facial recognition engine or not. The damage is arguably the same. The naivety of the last ten years will never be repeated, our innocence can’t be recaptured. So look again at those photos from 2009, from a time before the people pictured realized they were the ultimate product.

So, is this a change or a bump in the road? Too early to tell. But it’s definitely an awakening of sorts. You’ll no doubt be getting the sense it’s all starting to move pretty quickly. And for the first time, Big Data might not have its hands quite so firmly on the steering wheel.

The thing is, though, no-one else does either.

Follow me on Twitter or LinkedIn