Search

FACIAL RECOGNITION: LAST WEEK TONIGHT WITH JOHN OLIVER [FULL TRANSCRIPT]

John Oliver takes a look at facial recognition technology, how it’s used by private companies and law enforcement, and why it can be dangerous.
Facial Recognition: Last Week Tonight with John Oliver

John Oliver takes a look at facial recognition technology, how it’s used by private companies and law enforcement, and why it can be dangerous.

Our main story tonight concerns facial recognition, the thing that makes sure my iPhone won’t open unless it sees my face or the face of any toucan, but that is it.

Facial recognition technologies be showcased in TV shows and movies for years. Denzel Washington even discovered a creative use for it in the 2006 action movie Déjà Vu.

We have facial recognition software?
Yeah.
Let’s use it on the bag, cross-match it with all the bags on the south side of the city in the 48 hours leading up to the explosion, all right?
Don’t think that’s ever been used this way.
Look… same day. Bingo!

Bingo indeed Denzel. With smart believable plot development like that is frankly no wonder that Déjà Vu receives such glowing IMDb reviews as “An insult to anybody who finished elementary school,” “Worst movie of all time,” “More like Deja Pooh,” and my personal favorite, a one-star review that reads “Bruce Greenwood, as always, is great and so sexy and there’s a cat who survives.” A review that was clearly written either by Bruce Greenwood, or that cat.

Now, the technology behind facial recognition has been around for years, but recently as it’s grown more sophisticated his applications have expanded greatly. For instance, it’s no longer just humans who can be the targets.

“The i-Farm sensor scans each fish and uses automatic image processing to uniquely identify each individual. A number of symptoms are recognized, including loser fish.”

Yes, loser fish—which by the way, is an actual industry term. Now, that company says it can detect which fish are losers by facial scan—which is important, because can you tell which one of these fish is a loser and which one is a winner? Are you sure about that? Because they’re the same fish. This is why you need a computer.

But the growth of facial recognition, and what it’s capable of, brings with a host of privacy and civil liberties issues. Because if you want a sense of just how terrifying this technology could be, if it becomes part of everyday life, just watch as a Russian TV presenter demonstrates an app called FindFace.

[Russia-24, 2016] “If you find yourself in a cafe with an attractive girl, and you don’t have the guts to approach her—no problem. All you need is a smartphone and the application FindFace. Find new friends… take a picture… and wait for the result. Now you’re already looking at her profile page.

Burn it all down, burn everything down. I’ll realize that this is a sentence that no one involved in creating the app ever once thought, but just imagine that from a woman’s perspective. You’re going about your day, when suddenly you get a random message from a guy you don’t know that says “hello, I saw you in cafe earlier and used FindFace app to learn your name and contact information. I’ll pick you up from your place at 8:00—don’t worry, I already know where you live.”

But one of the biggest users of facial recognition is perhaps, unsurprisingly, law enforcement. Since 2011 the FBI has logged more than three hundred and ninety thousand facial recognition searches, and the databases law enforcement pulling from include over one hundred and seventeen million American adults, and incorporates, among other things, driver’s license photos from residents of all these states. So, roughly, one in two of us have had our photo search this way.

And the police will argue that this is all for the best. In fact here is an official with the London police explaining why they use it they’re.

Here in London we’ve had the London Bridge attack, the Westminster Bridge attacked– the suspects involved in that, the people that were guilty of those offences were often known by the authorities. Had they been on some database, had they been picked up by cameras beforehand, we may have been able to prevent those atrocities, and that would definitely be a price worth paying.

Okay, look, it’s hard to come out against the prevention of atrocities—this show is, and always has been anti-atrocity—but the key question there is what’s the trade-off. If the police can guarantee that they could prevent all robberies but the only way to do that is by having an officer stationed in every bathroom watching you every time you take a shit, I’m not sure everyone would agree that it’s worth it, and the people who do might want that for reasons other than preventing crime.

And now it’s actually a very good time to be looking at this issue, because there are currently serious concerns. The facial-recognition is being used to identify black lives matter protesters. And if that’s true it wouldn’t actually be the first time, as this senior scientist at Google, Timnit Gebru, will tell you.

[Timnit Gebru, Senior Research Scientist, Google] There was an example with the Baltimore Police and the Freddie Gray marches where they use face recognition to identify protesters, and then they try to link them up with their social media profiles, and then target them for arrest. So right now a lot of people are actually urging people not to put images of protesters on social media because there are people out there whose job is just to look up these people and and target them.

It’s true. Joining the Freddie Gray protests, police officers used facial recognition technology to look for people with outstanding warrants and arrest them—which is a pretty sinister way to undermine the right to assemble.

So tonight, let’s take a look at facial recognition.

Let’s start with the fact that even as big companies like Microsoft, Amazon, and IBM have been developing it, and governments all over the world have been happily rolling it out, there haven’t been many rules, or a framework in place for how it is used. In Britain they’ve actually been experimenting with facial recognition zones, even putting up signs alerting you to the fact that you’re about to enter one—which seems polite but what what happens when one man decided he didn’t actually want his face scanned.

This man didn’t want to be caught by the police cameras, so he covered his face, but they stopped him, they photographed him anyway, an argument followed.
What’s your suspicion? The the fact that he walked past a clearly marked sign…
I would do the same!
… and covered his face.
I would do the same!
I walked pass like that—it’s a cold day as well—as soon as I’ve done that, the police officers asked me […] I’ve got me back up, I said to him fuck off, I’ve got on there 90 pound fine, there you go, look at that– no effects lads, 90 pound, well done.

Yeah, that Guy Ritchie character was rightly mad about that. And incidentally, if you are not British and you’re looking at that man then at me, and then wondering how we both came from the same Island, let me quickly explain. British people come in two variations: so emotionally-stunted that they’re practically comatose, and cheerfully telling large groups of policemen to “fuck off and do one if you’re gonna take a photo with me face.” There’s absolutely nothing in between the two.

And the UK’s by no means alone in building out a system. Australia is investing heavily in a national facial biometric system called “The Capability,” which sounds like the name of a Netflix original movie—although that’s actually perfect if you want people to notice it and think “that seems interesting,” and then forget it ever existed.

And you don’t have to imagine what this technology would look like in the hands of an authoritarian government because China is, unsurprisingly, embracing it in a big way.

[BBC News, 2017] We can match every face with an ID card and trace all your movements back one week in time. We can match your face with your car, match you with your relatives and the people you’re in touch with. With enough cameras we can know who you frequently meet.

That is a terrifying level of surveillance. Imagine the Eye of Sauron but instead of scouring Middler for the One Ring he was just really into knowing where all his orcs like to go to dinner.

And some state-funded developers in China seem weirdly oblivious to just how sinister their projects sound.

[Vice: Face in the Crowd (2028) HBO]
[Interviewer] Skynet what is that?
[Xie Yinan, Vice President and spokenperson MegVII Technology Co. Ltd.] The Terminator is the favorite film of our founder. So they use the same name, but they want to put something good into the system.
[Interviewer] So, OK, in The Terminator Skynet is evil, reigns down death from the sky, but in China Skynet is good
[Xie Yinan] Yeah, that’s the difference.

Oh, that’s the difference, is it? You know, it’s not exactly reassuring that you called your massive all-encompassing AI network “Skynet”—but a good version—because it’d be like if the Today Show built a robot journalist, and called it Matt Lauer, but good. “Oh yeah, this one’s completely different!” Sure, he does also have a button under his office desk, but all it does is release lilac air freshener. This is the good version.

The point is, this technology raises troubling philosophical questions about personal freedom, and right now there are also some very immediate practical issues. Because, even though it is currently being used, this technology is still very much a work in progress. And its error rate is particularly high when it comes to matching faces in real time. In fact, in the UK, when human rights researchers watched police put one such system to the test, they found that only 8 out of 42 matches were verifiably correct. And that’s even before we get into the fact that these systems can have some worrying blah spots, as one MIT researcher found out when testing out numerous algorithms, including Amazon’s own recognition system.

[WGBH, 2018]
At first glance MIT researchers Joy Buolamwini says the overall accuracy rate was high, even though all companies better detected and identified men’s faces than women’s. But the error rate grew as she dug deeper.
“Lighter male faces were the easiest to guess the gender on, and darker female faces were the hardest.”
One system couldn’t even detect if she had a face, and the others misidentified her gender. White guy, no problem.

Yeah, white guy no problem—which, yes, is the unofficial motto of history—but it’s not like what we needed right now was for computers to somehow find a way to exacerbate the problem.

And it gets worse. In one test, Amazon system even failed on the face of Oprah Winfrey, someone so recognizable her magazine only had to type the first letter of her name, and your brain auto-completed the rest.

And that’s not all. A federal study of more than a hundred facial recognition algorithms found that Asian and African American people were up to a hundred times more likely to be misidentified than white man. So that is clearly concerning. And on top of all of this, some law-enforcement agencies have been using these systems in ways they weren’t exactly designed to be used.

In 2017 police were looking for this beer thief. The surveillance image wasn’t clear enough for facial recognition software to identify him. So instead police used a picture of a look-alike, which happened to be actor Woody Harrelson. That produced names of several possible suspects and led to an arrest.

Yeah, they used a photo of Woody Harrelson to catch a beer thief—and how dare you drag Woody Harrelson into this. This is the man that one’s got drunk at Wimbledon in this magnificent hat, made this facial expression in the stands, and in doing so accidentally made tennis interesting for a day. He doesn’t deserve prison for that, he deserves the Wimbledon trophy.

And there’s been multiple instances where investigators have had such confidence in a match they’ve made disastrous mistakes. A few years back Sri Lankan authorities mistakenly targeted this Brown University student as a suspect in a heinous crime, which made for pretty awful finals week.

[Amara Majeed, Mistakenly named terror suspect] “On the morning of April 25th, in the midst of final season, I woke up in my dorm room 2:35 missed calls all frantically informing me that I had been falsely identified as one of the terrorists involved in the recent Easter attacks in my beloved motherland Sri Lanka.”

That’s terrible. Finals week is already bad enough, with the staying up all night alternating shots of 5-hour energy and Java Monster Mean Bean, while trying to push your brain to remember the differences between Baroque and Rococo architecture, without waking up to find out that you’ve been also accused of terrorism because a computer sucks up faces.

Now, on the one hand these technical issues could get smoothed out over time but, even if this technology eventually becomes perfect, we should really be asking ourselves how much we’re comfortable with it being used by police, by governments, by companies, or indeed by anyone. And we should be asking that right now, because we’re about to cross a major line.

For years many tech companies approached facial recognition with caution. In fact in 2011 the then chairman of Google said it was the one technology the company had held back because it could be used in a very bad way. And think about that it was – Pandora’s Box e for Silicon Valley the world’s most enthusiastic Pandora’s Box openers. And even some of the big companies that have developed facial recognition algorithms have designed it for use unlimited data sets, like mug shots, or driver’s license photos, but now something important has changed and it is because of this guy Hoan Ton-That, and his company Clearview AI and I’ll let him describe what it does.

[Hoan Ton-That, Founder, Clearview AI] Quite simply, Clearview is basically a search engine for faces, so anyone in law enforcement can upload a face to the system and it finds any other publicly available material that matches that particular face.

OK, so the key phrase there is “publicly available material,” because Clearview says it’s collected a database of 3 billion images. That is larger than any other facial recognition database in the world, and it’s done that by scraping them from public facing social media like Facebook, LinkedIn, Twitter, and Instagram. So for instance system would theoretically include this publicly available photo of Hoan Ton-That what appears to be Burning Man. Or this one of him wearing a suit from the exclusive Santa Clau after dark collection at Men’s Wearhouse, and this very real photo of him shirtless, and lighting a cigarette with blood covered hands—which, by the way, is his profile photo untitled, because yes of course he’s also a musician—I can only assume that that’s the cover of an album called Automatic Skip, if this ever comes up on a Pandora station. And Ton-That’s willingness to do what others have not been willing to do, and that is scraped the whole internet for photos, has made this company a genuine game changer in the worst possible way. Just watch as he impresses a journalist by running a sample search.

[Clip from CNN Business]
So here’s the photo you uploaded to me…
Mm-hmm
…a headshot from CNN
Mm-hmm
So, first few images it’s found… it’s found a few different versions of that, that same picture. But now as we scroll down, we’re starting to see pictures of me that are not from that original image. Wow, oh my god… so this… this photograph is from my local newspaper where I lived in Ireland, and this photo would have been taken when I’m when I was like 16.
Wow.
That’s crazy.

Yeah, it is. Well though, here is some advice, if there is an embarrassing photo of you from when you’re a teenager, don’t run away from it make it the center of your television shows promotional campaign and own. It use the fact that your teenage years were a hormonal Stalingrad harness the pain

But the notion that someone can take your picture and immediately find out everything about you is alarming enough even before you discover that over 600 law enforcement agencies have been using Clearview service, and you’re probably in that database even if you don’t know it. If a photo of you has been uploaded to the Internet there is a decent chance that Clearview has it even if someone uploaded it without your consent even if you untagged yourself or later set your account to private. And if you’re thinking hold on isn’t this against the Terms of Service for internet companies you should know.

Clearview actually received cease and desist orders from Twitter, YouTube, and Facebook earlier this year, but it has refused to stop arguing that it has a First Amendment right to harvest data from social media—which is just not at all how the First Amendment works. You might as well argue that you have an Eighth Amendment right to dress up rabbits like John Lennon—that amendment does not cover what I think you think it does. And yet Hoan Ton-That insists that this was all inevitable so we should all frankly be glad that he’s the one who did it.

[Hoan Ton-That] I think the choice now is not between like no facial recognition and facial recognition is between you know bad facial recognition and responsible facial recognition and we want to be in the responsible category.

Well, sure you want to be but are you because there are a lot of red flags here for starters apps he developed before this included one called Trump hair, which would just add Trump’s hair to a user’s photo, and another called ViddiHo that fished its own users tricking them into sharing access to their Gmail account and then spamming all their contacts—so, I’m not sure that I would want to trust my privacy to this guy. If however I was looking for someone to build an app that let me put Ron Swanson‘s mustache on my face as my checking account was quietly drained—sure then he’d be the top of my list.

And despite clear his repeated reassurances that its product is intended only for law enforcement as if that is inherently a good thing. He’s already put it in a lot of other people’s hands because in addition to users like the DEA and the FBI, he’s also made it available to employees at Kohl’s Walmart and Macy’s which has alone completed more than 6,000 facial searches. And it gets worse, because they’ve also reportedly tried to pitch their service to congressional candidate and white supremacist Paul Nealon, suggesting that they could help him use unconventional databases for extreme opposition research—which is a terrifying series of words to share a sentence with white supremacists.

Now Clearview says that that offer was unauthorized but when questioned about who else he might be willing to work with Tom tats answer hasn’t been reassuring.

There’s some countries that would never sell to that at very adverse to the US for example like China and Russia Iran North Korea so those are the things that are definitely off the table
What about countries that think that being gay should be illegal it’s a crime
So like I said you know we want to make sure that we do everything correctly mainly focus on the U.S. in Canada and the interest has been overwhelming to be honest just so much interest that you know you’re taking in one day at a time.

Yeah that’s no terribly comforting when you ask a farmer if you’d let foxes into the henhouse the answer you hope for is know not the interest from foxes has been overwhelming to be honest just so much interest so you know we’re taking it one day at a time and unsurprisingly reporters for BuzzFeed have found that Clearview has quietly offered its services to entities in Saudi Arabia and the United Arab Emirates countries that view human rights laws with the same level of respect that clear view seems to have to Facebook’s Terms of Service. So facial recognition technology is already here the question is what can we do about it?

Well, some are trying to find ways to thwart the cameras themselves.

[Jillian Mayer, YouTube, 2013] Hi guys, it’s me Jillian again, with a new makeup tutorial. Today’s topic is how to hide from cameras.

Okay, first, that’s probably not a scalable solution. And second, I’m not sure if that makes you less identifiable or the most identifiable person on earth—”Officers are on the lookout for a young woman, dark hair, medium build, looks like a mine who went through a shredder.

Look, clearly what we really need to do is put limits on how this technology can be used, and some locations have laws in place already. San Francisco banned facial recognition last year, but the scope of that is limited to City law enforcement, it doesn’t affect state and federal use, or private companies. Meanwhile, Illinois has a law requiring companies to obtain written permission before collecting a person’s fingerprints facial scans or other identifying biological characteristics—and that is good—, but we also need a comprehensive nationwide policy and we need it right now. Because again, there are worries that it is being used in the protests that we are seeing now. And the good news is that just this week thanks to those protests and two years of work by activists, some companies did pull back from facial recognition. For instance IBM says they’ll no longer develop facial recognition, meanwhile Amazon said it was putting a one-year hold on working with law enforcement, and Microsoft said it wouldn’t sell its technology to police without federal regulation. But there is nothing to stop those companies from changing their mind if people’s outrage dies down.

And for the record, while Clearview says it’s canceling its private contracts, it’s also said it will keep working with the police, just as it will keep harvesting you photos from the internet.

So if Clearview is gonna keep grabbing our photos, at the very least there may be a way to let them know what you think about that, so if the next time you feel the need to upload a photo, maybe throw in an extra one for them to collect.  Maybe hold up a sign that says “These photos were taken unwillingly, and I’d rather you not be looking at them,” or if that feels too complicated, just “Fuck Clearview,”—that really does get the message across. And remember these photos are often being searched by law enforcement so you may want to take this opportunity to talk to the investigators looking through your photos maybe something like “I don’t look like Woody Harrelson but, while I have your attention  defund the police.” Really, whatever you feel is most important to tell them, you should put on a sign.

That’s our show, thank you so much for watching, we’ll see you next week. Good night!

 

SHARE THIS ARTICLE

Leave a Comment

Your email address will not be published. Required fields are marked *

Read More

Star Trek Discovery - S05E05 - Mirrors

Star Trek Discovery – S05E05 – Mirrors [Transcript]

Captain Burnham and Book journey into extradimensional space in search of the next clue to the location of the Progenitors’ power, while Rayner navigates his first mission in command of the U. S. S. Discovery and Culber opens up to Tilly.

Weekly Magazine

Get the best articles once a week directly to your inbox!