If the government is to regulate AI, who is going to regulate the government?

Google CEO Sundar Pichai
Google CEO Sundar Pichai (Image credit: Android Central)

Artificial Intelligence is an often misunderstood term. It sounds like claiming that machines — whether they be robots or your washing machine — are sentient and able to think for themselves, but that's not really the case. Machine learning (another misnomer in itself) is a tool where programmers can set up the software to recognize a pattern, like a shape or a color or a specific phrase — then call for an action to happen if it "sees" that pattern again.

Machines are not smart. They are just programmed to recognize patterns.

A great example was how an NVIDIA engineer "taught" one of its AI machines by feeding it photos of cats. All sorts of cats in all sorts of different situations. Eventually, the machine was able to recognize a cat in any photo or even a live feed. It didn't need any more programming to find a cat, no matter the situation because it "learned" what a cat was and what it looked like.

We've moved well beyond cats and as Google CEO Sundar Pichai mentions in his Financial Times editorial in that Google can predict the weather in India better than a meteorologist, and some companies or groups of people have trained machines to do things when a face is recognized.

A cat appears

Source: NVIDIA Yep, that's a cat. (Image credit: Source: NVIDIA)

Identifying a person in a Facebook photo, for example, can allow a machine to get a name, address, phone number, financial information, and an email address. If it's a famous person, it can probably find even more information including things that one would rather not be made public.

You don't need Terminators to do bad things with AI.

This is bad. Maybe it's not the same level of bad as a Terminator moving through time as we see in fictional movies, but still, do you want someone finding things out about you because your friend posted a photo with your face in it on social media?

And that's not the worst of it. AI that has learned exactly what a person looks and sounds like can create an electronic duplicate (called a deep fake) in a photo or video. Imagine a 90-second video of the head of state in some sort of compromising position or saying something off-color but it's 100% fake and computer-generated, and you couldn't tell it wasn't real.

Sorry everyone, but those Hermoine and Harry "adult" movies aren't real. They're just deep fakes.

These are real problems. Whether it's someone getting your credit score and selling it to fly-by-night creditors (don't you hate getting those letters?) or a movie star in a fake porno film or a presidential candidate making a fake speech that has millions of views on Facebook. Just because AI can detect cancer really well doesn't mean everything done with it will be beneficial.

Tensorflow

Source: Android Central (Image credit: Source: Android Central)

There needs to be some sort of oversight. That's obvious. It's also obvious that the companies building the machines or individuals writing software aren't capable of keeping it all in check. But having "the government" be the watchdog is insane.

Governments are created to take care of people but exist to take better care of some people. Even the most benevolent governments of the world are staffed by humans, and humans can not be trusted to always do the right thing. In a perfect world, it might work, but in the real world, government officials are concerned about being re-elected more than fixing the potholes in the roads or not starting World War III.

Governments should have to break the law to hurt people. Not make new laws that say it's OK to do it.

These are not the people who should be regulating something that's potentially more powerful than any other tool (or weapon) the world has ever seen. Do you want the Pentagon or the NSA to have technology that can run 24/7 to keep citizens under even more surveillance or to determine who is a threat to our freedom? Or an "enemy" country to have a system in place that can recognize the right time to make a first strike and how to invoke the most fear and chaos into your daily life? And have it all be OK under the law because the fox is guarding the henhouse?

I've been reminded that not every government official is evil. E.U. Competition Commissioner Margrethe Vestager is a great example. It is her job to make sure that businesses great and small — including Google, Apple, Amazon, Microsoft, Facebook and Volkswagen — play fairly and follow E.U. law when it comes to data, honesty, and privacy. And to date, she has done an excellent job and made valuable changes.

Volkswagen would have had no issues with truth in advertising if it were headquartered in the U.S.

But things that happen in the E.U. don't always have such a far-reaching effect. Especially when it comes to tech that can be weaponized. I don't expect Syria or Libya or the U.S. to take a well-meaning E.U. regulation about AI technology seriously when the heads of those countries know how powerful not following any regulation can be. This leads to a world where powerful and, depending on your point of view, aggressive nations having more power to be more aggressive. Or countries breaking their own laws and developing the same types of smart weaponry as countries without similar laws will.

Google Cloud TPU

Source: Android Central (Image credit: Source: Android Central)

An appointed official that the world would listen to, either willingly or by force, could develop rules for how AI can be used both in the private sector and by the world's governments.

An independent official could make the right decisions, but no country would follow them and an impartial candidate has no chance of getting the job.

Sundar Pichai knows how his words will be perceived, and it's good hearing one of the people responsible for the mess propose regulation. But just saying a thing that affects every single one of us needs government regulation is almost as terrible as saying nothing at all.

It's obvious that someone needs to take the reins and control who has access to powerful online servers that can be used for machine learning and what they are allowed to do with it once they have proper access. I don't know who that should be, though. One thing I do know is that passing the buck to "the government" means you want someone else to figure it all out for you.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Threads.