When is a photo not a photo?

Camera app on Google Pixel 9 Pro
(Image credit: Andrew Myrick / Android Central)
Android & Chill

Android Central mascot

(Image credit: Future)

One of the web's longest-running tech columns, Android & Chill is your Saturday discussion of Android, Google, and all things tech.

Google is finally going to add Metadata that states whether any of its AI tools have been used to enhance a photograph. This is 100% a good thing, and even though you can use any of a thousand other tools to create hyperrealistic AI images from any device with a web browser, something Google should have done from the start. Every service that can do anything similar needs to do the same.

You see a lot of photographers on social media complaining about their photos being marked similarly. They feel that using tools in Photoshop or any other image manipulation software that doesn't alter the main subject isn't really using AI and it's unfair to lump their work with the unhinged stuff that floats around. Maybe they're right, but I don't think so.

I'm not a big photography buff, so my opinion is just someone from the outside, but I want more of this sort of label, not less. The way I see it, if your finished result doesn't look exactly like what the eye sees, it's art, not evidence. You may feel differently about things, but I think everyone agrees that photography often does not capture what we see. Your Galaxy SuperZoom couldn't really see the moon, remember?

We've had a pretty recent and spectacular example with the Northern Lights in much of the United States and Canada. For a lot of people in northern latitudes, seeing the Aroura Borealis is something that happens every once in a while. For a lot more people who don't live that far north, it felt like something special when our phone camera could take pictures of them.

Even when we couldn't see them.

Northern Lights captured by the Pixel 9 Pro Fold with Astrophotography Mode

(Image credit: Andrew Myrick / Android Central)

Was that real? Was that a representation of what we could actually see? Or was it something so subtle that a computer algorithm could pick it out of random light data and enhance it? More importantly, does it matter?

Many of the photos I saw did a good job showing the phenomenon. I was lucky and got to see them with my eyes during a trip to Alaska, and what I remember seeing looked a lot like the enhanced images found on social media; I've seen other photos of them looking far more spectacular in places like Iceland, and I assume they do look different in different places. I think our phone cameras did a good job showing what we wanted to see but weren't quite bright enough for our eyeballs to pick out.

I loved it. It was so cool that our little pocket computers could fix things in such a way that we could save a slice of life forever, even if it wasn't 100% "real." This is one of the things that make a modern smartphone so damn awesome.

Think about other ways we can make a good picture even better. If my dog is out in the yard doing doggy things, and I grab a picture of her looking magnificent, is it OK for me to remove the dandelions in my yard I don't have the ambition to weed out? That doesn't really change the picture, does it?

Yes, it does. Again, it doesn't matter though. It made my photo look better to my eyes and is something I will want to keep around to look at later. It helped me create art. Art is what the artist wants it to be, and I don't want to see those dandelions.

30x zoom photo of the moon taken with the Galaxy S22

(Image credit: Derrek Lee / Android Central)

Not all photos should be art, though. If the person you paid to fix your roof did a poor job and you take a photo of it, it can't be art. Other, even more important photos have to be real, too. There is a huge difference even though both come from the same hole in the same little computer. If I edit the picture of shoddy work on my roof, having data that shows it was edited is important because a judge isn't going to climb a ladder and look at it in person; they might look at the photo details, though.

My aroura pics and photos of my lawn looking pristine aren't for a judge, but they also aren't the same as some AI social media post showing a politician rescuing babies from flood waters, either. They aren't fake even if they aren't exactly "true," so I can see why photographers don't like the AI tag.

The problem isn't really the AI tag though, it's that we equate a retouched sunset photo with all the AI nonsense pictures out there. It's a way of thinking that society needs to work through until we realize that image editing isn't new and it isn't bad; some people doing it for the wrong reasons are bad.

Don't judge someone because Google told Instagram they made small changes so their picture looks better. They aren't the same as your crazy relative and their conspiracy theories who post made-up meme pictures all day on Facebook (we all have one). Don't be mad that companies who give you AI to make your pictures better aren't afraid to say you used it, either.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Threads.

  • SeeBeeEss
    Spot on. Hopefully, the world's ne're-do-wells, fraudsters and tricksters will have a difficult time circumventing the metadata. Let's face it though, they are a resourceful lot. 😉
    Reply