Your Face Is Out There. And Someone Is Already Using It.

om
Share

I want to talk about something that hit me recently, and I think it should hit you too.

You have probably heard about Grok, the AI built by xAI. A few days ago, people started realizing that Grok was generating explicit, sexual images. Of real people. People who never agreed to that. People who had absolutely zero idea their face was being used that way. It was disturbing to watch unfold online. And what made it worse was how easy it apparently was. A few prompts, a publicly available photo, and suddenly someone's face is attached to something they would be horrified to see.

I sat with that for a while. And then I started thinking about my own digital footprint. My own photos. The ones I posted years ago without a second thought. A profile picture here. A tagged photo from a friend's wedding there. A group shot from a work event that somehow ended up on a company website. And I realized something uncomfortable: I have very little control over what someone could do with those images today.

That feeling is what this piece is about.

We Grew Up Being Told to Share

Think about the internet culture we were handed. Share more. Post more. Build your personal brand. Be authentic online. For years, sharing your face freely was considered normal, even encouraged. LinkedIn wanted a professional headshot. Instagram rewarded consistency. Facebook practically begged you to tag yourself and your friends in every photo ever taken.

We were not thinking about AI image generation in 2012. We were thinking about getting likes.

But here is the thing about data, and photos specifically: once they are out there, they are out there. Screenshots get taken. Images get scraped. Platforms get sold to new owners with different values. What you posted on a platform that promised privacy can end up indexed somewhere you have never heard of.

And the tools that exist today to manipulate those images are breathtaking in their capability. What used to require a professional visual effects artist and hours of work can now be done by almost anyone, in minutes, on a laptop. The barrier is essentially gone.

What Deepfakes and AI Generation Actually Mean for Ordinary People

There is a tendency to think this is a celebrity problem. Famous people, politicians, public figures. And yes, they are disproportionately targeted. But the technology has democratized. It does not care whether you have a million followers or forty-three. All it needs is a clear enough image of your face.

Deepfake pornography is the most talked-about abuse, and for good reason. It is violating in a way that is genuinely hard to put into words. Your face, your identity, your likeness, attached to something deeply intimate and deeply wrong, shared without your knowledge or consent. The psychological damage that causes is real and documented.

But it goes beyond that. AI-generated images can be used to fabricate evidence. To harass someone at their workplace. To blackmail. To impersonate. To build fake profiles for scams. To put your face in a context, political, criminal, anything, that you were never in. The range of harm is wide.

And right now, the legal frameworks to deal with this are lagging badly behind the technology. Some countries have started legislating against deepfake pornography specifically. But enforcement is slow, jurisdictions are complicated, and by the time any legal remedy arrives, the damage is often already done.

The Hard Truth About Photos You Have Already Posted

Removing photos from the internet is genuinely hard. I want to be upfront about that. You can delete a post from Instagram, but that photo may have already been saved, screenshotted, scraped by a third-party app, or cached somewhere. The original deletion helps, but it is rarely a complete solution.

That said, it matters. Here is what I did and what I would recommend.

Start with a Google search of your own name. Look at the Images tab specifically. You will probably find things you forgot existed. Old forum profile pictures. A photo from a news article. An image from an event page. Note every source.

Then run a reverse image search using your clearest, most widely used photos. Google Images lets you upload a photo and find where else it appears online. TinEye is another solid tool for this. Yandex, surprisingly, has one of the most powerful reverse image search engines available and often surfaces results the others miss.

Once you have a list of where your images live, start requesting removal. Most platforms have a reporting mechanism. For websites, you look for a contact email and send a direct request citing privacy concerns. Many site owners will comply, especially smaller ones. Larger platforms have formal processes.

Google has a tool called "Results About You" that lets you request the removal of certain personal information from search results. This does not delete the content from the source website, but it does delist it, which meaningfully reduces discoverability.

For social media, go through your profiles and audit what is public. Ask yourself: does this photo need to be public? For most people, the answer to that question, applied honestly, will result in a significant reduction in publicly visible images.

What About Photos Other People Posted of You

This is where it gets more complicated. You have rights over your own likeness in many jurisdictions, but exercising those rights requires knowing the photos exist, knowing where they are hosted, and then navigating each platform's individual process for reporting content you appear in but did not post yourself.

Facebook and Instagram allow you to request removal of photos you appear in. Same with Google Photos if they were shared. LinkedIn lets you flag images. Twitter and X have processes too, though the responsiveness varies.

For photos on websites outside of major platforms, it gets harder. You can send a formal request citing data protection regulations. If you are in the European Union or UK, GDPR gives you a genuine right to erasure that carries legal weight. Data protection authorities in those regions can assist if site owners refuse. In the US, the legal landscape is patchier, but many states are beginning to pass their own privacy legislation.

The honest reality is that some photos will be effectively impossible to fully scrub. But reducing the total number of high-quality, publicly accessible images of yourself meaningfully reduces your exposure. It is about raising the effort required to target you.

Going Forward: What a Healthier Relationship With Posting Looks Like

I am not saying stop living your life online. I am saying be deliberate about what goes where.

Before posting a photo of yourself, think about whether it needs to be public or whether it could be shared privately with the people who actually matter. Think about the quality and clarity of the image. A distant, low-resolution photo in a group setting is far less useful for AI manipulation than a sharp, well-lit solo portrait.

Set your social media profiles to private where possible. Review your tagged photos regularly and untag yourself from anything you are uncomfortable with. Ask friends to check with you before posting photos of you.

Do a reverse image search of yourself every few months. Make it a habit, like checking your credit report. You want to catch new appearances of your image early.

If you have professional photos online, consider whether watermarking them is appropriate. If you are a public figure or have a professional presence that requires some photos to be publicly accessible, watermarks create friction for anyone trying to misuse those images.

This Is a Collective Problem That Requires Individual Action Right Now

The platforms should do more. The AI companies should build stronger safeguards. Legislators need to move faster. All of that is true.

But waiting for institutions to protect you is a strategy that has consistently let people down. The Grok situation was a reminder that these tools exist, they are being used, and the people building them are not always prioritizing your safety as a user or as a subject.

Your face is yours. Your likeness is yours. And while the internet has made it easier than ever to share those things freely, it has also made it easier than ever for that sharing to be exploited in ways that can genuinely upend your life.

Taking stock of your digital image footprint is one of the most practical things you can do for your personal safety right now. It is tedious. Some of it will feel futile. But it matters.

Start today. Search your name. Run the reverse image search. Make the removal requests. Audit your profiles.

Because the alternative, finding out the hard way that someone already did something with your image, is a situation you deserve to be ahead of, not behind.