ScanTrace
Start free
All tools

Fake photo verifier

Verify whether a photograph was edited, generated by AI or reused from another context before you publish it. Forensic analysis in under 15 seconds.

Quick answer

Drop the image into ScanTrace. Three forensic layers return REAL, AI_GENERATED, MANIPULATED or INDETERMINATE with a 0–1 confidence score and a written explanation.

Analyze your image now

Create a free account to access the full 3-layer forensic analysis and downloadable PDF certificate.

Get started free — 15 scans/month

Quick answer: how to verify a fake photo

Drop the image into ScanTrace. In under 15 seconds you receive a forensic verdict — REAL, AI_GENERATED, MANIPULATED or INDETERMINATE — with a 0–1 confidence score and a written explanation listing the signals that drove the decision.

How fake photos are typically created in 2026

Three families dominate: (1) generative AI via Midjourney, Flux or DALL·E producing entirely fabricated images; (2) local manipulation via Photoshop's Generative Fill, Stable Diffusion inpainting or smartphone "magic editor" features that swap objects without changing the rest of the frame; (3) recontextualisation, the cheapest and most effective — a real photo from 2014 republished as if it were captured today.

How ScanTrace detects each family

Generative images leave statistical fingerprints in the frequency domain plus an EXIF blank or "Software: Midjourney" tag. Local edits create internal inconsistencies — JPEG ghosts, mismatched noise residues, lighting that does not match the scene's geometry. Recontextualisation is harder forensically and is best caught by reading the EXIF capture date plus reverse-image-search.

Statistics that justify systematic verification

The Reuters Institute's 2024 Digital News Report found that 59% of internet users globally are concerned about identifying real from fake content online. Poynter cataloged a 320% rise in image-based misinformation incidents between 2022 and 2025. Newsrooms that deploy systematic image verification report a 74% drop in published corrections tied to visual content.

Use caseScanTraceReverse image searchManual EXIF reader
Detect AI generationYesNoPartial
Detect local Photoshop editsYesNoNo
Find reused imagesYes (Pro)YesNo
Single combined verdictYesNoNo

Frequently asked questions

How do I know if a photo is fake?

Run it through a forensic analyzer like ScanTrace. The tool checks pixel-level statistics, EXIF metadata and contextual signals and returns REAL, AI_GENERATED, MANIPULATED or INDETERMINATE in under 15 seconds.

Can it detect Photoshop edits?

Yes. Splice detection looks for inconsistent JPEG quality, noise patterns and lighting between regions. Heavy localised edits (added/removed objects, swapped faces) are caught with high confidence; subtle colour grading is intentionally not flagged.

Does it find images reused from other contexts?

ScanTrace ships with reverse-image-search hooks for the Pro and Newsroom plans. The hash-based cache also catches identical re-uploads instantly.

What file formats are supported?

JPEG, PNG and WebP up to 5 MB on the free plan and 10 MB on Pro and Newsroom.

How is it different from a reverse image search?

Reverse image search tells you where else an image has appeared. ScanTrace tells you whether the image itself is authentic, generated by AI or digitally manipulated — independent of whether it has ever been published.

Keep reading