Close Menu
  • Home
  • Alternative News
    • Politics & Policy
    • Independent Journalism
    • Geopolitics & War
    • Economy & Power
    • Investigative Reports
  • Double Speak
    • Media Bias
    • Fact Check & Misinformation
    • Political Spin
    • Propaganda & Narrative
  • Truth or Scare
    • UFO & Extraterrestrial
    • Myth Busting & Debunking
    • Paranormal & Mysteries
    • Conspiracy Theories
  • Contact Us
  • About Us

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

What's Hot

The Tech High Ground

April 16, 2026

Higher Ed Is Hiding Racial Discrimination

April 16, 2026

Israeli Journalist With Deep Ties to IDF Admits West Bank Violence ‘Looks Like… Ethnic Cleansing’

April 16, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
TheOthernews
Subscribe
  • Home
  • Alternative News
    • Politics & Policy
    • Independent Journalism
    • Geopolitics & War
    • Economy & Power
    • Investigative Reports
  • Double Speak
    • Media Bias
    • Fact Check & Misinformation
    • Political Spin
    • Propaganda & Narrative
  • Truth or Scare
    • UFO & Extraterrestrial
    • Myth Busting & Debunking
    • Paranormal & Mysteries
    • Conspiracy Theories
  • Contact Us
  • About Us
TheOthernews
Home»Fact Check & Misinformation»Social media clips of crying U.S. soldiers may be AI-generated. Here’s how to spot them
Fact Check & Misinformation

Social media clips of crying U.S. soldiers may be AI-generated. Here’s how to spot them

nickBy nickApril 16, 2026No Comments7 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


“Mom, Dad, checking in,” one person who appears to be a U.S. service member tells the camera. “I’m good, okay? No need to worry about me.”

Surrounded by snow and ice, she continues: “It’s freezing out here and I’m soaked. But for the safety of the American people, and to help make America great again, I’m standing my ground. Do I earn your follow and your thumbs up yet? Stay safe.”

That isn’t a U.S. service member or even a real person. The video was generated with artificial intelligence. 

It’s one of many fake videos showing similar scenarios — U.S. service members crying or in dire conditions. Some show female soldiers in barren locations, in uniform, sniffling or crying. They address their parents directly. Others show male soldiers tearfully addressing their partners.

Fake videos of service members weeping and seeking empathy from viewers have surfaced during other conflicts, including the Russia-Ukraine war, and have continued during the Iran war. Thirteen U.S. service members have been killed as of April 15. 

Creators have a financial incentive to produce emotional content. Viral videos can earn money for the creators through social media platforms’ programs. Or their videos can direct users to websites prompting them to make purchases, or steal their personal information. 

PolitiFact found at least 11 TikTok, Facebook and YouTube accounts that primarily post AI-generated videos of service members, with more than 174,000 followers combined. The fake military videos gained 29.6 million views collectively, with the views for each account averaging from 628 to 466,192. Some video labels disclose that they are AI-generated, but even in those cases, people commenting on the videos don’t seem to realize they’re fake.

PolitiFact contacted Meta, YouTube and TikTok about the accounts. The accounts we inquired about later became unavailable as of April 15.

A TikTok spokesperson said the platform’s Community Guidelines prohibit AI-generated content that presents misleading information on matters of public importance, such as an active conflict, and that they have removed the accounts we shared.

Facebook removed the accounts we flagged for violating its policies, a spokesperson said, adding that the pages were not monetized. 

YouTube also removed a channel we inquired about, a spokesperson said, for violating their spam policies.

Shannon Razsadin, chief executive officer of the nonprofit organization Military Family Advisory Network, said military families are encountering such videos and questioning what is real. 

“These videos heighten anxiety by presenting scenarios that may not reflect reality, which can compound fear for families already navigating a lot of unknowns,” she said.

Mary Bennett Doty, associate director of programs at We the Veterans & Military Families, said such content adds to inflammatory rhetoric and could deepen division.

Videos show emotional service members talking about their families, fallen soldiers

The accounts often use one type of background and script for their videos, often sticking to videos of only men or only women, or pivoted from one to the other. 

(Screenshots from TikTok and YouTube)

One page named “US Soldier Legacy,” for example, containeds videos of women crying and talking over the sound of jets, with smoke in the background.

In one video posted by a TikTok account named “Usa Soldier Life,” with more than 764,000 views on TikTok, a man stands in the foreground with a flag-draped coffin in the distance, saying through tears, “I’m gonna miss you brother, I hope, I hope you know how much we love you. I love you, man. Rest easy.”

Other pages primarily show male service members, often holding photos presumably of their loved ones and addressing their partners. One page’s captions say the videos are their last messages to their families. 

Accounts often seek to monetize content 

Many accounts don’t appear to seek money, but some provide a way for viewers to potentially contact the account holders, such as through a telephone number or a website, for reasons that could include selling items or leading people to a phishing scam.

These accounts follow a trend that uses AI to create synthetic “influencers” and other deepfake content related to politics.

For example, one profile of a female service member named “Jessica Foster” that gained 1 million followers on Instagram while posting images of her with President Donald Trump and other political figures was AI-generated. The account linked to a separate page where the profile sold exclusive fetish content.

Accounts like these can make money through viewers’ engagement with the content, or can direct users to other websites that sell products. Daniel Schiff, a Purdue University assistant professor of technology policy, said people risk exposure to cyberattacks and information theft.

“Accounts may post sympathetic or incendiary information to leverage people’s emotions or draw their attention,” he said. “Once that account has enough followers, they may post links to external content, which could range from selling clothing to selling intimate content.” Schiff said many of these accounts are driven by economic motives. 

In one video, the AI-generated character cries and says he’s thinking about home, then promotes a “shop link” in the account’s bio description as he continues crying. The bio did not feature a link. One Facebook page with 31,000 followers called “Brave Marine,” featuring similar videos of male soldiers, linked to a website featuring job listings for a maritime company. A telephone number listed for the website’s registration has previously been connected to fraud campaigns.

The content undermines trust in information sources that military families rely on, Razsadin said.

“Many official entities like military branches, helping agencies or military service organizations like ourselves also use social media to communicate verified content to military families,” she said. 

How to identify fake videos of service members

If you have doubts about a video’s authenticity, check the account that posted it. If it consistently posts videos with different people saying the same things, it’s one indicator the videos could be AI-generated.

The profile’s creation date and posting volume also can be a signal. Some accounts we saw were created around the time the Iran war began, and have been posting consistently since.

“Many of these accounts are relatively new and engage in fairly uniform patterns of influence-style posting,” Schiff said. 

Some dubious accounts primarily post attractive young women in uniform. Gregory Daddis, a Texas A&M University history professor who served in the U.S. Army for 26 years, said that even when the women in the videos have muddy or scratched faces, they are still portrayed as attractive.

“Nearly perfectly waxed eyebrows across the board seems telling to me,” he said.

The uniforms also can be a giveaway. In one April 12 video, a female service member said, “Dad, it’s almost Christmas. I miss you so much, but for the safety of the American people, I have to hold the line out here. Could you tap the little red plus on my profile to support me? Um, I love you both. Stay safe.”

Looking closer at her uniform shows her name is gibberish, and the “U.S.” has three periods. 

Illegible text and spelling or grammar mistakes are common in these videos. Daddis said the rank insignia is out of place in several AI-generated videos, or feature inaccurate symbols. On combat uniforms, those are located in a patch on the middle of the chest, but some videos show them to the side or missing. 

Some videos still have a watermark indicating they were made with AI. One example is Veo, Google’s AI video creator. Watermarks can be cropped out, but another tell that a video was AI-generated is its length: Veo, for one, can typically make videos only up to eight seconds long. 

“Be cautious of content that relies heavily on emotion but lacks specifics,” Raszadin said.

Staff Writer Maria Briceño contributed to this report.

RELATED: Social media feeds are awash with Iran war misinformation. Here’s how to identify false imagery

 





Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
nick
  • Website

Related Posts

Thom Tillis is retiring, but he hasn’t left the Senate yet. Fact-checking Trump

April 15, 2026

Is Florida’s mid-decade redistricting plan ‘illegal,’ as some Democrats say?

April 15, 2026

Was the US Navy created to take on pirates, as Byron Donalds said? It was a big reason.

April 14, 2026
Leave A Reply Cancel Reply

Demo
Our Picks

Putin Says Western Sanctions are Akin to Declaration of War

January 9, 2020

Investors Jump into Commodities While Keeping Eye on Recession Risk

January 8, 2020

Marquez Explains Lack of Confidence During Qatar GP Race

January 7, 2020

There’s No Bigger Prospect in World Football Than Pedri

January 6, 2020
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Don't Miss

The Tech High Ground

Alternative News April 16, 2026

What it will take to gain the advantage over China. Source link

Higher Ed Is Hiding Racial Discrimination

April 16, 2026

Israeli Journalist With Deep Ties to IDF Admits West Bank Violence ‘Looks Like… Ethnic Cleansing’

April 16, 2026

Trump Announces Ceasefire Between Israel and Lebanon

April 16, 2026

Subscribe to Updates

Get the latest creative news from SmartMag about art & design.

Facebook X (Twitter) Instagram Pinterest
© 2026 ThemeSphere. Designed by ThemeSphere.

Type above and press Enter to search. Press Esc to cancel.