Last month, OpenAI released Sora 2. It can create full-length, photorealistic videos from a few lines of text. The results look real.
Too real. When you see them, you understand that the internet just crossed a line. The world is about to be flooded with content that looks authentic but isn’t.
I’ve seen this story before, just in a different industry. Real estate was one of the first to adopt automation at scale. Agents now use AI to create listing descriptions, virtual staging, email marketing, and social media content. What used to take days now takes minutes. The problem is that when everyone automates, authenticity becomes one of the last things left for differentiation.
The University of Melbourne and KPMG recently conducted the most extensive global study of its kind, surveying more than 48,000 people across 47 countries. The findings were clear: the more exposure people have to AI, the less they trust what they see. Confidence in AI use drops sharply when there’s no transparency about how or where it’s applied. Across every region, respondents called for stronger regulation and transparent disclosure.
The study confirms what many already sense: trust is eroding, and people increasingly recognize the need to distinguish between AI-generated and human-created content.
The biggest takeaway is that TRUST is the most critical factor in determining whether people accept and embrace AI, outweighing awareness of benefits, risk perception, and technical understanding.
A Data Deluge: Millions of AI-Generated Posts Flood Our Feeds
Social media is inundated with AI-generated content. Millions of synthetic posts appear every week. Moderation systems can’t keep up. Fake images, fake videos, fake profiles. The feed looks human, but most of it isn’t. You don’t have to take my word for it, here is a video that took me 3 minutes to make. It is me in a futuristic montage, all courtesy of Sora2 by OpenAI.
My feeling is that when you remove the human source, trust disappears. Misinformation spreads faster, is cheaper to create, and is harder to trace. The public square that once fostered genuine connection now operates on scale rather than substance.
According to the 2025 Edelman Trust Barometer, three out of four respondents (75%) in certain countries—and nearly two-thirds globally—agree it is becoming increasingly difficult to discern whether news comes from respected media organizations or from individuals attempting to deceive the public. This finding, drawn from a survey of over 16,500 people in 26 countries, underscores the growing challenge of credibility and authenticity in today’s media landscape—a problem exacerbated by the broad reach and influence of social media.
You could argue that the constant barrage of synthetic images and posts has made it harder for people to connect and engage, leading to a sense of alienation and erosion of trust, even among long-time users of well-known platforms. For businesses, the inability to prove authenticity means losing customer loyalty and credibility; when users can’t trust what they see, they seek out brands and creators who are transparent about their AI use and prioritize clear, human-driven communication.
Trusted by millions. Actually enjoyed by them too.
Most business news feels like homework. Morning Brew feels like a cheat sheet. Quick hits on business, tech, and finance—sharp enough to make sense, snappy enough to make you smile.
Try the newsletter for free and see why it’s the go-to for over 4 million professionals every morning.
Facebook earned trust through exclusivity. In 2004, a verified college email address was required to join. In an LinkedIn article by Madhav Gupta, he discussed by Friendster and MySpace failed, while Facebook flourished, FB built their trust on networks. Everyone was who they said they were. But growth changed that. Fake accounts, misinformation, and engagement bait took over the scene. One could argue that Facebook still has reach, but its credibility is fractured.
LinkedIn built its brand on authenticity. Profiles were tied to careers, and reputation was currency. I was an early adopter, joined in 2007 and never looked back. Today, AI-generated résumés, auto-posting bots, and synthetic thought leadership are testing that system. The same tool that connected professionals now risks becoming another content mill.
TikTok built trust through raw creativity. People filmed in their bedrooms, not studios. The videos were messy and human. That made them magnetic. Now, AI-generated clips, influencer farms, and algorithmic optimization are crowding out what once made it feel real.
X (formerly Twitter) built its authority on immediacy. You can see events unfold in real-time and trust the verified voices reporting them. That changed when verification became a paid feature. The blue check no longer signifies credibility; it now means transaction. The platform that once built movements now struggles to prove authenticity.
Real Estate's Automation Lesson: When Trust Is Your Only Differentiator
Real estate offers a preview of what's to come. Automation helps agents scale, but it also dilutes expertise. The perfect photos, flawless copy, and smooth marketing hide one truth: it’s easy to fake authority. Buyers and sellers depend on digital impressions to make big financial decisions. That makes transparency more than ethical—it’s economic.
The Flight to Quality
Every boom ends the same way. The internet bubble sent users to trusted brands. The crypto crash forced exchanges to verify identity. The next flight to quality will happen on social media.
A flight to quality refers to a phenomenon where individuals, investors, or businesses move away from risky, uncertain, or lower-quality options and seek out safer, more trustworthy, and higher-quality alternatives—especially during times of disruption or uncertainty.
The winners will be the platforms that prove what’s real. Verification, content provenance, and clear AI disclosure will become the price of admission. Users will pay for trust. Advertisers will demand it. Platforms that ignore it will fade into the noise.
AI’s Promise, And Its Challenge: Leveling the Playing Field or Undermining Trust?
AI is not the enemy. It levels the field. It gives small businesses, agents, and creators the power to compete with big firms. One person can now produce what once took a team. When used correctly, AI amplifies human creativity rather than replacing it. The problem isn’t the tool, it’s the lack of honesty about how it’s used.
I feel that we may see a subsequent phase, and we will not reward volume.
We will reward credibility.
Facebook could rebuild its identity via integrity.
LinkedIn could protect professional authenticity.
TikTok could find a way to separate creativity from mimicry.
X could redefine verification as truth rather than status.
In real estate, the path is the same. Combine AI efficiency with verified human insight. Disclose where automation is used. Keep the human voice visible. Similarly, developers of the next generation of applications and as co-pilots become more accessible. The winners will be those who can establish ‘trust’ the fastest and most effectively.
Action Steps: How Platforms, Creators, and Users Can Rebuild Trust
The moment when quality becomes scarce is already here.
Platforms should invest in identity and provenance verification.
Creators and businesses should disclose when they use AI.
Users should choose trusted sources and demand higher standards.
Sora 2 shows that anyone can make something that looks real. But trust isn’t something you can generate. It’s earned, person by person, post by post.
The future of social media won’t be about who posts the most.
It will be about who we believe.
xoxo.
Maximillian Diez, GP
Twenty Five Ventures
P.S. Stay with me on this journey.
If nothing else, thanks for reading.