When Emily Epstein shared her perspective on LinkedIn about how âpeople didnât stop reading books when encyclopedias came out,â it sparked a conversation about the future of primary sources in an AI-driven world.
In this episode, Katie Morton, editor-in-chief of Search Engine Journal, and Emily Anne Epstein, former director of content at Sigma, dig into her post and unpack what AI really means for publishers, content creators, and marketers now that AI tools present shortcuts to knowledge.
Their discussion highlights the importance of provenance, the layers involved in online knowledge acquisition, and the need for more transparent editorial standards.
If youâre a content creator, this episode can help you gain insight into how to provide value as the competition for attention becomes a competition for trust.
Watch the video or read the full transcript below:
Katie Morton: Hello, everybody. Iâm Katie Morton, editor-in-chief of Search Engine Journal, and today Iâm sitting down with Emily Anne Epstein, former director of content at Sigma. Welcome, Emily.
Emily Ann Epstein: Thanks so much. Iâm so excited to be here.
Katie: Me too. Thanks for chatting with me. So Emily wrote a really excellent post on LinkedIn that caught my attention. Emily, for our audience, would you mind summarizing that post for us?
Emily: So this should feel both shocking and non-shocking to everybody. But the idea is, people didnât stop reading books when encyclopedias came out. And this is a response to the hysteria thatâs going on with the way AI tools are functioning as summarizing devices for complicated and complex situations. And so the idea is, just because thereâs a shortcut now to acquiring knowledge, it doesnât mean weâre getting rid of the need for primary sources and original sources.
These two different types of knowledge acquisition exist together, and they layer on top of one another. You may start your book report with an encyclopedia or ChatGPT search, but what you find there doesnât matter if you canât back it up. You canât just say in a book report, âI heard it in Encarta.â Where did the information come from? I think about the way this is going to transform search: Thereâs simply going to be layers now.
Maybe start your search with an AI tool, but youâll need to finish somewhere else that organizes primary sources, provides deeper analysis, and even shows contradictions that go into creating knowledge.
Because a lot of what these synthesized summaries do is present a calm, âimpartialâ view of reality. But we all know thatâs not true. All knowledge is biased in some way because it cannot be âall-containing.â
The Importance Of Provenance
Katie: I want to talk about something you mentioned in your LinkedIn post: provenance. What needs to happen, whether culturally, editorially, or socially, for âshow me the source materialâ to become standard in AI-assisted search?
With Wikipedia or encyclopedias, ideally, people should still cite the original source, go deeper into the analysis, and be able to say, âHereâs where this information came from.â How do we get there so people arenât just skimming surface-level summaries and taking them as gospel?
Emily: First, people need to use these tools, and there needs to be a reckoning with how reliable they are. Thinking about provenance means thinking about knowledge acquisition as triangulation. So, when I was a journalist, you have to balance hearsay, direct quotes, press releases, and social media.
You create your story from a variety of sources, so that way, you get something thatâs in the middle and can explain multiple truths and realities. That comes from understanding that truth has never been linear, and reality is fracturing.
What AI does, even more advanced than that, is deliver personalized responses. People are prompting their models differently, so weâre all working from different sets of information and getting different answers. Once reality is fractured to that degree, knowing where something comes from â the provenance â becomes essential for context.
And triangulation wonât just be important for journalists; itâs going to be important for everyone because people make decisions based on the information that they receive.
If you get bad inputs, youâll get bad outputs, make bad decisions, and that affects everything from your work to your housing. People will need to triangulate a better version of reality that is more accurate than what theyâre getting from the first person or the first tool they asked.
Creators: Competing For Attention To Competing For Trust
Katie: So if AI becomes the top layer in how people access information â designed to hold attention within its own ecosystem â what does that mean for content creators and publishers? It feels like theyâre creating a commodity that AI then repackages as its own.
How do you see that playing out for creators in terms of revenue and visibility?
Emily: Instead of competing for attention, creators and publishers will compete for trust. That means making editorial standards more transparent. Theyâre going to have to show the work that theyâre doing. Because with most AI tools, you donât see how they work, itâs a bit of a black box.
But if creators can serve as a âblockchain,â (a verifiable ledger of information sources) and theyâre showing their sources and methods, that will be their value.
Think about photography. When it first came out, it was considered a science. People thought photos were pure fact. Then, darkroom techniques like dodging and burning or combining multiple exposures showed that photos could lie.
And when photography became an art form, people realized that the photographerâs role was to provide a filter. Thatâs where we are with AI. There are filters on every piece of information that we receive.
And those organizations that make their filter transparent are going to be more successful, and people will return to them because again, theyâre getting better information. They know where itâs coming from, so they can make better decisions and live better lives.
AI Hallucinations & Deepfakes
Emily: It was a shocking moment in the history of photography. that people could lie with photographs. And thatâs sort of where we are right now. Everybody is using AI, and we know there are hallucinations, but we have to understand that we cannot trust this tool, generally speaking, unless it shows its work.
Katie: And the risks are real. Weâre already seeing AI voiceovers and video deepfakes mimicking creators often without their consent.
Inspiring People To Go Deeper
Katie: In your post, you ended with âpeople still doing the work of deciding whatâs enough.â In an attention economy of speed and convenience, how do we help people go deeper?
Emily: The idea that people donât want to go deeper flies in the face of Wikipedia holes. People start with summarized information, but then click a citation, keep going further, watch another show, keep digging.
People want more of what they want. If you give them a breadcrumb of fascinating information, theyâll want more or that. Knowledge acquisition has an emotional side. It gives you dopamine hits: âI found that, thatâs for me.â
And as content marketers, we have to provide that value for people where they say, âWow, I am smarter because of this information. I like this brand because this brand has invested in my intelligence and my betterment.â
And for content creators, that needs to be the gold star.
Wrapping Up
Katie: Right on. For those who want to follow your work, where can they find you?
Emily: Iâm dialoging and writing my thoughts on AI out loud and in public on LinkedIn. Come join me, and letâs think out loud together.
Katie: Sounds great. And Iâm always at searchenginejournal.com. Thank you so much, Emily, for taking the time today.
Emily: Thank you!
More Resources:Â
- Building Trust In The AI Era: Content Marketing Ethics And Transparency
- AI Platform Founder Explains Why We Need To Focus On Human Behavior, Not LLMs
- Your Guide to Google E-A-T & SEO
Featured Image: Paulo Bobita/Search Engine Journal