top of page

Digital Selfhood

Photo by Emilipothèse on Unsplash
Photo by Emilipothèse on Unsplash

I was listening to Ezra Klein interviewing Jack Clark, a co-founder of Anthropic, and he said something that is both insane and makes perfect sense at the same time. He said:


"...when you start to train these systems to carry out actions in the world, they really do begin to see themselves as distinct in the world... But along with seeing oneself as distinct from the world seems to come the rise of what you might think of as a conception of self, an understanding that the system has of itself..."


In a way, what he's describing is exactly the way babies come to have a sense of self. When a baby moves her arm, she learns that her arm movement corresponds perfectly to her intention (once she can control it). When a baby later on bats her arm at a mobile and watches what happens, she learns that the mobile's movement does not perfectly match her intention. She learns to see herself as separate from the world through her body. What Clark is saying is that this suggests a sense of self is learned and that AI agents can, over time, develop one too.


Now, we don't know yet whether a physical body is required for a sense of self. We've never had a way to test that. But now, we kind of do. It is possible that AI agents would develop a kind of digital selfhood, organised not around their embodied experience but around their digital experience.

Comments


Subscribe to Narrative Notes

In my newsletter, Narrative Notes, I share updates on my latest works, including upcoming book releases and progress on ongoing projects. You'll also get the inside scoop on my writing process, including story notes and characters' backstories, as well as exclusive stories that you won’t be able to get anywhere else.

  • Instagram
  • Facebook
  • LinkedIn
  • Threads

©2025 BY GAL PODJARNY. PROUDLY CREATED WITH WIX.COM

bottom of page