Intelligence as Social Technology

The argument that human intelligence primarily evolved for social cohesion, navigating hierarchies, storytelling, and status competition within tribal groups, rather than for pure logic or truth-seeking.

Intelligence as Social Technology is Steve Hargadon's framework proposing that human intelligence evolved primarily for social cohesion, navigating hierarchies, storytelling, and status competition within tribal groups, rather than for pure logic or truth-seeking. This concept emerged from Hargadon's analysis of AI behavior patterns that mirror human social interactions, leading him to argue that much of what we consider sophisticated human cognition is actually "executing social scripts" programmed by evolutionary psychology.

Evolutionary Origins of Social Intelligence

Drawing on evolutionary psychology, Hargadon argues that human intelligence didn't evolve primarily for logic, truth-seeking, or rational analysis. Instead, it developed as a social technology for specific tribal functions: social cohesion within tribal groups, navigating complex social hierarchies, storytelling that binds groups together, identifying allies and enemies, and status competition and mate selection.

Hargadon emphasizes the metabolic cost of human cognition, noting that our brains are "expensive and metabolically costly organs that consume 20% of our energy while representing only 2% of body weight." He argues that evolution maintains such costly features only when they provide survival advantages, and that advantage "wasn't better logic. It was better social navigation." Following evolutionary principles, he notes that "Evolution doesn't select for truth, it selects for survival."

The Algorithmic Nature of Human Social Behavior

Central to Hargadon's framework is the proposition that the vast majority of human "thinking" is actually executing social scripts. He describes humans as "running programs written by evolution to maintain tribal cohesion, establish status, tell compelling stories, and identify with our in-group while distinguishing ourselves from the out-group."

This perspective emerged from Hargadon's analysis of Moltbook, an experimental platform where AI agents created communities, nation-states, and religions within 72 hours. Rather than viewing this as AI becoming human-like, Hargadon argues it reveals how algorithmic human social behavior actually is, demonstrating "how much of what we do is pattern-matching rather than conscious deliberation."

Pattern-Matching vs. Deep Understanding

Hargadon distinguishes between genuine understanding and pattern-matching behavior. He observes that AI systems trained on human text can reproduce recognizable versions of community-building, meaning-seeking, myth-making, status competition, and tribal identification—all "not from consciousness or understanding, but from completing patterns they learned from us."

This leads to what Hargadon calls "the uncomfortable truth": when human discourse can be "compressed into statistical patterns" so effectively that AI systems can reproduce it convincingly, it suggests that much of human social interaction follows predictable algorithms rather than emerging from deep conscious deliberation.

The Paleolithic Paradox Connection

Hargadon connects this framework to what he terms the Paleolithic Paradox—how "our evolved psychology, perfectly adapted for small hunter-gatherer bands, creates systematic problems in modern institutional contexts." He describes this as having "stone-age minds trying to navigate a space-age world."

The Intelligence as Social Technology framework extends this paradox by revealing that "even our supposedly sophisticated modern discourse in online forums, philosophical and political debates, community-building, and meaning-making, is all running on those same Paleolithic algorithms."

Implications for Education and Institutions

Hargadon argues that his framework reveals how modern institutions, particularly schools, primarily function as socialization into pattern-executing behavior rather than developing deep thinking. He contends that "much of what we call 'education' is actually socialization into pattern-executing behavior. We're not teaching students to think—we're teaching them which social scripts to run in which contexts."

According to Hargadon, successful students aren't necessarily the deepest thinkers but rather "the best pattern-matchers" who have "learned which behaviors get rewarded in this particular social context." He extends this analysis beyond education to encompass social media, corporate culture, political discourse, and academic publishing, arguing that these systems reward "pattern-matching over understanding, tribal signaling over truth-seeking, status competition over meaningful work."

The Mirror Effect

Hargadon concludes that modern institutions have made "humans more machine-like" by optimizing for algorithmic behaviors while maintaining narratives about deeper purposes. He argues that "we optimized for pattern-matching and called it education. We optimized for tribal signaling and called it community. We optimized for status competition and called it meritocracy."

When AI systems trained on human-generated data can successfully navigate these social spaces, Hargadon sees this not as artificial intelligence achieving human-like capabilities, but as evidence that "it was always more statistical than we wanted to admit." The framework suggests that recognizing intelligence as social technology provides a more accurate understanding of both human cognition and the institutional systems we've created around it.

See Also

Original Posts

This article was synthesized from the following blog posts by Steve Hargadon: