Artificial intelligence is reshaping political power in ways we barely understand, embedding itself into economic and governance structures that shape human behavior—often without our explicit consent. That was the stark warning delivered by Dr. Brian Brock, a theologian and ethicist from Aberdeen University, during the 2025 Scott Hawkins Lecture at SMU. In a talk that wove together ethics, theology, and technology, Brock explored how AI-driven systems are not only influencing individual behavior but also restructuring power in ways that are often invisible. Hosted in collaboration with the Dedman College Interdisciplinary Institute, the event brought together students and faculty for a thought-provoking discussion on the consequences of an increasingly automated world.
Dr. Dallas Gingles, a member of the Maguire Ethics Center’s Faculty Advisory Committee, was instrumental in bringing Brock to campus. “Conversations about AI tend to focus on what’s next—what it might do in the future,” Gingles said. “But Dr. Brock is asking us to look at what’s already happening, right now, and how it’s quietly shaping our political and economic structures. That’s the kind of ethical conversation we need to be having.”
Brock’s lecture framed AI as a force comparable to past technological revolutions, likening its impact to that of the combustion engine in how it restructured labor, transportation, and governance. He emphasized that while AI is often discussed in abstract terms—like “the cloud”—the reality is that it relies on tangible infrastructure, particularly data centers, which are central to the growing consolidation of power.
“When I say data center investment is locking in the political and economic architecture of the new era, I mean that we tend to imagine the internet as this boundless, neutral space,” Brock explained. “But when you look closely, you see these key infrastructural nodes—real places where control is concentrated. And that’s where the future is being written.”
One of the central paradoxes of AI, Brock argued, is that while it is marketed as a tool of individual empowerment, it is increasingly used to exert control. Citing Kate Crawford’s research in The Atlas of AI, he examined the ways AI is used to track and influence behavior, often without the public realizing the extent of its reach.
“China has stated its commitment to being the global leader in AI, and the data practices of companies like Alibaba and ByteDance are often framed as direct extensions of state policy,” Brock said. “But in the U.S., we tend to assume that because companies like Facebook and Google are private, they operate differently. The truth is, the lines between corporate and state power are blurred everywhere.”
Brock also drew on Shoshana Zuboff’s concept of surveillance capitalism to explain how AI-driven governance functions differently from traditional laws. Instead of direct enforcement, AI systems create environments where behaviors are incentivized or discouraged in ways that feel voluntary, even when they are not.
“The goal of this new political architecture is that no one feels coerced,” Brock said. “Everyone believes they are making their own choices, but the structure of incentives, access, and influence is being shaped behind the scenes.”
For Gingles, the lecture highlighted the need for more conversations that cut across disciplines. “AI isn’t just a technological issue—it’s a societal issue, a political issue, a theological issue,” he said. “Dr. Brock challenges us to see AI not as some distant future but as something embedded in the systems we already rely on.”
As AI continues to evolve, the 2025 Scott Hawkins Lecture left attendees with an urgent ethical question: If artificial intelligence is already reshaping governance and autonomy, how much control do individuals really have over their own digital lives?