The Philosophers Talking About AI: Context, Flow, and the #geekproblem

This is touching on the event as had to leave early.

I was recently at a talk from the Oxford University series, “The Philosophers Talking About AI”. There were some underlying themes that are deeply relevant to how we think about privacy, information, and our current techno-social mess.

Action vs. Paralysis, the talk opens with the tension between the strong and weak drives of human decision-making. This plays out in a constant oscillation between conversation and paralysis. Philosophically, we get stuck, debating endlessly, without acting. And in ethics, this inaction can be dangerous. If we don’t decide and act, we leave the field open for others to impose their decisions on us.

Rethinking Privacy. One of the more nuanced ideas from the talk is a definition of privacy not as secrecy, but as appropriate information flow.

"Privacy is not control, nor hiding – it’s about the right information flowing in the right way."

This is a key shift. Secrecy is often anti-human – it disrupts the flow of information, which is essential to human life and community. Instead, privacy is about appropriateness, about understanding which flows are legitimate in which contexts.

So what determines “appropriateness”? Social context. Contextual Integrity. Privacy, then, depends on social spheres, each defined by particular goals, values, and purposes. In each sphere, there are different expectations for how data should flow. These expectations aren’t always formal rules, but norms, often invisible until they’re violated.

The speaker brings in the idea of the transmission principle – that information shouldn’t flow without the right kind of consent or context. While consent matters, it’s not the only thing that legitimizes a flow. There are many transmission predicates in society that allow information to move in meaningful, appropriate, and socially beneficial ways.

But here’s the mess: our (post)modern systems, especially those built by geeks, often ignore or misunderstand this. This ties directly into what I often call the #geekproblem. The problem is that geeks, driven by abstract logic and rigid notions of control, block too many flows. They implement blanket rules and dogmatic blocks rather than engaging with messy human norms. Worse, they often start fighting among themselves about which blocks should exist, creating even more social dysfunction.

They don’t see the richness of the social world. They try to “fix” it by hard-coding overly simplified versions of reality into software, creating systems that are brittle, alienating, and to often oppressive.

This has real consequences for the #openweb and our attempts to build alternatives. If we don’t get privacy right – if we don’t understand the role of context and legitimacy in data flows – we’ll just reproduce the same broken #dotcons models we’re trying to replace.

Beyond policy and control, most privacy policies today are useless. They reduce privacy to a box-ticking exercise, just “terms and conditions” of control. But this is a dead end. Real privacy is contextual. It involves relationships between: The subject – The sender – The recipient – The nature of the information.

To build humane technology, we need to embed all these values into our tools and processes. That means ditching secrecy-as-default, dropping the obsession with control, and embracing appropriate social information flows.

#KISS #Oxford #talk


Discover more from Hamish Campbell

Subscribe to get the latest posts sent to your email.

Leave a Reply

Only people in my network can comment.