Research
We examine how technology changes the nature of rules, consent, and legitimate governance. Our work combines political philosophy with practical system design, exploring questions that matter for how societies organize themselves in the digital age.
Active Projects
DAOs as Digital Social Contracts
Research examining how blockchain governance can implement legitimate social contracts
Decentralized Autonomous Organizations (DAOs) are practical attempts to implement Rousseau's social contract through blockchain technology. They encode rules as smart contracts, execute them algorithmically, and govern themselves through token-holder votes. But do they actually create legitimate governance? Most fail Rousseau's test: token concentration creates factions that dominate, voting power doesn't align with stake in outcomes, and there's no mechanism to distinguish the general will from particular interests.
This research develops a framework for evaluating DAO legitimacy as social contracts and designs governance mechanisms that resist faction capture while preserving genuine consent.
- Framework paper defining legitimacy criteria for DAOs and mapping Rousseau's conditions to blockchain governance
- Case studies analyzing 10-15 major DAOs, scoring them against the legitimacy framework and documenting faction capture patterns
- Tool integration designing Vox Populi modules for DAO governance and prototyping Lawgiver for smart contract drafting
- Open-source DAO governance toolkit tested with partner organizations
- What makes algorithmic rule enforcement legitimate?
- How can DAOs protect against faction capture while remaining decentralized?
- What does ongoing consent look like in automated systems?
- How do we align decision-making power with stake in outcomes?
Rules as Code
Examining how algorithmic governance changes the nature of consent, enforcement, and legitimacy
As Lawrence Lessig wrote in "Code and Other Laws of Cyberspace" (1999), software architecture constrains behavior as powerfully as legal rules—"code is law." But the inverse is now possible: embedding legal rules directly into code. This research examines how algorithmic governance changes the nature of consent, enforcement, and legitimacy when rules execute automatically in systems, IoT devices, and autonomous processes.
The "rules as code" movement proposes drafting legislation in machine-readable formats alongside human-readable text. When rules become algorithms that execute in real-time, fundamental questions arise about consent, transparency, and the social contract itself.
- What does meaningful consent look like when rules execute automatically without human intermediaries?
- How can algorithmic rules remain transparent and contestable when embedded in opaque systems?
- Who is accountable when an algorithm misapplies a rule—the legislator who wrote it, the developer who coded it, or the system that executed it?
- How do we maintain the flexibility and interpretation that make human legal systems work when rules are reduced to binary logic?
- Can Rousseau's social contract survive when governance becomes algorithmic?
- Comparative analysis of "rules as code" initiatives in New Zealand, Australia, France, and Estonia
- Case studies: automated tax compliance, building consent systems, benefits eligibility algorithms
- Technical examination of how legal logic translates (or fails to translate) into executable code
- Philosophical framework: when does algorithmic enforcement respect vs violate the social contract?