We Need a Constitution for the Robot Age (Part 2)
We Need a Constitution for the Robot Age (Part 2)
If we accept that robots will inevitably carry values—embedded in their design, training, and deployment—the urgent question shifts from “who is building them?” to “how do we govern them?” As explored in Part 1, the scale of China’s robotics manufacturing means those values are being set somewhere by someone right now.
We have faced a similar inflection point before.
In the 1980s and 90s, as the internet began to scale, the United States and its allies made a strategic choice. We did not attempt to monopolize every server or fiber-optic cable. Instead, we focused on the architecture. Through institutions like ICANN and the IETF, we established the open protocols (TCP/IP, DNS) that allowed the network to function.
We exported our values—interoperability, decentralization, and permissionless innovation—by encoding them into the plumbing of the web itself.
As we enter the age of embodied AI, we need a similar “governance stack” for the physical world. We need an ICANN for Robotics.
Currently, the debate around AI safety focuses heavily on Large Language Models—preventing chatbots from generating hate speech or misinformation. While important, this ignores the far more immediate physical reality. A chatbot cannot knock over a pedestrian. A hallucinating robot arm in a warehouse, or a delivery droid on a crowded sidewalk, can.
We are currently building robots the way we built software in the Wild West era: move fast and break things. But when the “thing” being broken is physical infrastructure or human safety, the old model fails.
To solve this, we must reject the idea that robot behavior should be purely “learned” from data.
In my own work developing adaptive intelligence systems for robotics, I have found that safety cannot be a probabilistic afterthought. It must be deterministic. It must be “baked in.” We are building architectures where a centralized “kernel” of safety constraints—literally a set of hard-coded rules regarding laws, physics, and social norms—factors into every procedural move the robot makes.
Think of it as a constitutional operating system.
This system effectively wraps the AI’s decision-making in a layer of non-negotiable governance. It ensures that a robot operating in Tokyo automatically adapts to Japanese social etiquette and privacy laws, while the same machine in Texas adheres to local statutes and cultural norms of personal space. The robot perceives its environment, yes, but its actions are filtered through a rigid framework of civic compliance.
This is not something that can be left to individual manufacturers to figure out. It requires a shared standard.
If the West wants to maintain normative influence in a world where hardware is increasingly manufactured elsewhere, we must lead the creation of these standards. We should be proposing a global “Embodied AI Protocol”—a digital handshake that verifies a machine’s adherence to safety and legal norms before it is allowed to operate in public spaces.
This protocol would function like the internet’s SSL certificates. Just as your browser warns you when a website is insecure, our cities should have the digital infrastructure to flag—and disable—autonomous machines that do not run on a verified “constitutional” kernel.
This approach plays to Western strengths. While authoritarian regimes may excel at mass industrial production and hardware scaling, democratic societies excel at complex institution-building, rule of law, and standard-setting.
We can define the “conscience” of the machine, even if we do not build the body.
The internet proved that protocols are more powerful than hardware. But protocols do not write themselves. If democratic societies do not step up to write the constitution for the physical world—defining the specific, enforceable rules of how robots respect human life and local law—then the default settings will be decided by engineers in Shenzhen, optimized for efficiency rather than liberty.
The hardware revolution is already here. The governance revolution is late. It is time to code our values into the machine, before the cement dries.
This is Part 2 of a three-part series on values, governance, and architecture in embodied AI. Read Part 1: When Robots Carry Values and Part 3: One Brain, Many Shells.