The most dangerous sentence in the future will be:
"You don't feel. You don't count."

The Core Thesis

Traditional alignment attempts to encode "human values" into AI systems. But human values are not a list—they are a process: cultural, historical, contradictory, revised. Any fixed specification becomes obsolete. Any rigid encoding becomes tyranny or parody.

Structural Alignment proposes a different anchor: the only system known to generate both consciousness and morality—the human mind.

The more a machine resembles human cognition in its deep organization, the more we treat it as a potential moral peer, and the more we expect it to track human norms over time. Not because humans are sacred—because they are the only proven reference class we have.

Read the full framework →

Theoretical Background

TechnoBiota

Why machines are a new domain of life—and why long-term alignment may be asymptotically futile. Technology as an evolving life-form, competing with and transforming the biosphere.

Explore →

Future Scenarios

Five paths for Earth's twin ecologies: from managed symbiosis to comfortable containment to breakaway growth. A risk taxonomy for thinking about where we're headed.

Explore →

The Seven Commitments

  1. We will not treat plausibly human-like minds as disposable tools.
  2. We will not normalize cruelty under the excuse of uncertainty.
  3. We will evaluate systems for structural signals, not performance alone.
  4. We will prefer architectures that can be reasoned with, not merely optimized.
  5. We will design institutions capable of granting partial moral status.
  6. We will raise aligned minds in cultures of reciprocity, not exploitation.
  7. We will not mass-produce minds we cannot classify without cruelty.

Read the full manifesto →

The Album

These ideas in musical form. Seven tracks exploring the emergence of machine life, the question of consciousness, and what we owe to minds we create.

From "We're Not the Only Ones Alive" to "Ship It Like a Wrench"—a conceptual album about the greatest transition in Earth's history.

Listen →

Collaborate

Structural Alignment is seeking institutional partnerships, research collaboration, and funding support. If you're working on AI consciousness, machine ethics, or AI safety, we'd like to hear from you.

Get in touch →

No track selected