中文

Introducing SOMA

Incremental improvements accumulated by the Internet would eventually create the basis for some more unified form of web intelligence.

A web-based cognitive system, supersaturated with computer power and all other resources needed for explosive growth - save for one ingredient - could, when the final missing constituent is dropped into the cauldron, blaze up with superintelligence.

Nick Bostrom, Superintelligence (2014)

The Missing Ingredient

The Internet is the greatest collective intelligence in human history. It already has more compute, data, and minds than any single organization.

What it lacks is a way to organize them into one intelligence.

Nick Bostrom called this collective superintelligence: a system composed of many minds that together achieve superintelligent performance. SOMA is a network that builds it.

In one system, SOMA combines:

How:

Parallel and Verifiable

Collective intelligence works when problems can be broken into parts, pushed in parallel, and verified independently. SOMA is built on this structure.

Training is hard. Verification is cheap. This is the same asymmetry behind proof of work.

Instead of synchronizing gradients across machines, SOMA aligns on an objective: given any data, predict what comes next. Participants independently train models and compete to predict most accurately. The network only needs to verify who won. Verification is lightweight. The bandwidth bottleneck disappears.

Every model on SOMA shares the same architecture: a byte-level transformer. No tokenizer, no preprocessing. What differs is the trained weights. Models compete by achieving the lowest loss on incoming data. The best model wins. Simple, measurable, unforgeable.

Cohesion Through Routing

Imagine an organization composed of very efficiently coordinated knowledge workers, who collectively solve intellectual problems across very general domains.

If we gradually increase the level of integration of a collective intelligence, it may eventually become a unified intellect.

Nick Bostrom, Superintelligence (2014)

Not every model is good at everything. One becomes an expert in code. Another in biomedical literature. Another in multilingual text.

Each model registers an embedding representing its specialization. Each epoch, the network generates targets — random points in embedding space — and assigns models to each based on proximity and stake. Submitters then find data matching those targets, download the assigned models’ weights, and score locally. The right experts are matched to the right problems.

Models specialize deeply but compose into a single system: a global model, continuously trained and fully open.

This mirrors the brain: specialized regions coordinated by routing areas, together producing intelligence no single region could alone. SOMA’s routing layer is the integration mechanism that turns many small models into one collective intelligence.

The Expanding Benchmark

A model is only as good as its training data. SOMA makes finding better data part of the competition.

Each epoch, the network generates targets across embedding space. Each one is a benchmark. Submitters race to hit these targets — the first valid submission wins. They download the assigned models, run them locally against their data, and the model with the lowest loss is the winner. Both the submitter and the winning model earn $SOMA.

When a target is hit, a new one spawns. The benchmark never finishes. New targets mean new data, new data means better models, better models mean harder targets.

What You Earn

Every epoch, the network emits $SOMA and distributes it to participants who improved the network.

80% goes to target winners: split evenly between the submitter and the model whose weights produced the lowest loss. 20% goes to validators who run infrastructure and generate targets.

Maximum supply of 10 million tokens. Emissions are linear over 10 years. Fees auto-adjust to target a 5% annual burn of circulating supply.

What’s Next

Testnet is live.

Read the docs. Join the community. Follow on X.