Information Vectors: an intro to composable beliefs

Beliefs aren't binary - they're directional, bracketed, distributional. functionSPACE argues that binary event contracts force 1-bit belief expression and fragment capital by factors of up to 256, while a shared continuous probability surface lets traders express uncertainty directly, reward calibration, and turn beliefs into composable information assets.

Primitives

By: Igor (@justigor)

Holding a belief is rarely a binary state. In reality, beliefs have shapes: they are directional, bracketed, or distributional. They exist on a spectrum of uncertainty.

Binary event contracts have paved the way for information trading to become viable. However, at their microstructure level, they limit the full capacity of the wisdom of the crowd as a remedy to uncertainty. Binaries force agents into low bit-depth structures: yes/no, above/below, resolves or does not. This significantly limits the variety, depth, and precision of expression that could otherwise be used to form a richer source of predictive information.

The instrument is not neutral. It shapes expressivity, liquidity formation, and the economics of informational contribution.

The problem is architectural.

Binary event contracts flatten belief and fracture liquidity. Variance, uncertainty, modality, and the many shapes that probability distributions actually take when agents reason about the world cannot, by construction, be expressed cleanly or efficiently through the binary event contract, whether implemented via CTFs or NegRisk structures.

Numerically defined topics do not belong in these structures.

The necessity to break a single numerical range into many binaries results in either direct liquidity fragmentation, mathematical isolation of traders, or missed opportunities for information discovery. While it is theoretically possible to approximate expressive beliefs by creating more markets (range brackets, digitals, conditional ranges), this simply produces separate liquidity pools and sends information producers into isolated islands, marooned within sight of each other but never able to exchange directly.

We cannot distinguish between a market that is unsure (wide distribution) and a market that is precise (narrow distribution).

Binary markets treat numerical questions as 1-bit problems. To achieve 8-bit resolution, a venue must deploy 2^8 independent markets, which fragments capital by a factor of 256. functionSPACE uses a single, shared liquidity surface to increase the bit-depth of the information asset without sharding capital.

Liquidity fragmentation is not merely aesthetic

More expressive beliefs require more markets. More markets require more liquidity. Liquidity either becomes thin or must be subsidised. This is a structural cost, not a product mistake.

This dynamic also introduces asymmetry for liquidity providers. As information improves over time, informed participants, often bots, can enter late and extract value against LPs. To compensate, venues rely on incentives such as rebates, bounties, and subsidies. This is a signal that the instrument is fighting its own math.

The core risk for LPs lies in the fact that binary event contracts hold inventory that collapses to zero for one side at expiry. Inventory risk concentrates at resolution, precisely when uncertainty collapses.

It is worth noting that some of this may be attributable not only to the 0/1 contract itself, but to the interaction between binary contracts and topics prone to surprise or discontinuous resolution. Many high-value traders point to this directly. Event contracts, by their nature, are brittle in the presence of jumps.

Continuous range markets change that surface

Mapping topics to a continuous numerical range creates a shared liquidity surface that eliminates the gaps between discrete strike points. This recasts liquidity as a function of information density rather than subsidy.

This structure removes the requirement for traders to commit to pre-constrained binary positions. Participants can manage the risk of information jumps and resolution cliffs by holding wider, adaptive positions that are continuously updated.

Rather than being punished for being early through a low liquidity book, these participants can profit from taking positions while information is still sparse.

How do we value belief contribution?

Not all beliefs are valued equally under event contracts. Binary contracts cap upside at 100 cents. This creates a pre-judged valuation ceiling. While it simplifies EV calculations, it compresses belief value and equalises fundamentally different informational contributions.

Early and late information collapse into the same payoff function at resolution. Whether this is fair is debatable. What is clear is that the structure cannot distinguish between different shapes of belief.

If one trader believes a point estimate with a ±50 range, and another believes the same point with a ±200 range, should their contributions be valued equally? Are those beliefs equally informative?

One could argue that narrower beliefs should simply commit more capital. But this avoids the deeper issue: what happens when traders are allowed to express uncertainty directly, rather than being forced into identical payoff buckets?

When liquidity provision becomes the equaliser of informational contribution, belief valuation becomes murky. The signal produced by the market no longer cleanly reflects the information supplied by its participants.

Information doesn't get priced as information

Event contracts offer no mechanism to reward traders for improving the distribution itself. Binary contracts are path-independent at the informational level; they ignore the informational delta provided by participants who reduce entropy mid-cycle. All value is back-weighted to the resolution event, meaning the market fails to price the actual work of variance compression.

Much of the predictive power attributed to "the crowd" may therefore stem from the structure of binary contracts combined with CLOB matching and pre-defined payoffs. These systems work, but they constrain topic breadth, distance from resolution, and expressive range.

A useful question emerges: How much of today's forecasting accuracy comes from collective cognition, and how much comes from artefacts of financial infrastructure?

Reframing through composable beliefs

With the problem space established, belief composability can be introduced from first principles.

In any given market, the summation of beliefs forms a probability distribution that = 1. Beliefs can therefore be represented as vectors over that distribution. They are information assets, not yes/no tokens.

The market contract accepts beliefs in abstracted form. This is enabled through dynamic interface components rather than fixed market definitions. The integration surface itself becomes composable.

As a result, a developer can deploy interfaces that evolve with the market. A wide initial range can tighten as entropy collapses and information concentrates around a numerical point.

Because beliefs are buyable and sellable across a shared surface, they can interact, cancel, offset, and aggregate. The resulting information asset, consistently re-valued by the underlying market, can be traded on secondary markets, used as collateral, or combined with other financial primitives.

Instead of treating beliefs as contingent claims on a single future bit, lets treat them as continuous information assets that modify a shared distribution.

What becomes possible when beliefs are composable?

In functionSPACE, belief composability can be seen to operate across three layers: expression, framing, and interface. This is a core innovation. Topic structure and interface framing materially influence how agents form and update beliefs. The protocol should enable and guide this process, not dictate it.

Belief structures map to different curve shapes. UX modalities can reflect these structures directly. If a topic, user base, or application demands a specific form of belief expression, it can be created naturally within probability space.

A single market can support multiple cognitive lenses: binary, categorical, directional, modal, or fully distributional.

Any belief structure can be mapped to curve space, provided it sums to 1.

Example 1: Framing expression with sub-markets

A pre-launch market for "Polymarket Token FDV" initializes with a wide domain [0.1B, 10B] to capture high-entropy early signals.

As TGE approaches and consensus tightens, the frontend updates to display high-resolution increments within the 3B, 4B range with $100M precision.

The liquidity deposited during the high-variance phase remains mathematically valid and supports the low-variance phase without migration.

Same market. Same liquidity. New lens.

Example 2: Interface interoperability

A company earnings indicator market can serve multiple trader types simultaneously.

A single market with many ways to capture belief.

One participant may be guided through their forecast. Another may only care higher/lower replacing a Binary. A third may want a simple range.

The developer can choose to display any statistical function they desire to drive the belief expression. This value is completely dynamic, forming as information arrives into the market.

Same market. Same liquidity. Different belief maps captured.

Example 3: Replacing a futures contract

Probability space enables novel risk/reward design for belief expression. In a "US CPI" market, say the consensus mean is ~2.8% with high uncertainty. A participant with proprietary data confirming a print of exactly 3.0% submits a a sharp spike at that value.

I've made the spike less sharp for demonstrative purposes. But you can imagine a very very sharp spike instead.

Even if the mean price remains stable, as the market aggregates this signal and the curve tightens, the trader profits from variance compression (increasing density) rather than linear price action.

Beliefs as fluid assets (not contingent claims)

When beliefs are treated as assets, contributions can be valued based on informational delta, not just final outcome creating economic composability. Early forecasters are no longer compressed into equalised settlement buckets.

Unlike event contracts, where the number of yes tokens deterministically defines final value, information assets can be valued differently by different participants.

For example, an information asset might be sellable back into its functionSPACE market for a 20 percent profit, while a secondary market (i.e. DeFi platform) may value that same asset at a 50 percent premium due to its optionality, leverage, or integration into a broader strategy.

Position values can be determined by their informational contribution to the final distribution. These positions are valued by the future liquidity that enters the pool to refine the curve. Early and accurate participants receive rewards proportional to the reduction in uncertainty they provided relative to the total pool size.

Correctness in probability space is not binary. This unlocks more pricing power for information producers and rewards belief updating as a first-class skill.

Changing the primitive changes the economics

Today's incumbents demonstrate that binary markets can reduce bias and surface useful signals. The aggregation of beliefs on a binary scale has been shown to hold predictive power and research from Kalshi showed a "40.1% lower Mean Absolute Error (MAE) than consensus forecasts for economic statistics".

However, if forecasting output is to move toward precision and nuance, meaning it can evolve into a truly useful externality, the tools must evolve accordingly.

functionSPACE asks: what if we had a tool designed from the ground up for forecasting, not betting? what if that tool existed as part of a protocol designed from first principles to make beliefs not just expressive, but for developers, agents, and applications to make those beliefs truly composable?

Prediction markets exist to coordinate beliefs under uncertainty. When the instrument only supports binary claims, the market is constrained to express less than it knows. Composable beliefs remove that constraint.

By allowing beliefs to be expressed as probability distributions on a shared surface, the market becomes capable of representing uncertainty, variance, and revision without sacrificing liquidity or economic coherence.

This is why composable beliefs matter. They do not make prediction markets more complex but more accurate representations of belief, and therefore better tools for prediction.

Igor leads research at @functionspaceHQ an open-source project exploring market-led resolution and novel economic instruments for prediction markets.

Table of content

No headings found on page

Table of content

More Research

Explore our additional research for more in-depth insights.

Structure

Binary Events: Does Liquidity Trade The Tails?

Which of Polymarket's multi-market pathologies come from discretising a continuous quantity versus the binary architecture itself? By splitting 18,863 events into continuous (price, margin, temperature buckets) and categorical (candidates, teams) slices and re-running the v1 analysis, functionSPACE shows concentration is architecture-wide, ghost markets are largely a categorical phenomenon, and a continuous-distribution primitive is a sharper fix than v1 suggested.

Structure

Binary Events: What Happens When You Split One Market Into Twenty

Let's find out how Polymarket handles complex questions by breaking them into multiple yes/no contracts. By examining metadata from the Gamma API, functionSPACE argues that this "fragmented" approach creates a "resolution gap" where liquidity fails to spread evenly across all outcomes.

Structure

The Yes Bias Might Not Exist

Polymarket traders have an inherent psychological bias toward "Yes" outcomes. By analyzing over 7,000 events, the researchers discovered that the platform’s editorial tendency to frame questions around dramatic, unlikely scenarios (e.g., "Will a specific event happen?") naturally makes the "Yes" token a cheap long-shot. Their data reveals that traders don't actually care about the "Yes" label; they simply gravity toward cheaper tokens regardless of their name. Consequently, what appears to be a behavioral bias is actually a structural illusion created by price sensitivity and the way markets are designed, where the "No" outcome is the default reality for most unlikely events.

Structure

Binary Events: Does Liquidity Trade The Tails?

Which of Polymarket's multi-market pathologies come from discretising a continuous quantity versus the binary architecture itself? By splitting 18,863 events into continuous (price, margin, temperature buckets) and categorical (candidates, teams) slices and re-running the v1 analysis, functionSPACE shows concentration is architecture-wide, ghost markets are largely a categorical phenomenon, and a continuous-distribution primitive is a sharper fix than v1 suggested.

Structure

Binary Events: What Happens When You Split One Market Into Twenty

Let's find out how Polymarket handles complex questions by breaking them into multiple yes/no contracts. By examining metadata from the Gamma API, functionSPACE argues that this "fragmented" approach creates a "resolution gap" where liquidity fails to spread evenly across all outcomes.

Structure

The Yes Bias Might Not Exist

Polymarket traders have an inherent psychological bias toward "Yes" outcomes. By analyzing over 7,000 events, the researchers discovered that the platform’s editorial tendency to frame questions around dramatic, unlikely scenarios (e.g., "Will a specific event happen?") naturally makes the "Yes" token a cheap long-shot. Their data reveals that traders don't actually care about the "Yes" label; they simply gravity toward cheaper tokens regardless of their name. Consequently, what appears to be a behavioral bias is actually a structural illusion created by price sensitivity and the way markets are designed, where the "No" outcome is the default reality for most unlikely events.

Structure

Binary Events: Does Liquidity Trade The Tails?

Which of Polymarket's multi-market pathologies come from discretising a continuous quantity versus the binary architecture itself? By splitting 18,863 events into continuous (price, margin, temperature buckets) and categorical (candidates, teams) slices and re-running the v1 analysis, functionSPACE shows concentration is architecture-wide, ghost markets are largely a categorical phenomenon, and a continuous-distribution primitive is a sharper fix than v1 suggested.

Structure

Binary Events: What Happens When You Split One Market Into Twenty

Let's find out how Polymarket handles complex questions by breaking them into multiple yes/no contracts. By examining metadata from the Gamma API, functionSPACE argues that this "fragmented" approach creates a "resolution gap" where liquidity fails to spread evenly across all outcomes.

Structure

The Yes Bias Might Not Exist

Polymarket traders have an inherent psychological bias toward "Yes" outcomes. By analyzing over 7,000 events, the researchers discovered that the platform’s editorial tendency to frame questions around dramatic, unlikely scenarios (e.g., "Will a specific event happen?") naturally makes the "Yes" token a cheap long-shot. Their data reveals that traders don't actually care about the "Yes" label; they simply gravity toward cheaper tokens regardless of their name. Consequently, what appears to be a behavioral bias is actually a structural illusion created by price sensitivity and the way markets are designed, where the "No" outcome is the default reality for most unlikely events.

Ecosystem

Information as supply

We argue that prediction market TAM should include the supply side: as the cost of producing real-time probability estimates collapses, the addressable market extends beyond trading volume to every decision that benefits from better forecasts.

© 2026 functionSPACE

© 2026 functionSPACE

© 2026 functionSPACE

© 2026 functionSPACE