## More on Uncertainty

Exclusion space is directly implied by the counting numbers, I didn’t have to invent it, I just discover it by following the signal. This is a natural quantum space both discrete (the octahedrons) and uncertain. I’m going to talk more here about uncertainty.

The first octahedron is a pick from 54 isomers, they are all equally uncertain in terms of which takes up the space, and therefore exclude the others. That’s why its exclusion space!

As you introduce more octahedrons, if they don’t connect to a preexisting octahedron then they are isolated and there is no communication possible. When an octahedron appears with the possibility of a shared node or pair (edge) then we get our first real look at uncertainly at work.

There are two possibilities here. 1) like high school probability, the uncertainly of the system increases. One might think if you roll one die you have 6 states, but roll two and the number of possible states increases, OR 2) uncertainly decreases because we already have the pre-instantiation of the first die. The first and second event are not independent. The second octahedron only comes into view WHEN it shares a node or edge, so history is important.

Take the example of a shared edge. The uncertainty of the first octahedron drop to 1 in 18 when an edge is known. Similarly the second octahedron now has only 18 isomers to be uncertainly about and not 54. So did the uncertainly of the system increase or decrease. I put it to you that the uncertainly of a system, unlike probability, is measured at its boundary. When there was one, it was 1 in 54, now with two the boundary has increased in size, but at each new interface the uncertainly has fallen to 1 in 18. If we add a third to the chain the lowering of uncertainly will ripple back into the interior. Once the interior has its connection possibilities exhausted it is out of scope and unobservable.

A bit like a singularity, you only know of it through its event horizon.

## Where from here

There is only one way and that’s up. The problem with exclusion space is finding new signal. This is virgin territory unlike the counting numbers which have been studied for hundreds of years. The goal is clearly to approach concepts like the quantisation of space, gravity, charge, type and instance. Eventually confirm the dimensionality of spacetime, derive the Standard Model and explain mass and energy as we perceive them.

One of my thoughts was that at this level uncertainly behaves like temperature or entropy, and so in the long run a sum over history is essentially thermodynamic. Simply put, the more you have the less uncertain it all becomes and so entropy decreases. Which is why time appears to run forward, its essentially just thermodynamic memory.

## K

Looking at exclusion space in the low numbers is I suggest very misleading. We will only really get to know its properties once we use some supercomputing to simulate upward of billions of octahedrons. This means pushing n into unheard of regions. The universe as we see it today behaves in the region of K, where K is a counting number defined to be larger than any other counting number you can nominate or become aware of the ability to express or communicate. What happens at K is a mystery, do the number of solutions to a^{2} + b^{2} = c^{2} get more dense, or less. Are there many isolated clusters of connected octahedrons, or just one big one cluster. Do they loop, and is this common enough at scales to suite the spin foam theory.

If you have an opinion, please leave a comment.