Knowing is Not a Binary State

As humans, we store information and ideas in complex 3 dimensional structures. Accounting for nuances, weight of logic and auxiliary topics. When we need to use an idea we slice through that 3d spatial matrix and render a flat idea, into a sentence for example, or a decision.

The person on the other side then unpacks what we said into their own mental map, and on we go.

As we started using machines for tele–comm we reaped all of the advantages, with relatively small loss of meaning.

Ironically the deeper we are in the algorithmic age the more loss in digital signal we should be expecting. By asking machines to extract meaning (make decisions for example) we’re relinquishing understanding, and set the machines to fail.

We must not confuse a transistor for a translator.

We are now asking a binary machine (Shannon, 1948) to unpack levels of knowing, in a deeply qualitative state. We are asking a black and white machine, to see shades of gray.

Machines, conditionally and unequivocally, operate on a single plain. Single train track, single hockey rink, one dimension. The meaning comes from the connections between all of those and the liminality.

We’re the only ones able at a system view, and it is that bird eye view that understands meaning.

Published by Nitzan

I am a designer, writer and strategist with interest in machine learning, liminal thinking and complexity science. In my commercial work I help companies build innovative tools, design better qualitative processes, and lead that human machine collaboration with complexity in mind.

Leave a comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.