AI (machine learning, advanced algorithmic tools) is a wonderful suite of technologies. When applied with nothing but brute force (say to capture attention, make decisions, replace craftspeople) unintended things happen (Facebook ad overdrive, biased algorithms, too rigid of machine production).

The silver bullet to harness (and tame) the potential of autonomous algorithms is context. All machines run of average (Shannon, 1948; Rose, 2016), and all machine learning ‘hyper–statisticsize’ the average.

The good news is that the decentralized nature of these new tools allows them to split themselves across all respective instances of users (something I have been modeling as plumbers), but it is up for design (and the respective business models) to make use of these capabilities. If we’re relying on averages then all of this unintended consequences will be keep coming after us. We’re waking up to the fact that we’re not average, and being an individualist in an age of abundance is more affordable.

As long we’re building a single code–base (real estate instead of plumbers) and aggregating (averaging) our users’ context we’re missing out on the future, and are pulled back by linear (assembly line) thinking.