One may say “the eternal mystery of the world is its comprehensibility.” – Einstein
Something you won’t often find in the present-day penny press or add to your ready and prepared repertoire of elevator pitches concerns the unreasonable effectiveness of mathematics. Obviously part of this effectiveness is related to different variations of the observer effect, we see what we look for and we create and improve the tools that suit best, yet it remains a baffling feat that a simple formula can be accurate up to one part in a million or billion. Although deceivingly simple, the gravity law that Newton devised in 1687 is accurate to more than one in a million and we only need to be concerned about difference when looking at the very small or the very large. Devised in 1861, Maxwell’s equations determine the strength with which an electron interacts with a magnetic field with an accuracy of eight parts in a trillion, as verified in an experiment done in 2006. As guestimates go, this goes a little beyond a lucky guess. Obviously this has not gone unnoticed and an increasing number of scientists are flirting with the idea that physics is so successfully described by mathematics because the physical world is mathematical. Although one can construct a framework to describe what is happening, with perhaps the simplest of constructions being the ability to give some event a name, it appears that “constructability” may have a far deeper reach than usually considered. Instead of mathematics acting merely as a conceptual framework that is helpful in describing events as a complicated system in terms of simpler systems, increasingly accurate approximate prescriptions on ever more fine-grained detail levels are uncovering more hidden mechanisms underlying mathematics. That is, many advances are related to deepening insight concerning the mathematical construction within some physical, chemical or even economical context, without so much introducing new ideas to the context. Nature appears to follow a similar sort of approximations, forming levels at different minimal scales of simplicity, constructions of irreducible sophistication, and in a way the sciences are simply re-discovering these. Rediscovering and reinventing these by reconstruction, and similar to many scientific ideas, Nature has to deal with a vast enormity of non-sequiturs.
It is said that in mathematics you don’t understand things, you just get used to them. The exact sciences turn out to actually need a certain degree of vagueness, fundamental fuzziness and often even contradictions. With our bias towards exactitude the focus has been on systematic composition, whereas e.g. poetry often uses the logic of analogy. While mathematics is a mixture of systematic composition and correspondence, the latter has been given little attention besides its role in symbolic representation. While ‘the laws’ of gestalt theory are widely used in industrial design, no conceptual framework exists yet to gauge the mutual interpretability of neuroaesthetics, of Ramachandran’s “Eight Laws of Artistic Experience”, and apply these to analogue computations. In other words, there are quite a few different styles of logic still to be uncovered.
In an age when one could still become an expert in a scientific field by spending a night in a library, French philosopher Auguste Comte devised a “hierarchy of the sciences”. Moving from the simplest to the more complex the sciences developed in this order: Mathematics; Astronomy; Physics; Chemistry; Biology; Psychology; Sociology. In this nicely layered model particle physics determine how atoms and molecules behave, and these in turn determine the chemical interactions, which in turn determine the biological characteristics, which determine the psychological qualities and so on. As was the fashion at the time Comte followed a line from transcendence ‘upwards’ to the more tangible humanities at our scale of existence, as if one level is the logical consequence of the other, nicely layered on top of each other in a vertical fashion… just like society was supposed to be.
However intriguing and valuable an approach, it doesn’t seem like the appearance of a minimal structure of a new ‘layer’, that this indicates the other ‘layer’ to seize. Chemistry doesn’t stop where Biology begins. Just like perfume can have a clear impact on someone’s psychological mood, electricity is obviously having a great societal impact, or a road system where a certain degree of Boolean logic is unavoidable, where an entrance-ramp acts as an AND gate and an exit-ramp as an OR gate. The ‘lower’ realms continue to pervade the ‘higher’ realms. We’d be better off by trying to establish some arrangement of the potential reach of a science, and relatively close the old model we can follow a nested hierarchy of emergent constructions, mixing the Russian nested dolls with the recursive Droste effect. If we think in terms of “constructability” instead of “comprehensibility”, with structural and functional minima signifying the onset of emergent scales in probability space, we get a hierarchy that mixes transcendence with immanence; Biology; Artificial Intelligence; Computing; Semiotics; Logic; Mathematics; Physics; Chemistry; Psychology.
Although this is arguably erroneous to one degree or another, this is simply to highlight the importance of biology. Eventhough great progress is being made in mixing physics and mathematics, many phenomena in particle physics and astronomy cannot be explained if it were not for insight from theoretical biology. Even if evolution, self-organization and criticality are applicable to astronomy, physics and computing, meaning would be lost if placing these mechanisms outside of biology. The “laws of physics” are better approached as programs or learned behaviors and 3D space actually appears to be out to be one of the simplest complete frameworks that can arise for any kind of systematic arrangement, it is probably the simplest way in which the whole interactive tapestry of objects can organize. Recent research concerning the self-assembling tendencies of variable amounts of 145 different polyhedra (shapes like a cube, pyramid) showed that nearly 70% of the shapes tested produced crystal-like structures even when their environment was as disordered as possible. Some of these structures were highly complicated, with up to 52 particles involved in the pattern that repeated throughout the crystal. Contrary to the typical idea of entropy, the inevitable tendency towards chaos, even in a mathematical simulation there is no other way than to form ever more complex constructions. If we recognize mathematics as a natural science, life may be an inescapable result of how the universe works.
The insights grown concerning what kind of world we actually live in have been accelerating greatly during the last century and it seems like the sciences have shifted a level of abstraction to incorporate the importance of computing sciences and farfetched areas being mapped out as we study along concern metamathematics, metaphysics and metabiology, and the difference between these three is growing increasingly blurry. One of the deeper insights originates with theoretical biologist Stuart Kauffman. Set to define the fuzzy edge of chemistry and organic life, abiogenesis, he has been gathering increasing evidence that life initially arose as collective autocatalytic sets, collections of molecules each of which can be created catalytically (non-destructive agency of a chemical reaction) by other members within the set, such that as a collective the set is able to catalyze its own production. As a reproductive functionally self-sustaining whole of structurally self-sustaining parts, this is a likely arrangement to make the jump upwards to organic life. His insight is particularly important as it clearly shows how individual parts can cooperate in a minimal collective to form something that is “greater than the sum of its parts”. Replication already happens with more simple molecules though, but it needs to be repeatable enough so that this ‘organism’ doesn’t run wild, or deplete its immediate surroundings.
Studied by modern giants like Mandelbrot and Wolfram, we can see such frameworks in the light of different mixes of order and disorder. Too much order and life can’t emerge, and with too much disorder life doesn’t stick. If we look at Nature again, most of earth’s history earth involved mineral formation until there was such an abundance of atoms and molecules that life as we know it could evolve. Life has been brewing along for some three billion years before it evolved beyond single-cell organisms, until roughly half a billion years ago evolution accelerated and become as diverse and versatile as we now know life to be. None of the observations really explain why life exploded and it seems that the dynamic equilibrium of habitat and inhabitants grew abundantly fertile enough that when a lower threshold in genetic complexity was reached it allowed for an enormous variety in species to develop. “Quantity has a quality all its own”, as an infamous politician once said.
Life in its organic form seems to have emerged in a similar way on the edge of chaos and order, facilitated by the violent conditions of the early earth, nearly uninterrupted volcanic activity, and thunderstorms discharging electric currents into the primal soup, the dominant gasses intermingled in such a way they created the amino-acids, the alphabet of organic life. Furthering the language of organic life, its grammar appears to have developed analogues to “aperiodic crystals”, a mildly disordered assembly of ordered molecules. Single molecules, although their configuration can be quite information-rich, are too small to provide the expressive power of a grammar so it had to be a collective behavior. Pure crystalline format is far too repetitive to express any higher degree of complexity. Any amorphous solid was too chaotic to express anything. So it had to be a crystal with the right mix of order and disorder, and this turned out to be quasiperiodic crystals, with ordered elements held together in a disordered way. Limestone has long been suspected to act as a placeholder for the development of such behavioral complexity in the form of proteins, yet, if limestone acted as memory by preserving a structured imprint, water acted as processor.
When water is not tightly compressed its outer surface forms a liquid crystal layer. Like glass is actually an amorphous solid (too disordered to be crystalline) a liquid crystal is a form of ordered fluid. When tightly compressed the minimal energy configuration of water is not to arrange itself as single H2O molecules, but it becomes an assembly of five H2O molecules, and it will endlessly bond and re-bond so that it is always moving. With its strange properties water amplifies chemical reactivity and variations about a billion to a trillion times. The more we learn of water, the stranger it gets. Water doesn’t do what a normal substance does; it is always a little bit different, often with staggering results. If water acted like a normal substance the weather would have probably come to a standstill long ago… but when you put water together in a sea, under the weight of gravity a normal substance would squeeze the lower layers together in an ice format, but not water, no, it actually expands a little so that it keeps the deep sea at a temperature between 0 and 4°C and makes it push upwards. It even seems to match the simplest definition of organic life, a “complex adaptive system”. Although water is not organic life by definition, it is difficult to categorize water and it would be more suitable to see it one of the prime building blocks crossing the bridge between chemical life and organic life.
Water may not be the immediate cause of organic life, but acts as a participating facilitator enabling an extension in utilitarian degrees of freedom. Affine enablement of the nearest-neighbor in possibilities, nested complementarity in probability space, or as Kauffman calls it, “the adjacent possible”. Like evolutionary progress happens in steps, not jumps, a combinatorial reshuffling of existing and newly introduced parts, adjacency implies that these possibilities do not appear out of nowhere, there is a direct line of sight. This visibility requirement indicates that for any semi-closed system this “adjacent possible” expresses a dynamic equilibrium between a system and its immediate environment, the system’s potential energy as it propagates through phase space. Entropy, in the statistical approach thought to be a measure of disorder, may be better suited as an expression of the system’s structural arrangements, the “tensional integrity” of its emergent hierarchy as it reshuffles from actual to potential. In a simple physical setup that is simply the kinetic energy. Nevertheless, visibility means interaction, and eventhough something ‘new’ may enter the picture, a system is always in touch with its potential. As a result, potential energy is an active shaping force, like water not an immediate cause but a determinate yet unpredictable facilitator, due to the potentially numerous possibilities.
To give an idea of the number of possibilities, the average human body contains roughly 7 thousand quadrillion atoms, 7 billion billion billion parts, yet we still move around as a whole, so there are structural mechanisms in play which greatly simplify how our parts are arranged and coordinated, such as a proposed mechanism for muscle coordination involving low-intensity electromagnetic cellular interactions with a high degree of quantum coherence, along with biomechanical tensegrity. Eventhough acting on a ‘higher’ level of complexity, we see ‘lower’ level mechanisms being used in a greatly simplified manner. However, if we take a mechanical look at the different ways we can make a step, hundreds of muscles, bones and tendons are involved, and this gives about a billion times more possibilities than the number of atoms in this universe. Even with a coordinating mechanism in place, the amount of possibilities is mind-bogglingly large, but only a very few of these involve a large enough step to break the 8.95 m world record long jump.
If we take a bottom-up approach sometimes we encounter assemblies with a level of unity that implies a collectively induced coherence, with emergent attractors in phase space. Emergent attractors appear really strange, but they may be what life is about, and as they steer a systems behavior to a certain goal, they seemingly work backwards in time. Sometimes as with certain forms of quantum error correction, such effects “… cannot be used to go back in time, only to reduce the time between cause and effect a little bit”, although this happens in laboratory conditions which are shielded as much as possible from the rest of reality. However, as every little thing, or event, seems to have its own particular timeline, chains of causal events can split and join, and eventhough on an individual timeline there is no such thing as retro-causality, consistent with relativity physics the quantum mechanics ensemble interpretation allows for the possibility to connect one timeline’s present with another’s past. As it is quite impossible to measure the collective future, it is possible to measure the influence of events that happened at different times in the past. Some unusual experiments have been performed which indicate that we do indeed are able to change the past, as long as the particular local timeline was still in a ‘quantum’ state until the observation connected it with global history made it definitive then the outcome can be influenced to some extent. Time may be an illusion, but like a pair of face-to-face mirrors, the further we look, the deeper it gets.
Just like with quantum-mechanical systems, if you drill down to a too fine-grained detail level you will end up with interference problems and you cannot assign probabilities to such fine-grained histories. The details cannot be detached from each other; even neighboring probabilities cannot be treated as separate alternatives. The assembly is a minimal structural whole, with spatial coherence and temporal coherence, a wave. Eventhough it is composed of identifiable discrete parts like any wave function it loses meaning if you subdivide it even more, leaving you with a caricatural sketch. As it turns out, these quasi-classical coherent ensembles are much more prevalent than previously thought and if we explore the world in a top-down fashion, we encounter these self-sustaining complexes which are an irreducible unit, they cannot be split up any more eventhough they are clearly made up of individual parts. Most of these ‘organisms’ have little to do with the subatomic realm from which quantum fields originate, but the approximate framework itself is already reaching up into a realm named quantum biology.
Many physical ‘mechanisms’ pervade the biological world. The opening of a flower is vital for its reproduction, allowing for its own pollen to be taken by small insects or the wind as well as crosspollination with pollen from other members of its species. Of the several ways that flowers regulate flower opening, maybe the most elegant one uses osmotic pressure. When the first light rays of the morning sun hit the flower bud it heats up the fluids inside the petal’s cells making the fluid’s atoms jiggle around more wildly. In turn this causes the cells to expand a little and by doing so the fluid balance is disrupted resulting in a negative pressure gradient which will cause more fluid to enter the cells. In other words, the petal suck in water which make it blow up like a balloon, and as the petal expands it unfolds and opens up the flower bud. Again, we have here a ‘lower’ level mechanism acting as a controller on a ‘higher’ level of complexity. If this is the local “adjacent possible”, then it is not an open-ended combinatorial explosion, it is a functional arrangement more simple than the structural arrangement would suggest, with enough self-sustaining coherence to have this simple mechanism act as an emergent attractor. Another variation causes the tightly folded DNA ribbon inside the plant’s cells to expand and unfold a little, thereby exposing a particular genetic sequence, one that is activated by the incoming light that exactly fits through the opening in the folding structure. This sequence then starts the ‘program’ to produce the chemicals that cause the petals to open up. Once the sun starts setting, the particular light frequency is absorbed in the earth’s atmosphere and doesn’t reach the plant anymore, which causes it to seize production of the needed chemical and the flower closes as a result of absence of the stimulant. Like many such mechanisms there is an ‘on’ switch, but no ‘off’ switch. As biologists like Prigogine have suggested life is full of these negentropic mechanisms, (thermodynamically open) dissipative systems with a reproducible steady state, like cyclones, hurricanes, living organisms, or convection (concerted, collective movement of ensembles of molecules within fluids). Convection has been widely studied as one of the simplest examples of self-organizing nonlinear systems, self-reinforced spatial expansion by group formation. Eventhough a population of particles starts out with an equal distribution, evenly smeared out, once the particles start grouping, the larger a group becomes, the more surface it has available to attach to its nearest-neighbors. Self-amplifying spatial expansion, just like how most clouds grow.
However simple the workings of such organisms, mechanisms or complexes may seem, the outcome is often unpredictable. When viewed from the perspective of “constructability”, as said, many scientific advances are surprisingly similar in the sense that a deepening of the mathematical construct, which in general reduces the amount of work involved, greatly advances the applicability of a science or technology. Wolfram closely relates this common behavior to “computational irreducibility“, meaning that the only way to figure out what is going to happen is by actually performing each step. One of the definitions of mathematics is the study of the systematic composition of patterns, and eventhough some patterns may originate from a yet unexplored logic, it may be clear that for simple mechanisms and complex organisms alike the capability of computation is indistinguishable of their potential evolvability. Wolfram and his team have been making a map of the mathematical universe, a map of more than three million theorems that have been constructed from intermediary theorems and elementary axioms, self-evident assumptions which are accepted as true. Essentially it is a map of all things that turned out to be decidable and provable. Nevertheless; “Mathematics has navigated through these kinds of narrow paths in which you don’t run into rampant undecidability all over the place” and if one starts to ask mathematical questions at random one would soon run into undecidability. The known mathematical universe has paths following branches into side branches then to face a sudden intersection where separate branches unify and cross over for no apparent reason. When using computing systems to create and explore the space of all possible theorems, one might find new paths and in due course create such a map of the constructible universe. In the space of all possible evolutionary constructs, it is very easy to get complicated results, with evolutionary branches that die down in infinity dullness due to too much order, branches that explode with too much disorder, or those branches with some harmonious balance, some even capable of simulating their own evolution.
Nature however has more tricks up her sleeve to mix ordering and disordering, and there are many ways in which the direction of development is irreversible due to transitions which have a certain degree of undecidability, such as the first mover at a crossroads with a car waiting at every of the four roads, so that the rule which gives traffic from the right priority ends up in a closed loop with all drivers waiting for each other. Eventually something’s got to give and the traffic starts to flow again, but the actual way how this deadlock was solved doesn’t really matter, as long as it is solved. Besides undecidability, it turns out there are quite some of these irreversible ‘crossroads’, such as granular indeterminacies, uncertainties, incompleteness, indecomposability, unpredictability, intractability, indistinguishability, and even things that are maximally unknowable. Fuzziness and information loss may be quite normal in nature, such as with the interaction between water and crystals, where water’s pentagonal shape may grip into a hexagonal crystalline lattice, as far as the structural elasticity is tolerant of such misfits, leading to all sorts of impurities but like sprockets with an inexact fit can still gear up evolution. As the study on quasi-crystals shows, life may very well arise from within the undefined cracks of an incomplete space-filling tilling.
Conceptually, this “constructability” is pretty similar to the “mechanical” “materialist” worldview for which many popular writers seem to blame Newton and Descartes, although its accompanying de-spiritualization is actually a recent mix between the advent of modern economical sciences with Marx’s historical materialism, the popularization of psychoanalysis in spite of Freud’s strong distrust of unconscious inner drives and Sartre’s bleak and blasé existential nihilism. However, the world of hard science, such as particle physics, is much closer to Alice in Wonderland, Borges’ unrealities, Bakhtin’s chronotope or Aboriginal Dreamtime. Contrary to modernity’s industrialized hope industry, an infinitesimal small number of self-appointed gurus are willing to jump up and trade their front-row seat at this spectacle in exchange for some hard currency. Despite that, if there is something our universe does not deserve it is the nihilistic fatalism of the stylish looser. We live in a world, where inside and outside the science laboratories things appear to move backward in time; where something can reach the finish line before it arrives, yet still cannot arrive before it left; a world where simple molecules can be made to disappear and reappear on a place a hundred miles away. A world where past events can be affected by future influences as long as their timelines were on different branches with a slightly fuzzy history. A world that gets thicker if you stretch it out. We live in a world where space is a tapestry woven with light and matter as threads and knots, but with noticeable other sorts of stuff, outside this space-time texture, that appear to be everywhere and nowhere at the same moment. A world far more miraculous than anyone could have expected.
Although evolvability an-sich appears to be open ended, the very fact that an organism is self-sustaining, in its simplest biological format a collective autocatalytic set with a structural and functional circuit, means that it is self-delimiting, self-correcting, self-regulating. Especially where the functional arrangement is simpler than the structural arrangement, information about the flow dynamics can spread easier/faster than the flow dynamics themselves, and in order to do so it must optimize the flow of information across the system. This typically results in an asymptotically periodic behavior. The system itself as well as its information flow, displays some tension between two opposing forces: one, caused by discontinuities, is “entropic” and leads to chaos; the other one is “energetic” and pulls the system toward an attracting manifold within which the dynamics is periodic. Outside a vanishingly small region, chaos always loses. In other words, most natural processes are cyclic, with a rhythm of their own, like our heartbeat, nasal cycle, sleep cycles, biological clock or breath, though it can be controlled by will as our ancestors used to be swimming apes. As emergent attractors go, they do seem to be abundant.
If we look at the onset of living systems though, when self-regulation hasn’t kicked in yet, studies on natural occurring curves shows that when something spreads on a territory, the curve of territory size versus time is S-shaped: slow initial growth is followed by much faster growth, and finally by slow growth again. Like the periodicity most self-sustaining systems converge to, when path dependence is the prime mechanism then it turns out the S-curve is universal. The overshoot-and-collapse behavior is normal for mechanisms with an ‘on’ switch but no ‘off’ switch. Toggle-free growth always will have a certain degree of criticality, where function and structure start moving out of phase, one moving beyond a critical point while the other builds up overcapacity due to some form of inward directed elasticity, such as overheated chocolate drink from the microwave, snow avalanches, landslides or earthquakes. The S-curve is a combination of tree-shaped “invasion” by convection, followed by “consolidation” by diffusion perpendicular to the invasive lines. Zoom in close enough and any interplay between population and environment, habitants and habitat, will show these S-curves. Tree-shaped invasion covers the territory with diffusion much faster than line-shaped invasion, not that the latter does not occur, but its “program” is simply less efficient and is outrun by the forking mechanism. Branching out over different scales will cover an area much faster than following a evenly distributed network of channels. Adrian Bejan has made enormous progress with researching these dynamics, and it applies to just about everything, from the self-similarity of capillary blood vessels, the fractalesque branching of the lungs, river formation, to how ideas spread and how memes propagate through ‘the news’. This logic even recurs in our limbs, the bone structure of our arms branches out from one in the upper arm to two in the forearm, to four fingers, creating great freedom of motion, while the four fingers themselves form a unit and along with the “opposable thumb” provide excellent grip. Here the forking cascading onwards along the neighboring limbs towards a buildup of motional freedom. Forking is more economical, it provides the most result for the least amount of effort. This sort of economics reoccurs everywhere as a converging goal, and is rediscovered time and time again, as the path of least resistance, the principle of least action, the Hamiltonian, Occam’s razor, Leibniz’s principle of sufficient reason, or Kauffman’s adjacent possible.
In a wonderful twist, this sort of economics allows for emergent attractors, such as with convection. For example, the earth is spherical because it allows packing the most stuff in a small as place as possible, for a minimal surface with a maximum compactness, but this also has the center of the earth act as an emergent attractor. Out here on the earth surface we’re always falling towards the center of gravity, it ensures that most of our activities happen in a very thin layer above and below the surface; it’s not like we can just jump over a traffic jam although some people act like they can. These attractors also appear on other levels either by natural evolution or by deliberate interference in a system. Even mild forms of joining different systems can cause enough disturbances to change its behavior, such as with the observer effect where the very act of observing disturbs what you’re observing. Subtlety offers some resolve, but as a real-life example Google has enormous problems of the re-affirming feedback loops of their predictive analysis. Wherever Google’s search facilities focus their attention it starts acting as a self-fulfilling prophecy. If a site appears in the top ten on the first page of search results, it starts attracting more traffic which makes it appear more popular and thus makes it appear high up in first page of search results. In a world with limited attention, popularity feeds popularity. This is where personalized search results become somewhat problematic, as Google’s commercial model is an advertisement company wrapped around a search engine, so their incentive is to push information towards end-users that is agreeable with their customers, the advertisers… but even without this bias their search and categorization algorithms try to find and suggest information that is thought to be most suitable for the searcher. This is effectively creating a “filter bubble” of self-affirming information and if no-one steps outside to get some more information, it’s like living inside a television.
A known characteristic for information-based market segments is that they show an unusual behavior which is known as ‘increasing returns’. Due to the reduced dependencies on physical limitations knowledge and technologies can be distributed very quickly, e.g. via downloads or television news broadcasts, and due to network effects it creates what is known as ‘path dependence’, a self-sustaining reinforcing feedback loop. These self-magnifications also happen in the news, international politics, investor communities or fashion industry. Often without an ‘off’ switch, if left uncontrolled these will eventually burst like any speculative bubble. Yet, these are all predictable phenomena with predictable events on their own timeline, like little programs, machines or organisms. They are only unavoidable if left on its own, but like a balloon with a piece of sticky tape on it, you can stick in a syringe and deflate in a regulated fashion. For economic bubbles, this means that value can be moved into several other industries. However, the modern-day notion of shareholder value has caused most businesses or industries to have lost their natural format. They’re built for growth, and too much growth means they explode, too little growth they implode, but in general, after the sixty-sixth six sigma overhaul, they cannot handle economic seasons anymore.
What happens is that at a certain moment when an industry or market segment is forming, there is a strong interplay between the environment and population levels, habitat and habitants, market and companies, which influences the perceived uniqueness of a service or good. This translates in rarity and imitability. For example, positive rarity is a qualitative discriminative characteristic where an offering is not too far ahead of a highly cohesive market. Negative rarity would imply the offering is so unique within an incoherent market segment it has difficulty demonstrating the value and return on investment, the market is so disjointed that the company has difficulty being recognized as being core-player in this particular market. The latter can be addressed by e.g. solid support for open standards by which the offering gains a variety of contextual settings and use cases which it would not have in its own right. The interpretation flips around in value when an offering’s aim has to deal with e.g. security, such as financial messaging networks, where uniqueness and rarity are positive attributes. Negative imitability would be an incoherent organized offering with quantitative differentiation where sufficiently many equivalent offerings exist on a market dominated by competition, while positive imitability could for example result in de-facto market leadership. To move from above mentioned negative imitability a company can choose to streamline their business and production processes into a well-organized and cost-efficient manner, thus making the supporting organization highly coherent. But overly high coherence leads to an another dead-end, as simple mass-production goes, if the only differentiating factor is price, investing in the newest technology will increase the price so companies like these simple run until replaced by a newer more efficient version.
It serves to avoid extremes, and paradoxically increased competition leads to conformity and the whole population of companies turns into a coherent “swarm ball” where most are doing pretty much the same. When an industry-wide bubble starts to float and loses touch with reality individual companies will start looking closer at their close competition, and usually start mimicking each other’s behavior and this works both ways. Being a ‘fast follower’ saves out on the costs of being a thought leader, and keeping close to the competition will ensure that if they have a hit with some novelty they can quickly hook into that trend and join the party. But when market dynamics have evolved to the extent that there are only a few main players, then these start acting like a unit. As markets don’t have an ‘off’ switch, when the first companies start to drop-out due to “auto-cannibalization” as their initial business was too far removed from the emergent de facto norm, and they have to swallow double the costs to make the switch, which is when the industry starts to deflate and needs to consolidate in a small number of survivors.
Investor speculation only amplifies these mechanics, as when an industry starts to grow it caused investors to flock towards this industry, and the more investors invest somewhere the more investors it attracts, and when an industry starts to deflate investors will move away, and the more investors exit the more investors exit. As if market dynamics weren’t enough, the current investor climate increases the risk for speculative bubbles. Bubbles are normal though, as there is always a delay between anticipation and response, of introducing a product and its adoption, but it becomes problematic when overshoot and collapse behavior expands beyond its natural elasticity. If investors wouldn’t rush out of a bubble, it wouldn’t burst at all, but as they try to maximize their investment they will try to linger on as long as possible so that they don’t cause a rush out. Trying to win a game that is being defined while it is played, self-fulfilling collective deadlock dominate the current investor landscape. If the market is ‘life-worthy’, emergent attractors will be at play, and certain phenomena appear to be moving backwards in time, making things happen in the present so it can happen in the future. Or applying some negative logic, emergent un-attractors indicate the moment of dissipation, where the disappearances of emergent attractors causing a collectively induced decoherence. Even if happening outside laboratory conditions, as it is functionally more simple than the structure it acts upon, this can only be but a very minor effect, only noticeable in ‘how’ things happen, and to an increasingly lesser extent, ‘when’ and ‘if’. However, despite its subtle third-order derivative ‘jerk’ influence, it is clearly noticeable with many investors and company policy makers, in a rather exaggerated way where the positivism variant of magical thinking has become so strong they’re simply ignoring other input, like the climate change deniers… Comes to show that logic does not have to be rational, or make sense whatsoever.
Companies that understand these dynamics though, in particular the interplay of perception, expectations, maturity of the offering, fulfillment and obsolescence, can make use of above described means to bypass the usual hurdles when introducing a new product. Apple’s introduction of the iPhone and iPad are magnificent examples of a vendor entering an existing market while having such brand recognition they can propagate their reputation into an adjacent market and grab a large piece of the client potential. The S-curve “overshoot and collapse” tendency of a hype or fashion trend is not merely a psychological effect, it is a natural systemic mechanism when a new technology is introduced and tries to ‘settle in’ within a wider population of related technologies.
Recent adventures in artificial intelligence have taught precise communication may very well be impossible. Ambiguity, irreducible undecidability, appears to be the norm as Marvin Minsky states; “It is an illusion to assume a clear and absolute distinction between “expressing” and “thinking,” since expressing is itself an active process that involves simplifying and reconstituting a mental state by detaching it from the more diffuse and variable parts of its context. […] We can tolerate the ambiguity of words because we are already so competent at coping with the ambiguity of thoughts.” Once again the strange and intimate relation of order and disorder is at play here, and as a process which is forever taking shape a thought may be a smallest snapshot but is still an event with a minimal duration. For the active listener, participatory communication offers a high degree of mutual interpretability, but without some effort it is easy to get lost somewhere in between vagueness and clarity. Life has this incomplete and unfinished quality that allows magic to happen, as if time indeed works backward to boost a civilization forward towards fulfilling its potential, as if many of the “emergent attractors” combine here in irreversibility time-like structures, of events that have to happen. If there is a lesson to be learned from recent science, it is that “life wants to happen”. Life is unavoidable; it is woven into everything, even into complicated mathematical constructions. Although one can try creating a hierarchy of sciences, the boundaries are blurry and ambiguous, but it is clear that beyond the Kantian approach lays a world where all logic reasoning is an organization of analogue computations of a reduced biomimetic composition, not necessarily rational.
A vision that started arising in the mid 1960’s, around the time of the robotic arm started to become adopted, already a decade after the birth of ‘artificial intelligence’, was that “the factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment.” This is not happenstance, or an unforeseen utility, this is an emergent attractor that is going to happen sooner or later. Western society has been trying to shift towards a service and knowledge economy since it became clear manual labor started to demise. With offshoring, many manufacturing tasks have moved to China while much many ICT services have moved to India. This has allowed them to jumpstart their economy towards a Western level with an amazing speed. But the next phase is already happening, and in ten years from now robotics, either physical or software-based will be replacing many of the jobs made possible now, and competitive forces do not allow for an exit. In that sense Western society has also offshored future societal problems to China and India, and as Europe’s aging population seems to prepare for a slow retirement, it is up to these societies to come up with a solution. It is not just that China and India own the future, the future owns them.
This article is written without an executive summary on purpose, if you got this far, by now it should be clear that although we cannot fully predict the future, we can invent it.