“We are moving into an era where cities will matter more than states and supply chains will be a more important source of power than militaries — whose main purpose will be to protect supply chains rather than borders. Competitive connectivity is the arms race of the 21st century.”
-- Parag Khanna , 
A network is made of lines and switches, right?
Lots have been told about the network scaling effects , including attempts by myself [4-12] ... which compels me to introduce the not so frivolous notion of network forces.
These forces are expressed in several laws. I though initially to say 'forces' and 'laws' here, but I realize they are quite objective and physical emergenta , indeed.
In my ''Geodesic by Tauchain''  article of about couple of months ago I emphasized over the Huber-Hettinga Law , of how cost of switching literally defines the 'orographic'  topology of a network .
The cheaper the routing - the flatter the network.
Expensive switches = hierarchy, verticality, power, control, obey, centalization, 'world is fiat' ,, sollen , hence borders instead of bridges, limitations not stumulae, exclusivity ...
Cheap switching = geodesic society , 'world is flat', horizontality, p2p, decentralization, inclusivity ...
The more vertical by centralization a network is - the more it must deplete information - to omit, to ignore calls from the deeps or to even actively suppress or silence nodes. To cope with the stream by strangling it. Simply due to lesser capacity, less degrees of freedom . Geodesic networks possess higher entropy  and therefore are richer. They bolster higher both Scrooge  and Spawn  factors. With other words:
The flatter the network - the richer  it is.
Maybe the explanation on why the wealthiest-healthiest societies tend to be those who are with biggest economic-political freedom. 
Naturally the Huber-Hettinga Law led me to the elementary-watson  conclusion of the power and value of Tau as the ultimate über -switch. So far so good.
Now lets stare in the Lines. Here comes Nick Szabo .
Nick Szabo - a lawyer AND computer scientist - is a legendary figure from the great 'Archaic era of crypto'  - the 1990es when he, together with the other cypherpunk  titans like Tim May , Wei Dai , Bob Hettinga  etc. etc., poured the very baserock foundations in a staggering detail of what we enjoy now as Crypto  in the post-Satoshi  era.
It is THEIR vision came true we all now live in.
Bitcoin was a detonation of namely that critical mass of fused thoughts, of namely these very smart people, piled up and compressed by the connective network forces of the early internet .
No, I do not mean at all Szabo's most famous thing - the 1994 coining of the term of 'smart contracts' . In fact I deeply and strongly reject the very notion of 'smart contracts' - as utter non-sense, even as an oxymoron - which is an yuge separate problem, which I suspect that I nailed it, and I'll address in series of dedicated articles starting in the upcoming weeks...
I mean something much more valuable, what I call the Szabo Law.
When we hear the phrase 'networking effects' the first what comes to mind is the famous Metcalfe law .
''Metcalfe's Law is related to the fact that the number of unique connections in a network of a number of nodes (n) can be expressed mathematically as the triangular number n(n − 1)/2, which is proportional to n2 asymptotically (that is, an element of BigO(n2)).''
In the above order of appearance these network forces laws respect quantitatively the basic properties of a network as:
- Huber-Hettinga Law - the cost of switches and routing.
- Metcalfe Law - the number of nodes, i.e. switches defining the number of unique connections or lines.
- Szabo Law - the cost of the lines and connecting.
All these Laws are scaling ,  laws. Before we to come back to and continue on Szabo Law, we have to briefly mention another one .:
''So what is “scaling”? In its most elemental form, it simply refers to how systems respond when their sizes change. What happens to cities or companies if their sizes are doubled? What happens to buildings, airplanes, economies, or animals if they are halved? Do cities that are twice as large have approximately twice as many roads and produce double the number of patents? Should the profits of a company twice the size of another company double? Does an animal that is half the mass of another animal require half as much food?'' ... With Dirk Helbing (a physicist, now at ETH Zurich) and his student Christian Kuhnert, and later with Luis Bettencourt (a Los Alamos physicist now an SFI Professor), Jose Lobo (an economist, now at ASU), and Debbie Strumsky (UNC-Charlotte), we discovered that cities, like organisms, do indeed exhibit “universal” power law scaling, but with some crucial differences from biological systems.Infrastructural measures, such as numbers of gas stations and lengths of roads and electrical cables, all scale sublinearly with city population size, manifesting economies of scale with a common exponent around 0.85 (rather than the 0.75 observed in biology). More significantly, however, was the emergence of a new phenomenon not observed in biology, namely, superlinear scaling: socioeconomic quantities involving human interaction, such as wages, patents, AIDS cases, and violent crime all scale with a common exponent around 1.15. Thus, on a per capita basis, human interaction metrics (which encompass innovation and wealth creation) systematically increase with city size while, to the same degree, infrastructural metrics manifest increasing savings. Put slightly differently: with every doubling of city size, whether from 20,000 to 40,000 people or 2M to 4M people, socioeconomic quantities – the good, the bad, and the ugly – increase by approximately 15% per person with a concomitant 15% savings on all city infrastructure-related costs.
Which probably comes to denote the shear size of the network in STEM (space, time, energy, mass) , I'm not sure, but I have some strong suspicions about the unity of matter, structure and action which I will expose and share some other time.
What I call Szabo's Law reveals in his ''Transportation, divergence, and the industrial revolution''(Thu, Oct 16, 2014)  that similarly to Metcalfe's (''double the population, quadruple the economy'') there is power-law  correlation between the cost of connections or links or lines ... and the value of the network, too.:
''Metcalfe's Law states that a value of a network is proportional to the square of the number of its nodes. In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed. The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation. Combine this with Metcalfe's Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation. A reduction in transportation costs in a trade network by a factor of two increases the potential value of that network by a factor of sixteen. While a power of exactly 4.0 will usually be too high, due to redundancies, this does show how the cost of transportation can have a radical nonlinear impact on the value of the trade networks it enables. This formalizes Adam Smith's observations: the division of labor (and thus value of an economy) increases with the extent of the market, and the extent of the market is heavily influenced by transportation costs (as he extensively discussed in his Wealth of Nations).''
My encounter with this article of Nick Szabo's was a goosebumps experience for me, cause it coincided with series of lay rants of mine on the old Zennet irc chat room of Tau that ''computation =communication =transportation''. Somewhere in 2016 as far as I remember. :)
Maybe it was the last drop to shape my conviction that by my dedicated involvement in both Tau and ET3 , , , I'm actually working for ... one and a same project.
For communication, computation and transportation being modes of state change. Cause information is a verb, not a noun. And software being states of hardware.
''Decentralizing the internet is possible only with decentralized physical infrastructure.'' 
Just like the brain is a network computer of neuron nanocomputers , the emergent composite we colloquially call humanity or mankind or economy or society or world ... is a network computer made of all us billions of humans.
Brains do thought, economies do wealth.
Integrated circuitry  upon the face of planet Earth as a motherboard . Literally. The Humanity's planet-hardware. Parag Khanna's Connectography explained.
The Earth is definitely not our ultimate chip carrier . Probably there ain't limit at all of our culture-upon-nature hardware upgrades, see: , . The universe is our computronium  and we've been here for too short and haven't seen far enough. Networking is connectomics . And thus it always also is metabolomics .
Remember my last month's  ''Tauchain the Hanson Engine''?
The series of exponentially shortened growth doubling times looks like driven by transportation technological singularities : domestication of the horse, oceanic navigation, combustion engine ...
In the light of all the net forces summoned above: The planet Earth viewed as a giant computer chip ...
- itself is a subject of the relentless network entropic  force of the Moore's law 
The network forces accelerate what that wealth computer does.
Two quick examples:
A.: The $1500 sandwich  as a proof that trade+production is at least thousands of times stronger in sandwich-making than production alone.
B.: The example of Eric Beinhocker in his 2006 ''The Origin of Wealth''  about the two contemporary tribes of the Amazonian Yanomami  - a stone age population nowadays and the Eastcoastian Manhattanites . That the former are only about 100 times poorer, but the later enjoy billions of times bigger choice of things to have.
Tauchain 'threatens' to affect the parameters of ALL the network forces formulae mentioned herewith in a mind-bogglingly big scale.
Simultaneously, orders of magnitude :
- lower switch cost
- higher nodes count 
- lower connection cost
A wealth hypercane  recipe. Perfect value storm. Future ain't what it used to be .
In a recent article of mine  I hinted my strong suspicion that scaling is itself scalable.
''Scaling is a problem. Scaling must be scalable, too. Metascale from here to Eternity.''
No matter what a terrific grower a system is - as per its own internal algorithmic growth drive rules - it seems inevitable its growth to get it into entropic mutualization  upon impact with a kind of a ... downscaler.
Scaling is everything, yeah. But it is quite intuitive and supported by too big body of evidence to ignore, that, paradoxically: the faster a thing grows - the sooner its encounter with an external and bigger downscaling factor comes.
This realization, refracted through the prism of our 'reptilian brain' layer  amplified to gargantuan proportions by our inherent social hierarchicity  is the source of the 'Malthusian  anxiety' which led to countless violent deaths over all the human history. Fear is anger , so the emotion that there is only as much to go around, and that the catastrophe of 'running out' of something is imminent, is the major source of what makes us bad to each other .
There are plethora of examples of very well mathematically and scientifically grounded doomsayer scenarios, and we must admit that they all correct as per their internal axiomatics  , and simultaneously they are all totally wrong for missing out the obvious - the factors of externalities  , the properties and opportunities of the medium which is consumed and/or created by this growth, and which transcend the axiomatics. For growth being always 'growth into'. The fact that doomsday scenarios are so compellingly consistent internally is what makes them so strong and dangerous ideological weapon of mass destruction .
Lets throw some such problem-solution couples for clarity:
a. the world of 1890es big cities sunk up knee-deep into beast of burden manure , and the super-apocalyptic projections of that VS Tony Seba's  1 pic > 1000 words of NYC carts vs cars situations in 1900 -1913 ...
b. the grim visions of the whole Mankind becoming telephone switchboard blue collar workers , the number of which should've exceeded the number of total world population by now to achieve the same level of telephonization or
c. the all librarians world  where it takes more librarians than the whole mankind to serve the social memory in the paper & printed ink storage facilities mode ...
d. the Club of Rome  as the noisiest modern bird of ill omen with 'projections' based on the same blind extrapolations as the urban seas of shit or the 'proofs' of the impossibility to connect or educate or feed all - instigating mass destruction fear that ''we run out of everything and will soon all die'' , used for justification for mass atrocities VS Julian Simon's  - the ''Ultimate Resource'' (1981, 1996) . Cf.: my accelerando article  and see what precisely is the Factory for succession of better and better Hanson drives for the last few millions of years - from the Blade and the Fire to the Tau - it is the same thing which identification made Julian Simon from fanatical Maltusianist  into rationally convinced Cornucopian  ... the human mind.
e. the predator-pray model  which this pseudo-haiku  I guess depicts best how's it brutally flawed:
''hawk eat chic -> less chic, human eat chic -> more chic''
for missing out to posit and failure to account for positive feedback loop  of predator over pray dynamics ...
f. The comment of Dary Oster  , founder of the other passion of mine - ET3 , on the aka 'saturation' of the scalables (exemplified in the field of transportation, which btw, being communication ... our social structures map onto mobility systems we have on disposal ... ).:
''... US transportation growth has focused on automobile/roads (and airline/airport) developments. (And this has been VERY good for the US economy.) The reason is that cars/jets offered far better MARKET VALUE than horse/buggy/train transport did 150 years ago. In the mid 1800s, trains displaced muscle power for travel between cities - because trains offered better market value than ox carts. Trains reached 'market saturation' about 1895 to 1905 (becoming 'unsustainable') - however 'market momentum' produced 20 years of 'overshoot'. Cars/jets were far more sustainable than passenger trains and muscle power, and started to displace trains (and finish off horses). By 1916 the US rail network peaked at 270,000 miles (today less than 130,000 miles is in use).Just like passenger trains hit market saturation, roads/airports are reaching economic limitations. The time is ripe for a market disruption, and all indicators (past and present) say it will NOT come from, or be supported by government or academia -- but from private sector innovations that offer a 10x value improvement (like ET3), AND also offer incentives for most (not all) key industries to participate (like ET3). Automated cars, smart highways, and electronic ride sharing are industry responses that will contribute to overshoot of cars/roads for the next 5-10 years.The main problem i see with the education system is that is that academic research and publication on transportation is primarily funded by status quo industries like: railroads and rail equipment manufactures, highway builders, automobile/truck manufactures, engineering firms, etc. -- all who fund research centered on 'improving' the status quo.Virtually all universities (for the last 1k years+) are set up to drive incremental improvements that industry demands, and virtually all paradigm shifts are resisted until AFTER they occur and are first adopted by industry. Government is the same (for instance in 1905 passing laws to forbid cars that were disrupting horse traffic; or in 1933 passing laws to limit investment in innovation startups to the wealthy (those successful in the status quo)).''
g. Darwinian algo  sqrt(n) VS higher algos - like Metcalfe n^2 . It is not precise, it is more of metaphorical, to indicate direction or scale of scaling, rather then rigorous precision, but ... the former figuratively speaking takes 100 times more to put up 10 times more, and the later takes 10 times more to return 100 times more...
h. Barter vs money. See.:  bottom of page 5 over the bottomline notes, about the later:
simpliﬁes pricing calculations and negotiations from O(n^2) complexity to O(n) complexity
As demonstration how one item out of a scaling barter system, emerges as specialized transactor and accelerator to transcale the barter economy. From within. Endogenously as always. (btw, Extremely strong document where there are entire books read and internalized behind each tight and contentful sentence!)
i. The heat death of the universe  VS the realization that the 2nd law  - conservation law for entropy/information law does not allow that , the asymptoticity  of the fundamental limits of nature, the fact that max entropy grows faster than/from/due to the actual antropy growth  and that entropy is not disorder  and that at the end of the day it is an unbounded immortal universe  ... cause it's all a combinatorial explosion .
j. The Anthropic principle  and the realization that it is extremely hard if not impossible to posit a lifeless universe  ...
k. The Algoverse - my 'psychedelic' vision  of the asymptotic inexorable hierarchy of the Dirac sea  of lower algos which take everything for almost nothing - up towards giving almost everything for almost nothing - Bucky Fuller's runaway Ephemeralization . Algorithms are things. Objects. Structure. Homoousic or consubstantial to their input and output. Things taking things and making things outta the former. Including other algos of course! Stronger ones.
l. The Masa Effect . The Master of Softbank seeing how the machine productivity is on the imminent course to massively overscale the human clients base and his apparent transcaling solution to upscale the clients base with bots and chips, with the same which scales supply in such a too-much way. 
m. The Pierre the Latil 1950es and Stanislaw Lem 1960es ( copied 1:1 by Tegmark  ) hierarchy . Of degrees of self-creating freedom of Effectors ...
n. Limits of growth - present in any particular moment and in any finitary setting of rules ,  but nonexistent in the infinity of rules upgradability. Like a cancer cell trapped in a cage of light  vs ... photosynthesis.
o. Ray Kurzweil - static vs exponential thinking .
p. Craig Venter's  Human Genome project  which when commenced in 1990 was ridiculed that will be unbearably expensive and will take centuries to finish, and it did - it costed a unbearable for 1990 fortune and it did take centuries, of subjective time as per the initial projections conditions - being completed in year 2000.
q. Jeff Bezos vision  of Solar System wide Mankind:
''The solar system can easily support a trillion humans. And if we had a trillion humans, we would have a thousand Einsteins and a thousand Mozarts and unlimited, for all practical purposes, resources.''
r. The 'wastefulness' of data centers and crypto mining collocation facilities  ... which is as funny as to envy the brain for 'wasting' >25% of the body energy. (Btw, the tech megatrend is exponentially and relentlessly towards the minimum calculation energy).
s. The log-scale intuitive measure and smooth straight line visualization coming out of, this quote which I fished out off the net long time ago.:
"The singularities are happening fairly regularly but at an increasing rate, every 500 to 1000 billion man-years (the total sum of the worldwide population over time). The baby boom of the 1950 is about 200 Billion man-years ago."
ops! go back to Q. With 1 trln. humans population the 'singularities' will occur once a year?!
t. the Tau  !!
I can continue with these examples ... forever [wink] - excuse me if I've bored you - but I think that at least that minimum was needed to be shown and it is enough to grok the big picture.
Scaling is the solution. It is a problem too. Its overcoming is what I dub 'Transcaling' for the purpose of that study.
Size matters. Scaling is the way. But the more general is how a system handles change! This is as fundamental as to be in the very core of definition of life and intelligence .
Tauchain is all about change handling!
Now, lets knit the 'blockchain' of these all example threads above into a knot like the Norns do :
Dear friends, please, scroll back to Example D. Yes, the human mind transcaler thing. The Ultimate resource thing.
We are the ultimate resourse.
We the humans (and soon the whole zoo of our technological imitations and reproductions and transcendences of ourselves ).
We as the-I  are strong thinkers and creators, immensely more road lies ahead than it's been traveled, yes, but yet we, as the-I, are the momentary apex in the Effectoring business  in the Known universe ... AND simultaneously we as the-We are mediocre to outright dumb.
We are very far from proper scaling together. The Ultimate resource is not coherent and is not ... collimated. Scattered dim lights, but not a powerful bright mind laser. Dispersed fissibles, but not a concentration of critical masses.
We as The-We - paradoxically- persistently finds ways to transcale its destinies using the power of the-I, but the-We itself does not entertain the scaling well at all .
The individual human mind is the unscaled transcaler.
Tau is the upscaler of that transcaler.
I'll introduce herewith another 'poetic' neologism, which occurred to me to depict the scaling props of a system after the Scrooge factor of ''Tauchain - Tutor ex Machina'' , and it is the:
Spawn  factor
- the capacity and ability of a system to grow through, despite, against, across, from and via the changes. Just like cuboid  is about all rectangular things like squares, cubes, tesseracts ... regardless of their dimensionality, the Spawn Factor - to be a generalization of all orders of scaling. Zillion light years from rigor, of course, as I'm on at least the same distance from my Leibnizization . For the lawyer to become a mathematician is what is for a caterpillar to become a a butterfly. :) Transcaling.
Tau transcends the infinite regress of orders of: scaling of scaling of scaling ... by being self-referential. Or recursive. 
What is the Spawn factor of Tau?
If you let me I'll illustrate this by a poetic periphrasis of the famous piece of Frank Herbert's .:
I will face my change. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the change has gone there will be nothing. Only I will remain.
Lets build an universe , . I realize this blog post is the most 'psychedelic' up to now and for long time to go, but some 'poetry' never really hurts ...
We discussed already the worldmaker effectoring .
It is quite ancient but also exponentially growing business ... in all possible forms of science , faith  and art . This modeling  usually serves to play out what's possible and what's impossible. Gedanken eksperiment , yeah, but isn't all thought  merely algorithmic  and mere action  ?
Usually the posited universes are made of variations and combinations of substance/matter, structure/form and action/process rules. Though, the algorithmic component is always the essential ingredient. Yes, the Laws of Physics are full-fledged, literal algo , too. I have those conjectures that it is impossible to think out, make or discover (which is one and a same thing) a lifeless universe  and that substance-structure-action are inexctricable, but these are separate topic for some other times to address .
Lets put together ours toy-universe  out of only pure algorithm. I've never seen such a construct, although the Orbis Tertius  is enormous and I bet this vision have occurred gazillions of time in zillions of minds.
It is like an ocean. The primary coin-toss algo which outputs 0s and 1s  makes the water. We don't know (yet) if there are even deeper and more fundamental numerical bases  for running algos. Most probably the answer is yes by analogy with the Dirac Sea  - the deeps to be made of simpler and weaker algos. The most elementary coin-toss thing makes out the ... probabilistics, perhaps the primordial form of logic. The laws of physics (and of machine learning  and of darwinian evo algo  ...) tell the rule-set how to stitch together lotsa coin-toss outputs. A hint on inspiration for that - David Deutsch's Constructor theories . The laws of physics as entropy  limitation of the allowed elementary algo cumulative output. For information being a verb, not a noun - isn't it? Very interested philosophic perspective on algorithm as randomness constrictor  raise up...
So, if the Algoverse ocean water is made of elementary coin-toss molecules, being ''liquid'' is just another phase or aggregate state .
There is deep duality  between probabilistics and logic. Just like the zoo of dualities discovered in accelerating pace by the mathematical physics in the last decades  Probability/statistics we make now by logic , the reverse ... - well, nobody yet cracked it. Even Kolmogorov. But I bet we will. Most probably the breakthrough will be Ohad Asor name-labeled... To find the know-how to do it the other way round, do logic with probability/statistics. The statistical algorithmic - not the SAT , brute force, alchemist  way as with NN/ML ,  and other known beasts. This will be nothing less but full merger of maths/logic/philosophy/thought... and physics. Literally!
Excuse me for the haiku  simplification. It is deliberate due to realization of my grok constraints. :) Regard it as sharing a poetic impression.
Is there deeper and weaker algo than the digital - the radix-2, deterministic, unitary one? Intuition says ''yes, of course!'' Like with these radix-1 Half-coins  of negative and other non-unitary probabilities ... which take two tosses to yield a bit... and there must be transfinity  of lower ones, also transfinities of higher and sideways ones ... which is almost as counter-intuitive as Dirac's bottomless night of negative energy , but I bet also as much useful. (Lets not even touch numeral bases of Pi, i, e ... etc.), and lets stick to strictly binary 'water' for our oceanic toy-universe for the sake of sanity.
The next important notion of the Algoverse ocenic model is the Algorightmic strength  - the weakest algo would be that which takes infinity of tosses to get a full bit. The strongest?
Algorithmic ephimeralization  - essentially to do more with less. Or faster - Speed Prior  ... which is just another way to say 'more'.
Some algos are too strong - QM, M-Theory - they return way too much bits per 'toss'. Their vcdim  converges to infinity. Exponential walls  in all directions. Not exactly what Freeman Dyson had in mind ... In our ''mockup'' they could be depicted as too hot. Changing the phase of the elementary algo 'water'. Like.:
but because we are all for peaceful use of algorithmic energy - we reject those up here, too - together with the non-unitary statistics down there.
Last piece of the picture - the Algoverse ocean is habitable and inhabited!
By higher algos as life-forms, stronger - but not so strong to turn the 'water' into roaring steam or plasma.
Examples: Calculi , geometries, algebras ... software . The genetic inter-algo connection should be that calculus came from Leibniz and Newton and numerous unknown others ... heads, but it is the blind watchmaker  of evolution which put those heads together ... (I disagree with Dawkins only on that evolution and design are both algorithms, alternatives but not opposition).
Thus entropically  and combinatorially  algos kinda-sorta come from one another - the stronger from the weaker.
The stronger are the life-forms living in that ocean. Cause randomness  permeates everything, isn't it?
Not so far-fetched of a metaphor given the fact that any Effector-ing  has totally algorithmic nature and essence.
How much higher 'life form' Tauchain in the Algoverse ocean is?
Is it mere life form or ... life, new organizing principle to reform all the system?
''Thinking by Machine: A Study of Cybernetics''
by Pierre de Latil 
Published by Houghton Mifflin Company in 1957 (c.1956), Boston.
Foreword of Isaac Asimov (then only 36 years old) ! Recommendation by the legendary mathematician and cyberneticist Norbert Wiener (then 62 years old) ! ... A true jewel! The book is described as:
A review of "the last ten years' progress in the development of self-governing machines," describing "the principles that make the most complex automatic machines possible, as well as the fundamentals of their construction."
Nineteen fifties !! The midway between the first digital computer made by my half-compatriot John Atanasoff  and internet . Almost a human generation span between the former, the book and the later event. Epoch so deep in the past that even television, air travel, rockets and nukes ... were young then.
Same Kondratieff  wave phase btw, which hints towards the historical rhyming of socially important intellectual interests. (On how K-waves imprint on the humanity growth curve - in series of other posts to come).
I must admit here that I've never put my hands and eyes onto this book. But, it is stamped into my mind and memory by Stanislaw Lem  - one of the greatest philosophers of the XXth century, working under the disguise of a Sci-Fi writer, for being caught on the wrong side of the Iron curtain.
''Summa Technologiae'' (1964)  is a monumental work of Lem's, where most issues discussed sound more contemporary nowadays than they were the more than half a century ago when it was built, and for many things also we are yet in the deep past ...
... Lem reports and discussed the following from the aforementioned Pierre de Latil's book.:
''As a starting point will serve a graphic chart classifying effectors, i.e., systems capable of acting, which Pierre de Latil included in his book Artificial Thinking [P. de Latil: Sztuczne mys´lenie. Warsaw 1958]. He distinguishes three main classes of effectors. To the first, the deterministic effectors, belong simple (like a hammer) and complex devices (adding machine, classical machines) as well as devices coupled to the environment (but without feedback) - e.g. automatic fire alarm. The second class, organized effectors, includes systems with feedback: machines with built-in determinism of action (automatic regulators, e.g., steam engine), machines with variable goals of action (externally conditioned, e.g., electronic brains) and self-programming machines (system capable of self-organization). To the latter group belong the animals and humans. One more degree of freedom can be found in systems which are capable, in order to achieve their goals, to change themselves (de Latil calls this the freedom of the "who", meaning that, while the organization and material of his body "is given" to man, systems of that higher type can - being restricted only with respect to the choice of the building material - radically reconstruct the organization of their own system: as an example may serve a living species during biological evolution). A hypothetical effector of an even higher degree also possesses the freedom of choice of the building material from which "it creates itself". De Latil suggests for such an effector with highest freedom - the mechanism of self-creation of cosmic matter according to Hoyle's theory. It is easy to see that a far less hypothetical and easily verifiable system of that kind is the technological evolution. It displays all the features of a system with feedback, programmed "from within", i.e., self-organizing, additionally equipped with freedom with respect to total self-reconstruction (like a living, evolving species) as well as with respect to the choice of the building material (since a technology has at its disposal everything the universe contains).
Longish quote, but every word in it is a worth. When I've read this as a kid back in 1980es ... immediately came to my mind the next, the seventh logical higher effector class.: the worldmaker !!
The degrees of freedom of all the previous six according to the classical taxonomy of de Latil are confined by the rule-set, the local laws of physics.
They are prisoners of an universe. Like birds incapable to reconfig their cage into roomier and cozier ones.
If we regard the laws of nature as code or algorithm, my 7th level effector will be capable to draft and implement itself onto newer and stronger algorithmic foundations. ( Note the seamlessness between computation and robotics in Latil/Lem categorization construct - quite logical indeed, having in mind that software is state of hardware, that matter-form-action are inextricable from each other, but on this in series of other times and posts ... ). Without bond?
So, I wonder:
Where, you reckon, is Tauchain  placed onto the Latil's effectors map?
To zoom out is useful. It puts the events networks of our spacetime in perspective. Including on what the great Jorje Luis Borges was calling the Orbis Tertius :
''ORBIS TERTIUS. "Tertius" (Latin = third) is an allusion to: World 3: the world of the products of the human mind, defined by Karl Popper.''
Poetically stated, ''retrodiction studies'' , ,  enables us to get a glimpse on the "clear, cold lines of eternity".
Back in 20th century Prof Robin Hanson put together this extremely insightful and strong document .
Long-Term Growth As A Sequence of Exponential Modes,
Economy grows. [see: Footnote]. Unstoppable.
Hanson's unprecedented contribution was to provide us with systematic orientation tool on how and why economy grows.
It accelerates. See:
Mode Doubling Date Began Doubles Doubles Transition
Grows Time (DT) To Dominate of DT of WP CES Power
---------- --------- ----------- ------ ------- ----------
Brain size 34M yrs 550M B.C. ? "16" ?
Hunters 224K yrs 2000K B.C. 7.3 8.9 ?
Farmers 909 yrs 4856 B.C. 7.9 7.6 2.4
Industry 6.3 yrs 2020 A.D. 7.2 >9.2 0.094
The model identifies the past economy accelerators as.:
- neural networks, evolving into doubling brain size each 30-ish megayears (hinting that human level of intelligence is an inevitability: +/-30 millions of year around the Now, by the virtue of the good old 'coin-toss' Darwinian algorithm alone.)
- human as the top-of-the-foodchains predator since around 2 000 000 BC. (maybe the human mastering of the Fire and the Blade to blame), compressing the doubling time with over two orders of magnitude down to a quarter of a million of years.
- Food production, ecosystem manipulation (or rather the collimation of farming, horse domestication and writing as accelerator components), leading to less than 40 human generations per economy doubling.
- All we know as division of labor, specialization, systematized Sci-Tech... industry - the centralized ways for production and control of knowledge leading to another hundreds-fold compression down to mere ~decade of economy doubling time.
Recommended: digest each Hanson (economy accelerator drive or) Engine with the Bob Hettinga's 'ensime' :
My observation about networks in general is a rather obvious one when you think about it: our social structures map to our communication structures. As intuitive as it is to understand, this observation provides great insight into where the technology of computer assisted communication will take us in the years ahead.
Connectivity specs as indicator and drive.
Now, when we leave the past and use these models to gaze into the future, the really interesting stuff comes out.
Aside from giving explanation to the, detected by Brad DeLong in his also monumental paper , overall trajectory of the economy, the nucleus of meaning in the Rob Hanson's paper is:
Typically, the economy is dominated by one particular mode of economic growth, which produces a constant growth rate. While there are often economic processes which grow exponentially at a rate much faster than that of the economy as a whole, such processes almost always slow down as they become limited by the size of the total economy. Very rarely, however, a faster process reforms the economy so fundamentally that overall economic growth rates accelerate to track this new process. The economy might then be thought of as composed of an old sector and a new sector, a new sector which continues to grow at its same speed even when it comes to dominate the economy.
Visualize: a Petri dish and sugar being expanded in size and quantity by the accelerating growth of the bacterial culture in it.
Hanson actually predicted nearly quarter of century ago, ... something that is relentlessly coming.
In the CES model (which this author prefers) if the next number of doubles of DT were the same as one of the last three DT doubles, the next doubling time would be ... 1.3, 2.1, or 2.3 weeks. This suggests a remarkably precise estimate of an amazingly fast growth rate. ... it seems hard to escape the conclusion that the world economy will likely see a very dramatic change within the next century, to a new economic growth mode with a doubling time perhaps as short as two weeks.
An economy accelerator avalanche is roaring down the slope of time towards us.
A brand new Hanson Engine is about to leave the assembly line.
Tau, is that you?
FOOTNOTE: To wrap up the above statements in the flesh of the deep thesaurus of content onto which they lie, would conservatively consume hundreds of pages. Even if only briefed. I promise to come back to these subtopic meaning expansions (by referring back to here) with series of posts in the months to come to tie up with the notions of.: economy as a network, network as computer, what exactly it processes and outputs, economy (like the universe or life) being endogenously driven positive feedback loop self-amplifying non-equilibrium entropic combinatorial explosion system, the wealth as economy complexity growth in relation with GDP size and the intimate connection of dollars-joules in energy intensity, physical and economic limits of growth, self-reinforcing predator-pray models, knowledge as synonymous with skill and so forth, economic cycles upon the DeLong curve ... to name a few. Readers questions and comments will of course help a lot with the subtopics prioritization, and will boost (incl. mine) understanding. Thank you in advance!
NOTE: I currently have the pleasure and honor to be part of the Tau Team, but this post contains ONLY my personal views.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.