In a recent article of mine  I hinted my strong suspicion that scaling is itself scalable.
''Scaling is a problem. Scaling must be scalable, too. Metascale from here to Eternity.''
No matter what a terrific grower a system is - as per its own internal algorithmic growth drive rules - it seems inevitable its growth to get it into entropic mutualization  upon impact with a kind of a ... downscaler.
Scaling is everything, yeah. But it is quite intuitive and supported by too big body of evidence to ignore, that, paradoxically: the faster a thing grows - the sooner its encounter with an external and bigger downscaling factor comes.
This realization, refracted through the prism of our 'reptilian brain' layer  amplified to gargantuan proportions by our inherent social hierarchicity  is the source of the 'Malthusian  anxiety' which led to countless violent deaths over all the human history. Fear is anger , so the emotion that there is only as much to go around, and that the catastrophe of 'running out' of something is imminent, is the major source of what makes us bad to each other .
There are plethora of examples of very well mathematically and scientifically grounded doomsayer scenarios, and we must admit that they all correct as per their internal axiomatics  , and simultaneously they are all totally wrong for missing out the obvious - the factors of externalities  , the properties and opportunities of the medium which is consumed and/or created by this growth, and which transcend the axiomatics. For growth being always 'growth into'. The fact that doomsday scenarios are so compellingly consistent internally is what makes them so strong and dangerous ideological weapon of mass destruction .
Lets throw some such problem-solution couples for clarity:
a. the world of 1890es big cities sunk up knee-deep into beast of burden manure , and the super-apocalyptic projections of that VS Tony Seba's  1 pic > 1000 words of NYC carts vs cars situations in 1900 -1913 ...
b. the grim visions of the whole Mankind becoming telephone switchboard blue collar workers , the number of which should've exceeded the number of total world population by now to achieve the same level of telephonization or
c. the all librarians world  where it takes more librarians than the whole mankind to serve the social memory in the paper & printed ink storage facilities mode ...
d. the Club of Rome  as the noisiest modern bird of ill omen with 'projections' based on the same blind extrapolations as the urban seas of shit or the 'proofs' of the impossibility to connect or educate or feed all - instigating mass destruction fear that ''we run out of everything and will soon all die'' , used for justification for mass atrocities VS Julian Simon's  - the ''Ultimate Resource'' (1981, 1996) . Cf.: my accelerando article  and see what precisely is the Factory for succession of better and better Hanson drives for the last few millions of years - from the Blade and the Fire to the Tau - it is the same thing which identification made Julian Simon from fanatical Maltusianist  into rationally convinced Cornucopian  ... the human mind.
e. the predator-pray model  which this pseudo-haiku  I guess depicts best how's it brutally flawed:
''hawk eat chic -> less chic, human eat chic -> more chic''
for missing out to posit and failure to account for positive feedback loop  of predator over pray dynamics ...
f. The comment of Dary Oster  , founder of the other passion of mine - ET3 , on the aka 'saturation' of the scalables (exemplified in the field of transportation, which btw, being communication ... our social structures map onto mobility systems we have on disposal ... ).:
''... US transportation growth has focused on automobile/roads (and airline/airport) developments. (And this has been VERY good for the US economy.) The reason is that cars/jets offered far better MARKET VALUE than horse/buggy/train transport did 150 years ago. In the mid 1800s, trains displaced muscle power for travel between cities - because trains offered better market value than ox carts. Trains reached 'market saturation' about 1895 to 1905 (becoming 'unsustainable') - however 'market momentum' produced 20 years of 'overshoot'. Cars/jets were far more sustainable than passenger trains and muscle power, and started to displace trains (and finish off horses). By 1916 the US rail network peaked at 270,000 miles (today less than 130,000 miles is in use).Just like passenger trains hit market saturation, roads/airports are reaching economic limitations. The time is ripe for a market disruption, and all indicators (past and present) say it will NOT come from, or be supported by government or academia -- but from private sector innovations that offer a 10x value improvement (like ET3), AND also offer incentives for most (not all) key industries to participate (like ET3). Automated cars, smart highways, and electronic ride sharing are industry responses that will contribute to overshoot of cars/roads for the next 5-10 years.The main problem i see with the education system is that is that academic research and publication on transportation is primarily funded by status quo industries like: railroads and rail equipment manufactures, highway builders, automobile/truck manufactures, engineering firms, etc. -- all who fund research centered on 'improving' the status quo.Virtually all universities (for the last 1k years+) are set up to drive incremental improvements that industry demands, and virtually all paradigm shifts are resisted until AFTER they occur and are first adopted by industry. Government is the same (for instance in 1905 passing laws to forbid cars that were disrupting horse traffic; or in 1933 passing laws to limit investment in innovation startups to the wealthy (those successful in the status quo)).''
g. Darwinian algo  sqrt(n) VS higher algos - like Metcalfe n^2 . It is not precise, it is more of metaphorical, to indicate direction or scale of scaling, rather then rigorous precision, but ... the former figuratively speaking takes 100 times more to put up 10 times more, and the later takes 10 times more to return 100 times more...
h. Barter vs money. See.:  bottom of page 5 over the bottomline notes, about the later:
simpliﬁes pricing calculations and negotiations from O(n^2) complexity to O(n) complexity
As demonstration how one item out of a scaling barter system, emerges as specialized transactor and accelerator to transcale the barter economy. From within. Endogenously as always. (btw, Extremely strong document where there are entire books read and internalized behind each tight and contentful sentence!)
i. The heat death of the universe  VS the realization that the 2nd law  - conservation law for entropy/information law does not allow that , the asymptoticity  of the fundamental limits of nature, the fact that max entropy grows faster than/from/due to the actual antropy growth  and that entropy is not disorder  and that at the end of the day it is an unbounded immortal universe  ... cause it's all a combinatorial explosion .
j. The Anthropic principle  and the realization that it is extremely hard if not impossible to posit a lifeless universe  ...
k. The Algoverse - my 'psychedelic' vision  of the asymptotic inexorable hierarchy of the Dirac sea  of lower algos which take everything for almost nothing - up towards giving almost everything for almost nothing - Bucky Fuller's runaway Ephemeralization . Algorithms are things. Objects. Structure. Homoousic or consubstantial to their input and output. Things taking things and making things outta the former. Including other algos of course! Stronger ones.
l. The Masa Effect . The Master of Softbank seeing how the machine productivity is on the imminent course to massively overscale the human clients base and his apparent transcaling solution to upscale the clients base with bots and chips, with the same which scales supply in such a too-much way. 
m. The Pierre the Latil 1950es and Stanislaw Lem 1960es ( copied 1:1 by Tegmark  ) hierarchy . Of degrees of self-creating freedom of Effectors ...
n. Limits of growth - present in any particular moment and in any finitary setting of rules ,  but nonexistent in the infinity of rules upgradability. Like a cancer cell trapped in a cage of light  vs ... photosynthesis.
o. Ray Kurzweil - static vs exponential thinking .
p. Craig Venter's  Human Genome project  which when commenced in 1990 was ridiculed that will be unbearably expensive and will take centuries to finish, and it did - it costed a unbearable for 1990 fortune and it did take centuries, of subjective time as per the initial projections conditions - being completed in year 2000.
q. Jeff Bezos vision  of Solar System wide Mankind:
''The solar system can easily support a trillion humans. And if we had a trillion humans, we would have a thousand Einsteins and a thousand Mozarts and unlimited, for all practical purposes, resources.''
r. The 'wastefulness' of data centers and crypto mining collocation facilities  ... which is as funny as to envy the brain for 'wasting' >25% of the body energy. (Btw, the tech megatrend is exponentially and relentlessly towards the minimum calculation energy).
s. The log-scale intuitive measure and smooth straight line visualization coming out of, this quote which I fished out off the net long time ago.:
"The singularities are happening fairly regularly but at an increasing rate, every 500 to 1000 billion man-years (the total sum of the worldwide population over time). The baby boom of the 1950 is about 200 Billion man-years ago."
ops! go back to Q. With 1 trln. humans population the 'singularities' will occur once a year?!
t. the Tau  !!
I can continue with these examples ... forever [wink] - excuse me if I've bored you - but I think that at least that minimum was needed to be shown and it is enough to grok the big picture.
Scaling is the solution. It is a problem too. Its overcoming is what I dub 'Transcaling' for the purpose of that study.
Size matters. Scaling is the way. But the more general is how a system handles change! This is as fundamental as to be in the very core of definition of life and intelligence .
Tauchain is all about change handling!
Now, lets knit the 'blockchain' of these all example threads above into a knot like the Norns do :
Dear friends, please, scroll back to Example D. Yes, the human mind transcaler thing. The Ultimate resource thing.
We are the ultimate resourse.
We the humans (and soon the whole zoo of our technological imitations and reproductions and transcendences of ourselves ).
We as the-I  are strong thinkers and creators, immensely more road lies ahead than it's been traveled, yes, but yet we, as the-I, are the momentary apex in the Effectoring business  in the Known universe ... AND simultaneously we as the-We are mediocre to outright dumb.
We are very far from proper scaling together. The Ultimate resource is not coherent and is not ... collimated. Scattered dim lights, but not a powerful bright mind laser. Dispersed fissibles, but not a concentration of critical masses.
We as The-We - paradoxically- persistently finds ways to transcale its destinies using the power of the-I, but the-We itself does not entertain the scaling well at all .
The individual human mind is the unscaled transcaler.
Tau is the upscaler of that transcaler.
I'll introduce herewith another 'poetic' neologism, which occurred to me to depict the scaling props of a system after the Scrooge factor of ''Tauchain - Tutor ex Machina'' , and it is the:
Spawn  factor
- the capacity and ability of a system to grow through, despite, against, across, from and via the changes. Just like cuboid  is about all rectangular things like squares, cubes, tesseracts ... regardless of their dimensionality, the Spawn Factor - to be a generalization of all orders of scaling. Zillion light years from rigor, of course, as I'm on at least the same distance from my Leibnizization . For the lawyer to become a mathematician is what is for a caterpillar to become a a butterfly. :) Transcaling.
Tau transcends the infinite regress of orders of: scaling of scaling of scaling ... by being self-referential. Or recursive. 
What is the Spawn factor of Tau?
If you let me I'll illustrate this by a poetic periphrasis of the famous piece of Frank Herbert's .:
I will face my change. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the change has gone there will be nothing. Only I will remain.
Size matters. Some people object that it does not matter, but has meaning. But meaning always matters, so it is the same.
The bigger problems one solves, the bigger the gains. Big problems require big solutions. We live in a big universe and our very survival is to deal with bigger and bigger problems, which require bigger and bigger solutions to cope.
But nevertheless to build big is hard so we naturally prefer to create small things which can grow. Small from point of view both of understandable and affordable to build. So best fit are small solutions, cheap and easy to make which scale out or unfold or unleash into big means to address big problems. Scaling is everything.
Scaling. Scalable! Scalability !!
The root-word 'scale' possesses marvelous riches of meaning in English language  with lots of poetics inside.:
 snake skin epidermals - wisdom, memory, protection, rejuvenation, regeneration, eternity...
hen to pan (ἓν τὸ πᾶν), "the all is one"
 warrior armour - security, defense, power, strength.
 weighting scales - device to measure mass, unit, measure, account.
all very Blockchainy wording without any shadow of doubt.
The scalability issues could be grokked  with the following anecdote:
Bunch of workers on a construction site and a huge log. The onsite manager commands a few of them to lift and move it. They try and object ''Too heavy!''. The manager adds more and more workers, until they shout back again: ''Too short!''.
A few real examples, the first two - bad and the last three excellent:
[a] I won't name this 'crypto' just will say it is named after a mythical element of the universe, according to the prescientific gnostic  imaginations. It's core 'value proposition is to shovel meaningful computation into a thread of computation which very value proposition is to be as random, meaningless and unidirectional (hard to do, easy to prove) as possibly possible - the blockchain. The theoretically most expensive form of computation. Visualize: cars and airplanes made of gold and diamonds burning most expensive perfumes. Or mass production of electricity by raising trillions of cats and hiring trillions of people to pet them with grid of pure gold wires to discharge and collect the electrostatics. If they have chosen the original Satoshi blockchain  for their 'experiments' - where the futility of such attempt would become instantly clear and would die out outright due to impending unbearable cost - will of course be more fair way to do, and would've spared dozens of billions of dollars to the Mankind, but logically they preferred a 'controlled' blockchain of their own. In a sense that the guys with vested interest into it have the power to hand-drive, stop, restart and vivisect it. The only use of this 'blockchain supercomputer' is ... tokenomics by Layering. Why it was at all necessary for a blockchain advertised as so good as to do all the general computation, to be made so hairy and bushy with layered tokens??
[b] Another trio of chaps, won't mention names again, were really at awe with Satoshi's creation, so much that they not just liked, but wanted it and decided to have it. For themselves. All of it. And rebelled and forked out and provided 'scaling' errrmm ... uhhh... solution. By increasing the blocksize. Something which Satoshi meditated on, extensively discussed with his disciples and not occasionally decided to put breaks on.  Very recently the crypto news headlines said that the blocksize increase solution providers are eyeing ... Layering. Which they furiously were advocating that blocksize increase makes unnecessary. Cause it is the solution, isn't it? Or maybe it just was. And is not anymore? Well, I'd say that all the aka 'alts'  - to provide a rejuvenated clone of Bitcoin tweeked here and there to provide momentary ease of difficulty and transaction fees - suffer from one and a same problem - traveling back in time does not tell you the future.
[c] Lets jump half a century back in time. It is 1960es. The very making of internet. Computers are already here and scaled up in numbers so their networking to become a problem/juice worth the solution/squeeze. The birth of TCP/IP  and the report of the very makers of it. Of the solution for the network scaling. Enjoy the ancient wisdom:
Initially, the TCP managed both datagram transmissions and routing, but as the protocol grew, other researchers recommended a division of functionality into protocol layers. Advocates included Johnatan Postel of the University of Southern California's Information Sciences Institute, who edited the Request for Comments (RFCs), the technical and strategic document series that has both documented and catalyzed Internet development. Postel stated, "We are screwing up in our design of Internet protocols by violating the principle of layering." Encapsulation of different mechanisms was intended to create an environment where the upper layers could access only what was needed from the lower layers. A monolithic design would be inflexible and lead to scalability issues. The Transmission Control Program was split into two distinct protocols, the Transmission Control Protocol and the Internet Protocol.
The layering made the Internet as we know it. By the simple trick of just one node needed to permit another. Unstoppable inclusivity!
[d] The Mastercoin / Omni Layer :
«A common analogy that is used to describe the relation of the Omni Layer to bitcoin is that of HTTP to TCP/IP: HTTP, like the Omni Layer, is the application layer to the more fundamental transport and internet layer of TCP/IP, like bitcoin».
[e] The Lightning network (LN) :
The Lightning Network is a "second layer" payment protocol that operates on top of a blockchain (most commonly Bitcoin).
Satoshi spoke on 'payment' channels in his masterpiece. Foreseeing the way to scale.
An estimate of the power of LN layering .:
''The bitcoin devs accept that eventually larger block sizes will be needed. The current transaction rate isn't going to cut it if people all over the world actually start using bitcoin daily. They estimate that eventually, if everyone in the world uses bitcoin and makes 2 transactions a day, but uses the lightning network, a 133mb blocksize will be needed. Without the lightning network, something like a 200gb (GIGABYTE) size PER BLOCK would be needed to accommodate that much usage.''
Layering upscales it with orders of magnitude of higher efficiency.
If Bitcoin is the 'first layer' and Omni and Lightning are 'second layer', I see which one is the 'Zeroth Layer' and also foresee  the inevitability of the merger or 'Amalgamation' of all second layers over all blockchains, so the user will be able to transact everything into anything to anybody, without to know or care which chain is in use ... I have special nicknames for these and will go back to these topics in series of future posts.
Enough of examples I reckon.
The Postel's sacred Principle of Layering comes from the implementation levels paradigm.
or Abstraction layering :
''separations of concerns to facilitate interoperability and platform independence''
With other words - delegate the task to that layer of the system which does the particular job best. We can generalize this into The Scaling Commandment. Only one enough:
''Thou shalt not jam it all into a single layer!''
The Layer Cake architecture is literally ubiquitous across the Universe.: biology, semantics, informatics ...
It seems that it is if not the only, at least THE way to scale.
Maybe, someday, we the Humanity, upscaled by Tauchain will discover more powerful than Layering ways to Scale, but it is all we have for now.
Scaling is a problem. Scaling must be scalable, too.
Metascale from here to Eternity.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.