“We are moving into an era where cities will matter more than states and supply chains will be a more important source of power than militaries — whose main purpose will be to protect supply chains rather than borders. Competitive connectivity is the arms race of the 21st century.”
-- Parag Khanna , 
A network is made of lines and switches, right?
Lots have been told about the network scaling effects , including attempts by myself [4-12] ... which compels me to introduce the not so frivolous notion of network forces.
These forces are expressed in several laws. I though initially to say 'forces' and 'laws' here, but I realize they are quite objective and physical emergenta , indeed.
In my ''Geodesic by Tauchain''  article of about couple of months ago I emphasized over the Huber-Hettinga Law , of how cost of switching literally defines the 'orographic'  topology of a network .
The cheaper the routing - the flatter the network.
Expensive switches = hierarchy, verticality, power, control, obey, centalization, 'world is fiat' ,, sollen , hence borders instead of bridges, limitations not stumulae, exclusivity ...
Cheap switching = geodesic society , 'world is flat', horizontality, p2p, decentralization, inclusivity ...
The more vertical by centralization a network is - the more it must deplete information - to omit, to ignore calls from the deeps or to even actively suppress or silence nodes. To cope with the stream by strangling it. Simply due to lesser capacity, less degrees of freedom . Geodesic networks possess higher entropy  and therefore are richer. They bolster higher both Scrooge  and Spawn  factors. With other words:
The flatter the network - the richer  it is.
Maybe the explanation on why the wealthiest-healthiest societies tend to be those who are with biggest economic-political freedom. 
Naturally the Huber-Hettinga Law led me to the elementary-watson  conclusion of the power and value of Tau as the ultimate über -switch. So far so good.
Now lets stare in the Lines. Here comes Nick Szabo .
Nick Szabo - a lawyer AND computer scientist - is a legendary figure from the great 'Archaic era of crypto'  - the 1990es when he, together with the other cypherpunk  titans like Tim May , Wei Dai , Bob Hettinga  etc. etc., poured the very baserock foundations in a staggering detail of what we enjoy now as Crypto  in the post-Satoshi  era.
It is THEIR vision came true we all now live in.
Bitcoin was a detonation of namely that critical mass of fused thoughts, of namely these very smart people, piled up and compressed by the connective network forces of the early internet .
No, I do not mean at all Szabo's most famous thing - the 1994 coining of the term of 'smart contracts' . In fact I deeply and strongly reject the very notion of 'smart contracts' - as utter non-sense, even as an oxymoron - which is an yuge separate problem, which I suspect that I nailed it, and I'll address in series of dedicated articles starting in the upcoming weeks...
I mean something much more valuable, what I call the Szabo Law.
When we hear the phrase 'networking effects' the first what comes to mind is the famous Metcalfe law .
''Metcalfe's Law is related to the fact that the number of unique connections in a network of a number of nodes (n) can be expressed mathematically as the triangular number n(n − 1)/2, which is proportional to n2 asymptotically (that is, an element of BigO(n2)).''
In the above order of appearance these network forces laws respect quantitatively the basic properties of a network as:
- Huber-Hettinga Law - the cost of switches and routing.
- Metcalfe Law - the number of nodes, i.e. switches defining the number of unique connections or lines.
- Szabo Law - the cost of the lines and connecting.
All these Laws are scaling ,  laws. Before we to come back to and continue on Szabo Law, we have to briefly mention another one .:
''So what is “scaling”? In its most elemental form, it simply refers to how systems respond when their sizes change. What happens to cities or companies if their sizes are doubled? What happens to buildings, airplanes, economies, or animals if they are halved? Do cities that are twice as large have approximately twice as many roads and produce double the number of patents? Should the profits of a company twice the size of another company double? Does an animal that is half the mass of another animal require half as much food?'' ... With Dirk Helbing (a physicist, now at ETH Zurich) and his student Christian Kuhnert, and later with Luis Bettencourt (a Los Alamos physicist now an SFI Professor), Jose Lobo (an economist, now at ASU), and Debbie Strumsky (UNC-Charlotte), we discovered that cities, like organisms, do indeed exhibit “universal” power law scaling, but with some crucial differences from biological systems.Infrastructural measures, such as numbers of gas stations and lengths of roads and electrical cables, all scale sublinearly with city population size, manifesting economies of scale with a common exponent around 0.85 (rather than the 0.75 observed in biology). More significantly, however, was the emergence of a new phenomenon not observed in biology, namely, superlinear scaling: socioeconomic quantities involving human interaction, such as wages, patents, AIDS cases, and violent crime all scale with a common exponent around 1.15. Thus, on a per capita basis, human interaction metrics (which encompass innovation and wealth creation) systematically increase with city size while, to the same degree, infrastructural metrics manifest increasing savings. Put slightly differently: with every doubling of city size, whether from 20,000 to 40,000 people or 2M to 4M people, socioeconomic quantities – the good, the bad, and the ugly – increase by approximately 15% per person with a concomitant 15% savings on all city infrastructure-related costs.
Which probably comes to denote the shear size of the network in STEM (space, time, energy, mass) , I'm not sure, but I have some strong suspicions about the unity of matter, structure and action which I will expose and share some other time.
What I call Szabo's Law reveals in his ''Transportation, divergence, and the industrial revolution''(Thu, Oct 16, 2014)  that similarly to Metcalfe's (''double the population, quadruple the economy'') there is power-law  correlation between the cost of connections or links or lines ... and the value of the network, too.:
''Metcalfe's Law states that a value of a network is proportional to the square of the number of its nodes. In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed. The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation. Combine this with Metcalfe's Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation. A reduction in transportation costs in a trade network by a factor of two increases the potential value of that network by a factor of sixteen. While a power of exactly 4.0 will usually be too high, due to redundancies, this does show how the cost of transportation can have a radical nonlinear impact on the value of the trade networks it enables. This formalizes Adam Smith's observations: the division of labor (and thus value of an economy) increases with the extent of the market, and the extent of the market is heavily influenced by transportation costs (as he extensively discussed in his Wealth of Nations).''
My encounter with this article of Nick Szabo's was a goosebumps experience for me, cause it coincided with series of lay rants of mine on the old Zennet irc chat room of Tau that ''computation =communication =transportation''. Somewhere in 2016 as far as I remember. :)
Maybe it was the last drop to shape my conviction that by my dedicated involvement in both Tau and ET3 , , , I'm actually working for ... one and a same project.
For communication, computation and transportation being modes of state change. Cause information is a verb, not a noun. And software being states of hardware.
''Decentralizing the internet is possible only with decentralized physical infrastructure.'' 
Just like the brain is a network computer of neuron nanocomputers , the emergent composite we colloquially call humanity or mankind or economy or society or world ... is a network computer made of all us billions of humans.
Brains do thought, economies do wealth.
Integrated circuitry  upon the face of planet Earth as a motherboard . Literally. The Humanity's planet-hardware. Parag Khanna's Connectography explained.
The Earth is definitely not our ultimate chip carrier . Probably there ain't limit at all of our culture-upon-nature hardware upgrades, see: , . The universe is our computronium  and we've been here for too short and haven't seen far enough. Networking is connectomics . And thus it always also is metabolomics .
Remember my last month's  ''Tauchain the Hanson Engine''?
The series of exponentially shortened growth doubling times looks like driven by transportation technological singularities : domestication of the horse, oceanic navigation, combustion engine ...
In the light of all the net forces summoned above: The planet Earth viewed as a giant computer chip ...
- itself is a subject of the relentless network entropic  force of the Moore's law 
The network forces accelerate what that wealth computer does.
Two quick examples:
A.: The $1500 sandwich  as a proof that trade+production is at least thousands of times stronger in sandwich-making than production alone.
B.: The example of Eric Beinhocker in his 2006 ''The Origin of Wealth''  about the two contemporary tribes of the Amazonian Yanomami  - a stone age population nowadays and the Eastcoastian Manhattanites . That the former are only about 100 times poorer, but the later enjoy billions of times bigger choice of things to have.
Tauchain 'threatens' to affect the parameters of ALL the network forces formulae mentioned herewith in a mind-bogglingly big scale.
Simultaneously, orders of magnitude :
- lower switch cost
- higher nodes count 
- lower connection cost
A wealth hypercane  recipe. Perfect value storm. Future ain't what it used to be .
More on partial evaluation - How does partial evaluation work and why is it important? By Dana Edwards. Posted on Steemit. December 23, 2017.
I have been asked the question about what partial evaluation is. Partial evaluation is one of the core components behind the Tau Meta Language. In order to understand Tauchain and TML we have to do a little digging to understand not just partial evaluation but in specific the partial fixed point logic.
Self interpretation and self definition are the core of what makes Tauchain unique, and no other crypto or really any technology outside of academia will have that. Partial fixed point logic will be discussed in this blog post along with Futamura's paper on partial evalulation.
Partial Evaluation of Computation Process--An Approach to a Compiler-Compiler
Futamura's paper discusses partial evaluation as an approach to a compiler compiler. A compiler as many programmers know, is a lot like a translator. A compiler translates "source code" written in a high level formal language into "machine code" which is a lower level language. This translation process is what allows human "programmers" to communicate in a language which the machine can understand without having to speak directly in the machine code at the lowest level. With this in mind we can now understand that a compiler-compiler allows humans to describe, define, and compile a compiler, in essence allowing humans to create new programming languages.
Partial evaluation is a means of building a compiler compiler allowing this. A partial evaluation of a computation is based upon taking the formal description of the semantics of a programming language which is known as an "interpreter", allowing for description of the "evaluation procedure" which is the interpreter, to be used for defining the semantics of a programming language. So to simplify it down, the interpreter allows the advanced users of TML to define the semantics also known as the "meaning" of a programming language. This gives the user of TML a lot of flexibility, to in essence define their own programming language and then compile it (compiler-compiler). A meta compiler is what the paper describes as the ability to compile a particular language, where the partial evaluation process is where the meaning behind the semantics is defined.
A programming language is both syntax and semantics. So for future reference, after parsing is complete (syntax analysis) the meaning comes from the semantics through "semantic analysis". Partial evaluation is a process pertaining to the semantic analysis portion which takes place after syntax analysis otherwise called parsing. So we start with source code, which is input into a parser, which outputs into the generator, which at this point receives input from the partial evaluator just prior to the final stage of compilation into "machine code".
The significance of partial evaluation in Tauchain and the interesting features of partial fixed point logic
Partial evaluation is a bit of a tweak on what is usually called a compiler-compiler or parser generator. Partial fixed point logic is where the magic of TML is supposed to happen and by magic I mean the main selling points such as self defining, decidable, etc. A quick Google search of fixed point logic shows that fixed point logic has a role in model checking which is a critical design feature for Tauchain. We also learn that partial fixed point logic is more expressive on infinite structures than inflationary fixed point logic.
The critical paper comes from Imhof titled "logics that define their own semantics". It is almost magical in that in this paper we are presented with the logic which is self definable by it's nature as well as decidable. So the literature is clearly showing that theory backs TML. TML likey will be used to produce a partial fixed point logic solver which through the unique properties of this logic we will gain the magical properties promised for Tauchain. To understand partial fixed point logic fully requires a lot more in depth study, but this blog post will point potential developers in the right direction so that the most basic questions are answered.
What is the significance of this breakthrough? This is the part which is hardest for me because it opens the door to so many opportunities and so much potential. For example what can developers do with the ability to inject any logic they wish, whether partial fixed point logic or some other? What does this mean for Tauchain which can now support multiple logics? What does this mean for Agoras which can be built using a decidable logic yet be expressive? PSPACE complexity, what will this allow for developers?
I'm unable to answer those questions sufficiently, but I think this is a much bigger deal than mere "Turing complete" status which we see common with the current state of the art blockchain tech. In a way this represents the next level, which is a state of the art meta language and compiler-compiler where flexibility is in the ability to communicate from human to machine.
Futamura, Y. (1999). Partial evaluation of computation process—an approach to a compiler-compiler. Higher-Order and Symbolic Computation, 12(4), 381-391.
Kreutzer, S. (2002, September). Partial fixed-point logic on infinite structures. In CSL (pp. 337-351).
Imhof, H. (1999). Logics that define their own semantics. Archive for Mathematical Logic, 38(8), 491-513.
Fuente / Source: Original post written by Dana Edwards. Published on Steemit. December 23, 2017.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.