The Code of Hammurabi (left), ~1754 BC, and digital rain (right), the computer code featured in The Matrix series, ~2000 AD. Composite by Laura Taylor. 
The Zeroth Chapter
There is Book of Satoshi . You know that? It is even for sale on Amazon. :)
''Here is a phenomenal book which contains the complete collected writings of everything Satoshi ever posted about bitcoin."
A thesaurus  of all the public writings by the very Blockchain creator - the pseudonimous Satoshi Nakamoto .
Still unexcelled by anything so far! For almost an entire decade - of our fast time, guys! - corresponding to a full millennium of the previous doubling rate mode, and to a quarter million years before that, or to 30 MYs  in prehuman time - as measured in Hanson units  or in log-scale man-years !! It took about as much to decode all the human genome !
''Impressive. Most impressive ''.
The First Book of the Crypto Bible. :)
We witness , literally in real time, the powerful coalescence of the Second Book. The Book of Ohad .
Leaving the pompous tone behind ... maybe driven initially from attempt to compensate the anxiety of the heavy official responsibility - I have the honor and pleasure to face the highly non-trivial task to do Exegesis  of the following three milestone documents of Ohad Asor:
A. ''The New Tau'' (Dec 30, 2017, 3:27 PM [updated Dec 30, 2017, 3:28 PM] by Ohad Asor) 
B. ''From Agoras to TML'' (Mar 13, 2018, 2:30 AM [updated Mar 13, 2018, 3:01 AM] by Ohad Asor) 
C. ''The Art of Self-Reference'' (May 27, 2018, 2:09 PM [updated May 27, 2018, 2:13 PM] by Ohad Asor) 
1610s, "explanatory note," from Greek exegesis "explanation, interpretation," from exegeisthai" explain, interpret," from ex "out" (see ex- ) + hegeisthai "to lead, guide," from PIE root *sag- "to track down, seek out" (see seek (v.)). Meaning "exposition (of Scripture)" is from 1823. Related: Exegetic; exegetical; exegetically.
Scary, ah? Isn't it?
This exercise is not meant to merely translate from Ohadian English into Karovian English - nothing simplifying and depleting like that. :)
It is aimed towards increasing understanding via something ... lets call it explanatory interferometry . A meaning zoom-in technique.
The method chosen is to put emphasis on the main points of the three texts above by extracting them into few dozens of explanatory Chapters. Plus careful selection of tutorial materials as provided resources.
No, Ohad will not control or edit me, but the reward is that if I make it, these Exegetic articles will enter the official Tau website blog.
Wish me luck!
And because ''Tau is discussion about Tau'' - I strongly rely upon your support. Your comments. Lets talk it over, lets argue! - nothing is more meaning revealing, i.e. growing the shared understanding, than a really good conversation.
The periodics? I intend to be beaming the episodes of the Exegesis series on the Mondays. Stay tuned!
The power of ambiguity and of ambiguity minimization in communication. By Dana Edwards on Steemit. June 1, 2018.
Formal communication benefits from ambiguity minimization.
So what exactly do I mean by formal communication? Well when we think of how human beings communicate with machines it is in a formal language. This formal language requires minimized ambiguity for security analysis (how can we analyze code if we cannot effectively interpret it?). The other problem is that the machines require for example that if... then... else and similar conditional statements are well defined and unambiguous.
Is it possible to show that a grammar is unambiguous?
To show a grammar is unambiguous you have to argue that for each string in the language there is only one derivation tree. This is how it would be done theoretically speaking.
In computer science, an ambiguous grammar is a context-free grammar for which there exists a string that can have more than one leftmost derivation or parse tree, while an unambiguous grammar is a context-free grammar for which every valid string has a unique leftmost derivation or parse tree. Many languages admit both ambiguous and unambiguous grammars, while some languages admit only ambiguous grammars.
Specifically we know that deterministic context free grammars must be unambiguous. So we know unambiguous grammars exist. It appears the strategy is ambiguity minimization with regard to formal languages (such as computer programming languages).
For computer programming languages, the reference grammar is often ambiguous, due to issues such as the dangling else problem. If present, these ambiguities are generally resolved by adding precedence rules or other context-sensitive parsing rules, so the overall phrase grammar is unambiguous. The set of all parse trees for an ambiguous sentence is called a parse forest.
The parse forest is an important concept to note. All possible parse trees for an ambiguous sentence is called a "parse forest". This concept is key to understanding the strategy of ambiguity minimization. So we can in practice minimize ambiguity and we know for certain that deterministic context free grammars admit an unambiguous grammar but what does that mean? What are the benefits of unambiguous language in general?
A benefit of ambiguity minimization
Simple English is a form of controlled English designed to minimize ambiguity in English. This is important because by using simple English to codify the rules or write the laws it puts it in a language where there is less of a computational expense (in brain power) to process and interpret the statements.
In one of my older blogposts @omitaylor commented and in one of her future posts she asked about the topic of love. In specific her post was titled: "What Does LOVE Mean To YOU"
Her post highlights the fact that there are different love languages and that we don't all speak the same love language. Ambiguity here is actually not a good thing but the simple fact is when someone speaks about love how do we know they are talking about the same thing? As a result we often seek an agreed upon or formally defined "love concept" where we all agree it's love. This is not trivial to find and as a result a topic like love is not easy to discuss in any serious manner. Unambiguous communication or to be more precise (minimized ambiguity) would allow Alice to discuss with Bob the topic of love in a way where they both know exactly what the other is referring to in terms of behavioral expectations, emotions/feelings, etc.
If Alice agrees to love Bob then Bob has no way to determine what Alice means unless he and she agree on a mutually defined concept of love. This highlights how agreement requires very good communication and how minimizing ambiguity can be beneficial at least in this example.
Ambiguity minimization makes sense when you are following a principle of computational kindness. That is if Alice would like to reduce the computational burden on Bob then she can reduce or minimize the ambiguity of her sentence. This is because in order for Bob to interpret an ambiguous sentence Bob must in essence sort all possible interpretations of that sentence from most likely interpretation to least likely interpretation, and before he can even sort he must first search in order to find all possible or at least plausible interpretations.
This is very computationally expensive for Bob but very cheap for Alice. Alice knows exactly what she means but Bob has no clue what Alice REALLY means.
A benefit of ambiguity
There are other examples where increasing ambiguity could be beneficial, such as perhaps when the communication is less than formal, or to share a stream of consciousness without turning it into a formal communication. Humor for example rides on ambiguity and a good joke may have multiple layers. Art also leverages ambiguity because it's perhaps meant to be interpreted 20 different ways all to produce a certain desired affect.
Ambiguity allows more meaning to be packed into fewer words. This in a sense is a sort of compression scheme. So if a sentence has multiple possible meanings the levels or meanings are still finite. It's a fixed amount of meanings and so theoretically speaking a search can be conducted. In fact this is what a human being does when interpreting natural language where a sentence can have multiple meanings (they do a search for all possible interpretations of that sentence). The problem with this is that it is computationally expensive as a process at least for the human being to try to figure out all possible interpretations of a sentence.
Lawyers when they do their work are working with a specific knowledge base of common legal sentences and common interpretations known in their profession but the rest of us might see a sentence in lawyer-speak and not really know what it means because we will not know the common interpretations. This is a big problem of course because to form agreements between two parties both parties need to have a common understanding (a kind of knowledge symmetric understandability) allowing them both to interpret roughly the same sentence to mean the same thing.
Retrodictive archaeology is so tempting. It is about what it was, what it is, what we knew and what we know.
Here I present another time travel glimpse of mine:
February 1998. Global Information Summit*. Japan. Robert Hettinga** - the patriarch of financial cryptography wrote:
My realization was, if Moore's Law creates geodesic communications networks, and our social structures -- our institutions, our businesses, our governments -- all map to the way we communicate in large groups, then we are in the process of creating a geodesic society. A society in which communication between any two residents of that society, people, economic entities, pieces of software, whatever, is geodesic: literally, the straightest line across a sphere, rather than hierarchical, through a chain of command, for instance.
A network scales according to the capacity of its switches.
Mankind is a network of interlinked humans routed by ... humans.
The network topology*** of society is dictated by our incapacity to switch - similarly to the way the penguins society is shaped by their inability to fly.
Running the Sorites paradox**** in reverse - humanity does not form a sand-heap by adding grains, but fractalizes into groupings of up to just a few individuals.*****
Big body of research on discussions persistently brings back the result that over a certain threshold of as little as 5 persons the number of possible social interactions explosively exceeds the participants capacity to handle the group traffic of information.
Increase the group size and the 'c factor' - the collective intelligence - abruptly implodes. Bellow the individual human level. So long 'wisdom of the crowd'.
Hierarchy is the only way we know (up to now) for a society to scale. Centralization as emergenta of organic switching limitations.
It is fair to say that we have and have had upscaling exosomatic prosthetics all the time.: language, writing, institutions, specialization... but at the end of the day even within these boosters the social switching is bottlenecked down to just a few humans-strong.
Since recently, cause, you know ... computers. Humans are not only lousy switches, but also tremendously expensive ones to make. Computers - the vice versa: their performance/cost relentlessly bigbangs.
Moore's law****** is not only about silicon wafers. It is a megatrend from the very dawn of the universe as Kurzweil noticed******* long time ago, which goes up and up across all computronium substrata imaginable or possible.
Non-human computation and automated communication promises to break the social scaling barrier.
Here comes the Ohad Asor's Tau.********
The only project I know which asks the correct questions and looks into doable solutions of humanity scaling. And the only meaningful identification and treatment of these problems which seems to lead towards fulfilling of Bob Hettinga's Geodesic visions from few decades ago.
Of course I do not know it all, but lets say that I intensively search the relevant space.
Tau transcends the human switching limitations in humane way. Without to amalgamate individuals out of existence, which some other discussed ways - like direct neural interfacing - seem to inevitably infer. For society is ... human beings.
What's the pragmatics of geodesic vs hierarchic?
What game really the 'flat' p2p networks beat the vertical social configurations into?
It is an easy answer. It is pure physics:
A Tauful geodesic society comprises IMMENSELY richer economy.
Metcalfe's (and Szabo's) law on max!
The combinatorial size of it vastly exceeds the possible arrangements of any traditional social 'pyramid'.
The maximum social diameter becomes ~1.
In fact, it seems quite an ancient archetypal vision, the whole thing:
“Imagine a multidimensional spider’s web in the early morning covered with dew drops. And every dew drop contains the reflection of all the other dew drops. And, in each reflected dew drop, the reflections of all the other dew drops in that reflection. And so ad infinitum.” Allan Ginsberg*********
1. *- http://www.nikkei.co.jp/summit/98summit/english/online/emlasia3.html (the second entry)
2. **- http://nakamotoinstitute.org/the-geodesic-market/
3. ***- https://en.wikipedia.org/wiki/Network_topology
4. ****- https://en.wikipedia.org/wiki/Sorites_paradox
5. *****- https://sheilamargolis.com/2011/01/24/what-is-the-optimal-group-size-for-decision-making/
9.*********- https://en.wikipedia.org/wiki/Indra%27s_net (image from: https://mindfulnessforhealing.com/2012/12/29/weaving-a-tapestry-of-wellness/ )
NOTE: I'm in the Tau Team, but this post expresses only my own associations and interpretations.
For all who are researching Tauchain (TML) to understand how it works, a nice video! By Dana Edwards. Posted on Steemit. March 2, 2018.
This excellent video explains many of the concepts of programming, compilers, partial evaluation, and much much more!
To understand a program have a look at what makes up a program, the concept of a function:
By Wvbailey [Public domain], via Wikimedia Commons
As you can see above, a program takes input. Typically this is a structured input (structured information), which is to say the input must be in a certain format, and must be processed so that it can be useful to the program which manipulates that information to produce a relevant output.
By Bin im Garten (Own work) [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
Recursive functions take it even further:
By User:Maxtremus [CC0], via Wikimedia Commons
And the example program above provides the computer with the ability to count.
By Function_machine5.png: Wvbailey (talk). The original uploader was Wvbailey at English Wikipedia derivative work: Zerodamage (This file was derived from Function machine5.png:) [Public domain], via Wikimedia Commons
And of course much more
By Petteri Aimonen (Own work) [CC0], via Wikimedia Commons
Futamura projection is a program transformation. Futamura projection transforms an interpreter into a compiler.
Learn all you can about the concepts so you can seize the opportunity TML will bring.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.