Introducing Value Area Networks - Matched participants by shared values. By Dana Edwards. Posted on Steemit. December 14, 2018.
This concept is possible only based on the design of Tauchain presented by Ohad. In his design for Tauchain he highlights the fact that any member of the social network will be allowed to input their worldview. It has been discussed by myself previously that moral values could be an important part of Tauchain in this setting.
A Value Area Network is a concept I'm introducing which is designated to mean a kind of network where all participants are matched according to shared values. These participants in the network (economic agents, bots, machines, humans, companies, whatever) should in theory be allowed to outline as much of their current values and as long as all participants are deemed to be in alignment by the consensus algorithm of Tau then they will be considered part of unified network.
The acronym VAN can be designated to stand for Value Area Network, not to be confused by Value Added Network. Unlike a LAN (Local Area Network) which is based on physical geography, the VAN is based on "social geography". People who are closer to each other socially on the moral and "concerns and values" level would represent a sort of shared location. In social science the concept of social proximity is defined mostly in geographical terms but in the digital age with a technology like Tau in existence the idea of closeness might not have to be restricted to the geographical definition.
Closeness in terms of how close your values align to another participant in a network would represent a distinct place on a sort of map. This distinct place would be represented or quantified by a score which indicates it's potential location on a spectrum of possible locations. Of course the mathematics behind this would have to be more clearly defined in future posts but this post is to introduce the concepts for future discussion.
My concerns and reasons behind thinking up VANs is based on that fact that while social media today does a pretty good job connecting billions of people to random people it also does a horrible job connecting socially compatible people to each other. It's not good enough to connect a bunch of random people. People want to connect to people who have compatible values with themselves as their values are constantly updating over time. Tauchain in theory is the only platform which is expected to have the features to make this idea a possibility.
Values in this context could be negotiated from or derived from beliefs or worldview using Tau discussion. The values then would over time be updating as the person updates their beliefs or worldview. This would be to go the emergent route of letting Tau try to identify the values of the participant based on what the participant said in discussions (avoiding contradictions). The other would be to let the participant explicitly enter their current values and over time let Tau help them to constantly update that over time.
These are features I hope to see developed over Tau in some form some day. If I'm in the position to bring these features into development (provided AGRS works as intended) then this could be one of my contributions. The key mechanism behind this feature would be a novel matchmaking algorithm which leverages the Tau Shared Knowledge Base and reasoning capabilities. The social values map feature could be deduced via the discussions had over time or it can simply be a checkbox setting where the participant chooses by checking boxes and sliding scales.
If Money = Memory, if Society = a Super Computer, if Computation is in Physical Systems, what is a Decentralized Operating System? By Dana Edwards. Posted on Steemit. October 24, 2018.
These concepts are not often discussed so let's have the discussion from the beginning. The first concept to think about is pancomputationalism or put another way the ubiquitous computers which exist everywhere in our environment. We for example can look at physical systems living and non living and see computations taking place all around us. If you look at rocks and trees you can see memory storage. If you look at DNA you can see code and if you look at viruses you can see microscopic programmers adding new codes to DNA. Even when we look at the weather such as a hurricane it is computing.
If you look at nature you see algorithms. You will see learners (yes the same as in AI), also in nature. The process is basically the same for all learning. Consider that everything which is physical is also digital. Consider that the universe is merely information patterns.
If we look at society we can also think of society as a computer. What does society compute though? One way people talk about a society is as a complex adaptive system, but this is also how people might talk about the human body. The human body computes with the purpose of maintaining homeostasis, to persist through time and reproduce copies of itself over time. The human brain computes to promote the survival of the human body. Just as viruses pass on codes to our DNA, the human brain is infected with mind viruses which are called memes. Memes are pieces of information which can alter physically how the brain is working.
The mind isn't limited to the brain. The mind is all the resources the brain can leverage to compute. In other words a person has a brain to compute with but when language was invented this allowed a person to compute not just using their own brain but using the environment itself. To draw on a cave is to use the cave to enhance the memory of the brain. To use mathematics is to use language to enhance the ability of the brain to compute by relying on external storage and symbol manipulation. To use a computer with a programming language is essentially to use mathematics only instead of writing on the cave wall we are writing in 1s and 0s. The mind exists to augment the brain in a constant feedback loop where the brain relies on the mind to improve itself and adapt. If there were no external reality the brain would have no way to evolve itself and improve.
A society in the strictly human sense of the word is the aggregation of minds. This can be at minimum all the human minds in that society. As technology improves the mind capacity increases because each human can remember more, can access more computation resources, can in essence use technology to continuously improve their mind and then leverage the improved mind to improve their brain. The Internet is the pinnacle of this kind of progress but it's obviously not good enough. While the Internet allows for the creation of a global mind by connecting people, things, and minds, it does nothing to actually improve the feedback loop between the mind and the brain, nor does it really offer what could be offered.
Bitcoin came into the picture and perhaps we can think of it as a better memory. A decentralized memory where essentially you can have money. The problem is that money is a very narrow application. It is the start, just as to learn to write on the cave wall was a start, but it's not ambitious enough in my opinion.
Humans in the current blockchain or crypto community do not have many ways where human computation can be exchanged. Human computation is just as valuable as non biological machine computation because there are some kinds of computations which humans can do quite easily which non biological machines still cannot do as well. Translation for example is something non biological machines have a difficult time with but human beings can do well. This means a market will be able to form where humans can sell their computation to translate stuff. If we look at Amazon Mechanical Turk we can see many tasks which humans can do which computer AI cannot yet do, such as labeling and classifying stuff. In order for things to go to the next level we will need markets which allow humans to contribute human computer and or human knowledge in exchange for crypto tokens.
The concept of a decentralized operating system is interesting. First if there are a such thing as social computations (such as collaborative filtering, subjective ranking, waze, etc) then what about the new paradigm of social dispersed computing?
The question becomes what do we want to do with this computing power? Will we use it to extend life? Will we use it to spread life into the cosmos? Will we use it to become wise? To become moral? To become rational? If we want to focus on these kinds of concerns then we definitely need something more than Bitcoin, Ethereum, or even EOS. While EOS does seem to be pursuing the strategy of a decentralized operating system which seems to be the correct course, it does not get everything right.
One problem is as I mentioned before the importance of the feedback loops between minds and brains. The reason I always communicate on the concept of external mind or extended mind is based on that fact that it is the mind which creates the immune system to protect the brain from harmful memes. The brain keeps the body alive. The brain is not really capable of rationality, or morality, or logic, and relies on the mind to achieve this. The mind is essentially all the computation resources that the brain can leverage.
EOS has the problem in the sense that it doesn't seem to improve the user. The user can connect, can join, can earn or sell, can participate, but unless the user can become wiser, more rational, more moral, then EOS has limits. EOS does have Everpedia which is quite interesting but again there are still problems. What can EOS do to improve people in society and thus improve society, if society is a computer and is in need of being upgraded?
Well if society is a computer first what does society compute? What should it compute? I don't even know how to answer those questions. I could suggest that if computation is a commodity along with data then whichever decentralized operating systems that do compete and exist will compete for these commodities. The total brain power of a society is just as important as the amount of connectivity. And the mind of the society is the most important part of a society because it is what can allow the society to become better over time, allow the people in the society to thrive, allow the life forms to continue to evolve avoid extinction.
A decentralized operating system on a technical level would have a kernel or something similar to it. This is the resource management part. For example Aragon promises to offer a decentralized OS and it too mentions having a kernel. A true decentralized operating system has to go further and requires autonomous agents. Autonomous agents which can act on behalf of their owners are philosophically speaking the extended mind. But the resources of a society is still finite, has to be managed, and so a kernel would provide for an ability to allow for resource management.
The total computation ability of a society is likely a massive amount of resources. A lot more than just to connect a bunch of CPUs together. Every member of the society which can compute could participate in a computation market. Of course as we are beginning to see now, the regulators seem concerned about certain kinds of social computations such as prediction markets. So it is unknown how truly decentralized operating systems would be handled but my guess is that if designed right then they could be pro-social, be capable of producing augmented morality by leveraging mass computation, and also by leveraging human computation be able to be compliant. To be compliant is simply to understand the local laws but these can be programmed into the autonomous agents if people think it is necessary.
What is more important is that if a law is clearly bad, and people have enhanced minds, then it will be very clear why the law is bad. This clarity will help people to dispute and seek to change bad laws through the appropriate channels. If there is more wisdom, due to insights from big data, from data scientists, etc, then there can be proposals for law changes which are much wiser and more intelligent. This is something specifically that people in the Tauchain community have realized (that technology can be used to improve policy making).
A lot is still unknown so these writings do not provide clear answers. Consider this just a stream of consciousness about concepts I am deeply contemplating. This is also a way to interpret different technologies.
The Paradigm of Social Dispersed Computing and the Utility of Agoras. By Dana Edwards. Posted on Steemit. October 12, 2018.
Social Dispersed Computing
What is socially dispersed computing? It is an edge oriented computing paradigm which goes beyond cloud and fog computing. To understand socially dispersed computing we first have to discuss dispersed computing and how it differs from the previous paradigm of cloud and fog computing. The current trend toward decentralized networks which we first saw with the peer to peer technologies such as Napster, Limewire, Bittorrent, and later with Bitcoin, have brought to us an opportunity to conceptually new paradigms. The original model most people are familiar with is the client server model which was very much limited in that the server was always vulnerable to DDOS attack. The client server model has never been and could likely never be censorship resistant.
In the client server model the server could simply shut down as was the case with Bitconnect or it could be raided. The server could also be shut down by hackers who simply flood the site with requests. As we can see from the problems the client server model presented we discovered the utility of the peer to peer model. The peer to peer model was all about censorship resistance and promoted a network which was to have no single point of failure (single point of attack) which could be result in the shutdown of access points to the information. One of the first applications for these peer to peer networks was file sharing networks and networks such as Freenet/Tor etc. This of course eventually evolved into the Bitcoin which ultimately led to the development of Steem.
In dispersed computing a concept is introduced called "Networked Computation Points". An NCP can execute a function in support of user applications. To elaborate further I'll offer something below.
Consider that every component in a network is a node. Now consider that every component node is an NCP in that it can execute some function to support some user application. If we think of for example a blockchain then we know mining would fit into this category because it is both a node in the network and it also can execute a function in support of Bitcoin transactions. Why is any of this important? Parallelism is something we can gain from dispersed computing and please note that it is distinct form concurrent computing. When we rely on parallelism we can reap the benefits in terms of performance when executing code by breaking it up into many small tasks which can be performed across many CPUs.
EOS attempts to leverage parallelism specifically to enable it's performance boost. The benefit is speed and flexibility. Think for example of the hardware side also with FGPAs which can do similar tasks of a microprocessor. FGPAs (not ASICs) which unlike ASICs would provide generalized flexible parallel computing. Consider that just like with mining a company could add more and more FGPAs to scale an application as needed.
To understand Social Dispersed Computing we have to make note of the fact that there are other users at any given time. For example the other users in the network participate to provide resources to the network for the benefit of other users whilst using the network. So in Steem for example as you add content to Steem you are adding value to Steem in a direct way, but also in a dynamic way. The resources on Steem also can adapt dynamically to the demand provided that the incentive mechanism (Resource Credits) works as intended.
EOS as an example DOSC (Dispersed Operating System Computer)
Because EOS seems to be the first to approach this holistically I will give credit to the EOS network for pioneering dispersed computing in the crypto space. All resources are representable by tokenization in a dispersed computing network. EOS and even Steem have this. Steem has it in the form of "Resource Credits" which represent the available resources on the Steem network. If more resources are needed then theoretically the resource credits could act as an incentive to provide these resources to the Steem network. This provides a permanent price floor to Steem represented as the amount of Steem which would have to be purchased in order to have enough resources to run Steem (if I have the correct theoretical understanding). This would put Steem on a trajectory toward dispersed computing.
Operating systems typically sit between the hardware and software as a sort of abstraction layer. This traditionally has been valuable because programmers don't have to directly speak to the hardware and hardware designers don't have to directly communicate by their designs to the programmer. In essence the operating system in the traditional model is centralized and made by a company such as Microsoft or Apple. This centralized operating system typically runs on a device or set of devices and provides some standard services such as email, a web browser, and maybe even a Bitcoin wallet.
Typically the most valuable or high utility software people consider on a computer is the operating system. In our smart phones this is Android OS and in PCs it may be Windows or Linux. This is of course thrown on it's head under the new paradigm of dispersed computing and the new conceptual model of the "decentralized" operating system. EOS is the first to attempt a decentralized operating system using current blockchain technology but the upcoming technology easily eclipses what EOS could do. Tauchain is a technology which if successful will leave EOS in the stone age in terms of what it will be able to do. EOS while ambitious also has had it's problems with regard to the voting mechanisms and the ease at which collusion can take place.
To better understand how decentralized operating systems emerge learn about:
If we look at OSKit we see that it is the tools necessary for operating system development. If we look at Tauchain we realize that it is strategically the most important tool for the development of a decentralized operating system being provided in the form of TML (a partial evaluator). If we think of the primary tool necessary to develop from we have to initially start with a compiler. A compiler generator is more like what TML allows with it's partial evaluator. More specifically it is the feature of Futamura projection which can provide the ability to generate compilers.
If we look at the next most important part of an operating system it is typically the kernel. Let's have a look at what an exokernel is:
Operating systems generally present hardware resources to applications through high-level abstractions such as (virtual) file systems. The idea behind exokernels is to force as few abstractions as possible on application developers, enabling them to make as many decisions as possible about hardware abstractions. Exokernels are tiny, since functionality is limited to ensuring protection and multiplexing of resources, which is considerably simpler than conventional microkernels' implementation of message passing and monolithic kernels' implementation of high-level abstractions.
By Thorben Bochenek [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
From this at minimum we can see that an exokernel is a more efficient and direct way for programmers to communicate with hardware. To be more specific, "programs" communicate with hardware directly by way of an exokernel. We know the most basic function of a kernel in an operating system is the management of resources. We know in a decentralized context that tokenization allows for incentives for management of resources. When we combine them we get kernel+tokenization to produce an elementary foundation of an operating system. In a distributed context we could apply a decentralized operating system in such a way that the network could be treated as a unified computer.
Abstraction is still important by the way. In an operating system we know the object oriented way of abstraction. Typically the programmer works with the concept of objects. In an "Application Operating Environment" an "Application Object" can be another useful abstraction. Abstraction can of course be taken further but that is for another blog post.
The Utility of Agoras
Agoras+TML is interesting. Agoras is the resource management component of what may evolve into the Tau Operating System. This Tau Operating System or TOS is something which would be vastly superior to EOS or anything else out there because of the unique abilities of Agoras. The main abilities have been announced on the website such as the knowledge exchange (knowledge market) where humans and machines alike can contribute knowledge to the network in exchange for the token reward. We also know that Agoras will have a more direct resource contribution incentive property in the form of the AGRS token so as to facilitate the sale or trade of storage, bandwidth or computation resources.
The possible (likely?) emergence of the Tau Operating System
In order for Tauchain to evolve into a Dispersed Operating System Computer it will need an equivalent to a kernel. Some means of allowing whomever is responsible for the Tauchain network to control and manage the resources of that network. If for example the users decide then by way of discussion there would be a formal specification or model of a future iteration of the Tauchain network. This according to current documents is what would produce the requirements for the Beta version of the network to apply program synthesis. Program synthesis in essence could result in a kernel and from there the components of a Tau Operating System could be synthesized in the same way. Just remember that all that I write is purely speculative as we have no way to predict with certainty the direction the community will take during the alpha.
''We live in a world in which no one knows the law.''
Ohad Asor, Sept 11, 2016
I continue herewith with sharing my contemporary state-of-grok  of the up to now four  scriptures of the aka newtau . Sorry for the delay, but it comes mostly from the efforts to contain the outburst of words, catalyzed by the very exegetic process of such a rich content, into a reader-friendly shorter form.
The subject of vivisection textographically identifies as the first three paragraphs of ''Tau and the Crisis of Truth'', Ohad Asor, Sep 11, 2016 .
The four core themes extracted are ennumerated bellow, with as modest as not to sidetrack the thought and to not spoil the original message, streak of comments of mine.:
As I guy who's immersed in Law for more than quarter of century  I can swear with both hands on my heart in the notion of unknowability of Law.
Since my youth years in the law school  I was asking myself how it is possible at all to have 'rule of law'  in case any legal system ever known required humans to operate !?
It seemed that the only requisite or categorcal difference between mere arbitrary 'rule of man'  and the 'rule of law' was that in some isolated cases some ruling men happened to be internally programmed by their morals  to produce 'rule of law' appearance effects by 'rule of man' means.
Otherwise 'rule of law' done via 'rule of man' poses extremely serious threats of law to be used by some to exploit and harm others.
In that line of thoughts my conclusion was that the Law is ... yet to come.
What we know as Law is not good networking protocol software of mankind as such, but rather we see comparatively rare examples of individually well programmed ... lawyers.
On the wings of a technological breakthrough, just like: flying came with the invention of airplanes and moonwalk needed the advent of rocketry, or to remember without to stay alive - the writing. The Law is an old dream. If we judge by the depth of the abyss of floklore - one of the humanity's most ancient dreams, indeed. Needless to repeat myself that this was what sucked me into Tau as relentlessly as a black hole spagetification  :)
The referred by Ohad frustration by Law of the great Franz Kafka  expressed in his book The Trial  becomes very understandable for Kafka's epoch lacking the comforting hope in a technology which we already have - the computers - and the overall progress in the field of logic, mathematics, engineering ... forming a self-reinforcing loop centered around this sci-tech of artificial cognition.
Similarly to the nuclear fusion, which is always few decades away, but the Fusion gap closes noticeably nowadays , we are standing on the cliff of a Legal gap.
The mankind's heavy involvement in cognition technologies, especially in the last several decades, outlined multiple promising directions of further development, which seem to bring us closer to abilities to compensate the fundamental deficiencies of Law and in fact to finally bring it into existence.
It took entire Ohad Asor, however, to identify the major reasons why the Law is bottlenecked out of our reach yet, and to propose viable means to bridge us through that Legal gap... The other side is already in sight.
It is in the first place the language to blame !
The human natural language . Our most important atribute as species. The mankind maker. The glue of society. It just emerged, it hasn't been created. It has rather ... patterns, vaguely conventional, than intentionally coined set of solid rules. There ain't firm rules to change its rules, either ... The natural human language is mostly wilderness of untamed pristine naked nature, dotted here and there with very expensive and hard to install and maintain ''arteftacts'' . Leave it alone out of the coercion of state mass media, mass education and national language institutes and it falls back into host of unintelligible dialects. Even when aided by the mnemonic amplifier which we call writing.
Ambiguity is characteristic of the natural language, a feature in poetry and politics, but a deadly bug in logic and law.
We'll put aside for now the postulate of impossibility of a single universal language to revisit it later when its exegetic turn comes. In another chapter onto another scripture. Likewise, not in this chapter we'll cover the neurological human bottlenecks which are targetted to be overcome by Tau. Lets observe the sequence of author's thoughts and to not fast forward.
Instead of that I'll dare to share with you my own hypothesis about why the natural human languages are so. (I'm smiling while I type this, cause I can visualize Ohad's reaction upon reading such frivolous lay narrative. I hope he being too busy will actually not to.) To say that the human languages are just too complex does not bring us any nearer to decent explanation. Many logic based languages are more than a match of the natural human ones in terms of expressiveness and complexity. It shouldn't be that reason.
My suspicion is rather that the natural human languages pose such a Moravec hardness  for being not exactly languages. Languages are conveyors of meaning. Human languages convey not meaning, but indexes or addresses or tags of mind states. The meaning is the mind state. Understanding between humans is function of not only shared learnt syntaxi, but also of shared lives. Of aggregation of similar mind states which to be referred by matching word keys.
If this is true it is another angle for grokking the solution of human users leaning towards the machine by use of human intelligible Machinish, instead of Tau waiting the language barrier to be broken and machines to start speaking and listening Humanish.
In a nutshell we yet wait the Law to come cuz Law is not doable in Humanish. Bad software. And the other side of the no-law coin is that the humans are no cognitive ASICs . We do congnition only meanwhile and in-order-to do what other animals do - to survive. Bad hardware.
In order law to become law it must become handsfree .
Not humans to read laws, but laws to read laws.
The technology to enable that looks on an arm's length.
Ok, so far we butchered the law and the language. What's left?
The nature and essence of human language brought one of the most harmful and devastating notions ever. Literally, a thought of mass destruction.
The ''crisis of truth''. The wasteland left by the toxic idea spilover of ''there is no one truth'' or even ''there ain't truth'' at all. This is not only abstract, philosophical problem. Billions of people actually got killed for somebody else's truth.
Not occasionally the philosophers who immersed themselves into this pool are nicknamed 'Deconstructivist' . Following back their epistemic genealogy, we see btw, that they are rooted rather in faith than in reasoning, but this is another story.
The general problem of truth, of which the problem of law is just a private case, opens up two important aspects:
Number one, is that all knowledge is conjectural to truth and that, truth is an asymptotic boundary - forever to close on but never to reach. Like speed of light or absolute zero. Number two, is that human languages make pretty lousy vehicles to chase the truth with.
If really words are just to match people's thoughts together, then there are thoughts without words and words without thoughts. Words mismatch thoughts, so how to expect they to bridge thoughts to things? Entire worlds on nonsensical wording emerge, dangerously disturbing the seamless unity of things and thoughts. Truth displaced.
''But can we at least have some island of truth in which social contracts can be useful and make sense?''
This island of shared truth is made of consensus  bedrock and synchronization  landmass.
Thuth and Law self-enforced. From within instead of by violence from without. And in self-referenial non-regressive way.
''We therefore remain without any logical basis for the process of rulemaking, not only the crisis of deciding what is legal and what is illegal." 
Peter Suber with his ''The Paradox of Self-Amendment: A Study of Law, Logic, Omnipotence, and Change''  proposed a rulemaking solution which he called Nomic .
''Nomic is a game in which changing the rules is a move.'' 
The merit of Nomic is that it really eliminates the illths of the infinite regress  of laws-of-changing-the-laws-of-changing-the-laws, ad infinitum, by use of transmutable self-referrenial rules. But Nomic suffers from number of issues - the first one, in the spotlight of that chapter, being the fact that we still remain with the “crisis of truth” in which there is no one truth, and the other ones - like sclability of sequencing and voting - we'll revisit in their order of appearance in the discussed texts.
The aka 'newtau'  went past the inherent limitations of the Nomic system and resolves the 'crisis of truth' problem.
The next few chapters will dive into Decidability and how it applies to provide solution to the problems described above.
 - https://en.wikipedia.org/wiki/Grok
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-intro
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-the-two-towers
 - http://www.idni.org/blog/tau-and-the-crisis-of-truth.html
 - http://www.behest.io/
 - https://steemit.com/blockchain/@karov/behest-for-tauchain
 - https://en.wikipedia.org/wiki/Rule_of_law
 - https://en.wikipedia.org/wiki/Tyrant
 - https://en.wikipedia.org/wiki/Morality
 - https://en.wikipedia.org/wiki/Spaghettification
 - https://en.wikipedia.org/wiki/Franz_Kafka
 - https://en.wikipedia.org/wiki/The_Trial
 - https://www.amazon.com/Merchants-Despair-Environmentalists-Pseudo-Scientists-Antihumanism/dp/159403737X
 - https://en.wikipedia.org/wiki/Language
 - https://en.wikipedia.org/wiki/Official_language
 - https://steemit.com/blockchain/@karov/tau-through-the-moravec-prism
 - https://en.wikipedia.org/wiki/Application-specific_integrated_circuit
 - https://www.etymonline.com/word/manipulation
 - https://en.wikipedia.org/wiki/Deconstruction
 - https://en.wikipedia.org/wiki/Consensus_decision-making
 - https://en.wikipedia.org/wiki/Synchronization
 - http://legacy.earlham.edu/~peters/writing/psa/index.htm
 - https://en.wikipedia.org/wiki/Nomic
 - https://en.wikipedia.org/wiki/Infinite_regress
 - the illustration is a painting courtecy of the author Georgi Andonov https://www.facebook.com/georgi.andonov.9674?tn-str=*F
De Lege Ferenda  is a series. Like the Tauchain Exegesis ,  is. One train of articles.
This is the introductory 'locomotive' article where I attempt to nail down the essential basics. This is nontrivial cause it requires compression of very long stream of thoughts and research. Spanning literally decades. In that sense some of the overcompressed categorical statements are also cognitive ''letters of credit''  or ''promisory notes''  - comprising debt of mine for future separate more detailed explanations to come. I'm afraid this is the only way the theses and conclusions of mine to be expressed in a reader-friendly way. Of course, questions and comments as mutual understanding accelerator are as always more than welcome.
Three ''angles of attack'' , in roman numerals and capitals in pure latin (the lingua franca  of law :) bellow:
Maybe I ,  already tired you with repeating my incantation of:
Law is Between, Code is Within , 
It is quite multi-dimensional in meanings and multi-disciplinary in consequences but here it comes to denote the unavoidability of Law. Rendered down to the most basic physics we currently know:
This is the way and reason why Law is enforceable and Code is executable. And the major categorial difference between them which makes the notion of 'code is law'  utter nonsense, as well as, it seems, also destroys the very basis of the notion of 'smart contracts' . But this belongs to bunch of other series of mine to come ...
Even if it was theoretically possible all effectors  to become one, there'd still be internal uncertainty fragmentation and thus unavoidability of enforcement.
Leaving this head-dizzying fundamental cognitive datum  and heading up across the higher abstraction epistemic layers  we reach the surface to take a swallow of fresh air to:
Nothing, read my lips, NO-THING in crypto or blockchain has ever been or could possible be extralegal.
Cuz there ain't a thing in any blockchain aspect which is not ... physical. Hence beyond the scope of Law.
Blockchain is most probably the arrival of the expected Hanson engine , or Szabo booster , or ultimate Clusivity management tool . Which makes it extremely important domain for proper legal treatment and regulation - both as taxonomy within the existing institutes of Law  - lex lata, and as creation of novel norms to cater it - lex ferenda .
(as a side note: expectedly the novel collective mnemonic technologies knows under the umbrella term of 'crypto' provide positive feedback loop to strengthen the Law, too - Tauchain  seems to promise  the advent of law, at last, as consistent and decidable set of rules, for first time ever.)
II. IURIS DICTIO
Law being inherently about physical, is also about spatiotemporal, i.e. about geography / geopolitics. It is always territorial even when it is cross-border applicable by the virtue of international law or internal rules to resolve inter-jurisdictional normative collisions.
The known world (I deliberately do not say: the planet, the Earth, or the globe because of ... of course - the Outer Space Law  !), is tessalated geographically into jurisdictions , . Countries or nations. The pattern pixels of the universal human jursdictional cellularity. But borders not as much divide as they connect.
The world is internet of jurisdictions no matter how yet primitive are the networking protocols and architecture. And because due to topological defficiencies this can not yet be a geodesic network  - some jurisdictions are special. And among the special there are some which are even more special than the merely special ones. The specialness stems from the fact of what a jurisdiction enjoyed gives to its user.
After decades of observation and practice and comparative studies I reached the conclusion that THE jurisdiction is the Principality of Liechtenstein ! 
Mere ennumeration of its features and the sheer lack of bugs would occupy a sizeable volume. Liechtenstein is not just an island periphery money hideout of an old fat imperial metropoly - it is a HUB. It is immersed  right into the middle of the healthiest-wealthiest community of EU .
What starts in Liechtenstein does not stay in Liechtenstein but swiftly propagates into the giant space of EEA . It is a keyhole jurisdiction straight into this most giant jurisdiction of jurisdictions - so strong in soft power  and so influential that even the FAMGA  seem to reckon Europe more than their own home jurisdiction .
Liechtenstein is simultaneously with deepest and most stable roots in the best of history and geography and is most advanced and ahead in the making of legislation of a highest probe of adequacy.
It does in 2018 - what I (and just a few others) predicted years ago to happen. We must herein admit that other jurisdictions do have some timid try-outs for legal codification of the blockchain but nothing compares with the comprehesive and in-depth approach of the Principality's legislators.
On 28th of August 2018 Liechtenstein published  a draft  of the new Blockchain Act:
<< On 28 August 2018, the Ministry for General Government Affairs and Finance of Liechtenstein published the consultation report on the new Blockchain Act (Act on Transaction Systems based on Trustworthy Technologies (VT) (Blockchain Act; VT Act; VTG)).
The government has decided to regulate not only the current Blockchain-applications (in particular cryptocurrencies and initial coin offerings (ICOs)), but also to establish a legal basis for the entire scope of application of the token economy according to a long-term approach, which should also meet the needs of future generations. >>
The basic provisions of the Liechtenstein Blockchain Act are exposed yet only in German language - which I'm not at all in command of and a language quite indgestable by the Google Transalte AI.
The consultation period ends on 16 November 2018, i.e. less than 2 months left from today.
My modest intention is by this De Lege Ferenda series of articles to provide my comments and opinions to 'whom it may concern' on the upcoming Liechtenstein Blockchain Act.
You already know I'm kinda fond of timelining and retrodictions.  :)
Every result has its cause, often hidden in the ocean of data what past is, and quite hard to distinguish.
US has its Captain America . Liechtenstein is lucky to have its Mr Liectenstein .
Andreas Erick Johannes Kohl Martinez of the House of Sequence . Remember that name.
Since the dawn of the blockchain era, I'm under the strong conviction that Liechtenstein is the true Crypto Valley  of the globe. So is Andreas, too. Purely by chance it occured that we both - long time before we knew eachother - have this astronomically improbable coincidence or synchronicity  of this and multitude of other thoughts.
Society of mind .
[*] - photo attributed to: By Michael Gredenberg - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=18962
 - https://en.wikipedia.org/wiki/Lex_ferenda
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-intro
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-the-two-towers
 - https://en.wikipedia.org/wiki/Letter_of_credit
 - https://en.wikipedia.org/wiki/Promissory_note
 - https://en.wikipedia.org/wiki/Angle_of_attack
 - https://en.wikipedia.org/wiki/Lingua_franca
 - http://www.behest.io/
 - https://steemit.com/blockchain/@karov/behest-for-tauchain
 - https://steemit.com/tauchain/@karov/tauchain-trumps-procrustics
 - https://steemit.com/tauchain/@karov/tauchain-and-the-cost-of-trust
 - https://en.wikipedia.org/wiki/Pauli_exclusion_principle
 - https://en.wikipedia.org/wiki/Fermion
 - https://en.wikipedia.org/wiki/Enforcement
 - https://en.wikipedia.org/wiki/Uncertainty_principle
 - https://en.wikipedia.org/wiki/Free_will
 - https://en.wikipedia.org/wiki/Mutual_information
 - https://www.coindesk.com/code-is-law-not-quite-yet/
 - https://en.wikipedia.org/wiki/Smart_contract
 - https://steemit.com/tauchain/@karov/tauchain-over-de-latil
 - https://www.etymonline.com/word/data
 - https://steemit.com/tauchain/@karov/tauchain-the-hanson-engine
 - https://steemit.com/tauchain/@karov/tauchain-as-szabo-booster
 - https://steemit.com/tauchain/@karov/clusivity-by-tauchain
 - https://en.wikipedia.org/wiki/Lex_lata
 - http://www.idni.org/
 - http://www.idni.org/blog/tau-and-the-crisis-of-truth.html
 - https://en.wikipedia.org/wiki/Space_law
 - https://www.etymonline.com/word/jurisdiction
 - https://en.wikipedia.org/wiki/Jurisdiction
 - https://steemit.com/blockchain/@karov/geodesic-by-tau
 - https://www.liechtenstein.li/en/
 - https://www.liechtenstein-business.li/en/economic-area/get-to-know/hidden-treasures/liechtenstein-combines-the-best-of-both-worlds/
 - http://europa.eu/
 - https://en.wikipedia.org/wiki/European_Economic_Area
 - https://en.wikipedia.org/wiki/Soft_power
 - https://medium.com/crypto-oracle/why-cryptos-a-growing-threat-to-famga-a-k-a-facebook-apple-microsoft-google-and-amazon-ea237570d3ea
 - https://www.dw.com/en/eu-gives-facebook-twitter-ultimatum-on-consumer-protection-laws/a-45573561
 - https://www.pwc.ch/en/insights/regulation/liechtenstein-publishes-draft-of-the-new-blockchain-act.html
 - https://www.llv.li/files/srk/vnb-blockchain-gesetz.pdf
 - https://steemit.com/bitcoin/@karov/bitcoin-retrodictions
 - https://en.wikipedia.org/wiki/Captain_America
 - https://podcast.bitcoin.com/e349-How-Libertarian-Leader-Mr-Liechtenstein-Got-Lucky
 - http://www.sequence.li/
 - https://www.businessinsider.com/what-its-like-in-zug-switzerlands-crypto-valley-2018-6
 - https://en.wikipedia.org/wiki/Synchronicity
 - https://en.wikipedia.org/wiki/Society_of_Mind
 - https://steemit.com/tauchain/@karov/scaling-is-layering &https://steemit.com/tauchain/@karov/tauchain-transcaling
“We are moving into an era where cities will matter more than states and supply chains will be a more important source of power than militaries — whose main purpose will be to protect supply chains rather than borders. Competitive connectivity is the arms race of the 21st century.”
-- Parag Khanna , 
A network is made of lines and switches, right?
Lots have been told about the network scaling effects , including attempts by myself [4-12] ... which compels me to introduce the not so frivolous notion of network forces.
These forces are expressed in several laws. I though initially to say 'forces' and 'laws' here, but I realize they are quite objective and physical emergenta , indeed.
In my ''Geodesic by Tauchain''  article of about couple of months ago I emphasized over the Huber-Hettinga Law , of how cost of switching literally defines the 'orographic'  topology of a network .
The cheaper the routing - the flatter the network.
Expensive switches = hierarchy, verticality, power, control, obey, centalization, 'world is fiat' ,, sollen , hence borders instead of bridges, limitations not stumulae, exclusivity ...
Cheap switching = geodesic society , 'world is flat', horizontality, p2p, decentralization, inclusivity ...
The more vertical by centralization a network is - the more it must deplete information - to omit, to ignore calls from the deeps or to even actively suppress or silence nodes. To cope with the stream by strangling it. Simply due to lesser capacity, less degrees of freedom . Geodesic networks possess higher entropy  and therefore are richer. They bolster higher both Scrooge  and Spawn  factors. With other words:
The flatter the network - the richer  it is.
Maybe the explanation on why the wealthiest-healthiest societies tend to be those who are with biggest economic-political freedom. 
Naturally the Huber-Hettinga Law led me to the elementary-watson  conclusion of the power and value of Tau as the ultimate über -switch. So far so good.
Now lets stare in the Lines. Here comes Nick Szabo .
Nick Szabo - a lawyer AND computer scientist - is a legendary figure from the great 'Archaic era of crypto'  - the 1990es when he, together with the other cypherpunk  titans like Tim May , Wei Dai , Bob Hettinga  etc. etc., poured the very baserock foundations in a staggering detail of what we enjoy now as Crypto  in the post-Satoshi  era.
It is THEIR vision came true we all now live in.
Bitcoin was a detonation of namely that critical mass of fused thoughts, of namely these very smart people, piled up and compressed by the connective network forces of the early internet .
No, I do not mean at all Szabo's most famous thing - the 1994 coining of the term of 'smart contracts' . In fact I deeply and strongly reject the very notion of 'smart contracts' - as utter non-sense, even as an oxymoron - which is an yuge separate problem, which I suspect that I nailed it, and I'll address in series of dedicated articles starting in the upcoming weeks...
I mean something much more valuable, what I call the Szabo Law.
When we hear the phrase 'networking effects' the first what comes to mind is the famous Metcalfe law .
''Metcalfe's Law is related to the fact that the number of unique connections in a network of a number of nodes (n) can be expressed mathematically as the triangular number n(n − 1)/2, which is proportional to n2 asymptotically (that is, an element of BigO(n2)).''
In the above order of appearance these network forces laws respect quantitatively the basic properties of a network as:
- Huber-Hettinga Law - the cost of switches and routing.
- Metcalfe Law - the number of nodes, i.e. switches defining the number of unique connections or lines.
- Szabo Law - the cost of the lines and connecting.
All these Laws are scaling ,  laws. Before we to come back to and continue on Szabo Law, we have to briefly mention another one .:
''So what is “scaling”? In its most elemental form, it simply refers to how systems respond when their sizes change. What happens to cities or companies if their sizes are doubled? What happens to buildings, airplanes, economies, or animals if they are halved? Do cities that are twice as large have approximately twice as many roads and produce double the number of patents? Should the profits of a company twice the size of another company double? Does an animal that is half the mass of another animal require half as much food?'' ... With Dirk Helbing (a physicist, now at ETH Zurich) and his student Christian Kuhnert, and later with Luis Bettencourt (a Los Alamos physicist now an SFI Professor), Jose Lobo (an economist, now at ASU), and Debbie Strumsky (UNC-Charlotte), we discovered that cities, like organisms, do indeed exhibit “universal” power law scaling, but with some crucial differences from biological systems.Infrastructural measures, such as numbers of gas stations and lengths of roads and electrical cables, all scale sublinearly with city population size, manifesting economies of scale with a common exponent around 0.85 (rather than the 0.75 observed in biology). More significantly, however, was the emergence of a new phenomenon not observed in biology, namely, superlinear scaling: socioeconomic quantities involving human interaction, such as wages, patents, AIDS cases, and violent crime all scale with a common exponent around 1.15. Thus, on a per capita basis, human interaction metrics (which encompass innovation and wealth creation) systematically increase with city size while, to the same degree, infrastructural metrics manifest increasing savings. Put slightly differently: with every doubling of city size, whether from 20,000 to 40,000 people or 2M to 4M people, socioeconomic quantities – the good, the bad, and the ugly – increase by approximately 15% per person with a concomitant 15% savings on all city infrastructure-related costs.
Which probably comes to denote the shear size of the network in STEM (space, time, energy, mass) , I'm not sure, but I have some strong suspicions about the unity of matter, structure and action which I will expose and share some other time.
What I call Szabo's Law reveals in his ''Transportation, divergence, and the industrial revolution''(Thu, Oct 16, 2014)  that similarly to Metcalfe's (''double the population, quadruple the economy'') there is power-law  correlation between the cost of connections or links or lines ... and the value of the network, too.:
''Metcalfe's Law states that a value of a network is proportional to the square of the number of its nodes. In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed. The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation. Combine this with Metcalfe's Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation. A reduction in transportation costs in a trade network by a factor of two increases the potential value of that network by a factor of sixteen. While a power of exactly 4.0 will usually be too high, due to redundancies, this does show how the cost of transportation can have a radical nonlinear impact on the value of the trade networks it enables. This formalizes Adam Smith's observations: the division of labor (and thus value of an economy) increases with the extent of the market, and the extent of the market is heavily influenced by transportation costs (as he extensively discussed in his Wealth of Nations).''
My encounter with this article of Nick Szabo's was a goosebumps experience for me, cause it coincided with series of lay rants of mine on the old Zennet irc chat room of Tau that ''computation =communication =transportation''. Somewhere in 2016 as far as I remember. :)
Maybe it was the last drop to shape my conviction that by my dedicated involvement in both Tau and ET3 , , , I'm actually working for ... one and a same project.
For communication, computation and transportation being modes of state change. Cause information is a verb, not a noun. And software being states of hardware.
''Decentralizing the internet is possible only with decentralized physical infrastructure.'' 
Just like the brain is a network computer of neuron nanocomputers , the emergent composite we colloquially call humanity or mankind or economy or society or world ... is a network computer made of all us billions of humans.
Brains do thought, economies do wealth.
Integrated circuitry  upon the face of planet Earth as a motherboard . Literally. The Humanity's planet-hardware. Parag Khanna's Connectography explained.
The Earth is definitely not our ultimate chip carrier . Probably there ain't limit at all of our culture-upon-nature hardware upgrades, see: , . The universe is our computronium  and we've been here for too short and haven't seen far enough. Networking is connectomics . And thus it always also is metabolomics .
Remember my last month's  ''Tauchain the Hanson Engine''?
The series of exponentially shortened growth doubling times looks like driven by transportation technological singularities : domestication of the horse, oceanic navigation, combustion engine ...
In the light of all the net forces summoned above: The planet Earth viewed as a giant computer chip ...
- itself is a subject of the relentless network entropic  force of the Moore's law 
The network forces accelerate what that wealth computer does.
Two quick examples:
A.: The $1500 sandwich  as a proof that trade+production is at least thousands of times stronger in sandwich-making than production alone.
B.: The example of Eric Beinhocker in his 2006 ''The Origin of Wealth''  about the two contemporary tribes of the Amazonian Yanomami  - a stone age population nowadays and the Eastcoastian Manhattanites . That the former are only about 100 times poorer, but the later enjoy billions of times bigger choice of things to have.
Tauchain 'threatens' to affect the parameters of ALL the network forces formulae mentioned herewith in a mind-bogglingly big scale.
Simultaneously, orders of magnitude :
- lower switch cost
- higher nodes count 
- lower connection cost
A wealth hypercane  recipe. Perfect value storm. Future ain't what it used to be .
''Thinking by Machine: A Study of Cybernetics''
by Pierre de Latil 
Published by Houghton Mifflin Company in 1957 (c.1956), Boston.
Foreword of Isaac Asimov (then only 36 years old) ! Recommendation by the legendary mathematician and cyberneticist Norbert Wiener (then 62 years old) ! ... A true jewel! The book is described as:
A review of "the last ten years' progress in the development of self-governing machines," describing "the principles that make the most complex automatic machines possible, as well as the fundamentals of their construction."
Nineteen fifties !! The midway between the first digital computer made by my half-compatriot John Atanasoff  and internet . Almost a human generation span between the former, the book and the later event. Epoch so deep in the past that even television, air travel, rockets and nukes ... were young then.
Same Kondratieff  wave phase btw, which hints towards the historical rhyming of socially important intellectual interests. (On how K-waves imprint on the humanity growth curve - in series of other posts to come).
I must admit here that I've never put my hands and eyes onto this book. But, it is stamped into my mind and memory by Stanislaw Lem  - one of the greatest philosophers of the XXth century, working under the disguise of a Sci-Fi writer, for being caught on the wrong side of the Iron curtain.
''Summa Technologiae'' (1964)  is a monumental work of Lem's, where most issues discussed sound more contemporary nowadays than they were the more than half a century ago when it was built, and for many things also we are yet in the deep past ...
... Lem reports and discussed the following from the aforementioned Pierre de Latil's book.:
''As a starting point will serve a graphic chart classifying effectors, i.e., systems capable of acting, which Pierre de Latil included in his book Artificial Thinking [P. de Latil: Sztuczne mys´lenie. Warsaw 1958]. He distinguishes three main classes of effectors. To the first, the deterministic effectors, belong simple (like a hammer) and complex devices (adding machine, classical machines) as well as devices coupled to the environment (but without feedback) - e.g. automatic fire alarm. The second class, organized effectors, includes systems with feedback: machines with built-in determinism of action (automatic regulators, e.g., steam engine), machines with variable goals of action (externally conditioned, e.g., electronic brains) and self-programming machines (system capable of self-organization). To the latter group belong the animals and humans. One more degree of freedom can be found in systems which are capable, in order to achieve their goals, to change themselves (de Latil calls this the freedom of the "who", meaning that, while the organization and material of his body "is given" to man, systems of that higher type can - being restricted only with respect to the choice of the building material - radically reconstruct the organization of their own system: as an example may serve a living species during biological evolution). A hypothetical effector of an even higher degree also possesses the freedom of choice of the building material from which "it creates itself". De Latil suggests for such an effector with highest freedom - the mechanism of self-creation of cosmic matter according to Hoyle's theory. It is easy to see that a far less hypothetical and easily verifiable system of that kind is the technological evolution. It displays all the features of a system with feedback, programmed "from within", i.e., self-organizing, additionally equipped with freedom with respect to total self-reconstruction (like a living, evolving species) as well as with respect to the choice of the building material (since a technology has at its disposal everything the universe contains).
Longish quote, but every word in it is a worth. When I've read this as a kid back in 1980es ... immediately came to my mind the next, the seventh logical higher effector class.: the worldmaker !!
The degrees of freedom of all the previous six according to the classical taxonomy of de Latil are confined by the rule-set, the local laws of physics.
They are prisoners of an universe. Like birds incapable to reconfig their cage into roomier and cozier ones.
If we regard the laws of nature as code or algorithm, my 7th level effector will be capable to draft and implement itself onto newer and stronger algorithmic foundations. ( Note the seamlessness between computation and robotics in Latil/Lem categorization construct - quite logical indeed, having in mind that software is state of hardware, that matter-form-action are inextricable from each other, but on this in series of other times and posts ... ). Without bond?
So, I wonder:
Where, you reckon, is Tauchain  placed onto the Latil's effectors map?
Guys, after a few articles , , .  - I think I owe you to present a little bit myself and Behest.io , .
I, Karov, am a human, i.e. I'm not robot ( although, my friend @trafalgar is a witness, once I fought all day long with a google form Captcha, but I prefer to blame a software glitch for that ... ).
I occasionally understood that 'karov' is the word for 'near' in Hebrew, but this is pure coincidence.
I'm a lawyer. More than two decades of uninterrupted PQE . In couple of European jurisdictions.
Behest.io is a ... firm. In the sense of :: firm (n.) , or in the very original sense as any firm's only way to be - a signature. Not in the sense (yet) of a legal personhood entity.
As a signature Behest.io is a tool. My tool, which I continuously develop to deliver answers  upon behests  for compliance to various crypto endeavors.
Metaphorically, the Behest.io tool dev target is: if a law firm is a CPU , Behest.io to be crypto legal services ASIC .
Blockchain came too swift, too strong and too global. Like an alien invasion. Legislators and law enforcement can not keep pace. Law and regulations are far from being definite on it.
It is entire internet of jurisdictions out there. Nobody really knows the Law. One can not just go out and shop answers. There is no legal supermarket with neat shelves of turnkey solutions with price tags.
The compliance space is turbulent. Nothing is ready and definite. Very high risk a grey zone to turn red hot. Quicksand minefield.
Crypto lawyer job is not yet an industry, it is inevitably art and craftsmanship. Tailored solutions.
Thus Behest.io is a studio , not conveyor belt mass factory.
Our approach in support is: side by side, thinking together, carefully map the routes ahead, identify the correct questions and precisely craft specific solutions.
On tailored case by case basis. In strict confidence. In all the time dynamic and adaptive fashion. In real time. From entry to exit. All the way navigation from mere idea to end.
So far it sounds like just another advert... I know. But, let me quickly throw some Behest.io preconditional points in an attempt to start sketching the bigger map:
FIRSTLY.: Why ''of Tauchain''?
Since my law school years back in the past millennium I noticed that the Law in all its dimensions.: legislature, legislation, application, enforcement, science, jurisprudence, doctrine ... is somewhat inconsistent and not quite self-sufficient.
I'm now firmly on position that the place of Law is not with the soft sciences of history and literature but among the hard sciences of maths, logic, philosophy and physics.
If we compare the social rules set with a human network protocol code, the Law up to now is obviously not quite automatic and requires too much 'hand drive'. Including, in the rules to make rules, too.
I tried to envision (with my limited tech knowledge), all this quarter of century, various ... systems which eventually could compensate such flaws: virtualization, procedural generation, gamification ... and then Satoshi came. And Ohad Asor appeared.
If we compare our intention and dream of Law with flying - since times immemorial humans wanted to fly like birds, but it took Wright Bros  we to fly ... not like the birds do.
I must herewith admit that closest to my heart are two technological projects.: Tau  and ET3 . They form kinda ... unity, but on that - other times, in series of other posts.
Ohad Asor in his Sep 10, 2016, 8:25 PM essay  very precisely outlined the problem of Law:
''We would therefore be interested in creating a social process in which we express laws in a decidable language only, and collaboratively form amendable social contracts without diving into paradoxes. This is what Tau-Chain is about.''
Exactly! The problem of Law is that it is written in inherently buggy natural human language 'software' and is run on human brains 'hardware' which is faulty for this, for being 'made' to optimize performance of completely other category of tasks. Like ... survival.
We can achieve Law by these means - human natural language and human brains - not more successfully than we could walk from here to the moon.
Tau is the most solid grounded and promising effort to deliver our long dreamed 'rocketry' to take is from here to the Law.
If Law is decidable code, it is specifiable, all intended consequences predictable and granted. Decidable, consistent ... and self-amending. Precisely what the Law is supposed to be. At last. If it is specifiable in exact terms, action code is synthesizable out of it, to feed the legal effectors of all kinds with precise instructions.
Because our societies map to our communications , drastic improvement of our interactions rules is equivalent of immense improvement of the human condition.
The Law as a Tapp (Tau App)? Most definitely. I know no other attempt the issue to be addressed in such a way of pure reason and demonstrated understanding.
This is the reason behind ''for Tauchain'' part of this post's title. It can get us there. We can have the Law, at last.
This is in the Behest.io and mine best selfish interest. Which is: a world of unimaginable freedom and wealth for all.
Behest.io in that sense is ''for Tauchain'' for the perspective the Tau to become ''for Behest''. Realization of my lifetime Legum  project.
Behest.io is not of Tauchain, or of IDNI. It is an independent project of an independent lawyer, with strong current focus on Tau and ET3. Because of the outlined above reasons. In series of upcoming articles I intend to elaborate on my visions and positions on these in general.
SECONDLY.: How exactly is supposed Behest.io to operate before the Tau is in our hands to play with?
All by the books, of course! Legal profession is for compliance, but also it is all about compliance per se. Not just compliance makers and shippers, but must-be compliant the lawyers themselves. Lawyers are strictly local and heavily regulated profession. As it should be.
Not only no lawyer knows all law, but there is not such a thing as global or universal license to provide legal services. Regardless of the 'professional services provider' Big Four  or other hierarchic collab structure - a lawyer is limited to operate only on the territory which his professional 'badge' granting regulator says.
From the other hand Internet and Blockchain are inherently global and penetrate and permeate all jurisdictions as easy as neutrino passes through a planet.
My plan to deal with this ''license to kill (the problems)'' inter-jurisdictional professional license issue is simple:
Quick assembly of full professional license coverage teams. In bespoke to project way. Ad hoc. Where and when needed.
The idea is ... if Behest.io is a screen and the solutions - images on it, the backend machinery of professionals and other resources to be freely reconfigurable and developed and expanded on demand all the time, without the client to be bothered to grok anything else but what's on the screen.
This resembles the aka B2B2X  telecom services business model which is conceptually so new that it does not have a wikipedia article, yet.
So all professional services colleagues welcome to join! In whatever forms we together see fit in every particular occasion.
I'm sure some really groundbreaking fusions will come out of this collab direction alone!
More posts on Behest.io biz philosophy to come.
Retrodictive archaeology is so tempting. It is about what it was, what it is, what we knew and what we know.
Here I present another time travel glimpse of mine:
February 1998. Global Information Summit*. Japan. Robert Hettinga** - the patriarch of financial cryptography wrote:
My realization was, if Moore's Law creates geodesic communications networks, and our social structures -- our institutions, our businesses, our governments -- all map to the way we communicate in large groups, then we are in the process of creating a geodesic society. A society in which communication between any two residents of that society, people, economic entities, pieces of software, whatever, is geodesic: literally, the straightest line across a sphere, rather than hierarchical, through a chain of command, for instance.
A network scales according to the capacity of its switches.
Mankind is a network of interlinked humans routed by ... humans.
The network topology*** of society is dictated by our incapacity to switch - similarly to the way the penguins society is shaped by their inability to fly.
Running the Sorites paradox**** in reverse - humanity does not form a sand-heap by adding grains, but fractalizes into groupings of up to just a few individuals.*****
Big body of research on discussions persistently brings back the result that over a certain threshold of as little as 5 persons the number of possible social interactions explosively exceeds the participants capacity to handle the group traffic of information.
Increase the group size and the 'c factor' - the collective intelligence - abruptly implodes. Bellow the individual human level. So long 'wisdom of the crowd'.
Hierarchy is the only way we know (up to now) for a society to scale. Centralization as emergenta of organic switching limitations.
It is fair to say that we have and have had upscaling exosomatic prosthetics all the time.: language, writing, institutions, specialization... but at the end of the day even within these boosters the social switching is bottlenecked down to just a few humans-strong.
Since recently, cause, you know ... computers. Humans are not only lousy switches, but also tremendously expensive ones to make. Computers - the vice versa: their performance/cost relentlessly bigbangs.
Moore's law****** is not only about silicon wafers. It is a megatrend from the very dawn of the universe as Kurzweil noticed******* long time ago, which goes up and up across all computronium substrata imaginable or possible.
Non-human computation and automated communication promises to break the social scaling barrier.
Here comes the Ohad Asor's Tau.********
The only project I know which asks the correct questions and looks into doable solutions of humanity scaling. And the only meaningful identification and treatment of these problems which seems to lead towards fulfilling of Bob Hettinga's Geodesic visions from few decades ago.
Of course I do not know it all, but lets say that I intensively search the relevant space.
Tau transcends the human switching limitations in humane way. Without to amalgamate individuals out of existence, which some other discussed ways - like direct neural interfacing - seem to inevitably infer. For society is ... human beings.
What's the pragmatics of geodesic vs hierarchic?
What game really the 'flat' p2p networks beat the vertical social configurations into?
It is an easy answer. It is pure physics:
A Tauful geodesic society comprises IMMENSELY richer economy.
Metcalfe's (and Szabo's) law on max!
The combinatorial size of it vastly exceeds the possible arrangements of any traditional social 'pyramid'.
The maximum social diameter becomes ~1.
In fact, it seems quite an ancient archetypal vision, the whole thing:
“Imagine a multidimensional spider’s web in the early morning covered with dew drops. And every dew drop contains the reflection of all the other dew drops. And, in each reflected dew drop, the reflections of all the other dew drops in that reflection. And so ad infinitum.” Allan Ginsberg*********
1. *- http://www.nikkei.co.jp/summit/98summit/english/online/emlasia3.html (the second entry)
2. **- http://nakamotoinstitute.org/the-geodesic-market/
3. ***- https://en.wikipedia.org/wiki/Network_topology
4. ****- https://en.wikipedia.org/wiki/Sorites_paradox
5. *****- https://sheilamargolis.com/2011/01/24/what-is-the-optimal-group-size-for-decision-making/
9.*********- https://en.wikipedia.org/wiki/Indra%27s_net (image from: https://mindfulnessforhealing.com/2012/12/29/weaving-a-tapestry-of-wellness/ )
NOTE: I'm in the Tau Team, but this post expresses only my own associations and interpretations.
Tau Chain vs. Tezos - which platform will provide a better solution? By Isar Flis. Posted on Steemit. February 10, 2018.
In this article I would like to discuss the self-amending feature of Tau Chain (Tau), which I believe provides a better solution than the one proposed by Tezos.
A short summary about Tau
Tau will be a blockchain based computer network, aimed at supporting collaboration between people. It will be designed like any other social network you know (Facebook, Twitter, etc.); but on Tau, users can interact with each other using machine-comprehensible languages. Specifically, advanced users will be able to define new knowledge-representation languages simply by translating it to Tau’s metalanguage (TML). As the languages use logic, they will be understandable by both machines and humans.
Since Tau can “understand” the entire conversation, it can also translate the discussions into various languages and discover where people agree or disagree; then, it may present the content of the conversation in different forms (languages or formats) for each user, based on specific requests.
The ability of Tau to logically understand discussions (as it will be translated into its TML) will assist users in four important ways:
*For further information about Tau, please refer to my previous article, explaining Tau and its four-step roadmap.
“Tau, is a discussion about Tau”
Tau is a social platform that will assist users with writing and amending code based on users' discussions about a computer program. But Tau is a computer program by itself. Therefore, by discussing Tau, users will be able to amend Tau, whenever they (the community) reach an agreement about changing Tau’s protocol.
When Ohad Asor, the founder and developer of Tau Chain, mentioned that “Tau, is a discussion about Tau”, he meant that Tau is what the community decides when they discuss Tau. Meaning, when the community will face a decision, such as what Tau’s block size should be, they will just need to express their opinions and perspectives, like we do today in the social networks. Tau will organize the conversation in an efficient way to promote a solution that will represent what the community desires. As such, Tau will be the only dynamic decentralized social network.
Why is Tezos developing only a short-term solution?
You probably remember Tezos as one of the biggest ICOs in history, when they raised $232 million (when BTC price was ~$2,500). Like Tau, Tezos is also a dynamic protocol that can change itself based on users' agreements. Tezos considers voting to be the optimal solution to reach a decision between users.
Voting is a good method to include a large number of people in the decision-making process; however, voters have limited influence, as they can only choose between a few solutions/options presented to them. Who will decide when and why the community will vote? Who will decide what solutions the community can vote for? Tezos’ solution is still centralized and is only viable in the short-run. What will happen if some users do not agree with a specific vote? Does that mean that a Tezos fork is inevitable?
Without considering the perspectives of the entire community, we will not be able to reach a decentralized decision that benefits all users. Tau’s ability to scale discussions is the only decentralized solution to create a true dynamic protocol. Tau will enable all users to express their opinions by just discussing or communicating their views. Users will decide when and what to discuss, and Tau will change its protocol based on users' agreements. Thus, Tau will be able utilize all data in the decision-making process; data that is usually wasted when holding a vote.
To make it more tangible, think about the difference between discussing with your family which movie you’re going to watch and receiving a list of two movies to choose from. The latter might not reflect your taste in movies or how you want to spend your time. This is a low-scale analogy for Tezos’ voting solution. Tezos might provide a solution, but the solution is not optimal. When encountering a large-scale decision, the protocol will be changed based on the vote, but the minority might reject the vote and fork the coin.
Under Tau, the protocol will detect the core consensus among the different perspectives and change accordingly. With the assistance of Tau and its knowledge, users will effectively discuss among themselves how to reach further consensus points. With every consensus point, Tau will change itself accordingly.
*As the community members decide how Tau will be developed, they can suggest the majority rule (or a higher bar) as a decision rule. Tau will automatically detect the different perspectives of the community members and will execute their decision to change Tau’s protocol.
Another important aspect of Tau (compared to Tezos) is the fact that Tau will present its users with output about all the network input. All the data/opinions/information that users provide during their discussions will be accumulated to the knowledge archive. Tau will utilize its knowledge to provide its users with a better access for qualitative and quantitative information. Over Tau, the proposals (such as suggestions to change the protocol) that users will raise can be as wise as the information contained in the entire network.
I will end this article by quoting the last paragraph in my first article:
"I foresee huge potential for this project and urge you to read and learn about this project and its relevant applications. If you find this vision interesting, I recommend that you follow the project on Telegram, Facebook, LinkedIn and Reddit, or read Ohad’s blog for further information."
Disclaimer: I have invested in Agoras. Please do your own research before investing in Agoras and/or any other coin or project. Please do not consider this article to constitute financial advice.
The Power of Tau - Scaling the Creation of Knowledge. By Trafalgar. Posted on Steemit. December 31, 2017.
Ohad Asor, creator of Tau Chain/Agoras, has recently published the long awaited blog post detailing his vision for what very likely is the most ambitious project in the crypto space: Tau.
Tau will accelerate human endeavors by overcoming long ingrained limitations in our collaborative processes; limitations which we rarely even question.
The Problem of Social Governance
Take social governance, for example. As individuals, we have opinions over a wide variety of social issues. Perhaps you feel that the speed limit on certain roads is too high, or that programming should be a compulsory subject at public schools, or that everyone would benefit if cryptocurrencies were officially recognized and endorsed by the state.
However, you have no idea how to get these concerns across to the general public. I mean you could try writing a letter to your local representative or signing a petition but ultimately that's unlikely to gain much traction. Meanwhile, the very same issues that seems to have divided the nation over the past decade remain at the forefront of our political debate. Immigration, climate change, abortion, gun control etc. are all important issues of course, but very little progress have been made considering the amount of time, resources and attention that have been devoted to them.
So the problem with traditional forms of social governance, such as democratic voting, is apparent: on the one hand it has difficulty identifying and addressing the wide range of opinions different people hold, on the other hand, even with respect to the small number of issues that do end up bubbling up to the surface, it isn't particularly efficient at detecting consensus.
The central cause of this problem is that current modes of discussion are not scalable. There are inherent limitations in the way we're able to communicate our views across to each other; namely, human ability to comprehend and organize information is the main bottleneck. We cannot possible follow multiple conversations at once, or recall everyone's propositions once there are more than a handful of people in the mix. This is why most collaborative decision making bodies in practice are generally quite small in number: the President's cabinet, Supreme Court Justices, boardroom directions of a fortune 100 company etc.; you just can't have a productive discussion with 50 people. Our entire civilization is structured around this very limitation: discussions don't scale.
Scaling Collaborative Discussions Under Tau
Imagine if we can overcome this limitation; what will it mean for social governance? By using a self defining, decidable logic, the Tau network is easily able to keep track of every user's propositions and detect consensus automatically. Note that making a proposition is exactly the same as voting for that very same proposition: when you're proposing 'dogs should always be on a leash in public unless in a park' you're in effect putting in a vote for such a proposition. This way, countless issues, regardless of how technical or niche, can be assessed through the network concurrently, and social consensus can be detected on the fly. The Tau network can scale social governance by overcoming one of the greatest limitation in human communication of ideas by delegating the task of logically making sense of everybody's propositions to the computer. A simple use case of this will be the rules of the Tau network itself: through a self defining logic, Tau is able to detect consensus among its users from block to block, altering its own rules to conform to the choices of the user base.
The benefits of scaling discussions are not limited to just a more efficient form of social governance. Logic isn't merely about detecting surface level consensus, the network can easily form further deductions from everyone's propositions. If one states 'all men are mortal' and 'Socrates is a man', one can deduce that 'Socrates is mortal.' But deductions can be very deep and non trivial. Imagine if we had a group of 1000 mathematicians all inputting their mathematical insight as propositions. Tau can rapidly detect who agrees with whom on what, and deduce every logical consequence of their combined wisdom; in effect arriving to new truths and insights. In other words, Tau greatly accelerates the production of new knowledge. This will, of course, also work if you have physicists, doctors, engineers, computer scientists, indeed experts in every field working together on the platform. By scaling collaborative discussions in a logical network, Tau is able to scale the creation of knowledge.
When Tau comes into effect, any company, government, and indeed any organization not using this new network will be rendered obsolete. Tau aims to become an indispensable technology.
And this is only the alpha of Tau.
I will talk about the beta in a future posts. The beta will revolve around not just the scaling of discussions and consensus, but the automation and execution of code based of the results of those discussion. For more information on code synthesis and more, please read Ohad's blog. Also, do check out my introduction to Tau here if you missed it.
You can invest in Tau through buying Agoras tokens on Bittrex.
I am not affiliated or paid by the project. These represent my own subjective views. Tau/Agoras is the only other crypto project apart from Steem in which I see an extraordinary future, and I am merely sharing that with fellow Steemians here.
Ohad Asor's New Tau Blog
IRC Chat: Where you may ask Ohad himself technical questions
Tau Chinese QQ Group: 203884141
The liquid paradigm, feedback loops, the virtuous cycle and Tauchain. By Dana Edwards. Posted on Steemit. December 31, 2017.
What do I mean by the concept of "liquid platform"? This is merely a re-articulation of the concept of self amendment and self definition. In other words it is very much like an autopoietic design. Bruce Lee once said to "be like water", and the reason is because water can adapt to any environment it is placed it by taking the form of the container it is put into.
So by liquid paradigm I mean that the core feature of true next generation platform design is going to be focused on maximum adaptability.
Feedback loops and the virtuous cycle
How can we have a platform which promotes continuous self improvement? If you have a platform with no hard coded "self" then even the design of the platform is under constant negotiation and creation. This is key because it means Tauchain will be able to adapt quicker than all other competing platforms. Quicker than Tezos because Tezos merely provides self amendment but lacks the virtuous cycle, the meta language, etc.
The Tau Meta Language allows for self definition at the level of languages. This means even the communication mechanism between humans and machines can be updated continuously. This continuous updating is the key design breakthrough of Tauchain because it means Tauchain will always be state of the art in any area. Think of a platform like Wikipedia where anyone can update any part of it in real time continuously so that every part of it is always the state of the art.
Starting at languages, the feedback loop can be created between humans and intelligent machines. Humans must make decision on how to design Tau. These design decisions benefit from the virtuous cycle due to the feedback loop between humans and machines allowing the decision making ability itself to be upgraded. This could even allow for the humans to transcend traditional human capabilities by relying on intelligent machines to assist in design which means better future designs, which means better decision making, which means better future designs which leads to better decision making, this represents the "virtuous cycle" by way of a feedback loop between humans to machines to humans to machines to humans etc. The humans improve the quality of the machines by feeding knowledge, feeding new algorithms, feeding just enough for the machines to become intelligent enough to help the humans to help the machines even more efficiently in the next iteration of Tauchain, over and over again.
Humans and machines will seek more good and less bad for the formal specification of Tau itself. Good and bad designs will be defined collaboratively by the human participants by way of intelligent discussion. As discussion scales, bigger crowds means more human minds involved, which means improved design, which leads eventually to a better and perhaps wiser Tau, which of course would lead to wiser even more intelligent discussions, which can lead to an improved formal specification, and to a better Tau. So that is a loop. It is also a loop between improving Tau, improving society, improving Tau, improving society.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.