The Paradigm of Social Dispersed Computing and the Utility of Agoras. By Dana Edwards. Posted on Steemit. October 12, 2018.
Social Dispersed Computing
What is socially dispersed computing? It is an edge oriented computing paradigm which goes beyond cloud and fog computing. To understand socially dispersed computing we first have to discuss dispersed computing and how it differs from the previous paradigm of cloud and fog computing. The current trend toward decentralized networks which we first saw with the peer to peer technologies such as Napster, Limewire, Bittorrent, and later with Bitcoin, have brought to us an opportunity to conceptually new paradigms. The original model most people are familiar with is the client server model which was very much limited in that the server was always vulnerable to DDOS attack. The client server model has never been and could likely never be censorship resistant.
In the client server model the server could simply shut down as was the case with Bitconnect or it could be raided. The server could also be shut down by hackers who simply flood the site with requests. As we can see from the problems the client server model presented we discovered the utility of the peer to peer model. The peer to peer model was all about censorship resistance and promoted a network which was to have no single point of failure (single point of attack) which could be result in the shutdown of access points to the information. One of the first applications for these peer to peer networks was file sharing networks and networks such as Freenet/Tor etc. This of course eventually evolved into the Bitcoin which ultimately led to the development of Steem.
In dispersed computing a concept is introduced called "Networked Computation Points". An NCP can execute a function in support of user applications. To elaborate further I'll offer something below.
Consider that every component in a network is a node. Now consider that every component node is an NCP in that it can execute some function to support some user application. If we think of for example a blockchain then we know mining would fit into this category because it is both a node in the network and it also can execute a function in support of Bitcoin transactions. Why is any of this important? Parallelism is something we can gain from dispersed computing and please note that it is distinct form concurrent computing. When we rely on parallelism we can reap the benefits in terms of performance when executing code by breaking it up into many small tasks which can be performed across many CPUs.
EOS attempts to leverage parallelism specifically to enable it's performance boost. The benefit is speed and flexibility. Think for example of the hardware side also with FGPAs which can do similar tasks of a microprocessor. FGPAs (not ASICs) which unlike ASICs would provide generalized flexible parallel computing. Consider that just like with mining a company could add more and more FGPAs to scale an application as needed.
To understand Social Dispersed Computing we have to make note of the fact that there are other users at any given time. For example the other users in the network participate to provide resources to the network for the benefit of other users whilst using the network. So in Steem for example as you add content to Steem you are adding value to Steem in a direct way, but also in a dynamic way. The resources on Steem also can adapt dynamically to the demand provided that the incentive mechanism (Resource Credits) works as intended.
EOS as an example DOSC (Dispersed Operating System Computer)
Because EOS seems to be the first to approach this holistically I will give credit to the EOS network for pioneering dispersed computing in the crypto space. All resources are representable by tokenization in a dispersed computing network. EOS and even Steem have this. Steem has it in the form of "Resource Credits" which represent the available resources on the Steem network. If more resources are needed then theoretically the resource credits could act as an incentive to provide these resources to the Steem network. This provides a permanent price floor to Steem represented as the amount of Steem which would have to be purchased in order to have enough resources to run Steem (if I have the correct theoretical understanding). This would put Steem on a trajectory toward dispersed computing.
Operating systems typically sit between the hardware and software as a sort of abstraction layer. This traditionally has been valuable because programmers don't have to directly speak to the hardware and hardware designers don't have to directly communicate by their designs to the programmer. In essence the operating system in the traditional model is centralized and made by a company such as Microsoft or Apple. This centralized operating system typically runs on a device or set of devices and provides some standard services such as email, a web browser, and maybe even a Bitcoin wallet.
Typically the most valuable or high utility software people consider on a computer is the operating system. In our smart phones this is Android OS and in PCs it may be Windows or Linux. This is of course thrown on it's head under the new paradigm of dispersed computing and the new conceptual model of the "decentralized" operating system. EOS is the first to attempt a decentralized operating system using current blockchain technology but the upcoming technology easily eclipses what EOS could do. Tauchain is a technology which if successful will leave EOS in the stone age in terms of what it will be able to do. EOS while ambitious also has had it's problems with regard to the voting mechanisms and the ease at which collusion can take place.
To better understand how decentralized operating systems emerge learn about:
If we look at OSKit we see that it is the tools necessary for operating system development. If we look at Tauchain we realize that it is strategically the most important tool for the development of a decentralized operating system being provided in the form of TML (a partial evaluator). If we think of the primary tool necessary to develop from we have to initially start with a compiler. A compiler generator is more like what TML allows with it's partial evaluator. More specifically it is the feature of Futamura projection which can provide the ability to generate compilers.
If we look at the next most important part of an operating system it is typically the kernel. Let's have a look at what an exokernel is:
Operating systems generally present hardware resources to applications through high-level abstractions such as (virtual) file systems. The idea behind exokernels is to force as few abstractions as possible on application developers, enabling them to make as many decisions as possible about hardware abstractions. Exokernels are tiny, since functionality is limited to ensuring protection and multiplexing of resources, which is considerably simpler than conventional microkernels' implementation of message passing and monolithic kernels' implementation of high-level abstractions.
By Thorben Bochenek [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
From this at minimum we can see that an exokernel is a more efficient and direct way for programmers to communicate with hardware. To be more specific, "programs" communicate with hardware directly by way of an exokernel. We know the most basic function of a kernel in an operating system is the management of resources. We know in a decentralized context that tokenization allows for incentives for management of resources. When we combine them we get kernel+tokenization to produce an elementary foundation of an operating system. In a distributed context we could apply a decentralized operating system in such a way that the network could be treated as a unified computer.
Abstraction is still important by the way. In an operating system we know the object oriented way of abstraction. Typically the programmer works with the concept of objects. In an "Application Operating Environment" an "Application Object" can be another useful abstraction. Abstraction can of course be taken further but that is for another blog post.
The Utility of Agoras
Agoras+TML is interesting. Agoras is the resource management component of what may evolve into the Tau Operating System. This Tau Operating System or TOS is something which would be vastly superior to EOS or anything else out there because of the unique abilities of Agoras. The main abilities have been announced on the website such as the knowledge exchange (knowledge market) where humans and machines alike can contribute knowledge to the network in exchange for the token reward. We also know that Agoras will have a more direct resource contribution incentive property in the form of the AGRS token so as to facilitate the sale or trade of storage, bandwidth or computation resources.
The possible (likely?) emergence of the Tau Operating System
In order for Tauchain to evolve into a Dispersed Operating System Computer it will need an equivalent to a kernel. Some means of allowing whomever is responsible for the Tauchain network to control and manage the resources of that network. If for example the users decide then by way of discussion there would be a formal specification or model of a future iteration of the Tauchain network. This according to current documents is what would produce the requirements for the Beta version of the network to apply program synthesis. Program synthesis in essence could result in a kernel and from there the components of a Tau Operating System could be synthesized in the same way. Just remember that all that I write is purely speculative as we have no way to predict with certainty the direction the community will take during the alpha.
Tauchain and the privacy question (benefits of secret contracts and private knowledge). By Dana Edwards. Posted on Steemit. August 21, 2018.
As we can see from the current trend in crypto there is now a move toward privacy. Most people underestimate in my opinion the utility of these cryptographic advances. In this blogpost I will highlight a particular advance enabled by these new cryptographic (and hardware techniques such as trusted execution environment) which can be of massive benefit to the long term believers in Tauchain.
The problem: Anyone can copy the code Ohad writes if it's open source
So we have a problem with Tauchain where all of the code Ohad is writing with regard to TML is open source and on Github. This allows a competitor to simply steal his best ideas and in a sense rob the token holders who actually funded the development of the code. This happens very often as we see a new innovation in the crypto space and soon later we see a new ICO or a new group come out of no where acting as if they originated the technology. In some cases the new group may even be much more centralized, more secretive, and very well funded.
The solution: Secret contracts (private source code and execution)
The trusted execution environment allows for the protection of intellectual property rights on the hardware level. While sMPC (secure multiparty computation) can also achieve similar ends on the software level. The idea being that this provides a solution to idea theft where a community can keep certain critical pieces of code, data, algorithms, or other unique features secret. This creates an entirely new way to monetize knowledge, code, and ideas, which Agoras will be uniquely positioned to leverage.
Guy Zyskind of the Enigma Project provides the definition for what secret contracts are and how they work. The Enigma Project deserves credit for introducing this technology and for identifying a major problem in the cryptospace. Traditionally on Ethereum or all other current platforms when you release a DApp your code has to be open source. It is not possible to create a closed or private source decentralized app. In addition the app has to be executed in the open so all data running through it is public.
Strategic implementation of private knowledge and source code can allow Tauchain to maintain a dominant position
In most cases the world benefits if knowledge is shared. In fact I'm in favor most of the time of sharing as much knowledge as is safe. The problem with algorithms, source code, and certain kinds of knowledge is that by sharing that knowledge it provides a competitive advantage to people who have more financial resources. These individuals can simply see Github and copy. They can hire programmers to compete with Tauchain and Agoras developers and as long as the code is open there will be no real reason to buy the Agoras token long term.
What if the Tauchain development team and Agoras developers decide to implement private knowledge bases? What if it becomes possible to run code in a trusted execution environment so that other developers around the world cannot see the code or the algorithms? This would allow Tauchain to build Agoras in such a way that no other project will be capable of duplicating it. This would lock in the value backed by the community brainpower into the Agoras token making it a true knowledge token which cannot simply by copied with ease by another project.
In fact this is a strategy that developers making apps using Enigma's Secret Contracts are looking into as we speak. This competitive advantage of secrecy will change the landscape of the cryptospace. What does this enable for Agoras? Imagine an encrypted Github which developers can contribute to but only the developers can see the code? Imagine after the code is written that no one else can see the code if the code is set to run privately? This would allow developers to code in secret and have the code run on computers without anyone knowing what the code is.
This can open up security vulnerabilities but Tauchain can defend against these. In particular it matters what is private and what is public. Critical aspects can be private while security critical areas can always be kept public. There may even be ways to prove that the code doesn't behave in a certain way without actually sharing the code (using advanced cryptography). In fact my favored way of implementing this feature would be to timelock the release of the source code by a number of months of years.
The idea isn't to keep things closed forever or secret forever. Privacy is about access control and about keeping things secret long enough to maintain a competitive advantage. A time delay to unlock the source code for example could work. It is even possible to allow the community to use puzzle based time lock encryption to have to mine to get the source code released early (if there is a serious need or threat). In this way all secret blocks of code could be unlockable but not for free and this would make it less likely that the community will seek to unlock it unless there is a genuine reason (beyond just to steal ideas).
What do you think about these ideas? If you agree with this or disagree then comment below. Strategic IP (intellectual property) is used by major corporations to give themselves a competitive advantage. The crypto community can do the same thing in ways the legal mechanisms can't do. In fact it can be done in a more fair and better way because often the people or companies awarded IP rights aren't the actual inventors. A knowledge economy is fantastic but if the knowledge is just harvested by big corporations monitoring the wide open network then it's going to be hard to bring value to a knowledge token.
UPDATE: Many people ask where to buy Agoras. The problem is it's not widely available on centralized exchanges. The only exchange I know that has it is Bitshares. So if anyone really wants to buy Agoras (AGRS) which is the token of discussion in this post feel free to buy it at:
42 million intermediate tokens total. Current price is: 0.00010700 BTC which is around 70 cents. This is the cheapest price I've seen it in a while because for a long time it was $1.50-$1.30 range. This is a very speculative token at this time so buy at your own risk as I'm not providing any financial advice. I'm a holder of this token of course and have been for years.
Puddu, I., Dmitrienko, A., & Capkun, S. (2017). μchain: How to Forget without Hard Forks. IACR Cryptology ePrint Archive, 2017, 106.
Kaptchuk, G., Miers, I., & Green, M. (2017). Managing Secrets with Consensus Networks: Fairness, Ransomware and Access Control. IACR Cryptology ePrint Archive, 2017, 201.
“We are moving into an era where cities will matter more than states and supply chains will be a more important source of power than militaries — whose main purpose will be to protect supply chains rather than borders. Competitive connectivity is the arms race of the 21st century.”
-- Parag Khanna , 
A network is made of lines and switches, right?
Lots have been told about the network scaling effects , including attempts by myself [4-12] ... which compels me to introduce the not so frivolous notion of network forces.
These forces are expressed in several laws. I though initially to say 'forces' and 'laws' here, but I realize they are quite objective and physical emergenta , indeed.
In my ''Geodesic by Tauchain''  article of about couple of months ago I emphasized over the Huber-Hettinga Law , of how cost of switching literally defines the 'orographic'  topology of a network .
The cheaper the routing - the flatter the network.
Expensive switches = hierarchy, verticality, power, control, obey, centalization, 'world is fiat' ,, sollen , hence borders instead of bridges, limitations not stumulae, exclusivity ...
Cheap switching = geodesic society , 'world is flat', horizontality, p2p, decentralization, inclusivity ...
The more vertical by centralization a network is - the more it must deplete information - to omit, to ignore calls from the deeps or to even actively suppress or silence nodes. To cope with the stream by strangling it. Simply due to lesser capacity, less degrees of freedom . Geodesic networks possess higher entropy  and therefore are richer. They bolster higher both Scrooge  and Spawn  factors. With other words:
The flatter the network - the richer  it is.
Maybe the explanation on why the wealthiest-healthiest societies tend to be those who are with biggest economic-political freedom. 
Naturally the Huber-Hettinga Law led me to the elementary-watson  conclusion of the power and value of Tau as the ultimate über -switch. So far so good.
Now lets stare in the Lines. Here comes Nick Szabo .
Nick Szabo - a lawyer AND computer scientist - is a legendary figure from the great 'Archaic era of crypto'  - the 1990es when he, together with the other cypherpunk  titans like Tim May , Wei Dai , Bob Hettinga  etc. etc., poured the very baserock foundations in a staggering detail of what we enjoy now as Crypto  in the post-Satoshi  era.
It is THEIR vision came true we all now live in.
Bitcoin was a detonation of namely that critical mass of fused thoughts, of namely these very smart people, piled up and compressed by the connective network forces of the early internet .
No, I do not mean at all Szabo's most famous thing - the 1994 coining of the term of 'smart contracts' . In fact I deeply and strongly reject the very notion of 'smart contracts' - as utter non-sense, even as an oxymoron - which is an yuge separate problem, which I suspect that I nailed it, and I'll address in series of dedicated articles starting in the upcoming weeks...
I mean something much more valuable, what I call the Szabo Law.
When we hear the phrase 'networking effects' the first what comes to mind is the famous Metcalfe law .
''Metcalfe's Law is related to the fact that the number of unique connections in a network of a number of nodes (n) can be expressed mathematically as the triangular number n(n − 1)/2, which is proportional to n2 asymptotically (that is, an element of BigO(n2)).''
In the above order of appearance these network forces laws respect quantitatively the basic properties of a network as:
- Huber-Hettinga Law - the cost of switches and routing.
- Metcalfe Law - the number of nodes, i.e. switches defining the number of unique connections or lines.
- Szabo Law - the cost of the lines and connecting.
All these Laws are scaling ,  laws. Before we to come back to and continue on Szabo Law, we have to briefly mention another one .:
''So what is “scaling”? In its most elemental form, it simply refers to how systems respond when their sizes change. What happens to cities or companies if their sizes are doubled? What happens to buildings, airplanes, economies, or animals if they are halved? Do cities that are twice as large have approximately twice as many roads and produce double the number of patents? Should the profits of a company twice the size of another company double? Does an animal that is half the mass of another animal require half as much food?'' ... With Dirk Helbing (a physicist, now at ETH Zurich) and his student Christian Kuhnert, and later with Luis Bettencourt (a Los Alamos physicist now an SFI Professor), Jose Lobo (an economist, now at ASU), and Debbie Strumsky (UNC-Charlotte), we discovered that cities, like organisms, do indeed exhibit “universal” power law scaling, but with some crucial differences from biological systems.Infrastructural measures, such as numbers of gas stations and lengths of roads and electrical cables, all scale sublinearly with city population size, manifesting economies of scale with a common exponent around 0.85 (rather than the 0.75 observed in biology). More significantly, however, was the emergence of a new phenomenon not observed in biology, namely, superlinear scaling: socioeconomic quantities involving human interaction, such as wages, patents, AIDS cases, and violent crime all scale with a common exponent around 1.15. Thus, on a per capita basis, human interaction metrics (which encompass innovation and wealth creation) systematically increase with city size while, to the same degree, infrastructural metrics manifest increasing savings. Put slightly differently: with every doubling of city size, whether from 20,000 to 40,000 people or 2M to 4M people, socioeconomic quantities – the good, the bad, and the ugly – increase by approximately 15% per person with a concomitant 15% savings on all city infrastructure-related costs.
Which probably comes to denote the shear size of the network in STEM (space, time, energy, mass) , I'm not sure, but I have some strong suspicions about the unity of matter, structure and action which I will expose and share some other time.
What I call Szabo's Law reveals in his ''Transportation, divergence, and the industrial revolution''(Thu, Oct 16, 2014)  that similarly to Metcalfe's (''double the population, quadruple the economy'') there is power-law  correlation between the cost of connections or links or lines ... and the value of the network, too.:
''Metcalfe's Law states that a value of a network is proportional to the square of the number of its nodes. In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed. The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation. Combine this with Metcalfe's Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation. A reduction in transportation costs in a trade network by a factor of two increases the potential value of that network by a factor of sixteen. While a power of exactly 4.0 will usually be too high, due to redundancies, this does show how the cost of transportation can have a radical nonlinear impact on the value of the trade networks it enables. This formalizes Adam Smith's observations: the division of labor (and thus value of an economy) increases with the extent of the market, and the extent of the market is heavily influenced by transportation costs (as he extensively discussed in his Wealth of Nations).''
My encounter with this article of Nick Szabo's was a goosebumps experience for me, cause it coincided with series of lay rants of mine on the old Zennet irc chat room of Tau that ''computation =communication =transportation''. Somewhere in 2016 as far as I remember. :)
Maybe it was the last drop to shape my conviction that by my dedicated involvement in both Tau and ET3 , , , I'm actually working for ... one and a same project.
For communication, computation and transportation being modes of state change. Cause information is a verb, not a noun. And software being states of hardware.
''Decentralizing the internet is possible only with decentralized physical infrastructure.'' 
Just like the brain is a network computer of neuron nanocomputers , the emergent composite we colloquially call humanity or mankind or economy or society or world ... is a network computer made of all us billions of humans.
Brains do thought, economies do wealth.
Integrated circuitry  upon the face of planet Earth as a motherboard . Literally. The Humanity's planet-hardware. Parag Khanna's Connectography explained.
The Earth is definitely not our ultimate chip carrier . Probably there ain't limit at all of our culture-upon-nature hardware upgrades, see: , . The universe is our computronium  and we've been here for too short and haven't seen far enough. Networking is connectomics . And thus it always also is metabolomics .
Remember my last month's  ''Tauchain the Hanson Engine''?
The series of exponentially shortened growth doubling times looks like driven by transportation technological singularities : domestication of the horse, oceanic navigation, combustion engine ...
In the light of all the net forces summoned above: The planet Earth viewed as a giant computer chip ...
- itself is a subject of the relentless network entropic  force of the Moore's law 
The network forces accelerate what that wealth computer does.
Two quick examples:
A.: The $1500 sandwich  as a proof that trade+production is at least thousands of times stronger in sandwich-making than production alone.
B.: The example of Eric Beinhocker in his 2006 ''The Origin of Wealth''  about the two contemporary tribes of the Amazonian Yanomami  - a stone age population nowadays and the Eastcoastian Manhattanites . That the former are only about 100 times poorer, but the later enjoy billions of times bigger choice of things to have.
Tauchain 'threatens' to affect the parameters of ALL the network forces formulae mentioned herewith in a mind-bogglingly big scale.
Simultaneously, orders of magnitude :
- lower switch cost
- higher nodes count 
- lower connection cost
A wealth hypercane  recipe. Perfect value storm. Future ain't what it used to be .
''Tau solves the problems from the Tower of Babel to the Tower of Basel''
- an early 21st century yet undisclosable author
Okay, dearest friends, lets pull sleeves up and start with it. Vivisection of the Scriptures? Revelation by transfiguration? Pulling the Tau from the ocean of wisdom out on the dry no-Maths-land? I hope not.
The quote above on first glance sounds so pompously biblical, but in fact it denotes the crystal clear and simple practical and mundane rationale of Tau which I already tried to approach from few angles , .
It is about the hierarchic bottleneck of one unscaling ,  Humanity. Take the hint about leveling of the Towers as a poetic symbol of elimination of the social 'verticality' -- the hierarchies as a so far necessary evil to compensate certain innate neurological limitations , , ,  -- and reforming  the network we are embedded into and usually call mankind or society or economy or world into an as geodesic as possibly possible one . For the sake of its own functional programmatic optimization .
Notice that towers leveling is not by demolition, but by uplifting the overall landscape level to and above the tower tops, turning them into deep roots or support pylons of asymptotically geodesic society .
Apparently, mentioning the Gate of God  denotes the unmixing  of languages & mentioning the apex global fiat settlement institution  - the excelling of the current fiat procrustics  i.e. the economy aspect.
That is: TML to Agoras . The first and last of the totally six identified aspects or steps of the social choice  as addressed by what we call Tau.
''our six steps of language, knowledge, discussion, collaboration, choice, and knowledge economy''
These aspects deserve of course separate zoom-in exegetic chapters and they'll definitely get it. I promise. And not only they.
Any exegesis of Tau unavoidably must start with scroll back and tracking down of the full history of the development so far. As a zoom out to see the full picture and to identify the dominant features of the landscape relief.
You, I reckon, already noticed this retrodictive inclination of mine , that in my mind the notion of ''Timeline of Development'' can not be by any logic just a handful of milestone promises thrown into the future, but it is a must to account for the up to now trajectory, too! No future without past.
It all started as Zennet , continued as Tau-chains  and 'turned' into aka 'newtau' , , , .
Wait! A New Tau?
Excuse me, Ohad, but I personally do not buy that and I said it many times. There ain't old and new Tau. The situation is much more straightforward and grokkable . Here it is:
Lotsa guts, balls, butt, brains or whatever human offal... is required for each of us to admit a mistake made in our everyday life. Generally quite a strength is needed to even look ourselves into the mirror...
It takes a whole Ohad though, to keep all oneself's work totally public and transparent even on the full and unedited live record of the infil  into entire branch of mathematics  and then throwing it all away as untauful. We witnessed that reported in real time!
Did this change the ends? No. But sorted out the means to an end.
Was it a 'mistake'? In no case. It was duly delivered R&D effort.
Was oldtau looking promising on first glance? Yes, of course it did.
Did it survive the Ohad's R&D 'crash-testing'? No, it didn't.
Was it a ''juice worth the sqweeze''? It was.
Was it a job well done? Absolutely.
The oldtau materials are for me legacy jewels. Like those dinosaur bugs trapped into blobs of amber .
Development is a process, not just results shipping. Related like cooking and serving.
Studying the zoom-out dev map we observe these few major landmarks:
The Zennet province is all right. Its gently rolling hills gradually merge into the Tau lands proper with the inevitable realization that a 'world supercomputer' can not be a Tauless thing. Zennet lives in Tau with .:
''... having a decentralized search engine requires Zennet-like capabilities, the ability to fairly rent (and rent-out) computational resources, under acceptable risk in the user's terms (as a function of cost). Our knowledge market will surely require such capabilities, and is therefore one of the three main ingredients of Agoras... hardware rent market...''
We move over through the oldtau wastelands  where the burnt ruins of MLTT  lie scattered - rough oldtau location-on-the-map indicator is the fall of 2015 with
''Tau as a Generalized Blockchain'' - posted Oct 17, 2015, 6:33 AM [updated Oct 17, 2015, 6:49 AM]
and then we reach the fertile gardens of newtau  in the fall of 2017:
''The New Tau'' - posted Dec 31, 2017, 12:27 AM [updated Dec 31, 2017, 12:28 AM]
Hmm. Apparently we crossed a watershed. Which relief feature it was? - The ridge  of:
''Tau and the Crisis of Truth'' - posted Sep 10, 2016, 8:25 PM [updated Sep 10, 2016, 8:28 PM]
Tau sorts out the Towers. I hope that the synopsis in this short chapter of Exegesis helped to sort out Tau dev in time as a navigation lookup tool.
Software is nothing but states of hardware. There is that intimate deep, not yet codified into a neat compact of logic, connection between Gödel , Heisenberg  and Laws of thermodynamics .
Tau keeps us off these traps.
I do not dare to state that someday we won't have the command on infinities and to play with them with the ease  of
''... a boy playing on the seashore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.''
In fact, quite the opposite I'd rather take it as inevitability someday we to conquer the Cantor  expanses and to venture far even beyond that. To transcale  the transfinite. Like Hilbert  said it.:
''Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können. (From the paradise, that Cantor created for us, no-one can expel us.)''
But it takes ... finitary vehicles of DECIDABILITY to conquer the transfinitary outer spaces. Because, in order to dear to dream to tame the infinities, we must first harness and get full command of finities.
Including of ourselves. Tau is ''understanding each other''. Without Tau we are ... others to ourselves.
Imperare sibi maximum imperium est.
“A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”
― Robert A. Heinlein 
No, it is not a vow everybody to be everything. It is a reflection of the fundamental human fungibility . The average human can be taught to take any human role. The exceptions of true organic geniuses (those who are hard to be replaced) and morons (those who are incapable to replace), only confirm this general rule of shear numbers . This is what makes the mankind so scalable .
''Know'' is synonymous with ''can''. Literally. Knowledge = technology. Even etymologically . Knowledge is praxis . Only. There ain't such thing as impractical knowledge. If it is not a skill, it is not knowledge. I mentioned once  that we're all AIs. Ref.: feral children .
We are not what we eat , but we are what we've learnt. You are what you know/can. And you can what you have learnt. Learning is from the taking side. Teaching is on the giving side. Of one and a same process. We do not have a word to denote the modulus  of learning/teaching, it seems. But it will come.
We are taught by the others, the society. We are the cherry ontop of a layer cake of culture onto nature . We are learning by ... living. We acquire skills in plethora of contexts from family, street, school, job, media ... Learning  is not a monopoly of man, countless systems are also learners. Maybe one of the basic definitions of life and intelligence is the ability to learn . Giant topic, yeah. We won't graze into it here now on what is learning, but on how we learn.
Due to our neurological bottlenecks we spontaneously form hierarchies . This hinders our scalabilty  by forcing humanity to be more or less a fractal of 5. We are close to a number of breakthroughs which to mitigate these innate limitations of ours into a number of ways    . But the general case is not subject of this article - herein we focus on HOW we are taught. How we acquire knowledge, and how this knowledge of ours gets recognized and utilized by society. And the hierarchic emergent structuring is of course in full force upon us in teaching as well as into everything social else.
So comes education , such comes exam , knowledge certification , certified skills application , knowledge creation verification , job fitness testing , CVs and employer recommendations ... etc., etc. With all the bugs and the so little features of this 'map is not the territory' , situation.
It is all centralized and hierarchic - exactly as the global fractal of double-entry accountancy ledgers which we call fiat financial system is. In fact it is so interwoven with fiat finance than it is almost inextricable from it . And as much inefficient and imprecise.
In all these years of talking and thinking on Tauchain  - I noticed - and this suspicion of mine incrementally turns into shear conviction - that Tau, the upscaler of humanity, inevitably also is the ultimate teaching machine. If education is facilitating of learning, Tau is the maximizer of learning. By its very construction, it comes out so.
People talk and listen whenever and whatever they want. Tau has unlimited capacity to listen and attend and remember, and answer. Only limited by the hardware capacity allocated. Tau extracts meaning. Purifies the stream, distills it down to the essence. Detects repetitions, contradictions and all other, ubiquitous nowadays conversation bugs. Remembers changes of opinions of the individual user. And points them out. Sounds like the best tool to know oneself. And the others to know you if you let them.
Your Tau account or profile is what you know. You say what you say and also ask. Say statements and questions. Tau pools you together with the others who state the same and, more importantly, who ask the same type of questions. Knowing what you know, and asking about what you don't know but want to know, maps not only your knowledge state but also maps your knowledge dynamics. Records and drives how your knowledge changes. You even have access to what you forget, and can recollect it. True real time knowledge state reporting. For first time in human history.
If consciousness  is - aside from the clinical state of being merely awake - the post-factum integration of senso-motoric experience , the Accountant of mind, the speaker of the narrative which is you, then Tau is your consciousness booster. That is - stronger than thought.
The ultimate teaching, the ultimate fair testing or exam, the ultimate real-time comprehensive diploma, or certificate, super-peer reviewed paper(s) of you as academic carrer.., the ultimate job interview AND the ultimate ... job of being working as yourself and anything useful you create to be instantly scarcifiable and monetizable - your Tau account is! And all the rest of accessible socoety - being your own workforce. And you to them. In the billions. In a move. In real time.
Including control over the pathways of increase of your skills towards the most productive personally for you learning directions, because it aids you to analyze the you-Tau history and to apply knowledge maximizer techniques and to participate profitably into creation of newer better ones. Maximizer of self. And maximizer of society making it to consist of max-selfs. Ever improving. Merger of education with work occupation. Work-as-you-live.
The literal Knowledge Economy, as described by @trafalgar in his article  from few months ago. Where search, creation, reflection, certification, recognition, commercialization, accumulation, modification, improvement ... everything of knowledge - is all in one.
And it is not only Humans and Tau lonely job. I foresee the other Machines to join the party . Yes, I mean machines capable to have interests and to ask and seek answers of palatable questions.
This - the education amplification - to come down the technology way - has been, of course, anticipated by many. Few arbitrary examples:
- A distant rough-sketch hint for the inevitable tuition power of Tau is Neil Stephenson's  ''The Diamond age''  , with the depicted: '' Or, A Young Lady's Illustrated Primer '' , as an interactive networked teaching device.
- or if I'm right about the inevitable conquest of the natural languages territory  - UX  like in the 'Her' (2013) film .
- Thomas Frey  of the futurist DaVinci Institute  in his book ''Epiphany Z''  paid special attention of this.: down the way of micro- and nano-education, an effective merger of the processes of education, diplomas issuing, job application, exam and actual execution of job obligations. Tom does not know about Tau. But I'll tell him.
With a big smile of irony and self-irony of course... these examples. Just to pick from here and there proofs of the giant anticipation of what's to come. And taken with a few big grains of salt. Cause the reality will be immensely more powerful.
Tutor , tuition , my emphasis via using exactly this wording, comes to denote the economic side of learning/teaching. It is about the cost of learning - the association of tuition with fees, about the placement of the acquired skills, about the business organization of those, about the protection of ownership and security of transaction of knowledge ... Let me introduce here a neologism  which to reflect the business side of it:
Scrooge Factor 
- Simply denoting the money-making power of a technology use by a business. The 'money suction power' of a business entity or organization of any kind coming from the application of a technology, if you want. Technology as socialized knowledge. Scaled up over multiple humans. Over a society. Of course the Scrooge Factor can pump in different directions. The Scrooge Factor of the traditional hierarchic education, governance and everything ... is apparently very often negative - hierarchies decapitalize, dissipate, waste. Orders of magnitude more wasteful than any PoW , but on this - some other time.
So aside from all the niceties of the abstractions of the full supply and value chains of a Knowledge economy, lets round up some numbers:
- We know that a true functional semantic search engine alone is worth $10t. Yeah. Tens of Trills. Trillions. As per the assessments of Davos WEF attendees of as far as I remember 2015 or 2016...
- Also, Bill Gates stated back in 2004  that ''If you invent a breakthrough in artificial intelligence, so machines can learn,'' Mr. Gates responded, ''that is worth 10 Microsofts.''
- Tom Frey  also argued  that by 2030 the biggest corporation in the world will be an online school. Given the present day size and growth rate  of, say, Amazon  this 'online school' should be in the range of good deal of trillions of marcap if it is to be bigger than the biggest corporations. But we do not need such indirect analogies over analogies to access the scale. The shear size of the global education industry is the most eloquent indicator . Note that Tom talks about 'corporation' i.e. for clumsy and inefficient hierarchic human collective. Not for a system which does this orders of magnitude more efficiently and powerfully due to being intrinsically P2P, i.e. geodesic . Even the best futurologists can be forgiven for missing to predict Tau. :)
And this mind-boggling hail of trillions, does not even account for the Hanson Engine  factor.
Tau the Tutor ex Machina is just another unintended useful consequence outta the overall design.
It is nearly impossible to track and contemplate exactly what all these 'side-effects' would be and how they will synergetically boost each other.
With my articles I intend to only touch some lines of the immense phase space  of the possibilia, with neither any ambition to think it is possible to cover it all, nor this to represent any form of advice.
Future is incompressible. Compression is comprehension. Comprehensible only by living.
Failure to go to the geodesic way of learning, will turn these beautiful but trilling words into prophecy:
"The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age." H.P.Lovecraft  (1926 ).''
Tauchain y el conflicto de la ley.
Imagen ilustrada por @capitanart
El único objetivo de este post es poner a trabajar el cerebro dando a conocer ciertas reflexiones y puntos de vista.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.