If Money = Memory, if Society = a Super Computer, if Computation is in Physical Systems, what is a Decentralized Operating System? By Dana Edwards. Posted on Steemit. October 24, 2018.
These concepts are not often discussed so let's have the discussion from the beginning. The first concept to think about is pancomputationalism or put another way the ubiquitous computers which exist everywhere in our environment. We for example can look at physical systems living and non living and see computations taking place all around us. If you look at rocks and trees you can see memory storage. If you look at DNA you can see code and if you look at viruses you can see microscopic programmers adding new codes to DNA. Even when we look at the weather such as a hurricane it is computing.
If you look at nature you see algorithms. You will see learners (yes the same as in AI), also in nature. The process is basically the same for all learning. Consider that everything which is physical is also digital. Consider that the universe is merely information patterns.
If we look at society we can also think of society as a computer. What does society compute though? One way people talk about a society is as a complex adaptive system, but this is also how people might talk about the human body. The human body computes with the purpose of maintaining homeostasis, to persist through time and reproduce copies of itself over time. The human brain computes to promote the survival of the human body. Just as viruses pass on codes to our DNA, the human brain is infected with mind viruses which are called memes. Memes are pieces of information which can alter physically how the brain is working.
The mind isn't limited to the brain. The mind is all the resources the brain can leverage to compute. In other words a person has a brain to compute with but when language was invented this allowed a person to compute not just using their own brain but using the environment itself. To draw on a cave is to use the cave to enhance the memory of the brain. To use mathematics is to use language to enhance the ability of the brain to compute by relying on external storage and symbol manipulation. To use a computer with a programming language is essentially to use mathematics only instead of writing on the cave wall we are writing in 1s and 0s. The mind exists to augment the brain in a constant feedback loop where the brain relies on the mind to improve itself and adapt. If there were no external reality the brain would have no way to evolve itself and improve.
A society in the strictly human sense of the word is the aggregation of minds. This can be at minimum all the human minds in that society. As technology improves the mind capacity increases because each human can remember more, can access more computation resources, can in essence use technology to continuously improve their mind and then leverage the improved mind to improve their brain. The Internet is the pinnacle of this kind of progress but it's obviously not good enough. While the Internet allows for the creation of a global mind by connecting people, things, and minds, it does nothing to actually improve the feedback loop between the mind and the brain, nor does it really offer what could be offered.
Bitcoin came into the picture and perhaps we can think of it as a better memory. A decentralized memory where essentially you can have money. The problem is that money is a very narrow application. It is the start, just as to learn to write on the cave wall was a start, but it's not ambitious enough in my opinion.
Humans in the current blockchain or crypto community do not have many ways where human computation can be exchanged. Human computation is just as valuable as non biological machine computation because there are some kinds of computations which humans can do quite easily which non biological machines still cannot do as well. Translation for example is something non biological machines have a difficult time with but human beings can do well. This means a market will be able to form where humans can sell their computation to translate stuff. If we look at Amazon Mechanical Turk we can see many tasks which humans can do which computer AI cannot yet do, such as labeling and classifying stuff. In order for things to go to the next level we will need markets which allow humans to contribute human computer and or human knowledge in exchange for crypto tokens.
The concept of a decentralized operating system is interesting. First if there are a such thing as social computations (such as collaborative filtering, subjective ranking, waze, etc) then what about the new paradigm of social dispersed computing?
The question becomes what do we want to do with this computing power? Will we use it to extend life? Will we use it to spread life into the cosmos? Will we use it to become wise? To become moral? To become rational? If we want to focus on these kinds of concerns then we definitely need something more than Bitcoin, Ethereum, or even EOS. While EOS does seem to be pursuing the strategy of a decentralized operating system which seems to be the correct course, it does not get everything right.
One problem is as I mentioned before the importance of the feedback loops between minds and brains. The reason I always communicate on the concept of external mind or extended mind is based on that fact that it is the mind which creates the immune system to protect the brain from harmful memes. The brain keeps the body alive. The brain is not really capable of rationality, or morality, or logic, and relies on the mind to achieve this. The mind is essentially all the computation resources that the brain can leverage.
EOS has the problem in the sense that it doesn't seem to improve the user. The user can connect, can join, can earn or sell, can participate, but unless the user can become wiser, more rational, more moral, then EOS has limits. EOS does have Everpedia which is quite interesting but again there are still problems. What can EOS do to improve people in society and thus improve society, if society is a computer and is in need of being upgraded?
Well if society is a computer first what does society compute? What should it compute? I don't even know how to answer those questions. I could suggest that if computation is a commodity along with data then whichever decentralized operating systems that do compete and exist will compete for these commodities. The total brain power of a society is just as important as the amount of connectivity. And the mind of the society is the most important part of a society because it is what can allow the society to become better over time, allow the people in the society to thrive, allow the life forms to continue to evolve avoid extinction.
A decentralized operating system on a technical level would have a kernel or something similar to it. This is the resource management part. For example Aragon promises to offer a decentralized OS and it too mentions having a kernel. A true decentralized operating system has to go further and requires autonomous agents. Autonomous agents which can act on behalf of their owners are philosophically speaking the extended mind. But the resources of a society is still finite, has to be managed, and so a kernel would provide for an ability to allow for resource management.
The total computation ability of a society is likely a massive amount of resources. A lot more than just to connect a bunch of CPUs together. Every member of the society which can compute could participate in a computation market. Of course as we are beginning to see now, the regulators seem concerned about certain kinds of social computations such as prediction markets. So it is unknown how truly decentralized operating systems would be handled but my guess is that if designed right then they could be pro-social, be capable of producing augmented morality by leveraging mass computation, and also by leveraging human computation be able to be compliant. To be compliant is simply to understand the local laws but these can be programmed into the autonomous agents if people think it is necessary.
What is more important is that if a law is clearly bad, and people have enhanced minds, then it will be very clear why the law is bad. This clarity will help people to dispute and seek to change bad laws through the appropriate channels. If there is more wisdom, due to insights from big data, from data scientists, etc, then there can be proposals for law changes which are much wiser and more intelligent. This is something specifically that people in the Tauchain community have realized (that technology can be used to improve policy making).
A lot is still unknown so these writings do not provide clear answers. Consider this just a stream of consciousness about concepts I am deeply contemplating. This is also a way to interpret different technologies.
Tauchain: The Social Dispersed Computer introduced as a Social Network? By Dana Edwards. Posted on Steemit. October 12, 2018.
What might a Tau Operating System via a Tau Social Dispersed Computer function like?
We know from tauchain.org that the first iteration of Tau is to be a discussion platform not too dissimilar from Facebook. Of course this would simply be the front end or the "face" of what could behind the scenes evolve toward a social dispersed computer complete with a dispersed operating system. The resources have to be managed and a kernel could provide for this in a manner not dissimilar to what we see with EOS. The Agoras or AGRS token specifically represents "resources" as it is the tokenization of resources for whichever application Tauchain will use.
TML provides the basis from which to create the necessary languages to produce a dispersed operating system computer. Zennet even has an algorithm which Ohad himself worked on for the purpose of calculating the resource requirements. All minds will be able to contribute towards the computational resources (at least in theory) of Tauchain.
Because of Zennet there may in fact not be a limit to the amount of computation resources which we could throw at the super computer. It will of course depend on resource management which is where a kernel likely comes into play because any smart apps built to run on Tau will have to ask for resources. Resource management is one of the core functions of a kernel and of an operating system which is why I think it is likely that Tauchain will have one. I think the Ethereum route shows problems with scaling as applications also have to compete for resources in a way where the network cannot self manage it. Cryptokitties for example can render the whole Ethereum network lagged and if this is a computer then it could mean a nonsense app could disrupt more critical apps.
A prime example of a potential smart app for Tauchain
An example (which may or may not be feasible) is a health and fitness app. The app in theory could allow any user to provide data such as genetic information, blood test results, exercise tracking, blood pressure, blood sugar and anything else. All of this could provide a feedback loop back to the patient on how to improve their health over time based on the knowledge of Tau. As technology gets better the users could add more devices to provide more data for a better feedback loop. As technology evolves FGPAs could be added to meet the demand for calculations and storage can be rented as well.
An operating system could give priority to this kind of app by load balancing the resources. How would it know to do this? Tau could learn the morals, legal ramifications, and a consensus can emerge that health related apps deserve a premium access to resources because it can save lives.
''We live in a world in which no one knows the law.''
Ohad Asor, Sept 11, 2016
I continue herewith with sharing my contemporary state-of-grok  of the up to now four  scriptures of the aka newtau . Sorry for the delay, but it comes mostly from the efforts to contain the outburst of words, catalyzed by the very exegetic process of such a rich content, into a reader-friendly shorter form.
The subject of vivisection textographically identifies as the first three paragraphs of ''Tau and the Crisis of Truth'', Ohad Asor, Sep 11, 2016 .
The four core themes extracted are ennumerated bellow, with as modest as not to sidetrack the thought and to not spoil the original message, streak of comments of mine.:
As I guy who's immersed in Law for more than quarter of century  I can swear with both hands on my heart in the notion of unknowability of Law.
Since my youth years in the law school  I was asking myself how it is possible at all to have 'rule of law'  in case any legal system ever known required humans to operate !?
It seemed that the only requisite or categorcal difference between mere arbitrary 'rule of man'  and the 'rule of law' was that in some isolated cases some ruling men happened to be internally programmed by their morals  to produce 'rule of law' appearance effects by 'rule of man' means.
Otherwise 'rule of law' done via 'rule of man' poses extremely serious threats of law to be used by some to exploit and harm others.
In that line of thoughts my conclusion was that the Law is ... yet to come.
What we know as Law is not good networking protocol software of mankind as such, but rather we see comparatively rare examples of individually well programmed ... lawyers.
On the wings of a technological breakthrough, just like: flying came with the invention of airplanes and moonwalk needed the advent of rocketry, or to remember without to stay alive - the writing. The Law is an old dream. If we judge by the depth of the abyss of floklore - one of the humanity's most ancient dreams, indeed. Needless to repeat myself that this was what sucked me into Tau as relentlessly as a black hole spagetification  :)
The referred by Ohad frustration by Law of the great Franz Kafka  expressed in his book The Trial  becomes very understandable for Kafka's epoch lacking the comforting hope in a technology which we already have - the computers - and the overall progress in the field of logic, mathematics, engineering ... forming a self-reinforcing loop centered around this sci-tech of artificial cognition.
Similarly to the nuclear fusion, which is always few decades away, but the Fusion gap closes noticeably nowadays , we are standing on the cliff of a Legal gap.
The mankind's heavy involvement in cognition technologies, especially in the last several decades, outlined multiple promising directions of further development, which seem to bring us closer to abilities to compensate the fundamental deficiencies of Law and in fact to finally bring it into existence.
It took entire Ohad Asor, however, to identify the major reasons why the Law is bottlenecked out of our reach yet, and to propose viable means to bridge us through that Legal gap... The other side is already in sight.
It is in the first place the language to blame !
The human natural language . Our most important atribute as species. The mankind maker. The glue of society. It just emerged, it hasn't been created. It has rather ... patterns, vaguely conventional, than intentionally coined set of solid rules. There ain't firm rules to change its rules, either ... The natural human language is mostly wilderness of untamed pristine naked nature, dotted here and there with very expensive and hard to install and maintain ''arteftacts'' . Leave it alone out of the coercion of state mass media, mass education and national language institutes and it falls back into host of unintelligible dialects. Even when aided by the mnemonic amplifier which we call writing.
Ambiguity is characteristic of the natural language, a feature in poetry and politics, but a deadly bug in logic and law.
We'll put aside for now the postulate of impossibility of a single universal language to revisit it later when its exegetic turn comes. In another chapter onto another scripture. Likewise, not in this chapter we'll cover the neurological human bottlenecks which are targetted to be overcome by Tau. Lets observe the sequence of author's thoughts and to not fast forward.
Instead of that I'll dare to share with you my own hypothesis about why the natural human languages are so. (I'm smiling while I type this, cause I can visualize Ohad's reaction upon reading such frivolous lay narrative. I hope he being too busy will actually not to.) To say that the human languages are just too complex does not bring us any nearer to decent explanation. Many logic based languages are more than a match of the natural human ones in terms of expressiveness and complexity. It shouldn't be that reason.
My suspicion is rather that the natural human languages pose such a Moravec hardness  for being not exactly languages. Languages are conveyors of meaning. Human languages convey not meaning, but indexes or addresses or tags of mind states. The meaning is the mind state. Understanding between humans is function of not only shared learnt syntaxi, but also of shared lives. Of aggregation of similar mind states which to be referred by matching word keys.
If this is true it is another angle for grokking the solution of human users leaning towards the machine by use of human intelligible Machinish, instead of Tau waiting the language barrier to be broken and machines to start speaking and listening Humanish.
In a nutshell we yet wait the Law to come cuz Law is not doable in Humanish. Bad software. And the other side of the no-law coin is that the humans are no cognitive ASICs . We do congnition only meanwhile and in-order-to do what other animals do - to survive. Bad hardware.
In order law to become law it must become handsfree .
Not humans to read laws, but laws to read laws.
The technology to enable that looks on an arm's length.
Ok, so far we butchered the law and the language. What's left?
The nature and essence of human language brought one of the most harmful and devastating notions ever. Literally, a thought of mass destruction.
The ''crisis of truth''. The wasteland left by the toxic idea spilover of ''there is no one truth'' or even ''there ain't truth'' at all. This is not only abstract, philosophical problem. Billions of people actually got killed for somebody else's truth.
Not occasionally the philosophers who immersed themselves into this pool are nicknamed 'Deconstructivist' . Following back their epistemic genealogy, we see btw, that they are rooted rather in faith than in reasoning, but this is another story.
The general problem of truth, of which the problem of law is just a private case, opens up two important aspects:
Number one, is that all knowledge is conjectural to truth and that, truth is an asymptotic boundary - forever to close on but never to reach. Like speed of light or absolute zero. Number two, is that human languages make pretty lousy vehicles to chase the truth with.
If really words are just to match people's thoughts together, then there are thoughts without words and words without thoughts. Words mismatch thoughts, so how to expect they to bridge thoughts to things? Entire worlds on nonsensical wording emerge, dangerously disturbing the seamless unity of things and thoughts. Truth displaced.
''But can we at least have some island of truth in which social contracts can be useful and make sense?''
This island of shared truth is made of consensus  bedrock and synchronization  landmass.
Thuth and Law self-enforced. From within instead of by violence from without. And in self-referenial non-regressive way.
''We therefore remain without any logical basis for the process of rulemaking, not only the crisis of deciding what is legal and what is illegal." 
Peter Suber with his ''The Paradox of Self-Amendment: A Study of Law, Logic, Omnipotence, and Change''  proposed a rulemaking solution which he called Nomic .
''Nomic is a game in which changing the rules is a move.'' 
The merit of Nomic is that it really eliminates the illths of the infinite regress  of laws-of-changing-the-laws-of-changing-the-laws, ad infinitum, by use of transmutable self-referrenial rules. But Nomic suffers from number of issues - the first one, in the spotlight of that chapter, being the fact that we still remain with the “crisis of truth” in which there is no one truth, and the other ones - like sclability of sequencing and voting - we'll revisit in their order of appearance in the discussed texts.
The aka 'newtau'  went past the inherent limitations of the Nomic system and resolves the 'crisis of truth' problem.
The next few chapters will dive into Decidability and how it applies to provide solution to the problems described above.
 - https://en.wikipedia.org/wiki/Grok
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-intro
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-the-two-towers
 - http://www.idni.org/blog/tau-and-the-crisis-of-truth.html
 - http://www.behest.io/
 - https://steemit.com/blockchain/@karov/behest-for-tauchain
 - https://en.wikipedia.org/wiki/Rule_of_law
 - https://en.wikipedia.org/wiki/Tyrant
 - https://en.wikipedia.org/wiki/Morality
 - https://en.wikipedia.org/wiki/Spaghettification
 - https://en.wikipedia.org/wiki/Franz_Kafka
 - https://en.wikipedia.org/wiki/The_Trial
 - https://www.amazon.com/Merchants-Despair-Environmentalists-Pseudo-Scientists-Antihumanism/dp/159403737X
 - https://en.wikipedia.org/wiki/Language
 - https://en.wikipedia.org/wiki/Official_language
 - https://steemit.com/blockchain/@karov/tau-through-the-moravec-prism
 - https://en.wikipedia.org/wiki/Application-specific_integrated_circuit
 - https://www.etymonline.com/word/manipulation
 - https://en.wikipedia.org/wiki/Deconstruction
 - https://en.wikipedia.org/wiki/Consensus_decision-making
 - https://en.wikipedia.org/wiki/Synchronization
 - http://legacy.earlham.edu/~peters/writing/psa/index.htm
 - https://en.wikipedia.org/wiki/Nomic
 - https://en.wikipedia.org/wiki/Infinite_regress
 - the illustration is a painting courtecy of the author Georgi Andonov https://www.facebook.com/georgi.andonov.9674?tn-str=*F
Let's use Tauchain to save our own lives and the lives of others: The life saving potential of Tauchain. By Dana Edwards. Posted on Steemit. September 10, 2018.
In this post I'm going to discuss what I think is one of the main reasons why I want Tauchain to exist. This is a reason I think many or perhaps most people can relate to. It starts with the question of how can we save our own lives using our own effort? It evolves into the question of how can we save lives in general by augmenting our efforts as much as technologically feasible?
1 out of 2 (around 50%) will be diagnosed with invasive cancer
The current statistics reveal that the highest scale we have a 50% chance of developing cancer in our life time. This can be lower according to some recent statistics (closer to 30% or in some cases 40% but still this is very high). The fact is if we are each in a room then about 1 out of every 3 of us in the best case will get cancer someday. And 100% of us will know someone who has cancer someday. So there is a very high chance that someone we care about a lot will develop cancer and do we want to be in a position where we didn't do all we could to have a capability of saving their life? It could even be you who developers cancer and would you want to be in the position where you can say you dedicated some of your resources toward finding a cure?
Cancer is one of those global problems that most human beings want to eradicate. It is not politically controversial to want to cure cancer. It is also something that Tauchain can help with because using Tauchain we can scale discussions, define problems in a precise manner, and most importantly leverage the market. The ability to create markets which are smart (meaning which can adapt to regulatory obstacles) is a potentially unique feature of Tauchain.
Some might say that there are already pharmaceutical companies trying to cure cancer or develop anti-aging treatments. Indeed this is true there are these companies. The problem right now is these companies do not have the new business models which Tauchain might make possible. First is the fact that using an ICO you can let future patients/customers own shares in the company. This allows companies which want to create cures to have the potential to raise billions of dollars necessary to do expensive trials. In addition the ability to do research may improve due to the features of Tauchain as well so that it is cheaper to search for new potential drugs or supplements.
The human genome is very complicated and is an area we know very little about. Cancer is also something we have to study. One example of an approach to defeating cancer is immunotherapy but this again is going to require a lot of research into how to reprogram the immune system to identify and destroy cancer. If everyone can help or contribute in some way to the process then it makes the process much cheaper than it is right now which means the drug or treatment can potentially be cheaper due to lower R&D cost.
Most people want to live long and healthy lives but we still know very little
We know very little about aging. We do have some theories as to what causes aging. We even have some theories on how to slow it down. But we don't understand the mechanism well enough yet to develop a treatment. By aging I'm referring to the process by which cellular function deteriorates over time. We know for example the risk of getting cancer increases with age. But we still are working on the means of developing biomarkers to even determine the age of a person.
What if we could leverage the potential of Tauchain to discover more about the aging process? What if we could develop an anti aging pill or treatment which we could collaboratively develop and own? What if we could make a profit from every pill sold via tokenization? If this sounds good to you then it might sound good to millions of others who could be encouraged to participate in an ICO to develop a pill to slow or supplement the aging process.
The ethical and rational argument
Some people could say that to put an emphasis on saving lives is to seek to do the greatest good for the greatest number. This emphasis could put Tauchain on a fast track to mainstream adoption because utility would be measured in not just how profitable it is to hold a token but in the potential lives that could be saved. To profit from saving lives is an ethical and rational argument. To align the profit motive with saving as many lives as possible is an easy ethical (and rational) argument to make. People who value life will value any technology which saves lives.
Some projects exist which I will list below that already are trying to save lives or end aging. These projects did ICOs over Ethereum and so they currently are Ethereum focused. That being said there is the possibility that some projects could still leverage Tauchain regardless of whether they originally launched on Ethereum. It is also possible that new projects can launch on Tauchain to attempt the same or similar objectives.
What can Tauchain do?
Grunau, G. L., Gueron, S., Pornov, B., & Linn, S. (2018). The Risk of Cancer Might be Lower Than We Think. Alternatives to Lifetime Risk Estimates. Rambam Maimonides medical journal, 9(1).
''Tau solves the problems from the Tower of Babel to the Tower of Basel''
- an early 21st century yet undisclosable author
Okay, dearest friends, lets pull sleeves up and start with it. Vivisection of the Scriptures? Revelation by transfiguration? Pulling the Tau from the ocean of wisdom out on the dry no-Maths-land? I hope not.
The quote above on first glance sounds so pompously biblical, but in fact it denotes the crystal clear and simple practical and mundane rationale of Tau which I already tried to approach from few angles , .
It is about the hierarchic bottleneck of one unscaling ,  Humanity. Take the hint about leveling of the Towers as a poetic symbol of elimination of the social 'verticality' -- the hierarchies as a so far necessary evil to compensate certain innate neurological limitations , , ,  -- and reforming  the network we are embedded into and usually call mankind or society or economy or world into an as geodesic as possibly possible one . For the sake of its own functional programmatic optimization .
Notice that towers leveling is not by demolition, but by uplifting the overall landscape level to and above the tower tops, turning them into deep roots or support pylons of asymptotically geodesic society .
Apparently, mentioning the Gate of God  denotes the unmixing  of languages & mentioning the apex global fiat settlement institution  - the excelling of the current fiat procrustics  i.e. the economy aspect.
That is: TML to Agoras . The first and last of the totally six identified aspects or steps of the social choice  as addressed by what we call Tau.
''our six steps of language, knowledge, discussion, collaboration, choice, and knowledge economy''
These aspects deserve of course separate zoom-in exegetic chapters and they'll definitely get it. I promise. And not only they.
Any exegesis of Tau unavoidably must start with scroll back and tracking down of the full history of the development so far. As a zoom out to see the full picture and to identify the dominant features of the landscape relief.
You, I reckon, already noticed this retrodictive inclination of mine , that in my mind the notion of ''Timeline of Development'' can not be by any logic just a handful of milestone promises thrown into the future, but it is a must to account for the up to now trajectory, too! No future without past.
It all started as Zennet , continued as Tau-chains  and 'turned' into aka 'newtau' , , , .
Wait! A New Tau?
Excuse me, Ohad, but I personally do not buy that and I said it many times. There ain't old and new Tau. The situation is much more straightforward and grokkable . Here it is:
Lotsa guts, balls, butt, brains or whatever human offal... is required for each of us to admit a mistake made in our everyday life. Generally quite a strength is needed to even look ourselves into the mirror...
It takes a whole Ohad though, to keep all oneself's work totally public and transparent even on the full and unedited live record of the infil  into entire branch of mathematics  and then throwing it all away as untauful. We witnessed that reported in real time!
Did this change the ends? No. But sorted out the means to an end.
Was it a 'mistake'? In no case. It was duly delivered R&D effort.
Was oldtau looking promising on first glance? Yes, of course it did.
Did it survive the Ohad's R&D 'crash-testing'? No, it didn't.
Was it a ''juice worth the sqweeze''? It was.
Was it a job well done? Absolutely.
The oldtau materials are for me legacy jewels. Like those dinosaur bugs trapped into blobs of amber .
Development is a process, not just results shipping. Related like cooking and serving.
Studying the zoom-out dev map we observe these few major landmarks:
The Zennet province is all right. Its gently rolling hills gradually merge into the Tau lands proper with the inevitable realization that a 'world supercomputer' can not be a Tauless thing. Zennet lives in Tau with .:
''... having a decentralized search engine requires Zennet-like capabilities, the ability to fairly rent (and rent-out) computational resources, under acceptable risk in the user's terms (as a function of cost). Our knowledge market will surely require such capabilities, and is therefore one of the three main ingredients of Agoras... hardware rent market...''
We move over through the oldtau wastelands  where the burnt ruins of MLTT  lie scattered - rough oldtau location-on-the-map indicator is the fall of 2015 with
''Tau as a Generalized Blockchain'' - posted Oct 17, 2015, 6:33 AM [updated Oct 17, 2015, 6:49 AM]
and then we reach the fertile gardens of newtau  in the fall of 2017:
''The New Tau'' - posted Dec 31, 2017, 12:27 AM [updated Dec 31, 2017, 12:28 AM]
Hmm. Apparently we crossed a watershed. Which relief feature it was? - The ridge  of:
''Tau and the Crisis of Truth'' - posted Sep 10, 2016, 8:25 PM [updated Sep 10, 2016, 8:28 PM]
Tau sorts out the Towers. I hope that the synopsis in this short chapter of Exegesis helped to sort out Tau dev in time as a navigation lookup tool.
Software is nothing but states of hardware. There is that intimate deep, not yet codified into a neat compact of logic, connection between Gödel , Heisenberg  and Laws of thermodynamics .
Tau keeps us off these traps.
I do not dare to state that someday we won't have the command on infinities and to play with them with the ease  of
''... a boy playing on the seashore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.''
In fact, quite the opposite I'd rather take it as inevitability someday we to conquer the Cantor  expanses and to venture far even beyond that. To transcale  the transfinite. Like Hilbert  said it.:
''Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können. (From the paradise, that Cantor created for us, no-one can expel us.)''
But it takes ... finitary vehicles of DECIDABILITY to conquer the transfinitary outer spaces. Because, in order to dear to dream to tame the infinities, we must first harness and get full command of finities.
Including of ourselves. Tau is ''understanding each other''. Without Tau we are ... others to ourselves.
Imperare sibi maximum imperium est.
The power of ambiguity and of ambiguity minimization in communication. By Dana Edwards on Steemit. June 1, 2018.
Formal communication benefits from ambiguity minimization.
So what exactly do I mean by formal communication? Well when we think of how human beings communicate with machines it is in a formal language. This formal language requires minimized ambiguity for security analysis (how can we analyze code if we cannot effectively interpret it?). The other problem is that the machines require for example that if... then... else and similar conditional statements are well defined and unambiguous.
Is it possible to show that a grammar is unambiguous?
To show a grammar is unambiguous you have to argue that for each string in the language there is only one derivation tree. This is how it would be done theoretically speaking.
In computer science, an ambiguous grammar is a context-free grammar for which there exists a string that can have more than one leftmost derivation or parse tree, while an unambiguous grammar is a context-free grammar for which every valid string has a unique leftmost derivation or parse tree. Many languages admit both ambiguous and unambiguous grammars, while some languages admit only ambiguous grammars.
Specifically we know that deterministic context free grammars must be unambiguous. So we know unambiguous grammars exist. It appears the strategy is ambiguity minimization with regard to formal languages (such as computer programming languages).
For computer programming languages, the reference grammar is often ambiguous, due to issues such as the dangling else problem. If present, these ambiguities are generally resolved by adding precedence rules or other context-sensitive parsing rules, so the overall phrase grammar is unambiguous. The set of all parse trees for an ambiguous sentence is called a parse forest.
The parse forest is an important concept to note. All possible parse trees for an ambiguous sentence is called a "parse forest". This concept is key to understanding the strategy of ambiguity minimization. So we can in practice minimize ambiguity and we know for certain that deterministic context free grammars admit an unambiguous grammar but what does that mean? What are the benefits of unambiguous language in general?
A benefit of ambiguity minimization
Simple English is a form of controlled English designed to minimize ambiguity in English. This is important because by using simple English to codify the rules or write the laws it puts it in a language where there is less of a computational expense (in brain power) to process and interpret the statements.
In one of my older blogposts @omitaylor commented and in one of her future posts she asked about the topic of love. In specific her post was titled: "What Does LOVE Mean To YOU"
Her post highlights the fact that there are different love languages and that we don't all speak the same love language. Ambiguity here is actually not a good thing but the simple fact is when someone speaks about love how do we know they are talking about the same thing? As a result we often seek an agreed upon or formally defined "love concept" where we all agree it's love. This is not trivial to find and as a result a topic like love is not easy to discuss in any serious manner. Unambiguous communication or to be more precise (minimized ambiguity) would allow Alice to discuss with Bob the topic of love in a way where they both know exactly what the other is referring to in terms of behavioral expectations, emotions/feelings, etc.
If Alice agrees to love Bob then Bob has no way to determine what Alice means unless he and she agree on a mutually defined concept of love. This highlights how agreement requires very good communication and how minimizing ambiguity can be beneficial at least in this example.
Ambiguity minimization makes sense when you are following a principle of computational kindness. That is if Alice would like to reduce the computational burden on Bob then she can reduce or minimize the ambiguity of her sentence. This is because in order for Bob to interpret an ambiguous sentence Bob must in essence sort all possible interpretations of that sentence from most likely interpretation to least likely interpretation, and before he can even sort he must first search in order to find all possible or at least plausible interpretations.
This is very computationally expensive for Bob but very cheap for Alice. Alice knows exactly what she means but Bob has no clue what Alice REALLY means.
A benefit of ambiguity
There are other examples where increasing ambiguity could be beneficial, such as perhaps when the communication is less than formal, or to share a stream of consciousness without turning it into a formal communication. Humor for example rides on ambiguity and a good joke may have multiple layers. Art also leverages ambiguity because it's perhaps meant to be interpreted 20 different ways all to produce a certain desired affect.
Ambiguity allows more meaning to be packed into fewer words. This in a sense is a sort of compression scheme. So if a sentence has multiple possible meanings the levels or meanings are still finite. It's a fixed amount of meanings and so theoretically speaking a search can be conducted. In fact this is what a human being does when interpreting natural language where a sentence can have multiple meanings (they do a search for all possible interpretations of that sentence). The problem with this is that it is computationally expensive as a process at least for the human being to try to figure out all possible interpretations of a sentence.
Lawyers when they do their work are working with a specific knowledge base of common legal sentences and common interpretations known in their profession but the rest of us might see a sentence in lawyer-speak and not really know what it means because we will not know the common interpretations. This is a big problem of course because to form agreements between two parties both parties need to have a common understanding (a kind of knowledge symmetric understandability) allowing them both to interpret roughly the same sentence to mean the same thing.
Tauchain is a profound project that has taken years of deep research and development. Some of the smartest people I've known on this platform highly recommended it, which is why it has been making me do a few things I've not been doing for a while now:-
So one of the first things I noticed in #idni's IRC channel is a cool-looking username "naturalog". While I'm pretty sure it just means natural logarithm, could it be natural OG instead? The natural, original gangsta? In casual parlance of course. Turns out, that's Ohad Asor's (the founder) nickname. What a smooth operator. That username is like wordplay: a mathematician with street cred. Too bad that Steem username is already taken.
The Natural OG
Reading through the logs I soon realised that I can trust his words. Why? Other than his experience, I think it's because I'm somewhat the same in nature. Not that I'm a genius with great knowledge and expertise like he is, but I do appreciate stuff like language, semantics, logic, and such. They're the kind of subjects which I think helps shape clear communication. It shows throughout his replies in the logs.
Many might not know it, but everything I say or type usually takes quite some time because I do try to be careful with words. Sometimes I even spend minutes to decide whether or not to say "could" instead of "would", amongst all of the other nuances in communication. Because, what else do we really have between us other than words? This is why writing is almost sacred to me.
The ability to question oneself and question one's choice of words are part of our learning process. Why do we really say what we say, or think what we think? Can't speak for everyone, but I expect introspective, lifelong learners to be more trustworthy when it comes to dealing with complex subjects. Plus, the obvious elements of the project seems to speak more about substance than hype:-
So all things considered, the project is unlikely to be a scam. If you search through the ~28 megabytes worth of IRC chatlogs, you will even find three ultra-rare instances of Ohad Asor aka naturalog mentioning "before it was cool". Look at the image below. Knowing his history and experience, I think it's safe to conclude that this dude is a certified OG. The natural OG. Total man crush! I might even ask him for some dating tips once he's done with the bulk of the development.
If those points above are not enough street cred to establish an OG status, check out this section of the chat log below:-
10:39 < Liaomiao> you must know a lot about blockchain architecture if you came up with some of the ideas behind graphene
Just good to know that he might have had some influence in the creation of Graphene, Dan Larimer's creation for Bitshares that subsequently shaped both the inner-workings of Steem and EOS. Impressive indeed. It's a good sign for Tauchain / Idni Agoras. In contrast, I was still riding rollercoasters all day high on sweet carbonated drinks in Disneyland during the same age when Ohad Asor was already grinding like an OG, writing production-level software.
So it would seem like my investigation into the heart of Tauchain has quickly turned me into a huge admirer and fan of the project. It has never happened to me before to this extent, but I certainly don't mind given the project's scope and the main developer's character. It's at least a much better story than elevating irrational loonies and sensationalists with no appreciation of well-founded knowledge, which unfortunately is all too common in society these days. If anything would make the world a better place, it would be intellectual curiosity, not intellectual dishonesty.
For now, I'm quite happy to have found the natural OG who has been working quietly behind the scenes. So far it seems to me that it could very well be the next big thing other than Steem communities and SMTs. I'll be posting more about the project in time. As always, thanks for reading.
Website - http://www.idni.org
Github - https://github.com/IDNI/tau
Telegram - https://t.me/tauchain
Reddit (with FAQ) - https://www.reddit.com/r/tauchain/
Coinmarketcap entry - https://coinmarketcap.com/currencies/agoras-tokens/
Here's an hour-long interview with Ohad Asor that you might want to check out.
Not to be taken as financial advice.
Virtualización de los contratos con TauChain y Agoras. Video del Canal Educación Financiera Bitcoin Criptomonedas en Youtube. 15 de mayo de 2016.
Sujeto, verbo y predicado. Taking the language of Ontologies to unify languages of:
* Computer Programs.
* Network Protocols.
Ontologies are expressed in RDF language family (Resource Description Framework). IDNI propose a software client that stores an ontology of local rules. Inteligencia Artificial, ontología, lenguaje, código "human readable", democracia descentralizada y equitativa.
Fuente / Source: Fuente: Canal Educación Financiera Bitcoin y Criptomonedas en YouTube.
What is Tau?
Tau is a decentralized network that can amend itself based on decisions f its users. Tau will provide a platform for users to reach agreements and decisions, in the largest scale seen so far. A social platform to reach agreements: Tau is a blockchain based platform that will allow for on-the-fly logical consensus detection which enables it to scale some of the largest bottlenecks to human advancement including social governance and knowledge creation. By using a self defining and decidable logical framework, Tau is the first platform able to gather data voluntarily submitted by its users and logically deduce valuable knowledge over a network secured with the blockchain technology. What this means is that in effect, we can scale collaborative endeavors between thousands of users to greatly accelerate the production of knowledge. The only dynamic decentralized social network: When Tau’s community will face a decision to change Tau or its blockchain protocol, they will just need to express their opinions and perspectives, like we do today in the social networks, and Tau will self-amend itself based on users’ agreement. Considering the perspectives of the entire community (unlike voting) is the only way to reach a decentralized decision that benefits all users. Tau’s ability to scale discussions is the only decentralized solution to create a true dynamic protocol.
What is Agoras?
Agoras is a cryptocurrency and an integral platform built over the Tau network and will serve as the primary economy. While Tau creates a true knowledge society, Agoras is about creating true monetary knowledge, by powering the ecosystem built via Tau. Agoras will be used to execute the applications of Tau, Zennet (Computational Resource Market), derivatives trading platform and further developments to be built as part of Tau’s ecosystem. Through the power of Tau, we envision the possibility of fully autonomous businesses operating over the Agoras virtual economy.
Special fields: Language, Knowledge, Economy, Collaboration, Discussion, Choice, Blockchain, Cryptocurrency, Logic, Dynamic Protocol, Decentralized Network, Internet of Languages
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.