What is Tauchain & Why It Could Be One of The Greatest Inventions of All Time (Part 1: Introduction). By Kevin Wong. Posted on Steemit. August 28, 2018.
In anticipation of Tau's demo some time around the end of this year, I'd be publishing a series of articles leading up to its release and beyond on Steem. If you would like to get to know what some of us think is going to be one of the greatest inventions of all time, I'd recommend you to check out http://wwwidni.org. It seems like a foundation that we've missed out on building together since the birth of the Internet.
A close resemblance of this project is the Semantic Web although some of us would place Tau as being far more ambitious in scope, oddly in a way that is likely more feasible with its ingenious use of a logic blockchain to power a decentralized social choice platform. I think it's impressive how singular the concept actually is, despite the unavoidable lengthy explanations that come paired with the many first-time features that Tau will provide.
Without further ado, let's explore this world-changing technology that is currently baking in the oven.
What is Tau?
Let's begin by first checking out the opening of IDNI's website at http://idni.org:-
Tau is a decentralized blockchain network intended to solve the bottlenecks inherent in large scale human communication and accelerate productivity in human collaboration using logic based Artificial Intelligence.
Sounds fairly straight-forward at first glance, and to me, it really stands out in the cryptosphere. We now have millions and billions of people using the Internet everyday, yet we still do not have any effective means of discussing and collaborating without being all over the place. Sure, we may have been pouring a lot of our time and effort into various platforms trying to connect with others, but have things been really any different compared to a time before the Internet?
The speed of information propagation has increased by orders of magnitude, and we can reach anyone on the planet now, but it's still really up to us to be present and be able to process information in our heads before turning them into relevant knowledge for our networks.
Expanding our social bandwidth.
Turns out, we have been experiencing a lot of trouble coming to terms with the chatter of billions of people in cyberspace. The bottlenecks inherent in our human bandwidth remain to be unsolved even with near-instantaneous communications. From governments to corporations and blockchain communities, we are all still facing the age-old problem of being unable to scale governance beyond the size of a classroom. It's just difficult to get our points across to many different people, let alone making sense of complex long-term discussions and making network-wide decisions collaboratively.
The introduction to The New Tau written by Ohad Asor explains our situation quite accurately:-
Some of the main problems with collaborative decision making have to do with scales and limits that affect flow and processing of information. Those limits are so believed to be inherent in reality such that they're mostly not considered to possibly be overcomed. For example, we naturally consider the case in which everyone has a right to vote, but what about the case in which everyone has an equal right to propose what to vote over?
So how is Tau actually going to solve our communications bottleneck? It will be through a highly bespoke and non-trivial implementation of a logic-based Artificial Intelligence (AI). It's worth noting that AI in this case is more of a buzzword for marketing-speak, and it is actually not of the same variety as the commercial implementations of deep machine learnig.
The distinction that must be made is that Tau is not the kind of AI that attempts to guess what the world is around them, including that of our opinions and the things we say or do. Instead, we must make the step towards communicating through Tau and what we choose to communicate will be as definite as computer programs. It can be thought of as a persistent logic companion that helps us improve the scale our reasoning, logic, and bandwidth.
We can take the time to share what we want to share on the Tau network and most of the logic-based connections and operations will happen in the background over time, even when we're not paying attention in-person. Again, the use of the word AI is a misnomer here because it usually paints the picture of AI agents attempting to mimic human autonomy. That's not what Tau is about. In this case, thinking about Tau as just a logic machine should provide better clarity on what it actually is.
The power of logic.
To expand, here's the second paragraph found in the opening of IDNI's website that explains Tau's paradigm in logic-based communications, http://idni.org:-
Currently, large scale discussions and collaborative efforts carried out directly between people are highly inefficient. To address this problem, we developed a paradigm which we call Human-Machine-Human communication: the core principle is that the users can not only interact with each other but also make their statements clear to their Tau client. Our paradigm enables Tau to deduce areas of consensus among its users in real time, allowing the network to boost communication by acting as an intermediary between humans. It does so by collecting the opinions and preferences its users wish to share and logically constructing opinions into a semantic knowledge base.
Indeed, Tau will offer a semantic social choice platform where we can discuss and store knowledge in a logical universe that helps us organize information, thereby empowering us in highly relevant ways. If you're worried about privacy, know that Tau is first-and-foremost designed as a local client with local processing and storage. The platform itself will be deployed as a decentralized peer-to-peer network, a place where we can connect and share our knowledge-base with anyone we desire.
The only price to pay in all of these is that we must speak in Tau-comprehensible languages, which can always be added and modified over time. A sophisticated language that can be defined over Tau may closely resemble natural languages, but it is really best to expect Tau as a machine-comprehensible language that only speaks in logic. Fortunately, logical formalism is something that we can easily deal with.
So it will be up to us to communicate with our local Tau client in a way that it'll understand our worldviews. When the machine understands what we share completely in some logical, mathematically-verifiable sense, it can then connect our dots with the rest of the Tau network, effectively boosting communications beyond the limits of human bandwidth, effectively scaling our points of discussion, consensus, and collaboration up to an infinite number of participants.
Code and consciousness.
Finally, we look at the last paragraph of Tau's introduction at http://idni.org
Able to deduce consensus and understand discussions, Tau can automatically generate and execute code on consensus basis, through a process known as code synthesis. This will greatly accelerate knowledge production and expedite most large scale collaborative efforts we can imagine in today's world.
Since Tau is a logic blockchain that powers a semantic social choice platform, we can leverage it to have both small and large-scale discussions about program specifications, detect points of consensus, and even generate software in the process. Being able to go from discussions to the realization of decentralized applications would mean inclusive code development for the masses. It's also a unique addition to decentralization that no other blockchain projects have even thought about.
Now that we may have come to a better understanding of Tau's emphasis on the use of logic in every part of its being, let's revisit the process description found in The New Tau to get closer to knowing what it really is about:-
We are interested in a process in which a small or very large group of people repeatedly reach and follow agreements. We refer to such processes as Social Choice. We identify five aspects arising from them, language, knowledge, discussion, collaboration, and choice about choice. We propose a social choice mechanism by a careful consideration of these aspects.
In short, Tau is a decentralized peer-to-peer network that takes the shape of a social choice platform, and it can become anything that we want it to be, for as long as it's expressible within the self-defining and decidable logics of FO[PFP] with PSPACE-complexity. This precise specification is required to satisfy the very definition of Tau as seen in the excerpt above. Tau is also intended to be a compiler-compiler.
This is taking application-generality into a completely different direction compared to blockchains that are built specifically with turing-completeness in mind, like Ethereum. Relevant literature to check out: Finite Model Theory.
Understanding each other.
While it's all highly technical and difficult to grasp in one seating, perhaps a better way to truly begin to understand Tau is to spend some time studying its main features. Or just wait for the product release. In any case, I will try to explore these topics in the future if my brain can still handle it:-
The more I think about Tau, the more I think that it is (poetically) a logical conclusion to the way the Internet works as a protocol. It even lives and breaths logic. Not just any kind of logic, but specifically, logics that can define their own semantics and is decidable. Tau is intelligently designed to be a truly dynamic and ever-evolving blockchain.
When the Tau community intends to make changes to the network code, rules or protocols, they will simply need to express these opinions and perspectives in a compatible language over the network. The self defining logic of the Tau blockchain network will enable it to detect the consensus among these opinions and automatically amend its own code to reflect this consensus from block to block. Unlike the common method of voting, Tau’s approach will take into account the perspectives of the entire community, where people will be free to vote and propose what to vote for in real time. This unique ability of Tau is the only decentralized solution to create a truly dynamic protocol.
Now you might think: Tau seems like a powerful tool but will it be too difficult to use for most people? There might be some learning curve involved for sure, and it'd be similar to learning a new language in the beginning. Those of us who learn to use it well enough to scale our discussions and collaborative works will likely gain a significant edge over those who are not using the platform. I'd imagine plenty of projects and communities around the world being able to overcome some of their obstacles in development through Tau. Hence, it may be fair to expect that market forces will gravitate towards the platform just like how we're all using the Internet these days.
Until the next post.
I've been thinking about Tau almost everyday for the past many months now, and I will admit that its deeper technicalities are still way out of my league, although I've made sure to word them broadly out the best I can. If you like what I do, please consider sharing this post and voting on my witness account on Steem. For more info, check out my recent witness announcement post.
As always, thanks for reading!
Images from Pexels
Music tracklist by Magical Mystery Mix
Follow me @kevinwong / @etherpunk
Not to be taken as financial advice.
Always do your own research.
Lets build an universe , . I realize this blog post is the most 'psychedelic' up to now and for long time to go, but some 'poetry' never really hurts ...
We discussed already the worldmaker effectoring .
It is quite ancient but also exponentially growing business ... in all possible forms of science , faith  and art . This modeling  usually serves to play out what's possible and what's impossible. Gedanken eksperiment , yeah, but isn't all thought  merely algorithmic  and mere action  ?
Usually the posited universes are made of variations and combinations of substance/matter, structure/form and action/process rules. Though, the algorithmic component is always the essential ingredient. Yes, the Laws of Physics are full-fledged, literal algo , too. I have those conjectures that it is impossible to think out, make or discover (which is one and a same thing) a lifeless universe  and that substance-structure-action are inexctricable, but these are separate topic for some other times to address .
Lets put together ours toy-universe  out of only pure algorithm. I've never seen such a construct, although the Orbis Tertius  is enormous and I bet this vision have occurred gazillions of time in zillions of minds.
It is like an ocean. The primary coin-toss algo which outputs 0s and 1s  makes the water. We don't know (yet) if there are even deeper and more fundamental numerical bases  for running algos. Most probably the answer is yes by analogy with the Dirac Sea  - the deeps to be made of simpler and weaker algos. The most elementary coin-toss thing makes out the ... probabilistics, perhaps the primordial form of logic. The laws of physics (and of machine learning  and of darwinian evo algo  ...) tell the rule-set how to stitch together lotsa coin-toss outputs. A hint on inspiration for that - David Deutsch's Constructor theories . The laws of physics as entropy  limitation of the allowed elementary algo cumulative output. For information being a verb, not a noun - isn't it? Very interested philosophic perspective on algorithm as randomness constrictor  raise up...
So, if the Algoverse ocean water is made of elementary coin-toss molecules, being ''liquid'' is just another phase or aggregate state .
There is deep duality  between probabilistics and logic. Just like the zoo of dualities discovered in accelerating pace by the mathematical physics in the last decades  Probability/statistics we make now by logic , the reverse ... - well, nobody yet cracked it. Even Kolmogorov. But I bet we will. Most probably the breakthrough will be Ohad Asor name-labeled... To find the know-how to do it the other way round, do logic with probability/statistics. The statistical algorithmic - not the SAT , brute force, alchemist  way as with NN/ML ,  and other known beasts. This will be nothing less but full merger of maths/logic/philosophy/thought... and physics. Literally!
Excuse me for the haiku  simplification. It is deliberate due to realization of my grok constraints. :) Regard it as sharing a poetic impression.
Is there deeper and weaker algo than the digital - the radix-2, deterministic, unitary one? Intuition says ''yes, of course!'' Like with these radix-1 Half-coins  of negative and other non-unitary probabilities ... which take two tosses to yield a bit... and there must be transfinity  of lower ones, also transfinities of higher and sideways ones ... which is almost as counter-intuitive as Dirac's bottomless night of negative energy , but I bet also as much useful. (Lets not even touch numeral bases of Pi, i, e ... etc.), and lets stick to strictly binary 'water' for our oceanic toy-universe for the sake of sanity.
The next important notion of the Algoverse ocenic model is the Algorightmic strength  - the weakest algo would be that which takes infinity of tosses to get a full bit. The strongest?
Algorithmic ephimeralization  - essentially to do more with less. Or faster - Speed Prior  ... which is just another way to say 'more'.
Some algos are too strong - QM, M-Theory - they return way too much bits per 'toss'. Their vcdim  converges to infinity. Exponential walls  in all directions. Not exactly what Freeman Dyson had in mind ... In our ''mockup'' they could be depicted as too hot. Changing the phase of the elementary algo 'water'. Like.:
but because we are all for peaceful use of algorithmic energy - we reject those up here, too - together with the non-unitary statistics down there.
Last piece of the picture - the Algoverse ocean is habitable and inhabited!
By higher algos as life-forms, stronger - but not so strong to turn the 'water' into roaring steam or plasma.
Examples: Calculi , geometries, algebras ... software . The genetic inter-algo connection should be that calculus came from Leibniz and Newton and numerous unknown others ... heads, but it is the blind watchmaker  of evolution which put those heads together ... (I disagree with Dawkins only on that evolution and design are both algorithms, alternatives but not opposition).
Thus entropically  and combinatorially  algos kinda-sorta come from one another - the stronger from the weaker.
The stronger are the life-forms living in that ocean. Cause randomness  permeates everything, isn't it?
Not so far-fetched of a metaphor given the fact that any Effector-ing  has totally algorithmic nature and essence.
How much higher 'life form' Tauchain in the Algoverse ocean is?
Is it mere life form or ... life, new organizing principle to reform all the system?
What is the Knowledge Acquisition Bottleneck problem? By Dana Edwards. Posted on Steemit. March 29, 2017.
Now that we know what knowledge representation is, and what knowledge bases are, and how the knowledge base is relied upon in a knowledge based system of artificial intelligence (KR+KB+Inference engine), we can move on to discussing one of the open problems.
The Knowledge Acquisition Bottleneck problem.
Many people already know about the familiar Byzantines generals problem in computer science. We also know how the Nakamoto consensus in Bitcoin provided a novel example of a solution. The Knowledge Acquisition Bottleneck problem is one of the problems plaguing AI and is what limits or seems to be a limit on the strength of artificial intelligence. One of the main problems in artificial intelligence is that knowledge formation typically requires domain experts who can contribute to the knowledge base. The Cyc project attempted to solve the problem of scaling up the knowledge base but is suffering from the bottleneck. The bottleneck can be summarized below [taken from Wagner, 2006]:
The paper from which this summary was pulled "Breaking the Knowledge Acquisition Bottleneck Through Conversational Knowledge Management" also offers a solution called collaborative conversational knowledge management. This is the same solution which Tauchain will attempt to utilize in a more sophisticated way. Tauchain will allow for collaborative theory formation. In the paper this quote explains a key concept:
We see this concept in how Wikipedia works to manage knowledge. We know Wikipedia is indeed not without flaws but it does manage knowledge. In their conclusion we see this quote:
Tauchain by design will be collaborative and allow for collaborative theory formation. This would mean anyone will be able to contribute to the knowledge base with relative ease. In addition, it will have knowledge management properties built in, and if the knowledge acquisition bottleneck problem can be solved then it will have a huge impact. For one, the problems which prevent knowledge based AI from scaling could be resolved if this bottleneck is removed.
DARPA has attempted to solve the Knowledge Acquisition Bottleneck problem utilizing high performance knowledge bases (HPKBs)and Rapid Knowledge Formation yet failed. Cyc has attempted to solve the same problem and has failed. The semantic web has yet to take off because this problem stands in the way. Will Tauchain succeed where these other attempts have failed? I think it is a strong possibility which is why I'm excited about the implications should Tauchain successfully be built.
Lenat, D. B., Prakash, M., & Shepherd, M. (1985). CYC: Using common sense knowledge to overcome brittleness and knowledge acquisition bottlenecks. AI magazine, 6(4), 65.
Wagner, C. (2006). Breaking the Knowledge Acquisition Bottleneck Through Conversational Knowledge Management. Information Resources Management Journal, 19(1), 70-83.
Web 1. https://www.quora.com/What-is-knowledge-acquisition-bottleneck
Web 2. http://www.igi-global.com/dictionary/knowledge-acquisition-bottleneck/49991
Web 3: http://www.tauchain.org
Web 4: https://steemit.com/tauchain/@dana-edwards/how-to-become-a-stakeholder-in-agoras-and-indirectly-tauchain
Fuente / Source: Original post written by Dana Edwards. Published on Steemit: What is the Knowledge Acquisition Bottleneck problem?
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.