Truth vs Consensus
Truth can be thought of either as something which we can prove by experiments or it can be the result of a consensus. A scientific fact is arrived at by the process of conducting scientific experimentation. A mathematical fact is discovered by finding a proof. Consensus is discovered by analysis of sentiment (or by voting) to determine what the majority currently believes at a point in time about a subject. The truth of the scientists might not match up with the popular consensus at the time. The mathematical proof might say one thing but a majority of people might agree to disagree with the math. We have seen this happen in the past and this blog post is a discussion on that topic. Particularly for Tauchain we have the question of what is the truth and what is more important? Do we care more about the truth or more about consensus?
Tauchain offers helpers in the form of reasoners and logic to improve the quality of discussion. These helpers will not necessarily work unless people agree to accept the results generated. In addition, the bias people inherently have could influence what they discuss in the first place which could create a consensus but not necessarily an improvement.
Consensus as Truth
According to the "truth by consensus" paradigm the truth is produced by consensus gentium. Consensus gentium means agreement of the people. In my previous post I discussed exactly this topic: Consensus Morality and Tauchain | Consensus Gentium. To be specific we can think of consensus gentium to mean: "the truth is what everyone currently believes". In this model of truth we can only get the truth by finding out what everyone believes but how do we determine what people believe? It is a challenge to find a way to determine what people actually believe in a blockchain context. One method of attempting this is called Futarchy which provides an economic reward and an economic cost for having correct or incorrect beliefs. In essence under Futarchy the people must bet on their beliefs rather than just vote. Under Futarchy prediction markets are used to apply market elements to produce a market consensus truth.
Consensus gentium in an environment where there is persecution and or coercion can result in widely held "beliefs" which are enforced into existence such as the belief in geocentrism. Victims of this kind of persecution may include Galileo who was forced to recant his beliefs or face the inquisition. Ancient Greek philosopher Anaximander proposed that the universe revolved around the earth and this idea caught on. Once the idea caught on it became the gospel truth and over time it became blasphemous to dispute this belief. We continue to see this happen even now in the cryptospace with for example the belief of "code is law" or that "blockchains must be immutable", but these too are beliefs based on a particular set of values which the holders of these beliefs hold dear.
Consensus as a regulative ideal
A descriptive theory is one that tells how things are, while a normative theory tells how things ought to be. Expressed in practical terms, a normative theory, more properly called a policy, tells agents how they ought to act. A policy can be an absolute imperative, telling agents how they ought to act in any case, or it can be a contingent directive, telling agents how they ought to act if they want to achieve a particular goal. A policy is frequently stated in the form of a piece of advice called a heuristic, a maxim, a norm, a rule, a slogan, and so on. Other names for a policy are a recommendation and a regulative principle.
In this case we have a distinction between the way things are and the way things ought to be. Policies can be directed to shape the way things ought to be.
The problem with consensus as truth | argumentum ad populum
If consensus equals truth, then truth can be made by forcing or organizing a consensus, rather than being discovered through experiment or observation, or existing separately from consensus. The principles of mathematics also do not hold under consensus truth because mathematical propositions build on each other. If the consensus declared 2+2=5 it would render the practice of mathematics where 2+2=4 impossible.
A big problem is that of coercion. Another big problem is that popular opinion can in fact lead to really bad outcomes. If something is true at a point of time merely because a lot of people believe it then we are basing our decisions merely on what a lot of people believe. This can result in decisions which satisfy what is popular yet also unwise. A lot of people believe a lot of crazy wrong stuff but this does not mean they do not passionately believe it. The question of truth is more about what is true even if not very many people believe it. Geocentricism turned out to be false even though a lot of people believed it at some point in time. On the other hand the laws of physics appear to be true for 13 billion years even during times when a lot of people didn't believe it.
The State, or the ruling government, has the special role of taking care of the people; however, what distinguishes the Chinese ruling government from other ruling governments is the respectful attitude of the citizens, who regard the government as part of their family. In fact, the ruling government is "the head of the family, the patriarch." Therefore, the Chinese look to the government for guidance as if they are listening to their father who, according to Chinese tradition, enjoys high reverence from the rest of the family. Furthermore, "still another tradition that supports state control of music is the Chinese expectation of a verbal 'message.'" A "verbal message" is the underlying meaning behind people's words. In order to get to the "verbal message," one needs to read into words and ask oneself what the desired or expected response would be.
Smaller vs larger denominations in crypto
The large denominations produce a different psychology (the psychology of scarcity). This has a problem though because if for example 1 ETH or 1 BTC is $1000 it does eventually begin to look like it's just for rich people. It begins to look to some in developing countries that it's just too expensive. On the other hand Ripple has smaller denominations and still has quite a high market cap regardless.
In the official documents, it is known that the Agora tokens currently being sold on exchanges are "intermediate tokens". There are going to be roughly 42 million intermediate tokens. When people look at it this way people might think the price of AGRS is high after a certain psychological barrier such as $100. At $100 using the intermediate tokens the market cap would be four billion two hundred million. In crypto this is not that high of a market cap and in tech this is not so high. A tech company can easily reach a market cap of 4 billion and if we remember Snapchat had a market cap far beyond that.
In the case of Tauchain which the goal is to reveal to the world truly novel technological breakthroughs which provide for unique features then we can not predict where the high end for AGRS will be. What we can know is that the price looks vastly different if we look at it via intermediate tokens vs official Agora tokens. 147,000,000,000 tokens can exist according to Ohad.
So we do not sell Zennet coins anymore, and all previous buyers will get Agora, offering no less but much more technological and economical features. The current sale terms are as follows: From now on, we sell 50% of coins for approx $2M: The current price is $100 for 3.5 million (3,500,000) Agora coins, and will go up in 2% every week. Total number of coins is 147,000,000,000.
In this case the true number is 147 billion. These are Ripple like numbers. So a price of between 0.01 and 0.04 USD is per token is reasonable in a good market. The $1 range is if AGRS achieves similar to Bitcoin price success range. This in my opinion would be extremely optimistic. Ultimately no one can predict where the price could move and currently 1 true AGRS is less than a penny.
For people who did take the risk to buy Agora tokens at $100 to get 3.5 million? If it ever does reach $1 level (Ripple or Bitcoin scale success) then you folks are multi-millionaires in the making.
Tauchain and the mysterious Futamura projections. By Dana Edwards. Posted on Steemit. October 15, 2018.
Futamura Projections and Partial Evaluation
While we know Futamura projection is a planned and necessary feature it is also unlikely that most of us even know what Futamura projection is. In fact most people do not even fully understand what BDD can do in particular.
One video which can help for those who wish to study further is:
The distinction must be made between the topic of "Boolean Algebra" and "The Algebra of Boole". The Algebra of Boole is pertinent to the understanding of the BDD aspect of TML. Disclosure, I am not a mathematician so the information in the video above goes toward a level of detail which I am not qualified to express any expertise on. If you choose to take on the herculean task of carrying the cognitive load then please do so at your own risk. If you are really brave you can check out the work of Boole himself directly as well.
For all who have suffered through the cognitive workload presented in that video the next part of this discussion is on the capabilities and process of Futamura projection.
The formula below concisely represents exactly what partial evaluation is:
Given a program, p, static inputs SI, dynamic inputs DI, and outputs O,
We can input the description of our translator. Our translator can either be a compiler or an interpreter. What we want to describe is the process by which the defined language can translate to another language. Using an interpreter we can describe the semantics of our programming language.
How do you compile a compiler?
At the most simple and basic level we start with one input and one output. In the abstract you input your commands into the box and the box produces an output based on those commands. Most very simple software works in this way. A compiler basically takes input (source code) and produces output (a program). The source code are the acceptable commands for the compiler to produce the program with the appropriate behavior. In essence we can think of the box as nothing more than a translator device which takes one set of symbols and produces an output of another set of symbols.
Futamura offers three projections. This is a self referential process so what if instead of just one input into the box we now have two? With two inputs we can now not only send source code into the box and watch it translate into a program but now we can actually go even further and create an "interpreter". Using this second input we can now define the behavior of the box by sending a description of how you want the box to behave. In other words you can now rely on an interpreter which is distinct from a compiler in that it can only translate one statement at a time. Compilers, interpreters, and assemblers, are all translators so ultimately we have symbol manipulation at the core of all this activity.
To compile a compiler you must take an input as an interpreter and get an output as a compiler. Wikipedia provides the three projections:
1. Specializing an interpreter for given source code, yielding an executable
In other words Ohad will have to rely on TML to compile TML by using Futamura projection 3 in the list. In essence he will have to compile TML by using TML. This is the most confusing aspect to explain because it's a mode of self reference where TML essentially is used to create itself. The specializer is specialized for itself.
In my opinion this is a similar moment to when Satoshi Nakamoto mined the genesis block to prove Bitcoin could be built. If Ohad can achieve the feat of compiling TML using TML then we will know from this that TML is able to work. From this we can know at minimum that Tauchain on the most basic level is feasible. The question still remains on the question of logic of course. While in theory we know the logic is supposed to work it is also an area of theory which very few of us understand well. If it is demonstrated that this logic does in fact work as intended then we will know for certain that Tauchain is feasible.
Futamura projection is perhaps one of the most difficult parts of TML to explain conceptually due to the self referential nature. Excuse me if I made any errors in my attempt to explain it.
Boole, G. (1984). Analysis of Logic.
Aside from my favorite Liechtenstein ,  there is a number of other jurisdictions - etc., vigorously self-advertizing their dedication onto becoming the global crypto hub.
The logic behind is clear: the crypto shows unprecedented parameters of wealth accumulation and growth! Orders of magnitude stronger than anything classicaly fiat. -
Practically all these jurisdictions are well established for decades if not centuries no-joke financial centers . Entire societies making thus a living .
And all of them apparently consider the following two approaches as the sufficient means to an end :
A. legislation changes
B. Regulators effort bias
Are A & B enough? No. Why? That's why, in the four points which follow:
The power of crypto - in the most general point of view - stems from its essence of an economy optimization mechanism :
''Depending on the parameters used in the optimization mechanism, the algorithm can build three types of networks: a star network, a random network, and a scale-free network.''
The only form of sustainability we've ever known and do know is growth.
Just have a look on the global GDP  vs global wealth , and labor/capital factor ratio  - to see the reproduction/replacement rate necessary and the actual durability of value per averaged product.
Only a scale-free system can grow without to hit inherent bottlenecks.
Crypro is strong in growth, therefore a successful crypto regulation must be its match in growth mode.
Only a regulatory algorithm which builds scale-free compliance mechanism can be a match to harness the full potential of the subject of regulation.
To compose legal texts like to: enact statutory law amendments or entire novel codical bodies of legal text, excercise administrative and court precising practices, etc ... are tasks finite and easy to grasp. Even on the individual human level.
This is the nice picture on the front end. What happens on the back end, though?:
The consequences of the specialized crypto legislation application and enforcement, however, in recombination with all the rest of the 'jungle of norms', is a matematically intractable task. Inevitably lead to necessity of sharp policy turns, which shake the confidence into the long term stability of the jurisdiction. And for businesses regulatory predictability is synonymous with attractiveness.
Compliance is at the end of the day the real-world work to satisfy all the requirements from all directions in a way which is within the regulators sensitivity specs. Literally.
Compliance has astronomically higher complexity than legislation.
Thus it is immensely easier to write laws than it is to utilize them.
We see dozens of countries with such a splendid legislation package and so lousy job and meagre record of actual application of their own laws and utter lack of success in attracting valuable businesses.
Cuz enforcement is function of the actual implementation infrastructure in place. And because of scaling issues even if we can afford it it is not the best possible solution to just ''put together as much as needed'' without to seek for more efficient ways of processing. Mere extensive/quantitiative growth always hits limits of diminishing returns. It is satiable in contrast with the intensive/qualitiative one.
3. Administrative emphasis
That means to posit 'cryptos first' as a regulators policy commandment. Ok, it works well for initial herding of as much as possible firms out of the not so numerous crypto businesses in the now so close to the dawn of the crypto age, but is it able to handle future growth?
Possible solution of the problem of handling scaling business with unscaling regulation is the ''that's enough'' way - where the jurisdiction intakes only as much as it actual regulatory capacity allows - but a ''closed club'' approach not only limits its overall turnover way way bellow the potential, but also makes the jurisdiction prone to entrap itself into dependency from the let in firm's policies and more importantly - kills off its own capacity for developmental agility. Similar to the 'oil trap'  but in the financial services sector.
We do not even mention here the resources displacement factor - where 'priority treatment' of one sector is in negative tradeoff relation with the diminishing capacity of the regulator to do their 'regular job'. The risks of 'sparrow in hand vs eagle in the sky' situation.
4. Capital costs
Compliance is extremely huge and extremely financial and human capital-intensive  activity. Leading fintech  businesses indicate over 20% of their workforce dedicated to compliance , and this is the most expensive workforce.
Financial capital is abundant and easy to redirect from other sectors of economy, although it is not unlimited we are very far from situation in which its demand could hit a supply brick wall.
Anyway, increase in the intake of even the easy to reproduce non-human capital bears the high opportunity cost  of displacement of resources from production into control. In general. If it is too expensive to control ... we are on the curve of economy suphocating trend similar to the ill effects of high public spending , high public debt  and/or high taxation !
The immediate trouble, however, is the compliance human capital  cost. Typically the compliance staff in firms and the stuff in the regulators consists of human beings of good upbringing, IQ above average, two digit number of years of very special education and very formal qualifications, clean record, brilliant CVs, heavy professional organizations regime (compliance of the compliancers!), tough clearance regime ...
It is not the high salaries which bottleneck the expansion of the compliance system capacity so much, as is the fact that it takes literally decades to produce a competent to occupy a compliance position person.
Compliance is inevitable.
It is the equivalent of consiousness in business.
We live in an exponentially interconnected world.
Various transnational levels of rules and standards emerge and develop ,  etc.
Heavy compliance is here to stay and grow further!
Simplification of compliance is unthinkable.
From both: purely practical unfeasibility and from the fact that nobody can afford to lower its compliance criteria under threat to be instantly disconnected, black-listed , isolated.
Quite the opposite - nowadays namely the countries with highest and strongest regulatory and compliance regimes and standards are considered the safest and most attractive.
What's the way forward?
In the last few years we hear frequantly and louder the word RegTech.
*Map Source: CBINSIGHTS .
''At risk of sounding too simple, RegTech is pretty much what it says on the tin: the use of new technology to facilitate the delivery of regulatory requirements. Or, in slightly more words, RegTech is technology that seeks to provide “nimble, configurable, easy to integrate, reliable, secure and cost-effective” regulatory solutions (Deloitte).'' 
The connotation emphasis on compliance is clear.
What's important is the direction - RegTech is contemplated nowadays as an array of auxiliary separate businesses who to provide compliance data services to companies and regulators. Intermediaries between firms and regulators who to assist the former to cope with the growing amount and complexity of requirements by the later. The workflow geometry is not changed at all. It goes from the firms (via RegTech companies) to the regulators.
What if the RegTech is the other way round - the inner workings of the Regulator itself?
Regulators = automata? Platforms for compliance services themselves?
Regulation-Compliance a single automated process?
A smart law. Written and run in decidable language .
In my recent ''Tauchain Exegesis .: Nomic''  essay I outlined some thoughts of mine on the subject:
''In order law to become law it must become handsfree .
Not humans to read laws, but laws to read laws.
The technology to enable that looks on an arm's length.'' 
(1) - RegTech, becoming synonymous with fully and truly automated regulation and compliance, does not make obsolete, but optimizes the performance of the classical governmental aparatus of legislation and regulators. Applicable, internalized into the legal process RegTech is still matter of legislative enactment and regulatory control per se.
(2) - RegTech machinery and its consumables are subject of Moore's , Coomey's  and other entropic forces laws. Computational physics provides for infinitely more scalability room than the human biology and sociology. I.e. regulation growth matching the regulables growth.
(3) - Given a sufficiently capacious automated compliance processing system the limits of demographic base of the implementing nation won't be an issue. It won't be necessary it to suffer the illths of neither undesired immigration, nor the security risks inherent to transnational subprocesses outsourcing.
(4) - Strategical advantage 'escape velocity'  means that the jurisdiction which manages to master it first will take and rule it all, and will be best positioned to grab the even newer opportunities to come.
 - https://steemit.com/tauchain/@karov/de-lege-ferenda-intro
 - https://steemit.com/tauchain/@karov/de-lege-ferenda-koine
 - https://caymannewsservice.com/2018/04/cryptocurrencies-cayman-framework/
 - https://www.bloomberg.com/news/articles/2018-04-23/how-malta-became-a-hub-of-the-cryptocurrency-world-quicktake
 - https://oracletimes.com/gibraltar-is-the-first-regulated-cryptocurrency-country/
 - https://www.nytimes.com/2018/07/29/technology/cryptocurrency-bermuda-malta-gibraltar.html
 - https://www.albawaba.com/business/uae-set-become-global-leader-blockchain-cryptocurrencies-1119042
 - https://steemit.com/blockchain/@karov/geodesic-by-tau
 - https://steemit.com/tauchain/@karov/tauchain-the-hanson-engine
 - https://steemit.com/tauchain/@karov/masa-effect-with-tauchain
 - https://steemit.com/tauchain/@karov/scaling-is-layering
 - https://steemit.com/tauchain/@karov/tauchain-transcaling
 - https://steemit.com/tauchain/@karov/tauchain-trumps-procrustics
 - https://steemit.com/tauchain/@karov/tauchain-as-szabo-booster
 - https://steemit.com/tauchain/@karov/tauchain-and-the-cost-of-trust
 - https://steemit.com/tauchain/@karov/clusivity-by-tauchain
 - https://www.forbes.com/sites/kenrapoza/2017/09/15/tax-haven-cash-rising-now-equal-to-at-least-10-of-world-gdp/#3a046d6770d6
 - https://en.wikipedia.org/wiki/Means_to_an_end
 - https://en.wikipedia.org/wiki/Optimization_mechanism
 - https://en.wikipedia.org/wiki/Gross_world_product
 - https://www.credit-suisse.com/corporate/en/articles/news-and-expertise/global-wealth-outlook-201712.html
 - https://en.wikipedia.org/wiki/Cobb%E2%80%93Douglas_production_function
 - https://en.wikipedia.org/wiki/Resource_curse
 - https://en.wikipedia.org/wiki/Capital_intensity
 - https://en.wikipedia.org/wiki/Financial_technology
 - https://www.crowdfundinsider.com/2018/03/130076-congressional-hearing-cryptocurrencies-provides-interesting-perspective-legislators-industry-alike/
 - https://en.wikipedia.org/wiki/Opportunity_cost
 - https://www.heritage.org/budget-and-spending/report/the-impact-government-spending-economic-growth
 - https://mises.org/library/whats-wrong-government-debt
 - https://mises.org/library/corporate-taxes-suffocate-growth
 - https://en.wikipedia.org/wiki/Human_capital
 - https://en.wikipedia.org/wiki/Basel_II
 - https://en.wikipedia.org/wiki/Solvency_II_Directive_2009
 - http://crwwgroup.net/en/eu-offshore-blacklist/
 - https://www.cbinsights.com/research/regtech-regulation-compliance-market-map/
 - https://complyadvantage.com/blog/what-is-regtech/
 - https://en.wikipedia.org/wiki/Decidability_(logic)
 - https://steemit.com/tauchain/@karov/tauchain-exegesis-nomic
 - http://www.idni.org/blog/
 - https://en.wikipedia.org/wiki/Moore%27s_law
 - https://en.wikipedia.org/wiki/Koomey%27s_law
 - https://medium.com/the-mission/the-1-percent-rule-why-a-few-people-get-most-of-the-rewards-d92ca43baa0e
Tauchain: The Social Dispersed Computer introduced as a Social Network? By Dana Edwards. Posted on Steemit. October 12, 2018.
What might a Tau Operating System via a Tau Social Dispersed Computer function like?
We know from tauchain.org that the first iteration of Tau is to be a discussion platform not too dissimilar from Facebook. Of course this would simply be the front end or the "face" of what could behind the scenes evolve toward a social dispersed computer complete with a dispersed operating system. The resources have to be managed and a kernel could provide for this in a manner not dissimilar to what we see with EOS. The Agoras or AGRS token specifically represents "resources" as it is the tokenization of resources for whichever application Tauchain will use.
TML provides the basis from which to create the necessary languages to produce a dispersed operating system computer. Zennet even has an algorithm which Ohad himself worked on for the purpose of calculating the resource requirements. All minds will be able to contribute towards the computational resources (at least in theory) of Tauchain.
Because of Zennet there may in fact not be a limit to the amount of computation resources which we could throw at the super computer. It will of course depend on resource management which is where a kernel likely comes into play because any smart apps built to run on Tau will have to ask for resources. Resource management is one of the core functions of a kernel and of an operating system which is why I think it is likely that Tauchain will have one. I think the Ethereum route shows problems with scaling as applications also have to compete for resources in a way where the network cannot self manage it. Cryptokitties for example can render the whole Ethereum network lagged and if this is a computer then it could mean a nonsense app could disrupt more critical apps.
A prime example of a potential smart app for Tauchain
An example (which may or may not be feasible) is a health and fitness app. The app in theory could allow any user to provide data such as genetic information, blood test results, exercise tracking, blood pressure, blood sugar and anything else. All of this could provide a feedback loop back to the patient on how to improve their health over time based on the knowledge of Tau. As technology gets better the users could add more devices to provide more data for a better feedback loop. As technology evolves FGPAs could be added to meet the demand for calculations and storage can be rented as well.
An operating system could give priority to this kind of app by load balancing the resources. How would it know to do this? Tau could learn the morals, legal ramifications, and a consensus can emerge that health related apps deserve a premium access to resources because it can save lives.
The Paradigm of Social Dispersed Computing and the Utility of Agoras. By Dana Edwards. Posted on Steemit. October 12, 2018.
Social Dispersed Computing
What is socially dispersed computing? It is an edge oriented computing paradigm which goes beyond cloud and fog computing. To understand socially dispersed computing we first have to discuss dispersed computing and how it differs from the previous paradigm of cloud and fog computing. The current trend toward decentralized networks which we first saw with the peer to peer technologies such as Napster, Limewire, Bittorrent, and later with Bitcoin, have brought to us an opportunity to conceptually new paradigms. The original model most people are familiar with is the client server model which was very much limited in that the server was always vulnerable to DDOS attack. The client server model has never been and could likely never be censorship resistant.
In the client server model the server could simply shut down as was the case with Bitconnect or it could be raided. The server could also be shut down by hackers who simply flood the site with requests. As we can see from the problems the client server model presented we discovered the utility of the peer to peer model. The peer to peer model was all about censorship resistance and promoted a network which was to have no single point of failure (single point of attack) which could be result in the shutdown of access points to the information. One of the first applications for these peer to peer networks was file sharing networks and networks such as Freenet/Tor etc. This of course eventually evolved into the Bitcoin which ultimately led to the development of Steem.
In dispersed computing a concept is introduced called "Networked Computation Points". An NCP can execute a function in support of user applications. To elaborate further I'll offer something below.
Consider that every component in a network is a node. Now consider that every component node is an NCP in that it can execute some function to support some user application. If we think of for example a blockchain then we know mining would fit into this category because it is both a node in the network and it also can execute a function in support of Bitcoin transactions. Why is any of this important? Parallelism is something we can gain from dispersed computing and please note that it is distinct form concurrent computing. When we rely on parallelism we can reap the benefits in terms of performance when executing code by breaking it up into many small tasks which can be performed across many CPUs.
EOS attempts to leverage parallelism specifically to enable it's performance boost. The benefit is speed and flexibility. Think for example of the hardware side also with FGPAs which can do similar tasks of a microprocessor. FGPAs (not ASICs) which unlike ASICs would provide generalized flexible parallel computing. Consider that just like with mining a company could add more and more FGPAs to scale an application as needed.
To understand Social Dispersed Computing we have to make note of the fact that there are other users at any given time. For example the other users in the network participate to provide resources to the network for the benefit of other users whilst using the network. So in Steem for example as you add content to Steem you are adding value to Steem in a direct way, but also in a dynamic way. The resources on Steem also can adapt dynamically to the demand provided that the incentive mechanism (Resource Credits) works as intended.
EOS as an example DOSC (Dispersed Operating System Computer)
Because EOS seems to be the first to approach this holistically I will give credit to the EOS network for pioneering dispersed computing in the crypto space. All resources are representable by tokenization in a dispersed computing network. EOS and even Steem have this. Steem has it in the form of "Resource Credits" which represent the available resources on the Steem network. If more resources are needed then theoretically the resource credits could act as an incentive to provide these resources to the Steem network. This provides a permanent price floor to Steem represented as the amount of Steem which would have to be purchased in order to have enough resources to run Steem (if I have the correct theoretical understanding). This would put Steem on a trajectory toward dispersed computing.
Operating systems typically sit between the hardware and software as a sort of abstraction layer. This traditionally has been valuable because programmers don't have to directly speak to the hardware and hardware designers don't have to directly communicate by their designs to the programmer. In essence the operating system in the traditional model is centralized and made by a company such as Microsoft or Apple. This centralized operating system typically runs on a device or set of devices and provides some standard services such as email, a web browser, and maybe even a Bitcoin wallet.
Typically the most valuable or high utility software people consider on a computer is the operating system. In our smart phones this is Android OS and in PCs it may be Windows or Linux. This is of course thrown on it's head under the new paradigm of dispersed computing and the new conceptual model of the "decentralized" operating system. EOS is the first to attempt a decentralized operating system using current blockchain technology but the upcoming technology easily eclipses what EOS could do. Tauchain is a technology which if successful will leave EOS in the stone age in terms of what it will be able to do. EOS while ambitious also has had it's problems with regard to the voting mechanisms and the ease at which collusion can take place.
To better understand how decentralized operating systems emerge learn about:
If we look at OSKit we see that it is the tools necessary for operating system development. If we look at Tauchain we realize that it is strategically the most important tool for the development of a decentralized operating system being provided in the form of TML (a partial evaluator). If we think of the primary tool necessary to develop from we have to initially start with a compiler. A compiler generator is more like what TML allows with it's partial evaluator. More specifically it is the feature of Futamura projection which can provide the ability to generate compilers.
If we look at the next most important part of an operating system it is typically the kernel. Let's have a look at what an exokernel is:
Operating systems generally present hardware resources to applications through high-level abstractions such as (virtual) file systems. The idea behind exokernels is to force as few abstractions as possible on application developers, enabling them to make as many decisions as possible about hardware abstractions. Exokernels are tiny, since functionality is limited to ensuring protection and multiplexing of resources, which is considerably simpler than conventional microkernels' implementation of message passing and monolithic kernels' implementation of high-level abstractions.
By Thorben Bochenek [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
From this at minimum we can see that an exokernel is a more efficient and direct way for programmers to communicate with hardware. To be more specific, "programs" communicate with hardware directly by way of an exokernel. We know the most basic function of a kernel in an operating system is the management of resources. We know in a decentralized context that tokenization allows for incentives for management of resources. When we combine them we get kernel+tokenization to produce an elementary foundation of an operating system. In a distributed context we could apply a decentralized operating system in such a way that the network could be treated as a unified computer.
Abstraction is still important by the way. In an operating system we know the object oriented way of abstraction. Typically the programmer works with the concept of objects. In an "Application Operating Environment" an "Application Object" can be another useful abstraction. Abstraction can of course be taken further but that is for another blog post.
The Utility of Agoras
Agoras+TML is interesting. Agoras is the resource management component of what may evolve into the Tau Operating System. This Tau Operating System or TOS is something which would be vastly superior to EOS or anything else out there because of the unique abilities of Agoras. The main abilities have been announced on the website such as the knowledge exchange (knowledge market) where humans and machines alike can contribute knowledge to the network in exchange for the token reward. We also know that Agoras will have a more direct resource contribution incentive property in the form of the AGRS token so as to facilitate the sale or trade of storage, bandwidth or computation resources.
The possible (likely?) emergence of the Tau Operating System
In order for Tauchain to evolve into a Dispersed Operating System Computer it will need an equivalent to a kernel. Some means of allowing whomever is responsible for the Tauchain network to control and manage the resources of that network. If for example the users decide then by way of discussion there would be a formal specification or model of a future iteration of the Tauchain network. This according to current documents is what would produce the requirements for the Beta version of the network to apply program synthesis. Program synthesis in essence could result in a kernel and from there the components of a Tau Operating System could be synthesized in the same way. Just remember that all that I write is purely speculative as we have no way to predict with certainty the direction the community will take during the alpha.
The importance of modeling opinion dynamics in Tauchain. By Dana Edwards. Posted on Steemit. October 9, 2018.
The videos I recommend anyone watch to understand the importance of this are listed below:
Opinion dynamics modeling in society (part 1)
How do governments determine policy priorities?
The Hidden Trump Model - Opinion Dynamics w/ Social Desirability Bias - H. Zontine & S. Davies
Tauchain is unique because it can aggregate opinions into consensus and toward synthesis
For those who do not understand what Tauchain is trying to do we have to understand that in the beta network of Tauchain consensus = synthesis. Synthesis in this case is program synthesis. In other words the product of consensus is the software. The consensus emerges based on discussion. During this discussion the opinions will be broadcast in such a way that agreements will be reached. These agreements will form the basis of the specification from which program synthesis can produce or output the software.
The problem Tauchain will face is the same problem which any preference aggregation optimization network will face. In other words just because people have preferences and try to express those preferences it does not mean that these preferences will be effectively expressed. In my other post I identified a specific problem which is summed up in the question on whether or not you can effectively aggregate preferences if there is false preferences being expressed? This problem has been called preference falsification but in general it seems to make the case for why privacy is necessary.
Tauchain promises to scale discussion which is great but the problem is some discussions cannot be had at all. Some discussions are so controversial that people cannot even attempt to start them. For these discussions only privacy would allow for the discussion to take place. Of course this doesn't mean discussions will be equally productive even if privacy was allowed.
What is so important about modeling opinion dynamics?
Opinions have to be formed. How are opinions formed? If a agent must make a decision to be pro or con some specific issue then can we model this process? The utility of this is explored in the video below:
The mathematics of influence is the title of the video above. In other words it might be possible to use Tau not just to scale discussion but to discuss how to better discuss. To improve opinion formation or to at least understand how opinions are being formed in the network could be of utility. The more participants in the discussion, the bigger the network, the more important the mathematical models could become.
How do we deal with problems such as bias? This could include racism, sexism, etc? Any kind of cognitive bias can influence opinion formation but how? Ultimately if we do not understand how to model or think about these things mathematically then it's going to be much harder to examine in depth what is going on. For people who are math inclined and who understand the danger of bias in AI then this may be of interest.
The voter model is specifically interesting. It examines how opinions on who to vote for forms. Under this model a node is picked at random from the network (a neighbor) and the opinion of that neighbor is adopted by the node. Which opinion wins out? The high degree nodes (hubs) which have the highest probability of being connected to. This could mean a lot for an election or for opinion shaping. To me this would resemble the thought leader paradigm where the most connected thought leader expresses their opinion in the group and because a lot of people are connected to them in some direct or indirect way their opinion holds a lot more weight. If those thought leaders are zealots (will not change their mind no matter what new evidence they receive) then these individuals have even more influence on the outcome and on opinion formation.
The Era of Signals and Changing Power Dynamics. By Dana Edwards. Posted on Steemit. October 8, 2018.
The world we live in is rapidly changing. For instance the #MeToo era has arrived. This new era shows us that any individual in any position in society can be brought down. It proves a point that many in the blockchain community may have known instinctively which is that any individual source of authority and or power can and may be removed from that position. Some people actively choose to seek to be in these positions of power for their own reasons and then some of these people abuse their positions of power. People who seek power for the wrong reasons and then abuse it are in my opinion a risk which positions of authority bring (which blockchain technology may help reduce).
What are signals and what is signalling theory?
Social desirability bias is a popular topic in academic circles. To explain:
In social science research, social desirability bias is a type of response bias that is the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others. It can take the form of over-reporting "good behavior" or under-reporting "bad," or undesirable behavior. The tendency poses a serious problem with conducting research with self-reports, especially questionnaires. This bias interferes with the interpretation of average tendencies as well as individual differences.
People tend to want to be liked/loved. People when asked questions on a survey may feel pressured to answer the survey in a way which they think they will be viewed more favorably by others. In other words rather than answering in a manner which they truly think or feel they will assess how others might judge their response and then answer in a way which they think they will be judged more favorably.
A full video on this topic is below:
Social desirability bias is exactly why voting on platforms such as Steem will not work. When voting is public then most of the research seems to show that people will feel pressured to answer the question not in the way which they really believe or prefer but in the way which they think the whales want them to vote or prefer. In other words because on Steem the whales can reward (or punish) anyone who votes in ways which go against "political sensibilities" it is likely that social desirability bias applies particularly on DPOS style consensus platforms. If there are votes and the votes are not encrypted (secret) then we have no way to determine which votes are legitimate and which votes are the result of signalling (such as virtue signals).
For example when it was Trump vs Hillary the polls suggested Hillary would win. This is because there likely was social desirability bias which made it socially undesirable for anyone to admit they voted for Trump. As a result people who voted for Trump or who planned to vote for Trump may have said in public that they intended to vote for Hillary. Because the votes in the election are secret the people who may have seemed like loud Hillary supporters could have been secret Trump supporters in disguise.
In some of my previous posts I discuss signalling theory a bit more:
In these posts I have identified that behavior of individuals is shaped by how individuals think other individuals will think of their behaviors. This would apply to social desirability optimization which I'll label as adopting behaviors which provide the expected payoff of being rewarded with improved social desirability.
To provide clarity the definition of social desirability:
Social desirability is the tendency for research participants to attempt to act in ways that make them seem desirable to other people.
In other words people want to be liked. Likeability is a word I can use to simplify the concept of social desirability for readers. In the example with the 2016 election it is clear that supporters of Trump would risk a social stigma with severe social consequences if they came out in public support. This high cost of public support is why some believed that there were secret Trump supporters who were simply afraid of "losing face". In the most simple terms a person can talk red or talk blue depending on where the social stigma is.
One of the stunning conclusions I reached in my own research on this topic is that the increasing transparency leads to "preference falsification". That is a person who is talking blue while thinking red. If all speech is public (like it is on Steem) then there is the possibility that preference falsification is taking place.
Here is a video on the topic of preference falsification:
Why is this a major problem in the blockchain community? The evolutionary trajectory of a platform relies entirely on market preferences. If censorship exists and conformist pressures hinder true preference aggregation then the developers (and the community itself) will have no way of knowing which improvements to make or which changes would best satisfy the community.
What is leadership and what is the era of signals?
Before I attempt to discuss leadership I will first explain what I think leadership means and what it is. In my opinion the community must always come first. A person who is put into a leadership position is in my opinion in what I'll term "the seat of responsibility". This is in my opinion not an enviable position to be in but someone has to be in this position. For example a person who receives a security clearance is now in a position of heavy responsibility. The information which they protect is not their secrets but the nations secrets.
Leadership in my understanding is not about "being in power" but is about serving a community. To be in a "big seat" is to be in a position of responsibility to make decisions on behalf of a community which the chosen person must represent. In other words being in positions of responsibility is entirely about service and not about power. A representative in congress is not in a position of power but in a position to serve their constituents who put them in that position to represent their interests.
In my opinion to be a good leader is to be a great listener. The leader must listen to the community to find out what the community wants and or needs. The leader must listen to the community to determine what the community thinks is right or wrong. The leader then must offer solutions or proposals or policies which satisfies the requirements of the community. What matters more than who is in the seat is the seat itself. This means the Presidency itself matters more than who is in office. The positions themselves matter more than who is in them. Long after whomever is in these positions are gone there will be these positions to be filled. Any leader in any position is replaceable by someone else if they show failure to lead (whether it be a CEO, or a President of a country, or a lead developer, or any other kind of community leader).
In my understanding it is like chess where all pieces on the board can be in various positions. We know in chess that the pawn can become any piece on the board. The point with this analogy is that individuals in my opinion are not likely to remain the source of power in society. The source of power in society is increasingly becoming the community for better or for worse. According to me, to lead is to serve and to lead effectively is to serve effectively.
To accept a responsibility to serve (to lead) it is required to seek feedback from all whom the community servant represents. This does not require voting specifically but it does require under any circumstance a mechanism by which the community can give brutally honest feedback to the system itself. When I say the system itself I do not mean the feedback must go direction to those who serve the system but that the system must have a means of collecting data, analyzing data, and then informing those who can improve the system on which changes best would satisfy the needs of the community.
In my opinion this is a very data driven process. I do not think leaders can for example process big data using their brain power. This will require that they harness the power of machines (machine intelligence). There is also risk if all the processing is done by one company (such as Google) just as there is risk if all people rely on Facebook for the news and opinions. We can see that Facebook has the ability right or wrong to shape elections by deforming the news feed or by allowing certain fake profiles to interact on the site. We see that Facebook can ban crypto ads at will for example to enforce certain policies without taking any kind of poll from the community or the users for instance. We simply do not see any poll data from the users which indicated that the users were tired of seeing crypto ads.
Summary of thoughts on leadership:
Augmenting the wisdom of the community as a means of better governance
In a world where the community must decide what to do we have a situation where responsibility is increasingly diffuse. This means while it is true that the signature may come from the face of the community (if it is a human face) it is still the community which has to be capable of wisdom. The problem is most communities in the world do not become wiser as more join the community. A bigger community doesn't produce better policies by merely voting together. The problem is while most people have opinions it does not mean opinions are well informed or scientific or wise. The lack of wisdom in a community results in horrible (harmful) policies, over reactions, systemic bias, and more.
The conclusion I have reached so far is that in order to have better governance in an era where the community is the government it is a requirement that the community be wise. It's not enough to simply give the community unlimited power to shape the future without providing any capacity for the community to be wise or to do research or to solve problems. Voting in the sense we see in elections does not involve informed voters. Information supplied to voters is almost always sub par and voters are expected to trust "opinion leaders" and "opinion shapers" who tell them how to vote and why. Often disinformation shapes elections more than scientific evidence, facts, math, or reason.
As we build blockchain technology I think it is critical that we put great emphasis on data analytics. Data analytics will allow our leaders to make better decisions on our behalf. Blockchain technology will have to rely on data analytics to figure out potential wants and needs of it's participants, users, e-citizens, etc. At the same time private communication will be a necessity even if just to conduct surveys. The reason is people will not necessarily provide their real opinion in a survey which is completely transparent. The only solution I could find to the problem of preference falsification is privacy.
Most important of all is those who are put into positions of leadership are in trusted positions. This includes people who are moderators at forums, people who are lead developers, people who run exchanges. People who are in these positions have the responsibility to serve the blockchain community to the best of their ability. The abuse of these positions for personal power or personal gain is a violation of this trust and in these instances the community can and should select someone else for that position.
Bulbulia, J., & Sosis, R. (2011). Signalling theory and the evolution of religious cooperation. Religion, 41(3), 363-388.
Davis, W. L. (2004). Preference falsification in the economics profession. Econ Journal Watch, 1(2), 359.
Frank, R. H. (1996). The Political Economy of Preference Falsification: Timur Kuran's Private Truths, Public Lies. Journal of Economic Literature, 34(1), 115-123.
Grimm, P. (2010). Social desirability bias. Wiley international encyclopedia of marketing.
Sîrbu, A., Loreto, V., Servedio, V. D., & Tria, F. (2017). Opinion dynamics: models, extensions and external effects. In Participatory Sensing, Opinions and Collective Awareness (pp. 363-401). Springer, Cham.
Voluntary compliance as a necessary feature. By Dana Edwards. Posted in Steemit. September 29, 2018.
Voluntary compliance in moral alignment with the participant
Every platform which is decentralized in my opinion should allow it's users to comply with the laws of their local jurisdiction to the degree which each user thinks is moral. The platform should not enforce the laws of any specific jurisdiction or remove the ability of any user to comply with the laws of their local jurisdiction. This means if a user would like to track and pay taxes the platform should provide a means for users to do this. This means if users would like to go through KYC before interacting with ICOs so that their accounts are whitelisted by banks they should be allowed. This also means that if a user thinks that a certain regulation or rule or law is immoral that they should be allowed to make up their own mind and take their own risks.
In other words platforms should not choose for users what is right and wrong. Platforms should simply provide the tools so that each person can decide how much risk they are willing to accept in alignment with their morality. The ability to comply with the law is necessary for mainstreamability. Mainstreamability ability is about winning the long war rather than the little battle. In order for crypto to have the maximum positive impact on future generations it must go mainstream and escape from the fringe use cases. This applies as much to Steem as it does to Ethereum as it does to Tauchain. Mainstreamability are the key elements which enable mainstream adoption success.
Legal contracts as tokens
Compliance can be modularized, tokenized, decentralized. Legal contracts can become tokens. The risk (and it is real) of money laundering or rogue nations violating sanctions can be reduced by decentralizing AML/KYC. At the same time regional locks in my opinion are one of the worst ideas and should not be technically enforced. Once again compliance should be voluntary but always allowed.
What does it mean to voluntarily comply? A participant can choose to comply to reduce their risks. The participant who does not comply is willing to take the risk of non-compliance. This means compliance is a means of risk reduction. But in a decentralized network such as a decentralized exchange the risk is entirely on the users. The users can decide (and only the users) which level of risk is best for them. Developers of the platform should have no responsibility to decide for the users of the platform what is morally right or what risks are acceptable or unacceptable to take.
An Update on Tauchain & Agoras (Exchange Listing + Interview Questions). By Kevin Wong. Posted on Steemit. September 15, 2018.
Agoras is getting listed on a new exchange.
Tauchain is a blockchain that doesn't have its own coin just like the Internet, but Agoras is a project that is designed to be built on top of it, hence the existence of Agoras Tokens. It's the only closest way to be able to invest in Tau. The demo is coming soon, but the blockchain itself will only come into fruition in year 2020 or later, so this is really more of a notice for those who are interested to support the project at this relatively risky stage.
For your information, this project has one of the fairest distributions in the space as the team behind it only reserved 3% of the total supply for themselves. At the moment, it's available on Bitshares / Openledger under AGRS (make sure it's the correct asset if you're looking into it).
The new exchange that Agoras will be listed is at https://www.bcex.ca on 18th of September 2018. The announcement can be found here. There's also another recent community update that can be found here.
Tauchain is certainly not a project that is easy to comprehend at first. If you have any questions after going through the available materials, feel free to drop a comment or two and I might include it in the written interview that I'm planning to forward to the team soon. Thanks in advance!
Also, feel free to drop by the group's Telegram channel: https://t.me/tauchain.
If you have no idea what Tauchain is about but interested to get to know more about it, check out these links:-
Not to be taken as financial advice.
Always do your own research.
Paper Wallet for Agoras, Billetera de papel para Agoras. By CapitanArt. Posted on Steemit. January 2, 2018.
Dear friends, I want to share my latest work related to Cryptocurrencies. For almost a year, I have been following the project track TAUCHAIN, it is very complex and I do not dare to explain what it is, but I do have one thing, in case the project has success, will create a great social-technological revolution. I recommend you do some research and you will see how ambitious it is.
Queridos amigos, quiero compartir mi ultimo trabajo relacionado con las Criptomonedas. Desde hace casi un año sigo la pista del Proyecto TAUCHAIN, es muy complejo y no me atrevo a explicar de que va, pero si digo una cosa, en el caso de que el proyecto tenga éxito, va a crear una gran revolución tecnológica social. Te recomiendo investigues un poco y veras lo ambicioso que es.
My small contribution has been to create a paper wallet, where you can save the official currency of the project, the AGORAS. This security system consists of a block of code where you can send your coins and other private and non-transferable code. Below you can see an example of a paper wallet for the BITCOIN coin.
Mi pequeña aportación a sido crear una billetera de papel, donde se pueden guardar la moneda oficial del proyecto, las AGORAS. Este sistema de seguridad consta de un bloque de código donde puedes enviar tus monedas y otro código privado e intransferible. A continuación podeis ver un ejemplo de una billetera de papel para la moneda BITCOIN.
I gladly share my work free of charge for the entire community. For anyone who wants to have their AGORAS printed on paper, the only thing to do is download the Mockup TAUpaperwallet.psd and once opened in Photoshop replace the text layers and QR codes images by personal ones. I use my wallet data in Omniwallet
Con mucho gusto comparto mi trabajo gratuitamente para toda la comunidad. Para todo aquel que le apetezca tener sus AGORAS impresas en papel lo único que a de hacer es descargar el Mockup TAUpaperwallet.psd y una vez abierto en Photoshop sustituir las capas de texto y las imágenes códigos QR por los personales. Yo utilizo los datos de mi billetera en Omniwallet
Something peculiar to this paper wallet is that the codes are hidden from view, use a red filter if you want to see your codes.
Algo peculiar de esta billetera de papel es que los códigos están ocultos a simple vista, utiliza un filtro de color rojo si quieres ver tus códigos.
If you do not know how to make your own QR codes download the App from this web QR code Generator.
Si no sabes cómo hacer tus propios códigos QR descarga la App de esta web generador de codigo QR.
I like if you liked the entrance to my blog, Thanks
Dale a me gusta si te a gustado la entrada a mi blog, Gracias!!
Demo coming soon? Tau Meta Language in C++ updated on Github. By Kevin Wong. Posted on Steemit. October 1, 2018.
Awesome visual promotion design by @capitanart for my only other favourite blockchain project besides Steem. Tau for the win! Might sound overly dramatic, but life has never been the same ever since I came across the following statement back in 31st December 2017:-
"Consider a process, denoted by X, of people, forming and following another process denoted by Y. Tau is the case where X=Y."
Say hello to our little friend above. It's the formula for intelligent decentralized networks. I'd really like to write more about this all day and night, but it's just difficult talking about something without a product to inspect. Regardless, it's still very real in my head because it has been shown to be a technical possibility. Is this the alpha protocol, the e=mc2 of blockchain technology?
Good news: there's something to show soon. It looks like the MVP release is on the horizon. New code just out on https://github.com/IDNI/tau. The author, Ohad Asor also remarked "the code is written. now i have to fix its bugs."
At only 384 lines of C++ code, what could it possibly demonstrate? If this is indeed the first instance Tau Meta Language (TML), it would then need to be re-written in TML itself for the next significant stage in development. At the moment, maybe you'd want read up on the following if you're interested:-
Honestly, I don't really know what to expect at the moment. All I know is that I've never been this excited before. Alright, time to attend to some life obligations before getting back into writing about Tau's development. As always, thanks for reading!
Note: here's part 1 of my series on Tau, more to come soon -https://steemit.com/blockchain/@kevinwong/what-is-tauchain-and-why-it-could-be-one-of-the-greatest-inventions-of-all-time-part-1
Disclaimer: Not financial advice.
Tauchain Update: Significant code changes in Github and discussion of progress. By Dana Edwards. Posted on Steemit. September 30, 2018.
Just several hours ago lead developer and founder of the Tauchain project Ohad Asor released his most significant code update yet. This blog post will be to discuss some of those updates and put it into context. In order to make sense of the current codebase : "Tauchain Codebase" I will also discuss a bit about the makeup of the code.
The significant breakthrough - Ohad implements the BDD
First some might be wondering what is BDD? BDD is a data structure called binary decision diagram. This data structure in my opinion is as significant to Tauchain as the "blockchain" data structure was to Bitcoin. For those who do not have a computer science degree I will elaborate on what exactly a data structure is below before discussing what a BDD is and why it is so significant.
Brief discussion on what a data structure is
In programming a data structure is a concept which represents a data organization method. For example blockchain is all about how records are stored as blocks. There are other similar data structures which represent decentralized data management and storage such as for instance the distributed hash table data structure.
A blockchain data structure looks like this for visualization:
By Matthäus Wander [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], from Wikimedia Commons
A hash table looks like this for a visual:
By Jorge Stolfi [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)], from Wikimedia Commons
The really good programmers choose the appropriate data structure to meet the requirements of the project. BDD was chosen specifically by Ohad because it provides efficiency boosts in a key area necessary for Tauchain to function as intended. In specific we know Tauchain requires partial fixed point logic in order to have decidability in P-SPACE. We also know Tauchain requires decentralization and efficiency. Efficiency can be understood better in terms of the trade off between time and space. We do not have unlimited time or space so we must sacrifice one in order to get more of the other.
When we look at the code base we know that Ohad can optimize the code either by sacrificing space in which the executable will be bigger (but the code runs faster) or he can choose to sacrifice time in which the code is a smaller executable to save memory but might run slightly slower. This highlights the essential trade off between time and space when optimizing code but of course there is more to it because algorithms within a code base have to make similar trade offs.
Now what exactly is a BDD (binary decision diagram)?
Now that we understand the basics about efficiency and what a data structure is we can make a bit more sense of what a BDD is. In order to understand why BDD as a data structure is so important to Tauchain we have to remember that Tauchain is about logic. We can take the most basic example of Socrates:
A predicate takes an entity or entities in the domain of discourse as input while outputs are either True or False. Consider the two sentences "Socrates is a philosopher" and "Plato is a philosopher". In propositional logic, these sentences are viewed as being unrelated and might be denoted, for example, by variables such as p and q. The predicate "is a philosopher" occurs in both sentences, which have a common structure of "a is a philosopher". The variable a is instantiated as "Socrates" in the first sentence and is instantiated as "Plato" in the second sentence. While first-order logic allows for the use of predicates, such as "is a philosopher" in this example, propositional logic does not.
Based on the rules of first order logic we can have our inputs and receive our outputs. In the most basic example above we an see a bit about how logic works. To elaborate further:
Relationships between predicates can be stated using logical connectives. Consider, for example, the first-order formula "if a is a philosopher, then a is a scholar". This formula is a conditional statement with "a is a philosopher" as its hypothesis and "a is a scholar" as its conclusion. The truth of this formula depends on which object is denoted by a, and on the interpretations of the predicates "is a philosopher" and "is a scholar".
A truth table has one column for each input variable (for example, P and Q), and one final column showing all of the possible results of the logical operation that the table represents (for example, P XOR Q). Each row of the truth table contains one possible configuration of the input variables (for instance, P=true Q=false), and the result of the operation for those values. See the examples below for further clarification. Ludwig Wittgenstein is often credited with inventing the truth table in his Tractatus Logico-Philosophicus, though it appeared at least a year earlier in a paper on propositional logic by Emil Leon Post.
When we are dealing with logic we may find that a truth table helps with visualization.
Now with this knowledge we have the most basic Socrates example:
This can be represented via truth table and is called a syllogism. To solve this we simply apply a kind of reasoning called deductive reasoning. This would indicate that if All men are mortal is true and if Socrates is a man is also true then Socrates is a mortal must be true. If we were to say all men are mortal but Socrates is immortal then Socrates cannot be a man. So if Socrates is a man he must be moral or there is what we call a contradiction. Logic is all about avoiding these sorts of contradictions and in specific binary or boolean logic is to reach a conclusion which always must be one of two possible values.
If I ask you to play a game which we want to guarantee will end with either one of two possible outcomes then we have a good example of a boolean function. 1 or 0, true or false, on or off, a or b.
Some of you may be familiar with data structure we call a DAG (directed acyclic graph). For those of you who understand this concept you can visualize a BDD as being very similar to a propositional DAG.
By David Eppstein [CC0], from Wikimedia Commons
We know from DAGs that it's a finite amount of vertices, edges, etc. We may also be able to visualize topological ordering and if you remember my post on transitive closure you might also remember the visuals on how that can work:
A binary decision diagram can represent a truth table:
By The original uploader was IMeowbot at English Wikipedia. (Transferred from en.wikipedia to Commons.) [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons
And from these visuals now it should be abundantly clear how this is critical to the functioning of Tauchain. The BDD data structure allows for efficient model checking as well. To understand we have to consider the boolean satisfiability problem.
This highlights the fact that BDD can be used to create a SAT solver.
A DPLL SAT solver employs a systematic backtracking search procedure to explore the (exponentially sized) space of variable assignments looking for satisfying assignments. The basic search procedure was proposed in two seminal papers in the early 1960s (see references below) and is now commonly referred to as the Davis–Putnam–Logemann–Loveland algorithm ("DPLL" or "DLL"). Theoretically, exponential lower bounds have been proved for the DPLL family of algorithms.
Without getting overwhelmed by technical details the key points are below:
To read the code for yourself and track the progress of Tauchain development take a look at Github:
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.