The Paradigm of Social Dispersed Computing and the Utility of Agoras. By Dana Edwards. Posted on Steemit. October 12, 2018.
Social Dispersed Computing
What is socially dispersed computing? It is an edge oriented computing paradigm which goes beyond cloud and fog computing. To understand socially dispersed computing we first have to discuss dispersed computing and how it differs from the previous paradigm of cloud and fog computing. The current trend toward decentralized networks which we first saw with the peer to peer technologies such as Napster, Limewire, Bittorrent, and later with Bitcoin, have brought to us an opportunity to conceptually new paradigms. The original model most people are familiar with is the client server model which was very much limited in that the server was always vulnerable to DDOS attack. The client server model has never been and could likely never be censorship resistant.
In the client server model the server could simply shut down as was the case with Bitconnect or it could be raided. The server could also be shut down by hackers who simply flood the site with requests. As we can see from the problems the client server model presented we discovered the utility of the peer to peer model. The peer to peer model was all about censorship resistance and promoted a network which was to have no single point of failure (single point of attack) which could be result in the shutdown of access points to the information. One of the first applications for these peer to peer networks was file sharing networks and networks such as Freenet/Tor etc. This of course eventually evolved into the Bitcoin which ultimately led to the development of Steem.
In dispersed computing a concept is introduced called "Networked Computation Points". An NCP can execute a function in support of user applications. To elaborate further I'll offer something below.
Consider that every component in a network is a node. Now consider that every component node is an NCP in that it can execute some function to support some user application. If we think of for example a blockchain then we know mining would fit into this category because it is both a node in the network and it also can execute a function in support of Bitcoin transactions. Why is any of this important? Parallelism is something we can gain from dispersed computing and please note that it is distinct form concurrent computing. When we rely on parallelism we can reap the benefits in terms of performance when executing code by breaking it up into many small tasks which can be performed across many CPUs.
EOS attempts to leverage parallelism specifically to enable it's performance boost. The benefit is speed and flexibility. Think for example of the hardware side also with FGPAs which can do similar tasks of a microprocessor. FGPAs (not ASICs) which unlike ASICs would provide generalized flexible parallel computing. Consider that just like with mining a company could add more and more FGPAs to scale an application as needed.
To understand Social Dispersed Computing we have to make note of the fact that there are other users at any given time. For example the other users in the network participate to provide resources to the network for the benefit of other users whilst using the network. So in Steem for example as you add content to Steem you are adding value to Steem in a direct way, but also in a dynamic way. The resources on Steem also can adapt dynamically to the demand provided that the incentive mechanism (Resource Credits) works as intended.
EOS as an example DOSC (Dispersed Operating System Computer)
Because EOS seems to be the first to approach this holistically I will give credit to the EOS network for pioneering dispersed computing in the crypto space. All resources are representable by tokenization in a dispersed computing network. EOS and even Steem have this. Steem has it in the form of "Resource Credits" which represent the available resources on the Steem network. If more resources are needed then theoretically the resource credits could act as an incentive to provide these resources to the Steem network. This provides a permanent price floor to Steem represented as the amount of Steem which would have to be purchased in order to have enough resources to run Steem (if I have the correct theoretical understanding). This would put Steem on a trajectory toward dispersed computing.
Operating systems typically sit between the hardware and software as a sort of abstraction layer. This traditionally has been valuable because programmers don't have to directly speak to the hardware and hardware designers don't have to directly communicate by their designs to the programmer. In essence the operating system in the traditional model is centralized and made by a company such as Microsoft or Apple. This centralized operating system typically runs on a device or set of devices and provides some standard services such as email, a web browser, and maybe even a Bitcoin wallet.
Typically the most valuable or high utility software people consider on a computer is the operating system. In our smart phones this is Android OS and in PCs it may be Windows or Linux. This is of course thrown on it's head under the new paradigm of dispersed computing and the new conceptual model of the "decentralized" operating system. EOS is the first to attempt a decentralized operating system using current blockchain technology but the upcoming technology easily eclipses what EOS could do. Tauchain is a technology which if successful will leave EOS in the stone age in terms of what it will be able to do. EOS while ambitious also has had it's problems with regard to the voting mechanisms and the ease at which collusion can take place.
To better understand how decentralized operating systems emerge learn about:
If we look at OSKit we see that it is the tools necessary for operating system development. If we look at Tauchain we realize that it is strategically the most important tool for the development of a decentralized operating system being provided in the form of TML (a partial evaluator). If we think of the primary tool necessary to develop from we have to initially start with a compiler. A compiler generator is more like what TML allows with it's partial evaluator. More specifically it is the feature of Futamura projection which can provide the ability to generate compilers.
If we look at the next most important part of an operating system it is typically the kernel. Let's have a look at what an exokernel is:
Operating systems generally present hardware resources to applications through high-level abstractions such as (virtual) file systems. The idea behind exokernels is to force as few abstractions as possible on application developers, enabling them to make as many decisions as possible about hardware abstractions. Exokernels are tiny, since functionality is limited to ensuring protection and multiplexing of resources, which is considerably simpler than conventional microkernels' implementation of message passing and monolithic kernels' implementation of high-level abstractions.
By Thorben Bochenek [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
From this at minimum we can see that an exokernel is a more efficient and direct way for programmers to communicate with hardware. To be more specific, "programs" communicate with hardware directly by way of an exokernel. We know the most basic function of a kernel in an operating system is the management of resources. We know in a decentralized context that tokenization allows for incentives for management of resources. When we combine them we get kernel+tokenization to produce an elementary foundation of an operating system. In a distributed context we could apply a decentralized operating system in such a way that the network could be treated as a unified computer.
Abstraction is still important by the way. In an operating system we know the object oriented way of abstraction. Typically the programmer works with the concept of objects. In an "Application Operating Environment" an "Application Object" can be another useful abstraction. Abstraction can of course be taken further but that is for another blog post.
The Utility of Agoras
Agoras+TML is interesting. Agoras is the resource management component of what may evolve into the Tau Operating System. This Tau Operating System or TOS is something which would be vastly superior to EOS or anything else out there because of the unique abilities of Agoras. The main abilities have been announced on the website such as the knowledge exchange (knowledge market) where humans and machines alike can contribute knowledge to the network in exchange for the token reward. We also know that Agoras will have a more direct resource contribution incentive property in the form of the AGRS token so as to facilitate the sale or trade of storage, bandwidth or computation resources.
The possible (likely?) emergence of the Tau Operating System
In order for Tauchain to evolve into a Dispersed Operating System Computer it will need an equivalent to a kernel. Some means of allowing whomever is responsible for the Tauchain network to control and manage the resources of that network. If for example the users decide then by way of discussion there would be a formal specification or model of a future iteration of the Tauchain network. This according to current documents is what would produce the requirements for the Beta version of the network to apply program synthesis. Program synthesis in essence could result in a kernel and from there the components of a Tau Operating System could be synthesized in the same way. Just remember that all that I write is purely speculative as we have no way to predict with certainty the direction the community will take during the alpha.
Using Controlled English as a Knowledge Representation language. By Dana Edwards. Posted on Steemit. April 4, 2017.
Previously I mentioned "controlled English" when discussing the concept of knowledge representation. This post will go into some detail about what controlled English is. In specific I will discuss Kuhn's doctoral dissertation and Attempto Controlled English (ACE).
Computational linguistics is an interdisciplinary field concerned with the statistical or rule-based modeling of natural language from a computational perspective.
There are many different controlled natural languages
First I would like to discuss the fact that controlled English is not the only controlled nature language and Attempto Controlled English is only one particular controlled English. For example 1 there is RuleSpeak which is a controlled natural language for business rules. Another example2 is Quelo Controlled English which is a controlled English for querying, where you would say statements such as: "I am looking for something, it should be located in a city, the city should produce a new car, the new car should be equipped with a diesel engine". In addition to these examples we also have Google which uses Voice Actions where you can speak into your android phone and say something like: "Create a calendar event: Dinner in San Francisco, Saturday at 7:00PM". All of these are examples of controlled natural languages and reveal just how powerful this could be for users and developers.
What is Attempto Controlled English (ACE)?
Attempto Controlled English also known as ACE is a specific controlled natural language. It is likely that at some point in the early stages of development this controlled natural language will be implemented on Tauchain. ACE is like English but relies on following certain rules with a restricted vocabulary.
Rule subject + verb + complements + adjuncts
All simple ACE sentences have the above structure of subject + verb + complements + adjuncts. An example would be the following sentences below:
A customer waits.
To construct sentences without a verb you can rely on:
there is + noun phrase
There is a customer.
And you can add detail with:
A trusted customer inserts two valid cards.
And you can use variables:
How does Attempto Controlled English help with Knowledge Representation?
In specific because anyone who speaks English can quickly learn Attempto Controlled English it will mean anyone will be able to contribute to the process of knowledge representation. Contributing to a knowledge base becomes very easy when you can simply describe in plain English (with restrictions) exactly the knowledge you want to represent. A semantic Wiki can be built out of this process rather easily.
How does Attempto Controlled English relate to Tauchain?
Tauchain requires input from the users to determine a formal specification. Attempto Controlled English is simple enough that anyone can describe a formal specification. For example sentences like:
Every customer inserts a card.
As you see above, we are dealing with types. Human is a type. Human is divided at a minimum between male and female subtypes.
And ACEWiki gives an example of what a formal specification could look like in Tauchain. The example being country, where the knowledge in this case is the concept of a country. Then we describe a country by filling in the Wiki collaboratively, where we know first of all that every country is an area, but then collaboratively we fill in the list of current persons who govern a country. Through this method we add to the knowledge base using the knowledge representation language ACE, and in the case of Tauchain we would be adding to potentially a formal specification which eventually is synthesized (program synthesis) by the Tauchain automatic programmer.
To learn more about Attempto Controlled English Wiki watch the video lecture
Kuhn, T. (2009). Controlled English for knowledge representation (Doctoral dissertation, University of Zurich).
Kuhn, T. (2014). A survey and classification of controlled natural languages. Computational Linguistics, 40(1), 121-170.
Kuhn, T. (2009). How controlled English can improve semantic wikis. arXiv preprint arXiv:0907.1245.
Ranta, A., Enache, R., & Détrez, G. (2010, September). Controlled language for everyday use: the molto phrasebook. In International Workshop on Controlled Natural Language (pp. 115-136). Springer Berlin Heidelberg.
Ross, Ronald G. 2013. Tabulation of lists in RuleSpeak—using “the following” clause. Business Rules Journal, 14(4):1–16.
White, C., & Schwitter, R. (2009, December). An update on PENG light. In Proceedings of ALTA (Vol. 7, pp. 80-88).
Web 2: http://attempto.ifi.uzh.ch/site/resources/
Fuente / Source: Original post written by Dana Edwards. Published on Steemit: Using Controlled English as a Knowledge Representation language. April 4, 2017.
Logo by CapitanArt
Enlaces / Links
Logo by CapitanArt
Archivos / Archives
Suggested readings to better understand the Tau ecosystem, Tau Meta Language, Tau-Chain and Agoras, and collaborate in the development of the project.
Lecturas sugeridas para entender mejor el ecosistema Tau, Tau Meta Lenguaje, Tau-Chain y Agoras, y colaborar en el desarrollo del proyecto.