Teasing Out An Algorithm

Original article was published on Artificial Intelligence on Medium

Over time everything changes and that reality is what Deep Learning needs a way to realize

The materials and techniques for creating images have long been helping human vision to change recursive impressions into a sense of the recognizable. Today, this is the algorithm’s means of communicating the calculus of knowledge. But recognition in the end is superficial in a discourse on creation. Only ascertaining the source of chance can provide the real thing.

Consider that 55,000 uniquely identifiable patterns of color are digitized from original non-digital things with an actual presence, and, in making a Non-Fungible Token (NFT) as a digital presence for each, Smart Contracts direct them all to be gathered together and organized based on their production order. That is what is being presented as an exercise in teasing out an algorithm from a concept where a preordained outcome is the creation.

Human and AI object recognition capabilities are matched up for this. Everything that exists to complete it is in the prose of http://www.greatknot.com/5.html. There, it is learned that the majority of these number of things are distributed and in the wild, meaning an algorithm must find actual things, and make them virtual to be identifiable as digital images to the Internet, and that creates this outcome.

It begins as a system where Big Data finds a Distributed Autonomous Organization (DAO) growing, conveying a presence of individuals coming forward with digital images matching metadata properties it recognizes. Images similar in specification are in Smart Contracts for selling an NFT with features that reference the presence of a physical object having that same identity, independently held as a personal possession identifiable with this DAO’s interests. This is taken as an indicator that similar objects are organized and actively guiding others in the same way of applying their properties to NFTs.

In this system the NFT is positioning actual objects in a digital marketplace where tokens of them go through many transfers of ownership in an improvised game of changing value. Every time one is sold, its Smart Contract gets recognized as containing an independent identity, and the authentication process is then tracked through Object Recognition algorithms in Strong AI where similarities to other objects reinforce the idea of a generative identity. The value of this is in having a blockchain messaging these 55,000 relationships each time any token changes hands; like building a historicity out of a provenance.

From such a procedural connection to the DAO, each Smart Contract, in providing a percentage of its token’s sale to the holder of the physical object, represents recognition of preserving its function; and so a smaller amount taken from each transaction that is put into a common pot the DAO sets aside as an incentive becomes the value of the whole. Its growth with every transaction recorded in any Smart Contract is connected to the DAO; and, by extension, the game of relative rarity of the tokens.

This pot is used to get programmers to compete to build the algorithm that will apply the data the Smart Contracts accumulate. The energy this 55,000-strong DAO spends in finding owners of the objects still in the wild,and guiding them through the use of the NFTs, to bring the full distribution into the order of the generative sequence of its production, is all for teasing out this algorithm.

In the Smart Contracts, all the token creators, individually and together as the DAO, are continually monitoring the order in the sequence their particular object holds within its place in the full production. This is what an algorithm is assumed to follow. Any stakeholder or ad hoc number in the DAO can bring together the whole production and put it together in its sequential order, without an algorithm to support it; taking the total reward in the pool; but with a set of 55,000 this is unlikely to happen.

So if an algorithm is successful in correctly satisfying the contest parameters, its developer will get the pot. In either case, the DAO ceases to function upon this goal being reached, and the owners receive their full percentage, plus the DAO’s take, of each sale of the token of their possession, thereafter.

This outcome is designed to reward dives into data that must be teased out of recursive impressions of what has been structured over the time between when this exercise began and when it may ever reach an end. It is only the uniqueness of the content-mix cited in the link, above, that drives Deep Learning to discover the holders, and that may never again appear in such a transparent set of knowns as those available to this exercise. That’s why it is so important this be used to build algorithmic performances.

There’s bigger fish to fry in the outcome of this endgame. What comes from structuring the generative relationship of the tokens to the state of building value when it is up against finding these knowns, in such a distribution of unknowns, is a lesson. It is a more than certain future, that this same algorithmic performance will see use as a growth in NFTs that are solely applied to verified content, interjects a need to work the Internet with the same investigative purpose Deep Learning has experienced here.