Taking center stage in a diverse cast of new potential applications and digital business models, tokens promise to revolutionize the way we do business, ousting intermediaries, bypassing physical borders, and reducing cost in the process. This is achieved primarily through blockchain, a technology that usually derives its value from being open, transparent, or trustless.

Since the ICO boom of 2017, tokens have become ubiquitous in the blockchain industry. Despite this, so far it seems no token has been able to make serious inroads regarding market penetration. Evidently, to make wide-scale adoption possible, there are a few key issues to solve.

The InterWork Alliance

Although the immature nature of blockchain has shown itself to be a limiting factor at times, other problems exist on the token level. For example, it needs to be possible for anyone, be they a businessperson, regulator, or developer, to understand precisely how a given token works, both as an independent object and within the context of a particular platform or project.

This is primarily achieved through the creation of platform-neutral token standards, which can provide interoperability between projects. Put simply, if everyone can agree on the fundamentals of what a token should be doing, the friction can be removed in handling it, reducing the complexity and cost when building systems that work with tokens.

Additionally, certification processes are required to enable the efficient testing and independent auditing of tokens. This is an important step in building trust around the adoption of tokens and helps in quickly identifying malicious software, protecting token users by ensuring that any given implementation does precisely as it claims to do.

Establishing the required common ground to serve these needs was the catalyst behind the Interwork Alliance (IWA), recently formed as a collaborative effort between several members from a range of backgrounds including government bodies, the private sector, and the blockchain industry. Examples of IWA members include Neo, Microsoft, Accenture, Nasdaq, R3, and Hyperledger.

The IWA views digital assets as a force to empower mainstream businesses, helping them digitize their operations, automate business logic through smart contracts, and seamlessly interoperate with other services or industries.

The Need for Standards

Neo—originally AntShares, is no stranger to tokenization. It was created as a platform for the digitization of assets, bridging the gap between digital economies and the physical world. Since the release of its smart contract infrastructure, Neo has seen the deployment of numerous tokens onto its platform, particularly those adhering to the NEP-5 standard.

The NEP-5 token standard is a specification that provides method definitions, invoke parameters, and expected behavior for a number of functions that must be implemented in order for a token to comply with the standard.

As a standard, NEP-5 provides a reference for developers creating basic tokens in the Neo ecosystem, providing them with a specification to ensure compatibility with other projects on the blockchain, such as wallets or exchanges, and creating an environment where service providers only need to access one standard API. However, the standard is only certain to make sense within the context of the Neo blockchain itself—it may not translate perfectly to other platforms.

This is a common issue for blockchains, one that has become more prominent as the industry has begun to recognize interoperability as a fundamental requirement of future digital ecosystems.

It needs to be possible for tokens to work predictably in any environment. This necessitates the creation of standards that are abstracted from the implementation level, focusing purely on the fundamental traits and functionalities of the tokens themselves.

The Token Taxonomy Framework

This pursuit requires a universal classification system for tokens, which is the intention behind the Interwork Alliance’s Token Taxonomy Framework (TTF). The TTF aims to make sense of tokens by breaking them down into their constituent pieces—common token types or compartmentalized pieces of functionality.

If you can read the letters of the alphabet, you can use a dictionary to look up any word and find a definition. You could also combine and rearrange letters to form your own words, imbuing them with your own original definitions.

In much the same way, understanding the basic components of tokens as described in the TTF will enable anyone to discover, understand, or define new tokens for themselves. Every token encountered, regardless of the platform on which it exists, can be described by its atomic parts and understood functionally as the sum of them.

It should also be noted that the TTF is by no means a finished work; it provides initial pieces for working with tokens, but the intent is for it to be extended through collaboration. In the event that an entity can not fully meet their token behavior requirements using artifacts already in the taxonomy, they can create and contribute their own to the hierarchy.

Token formulas

An important step in making tokens understandable is finding a way to visually represent them. Requiring users to dig into the code for themselves has very limited reach, and any information gained when doing so is only useful within the context of the specific token under investigation.

To solve this problem, the framework’s proponents landed on token formulas, a way to represent all of the characteristics and metadata, or artifacts, of a token in a succinct manner. Though they may appear intimidating at face value, the formula structure is relatively simple and simply depends on learning the corresponding character for each artifact.

 

Token formulas present the three main components of tokens in the framework: base types, behaviors, and property sets. First are the base types, specifying whether a token is fungible (interchangeable with others, i.e. all tokens are equal and identical), non-fungible (i.e. each token instance is unique), or a hybrid combination of the two.

The next set of components are the behaviors, rules which define how a token should function. This could be some sort of intrinsic restriction on its use, such as being indivisible, or refer to an available action for the token, such as having the ability to be transferred between accounts at will. For simplicity, some frequently associated behaviors may be grouped together.

Finally are the properties, intended for bits of data the token contains about itself (like the state of a behavior) or attached with external relevance. For example, a piece of artwork might have a unique identifier attached, or a game item might include a URL path for direct viewing on a website.

Let’s consider the example formula shown above and dissect what each element is telling us:

  • The base type is both “Fungible” (tF), and “Whole”, meaning it is “Indivisible” (~d). NEO, the governance token of the Neo blockchain, is a good example of this kind of token.
  • The token is “Delegable” (g), so a party can request from the token owner the right to perform delegatable behaviors on their behalf.
  • The token has the “Supply Control” behavior group (SC), meaning the “Mintable” and “Burnable” behaviors are present, alongside the “Role” behavior which notes that accounts can be granted the right to perform token minting.
  • The token carries the “SKU” property set, meaning each token has an additional field containing its stock keeping unit (SKU).

Both behaviors and property sets also included a set of Control messages in request/response pairs, with generic descriptions explaining how to invoke a behavior or change a property. These use the Protobuf format and clearly define interactions between artifacts.

Creating tokens

To turn this formula into a usable token, it has to be combined with a definition, containing extra details and instructions. Definitions are used to clearly explain how the different characteristics in the formula fit together, informing how the token should be implemented and giving analogies for its use.

A definition to accompany the token formula shared above—an indivisible token with a flexible supply and unique SKU—might define the token as a way to track the item inventory of a business. Equally, it could also be used to represent the loyalty point balances of its customers.

The definition would be used to clearly explain these differences. For example, the inventory token previously mentioned might exclusively use the Delegable behavior to grant approved inventory managers the ability to raise the token supply (through minting) as new stock is ordered in.

Alternatively, a loyalty point token using this formula may only use the Delegable behavior to request a token burn from a customer, used as an alternate payment method or to redeem a reward.

In combining the token formula with two different definitions, we have created two new token templates. The next step would be to create a specific implementation of the template, usually by deploying a contract to some blockchain platform such as the Neo or Ethereum.

The framework calls this implementation a class of the token template. From there, individual tokens, called instances in the framework, are initialized and distributed. These instances are the actual tokens that an end user would hold in their wallet.

Toward a token-powered future

Regular users usually only need to concern themselves with the safekeeping of their own particular token instance. However, when you shift to the perspective of a regulator, business owner, or developer, it should not be understated how useful it is for a token to be clearly described as is enabled by the TTF.

The use of generalized behaviors and properties means that the way a token interacts can always be predicted. Every artifact in the taxonomy is essentially a standard in its own regard, detailing its intended usage. As a result, every token created using the framework ensures its own consistency with each other at the behavioural level, even if the final implementation is different.

At the high level, this can simply be understood as a common language for tokens and all those working with them. From this foundation, and the collaborative efforts in extending the framework, the TTF makes it possible to create new standards with interoperability in mind and demystifies tokens for all the applications in which they could be applied.

As with the IWA’s other initiatives, the TTF aligns closely with Neo’s original vision of the Smart Economy, encouraging far-reaching innovation and fostering the adoption of tokens in the new digital frontier.