• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • Articles
  • News
  • Events
  • Advertize
  • Jobs
  • Courses
  • Contact
  • (0)
  • LoginRegister
    • Facebook
    • LinkedIn
    • RSS
      Articles
      News
      Events
      Job Posts
    • Twitter
Datafloq

Datafloq

Data and Technology Insights

  • Categories
    • Big Data
    • Blockchain
    • Cloud
    • Internet Of Things
    • Metaverse
    • Robotics
    • Cybersecurity
    • Startups
    • Strategy
    • Technical
  • Big Data
  • Blockchain
  • Cloud
  • Metaverse
  • Internet Of Things
  • Robotics
  • Cybersecurity
  • Startups
  • Strategy
  • Technical

How Purpose-Driven Tokenisation Will Enable Innovative Ecosystems

Dr Mark van Rijmenam / 9 min read.
January 31, 2020
Datafloq AI Score
×

Datafloq AI Score: 83.33

Datafloq enables anyone to contribute articles, but we value high-quality content. This means that we do not accept SEO link building content, spammy articles, clickbait, articles written by bots and especially not misinformation. Therefore, we have developed an AI, built using multiple built open-source and proprietary tools to instantly define whether an article is written by a human or a bot and determine the level of bias, objectivity, whether it is fact-based or not, sentiment and overall quality.

Articles published on Datafloq need to have a minimum AI score of 60% and we provide this graph to give more detailed information on how we rate this article. Please note that this is a work in progress and if you have any suggestions, feel free to contact us.

floq.to/JsAEr

Tokens have been around for 1000s of years, but only recently have we seen the rise of digital tokens. Now, cryptographic tokens offer us an opportunity to redesign value streams and hence existing ecosystems. A well-designed token ecosystem unlocks value by bringing parties together in new ways and stimulates the target behaviour by having cryptographic tokens as built-in incentives. Tokens matter and offer us a chance to redesign existing and new ecosystems.

On January 14, 2020, we had the second round table session organised by the 2Tokens initiative. The 2Tokens project aims to clarify the path to realising value from tokenisation. During the first round table session, we discussed why we need tokenisation, what is required to achieve value from tokenisation, and how we should move ahead with it.

The objective of the second round table discussion was to understand the challenges faced when designing new, token-driven, ecosystems; what is needed to enable purpose-driven tokenisation, which comes down to token engineering – the practice of using tokens as the foundation for designing value flows and ultimately economic systems?

The event took place at YES!Delft and with over 70 thought leaders, innovation drivers and representatives from enterprises, law firms, the regulator and the Dutch government, it was a great success. The attendees had interesting discussions on seven different topics, related to purpose-driven tokenisation:

1. Shared understanding how to align interests and motivations to collaborate and make progress in ecosystems?
2. Innovative funding how to fund ecosystems beyond traditional VC financing or (bank) loans?
3. Change management how to understand and facilitate the change implied with new ways of interacting?
4. Messaging and engagement should the narrative around tokenisation change to enable innovative ecosystems?
5. Knowledge and skills what skills do organisations need to transition to tokenised ecosystems?
6. Problem-solution fit how to ensure addressing real problems where tokenisation can help realise in new solutions?
7. Tokenisation and the law what are the legal requirements around purpose-driven tokenisation?

The objective of these seven tables was to understand how to enable tokenised ecosystems from the perspective of enabling new ecosystems.

Purpose-driven tokenisation

Below are summaries of the discussions that took place at each table:

Shared Understanding

Tokenisation, tokenomics and token engineering are new concepts and, for many, difficult to understand. Especially since often, these terms are used differently in different contexts. What do these concepts mean, and how can an ecosystem benefit from it? Without a shared understanding among all stakeholders involved in an ecosystem, it becomes difficult to allow people from multiple disciplines to collaborate effectively and build a tokenised ecosystem.

A shared understanding is not only relevant for building tokenised ecosystems, but it will also enable regulators and policymakers to address regulatory concerns in the right way. As such, it will foster a healthy public debate around tokenisation.

A shared understanding consists of precise terminology and taxonomy, international standards, useful metaphors to share with wider audiences and clear legal and regulatory frameworks. Since the field of tokenisation is very much alive, on-going coordination, education and engagement among all stakeholders is essential. A Token Coalition can manage this, or an organisation such as the 2Tokens project, to ensure all stakeholders’ requirements are met.

We define the terminology as follows:

Tokens: the digital representation of value (e.g. asset) on a blockchain.
Tokenisation: the process of changing value (e.g. asset) into its digital representative.
Tokenomics: the study of the emerging field of the design of crypto tokens and related digital assets using economic incentives, game theory, cryptography and computer science.
Token engineering: the practice of using tokens as the foundation for designing value flows and ultimately economic systems.
Purpose-driven tokenisation: leveraging the exchange of value to drive behaviours of an ecosystem towards a particular goal.

Innovative Funding

For any ecosystem, funding is important. Tokenisation, however, changes how this funding can be achieved, going beyond traditional financing from venture capitalists or financial institutions. When an ecosystem plans to use tokens for funding, it can benefit from easy access to capital, anywhere in the world. Compared to traditional financing such as an IPO, the costs of financing are lower, and it is easier to scale as more people have access to the investment opportunity. After all, tokens do not know borders.

Tokens offer multiple, technical, advantages over traditional funding. First of all, they are programmable. This means that governance and rules can be embedded within the token. For example, the longer you hold a token, the more dividend you will receive. This allows you to drive the behaviour of your investors while raising funds. In addition, tokens are transparent, secure and traceable, giving regulators more control to ensure correct behaviour.

With tokenised funding becoming the norm in the coming years, we can expect a shift from ownership to temporary ownership, as exchanging assets will become easy. As a result, previously illiquid assets will become liquid, thereby drastically changing economies. Anything can be tokenised and made liquid, including real estate (fractional ownership), CO2 rights, mobility, futures, art or even entire clubs and sport contracts to increase fan engagement.

Key to tokenised funding is the right infrastructure. This includes secondary markets to easily exchange security tokens, clear regulations so companies know what they have to comply with, and an intuitive user interface to facilitate ease of investment. When the right tools are available, tokens will revolutionise funding opportunities.


Interested in what the future will bring? Download our 2023 Technology Trends eBook for free.

Consent

Change Management

Developing an ecosystem is one thing; getting people to use it is a different challenge. Although tokens can drive behaviour, people will need to change their behaviour to participate in tokenised ecosystems. It can be expected that there is a resistance to change because people don’t understand the new ways of interacting (why is the new ecosystem better?), don’t see the urgency (why do we need to change now?) or don’t see what is different compared to traditional ecosystems (is the status quo not good enough, or even better?). In addition, there might be a fear (that large parties will determine the rules or reap all rewards) to change among the ecosystem participants.

To tackle this resistance, it is crucial for ecosystem owners to create awareness and understanding the change implied when using tokenisation; what is it, why is it important, how will it change the ecosystem and what are the benefits? In addition, it is important to take into account to define and communicate the scale of change and eliminate certain (wrong) assumptions. Starting with a minimum viable ecosystem to build engagement and showing why a tokenised approach is important can help in market adoption.

When talking about tokenisation of an ecosystem, or multiple ecosystems for that matter, we should also take into account the following subjects:

  • Responsibility: define responsibility, who is responsible, what creates what and who might be responsible for possible negative consequences of the tokenisation of the ecosystem?
  • Weaker parties: Weaker parties might need a helping hand to participate in the tokenisation of ecosystems.
  • Ethics: is the tokenised ecosystem designed with ethics in mind?
  • Skin in the game: who are the parties at risk when tokenising an ecosystem? Who can quickly adapt and who might need more help? It is important to try to keep everyone on board. Diversity within and between ecosystems could add a lot of value.

With the above components in place, it becomes easier to design and grow a tokenised ecosystem.

Messaging and Engagement

Apart from designing a tokenised ecosystem, communication around your ecosystem is also vital to drive change. For many, the concept of tokenisation is not clear, let alone the benefits of purpose-driven tokenisation. Therefore, a clear message explaining why a token is used and what the benefits are for the ecosystem as a whole will be important to ensure adoption. This does not only include the right marketing material but also to see the ecosystem in production so the user can experience the benefits of a tokenised ecosystem. A demo or proof of concept will not be sufficient, and only real applications will get tokens out of the taboo sphere.

However, any tokenised ecosystem explaining tokens and the benefits of tokenisation is not sufficient. There must be clear messaging and engagement on a higher level, which could benefit all tokenised ecosystems. This links back to the need for a shared understanding, standards and clear definitions. It would allow us to explain what tokenisation can do for society, which would benefit all tokenised ecosystems and speed up adoption. Leveraging success stories as showcases would definitely help. However, showcasing failures can also contribute as we can learn from our mistakes, i.e. we need to educate people on tokenisation.

Tokenisation can have broad benefits for society if done right how to get the implications and opportunity widely understood?

Knowledge and Skills

Education of consumers, companies, regulators and policymakers is vital for tokenised ecosystems to succeed. A lot of applications, such as decentralised autonomous applications (DAOs) are too complicated (technical) to understand for most, which limits effective governance. In addition, crypto has a reputation problem, among others due to the many scams the world has seen in the past years. Therefore, for tokenised ecosystems to succeed, we need to educate and increase the community’s knowledge of tokenisation to create trust. This can only be achieved when different industry players, such as regulators, policymakers, startups and investors, actively collaborate when designing the ecosystem and share their insights with the broader community.

Unfortunately, the problem is that regulatory clarity will take too long, which could mean that organisations would need to take ownership and move ahead regardless. This can only work if those organisations moving forward ensure trust in their product by adhering to ethical standards education through action.

Problem-Solution Fit

When designing and developing tokenised ecosystems, it is important that there is a real problem that can be fixed with tokenisation. As with any startup, validation is, therefore, vital for success. Are you designing and developing the right ecosystem in the right way?

What can tokens bring to the table that cannot be achieved without tokens? To drive trust in the wider community, it needs to be clear for (potential) users/clients/regulators why a token is necessary and how the token will be used. What economic value will the token bring to the ecosystem? Important questions that need to be answered and shared with relevant stakeholders – prior to building a tokenised ecosystem. Since tokenisation is so new, this can only be achieved by active collaboration with all stakeholders before developing the ecosystem.

Tokenisation and the Law

When designing tokenised ecosystems, legal compliance is essential to stand apart from unethical and fraudulent counterparts. Therefore, legal design thinking is relevant at the core of every project. It is important to have a clear view of the legal aspects of new developments to obtain regulatory cooperation and closely collaborate (for example, using sandboxes) with regulators and policymakers when designing the tokenised ecosystem. While the community should welcome regulation, regulators and policymakers need to develop regulations that do not stifle innovation. This requires more cooperation and open dialogue between those innovators and the regulators.

Conclusion

Standardisation of terminology, education of the wider community and collaboration with relevant stakeholders such as regulators and policymakers are important preconditions for developing successful tokenised ecosystems. It is up to regulators and policymakers to establish clear regulations that enable innovation instead of limiting it because tokens do not adhere to borders.

The opportunity is there if we can get the right environment for tokenisation in place. If this does not happen, then startups and ecosystems looking to leverage the benefits of tokenisation will move elsewhere as the benefits are simply too big to ignore.

This second round table session was again a great success by showing what the preconditions are when developing tokenised ecosystems. Purpose-driven tokenisation can drive and change behaviour, but all stakeholders must be involved from the start, which is precisely what we are doing with developing the 2Tokens ecosystem.

The final round table took place on February 11, where we took a deep dive into all aspects of Token Financing. If you would like to contribute and/or engage on this topic, then please indicate your interest to join, you can register here.

Categories: Blockchain
Tags: blockchain, security token, Token economics, Tokenomics, tokens

About Dr Mark van Rijmenam

Dr Mark van Rijmenam, CSP, is a leading strategic futurist and innovation keynote speaker who thinks about how technology changes organisations, society and the metaverse. He is known as The Digital Speaker, and he is a 5x author and entrepreneur.

Primary Sidebar

E-mail Newsletter

Sign up to receive email updates daily and to hear what's going on with us!

Publish
AN Article
Submit
a press release
List
AN Event
Create
A Job Post
Host your website with Managed WordPress for $1.00/mo with GoDaddy!

Related Articles

The Advantages of IT Staff Augmentation Over Traditional Hiring

May 4, 2023 By Mukesh Ram

The State of Digital Asset Management in 2023

May 3, 2023 By pimcoremkt

Test Data Management – Implementation Challenges and Tools Available

May 1, 2023 By yash.mehta262

Related Jobs

  • Software Engineer | South Yorkshire, GB - February 07, 2023
  • Software Engineer with C# .net Investment House | London, GB - February 07, 2023
  • Senior Java Developer | London, GB - February 07, 2023
  • Software Engineer – Growing Digital Media Company | London, GB - February 07, 2023
  • LBG Returners – Senior Data Analyst | Chester Moor, GB - February 07, 2023
More Jobs

Tags

AI Amazon analysis analytics app application Artificial Intelligence BI Big Data business China Cloud Companies company costs crypto customers Data design development digital environment experience future Google+ government information learning machine learning market mobile Musk news Other public research sales security share social social media software strategy technology twitter

Related Events

  • 6th Middle East Banking AI & Analytics Summit 2023 | Riyadh, Saudi Arabia - May 10, 2023
  • Data Science Salon NYC: AI & Machine Learning in Finance & Technology | The Theater Center - December 7, 2022
  • Big Data LDN 2023 | Olympia London - September 20, 2023
More events

Related Online Courses

  • Oracle Cloud Data Management Foundations Workshop
  • Data Science at Scale
  • Statistics with Python
More courses

Footer


Datafloq is the one-stop source for big data, blockchain and artificial intelligence. We offer information, insights and opportunities to drive innovation with emerging technologies.

  • Facebook
  • LinkedIn
  • RSS
  • Twitter

Recent

  • 5 Reasons Why Modern Data Integration Gives You a Competitive Advantage
  • 5 Most Common Database Structures for Small Businesses
  • 6 Ways to Reduce IT Costs Through Observability
  • How is Big Data Analytics Used in Business? These 5 Use Cases Share Valuable Insights
  • How Realistic Are Self-Driving Cars?

Search

Tags

AI Amazon analysis analytics app application Artificial Intelligence BI Big Data business China Cloud Companies company costs crypto customers Data design development digital environment experience future Google+ government information learning machine learning market mobile Musk news Other public research sales security share social social media software strategy technology twitter

Copyright © 2023 Datafloq
HTML Sitemap| Privacy| Terms| Cookies

  • Facebook
  • Twitter
  • LinkedIn
  • WhatsApp

In order to optimize the website and to continuously improve Datafloq, we use cookies. For more information click here.

Dear visitor,
Thank you for visiting Datafloq. If you find our content interesting, please subscribe to our weekly newsletter:

Did you know that you can publish job posts for free on Datafloq? You can start immediately and find the best candidates for free! Click here to get started.

Not Now Subscribe

Thanks for visiting Datafloq
If you enjoyed our content on emerging technologies, why not subscribe to our weekly newsletter to receive the latest news straight into your mailbox?

Subscribe

No thanks

Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

Marketing cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

Please enable Strictly Necessary Cookies first so that we can save your preferences!