Jun 18, 2024

Data Assets: Tokenisation and Valuation

In today’s digital era, data has become immensely valuable, estimated at $13 trillion globally. However, unlike traditional assets, data often doesn’t find its place on balance sheets. Tokenisation, a process commonly utilised by banks for payments and settlements, has the potential to manage, value, and monetise data assets. By representing data as tokens powered by decentralised platforms, it could become a tradeable commodity, fostering liquidity and enabling efficient exchange.  

This position paper outlines our vision and a general framework for tokenising data and managing data assets and data liquidity to allow individuals and organisations in the public and private sectors to gain the economic value of data while facilitating its responsible and ethical use.

We will examine the challenges associated with developing and securing a data economy, as well as the potential applications and opportunities of the decentralised data-tokenised economy. We will also discuss the ethical considerations to promote the responsible exchange and use of data to fuel innovation and progress.

Download the paper here.

—-

Cover Image by Google DeepMind from Pexels.

More to read

Subscribe to our newsletter!

Valyu is a data provenance and licensing platform that connects data providers with ML engineers looking for diverse, high-quality datasets for training models.  

#WeBuild 🛠️

Subscribe to our newsletter!

Valyu is a data provenance and licensing platform that connects data providers with ML engineers looking for diverse, high-quality datasets for training models.  

#WeBuild 🛠️

Subscribe to our newsletter!

Valyu is a data provenance and licensing platform that connects data providers with ML engineers looking for diverse, high-quality datasets for training models.  

#WeBuild 🛠️