3 Steps to Monetize Your Data

Treating data ‘as an asset’

I remember reading a book on big data a few years back — the name escapes me now — which suggested that organisations should start to look at data in the same way they do tangible assets such as people, buildings and machines.

The book went on to recommend that organisations should treat data ‘as an asset’ which, in practical terms meant to take proper care of it over its valuable lifecycle through the application of data governance, data management processes and effective quality control. If the data was well maintained, the book continued, then it could be used for a broader range of use cases and deliver more value to the organisation. As such data could be said to have a tangible value and could, theoretically be added to the balance sheet of the financial statement for the business.

Treating data as an asset is for sure a valuable goal to strive towards for sure, but adding it to the balance sheet? It’s a nice idea, but in my experience the businesses I work with are far away from even considering this. You never know, one day it might become common place, but I won’t hold my breath!

However, one trend I have seen emerge which stems from this line of thought is monetizing data. The advent of Snowflake’s Data Marketplace has significantly lowered the barrier to entry and made this a reality for even the most modest organisations. Many of the customers I work with today are really excited about the possibilities that this now presents to their business operations.

Why monetize data?

There are many reasons why an organisation might want to monetize data, some common examples are:

1. Provides a point of different in a competitive marketplace — if you’re able to generate new insights and package up a data offering to your partners, this may well give you an edge over your competition in a crowded marketplace. This might act as a point of difference which allows you to attract and retain new and existing customers more easily.

2. Reduces operational overhead — managing data feeds can be a time-consuming, costly, and complex exercise. Many companies are still sharing data based on physically shuffling files across the network. This not only introduces multiple points of failure but also adds latency and delays to the process. This means there’s a greater potential your customers suffer a bad experience; either the data is out of date by the time they receive it, is corrupt or there’s a failure in the process meaning they don’t get their data at all! As we’ll discover Snowflake’s data marketplace overcomes a lot of this by reducing data movement along with all the associated risks I’ve mentioned.

3. Create new products — your organisation might already be sitting on a gold mine of data but has never had the means to share it. You might have data related to customer behaviour, spending habits or market trends for example which could help other companies offer a more personalised service to its customers. The ability to package up your data to create new products is a compelling and existing proposition to many organisations.

The 3 steps to monetising your data

I like to keep things simple. In this section, I’m going to break down what can be potentially viewed as a complex problem into just 3 simple steps to help you understand how you maximise the value from the data you have at your disposal!

Step 1 — Discover

Objective: Create an inventory of high value data assets

Identify and catalog your data assets

As counter-intuitive as it sounds the best place to start is with the data you CANNOT share! There’s going to be data you don’t wish to share as it would disclose trade secrets or risk competitiveness. Additionally, you’ll more than likely have to deal with regulatory, legal and contractual concerns which place additional constraints on how you can use the data you have access to.

That still leaves a wealth of possible datasets such as:

· Operational data such as sensor, telemetry, weather or log data

· Commercial data such as share dealing, price movements or market insights

· Marketing data such as customer buying trends or patterns, segmentation

Once you know what you can share, you need to consider how to add value to your data.

Firstly, let’s assume you have the raw data, so no transformations, cleansing or additional business logic has been applied. Now, ask yourself what would it take to clean and standardize the data? Would it be a worthwhile exercise based on the value it would offer to your customers. Remember you might only need to cleanse the data once to sell it many times over. Economy of scale would be your friend here.

How about if you integrate it with other existing data sets you have available and curate or tailor it for a specific purpose? You could also look to analyse the data to derive new, tangible insights and draw on this analysis as part of the value add to any potential customers.

Look at the chart below and consider what the trade-off might be for you in your environment, with your own data sets. How much value you can you add to the data for the lowest time and effort?

Step 2: Market Strategy

Objective: Define a go to market strategy including pricing

Consider how much the data costs to produce.

This includes:

• Sourcing and onboarding the data

• Cleaning, transforming and augmenting the data

• Handling any operational issues, such as corrupt or missing data, failed processes and customer queries

• Associated infrastructure and licensing costs

One option could be to simply add up all the costs associated with the activities above, add your margin and that’s your price. Alternatively, I would suggest you also consider value-based pricing which involves several factors:

• How hard would it be for your customers to source this data? Is it even possible?

• Do you need certain niche skills and domain expertise to make sense of the data to unlock new insights?

• How could this data help your potential customers improve their business? Can they develop better products? Can they make more sales, reach more prospects, target customers more effectively with personalized offers or messaging?

• Are customers already dealing with the pain of collecting and managing this data operationally and would rather consume it following ‘data as a service’ model?

It’s possible to produce an incredibly costly data set of no value to anyone else or, ideally, a lower-cost one with immense potential value to a buyer.

Now you have thought about the costs of providing the data and the potential value on the market. The final element in step two around market strategy and what we refer to as ‘packaging’.

The key questions here are to consider things like how often you refresh your data and how important is this to your customers. For example, do customers only care about data at the present time, such as stock prices and volatility when they’re making investment decisions? Once this kind of data is outdated, then it loses its value quite rapidly.

For other customers historical insights are more valuable, such as historical house prices over time, or perhaps an insurance company that offers house insurance wants to analyze insurance claim events over time. They may want to look at certain streets within geographical areas to get a better sense of the market risk it’s dealing with for example.

The bottom line here is different customers will place different value on how often you refresh those particular data sets.

It’s also well worth considering how comprehensive the data will be in terms of number of data points or level of aggregation. This can really help refine your pricing model. You could set up tiered pricing like gold, silver and bronze.

In this case, bronze, for example, could have the narrowest amount of data points available at the highest level of aggregation. Some customers choose to pay more to access silver or gold to obtain access to more data points and at a more granular level, giving them potentially better, and more significant insights.

You should also consider if exclusivity to a single customer in return for a premium payment would be a compelling proposition. I’ve seen this setup before and it can work well. And because you’re just dealing with one particular customer you reduce operational overhead and potential queries in comparison to having tens or even hundreds of customers.

Sometimes a single customer wants to corner the market and capture that data for themselves and are prepared to pay more, to be able to secure that data exclusively.

Another area worth considering upfront when considering your pricing model and how to package your data product might be to consider delivering a suite of BI dashboards such as something built in power BI or Tableau or Looker, prebuilt with all the filters in there, allowing your customers to slice and dice the data. Would that add further value to your customers, meaning they don’t need to worry about licensing a data vizualisation tool and developing the dashboards themselves? Would that complete your data offering and data product to the market?

Step 3: Data Sharing

Objective: Defining a mechanism to share data with minimal friction

Finally, you can share your data with the world!

We’ve discovered what data we can share and identified those high value data sets. We’ve potentially curated that data and cleansed it and even added new insights to it. We’ve defined our target market and price and strategy. The final step now is sharing that data and decide on how to distribute it to the market.

This is where you can leverage Snowflake’s Data Marketplace. This means the data stays within the company or organization that owns the data and it can be shared -with no data movement — to potentially unlimited number of consumers. Consumers can access that data directly as soon as it’s available using their own Snowflake account. Once you’ve shared your data with them, it appears like any other database within their Snowflake account.

As soon as you as the owner of the data, refresh the data, it becomes available directly to those data consumers. They can then easily and quickly consume that data using native SQL code as well as join in it with their own data. So the inherit challenges involved with using APIs and traditional file transfer methods are because data movement is eradicated from the entire process.

Furthermore, Snowflake can share across regions and multi-region, and cross-cloud. So if you’re a Snowflake customer running on Microsoft Azure and your data consumers have Snowflake, but it’s running on Google Cloud Platform, then it really doesn’t matter. It’s still going to be the same user experience.

See it in action!

In this week’s video I walk you through my 3 step process, provide a brief orientation of Snowflake’s Data Marketplace, before diving into a real-world case study which looks at how one customer used the Marketplace to develop a new product offering which great success!

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store