Welcome!

@ThingsExpo Authors: Elizabeth White, Jnan Dash, Pat Romanski, Liz McMillan, William Schmarzo

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Economic Value of Data (EvD) Challenges | @BigDataExpo #BigData #Analytics

Data has a direct impact on an organization’s financial investments and monetization capabilities

Well, my recent University of San Francisco research paper “Applying Economic Concepts To Big Data To Determine The Financial Value Of The Organization’s Data And Analytics Research Paper” has fueled some very interesting conversations. Most excellent! That was one of its goals.

It is important for organizations to invest the time and effort to understand the economic value of their data because data has a direct impact on an organization’s financial investments and monetization capabilities. However, calculating economic value of data (EvD) is very difficult because:

  • Data does not have an innate fixed value, especially as compared to traditional assets, and
  • Using traditional accounting practices to calculate EvD doesn’t accurately capture the financial and economic potential of the data asset.

And in light of those points, let me share some thoughts that I probably should have been made more evident in the research paper.

Factoid #1:  Data is NOT a Commodity (So Data is NOT the New Oil)
Crude oil is a commodity. West Texas Intermediate (WTI), also known as Texas light sweet, is a grade of crude oil used as a benchmark in oil pricing. This grade is described as light because of its relatively low density, and sweet because of its low sulfur content.  WTI is a light crude oil, with an API gravity of around 39.6, specific gravity of about 0.827 and less than 0.5% sulfur[1].

And here’s the important factoid about a commodity: every barrel of Texas light sweet is exactly like any other barrel of Texas light sweet. One barrel of Texas light sweet is indistinguishable from any other barrel of Texas light sweet. Oil is truly a commodity.

However, data is not a commodity. Data does not have a fixed chemical composition, and pieces of data are NOT indistinguishable from any other piece of data. In fact, data may be more akin to genetic code, in so much as the genetic code defines who we are (see Figure 1).

Figure 1: Genetic Code

Every piece of personal data – every sales transactions, consumer comment, social media posts, phone calls, text messages, credit card transactions, fitness band readings, doctor visits, web browses, keyword searches, etc. – comprises another “strand” of one’s “behavioral genetic code” that indicates one’s inclinations, tendencies, propensities, interests, passions, associations and affiliations.

It’s not just the raw data that holds valuable strains of our “behavioral genetic code”, the metadata about our transactional and engagement data are a rich source of insights into our behavioral genetic code. For example, look at the metadata associated with a 140-character tweet. 140 characters wouldn’t seem to be much data. However, the richness of that 140-character tweet explodes when you start coupling the tweet with all the metadata necessary to understand the 140-characters in context of the conversation (see Figure 2).

Figure 2: “Importance of Metadata in a Big Data World”

The Bottom-line:
Data is not a commodity, which makes determining the economic value of data very difficult, and maybe even irrelevant, using traditional accounting techniques. Which brings us to the next point…

Factoid #2: Can’t Use Accounting Techniques to Calculate Economic Value of Data
The challenge with using accounting or GAAP (generally accepted accounting principles) techniques for determining the economic value of data is that accounting uses a retrospective view of your business to determine the value of assets. Accounting determines the value of assets based upon what the organization paid to acquire those assets.

Instead of using the retrospective accounting perspective, we want to take a forward-looking, predictive perspective to determine the economic value of data. We want to apply data science concepts and techniques to determine the EvD by looking at how the data will be used to optimize key business processes, uncover new revenue opportunities, reduce compliance and security risks, and create a more compelling customer experience. Think determining the value of data based upon “value in use” (see Table 1).

Accounting Perspective Data Science Perspective
Historical valuation based upon knowing what has happened Predictive valuation based upon knowing what is likely to happen and what action one should take
Value determination based upon what the organization paid for the asset in the past Value determination based upon how the organization will monetization the asset in the future
Valuations are known with 100% confidence based upon what was paid for the asset Valuations are based on probabilities with confidence levels dependent upon how the asset will be used and monetized
Value determination based upon acquisition costs (“value in acquisition”) Value determination in use based upon how the data will be used (“value in use”)

Table 1:  Accounting versus Data Science Perspectives

This “value in use” perspective traces its roots to Adam Smith, the pioneer of modern economics. In his book “Wealth of Nations,” Adam Smith[3] defined capital as “that part of a man’s stock which provides him a revenue stream.” Adam Smith’s concept of “revenue streams” is consistent with the data science approach looking to leverage data and analytics to create “value in use”.

We have ready examples of how other organizations determine the economic value of assets based upon “value in use” starting with my favorite data science book – Moneyball.  Moneyball describes a strategy of leveraging data and analytics (sabermetrics) to determine how valuable a player might be in the future. One of the biggest challenges for sports teams is to determine a player’s future value since player salaries and salary cap management are the biggest management challenges in sports management. Consequently, data science provides the necessary forward-looking, predictive perspective to make those “future value” decisions.

Sports organizations can not accurately make the economic determination of a player’s value based entirely on their past stats. To address this challenge, basketball created Real Plus-Minus (RPM)[4]. Real Plus-Minus is a predictive metric (score) that is designed to predict how well a player will perform in the future.

The Bottom-line:
We need to transition the economic vale of data conversation away from the accounting retrospective of what we paid to acquire the data, to a data science predictive retrospective of how the data is going to be used to deliver “value in use.”

Economic Value of Data Summary
Data is an asset that can’t be treated like a commodity because:

  1. Every piece of data is different and provides unique value based upon the context (metadata) of that data, and
  2. Traditional retrospective (accounting) methods of determining EvD won’t work because the intrinsic value of the data is not what one paid to acquire the data, but the value is in how that data will be used to create monetization opportunities (“data in use”).

To exploit the economic value of data, organizations need to transition the conversation from an accounting perspective (of what has happened) to a data science perspective (on what is likely to happen) on their data assets. Once you reframe the conversation, the EvD calculation becomes more manageable, more understandable and ultimately more actionable.

[1] https://en.wikipedia.org/wiki/West_Texas_Intermediate

[2] Edited by Seth Miller User:arapacana, Original file designed and produced by: Kosi Gramatikoff User:Kosigrim, courtesy of Abgent, also available in print (commercial offset one-page: original version of the image) by Abgent – Original file: en:File:GeneticCode21.svg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=4574024

[3] “Wealth of Nations”, http://geolib.com/smith.adam/won1-04.html

[4] https://cornerthreehoops.wordpress.com/2014/04/17/explaining-espns-real-plus-minus/

The post Economic Value of Data (EvD) Challenges appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

@ThingsExpo Stories
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...