Welcome!

@ThingsExpo Authors: Xenia von Wedel, Liz McMillan, Jason Bloomberg, Yeshim Deniz, Elizabeth White

Related Topics: @DXWorldExpo, Machine Learning , @ThingsExpo

@DXWorldExpo: Article

Data and Economics 101 | @CloudExpo #IoT #AI #M2M #BigData #Analytics

Economics is the science that deals with the production, distribution, and consumption of commodities

As more organizations try to determine where best to deploy their limited budgets to support data and analytics initiatives, they realize a need to ascertain the financial value of their data and analytics - which means basic economic concepts are coming into play.  While many of you probably took an economics class in college not too long ago, some more "seasoned" readers may be rusty.

The starting point for this topic began with a blog that I wrote several months ago titled "Determining the Economic Value of Data" and this key observation that started that conversation:

Data is an unusual currency. Most currencies exhibit a one-to-one transactional relationship. For example, the quantifiable value of a dollar is considered to be finite - it can only be used to buy one item or service at a time, or a person can only do one paid job at a time. But measuring the value of data is not constrained by those transactional limitations. In fact, data currency exhibits a network effect, where data can be used at the same time across multiple use cases thereby increasing its value to the organization. This makes data a powerful currency in which to invest.

So to better understand how economics can help determine the value of an organization's data and analytics, I sought the help of an old friend who is passionate about applying economics in business. Vince Sumpter (Twitter: @vsumpter) helped deepen my understanding of some core concepts of economics, and consider where and how these economic concepts play in a business world that is looking for ways to determine the financial - or economic - value of their data and analytics.

The economic concepts that seem to have the most bearing on determining the economic value of data (and the resulting analytics) that this blog will cover include:

  • Scarcity
  • Efficiency
  • Postponement Theory
  • Multiplier Effect
  • Capital

It is our hope that this blog fuels some creative thinking and debate as we contemplate how organizations need to apply basic economic concepts to these unusual digital assets - data and analytics.

Defining Economics
I found the below two definitions of "economics" the most useful:

  • Economics is the science that deals with the production, distribution, and consumption of commodities. Economics is generally understood to concern behavior that, given the scarcity of means, arises to achieve certain ends[1].
  • Economics is a broad term referring to the scientific study of human action, particularly as it relates to human choice and the utilization of scarce resources[2].

I pulled together what I felt were some of the key phrases to come up with the following definition of the "economics of data" for purposes of this blog:

Economics of Data:  The science of human choice and behaviors as they relate to the production, distribution and consumption of scarce data and analytic resources.

Economics is governed by the law of supply and demand that dictates the interaction between the supply of a resource and the demand for that resource. The law of supply and demand defines the effect that product or service availability and the demand for that product or service has on price. Generally, a low supply and a high demand increases price, and in contrast, the greater the supply and the lower the demand, the lower the price tends to fall.

Now, we will explore the most relevant economic concepts in context to the Economics of Data.

Scarcity
Scarcity
refers to limitations-insufficient resources, goods, or abilities to achieve the desired ends. Figuring out ways to make the best use of scarce resources or find alternatives is fundamental to economics.

Scarcity Ramifications

Figure 1: Scarcity Ramifications

Scarcity is probably the heart of the economics discussion and ties directly to the laws of supply and demand.  Organizations do not have unlimited financial, human or time resources, consequently as we discussed previously, organizations must seek to prioritize their data and analytic resources against those best opportunities.  Scarcity at its heart forces organizations to do two things that they do not do well:  prioritize and focus (see "Big Data Success: Prioritize ‘Important' Over ‘Urgent'").

Scarcity plays out in the inability, or the unwillingness, for the organization to share all of its data across all of its business units.  For some business units, scarcity drives their value to the organization; that is, he who owns the data owns the power.  This short-sighted mentality manifests itself across organizations in the way of data silos and IT "Shadow Spend."  For example, if you are a Financial Services organization trying to predict your customers' lifetime value, having analytics that optimize individual business units (checking, savings, retirement, credit cards, mortgage, car loans, wealth management) without seeking to optimize the larger business objective (predict customer lifetime value) could easily lead to suboptimal or even wrong decisions about which customers to prioritize with what offers at what times through what channels.

Scarcity has the biggest impact on the prioritization and optimization of scarce data and analytic resources including:

  • Are your IT resources focused on capturing or acquiring the most important data in support of the organization's key business initiatives?
  • Are your data science resources focused on the development of the top priority analytics?
  • Does your technical and cultural environment support and even reward the capture, refinement, and re-use of the analytic results across multiple business units?

Consequently, the ability to prioritize (see "Prioritization Matrix: Aligning Business and IT On The Big Data Journey") and carefully balance the laws of supply and demand are critical to ensure not only that your data and analytics resources are being prioritized against the "optimal" projects.

Postponement Theory
Postponement
is a decision to postpone a decision (which is itself a decision). It can occur as one party seeks to either gain additional information about the decision and/or to delay for better terms from the other party.

Postponement has the following ramifications from an economics of data perspective:

  • Case #1: Organizations may decide to postpone a decision in order to gather more data and/or build more accurate analytics in order to dramatically improve the probability of making a "better" decision
  • Case #2: People and organizations may postpone a decision in order to get better terms especially given certain time constraints (e.g., car dealers get very aggressive with their terms near the end of the quarter)

While Case #2 may not have an impact on the economics of your organization's data and analytics, Case #1 has direct impact.  In order to make a postponement decision, organizations need to understand:

  • What is the estimated effectiveness of the current decision given Type I/Type II decision risks (where a Type I error is a "False Positive" error and a Type II error is a "False Negative error)? See "Understanding Type I and Type II Errors" for more details on Type I/Type II errors.
  • What data might be needed to improve the effectiveness of that decision?
  • How much more accurate can the decision be made given these new data sources and additional data science time?
  • What are the risks of Type I/Type II errors (the costs associated with making the wrong decision)?

Efficiency
Efficiency
is a relationship between ends and means. When we call a situation inefficient, we are claiming that we could achieve the desired ends with less means, or that the means employed could produce more of the ends desired.

Data and analytics play a major role driving efficiency improvements by identifying operational deficiencies and proposing recommendations (prescriptive analytics) on how to improve operational efficiencies.

The aggregation of the operational insights gained from efficiency improvement might lead to new monetization opportunities in enabling the organization to aggregate usage patterns across all customers and business constituents.  For example, organizations could create benchmarks, share, and index calculations that customers and partners could use to measure their efficiencies and create goals around efficiency optimization from the aggregated performance data.

Multiplier Effect
The multiplier effect refers to the increase in final income arising from any new injection of spending. The size of the multiplier depends upon household's marginal propensity to consume (MPC), or the marginal propensity to save (MPS).

The Multiplier Effect is one of the most important concepts developed by J.M. Keynes to explain the determination of income and employment in an economy. The theory of multiplier has been used to explain the cumulative upward and downward swings of the trade cycles that occur in a free-enterprise capitalist economy. When investment in an economy rises, it can have a multiple and cumulative effect on national income, output and employment.

The multiplier effect is, therefore, the ratio of increment in income to the increment in investment.

When applied to our thinking about the Economics of Data, the multiplier effect embodies the fact that our efforts to develop a new data source, or derived analytic measure, could have that same multiplier effect if the new data/analytics were to be leveraged beyond the initial project.

For example, when CPG manufacturers worked with retailers to implement the now ubiquitous UPC standard in the early 1980's, their primary motivation was a desire to drive more consistent pricing at the cash register... Few imagined the knock-on benefits that would accrue by now having much deeper understanding of actual product movement through the supply chain...let alone the shift in balance-of-power that subsequently ensued from CPG Manufacturer to today's Retailers!

Multiplier Effect

Figure 2:  Multiplier Effect

Price Elasticity
Price elasticity
of demand is the quantitative measure of consumer behavior that indicates the quantity of demand of a product or service depending on its increase or decrease in price. Price elasticity of demand can be calculated by the percent change in the quantity demanded by the percent change in price.

In today's big data environment, the price of data science resources (i.e. their salaries) seems almost price inelastic (inelastic describes the situation in which the quantity demanded or supplied of a good or service is unaffected when the price of that good or service changes).  That means that the demand for data science resources is only slightly affected when the price of data science resources increases.

This price inelasticity of data science resources can only be addressed in a few ways:  train (and really certify) more data scientists or dramatically improve the capabilities and ease-of-use of data science tools.

However, there is another option:  train your business users to "think like a data scientist."  The key to this process is training your business users to embrace the power of "might" in collaborating with the data science team to identify those variables and metrics that might be better predictors of performance.  We have now seen across a number of projects how coupling the creative thinking of the business users with the data scientists can yield dramatically better predictions (see forthcoming blog:  "Data Science: Identifying Variables That Might Be Better Predictors").

The "Thinking Like A Data Scientist" process will uncover a wealth of new data sources that might yield better predictors of performance.  It is then up to the data science team to employ their different data transformation, data enrichment and analytic algorithms to determine which variables and metrics are better predictors of performance.

Capital
Capital
is already-produced durable goods and assets, or any non-financial asset that is used in production of goods or services.  Capital is one of three factors of production, the others being land and labor.

Adam Smith defined capital as "that part of a man's stock which he expects to afford him revenue".  I like Adam Smith's definition because the ultimate economic goal of data and analytics is to "afford organizations revenue."  And while it may be possible to generate that revenue through the sale of data and analytics, for most organizations data and analytics as capital get converted into revenue in four ways:

  • Driving the on-going optimization of key business processes (e.g., reducing fraud by 3% annually, increasing customer retention 2.5% annually)
  • Reducing exposure to risk through management of security, compliance, regulations, and governance, to avoid security breaches, litigation, fines, theft etc. to build customer trust and loyalty while ensuring business continuity and availability.
  • Uncovering new revenue opportunities through superior customer, product and operational insights that can identify unmet customer, partner and market needs
  • Delivering a more compelling, more prescriptive customer experience that both increases customer satisfaction and advocacy, but also increases the organization's success in recommending new products and services to the highest qualified, highest potential customers and prospects

Probably the most important economic impact on data and analytics is the role of human capital.  Economists regard expenditures on education, training, and medical care as investments in human capital. They are called human capital because people cannot be separated from their knowledge, skills, health, or values in the way they can be separated from their financial and physical assets.  These human investments can raise earnings, improve health, or add to a person's good habits over one's lifetime.  But maybe more importantly, an organization's human capital can be transformed to "think differently" about the application of data and analytics to power the organization's business models.

Summary
As my friend Jeff Abbott said after reviewing this blog: "What did I do wrong to have to review this blog?"

While the economic concepts discussed in this blog likely do not apply to your day-to-day jobs, more and more I expect that the big data (data and analytics) conversation will center on basic economic concepts as organizations seek to ascertain the economic value of their data and analytics. Data and analytics exhibit unusual behaviors from an asset and currency perspective, and applying economic concepts to these behaviors may help organizations as they seek to prioritize and optimize their data and analytic investments.

So, sorry for bringing back bad college memories about your economics classes, but hey, no one said that big data was going to be only fun!

Sources:

http://www.econlib.org/library/Topics/HighSchool/KeyConcepts.html
http://www.econlib.org/library/Topics/HighSchool/Scarcity.html
http://www.economicsdiscussion.net/keynesian-economics/keynes-theory/keynes-theory-of-investment-multiplier-with-diagram/10363
http://www.tutor2u.net/economics/reference/multiplier-effect
http://www.investopedia.com/university/economics/economics3.asp
http://www.econlib.org/library/Topics/HighSchool/ElasticityofDemand.html
http://www.econlib.org/library/Topics/HighSchool/HumanCapital.html
[1] http://www.dictionary.com/browse/economics
[2] http://www.investopedia.com/terms/e/economics.asp?lgl=no-infinite

The post Data and Economics 101 appeared first on InFocus Blog | Dell EMC Services.

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting strategy and defining the Big Data service offerings for Dell EMC’s Big Data Practice.

As a CTO within Dell EMC’s 2,000+ person consulting organization, he works with organizations to identify where and how to start their big data journeys. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored the Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements. Bill serves on the City of San Jose’s Technology Innovation Board, and on the faculties of The Data Warehouse Institute and Strata.

Previously, Bill was vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a Masters Business Administration from University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
BnkToTheFuture.com is the largest online investment platform for investing in FinTech, Bitcoin and Blockchain companies. We believe the future of finance looks very different from the past and we aim to invest and provide trading opportunities for qualifying investors that want to build a portfolio in the sector in compliance with international financial regulations.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
We are given a desktop platform with Java 8 or Java 9 installed and seek to find a way to deploy high-performance Java applications that use Java 3D and/or Jogl without having to run an installer. We are subject to the constraint that the applications be signed and deployed so that they can be run in a trusted environment (i.e., outside of the sandbox). Further, we seek to do this in a way that does not depend on bundling a JRE with our applications, as this makes downloads and installations rat...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is ...