Welcome!

@ThingsExpo Authors: Liz McMillan, Pat Romanski, William Schmarzo, Kevin Benedict, Elizabeth White

Related Topics: @DXWorldExpo, Machine Learning , @ThingsExpo

@DXWorldExpo: Blog Feed Post

Natural Language Processing | @BigDataExpo #BigData #Analytics #DataScience

Bridging the Gap Between Big Data and Big Information

Apophenia is the propensity to see patterns in random data.”  We encounter it all the time in the real world. Examples include gamblers who see patterns in how the cards are being dealt or investors who imagine patterns in the movement of certain stocks, or basketball fans who believe that their favorite player has the “hot hand.” But apophenia has no place in the world of data science, especially when data science is trying to help us make better decisions about critical things such as the quality of healthcare, where to allocate police resources, ensuring that our airplanes operate effectively or making investment decisions that determine our retirement readiness.

Understanding the differences between epiphany (a sudden, intuitive perception of or insight into the reality) and apophenia (the perception of or belief in connectedness among unrelated phenomena) is critical as data scientists build analytic models to quantify cause and effect. Regression Modeling is a great tool in helping to quantify cause and effect, but one still needs to leverage industry insights and sometimes just plain common sense to make sure that we are not trying to quantify “spurious relationships.” In the world of data science, one cannot automate out the importance of common sense. I think Craig Wilkey nails this distinction in this blog he wrote:

Apophenia is the propensity to see patterns in random data. It was first coined in 1958 by Klaus Conrad – a German neurologist and psychiatrist who, perhaps a little ironically, was attempting to identify early indicators of psychosis.

An apophany (an instance of apophenia) can best be defined in contrast to an epiphany. An epiphany is a moment of sudden and striking realization that leads a person to a greater degree of clarity in the nature of reality – a discovery of a truism, often hidden in plain sight. An apophany is having the experience of an epiphany, but you’re just plain wrong.

We’ve all heard some version of the old adage that correlation does not imply causation. It can be clearly demonstrated that in neighborhoods where there is an increase in ice cream consumption, there is a roughly equivalent spike in aggravated assault incidents. We’d be foolish to assume that eating ice cream makes people irrationally violent, but that doesn’t mean there’s nothing valuable to learn from this. When we broaden the lens a bit and include other variables, the connections become clearer.

In overpopulated urban environments, where there is a greater concentration of disenfranchised people – people who are statistically more likely to commit poverty crimes, and statistically less likely to have air-conditioned homes – heat waves usher in higher levels of frustration, lower levels of tolerance, and more people eating ice cream. We can also sharpen the focus by throwing in aggravation over public transit failures, brown-outs and black-outs, lower productivity, and countless other factors.

So, while enjoying tasty dairy products does not necessarily incite violence, the correlation between ice cream consumption and violence is not meaningless. Ice cream consumption analysis may indeed provide value as a leading indicator, or bellwether, of the potential for violent acts trending upward in a given community. If not a bellwether, it certainly is a valid correlation – as opposed to a simple coincidence.

The purpose of regression analysis is to identify those variables (referred to as independent variables) that help reveal valid correlations in the phenomena one is attempting to predict (the dependent variable).

Regression analysis is a tricky beast to harness. When the whole point is to find hidden correlations that may even defy intuitive understanding, it can be tempting to throw in the entire kitchen sink and see what comes out. The greatest perceived risk in that arises from patterns that may align, but are nevertheless invalid. These coincidences are referred to as ‘spurious relationships’.

If the patterns of some spurious relationship(s) happen to align with the patterns of other independent variables in a regression analysis model, the accuracy of the model will be impacted, and could be dramatically impacted.

It would be foolish to place any faith in all those quirky coincidences we always hear about with sports teams, for example. There is no reasonably conceivable way the first initial of the middle name of the first child born in some small town after the start of a sport’s season could predict the outcome of a team’s playoff standings – but I’d be genuinely surprised if there weren’t some spurious relationship to be found there.

On the other hand, we do have a valid argument for getting rid of the dramatic orchestra strike that foreshadows violent crime in movies, and replacing it with the sound of an ice cream truck.

How do we strike the balance between the desire to uncover hidden variables that provide valuable insight into trends, and the fear of creating an apophenic, potentially psychotic, regression analysis model?

My nearly two and a half decades of experience in IT have led me to the conclusion that the field suffers from rampant apopheniphobia: The irrational fear of finding ostensibly meaningful patterns in random data. (Yes, I did just make that word up. © Craig Wilkey, 2016)

Almost invariably, we simply do not push far enough. Should stock market analysis include things like weather patterns, celebrity news stories and grade school holidays? Absolutely! Classical stock market analysis techniques don’t work as well as they used to. Why? Frankly, we have a greater number of ignorant people playing the market. The proliferation of ‘Day Traders’ has crippled the old market truisms, because so many people who are affecting the market dynamics don’t have any classical training. The things that affect the moods and daily lives of ‘normal people’ need to be considered, because ‘normal people’ are far more active in the markets than they used to be. If they don’t play by the rules, then some of those rules simply cease to apply.

Apopheniphobia is fueled by fears of falling prey to spurious relationships. Who wants to be known as the person who unleashed a dangerously psychotic algorithm into the world? People think about the many statistical oddities they’ve come across, and it stunts their creative growth. For example, did you know there is a direct correlation between the per capita consumption of margarine and the divorce rate in Maine? Cheese consumption is far more dangerous than margarine consumption – it correlates with the number of people who die by becoming tangled in their bed sheets. (And you thought lactose intolerance was bad?) The number of people who drowned by falling into a pool also correlates with the number of films Nicholas Cage appeared in from 1999 through 2009. (Source)

In IT, we have a tendency to drive toward ‘proving’ clear, unambiguous relationships that quantify efforts, justify means and, more often than not, clearly align to our own preconceived notions. We want to be able to show clear lines of progression and indisputably direct relationships – we tend to believe anything less will not be trusted by those who hold the purse strings.

Our hyper-rational modes of thinking have a tendency to overshadow our creative imaginations – which, almost inevitably, leads to hampered understanding. Perhaps the greatest value of regression analysis is that it allows us to challenge our preconceived notions and learn something new. The greatest challenge with it is rarely throwing too much data at our models – it’s not having enough.

Yes, I know… We’re IT. We’re awash with data. We’re swimming in lakes of data and constantly inhaling the fumes of endless data exhaust. What we’re missing is the meaningful data extracted from unstructured information sources – in other words, the extraordinarily valuable information that’s locked away in language that has historically been inaccessible to machines – human language.

Estimates have been telling us for a decade or more that 80% of all information in a given organization is in the form of unstructured, human-readable text. I think there is nowhere that rings more true or significant than in trying to understand customer experience. I’d also argue that the majority of the most important service information is within that 80%.

Customer Experience Personalization absolutely depends on translating that human-readable text to machine-actionable data. When it comes to understanding and deriving value from actionable insights within our customer interactions, we must extract as much understanding from that unstructured text as possible and add it all to the other data in our regression models. Apopheniphobia be damned!

While it’s, admittedly, an oversimplification, it’s convenient to talk about two general approaches to extracting data from text. Text Analytics/Mining breaks the textual input into digestible chunks of string variables and uses statistical modeling techniques to find patterns in those variables. The ideal of Natural Language Processing is to develop a translation engine between human language and machine language. It uses some of the same statistical modeling approaches as Text Analytics, but goes much further by applying semantic and syntactic analysis to extract meaning, intention, sentiment and key concepts (among other things) included in the text.

Our best opportunity to achieve our vision of industry-leading Customer Experience Personalization is to take advantage of Natural Language Processing. This barely scratches the surface of what’s possible. Natural Language Processing will enable us to step aggressively toward extracting real meaning from the vast amount of otherwise machine-invisible, extraordinarily valuable content we have. Using that extracted meaning, in conjunction with our structured data points, will allow us to build truly valuable regression analysis models to understand our customers like never before. Keep pushing until the model breaks, then dial it back a scosche. That is the path to progress.

Apopheniphobia is the enemy of personalization and Customer Relationship Management. This is why I’ve decided to launch the Apopheniphobia Awareness Campaign. Please spread the word! I need to come up with a design for the lapel pin… Maybe a ribbon with as many digits of pi I can squeeze on it – with all the prime digits bolded? Maybe we should schedule a charity walk… We can follow the streets in alphabetical order.

The post Natural Language Processing: Bridging the Gap Between Big Data and Big Information appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

@ThingsExpo Stories
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...