Welcome!

@ThingsExpo Authors: Liz McMillan, Kevin Benedict, Elizabeth White, William Schmarzo, Pat Romanski

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Citizen Data Scientist, Jumbo Shrimp | @CloudExpo @Schmarzo #BigData

Okay, let me get this out there: I find the term 'Citizen Data Scientist' confusing

Citizen Data Scientist, Jumbo Shrimp, and Other Descriptions That Make No Sense

Okay, let me get this out there: I find the term “Citizen Data Scientist” confusing. Gartner defines a “citizen data scientist as “a person who creates or generates models that leverage predictive or prescriptive analytics but whose primary job function is outside of the field of statistics and analytics.”

While we teach business users to “think like a data scientist” in their ability to identify those variables and metrics that might be better predictors of performance, I do not expect that the business stakeholders are going to be able to create and generate analytic models. I do not believe, nor do I expect, that the business stakeholders are going to be proficient enough with tools like SAS or R or Python or Mahout or MADlib to 1) create or generate the models, and then 2) be proficient enough to be able to interpret the t-tests, f-scores, p-values and residuals necessary to ascertain the analytic model’s goodness of time.

No one would say “Citizen Lawyer” or “Citizen Nuclear Physicists” or “Citizen Physician.” I guess a “Citizen Physician” would be someone who “practices medicine but whose primary job function is outside of the field of medicine (meaning that they’ve had no training in medicine or medical procedures).” They call those people quacks (not quants…he-he-he).

WebMD doesn’t make someone a doctor any more than analytics makes someone a data scientist. Analysis of the analytic results and insights is an important step in the process, particularly when the results contradict each other. Data scientists provide the necessary experience about the different analytic techniques and algorithms required to decipher the results, validate the results and then turn the results into actions or recommendations.

What’s wrong with the definition is that it doesn’t properly acknowledge the deep training in analytic disciplines such as machine learning, cognitive computing, data mining, computer programming, and applied mathematics. It also dismisses the critical importance of gaining hands-on, data science experience through years of apprenticeships and tutelage under the guidance of master data scientists.

In order to understand the importance of the role of the data scientist, I solicited the help of the best data scientist that I know …Wei Lin. Wei and I have done numerous big data projects together and every time I engage with Wei, I learn tons. So naturally, I’d call upon a true master data scientist to help me write this blog.

Data Scientist Capabilities Are a Good Starting Point…
The starting point for the data scientist discussion starts with an understanding of the types of tasks at which a data scientist must become proficient. Below is a summary of these tasks. I think you can quickly see that an effective data scientist requires a wide and deep range of capabilities including:

  • Data acquisition. The data scientist is going to pull data from a wide variety of sources in a wide variety of formats. Some of the data will be accessible as tables using SQL. However, much of the data will be in log files and will be extracted using tools such as R and Python to grab the raw log files. Some of the data will be pulled from websites, in which case one can either use the provided API’s (if there are API’s) or they screen scrape the data. A wide variety of expertise across a wide variety of tools is required to acquire the data from whatever the source may be – structured (tables, csv), semi-structured (log files) and unstructured (text files, documents, images, video files).
  • Data preparation. The data scientist needs to go through a process of cleaning up the data (especially if screen scraping was used), normalizing, aligning, and enriching (adding new variables such as frequency, recency, monetary and indices) the data. There is a common sense component required during this process to ensure that one is aligning like levels of granularity and is comparing like entities. Tools used here include SQL, R, Python and Java.
  • Data exploration/data visualization. The data scientist then starts to explore the data looking for outliers and visual correlations in the data. This is where an inquisitive mind is useful as the data scientist digs deeper and deeper into the granular data and looks for opportunities to link other data sources. Missing values may be discovered in the exploration phase, in which case the data scientist needs to decide how to handle the missing values. Tools used here include Tableau, Spotfire and R (ggplot2).
  • Model development. This is where the data scientist starts to quantify cause and effect by actually building predictive models. Quantifying correlations coefficients, statistical errors and residuals using tools such as SAS, R, Python, MADlib, and Mahout is required to ascertain if the model being built is more predictive or not.
  • Model validation. The data scientist then needs to determine the model “goodness of fit” using measures such as F-test, t-tests and p-values. The tools that were used to build the model (SAS, R, Python, MADlib, Mahout) provide the goodness of fit metrics.
  • Results visualization: Once the data scientist has a model with which they are confident that is “good enough” given the problem that they are trying to address (see my blog “Understanding Type I and Type II Errors”), then the data scientist needs to use many of the same data visualization tools (Tableau, Spotfire, ggplot2) to determine the optimal way to present the results so that the users can understand the results in order to act on the analytic results.

But The Key Is the Experience
Understanding algorithms is different from deciphering the results and translating the knowledge into business actions or client treatment. Going back to our WebMD example, a person who reads WebMD will have challenges trying to match their symptoms to wide variety of potential diseases and illnesses (except for the easy, more frequent illnesses), and to properly prescribing the “right” mix of medications, treatments and therapy.

Data scientist often frames a question into its business value and data context. It makes question more readable. Those questions could go in several different levels so rather than asking it all in one, the question itself could be break down into smaller business questions. There are methods to further reduce complexity by dimension reduction, variable decomposition or principle component analysis, etc.

There are many analytic algorithm and modeling options. Choosing a proper algorithm could be a challenge. The alternatives are to run large number of algorithms to search. With that, large number of results will need to be analyzed.

Interpreting results is a complex task. By running a large number of algorithms, the results tend to partial converge or partial conflicting. The conflict resolution and the weights of the variables require further modeling or ensemble.

Data Science Requires More Than Smart
But it isn’t just the analytics capabilities, skills, training, apprenticeship and hands-on experience that make an outstanding data scientist. Our best data scientists also exhibit outstanding “bed side manners” or humility. They understand the power of humility that immediately puts others at ease, allowing for a more open and more inclusive conversation.  To me, this is the real key to being an effective data scientist, where I define “effective” to mean “comes up with reasonable recommendations that the users can understand and take action on.” The best data scientists quickly learn that in order to deliver outstanding outcomes, they need to be able to engage, listen and learn from others of all types.

But I could argue that humility is the key to success no matter your profession. Whether becoming a physician, or a nuclear physicist, or a lawyer or a barista or a teacher/coach, humility is imperative for continued growth and mastery of your craft.

As we like to say during our Big Data Vision Workshop engagements, all ideas are worthy of consideration. Because the minute you think you know all the answers, is the time when you are no longer relevant to the conversation.

To quote the “Lego Movie”

“A special ‘Master Builder’ will defeat Lord Business and become the greatest ‘Master Builder’ of all. The key to true master building is to believe in yourself and follow your own set of instructions inside your head.”

Sounds like a Master Data Scientist to me (especially when said in Morgan Freeman’s voice)!

The post Citizen Data Scientist, Jumbo Shrimp, and Other Descriptions That Make No Sense appeared first on InFocus Blog | Dell EMC Services.

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

@ThingsExpo Stories
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...