Welcome!

@ThingsExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, William Schmarzo

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Connecting Big Data Project Management with Enterprise Data Strategy By @DDMcD | @BigDataExpo #BigData

Making the data analysis process effective and efficient is where good project planning and management come in

The Tip of the Spear II: Connecting Big Data Project Management with Enterprise Data Strategy
By Dennis D. McDonald

“If data analysis is Big Data’s "tip of the spear" when it comes to delivering data-dependent value to customers or clients, we also must address how that spear is shaped, sharpened, aimed, and thrown – and, of course, whether or not it hits its intended target.”

Introduction
In Meeting the Mission of Transportation Safety, Richard McKinney, U.S. Department of Transportation's CIO, describes four components for what I call an “enterprise data strategy”:

  1. Data governance
  2. Data sharing
  3. Data standards
  4. Data analysis

He also mentions additional factors relevant to DOT’s data strategy:

  1. The volume of data is increasing and we need to be ready for it.
  2. Managing data is not the same as analyzing it.
  3. We need to be thinking now about what type of analysis we need to be doing and what resources will be needed to do the analysis.

bdpm

Based on the 20+ personal, telephone, and email interviews I’ve conducted so far[2] as part of my big data project management research I would add a fourth item to McKinney's list:

  1. We need to spend at least as much time to planning and managing the people and business processes that make data analysis possible as we do the analysis process itself and the technologies that support it.

Tip of the Spear
If data analysis is Big Data’s “tip of the spear” when it comes to delivering data-dependent value to customers or clients, we also must address how that spear is shaped, sharpened, aimed, and thrown – and, of course, whether or not it hits its intended target.

We also want the processes associated with throwing that spear to be both effective and efficient.

Making the data analysis process – the tip of the Big Data spear -- effective and efficient is where good project planning and management come in.  Challenges to doing this in connection with data intensive projects are identifiable and include:

  1. Siloes. Data are often generated and managed in system- or mission-specific siloes. As a result, creating and implementing an effective enterprise-level data strategy that rises above and encompasses multiple programs, systems, and/or missions requires not just data analysis skills but a mix of technical, organizational, and political skills – not just good “project management.”
  2. Sharing. Making data accessible and useful often means that data need to be shared with systems and processes outside the control of those who "own" the data to be analyzed. Key steps in sharing data are that (a) data need to be identified and inventoried, and (b) technical and business ownership of the inventories data must be determined. In many organizations this inventorying is easier said than done and may require both manual and automated approaches to creating the necessary inventories.
  3. Standards. Efficient and sustainable analysis of data and metadata may require development or implementation of data standards. Existence and use of such standards differs by industry, data type, and system. The costs for developing and adopting standards to facilitate data sharing and analysis will also vary and may have cost and schedule implications at the project, program, enterprise, and industry or community levels.
  4. Delivering value. Modern data analysis tools and techniques provide mechanisms to identify patterns and trends from the increasing volumes of data generated by a steadily widening variety of data capture mechanisms. Challenges in predicting what will be found when data are analyzed places a premium on making sure we are asking the right questions. This in turn impacts our ability to justify project expenditures in advance.

Portfolio Management
Responding to the above challenges requires not only project management skills but also a project planning process that takes into consideration alignment with an organization’s goals and objectives.

As one of my interviewees suggested, the challenge faced in complex “big data” projects has just as much – if not more -- to do with overall strategy and “portfolio management” as with how individual projects are planned and managed. Effectively designing and governing a portfolio of projects and processes requires not only an understanding of how the portfolio supports (relates to, is aligned with, interacts with) the organization’s objectives; it should also incorporate a rational process for defining project requirements and then governing how the organization’s resources are managed and applied.

Given how pervasive and fundamental data are to an organization’s operation, skill in data science and analytics is a necessary element but such skill will not be, in many cases, a guarantor of success. Technical and analytical skills must be accompanied by effective planning, oversight, and management in order to ensure that the data analysis “spear” is being thrown in the right direction.

Delivering Value Quickly
Ideally a portfolio of projects will support an organization’s strategic plan and the goals or missions the organization is charged with pursuing. We may also need to “get tactical” by delivering value to the customer or client as quickly as possible, perhaps by focusing on better-controlled and better-understood product-centric data early on via a “data lake” approach.

Doing so will be good for the customer and will help create a relationship of trust moving forward. Such a relationship will be needed when complications or uncertainties arise and need to be dealt with.

In organizations that are not historically “data centric” or in organizations where management and staff have a low level of data literacy, an early demonstration of value from data analysis is especially important. An agile approach to project management, accompanied by openness, transparency, and collaboration, will help to accomplish this.

Unfortunately, challenges such as those identified above in many cases cannot be addressed effectively in tactically focused short-term projects given the usual pressures of time and budget. Such challenges can be complex or rooted in how the organization has been traditionally structured and managed.

Still, it’s not unusual for a tactically-focused “sprint” project, even while delivering an effective model or other deliverable, to uncover the need for a more global (or strategic) approach to managing data, metadata, data security, privacy, or data quality.

Balancing Tactics and Strategy
When focusing on delivery of useful data-related deliverables it always pays to keep two questions in mind:

  1. What needs to be done immediately to make data useful?
  2. What does this tell us about what needs to be done more globally in order to maintain and increase data usefulness?

Attention to enterprise-level data strategy while delivering useful results in the short term has implications beyond what is being attempted in an individual project’s scope. Treating data as an enterprise resource may even require changes to how the enterprise itself is managed. As we all know, it’s not unusual for change to be resisted.

An effective enterprise level data strategy will be one that balances the management of a portfolio of individual data intensive “agile” projects with parallel development of an upgraded enterprise data strategy. Doing one without the other could have negative consequences, for example:

  1. Focusing only on a narrowly defined data intensive analytics project by itself may generate immediate value through frequent useful deliverables but may not address underlying technical process issues that impact long-term efficiency and sustainability.
  2. Focusing only on an enterprise data strategy without delivering tactical benefits reduces the possibility that that less data-savvy managers understand the “big picture” down the road.

As experienced project managers know, concentrating on “quick and dirty” or “low hanging fruit” when under the gun to deliver value to a client in the short term can generate short term benefits. This same approach, however, may actually increase costs over time if strategic data management issues related to data standards or quality are repeatedly kicked “down the road.” Also, delivering a “strategy” without also engaging users in development of real-world analytical deliverables might mean that strategically important recommendations ends up gathering dust on the shelf somewhere.

Communication Strategy
As experienced project managers understand all too well one of the most important elements in effective project management is communication:

  • Communication among project staff
  • Communication with the client
  • Communication with stakeholders

In the case of the big data or data intensive project, even when focused on delivering incremental value to the customer by focusing initially on specific or narrowly targeted goals, we want communications about project activities, especially among key stakeholders, to focus both on tactical as well as strategic objectives.

This may require accommodating a variety of communication styles as well as different levels of data and analytical literacy especially when both business-focused and technology- or analytics-focused staff are involved. But if we do follow this balanced approach we will:

  1. Deliver a useful project.
  2. Develop a trusted relationship with the client.
  3. Build the foundation for a realistic sustainable enterprise data strategy going forward.

Summary
In summary, how a data-intensive project is planned must take into account both short- and long-term goals. This planning process must be a collaborative one and, even if led by the organization’s IT department – not an unusual situation – it must involve business or operating units right from the start in order to ensure success.

I’ll be turning my attention to this planning process in future posts. If you’re interested in learning more about this process please let me know.

Related reading:

[1] Copyright (c) 2015 by Dennis D. McDonald, Ph.D. Dennis is an independent Washington DC area management consultant. His services include preproposal research and analysis, proposal development and costing, marketing and sales support, project and program management, project plan development, requirements analysis, and strategic planning. Reach him by phone at 703-402-7382 or by email at [email protected]. An earlier version of this post was published at http://www.ddmcd.com/spear.html and distributed at the Dec. 8, 2015 ATARC Federal Big Data Summit in Washington, DC.

[2] Thanks are due the following for sharing their thoughts with me: Aldo Bello, Kirk Borne, Clive Boulton, Doug Brockway, Ana Ferreras, Keith Gates, Douglas Glenn, Jennifer Goodwin, Jason Hare, Christina Ho, Randy Howard, Catherine Ives, Ian Kalin, Michael Kaplan, Jim Lola, David McClure, Jim McLennan, Trevor Monroe, Brian Pagels, John Parkinson, Dan Ruggles, Nelson Searles, Sankar Subramanian, and Tom Suder.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

IoT & Smart Cities Stories
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of ...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...