Welcome!

@ThingsExpo Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Connecting Big Data Project Management with Enterprise Data Strategy By @DDMcD | @BigDataExpo #BigData

Making the data analysis process effective and efficient is where good project planning and management come in

The Tip of the Spear II: Connecting Big Data Project Management with Enterprise Data Strategy
By Dennis D. McDonald

“If data analysis is Big Data’s "tip of the spear" when it comes to delivering data-dependent value to customers or clients, we also must address how that spear is shaped, sharpened, aimed, and thrown – and, of course, whether or not it hits its intended target.”

Introduction
In Meeting the Mission of Transportation Safety, Richard McKinney, U.S. Department of Transportation's CIO, describes four components for what I call an “enterprise data strategy”:

  1. Data governance
  2. Data sharing
  3. Data standards
  4. Data analysis

He also mentions additional factors relevant to DOT’s data strategy:

  1. The volume of data is increasing and we need to be ready for it.
  2. Managing data is not the same as analyzing it.
  3. We need to be thinking now about what type of analysis we need to be doing and what resources will be needed to do the analysis.

bdpm

Based on the 20+ personal, telephone, and email interviews I’ve conducted so far[2] as part of my big data project management research I would add a fourth item to McKinney's list:

  1. We need to spend at least as much time to planning and managing the people and business processes that make data analysis possible as we do the analysis process itself and the technologies that support it.

Tip of the Spear
If data analysis is Big Data’s “tip of the spear” when it comes to delivering data-dependent value to customers or clients, we also must address how that spear is shaped, sharpened, aimed, and thrown – and, of course, whether or not it hits its intended target.

We also want the processes associated with throwing that spear to be both effective and efficient.

Making the data analysis process – the tip of the Big Data spear -- effective and efficient is where good project planning and management come in.  Challenges to doing this in connection with data intensive projects are identifiable and include:

  1. Siloes. Data are often generated and managed in system- or mission-specific siloes. As a result, creating and implementing an effective enterprise-level data strategy that rises above and encompasses multiple programs, systems, and/or missions requires not just data analysis skills but a mix of technical, organizational, and political skills – not just good “project management.”
  2. Sharing. Making data accessible and useful often means that data need to be shared with systems and processes outside the control of those who "own" the data to be analyzed. Key steps in sharing data are that (a) data need to be identified and inventoried, and (b) technical and business ownership of the inventories data must be determined. In many organizations this inventorying is easier said than done and may require both manual and automated approaches to creating the necessary inventories.
  3. Standards. Efficient and sustainable analysis of data and metadata may require development or implementation of data standards. Existence and use of such standards differs by industry, data type, and system. The costs for developing and adopting standards to facilitate data sharing and analysis will also vary and may have cost and schedule implications at the project, program, enterprise, and industry or community levels.
  4. Delivering value. Modern data analysis tools and techniques provide mechanisms to identify patterns and trends from the increasing volumes of data generated by a steadily widening variety of data capture mechanisms. Challenges in predicting what will be found when data are analyzed places a premium on making sure we are asking the right questions. This in turn impacts our ability to justify project expenditures in advance.

Portfolio Management
Responding to the above challenges requires not only project management skills but also a project planning process that takes into consideration alignment with an organization’s goals and objectives.

As one of my interviewees suggested, the challenge faced in complex “big data” projects has just as much – if not more -- to do with overall strategy and “portfolio management” as with how individual projects are planned and managed. Effectively designing and governing a portfolio of projects and processes requires not only an understanding of how the portfolio supports (relates to, is aligned with, interacts with) the organization’s objectives; it should also incorporate a rational process for defining project requirements and then governing how the organization’s resources are managed and applied.

Given how pervasive and fundamental data are to an organization’s operation, skill in data science and analytics is a necessary element but such skill will not be, in many cases, a guarantor of success. Technical and analytical skills must be accompanied by effective planning, oversight, and management in order to ensure that the data analysis “spear” is being thrown in the right direction.

Delivering Value Quickly
Ideally a portfolio of projects will support an organization’s strategic plan and the goals or missions the organization is charged with pursuing. We may also need to “get tactical” by delivering value to the customer or client as quickly as possible, perhaps by focusing on better-controlled and better-understood product-centric data early on via a “data lake” approach.

Doing so will be good for the customer and will help create a relationship of trust moving forward. Such a relationship will be needed when complications or uncertainties arise and need to be dealt with.

In organizations that are not historically “data centric” or in organizations where management and staff have a low level of data literacy, an early demonstration of value from data analysis is especially important. An agile approach to project management, accompanied by openness, transparency, and collaboration, will help to accomplish this.

Unfortunately, challenges such as those identified above in many cases cannot be addressed effectively in tactically focused short-term projects given the usual pressures of time and budget. Such challenges can be complex or rooted in how the organization has been traditionally structured and managed.

Still, it’s not unusual for a tactically-focused “sprint” project, even while delivering an effective model or other deliverable, to uncover the need for a more global (or strategic) approach to managing data, metadata, data security, privacy, or data quality.

Balancing Tactics and Strategy
When focusing on delivery of useful data-related deliverables it always pays to keep two questions in mind:

  1. What needs to be done immediately to make data useful?
  2. What does this tell us about what needs to be done more globally in order to maintain and increase data usefulness?

Attention to enterprise-level data strategy while delivering useful results in the short term has implications beyond what is being attempted in an individual project’s scope. Treating data as an enterprise resource may even require changes to how the enterprise itself is managed. As we all know, it’s not unusual for change to be resisted.

An effective enterprise level data strategy will be one that balances the management of a portfolio of individual data intensive “agile” projects with parallel development of an upgraded enterprise data strategy. Doing one without the other could have negative consequences, for example:

  1. Focusing only on a narrowly defined data intensive analytics project by itself may generate immediate value through frequent useful deliverables but may not address underlying technical process issues that impact long-term efficiency and sustainability.
  2. Focusing only on an enterprise data strategy without delivering tactical benefits reduces the possibility that that less data-savvy managers understand the “big picture” down the road.

As experienced project managers know, concentrating on “quick and dirty” or “low hanging fruit” when under the gun to deliver value to a client in the short term can generate short term benefits. This same approach, however, may actually increase costs over time if strategic data management issues related to data standards or quality are repeatedly kicked “down the road.” Also, delivering a “strategy” without also engaging users in development of real-world analytical deliverables might mean that strategically important recommendations ends up gathering dust on the shelf somewhere.

Communication Strategy
As experienced project managers understand all too well one of the most important elements in effective project management is communication:

  • Communication among project staff
  • Communication with the client
  • Communication with stakeholders

In the case of the big data or data intensive project, even when focused on delivering incremental value to the customer by focusing initially on specific or narrowly targeted goals, we want communications about project activities, especially among key stakeholders, to focus both on tactical as well as strategic objectives.

This may require accommodating a variety of communication styles as well as different levels of data and analytical literacy especially when both business-focused and technology- or analytics-focused staff are involved. But if we do follow this balanced approach we will:

  1. Deliver a useful project.
  2. Develop a trusted relationship with the client.
  3. Build the foundation for a realistic sustainable enterprise data strategy going forward.

Summary
In summary, how a data-intensive project is planned must take into account both short- and long-term goals. This planning process must be a collaborative one and, even if led by the organization’s IT department – not an unusual situation – it must involve business or operating units right from the start in order to ensure success.

I’ll be turning my attention to this planning process in future posts. If you’re interested in learning more about this process please let me know.

Related reading:

[1] Copyright (c) 2015 by Dennis D. McDonald, Ph.D. Dennis is an independent Washington DC area management consultant. His services include preproposal research and analysis, proposal development and costing, marketing and sales support, project and program management, project plan development, requirements analysis, and strategic planning. Reach him by phone at 703-402-7382 or by email at [email protected]. An earlier version of this post was published at http://www.ddmcd.com/spear.html and distributed at the Dec. 8, 2015 ATARC Federal Big Data Summit in Washington, DC.

[2] Thanks are due the following for sharing their thoughts with me: Aldo Bello, Kirk Borne, Clive Boulton, Doug Brockway, Ana Ferreras, Keith Gates, Douglas Glenn, Jennifer Goodwin, Jason Hare, Christina Ho, Randy Howard, Catherine Ives, Ian Kalin, Michael Kaplan, Jim Lola, David McClure, Jim McLennan, Trevor Monroe, Brian Pagels, John Parkinson, Dan Ruggles, Nelson Searles, Sankar Subramanian, and Tom Suder.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

IoT & Smart Cities Stories
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Early Bird Registration Discount Expires on August 31, 2018 Conference Registration Link ▸ HERE. Pick from all 200 sessions in all 10 tracks, plus 22 Keynotes & General Sessions! Lunch is served two days. EXPIRES AUGUST 31, 2018. Ticket prices: ($1,295-Aug 31) ($1,495-Oct 31) ($1,995-Nov 12) ($2,500-Walk-in)
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...