Welcome!

@ThingsExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski

Related Topics: @CloudExpo, SDN Journal, @ThingsExpo

@CloudExpo: Blog Post

SDLC Tool Integration Requires Domain Understanding | @CloudExpo #Cloud

These impedance mismatch examples require an understanding of the underlying business problem specific to software delivery

Consider the following integration scenarios: Moving medical records between EMR systems; financial information between banking systems; HR information between ERP systems; and software development information between SDLC tools.

At first glance the approaches required for these integrations may seem the same. But if you look slightly deeper you will realize that this can't be the case because of impedance mismatch. I'm defining impedance mismatch as the friction that occurs when trying to align two things or concepts that don't naturally/actually match. Because many of the hardest impedance mismatches are domain specific, to overcome them you have to have a layer of domain understanding "baked into" your integration software to address business problems.

That's what too often has been missing, because historically integration has primarily been perceived as being accomplished by overcoming a series of low-level, generic technical problems.

This misconception about addressing integration becomes evident as soon as someone comes into your office to sell you integration software. Invariably, the conversation quickly turns to questions such as: What is your message queue policy? How do you handle conflict management? What value transformations are available?

Those are secondary questions. Of course every application that you use should have those qualities. Shouldn't any integration application have a robust message queue implementation?

If you think of integration as nothing more than a series of low-level technical problems then you're going to default to those questions. No business person knows or cares about message queues, conflict management or value transformations. They care about getting the information they need when they want it. And that requires domain awareness.

Domain Expertise
Begin to move away from that misguided thinking about integration. There are virtually no troublesome technical hurdles anymore. Integration is a business problem that requires higher level technology solutions that are domain specific. And for different categories of business problems you need very different technical solutions. As a result, the next wave of integration will be all about domain expertise and nuance.

This brings us back to impedance mismatch. When you have something that has "friction" because there are fundamental differences between the two sides, what do you do?

Consider this example: If you are integrating requirements between IBM Doors Next Generation (DNG) and Atlassian JIRA, you need to know that JIRA only allows two levels of hierarchy (three if you include sub-task, but sub-task is used differently in JIRA to other artifact types). However DNG has an infinite number of them.

This business problem can only be solved by having the domain experts (the business analysts and the developers) decide how they will structure DNG such that the right layer in the tree translates to the epic/story in JIRA. JIRA can't handle more than two levels, and no amount of technical magic will help that. The only solution is to add a layer of domain expertise by practitioners, who must make some decisions. And the underlying integration software has to be "domain aware" to know that this situation can occur and what to do about it.

Here is another example. Let's say your organization is implementing the Scaled Agile Framework (SAFe). You need to have the high-level status of either "Ready for Release" or "In Development" at the Feature level. That is all the portfolio manager needs to know - and his tool is configured to handle those statuses.

But your developers have gotten deep into kanban, so the statuses for them are not just "In Development,"  but rather "Ready for SWAG," "Ready for UX Design," "Ready for Scoping," "Ready for Done Review," etc. What do you do?

Any integration software will know to provide a transformation that allows for multiple statuses in one tool to map to a single status in another tool. But domain aware integration software will provide a much needed level of ease of configuration (i.e. no coding) and should be able to automate the transformation needed. This is only possible if you "layer on" the domain expertise on top of the underlying technical engine.

General integration solutions that are nonspecific to the problem simply can't do this because they have no semantic understanding of the underlying information. Without really understanding the meaning of the data, which requires domain expertise, it is much harder to craft solutions that actually address the business problem you are trying to solve.

Dumb vs. Smart Integration
Consider this next example of integration using what I call dumb (generic) integration and smart (domain aware) integration techniques: Your organization needs to flow defects between itself and all suppliers. Defects are probably the most well understood and clearly defined artifacts that exist in software development. A dumb integration approach would simply list all the attributes that all defects have and then have you map each attribute to an attribute on the other side. If you have no insight into the actual goals of why these defects need to flow between systems, it would be legitimate to use this approach.

Since you are crossing organizational boundaries, you might want to create a "smart" integration to ensure that no proprietary attributes are being flowed outside your organization. And you may wish to limit attributes, for example, only to primary attributes such as priority and status to ensure efficient flow.

Rather than starting from scratch and working through all of the many impedance mismatches (e.g. one tool uses two attributes to represent priority and the other one uses only one attribute to represent priority), it makes sense for the integration tool to tell you what aspects of a defect should be flowed between tools and offer suggestions regarding when and how to best transform information when necessary. Without a layer of domain expertise, this can't be done.

Is the Tool Domain Aware?
All of these impedance mismatch examples require an understanding of the underlying business problem specific to software delivery that is being solved. As such, they are not low-level technology problems. And thus, it is critical that the first question that anyone asks when considering the right tools to help with integration is, "Is this tool domain aware?"

When moving medical records between EMR systems, consider software from a vendor that understands the healthcare industry. The same approach holds true for financial software. Software development is no different. Your approach to integrating SDLC tools should start from a place of domain expertise. Because to do SDLC integration well, you can't just simply throw technology at it. You have to bake in domain knowledge. It's as simple as that.

More Stories By Nicole Bryan

Nicole Bryan has extensive experience in software and product development, focused primarily on bringing data visualization and human considerations to the forefront of Application Lifecycle Management, and is currently Vice President of Product Management at Tasktop. Prior to this she served as Director of Product Management at Borland Software/Micro Focus, where she was responsible for creating a new Agile development management tool. Prior to Borland, she was a Director at the New York Stock Exchange (NYSE) Regulatory Division, where she managed some of the first Agile project teams at the NYSE, and VP of Engineering at OneHarbor (purchased by National City Investments).

Nicole holds a Master of Science in Computer Science from DePaul University. She is passionate about improving how software is created and delivered – making the experience enjoyable, fun and yes, even delightful.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...