Thursday, May 29, 2008

The Five Causes of IT Complexity

Complexity is a big problem for IT. Complex systems cost too much, fail too often, and usually do not meet basic business and architectural requirements.

I believe that all IT complexity can be traced to five causes. Eliminate the five root causes of complexity, and you can eliminate unnecessary complexity.

These root causes are as follows:

  • Partitioning Failures – that is, systems in which either data and/or functionality has been divided into subsets that do not represent true partitions.
  • Decompositional Failures – that is, systems that have not been decomposed into small enough subsets.
  • Recompositional Failures – that is, systems that have been decomposed into subsets that are too small and have not been recomposed appropriately.
  • Boundary Failures – that is, systems in which the boundaries between subsets are weak.
  • Synergistic Failures – that is, systems in which functionality and/or data placement does not pass the test for synergy.

I’m planning on exploring these five causes in my next ObjectWatch Newsletter, so if you are interested, stay tuned.

Can anybody think of a cause of IT complexity that is not covered above?

11 comments:

Anonymous said...

Abstraction-coherency failures (sorry, got to find a better name): similar to partitioning failures, but vertical instead of horizontal. I.e. due to inconsistent distribution of functionality across abstraction layers, composing applications breaks layer containment of constraints and dependencies. The hallmark of good/bad design, prominently encountered in frameworks and general purpose components and libraries.
Perhaps related to Synergistic failures (which appear to be the CATCH for everything that doesn't belong in the other categories ;-))

Roger Sessions said...

Moritz,
Thanks for your comment. One point I should make about synergistic failures is that these failures are actually the best defined of all failures. It involves a test for synergy between two (or more) functions. In the few cases in which it is not obvious whether synergy is or is not present, one has a simple problem that can be escalated to one with a better business overview of the problem domain.

Now as to your proposal for abstraction/coherency, this is an interesting idea. I agree with you the systems that fail this test have problems. The question I have is, Is "complexity" the best possible description of that problem? We don't want to use the word "complexity" to describe everything that can go wrong with a system. For example, a system could be bad because it doesn't meet business requirements, yet still not be complex in a mathematical sense.

Anonymous said...

I think complexity is something that very large organisations with unclear or varying business goals can't fail to feed.
For example in these times of uncertain financials companies are looking to cash in on tactical initiatives rather than taking a measured , prescripted approach using a framework such as TOGAF to develop an enterprise architecture.
As a result of this the IT estate is under constant change with many conflicting requirements often making a mess even more messy.
So I would say a major cause of complexity is the business areas not really knowing where they want to be and not knowing how to partition itself.
In the words of George Harrison, if you don't know where your going any road will take you there !

Roger Sessions said...

One reason that companies aren’t using a “measured, prescripted approach using a framework such as TOGAF” is that the time and money investment to do so is so large. Often it takes so long to create an enterprise architecture that by the time it is created it is out of date.

In a recent talk I gave comparing the various EA methodologies, I categorized methodologies into three generations as follows:

Generation One
Time Period: 1987-1995
Mantra: We need to align IT and business!
Representative Methodologies: Zachman

Generation Two
Time Period: 1995-2003
Mantra: We need a process to follow!
Representative Methodologies: FEA, TOGAF

Generation Three
Time Period: 2005-
Mantra: This is taking way too long!
Representative Methodologies: VPEC-T, AEA, SIP

The Generation Three technologies are all trying, in one way or another, to address the inefficiencies of technologies like TOGAF.

Anonymous said...

Thanks for your comments.
Do you think that the concept of alligning IT and the business is no longer relevant or valid ?.
(Sorry for being anonymous I need to set an account up).
DO you agree with my assertion that continual reactive change is a source of (or driver towards) continuing complexity?

Roger Sessions said...

I believe that aligning IT and the business is extremely important. It’s just that I believe that you need to partition the enterprise into simple subsets (autonomous business capabilities, in my terminology) and then achieve the alignment within those subsets.

As far as continual reactive change, I don’t see that so much as a source of complexity as much as a symptom of complexity. When complexity is out of control, there is little you can do except be in continual reactive mode. Once we properly manage complexity, we can take the time to be more thoughtful and deliberate.

Craig Brown said...

Roger,

(new reader)

There is still still plenty of complexity in the business, for example;

- Balancing short and long term needs
- Process v functional structures and focus
- Talent and other labour management issues

All of these interfere with business-IT alignment.

And another simpler, but more insidious one is the number of stakehodlers to a system (which I think you have addressed in the past.)

One stakeholder - One system = easy. And where many of us are today = virtually impossible.

Thanks for sharing your insights at this blog by the way.

WalterRSmith said...

Roger,

Maybe I'm addressing something that's outside the scope of this conversation, but my initial reaction to your "Five Causes" was that it was limited to the "model" being instantiated in an IT system. I see by the comments that I should perhaps interpret "system" a bit more broadly.

Regardless...after a decade or so of kicking around concepts from complexity, philosophy, social psychology, and large IT system development, I've found Dave Snowden's Cynefin framework helpful in thinking about complexity. Here's why: it looks at the intersection of the subject (the knower) and the object (the system, the system context, and the associated data/info).

For IT systems development, that typically means focusing on creating models of a slice of a "frozen" context, a static knower, and a system that connects the user the context slice. For Known/Knowable contexts, that works well. However, for Complex contexts (and an increasingly hyperconnected world is making these more the norm than not), it seems that we may need to expand root causes to address such issues as (a) the context of systems usage (e.g., for Known/Knowable work, or for Complex work), (b) the knower's ability to effectively engage the context, and (c) the ability of the knower's organization to do likewise (where multiple individuals are trying to maintain decision making coherence).

Maybe I'm misreading this, but it seems that there's an assumption of a Known/Knowable context in much of this discussion.

BTW, this point applies equally to the construction of large IT systems, an activity where the importance of shared sensemaking and maintenance of coherent understandings is not always appreciated. The agile movement may be, in part, based in a growing awareness of these challenges.

Anonymous said...

Roger,

This is a great discussion.

One viewpoint that I've found helpful is to distinguish 'complexity' and 'complication'. Complexity is an inherent property of a domain. Complication is an emergent property of a system addressing a domain.

Take an overly simple example. We know that sorting takes O(N log N) time - that's the inherent complexity of the problem domain. Attempting to engineer the one sorting algorithm to rule them all, the one that addresses all possible additional requirements someone might have, would lead to immense complication.

In watch making you refer to such artificial degrees of convolution as 'complications' as well.

Complexity is not to be battled or ignored. It is not even a problem, it is a fact of live. Complication on the other hand is a phenomenon of engineering that can be recognized and addressed.

As the saying goes: "There are problems and there are facts of live. Problems we strive to solve. Facts of live we don't solve -- facts of live we have to live with".

As I wrote, this is something that has helped me thinking about such things.

Cheers,

Roger Sessions said...

Craig Brown points out that there is "still plenty of complexity in the business..." and that "all of these interfere with business-IT alignment. I agree with this.

Some amount of complexity is inevitable. But much of the complexity we have is avoidable. We should distinguish between these two. I often talk about "unnecessary" complexity as the type that we are trying to eliminate.

In Clemens comment, he distinguishes between "complexity" and "complication". Clemens, by the way, has done much to give us useful approaches to managing complexity with his considerable writing on components.

I don't think I agree with using the term "complication" to describe unnecessary complexity. This word seems to me to imply an afterthought, or something injected into a system. It is closely related to its verb form, "complicate", implying an active introduction. "Complexity", on the other hand, is a died-in-the-wool noun. Complications, for me, are something I want to avoid introducing. Complexity, for me, is something I want to get rid of. At least, that's how I read these two words.

Back to Walter's comments. I have heard about Cynefin, but haven't studied it yet. I'll take a look at it.

You mention an assumption of a "known/knowable context in much of this discussion". My main assumption is not that a given system is or isn't knowable, but that its "knowability" is enhanced once the (unnecessary) complexity is removed.

Thanks Craig, Walter, and Clemens for your thoughts!

- Roger

WalterRSmith said...

@Roger,

When you review Dave's definition of known/knowable (Cynefin) and complex (Cynefin), my comment may make more sense. Dave discusses strategies to move a context that is largely complex into a more known/knowable domain. However, his primary focus is on contexts that are inherently complex and therefore require a "probe-sense" approach (vs. a "sense-analyze" approach).

As I noted above, Dave (whose background is physics and philosophy) initially framed Cynefin as a matrix with ontology on one axis and epistemology on the other, with the focus on human sensemaking...which is not exactly the same sort of thing you're addressing. However, I think you'd find Dave's perspective thought-provoking.