Thursday, September 3, 2009

On Practices and Pitfalls

An interesting discussion on Twitter yesterday started with an ebizQ story Gartner Identifies Ten Enterprise Architecture Pitfalls. Brenda Michelson thought Gartner was pointing out the obvious, and Dana Gardner replied that 90% of what analysts do is point out the obvious.

After a brief (and predictable) flurry of comments deploring analyst superficiality, Brenda called for people to post real EA pitfalls, using the hashtag #eapitfall. Lots of good contributions, but I couldn't help noticing that many of them weren't exactly pitfalls, but misconceptions (e.g. "thinking one-size-fits-all is the ideal for all scenarios", "thinking that knowledge can reside purely in artifacts", "the framework is the answer"), unwise tactics ("modified waterfall planning", "running behind a project team waving a red flag") and intriguing disconnects ("embarking on EA without understanding of Why & What", "practice methodology without anthropology").

When I pointed this out, my friend Sally Bean indicated I was following another well-known EA antipattern: "turning people off, by being overly pedantic and logical".

Here's why I think it's important to separate pitfalls from all the other stuff. Because a pitfall is a surprise trap for the unwary or unprepared (landing in the doo-doo). The purpose of planning and preparation is to anticipate the pitfalls.

Here are some examples that I think look more like real EA pitfalls (=bad things happening to EA): "corporate acquisition of enterprise application package preempts EA activity", "developers persuade stakeholders that EA guidelines are unnecessary redtape", "8 out of 10 enterprise architects suffer from hypertension". 

Experts should identify pitfalls by distilling patterns from their experience of the things that can typically go wrong. These patterns are used to motivate specific acts of planning and preparation ("do this to avoid this particular pitfall"), and also to justify the general activity of planning and preparation ("think ahead to avoid all sorts of pitfalls"). 

However, instead of deriving pitfalls directly from experience, many experts try to derive pitfalls by negation - starting from what they believe EAs ought to think and do, so that any failure to conform to this normative view of EA is defined as a pitfall. Thus a pitfall is simply another way of communicating a best practice in reverse: Gartner's top two ("selecting the wrong person as lead enterprise architect", "not engaging business people") merely reflect received wisdom ("selecting the right person", "engaging business people").

Apart from being obvious, these so-called pitfalls have four problems. Firstly, they are not directly grounded in experience. Secondly, they are not intrinsically motivational, because they are not linked to outcomes. Thirdly, they are too vague to be actionable: if I wanted to follow Gartner guidelines on selecting the right person or engaging business people, I'd need the full handbook. And fourthly, they don't convey any sense of how difficult these things actually are in practice. Anybody can start a project with good intentions of selecting the right people and engaging business people and so on, but the day-to-day pressures of the job make it harder than you might expect.



Talking about pitfalls could be a way of moving "best practice" forward into "next practice". But instead it is often used merely as a way of reinforcing conventional notions of "best practice". Thus I think it matters where you get your pitfalls from.

2 comments:

  1. As one of the triggers for this blog, I've been mulling this over for a while and would like to offer a couple of reflections.

    Firstly, on the subject of my banter with Richard about his care with language. This is a good example of the importance of context. I may indeed have been accusing Richard of being pedantic and logical (having initially failed to grasp the point he was making), but, despite appearances to the contrary, I wasn't actually 'turned off'and I do believe that precision and logic are important.

    However, my reason for posting the comment was that I do think that, when operating in an organizational context, people do have to be aware that if they often find ourselves pointing out logical inconsistencies in language or behaviour, the cumulative effect may be to become perceived by some people (typically the ones who aren't deep thinkers, and are susceptible to pitfalls) as negative, critical, or worse, a 'troublemaker'. So it's important to be selective about when to do this, choose your language carefully, know your audience and their likely response, and balance these types of observations with more optimistic positive ones.

    Secondly, I think there may be two sorts of pitfalls. Those that could be foreseen from experience in the way that Richard suggests above, and those caused by unforseeable changes or paradigm shifts. Avoiding the latter requires good stakeholder analysis, monitoring what's going on around you, and an open mind.
    Sally Bean

    ReplyDelete
  2. Thanks Sally. I agree that pitfalls cannot always be foreseen from experience, and that some changes and paradigm shifts may be unforeseeable. However, if a major analyst firm (or anyone else) wishes to publish lists of pitfalls, then these lists only have value to the extent that they are derived from experience rather than received wisdom.

    I also take your point about the potential unpopularity of pointing out the logical weaknesses in someone else's position. Enterprise architects are sometimes impelled to point out the logical weaknesses in an AS-IS enterprise, and their consequent unpopularity should be no surprise.

    Finally, I should point out that deep thinkers are also susceptible to pitfalls. As Professor Dumbledore says somewhere "Being - forgive me - rather cleverer than most men, my mistakes tend to be correspondingly huger".

    ReplyDelete