How was Training?

more

I liked the course a lot, give me a lot of clarity of what EA means and how large and holistic it is. Even though a read the materials on the web only until we got the course I could get the meaning of them - Associate Director (EA), PPDI, USA, Sep 2010

Recommend PEAF?

more

Yes - It's a great framework - Student, Unisa, South Africa, Jun 2012

  Introduction   Context   Methods   Artefacts   Culture   Environment   Adoption  

Desktop

Mobile





 

<< Previous <<

>> Next >>

If you can't see the solution, you don't understand the problem but if you can see the solution, you may still not understand the problem! Moral of the story - Don't jump to conclusions, even if those conclusions seem to be obvious.

The problem is that, this tends to happen quite often. Many decisions that are taken are made without the required information to make them decisions as opposed to essentially arbitrary guesses.

In fact, it happens so often that in most Enterprises there is implicit acceptance, an almost apathetic resignation that this is the way things are done with the resulting abdication of Accountability when it all goes wrong (further down the road).

"Analysis Paralysis" is a phrase often used when people who want to jump to conclusions or to repress others who may be more wary. Of course this does not mean that "analysis paralysis" cannot happen. It just means that when people use that phrase, it should set off warning bells and make people think are we really over analysing something that looks so simple or are there hidden dangers lurking below the waterline?

This missing information, when potentially exposed later, could be (or should have been) quite obvious (if only the time was spent to see them) prompting questions like "Surely you knew that at the time you made the decision?". These are "difficult" questions that cause cognitive dissonance (described earlier - or later depending on how you are reading this!) which people will tend to resolve by changing their perceptions to preserve the "correctness" of their decision. Although it wasn't the decision that was wrong it was the decision not to expose more information to inform the decision that was wrong.

 

Do people in your Enterprise jump to conclusions?

Can you think of examples where this has happened in the past?

Who were they? What was the impact? Why do you think they acted in this way?

What needs to change to reduce the likelihood of it happening in the future?

Who needs to drive that change?

 



 

2008-2016 Pragmatic EA Ltd