Right People, Right Analyses, Right Decisions

Some time ago I found myself in a gathering somewhere in the bowels of a large, very bureaucratic organization for which my company was doing some work. The room was full of forty or so senior managers and analysts, many of whom were not technologically inclined. A pair of spit-shined young employees from one of the Big Consulting companies ran an efficient, well-organized meeting intended to decide which of the big enterprise software packages should be adopted by the organization. There were rounds of questions to assess the most important capabilities the software had to have, there were selections made using multi-voting techniques, and there was a beautifully prepared list of candidate packages listing the strengths and weaknesses of each printed on both sides of B-sized handouts. The group narrowed the options to three, and senior managers chose the winner.

The quality of that decision: garbage.

Seriously, I was embarrassed to be in the room.

I liked that a smaller and hopefully better-informed group of people made the final decision but everything else about the process was wrong. The meeting was the first time most of the people in the room had seen the options; there had been no time for preparation and research. As capable as many of the participants were in their own subject matter areas a large fraction of them had little to no experience evaluating enterprise software systems (including me) and many had very little knowledge of software systems of any kind. The elephant in the room was that the organization had already licensed fifteen or twenty thousand seats of one of the candidates and, surprise, surprise, that’s the one that was chosen.

Was the probably outrageous sum charged by the Big Consulting company a good expenditure? I guess that depends whether the purpose was to make a good decision or provide cover for one that had already been made. Who knows? Maybe the decision was exactly what those in charge wanted it to be.

Did that selection have any effect on what was actually going to get used within the organization? The individuals we were working with wanted no part of it. The selection of that package was in no way going to make management of any data more accurate or approachable. It was not going to address the fact that the organization did not have a strong, integrated picture of how its own data were acquired and used. The organization was also moving toward web-based delivery and access internally so I would question whether all those seat licenses were ever going to be used. For anything.

The improvements the organization was seeking were only going to come from implementations carried out by skilled practitioners who understood their data and their needs and knew how to create interfaces that gave the customer the control and ease of use they needed. Systems can be made simpler, but they cannot be made simpler than they need to be.

That said, over time that organization will probably stumble through a series of analyses and implementations that will slowly hammer their data and processes into some kind of shape. I see the same process happening at another organization of similar scope and scale. I just can’t help but think it could all be done more rationally if some groundwork were laid first. I propose the following steps might be useful before any undertaking:

  • Ensure any project has support from senior management.
  • Ensure that senior management understands the following ground rules so they have an appreciation of what their senior implementers are going to do.
  • Determine what business needs are to be represented and implemented.
  • Determine the business roles that need to be defined. Ensure that training and scope of actions are properly assigned and understood.
  • Determine whether data exist that support the business need. If the need for new data items are identified, plan how to acquire them. (As one of my college professors was fond of pointing out, the problem should be solved before you put the numbers in. If you turn your calculator on before you have the final equation you’re not doing it right.)
  • Determine the needs of the meta-implementation.
  • Determine the meta-implementation roles that need to be defined. Ensure that training and scope of actions are properly assigned and understood. Ideally these will be closely related to the business roles.
  • Ensure the meta-implementation provides the right capabilities and controls to the right classes of users.
  • Determine the source of all data. Ensure that incoming data is validated at the source and is complete.
  • Ensure that all sources of data agree and are consistent, specifically ensure that individual data records and roll-ups and aggregations of them are consistent. For example, reports of the number of items processed per day should be consistent with the number of records created for individual actions. If different systems collect these numbers differently then find out why they are different and rationalize them. Yes, this happens.
  • Ensure that downstream uses of data remain consistent.
  • Select the tools, practitioners, and hosts for the actual implementation.
  • Carry out the implementation, whether a custom build or a ready-made solution.

Decomposing the problem in this way will help you choose the right tools for any job. Doing this in pure form might be extreme; there is a place for prototyping and internal experimentation to work out some of the ideas. You may also choose to go with a ready-made implementation, like an enterprise system with predefined, business-specific modules. If you’ve done the groundwork ahead of time you’ll have better target against which to compare the features each solution offers and the costs it imposes. Just as importantly, thinking through the problem as completely as possible ahead of time will keep you from implementing features and buying capabilities you don’t needed.

This entry was posted in Soft Skills, Software, Tools and methods and tagged , , , . Bookmark the permalink.

Leave a Reply