Information Gathering: Turning mounds of data to neatly stacked knowledge. (Part 1 of 2)

Effective process improvement begins with an intimate understanding why people do what they do and how the current process oeprates. As an external consultant or a strategy & transformation leader within your organization you will ultimately find yourself "facing the music" during a pitch-out: will someone in the crowd stand up and say "this is b#**" or will you get enthusiastic nods for your keen insight? We will discuss the most fundemental ways in which data (easy to find) turns into knowledge (hard to get)

Where in the world is my process analyst today?

For the purpose of our discussion I will assume you are familiar with one of the prevailing quality/capacity improvement methdologies such as Lean, Six Sigma, TQM, etc.

If you were convinced that the two consultants depicted by "office space" were effective, your approach to "process improvement" begins with firing everyone involved in the process, bulldozing the buildings they occupied and rebuilding your company all over again (a-la "office space). If you find that this approach generates some "friction", I would suggest another way:

Make yourself credible enough to convince the people who live the process you know what it feels like sitting in their chairs. That you are able to demonstrate uncommon wisdom about the root causes and that when you guide them to a solution that they will follow you with blind obedience knowing they are doing the right thing.

Wow! you say? How do I become the piper?

Your first step is to learn 10+ years of process pains/aches within the prep period prior to an attempt at a process change. Hopefully your project charter allots at least 2-3 months to gather information (notice I did not say "gather data") and so become an imbed within the process. The reason I mention the fact you need to gather information is because data is usually abundant. It is your role to turn that data into information and knowledge. A good (and smart!) friend of mine, Harold Rudolf, is an expert at building mental models and has a very simple view of the genesis of knowledge:

The "X" axis of this chart denotes the level of maturity of the knowledge & understanding you have of the process (where "you" refers to the process analyst or the business leaders managing these processes).

The "Y" axis ("Risk") is interchangeable with "process improvement success" or "tranformation impact".

The idea behind this mental model is that data, the most rudimentary form of knowledge puts us at the low end of the "X" axis, thus exposing us to greater risk. There are many data systems available in every business whether this data resides on an accounting system, in people's heads or on customer orders in paper form. The risk is associated with the fact thatat this level of process maturity you are not truly "managing" the process; your processess are probably ill-defined and your business is not making decisions based on the data beeing collected but is rather "reactive". This implies a "lagging" effect the data has on your business: when managers look at "data" - they are examining this month's sales reports or last quarter's aging report and can only react to what has been set in stone (hence the term "lagging").

The higher level of matury (traversing the "X" axis to the right) is "information". Here you have added your analytical insights to the data and created extracts, dashboards and queries to take mounds of data and turn them into decision making points. Most organizations claim to have this level of maturity in managing their processess though I would argue that I have seen some leading Fortune 500 companies whose data is so "immature" that it is still "lagging"

A collection of "data" points is meaningless unless you are able to turn it into "information" by asking the right questions (and I might add - view the data in the context of your process).

Knowledge is a higher level of insight into your processes which is forged by applying your experience to the information you gathered.

Transforming Data -> Information -> Knowledge

In a past project for a Fortune 100 bank we examined the granting and revocation of access to the bank's systems. There were SOX compliance issues relating to employee terminations whose access rights were not revoked on a timely basis because the staff at the risk office could not handle the high volume of requests hitting their queue.

Mounds of Data
We readily learned that data was not in shortage. In fact we hard hard pressed to have the data center produce an extract of 1 year's worth of data without continuously crashing the Oracle server. Finally we had several hundred million records of data on our hands. Data that had been used to create management reports on productivity.

One of the biggest challenges we faced with the data was its "collapsed" form. There was a single field of data containing all of the details for each access/revoke request, specifically which system needs to be added/removed for access, which types of access are asked for, etc. This data was not previously used because of the way it had been recorded by the designers of the original system who only tracked the data for the purpose of handling a single request. However it was nearly impossible to generate a report that compares bits of this information across multiple records.

So we designed a set of complex queries and decomposed the data to its original elements and turned this data into information by measuring the volumes of requests by system name and access type.

The last step in our analytical process was to convert the information we marshaled into actionable knowledge: Armed with the experience we gained by walking the process, we started tagging the systems (for which access was requested) by their type, platform and a few other attributes. We classified the volume for requests by the teams assigned to them.

Most importantly, we were only able to calculate cycle time only after mapping the current state process. In the previous "compressed" form, the records indicated a single transaction in the process: the user request generated a single record. The handling of the request (the point in which an operator pulls a request from the queue) yielded yet another record and the abdication of the request generated another record. Each approval of the request generated another record. By understanding the business process we were able to tie the threads together and calculate cycle times from request to handling, handling to approval, approval to approval, and approval to abdication.

At this point the data "sings" with knowledge and it became evident that there were significant delays in the process stemming from the second of the three approval levels. The company agreed to remove this approval layer since the last approval was by a senior manager/business group manager.

We also learned that certain type of requests took a significant longer time to process. So we created what is known in Lean manufacturing as a "supermarket express checkout lane". The majority of the requests which were easy to fulfill were handled by the main process. The exceptions and complex requests were assigned to a new group within the risk office dedicated to these requests.

From here the project took a major leap forward and the team was asked to lead the development of dashboard for management, redesign the queueing method of incoming requests, develop data drill-down interactive reports and measure processing effectiveness based on the critical factors that surfaced in the data analysis phase of the project. The project lead was able to score the SOX compliance level for the process and draw a list of controls based on COBIT framework to significantly improve compliance.

Talk the talk and walk the walk
So now that you are familiar with the data to knowledge model the question you must answer to yourself is do you possess the experience necessary to take information and make it into usable knowledge?

I typically spend a great deal of time talking to people who "live" the process before I get into data collection. For one project this meant "shadowing" news producers and walking with them from their station to the media library to the viewing station to the satellite feed room back to their floor just in order to understand the process complexity they were facing. In another case I spent several weeks shadowing marketing managers in their job to understand how their requests end up in a bottleneck downstream from them (at the end of the process). The other day I spent 4 hours with a member of my team at a large warehouse facility to understand how they receive and handle fulfillment requests for one of the tens of millions of media assets under their care.

The insights you gain by "walking the process" are priceless when you find yourself in a room, facilitating a process improvement event, and need to drive the discussion to where the root cause process breakdowns occur. If you do not understand how much time your employees spend running across a 300 foot access way back and forth to retrieve physical assets then how can you possibly drive them to real-world solutions?

In a follow up article I will share with you ideas for shadowing and collecting process information.

No comments: