jump to navigation

Decision Automation in BI: Design Guidelines for Business Analytics and Rules June 18, 2007

Posted by Cyril Brookes in Decision Automation, General, Issues in building BI reporting systems.
trackback

Authors routinely ignore the specifics of designing automated decision components for BI systems. I guess this is often because they believe these details are application dependent. However, I believe that there can be, should be, more rigor in the specification of business analytics, rules and predictions that underlie these designs and specifications. Generalizations can only take us so far, sooner or later we have to get down and dirty.

In this, my third recent post that discusses decision automation, I offer some guidelines that can provide the requisite structure; but you, Dear Reader, can judge for yourself, as always.

To recap, hopefully avoiding the need for you to read my recent stuff, it is my hypothesis that the project selection and specification of BI systems with decision automation incorporated should follow five steps as below. Earlier posts have considered the overall issue (here) and Phases 1 through 3 (here).

  1. Identify the controllable business variables in your business environment
  2. Determine the business processes, and relevant decisions, that impact those controllable variables
  3. Identify the BI systems that support the business processes and decision contexts selected in Phase 2
  4. Design the business analytics that are the basis of the decision automation: business rules, predictive analyses, time series analyses, etc. wherever Phase 3 indicates potential value
  5. Evaluate feasibility and profitability of implementing the analytics created in Phase 4

This post covers Phase 4, arguably the most interesting from a technical viewpoint.

It is axiomatic, I believe, that we should revisit now the steps in the decision making process that are to be automated. Each step requires a different style of automation, offers distinct benefits; and some are more complex than others.

Drawing on Management 101, with Herbert Simon as our mentor, we know that the universal key steps in the decision making process, be it manual or automated, high level strategic or low level operational, are:

  • Measuring and assessing the business process status: Where are we? Is it good or bad? What are people saying about us?
  • Finding “problems”: i.e. situations (including opportunities) that need a response, out of specification KPIs, adverse trends or predictions, unusual circumstances, people telling us we have a problem!
  • Diagnosing the problem context: i.e. how bad is it, what happens if nothing is done, has this happened before, what happened then, etc.?
  • Determining the alternatives for problem resolution: i.e. what did we do last time, what are people suggesting we do?
  • Assessing the consequences of the outcomes from each alternative: i.e. predictive modelling, computing merit formulae, what happened after previous decisions in this problem area?
  • Judging the best perceived outcome, and hence making the decision: i.e. comparing the merit indicators, accepting or rejecting peoples opinions.

Most of you, Dear Readers, will know all this, but we need to be sure we are all on the same page. Otherwise, confusion reigns, as indeed it does in several of the recent articles on this subject, especially the marketing hype ones.

Our objective with BI system design is to enable improved business process performance. Our primary channel to do this is to collect and report information that supports management decision making. Apart from passively dumping a heap of facts and figures on a screen, we know we can empower improved decision making in two ways:

Create action oriented BI systems by presenting the information in a pre-digested way that highlights good and bad performance and spurs the executive to react appropriately. Dashboards and scorecards are obvious examples of how we can do this. I proposed in earlier posts some general design principles for summarization and drill-down specification. OR

We can actually make decisions automatically as part of the BI system, adjusting the controllable parameters of the business process without reference to a manager.

We’re focusing here on the second option. It’s not that this is a new idea, we’ve been doing it for decades, but the new business analytics, rule management and prediction software now available does make the whole process much easier than when we had to rely on IF…AND …THEN…ELSE statements to make it all work. A reference that shows the scope of what’s available is by James Taylor.

Let us now consider how we can automate all, some, or one of Simon’s decision process steps. Clearly, even automating one step effectively may be beneficial. It shouldn’t be essential to “go the whole hog “ in the name of decision automation, just do what makes business sense and leave the balance to the manager. Further, we can complete the automation project in stages, progressively removing human interaction; this is Phase 5 from the list above.

Assessing status:

Our BI system will tell us the status, if it doesn’t then fix it. We must have available, in machine readable format, all the business KPIs and metrics which are relevant to the business process, including the state of the controllable variables (Phase 1 of the methodology above). The trick is to manipulate that information so we can compute automatically whether this status is good or bad. Remember, we won’t have a manager to decide this for us if we’re automating. Don’t forget that often the status assessing process will include comments or opinions from real people. Just because we’re automating the decision, doesn’t mean that we can’t accept human inputs.

The most common type of human input that impacts the decision automation is a commentary on the validity, accuracy or timing of a KPI or other important metric. And the most common impact such an input has is: Abort Immediately. You don’t want the automated system making decisions with garbage data.

For this reason, the design should desirably contain some output of status data for human monitoring purposes.

When we have the metrics and other input data accessible, we can move to consider automating the Problem Finding step.

Finding problems:

Situations that need a response can often be determined automatically by examining the degree of status goodness or badness. Fortunately, there are a limited number of available techniques for this. They are mostly the same methods we use to alert managers to problems in a non-automated environment. Almost all can be automated by applying business rules, statistical procedures and/or predictive models. The only human input likely is when the CEO, or another potentate, says you have a problem; this being self-evident thereafter.

The automatable problem finding techniques I use most often include:

Performance Benchmark Comparison: Compare the important KPIs with benchmarks that make sense from a problem identification viewpoint. Obvious examples include: actual versus budget, plan; previous corresponding period, best practice, etc. In addition, you can compute all kinds of metrics that relate to performance and compare them across divisions, products, locations, market segments, etc.

Performance Alerting: The next step is to use the above automated comparisons to identify bad, or superior, performance. This normally involves placing relevant metrics on a scale of awful to excellent. It’s a form of sophisticated exception analysis. The need for action response is usually determined automatically by the assessed position of the metrics on the scale.

Trend Analysis and Alerting: If no problem is found with the basic performance analysis, it is time to bring in the heavy statistical artillery. Trends of performance metrics, either short or long term, are often good indicators of problems that are not immediately apparent. Alerts based on good or adverse trends that trigger a need for a response are easily automated. Current application development software is very sophisticated.

Forecasting and Alerting: Even if current metrics are within acceptable bounds, the future may be problematical. Often this should be corrected earlier, not later. Applying predictive models and then reassessing the adequacy of the forecast critical performance metrics is often valuable, and also relatively easy to automate.

Alerting to Unusual Situations: Time series analysis will often highlight hidden issues, e.g. with changes in customer, supplier, manufacturing or marketing activity. For example, the credit rating of a customer may be altered if the statistical properties of its payment pattern alter significantly.

Diagnosing the context:

Scope, and necessity, for diagnosis in an automated decision environment is limited.

In a non-automated context this is an important part of a BI or decision support system. It involves assisting the human decision maker to understand how bad the problem is, what will happen if no action is taken, and how rapidly disaster will strike.

Normally the automated decision context is operational and relatively simple. I have found that it is often desirable to validate the problem identification procedures specified earlier. Hence, I look for ways to check that the problem is both real and significant enough to warrant automatic rectification action. This could include notifying a human monitor that action is imminent and giving a veto opportunity.

Determining alternatives:

If you’re following the methodology I outlined earlier you will have identified the controllable variables in the target business process. This would have been done in Phase 1. Some suggestions as to potential controllable variables were presented in an earlier post.

Obviously this is a critical step in the design of an automated decision system. However, provided you have done the initial homework on the control levers available for adjusting the performance of the business process, it is easy. It is simply a case of determining which levers to move, whether they move up or down, and by how much.

It may require some modelling work to answer these questions, but most often (I find) a basic table linking variance of performance metric to control adjustment is adequate. Implementing such a specification using modern rule management systems is trivial.

Evaluating outcomes:

In the automated decision context this is usually a simple or non-existent step. The rules for determining the alternatives usually imply a certain outcome.

Only one alternative is often available due to the shortage of control variables. If more than one solution option is available, e.g. inadequate sales volume presages either a decreased price, or increased advertising expense, it may require some modelling to determine the best outcome.

Complexity arises when more than one performance metric is out-of-specification. This will usually imply that more than one control variable needs adjustment. There may be interactions between the variables that requires arbitration; or simply throwing in the automation towel, and advising a human monitor of the issues.

Decision making:

For most decision automation systems the decision is effectively made with the alternative determination, and judgement is not required. If more than one alternative is identified, then an automated assessment of the evaluation determines the decision. Subjective input is usually not relevant or sought. If subjective issues are relevant, then a human assessor is required.

 

In a further post I will consider the implementation issues and recap on the overall method, since I’ve been formalizing my thinking as these posts have been created. Please advise, Dear Reader, if you have any comments on the process thus far, especially if you find it helpful or otherwise.

Advertisements

Comments»

1. James Taylor's Decision Management - July 16, 2007

What lies between “gut feel” and magic when it comes to decision-making?

Timo Elliott over at the BI Questions blog had some great cartoons on this topic – this one about “gut-feel” and this one about what executives want from BI. These are both genuinely funny but they also point out a…


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: