jump to navigation

Memo to Business Analysts: A Compelling Treatise on Why and How Businesses should Automate Decisions December 12, 2007

Posted by Cyril Brookes in Decision Automation, General, Issues in building BI reporting systems.

You may know that fellow blogger James Taylor is the author, with Neil Raden, of a new book on the current hot topic of Automated Decision Making, titled Smart Enough Systems. In it they present a compelling proposition to business intelligence analysts and executives:

Look out for decisions that can be automated in your business;

Automate them and the business will be much better for it.

I suggest that you bring this work to the attention of the more creative managers and professionals in your business.

I have written on decision automation, in particular how to identify candidate decisions, a while back, here and here. You may care to revisit these, Dear Reader.

James and Neil propose that it is practicable to identify decisions that can be automated, and that the subsequent system design path is now both well trodden and amply supplied with technical support. They then give lots of detail on how to do it.

After reading the book, I put a set of questions to James, and here are his responses. I believe you’ll find them interesting.

Q1. On page 1 in the Introduction you say the book comprises two unequal “halves”, the first being general and the second technical. Are you implying that executive readers should read the first “half”, Chapters 1-4, and then move to the implementation proposals in Chapter 9?

A1. I sometimes feel like we had two books, one for executives of any kind and one for technology focused people, which we had to put inside one set of covers! The introduction did try and guide people to read as you suggest, but it’s hard in a physical book to do that well. In general I do think this would be a good approach for a non-technical reader except that I would also encourage them to read chapter 8 “Readiness Assessment” and at least skim the stories in the other chapters.

Q2. Much of the book refers to “automating hidden operational decisions” or “micro-decision automation”. Does the EDM approach described in the book only apply to automated decisions, or is it also relevant to partially automated decisions or even to a decision support role for human decision makers?

A2. I recently came across the work of Herbert Simon again and found his classification of problems very helpful:

  • Structured – well understood, repetitive, lend themselves to well-defined rules and steps
  • Unstructured – difficult, ambiguous, no clear process for deciding
  • Semi-structured – somewhere in between

Clearly EDM works particularly well as an approach to handle structured problems, whether you want to automate them completely or automate a large part of the decision while leaving some room for human intervention and participation in the decision making process. I think EDM also has value for the semi-structured problems, especially in areas like checking for completeness or eligibility. At some level EDM solutions blur into decision support solutions but an EDM solution is always going to deliver actions or potential actions not just information.

Q2(a). Does this reply imply that EDM is principally a decision support process or tool – relevant when a problem situation is identified or predictable in detail. Hence, alternative BI systems, applications or techniques are required to help executives, professionals and automation systems understand current business status and to find problem situations that require a response (to use a Herbert Simon term); presumably a response determined using EDM principles?

A2(a). Absolutely. I don’t like calling EDM decision support, though, because I find people have a mental model of decision support that can confuse them when thinking about EDM. A problem needs to be relatively well defined and understood to be amenable to automation using EDM. While many of these problems come from process-centric analysis, it is very often the case that more ad-hoc analytics are used to find the problems that offer the best pay off and to see how well an EDM solution is working once it is implemented. In particular the adaptive control aspect of EDM solutions requires good performance monitoring and analytics tools.

Q3. Similarly, the reference in Chapter 5 to Analytics and Predictive Modeling, and in Chapter 7 to Optimization and Operations Research, could imply that EDM has a role in higher level decision support, especially at tactical level. Is this a correct inference?

A3. I don’t think so. I think it is more true that some of the techniques and technologies that work for higher level decision support are also useful in the development of EDM solutions. The mindset though, that of automating decisions not simply helping someone who has to make the decision, is quite different. The different solution type also means the techniques are often applied in quite different ways – producing an equation rather than a picture for example.

Q3(a). Following my earlier theme, and looking ahead to your answer to Question 8: is there a role for EDM in automating the finding and diagnosis of problem situations in the business, perhaps without actually producing the “equation” that will solve it – leaving that part to a human?

A3(a). This is one of the edge conditions for EDM, where the system takes the action of diagnosing something (rather than fixing it) and is certainly not uncommon. It is often found where the delivery of the action cannot easily be automated. Interestingly, it has been found to be more effective to allow the human to provide inputs that rely on human skills, such as entering the mood of a customer, and having the system produce an answer than having the system provide options or diagnosis and then having the human interpret that.

Q4. IT groups worldwide are committing to SOA as a major part of their strategic plan implementation. You refer to SOA many times in the book, and how EDM and Decision Services are complementary to the concept. To what extent is the implementation of EDM and micro-decision automation generally dependent upon the enterprise having implemented SOA principles?

A4. I don’t think it is dependent but it is certainly true that companies already adopting SOA will find it much easier to adopt EDM. I also think that companies adopting SOA are more ready both for the explicit identification of a new class of solution (a decision service) and more open to adopting new technology to develop such services. I would not be at all surprised if the vast majority of EDM projects were not done alongside or following on from SOA initiatives.

Q5. Some years ago there was much discussion about Organizational Learning, the Learning Organization, Designing Learning Systems, etc. Is your approach to Adaptive Control in Chapter 7 related to this, or does it have different underlying purpose and concepts?

A5. To some extent. Part of the value of adaptive control is that it means an organization is committed to constantly challenging its “best” approach to see if it can come up with a better one – either because it knows more or because the environment has changed. In that sense the adaptive control mindset matches that of a learning organization. I also think that the use of business rules as a core technology has to do with the learning organization in that it gives a way for those who understand the business to actively participate in “coding” how the systems that support that business work.

Q6. The champion/challenger process described in Chapter 7 is depicted as two or three dimensional in the diagrams. Is it more difficult to implement when there are several alternative control variables? Are there project selection and management criteria that will help ensure champion/challenger successful approaches?

A6. It is much easier to implement adaptive control and champion/challenger when there are clear and well defined metrics with which to measure the relative value of different approaches. If the value of an approach is a complex mix of factors then it will be harder to compare two and pick the “winner”. Without a champion/challenger approach, of course, you are even worse off as you don’t even know how those alternatives might have performed.

Q7. You have clearly and comprehensively outlined the case for automating micro-decisions in Chapters 1 to 3. This argument ought to be compelling for many executives. However, there are many options for implementing rule based, model based and adaptive control based systems using the technologies in later chapters. Is it practicable to describe other implementation procedures introduced in Chapters 8 and 9, possibly via the book’s Wiki?

A7. One of the big challenges when writing the book was that many of the component technologies and approaches have value in other contexts besides that of decision management. Rules and analytics can both, for instance, improve a business process management project. There are so many of these that I don’t think even the book’s wiki would be able to handle it. We are engaged in some interesting research around decision patterns and a decision pattern language. I think this is an interesting area – identifying and describing the implementation patterns for decisions where decision management makes sense.

Q8. It appears from your discussion in Chapter 9 that the key to successfully implementing EDM in a “Greenfield” site is the selection of the initial projects. You propose selecting two applications; one rule based, and then a second that is predictive model based. This sounds sensible, but are there alternative project selection methods that might be applied? Other examples could include:

  • Partial automation of more complex, but valuable, decisions; e.g. using rules or models to find problem situations without solving them in Phase 1, with later phases implementing the automation fully?
  • Analyzing the informal “know-how” knowledge base of key professionals to determine if an initial project can be built by capturing and encoding their knowledge?
  • Use industry best practice reports to identify enterprise deficiencies that may be rectified using EDM?

A8. Chapter 9 was hard to write because different customers have succeeded in different ways. The way we outline was the one that seemed like the most likely to work overall. Each individual company may find that a different approach works better for them. Companies reluctant to fully automate decisions, for instance, may well find the first of your examples to be very useful in getting them more comfortable with the idea of automation. Identifying problem areas using best practice and driving to fix those would also be very effective, though it might well be implemented using a first rules project and a first analytic project as we suggest.

In general I don’t think that starting with know-how and working out is a good approach, however. Our experience is that you need to have the decision identified and “grab it by the throat” to successfully adopt EDM. Decision first. Always.


Implementing Decision Automation BI Projects July 13, 2007

Posted by Cyril Brookes in Decision Automation, General, Issues in building BI reporting systems.
1 comment so far

The feedback on my three earlier posts on specification guidelines for automated decision capability in BI systems has been both positive and heartening. My objective has been to show how these BI projects for operational business processes may be built relatively simply, and to generate enthusiasm for this among the legions of business analysts. You can indeed try this at home!

This post summarizes the major issues that received favorable comment and then deals briefly with profitability, feasibility and implementation techniques for these systems. It concludes the series of (now) four posts on decision automation for BI that commenced here.

I haven’t attempted to place this subject in its context, or to cite various examples of success or otherwise. Thomas Davenport, Jeanne Harris, James Taylor and Neil Raden have done this comprehensively in their recent articles and books.

You may recall, Dear Reader, the underlying principles for this methodology are:

Selecting the right targets is critical to success:

Doing the right thing is much better than doing things right (thanks Peter Drucker!). My prescription is to avoid trying to pick winners from the various business processes that may be candidates for BI automated decisions. Rather we should look to a different set of candidates as the start point.

Identify the controllable variables – the levers that are adjustable in the business process that can shift performance towards improvement

These are easy to pick. There are relatively few options available in most businesses, variables like changing prices, adjusting credit ratings, buying or selling stock or materials, approving or rejecting a policy proposal, etc. A more complete discussion is in my second post on this subject.

Only consider automated decision making BI systems where controllable variables exist

This is a no-brainer, I guess. It’s only possible to automate when automation is possible. If we can’t control the process when there’s a problem, because nothing is available to be done (e.g. we can’t raise prices if all customers are on fixed contracts), then don’t let’s start automating.

Segment the design processes into logical sub-projects so the project doesn’t run away uncontrolled

I suggest in the third post that the Herbert Simon style decision process elements are an effective segmentation. This allows focus on (say) finding problems and then on deriving the relationship between adjusting a control variable and the resultant outcomes.

Enough of a recap: here are some basic suggestions for project management.

Implementation of a decision automation project is always tricky. In most cases it is not possible to “parallel run” the new with the old, since only one option can be tried each time in real life, and comparison is not possible longitudinally.

I suggest that an iterative implementation is therefore appropriate. It should incorporate feasibility and profitability analyzes as well.

Referring to the more detailed methodology in the third post:

  • Build the status assessment and problem finding sections first, and leave the action part to the management.
  • Then design the diagnosis and alternative selection modules and instruct a human manager what to do (always leaving the override option, of course). This is simple as long as there is only one controllable variable available for the business process and only one KPI metric, or a set of related KPIs and metrics, that are out of specification, hence signifying the problem. If there are more than one of these, then it can (almost certainly will) become complex. Certainly it’s achievable, but there’s a good deal of modeling and use case analysis required that is beyond the scope of a blog post.
  • Finally, link the alternative action chosen to the automatic adjustment of the control variable(s) and you’re home free.

I hope, Dear Reader, you’ve been infected in some small way with the enthusiasm I have for automated decisions in BI applications. In many ways they are the most satisfying aspects of Business Analyst work, since you get to design the system, and then get to see it perform. Working on high level strategic projects is often more intellectually challenging, but you rarely get to have full closure, it’s the executive users who have that pleasure, often long after you’ve left the scene.

Decision Automation in BI: Design Guidelines for Business Analytics and Rules June 18, 2007

Posted by Cyril Brookes in Decision Automation, General, Issues in building BI reporting systems.
1 comment so far

Authors routinely ignore the specifics of designing automated decision components for BI systems. I guess this is often because they believe these details are application dependent. However, I believe that there can be, should be, more rigor in the specification of business analytics, rules and predictions that underlie these designs and specifications. Generalizations can only take us so far, sooner or later we have to get down and dirty.

In this, my third recent post that discusses decision automation, I offer some guidelines that can provide the requisite structure; but you, Dear Reader, can judge for yourself, as always.

To recap, hopefully avoiding the need for you to read my recent stuff, it is my hypothesis that the project selection and specification of BI systems with decision automation incorporated should follow five steps as below. Earlier posts have considered the overall issue (here) and Phases 1 through 3 (here).

  1. Identify the controllable business variables in your business environment
  2. Determine the business processes, and relevant decisions, that impact those controllable variables
  3. Identify the BI systems that support the business processes and decision contexts selected in Phase 2
  4. Design the business analytics that are the basis of the decision automation: business rules, predictive analyses, time series analyses, etc. wherever Phase 3 indicates potential value
  5. Evaluate feasibility and profitability of implementing the analytics created in Phase 4

This post covers Phase 4, arguably the most interesting from a technical viewpoint.

It is axiomatic, I believe, that we should revisit now the steps in the decision making process that are to be automated. Each step requires a different style of automation, offers distinct benefits; and some are more complex than others.

Drawing on Management 101, with Herbert Simon as our mentor, we know that the universal key steps in the decision making process, be it manual or automated, high level strategic or low level operational, are:

  • Measuring and assessing the business process status: Where are we? Is it good or bad? What are people saying about us?
  • Finding “problems”: i.e. situations (including opportunities) that need a response, out of specification KPIs, adverse trends or predictions, unusual circumstances, people telling us we have a problem!
  • Diagnosing the problem context: i.e. how bad is it, what happens if nothing is done, has this happened before, what happened then, etc.?
  • Determining the alternatives for problem resolution: i.e. what did we do last time, what are people suggesting we do?
  • Assessing the consequences of the outcomes from each alternative: i.e. predictive modelling, computing merit formulae, what happened after previous decisions in this problem area?
  • Judging the best perceived outcome, and hence making the decision: i.e. comparing the merit indicators, accepting or rejecting peoples opinions.

Most of you, Dear Readers, will know all this, but we need to be sure we are all on the same page. Otherwise, confusion reigns, as indeed it does in several of the recent articles on this subject, especially the marketing hype ones.

Our objective with BI system design is to enable improved business process performance. Our primary channel to do this is to collect and report information that supports management decision making. Apart from passively dumping a heap of facts and figures on a screen, we know we can empower improved decision making in two ways:

Create action oriented BI systems by presenting the information in a pre-digested way that highlights good and bad performance and spurs the executive to react appropriately. Dashboards and scorecards are obvious examples of how we can do this. I proposed in earlier posts some general design principles for summarization and drill-down specification. OR

We can actually make decisions automatically as part of the BI system, adjusting the controllable parameters of the business process without reference to a manager.

We’re focusing here on the second option. It’s not that this is a new idea, we’ve been doing it for decades, but the new business analytics, rule management and prediction software now available does make the whole process much easier than when we had to rely on IF…AND …THEN…ELSE statements to make it all work. A reference that shows the scope of what’s available is by James Taylor.

Let us now consider how we can automate all, some, or one of Simon’s decision process steps. Clearly, even automating one step effectively may be beneficial. It shouldn’t be essential to “go the whole hog “ in the name of decision automation, just do what makes business sense and leave the balance to the manager. Further, we can complete the automation project in stages, progressively removing human interaction; this is Phase 5 from the list above.

Assessing status:

Our BI system will tell us the status, if it doesn’t then fix it. We must have available, in machine readable format, all the business KPIs and metrics which are relevant to the business process, including the state of the controllable variables (Phase 1 of the methodology above). The trick is to manipulate that information so we can compute automatically whether this status is good or bad. Remember, we won’t have a manager to decide this for us if we’re automating. Don’t forget that often the status assessing process will include comments or opinions from real people. Just because we’re automating the decision, doesn’t mean that we can’t accept human inputs.

The most common type of human input that impacts the decision automation is a commentary on the validity, accuracy or timing of a KPI or other important metric. And the most common impact such an input has is: Abort Immediately. You don’t want the automated system making decisions with garbage data.

For this reason, the design should desirably contain some output of status data for human monitoring purposes.

When we have the metrics and other input data accessible, we can move to consider automating the Problem Finding step.

Finding problems:

Situations that need a response can often be determined automatically by examining the degree of status goodness or badness. Fortunately, there are a limited number of available techniques for this. They are mostly the same methods we use to alert managers to problems in a non-automated environment. Almost all can be automated by applying business rules, statistical procedures and/or predictive models. The only human input likely is when the CEO, or another potentate, says you have a problem; this being self-evident thereafter.

The automatable problem finding techniques I use most often include:

Performance Benchmark Comparison: Compare the important KPIs with benchmarks that make sense from a problem identification viewpoint. Obvious examples include: actual versus budget, plan; previous corresponding period, best practice, etc. In addition, you can compute all kinds of metrics that relate to performance and compare them across divisions, products, locations, market segments, etc.

Performance Alerting: The next step is to use the above automated comparisons to identify bad, or superior, performance. This normally involves placing relevant metrics on a scale of awful to excellent. It’s a form of sophisticated exception analysis. The need for action response is usually determined automatically by the assessed position of the metrics on the scale.

Trend Analysis and Alerting: If no problem is found with the basic performance analysis, it is time to bring in the heavy statistical artillery. Trends of performance metrics, either short or long term, are often good indicators of problems that are not immediately apparent. Alerts based on good or adverse trends that trigger a need for a response are easily automated. Current application development software is very sophisticated.

Forecasting and Alerting: Even if current metrics are within acceptable bounds, the future may be problematical. Often this should be corrected earlier, not later. Applying predictive models and then reassessing the adequacy of the forecast critical performance metrics is often valuable, and also relatively easy to automate.

Alerting to Unusual Situations: Time series analysis will often highlight hidden issues, e.g. with changes in customer, supplier, manufacturing or marketing activity. For example, the credit rating of a customer may be altered if the statistical properties of its payment pattern alter significantly.

Diagnosing the context:

Scope, and necessity, for diagnosis in an automated decision environment is limited.

In a non-automated context this is an important part of a BI or decision support system. It involves assisting the human decision maker to understand how bad the problem is, what will happen if no action is taken, and how rapidly disaster will strike.

Normally the automated decision context is operational and relatively simple. I have found that it is often desirable to validate the problem identification procedures specified earlier. Hence, I look for ways to check that the problem is both real and significant enough to warrant automatic rectification action. This could include notifying a human monitor that action is imminent and giving a veto opportunity.

Determining alternatives:

If you’re following the methodology I outlined earlier you will have identified the controllable variables in the target business process. This would have been done in Phase 1. Some suggestions as to potential controllable variables were presented in an earlier post.

Obviously this is a critical step in the design of an automated decision system. However, provided you have done the initial homework on the control levers available for adjusting the performance of the business process, it is easy. It is simply a case of determining which levers to move, whether they move up or down, and by how much.

It may require some modelling work to answer these questions, but most often (I find) a basic table linking variance of performance metric to control adjustment is adequate. Implementing such a specification using modern rule management systems is trivial.

Evaluating outcomes:

In the automated decision context this is usually a simple or non-existent step. The rules for determining the alternatives usually imply a certain outcome.

Only one alternative is often available due to the shortage of control variables. If more than one solution option is available, e.g. inadequate sales volume presages either a decreased price, or increased advertising expense, it may require some modelling to determine the best outcome.

Complexity arises when more than one performance metric is out-of-specification. This will usually imply that more than one control variable needs adjustment. There may be interactions between the variables that requires arbitration; or simply throwing in the automation towel, and advising a human monitor of the issues.

Decision making:

For most decision automation systems the decision is effectively made with the alternative determination, and judgement is not required. If more than one alternative is identified, then an automated assessment of the evaluation determines the decision. Subjective input is usually not relevant or sought. If subjective issues are relevant, then a human assessor is required.


In a further post I will consider the implementation issues and recap on the overall method, since I’ve been formalizing my thinking as these posts have been created. Please advise, Dear Reader, if you have any comments on the process thus far, especially if you find it helpful or otherwise.

Building Automated Decision Making into BI System Design – A Methodology Overview May 18, 2007

Posted by Cyril Brookes in BI Requirements Definition, Decision Automation, General, Issues in building BI reporting systems.

Automated decision making for business is about flavor of the month. Most emphasis has been on automating business analytics, say, underwriting in the insurance industry and stock market program trading. But there are ample opportunities for incorporating automation in more conventional BI systems, especially corporate performance management, where there has been, so far, little discussion.

Tom Davenport’s recent work on business analytics has been widely reported and commented. The consultants and software marketers are circling the wagons.

To highlight opportunities and stimulate discussion among BI analysts this post explores how relevant BI system targets for automation might be identified.

Most BI analysts see their role as designers of systems to support management decision making through effective presentation of information. That is, of course, commendable and important. But is that all there is? That focus doesn’t preclude building automated decision making systems if the context is suitable. It’s just that it isn’t done often. We seem to be reluctant to try and replace managers, maybe it’s because they are our bread and butter?

There are three generally accepted classes of decisions in business; operational, tactical and strategic. It’s pretty obvious that automatic decision making is almost always associated with operational, and perhaps some tactical, contexts. If it’s strategic, then forget it. Since many BI environments serve a mix of strategic and operational users, the prevailing focus is almost always on information presentation, rather than active replacement of human decision makers.

This discussion reminds me of a 25 year prediction from a long forgotten business journal article of the 1960s “Boards of Directors will be retained for sentimental reasons; computers will make all the decisions….”. Didn’t happen, and won’t. A similar, but contrary, forecast in the HBR of June 1966 “A manager in the year 1985 or so will sit in his paperless, peopleless office with his computer terminal and make decisions based on information and analyses displayed on a screen…” There still seem to be a lot of executive assistants around!

My intention with this post is to suggest a methodology or process which demonstrates how BI analysts can effectively and efficiently identify opportunities beyond the passive aim of information presentation. Even if the resulting design only partially automates decision making, it is likely to be a better, more effective solution than its passive counterpart, simply because it will be the result of a more creative and challenging design process.

In the current spate of articles there are many examples of apparently successful automated business process systems. While these may whet the appetite of a designer they are not, in my view, useful guides when the task of synthesising a BI system incorporating is being undertaken. When your child is given his/her first bicycle, showing someone cycling down the street isn’t going to be much help in teaching how to ride. Hands-on synthesis is needed. Big pictures may create envy, but don’t instruct much.

I suggest that it will be worthwhile for a BI analyst and executive team to review the corporate BI environment, existing and planned, and assess the potential for including automated decision making in the BI systems supporting each business segment.

Further, such a review should use a project planning method which segments activities into several bite sized Phases. Here’s a suggested outline, with more detail on each Phase to follow.

Phase 1: Identify the controllable business variables in the target businesses, ignoring specific business processes

Most articles on automated decision making start with the business process and BPM analyses. I think this is the wrong initial focus. To me, the optimal review starting point is to identify the control parameters of typical business processes that are amenable to automatic adjustment. The number of business process control “levers” available to management is finite, quite small in fact, and the number that might be controlled automatically, with profit, is even smaller. Examples include: Automatic pricing adjustment, dynamic production scheduling, staff re-assignment.

A more complete discussion on identifying control variables follows in a later post. It is, I believe, the most important part of project selection and specification. Get this wrong and you will certainly miss out on the best opportunities.

Phase 2: Identify potential business processes, existing or planned, that utilize one or more of these candidate control parameters and may benefit from automation

The same control variables are likely to appear in multiple business processes. For example, automatic price adjustment could impact BI systems supporting Order Entry, Production Scheduling, CRM, Inventory Management, etc.

Phase 3: Identify components of the candidate BI systems that may profitably incorporate automated decision making

Management 101, since Herbert Simon’s day, tells us that there is a defined decision making process, with several component steps between becoming aware of a problem or opportunity, and deciding what action to take. Automating the decision process clearly requires that one or more of these steps should be performed without reference to a human.

It is relatively easy to consider each of these decision process components in turn, to determine the extent to which it/they can be automated. My later post will give more detail if you are interested, Dear Reader.

Phase 4: Design the business analytics; business rules, predictive analysis, time series analysis wherever Phase 3 indicates potential utility

This is the fun part. The software tools for business rules management are much improved since I first started playing with IF…AND…THEN….ELSE statements as the basis for automation, as are the forecasting and statistical analysis packages.

I leave it to you to work out the details, as they are always application dependent. But always be aware that rules change, sometimes quickly, so dynamic management, or decision making agility if you will, is important. Enjoy.

Also, note that Phase 4 will be an iterative process, with frequent Phase 5 reviews to ensure that business sense prevails, limiting the scope for white elephant projects; even though they can be fun.

Phase 5: Evaluation and feasibility reviews of the costs and benefits of automated decision making components within the BI system

Try not to let the excitement of creating rules and embedding predictive analytics in a BI system carry you away; well only a little bit anyway! To me, this is one of the most interesting and absorbing roles of being a BI analyst and designer; certainly it beats specifying reports.

Building automation into BI is highly recommended, especially if you are looking for a challenge!