jump to navigation

Personalization in BI: Selective Dissemination and Targeted Retrieval of Important Information January 30, 2007

Posted by Cyril Brookes in General, Issues in building BI reporting systems, Stafford Beer, Tacit (soft) information for BI.
add a comment

Personalization in BI grows in significance with the near universal recognition that passive reporting, designed for the masses by supposed experts, is limited in its utility. Action oriented reporting is preferable; it always has been. However, many business analysts do not recognize that selective dissemination of information, aka personalization, is a pre-requisite if reporting is to stimulate action. Only specific people can or need to take action, and common sense tells us that they must be targeted.

See, for example, the article by Neil Raden.

Two sorts of personalization apply in a BI context:

         Push and Pull

Each can be applied to external or internal recipients.

The focus of this blog is on selectively pulling information, predominately by internal people. This is the principal aim of action oriented BI, directing valuable information (and only valuable information) to executives and professionals that assess a situation and/or take action as a result. This is not to say that other aspects of BI, such as keeping people informed as to status of the business, should be ignored. But these objectives are far less vital than supporting executvie actions.

I leave discussion on the much more fraught selective information push situation to others. Determining what information will be of interest to a customer or supplier and pushing this category of stuff to them can be a valuable marketing tool, or (more likely IMHO) a PR disaster. We always hear of Amazon.com and its success with cognate book promotions, but books are easily categorized in a universally accepted manner; most other items are not so easily classified , and the implications for inappropriate information push can be dysfunctional.

Any discussion on the effectiveness of BI for improving the quality of executive decisions (and what other purpose might it have?) must have regard to the actual decision making process. The theory of this process is well established, notably by Herbert Simon. Many researchers have also considered the relationship between this process and the information required for its operation. In this context I particularly value the work of Stafford Beer and Henry Minzberg.

Information that enables effective decision making belongs to one of two categories, and both are essential if decisions are to be optimal:

It helps the executive find problems and opportunities – situations that need a response.

Stafford Beer calls this Attenuation information, and I have discussed this in detail earlier

It helps the executive solve problems he/she has found (or been told about)

Beer calls this Amplification information, also discussed in this blog earlier

But I diverge.

Returning to the personalization theme; selective dissemination is vital in the problem finding context.

Obvious candidates include:

  • Alerting the executive to important exceptions, out-of-specification performance, unusual situations, adverse forecasts of key indicators, and unacceptable (or advantageous) trends.

  • Equally, if not more, critical is soft information (opinions, comments, assessments, etc. that portend problems, or throw doubt on the accuracy of factual information.

Targeted information retrieval is also vital to support problem solving.

  • The solution process that needs supporting includes diagnosis of the severity of the problem (what will happen if nothing is done), identifying possible alternatives and assessing their implications.
  • During a decision making process executives must be able to retrieve important, valuable, information as distinct from the routine stuff. This applies to both factual and soft (tacit) information. In this context, the latter includes ideas about problem implications, suggestions for potential solution alternatives and recollections about what we did last time this happened.

The key word in both these situations is “importance”.

Alerting to, and targeted retrieval of, useful information implies that some assessment must be made of the significance of a data item, either using an automated rule system, or a personal assessment.

Truly this is the stuff of business intelligence. Without importance classification all information is equal, but obviously this is not reality.

Selective dissemination and targeted retrieval, the basis of all personalization, depend therefore on the BI context being able to distinguish information importance as well as its subject, topic, or data class.

Importance, in turn, depends on two characteristics: urgency and value to the business.

I have experimented over 20 years with different retrieval/alerting procedures for corporate BI systems, using both automated and human importance assessment. I’ll detail this experience in the next post.


BI System Design incorporating Wiki and other Web 2.0 Components January 10, 2007

Posted by Cyril Brookes in General, Issues in building BI reporting systems, Tacit (soft) information for BI.

Suddenly collaboration is flavor of the month, or year anyway. Customers are re-designing products, buyers are guiding the choice of other buyers, repair and service people are specifying work procedures, those with spare time collaborate in wikis-everything; and maybe the lunatics are running the asylum? But what of Business Intelligence systems, what is their place in all this?

I’ve been preaching the utility of collaboration an essential element of BI for 20 years now. Maybe it’s going to happen at last?

But, even with the current enthusiasm, it won’t happen at Youtube speed. Collaborative BI is more complex than just loading some videos or other data into a category for others to retrieve. And, supposedly, corporate people don’t have time to surf the Intranet, let alone the Web, looking for relevant stuff.

Therefore, it will take time; and remember the old timer’s adage: “You can tell the pioneers; they’re the ones with the arrows in their backs!”

Here’s a set of axioms that I believe are relevant if we are to succeed in this collaborative endeavor, all of which raise barriers, some large, some not, it depends on the business environment:

  • Corporate people who come across Web 2.0 style intelligence often don’t know its value, and whom to tell
  • They usually only have part of the story anyway
  • They often lack the background to be able to assess implications
  • Supplying intelligence to Web 2.0 style repositories or applications is time consuming, and may not be at all rewarding to the author, only to others
  • Intelligence can’t be searched for, or be subject to push messaging and alerting, if it’s not categorized
  • Categorization must conform to a corporate standard vocabulary, or it will not facilitate sharing and collaboration
  • All BI items indexed by a category are not of equal significance or value; some may be critical to the business, others routine news that’s already well known
  • The high value items ought be separated from the dross and given wider audience, or personalization will be ineffective; but how to do this?
  • BI is useless unless the recipient can assess its implications, and often this requires additional input of BI, or experience, from other people or sources – the collaboration imperative
  • Corporate collaboration raises infinitely more cultural and behavioral red-flags than Web 2.0 practitioners could dream of; see earlier post.

Nonetheless, I’m sure that we’ll see increasingly effective means for accommodating the issues raised by the above points. I have suggested some design principles in another earlier post, but the rapid evolution of wiki style knowledge creation, with the attendant blog explosion, is opening up new opportunities.

I believe that the issue of bringing new knowledge to the attention of the right people, personalized distribution as the knowledge is created, will remain a substantial BI issue. My ideas on this will be the subject of a later post.

Drilldown specification in BI– it’s harder than the demo makes out August 27, 2006

Posted by Cyril Brookes in BI Requirements Definition, General, Issues in building BI reporting systems, Stafford Beer.

We’ve all sat through those glitzy demonstrations of canned detailed reporting – solving hypothetical problems that may never exist in practice, or if they have existed, will never be repeated again in the same form. 

Drilldown is, in my opinion, one of the hardest aspects to specify and implement effectively in a BI reporting system.  Unfortunately, it is also probably the most hyped “silver-bullet” keyword of BI software marketing speak. 

You cannot build a detailed report for a problem situation that doesn’t yet exist.  Even if the type of problem, say a debtor’s default, is predictable, the specifics will always vary.  You can only create an environment to make such reporting easier. 

Further the nature of that reporting environment is often complex, involving hypothetical database specification and modelling capability – as I will discuss in a later post. 

Drilldown is, of course, a modern descriptor for part of Stafford Beer’s Amplification concept. 

In my posts of July 7 and July 28, I introduced and discussed Stafford’s distinction between Attenuation and Amplification reporting in the corporate BI context. In summary:  Attenuation reporting is pre-specified and pre-formatted reporting of information that empowers executives:

  • To be aware of, and able to assess the implications of, the current state of the business, and
  • To be alerted to actual or potential unusual or unacceptable situations. 

Or more colloquially:  Where are we?  What is good and bad about where we are?  What is unusual or forecast that I need to know about? 

Therefore, Attenuation style reporting makes executives comfortable they know what is happening in their part of the business, and that they are alerted to any problems that can be discovered from the data available.  Amplification reporting is more difficult to pin down, since it is only required when a problem, or apparent problem, has been identified. 

If Attenuation is about finding problems, Amplification is about solving them.  

By this definition, then, Amplification reporting – or Drilldown if you prefer the current term – is difficult to pre-specify or pre-format because we cannot be sure of the exact nature and context of the problem.  

In the good old/bad old days of this technology, we used the terms Executive Information Systems for the first type of reporting, and Decision Support Systems for the second.  Now it’s all encapsulated in the BI terminology.   The names change, but the issues remain the same. 

But I diverge. 

Drilldown has three basic objectives, as I see it, drawing from classic decision theory – as per, say, Herbert Simon.  It is to empower our client executives to: Diagnose the problem/opportunity we have helped them find (using Attenuation reporting).  How bad is it?  What will happen if we do nothing?

For Diagnosis support, the minimum requirement is more detailed data.  Exactly how more detailed, and over what time periods, can only be determined through appropriate research and interviews with client executives.  However, models that can answer “what-if” queries, and statistical analysis tools may also be valuable.

Determine the available options for solving the problem – or capitalizing on the opportunity

Options may be selected from past experience, or suggestions from experts.  Support for this aspect of Drilldown is often based on knowledge sharing and tacit information management tools – see my post of July 27. 

Assessing the implications of each of the apparently viable alternatives

Feasibility validation and outcome assessments of options may require pre-specified models, or at least partly constructed versions that can be adapted to the problem situation when it is known.  Assessments of the anticipated outcomes of the viable alternatives are often the key information elements required to support decision making and the BI system should include these if at all practicable. 

Many people think that Drilldown is only about getting more detailed data, but clearly the client executive is likely to want more – just like Oliver Twist.  A quality BI environment design will go much further as discussed above, but at least will extend to a full review of what that “more detail” should be, and how it is to be collected for inquiry and ad-hoc reporting. 

 After all that, it’s up to the executives to judge which alternative is best, i.e. make a decision.   

I will cover these Drilldown component steps in more detail in a later post.  If you want more detail immediately, you can see how these concepts are implemented in the BI requirements methodology at www.bipathfinder.com and how the metadata that supports the method is documented at www.bidocumenter.com 

Your comments on the accuracy and utility of this material are welcome.

BI implies passivity – we need action oriented reporting, bring back KM? August 12, 2006

Posted by Cyril Brookes in BI Requirements Definition, General, Issues in building BI reporting systems.
add a comment

BI used to be KM in the management consultant marketing speak. Let’s bring back KM – BI is becoming too passive. Knowledge Management was our buzzword during the late 90s. Trouble was, no one seemed to agree on what it meant. I blame the software vendors for this shambles. KM was the new corporate must have and so every piece of reporting software, from ERM to spreadsheets, became a KM product.

As is usual in this situation, we changed the name, this time to BI. Today, we often see a BI system defined as comprising principally: Corporate Performance Management ERM Customer Relationship Management Competitive Intelligence This seems to suit the software vendors, as they can fit about anything into one of the above categories.

We are missing something, however, that was a key part of the initial Knowledge Management movement, before it was taken over by the marketers.

Herbert Simon once wrote:

The impact of information is obvious. It consumes the attention of its readers. Therefore, a wealth of information creates a poverty of attention.

So often BI is seen as a reporting tool, and BI designers and software has but one objective – Put information in front of executives so they can make better, more informed, decisions. But I believe it doesn’t work like that; executives are time poor and need, in fact deserve,much help to determine what is important, what signifies a potential problem, and what the implications are of the current status. To me, that’s what Knowledge Management was, should be, all about and therefore it’s what BI should also be.

As BI professionals, what can we do about it in our projects and designs? Here are some of the most important issues I believe we face to empower our client executives to perform at a higher level – BTW: I hate the word user in this context.

Empowering effective individual knowledge work is the most important objective. So often the corporate culture is focused on finding the right number or document, rather than the right person, with marginal result.

Practice check 1: We need to put each KPI, metric or measure in front of the right person, PLUS we should identify the expert who will explain the significance and implications of good and bad numbers or documents.

Executives need to need to distinguish between numbers and documents that are significant and those that merely detail or cover a nominated KPI or topic, but are not useful.

Practice check 2: We must know what rules, tacit or explicit, our client executives employ to decide substance, significance and need for response. And/Or, make the system adaptable to meet the changes in these.

The more senior people obviously have to monitor and contribute to broader business performance and strategic areas. They will therefore want to focus on the more important issues, and not be burdened with detail that is not about something significant. Yet, when something is significant, they need much more, fast. What is the more and where is it?

Practice check 3: While this is an obvious requirement, it is very difficult to implement, in my experience. For the BI designer, it comes down to two basic issues: (1) establishing, ex ante, the degree of detail required for drill-down when certain criteria for significance are met, and (2) making it easy to identify and reach the subject experts for each area.

People with high levels of expertise can leverage their contribution to the corporation through collaboration on issue assessment and a process of issue escalation. Leveraging expertise involves sharing the results of problem assessment and creative work among those who will benefit.

Practice check 4: It is a part of the BI designer’s role to ensure that the right people can be brought into the problem solving loop.

Knowledge enabling response to a problem or opportunity is usually not recorded, it is created dynamically as issues are identified, assessed, debated and resolved. Notification about an adverse assessment of data, messages, queries, etc. from an information repository will often trigger creation of more explicit and communicable expressions of knowledge via expert discussion.

Practice check 5: This is the nub of knowledge management, as I understand it. Clearly we cannot achieve KM unless our BI systems recognize both the hard and tacit information resources, allow access to subject experts and promote the recording of the knowledge created as a result.

These five checks on the specification or implementation of a BI environment will materially help you determine how action oriented your systems are.

A Way Forward for Effective BI Requirements Specification June 30, 2006

Posted by Cyril Brookes in BI Requirements Definition, General, Issues in building BI reporting systems.
1 comment so far

There is a way out of the mess we’re in because of poor BI Requirements definition.  The “waterfall”, or top-down, approach isn’t compatible with the instant gratification desire promoted by BI software vendors and exponents of the prototyping incremental development methodologies. 

I accept that we are not going back to the top-down development models for BI systems; executives want faster turnaround time.However, speeding up implementation by short-cutting requirements analysis is only satisfactory if the initial results can be fine-tuned in subsequent iterations to produce a good result.  So the starting point, the first iteration, must be well planned.   Often the initial BI design specifications are incomplete, and do not offer any creative or innovative content or presentation.  It’s like going to McDonalds or Burger King when the objective was “fine dining”.  But, being realistic, nothing is going to change the love affair consultants and IT analysts have with prototyping.  And it isn’t necessary.  All we have to do is to inculcate a habit of finding sensible, and creative, requirements at the outset. Here is my prescription. 

Top-down, or waterfall, methods are seen as being inappropriate  – that leaves two basic options. 

  1. Build what  already exists in a more glamorous package, with some catchy phrases like “drill-down” added
  2. Employ the bottom-up planning approach, with some minimal top-down content [applied in a structured manner to minimize wasted time]

If we can make good use of the second option we should be well on the way to a consistently better BI context. 

Something funny happened to BI Requirements Definition. It went away! May 17, 2006

Posted by Cyril Brookes in BI Requirements Definition, General, Issues in building BI reporting systems.
1 comment so far

Information reporting all started with MIS in the 60s; Management Information Systems were going to make all executives’ lives simple, decisions would be based on straightforward assessment of all the facts.  First, though, we had to find out what information the executives wanted – and we did, or tried to do so.

Then there was DSS in the 70s; Decision Support Systems would use those fantastic optimization algorithms to cover the fuzzy areas left because MIS didn’t seem to be cutting it.  Expert Systems based on so-called “Rules” offered the promise of filling more gaps. “Boards of Directors will be retained for sentimental reasons, computers will make all the decisions” expounded one eager High Priest of the Computer.  Again, however, we needed to know what processes and decision types needed support.

Enter EIS in the 80s;  Executive Information Systems was the catch-all term that would finally lead management to the Holy Grail.  It sold lots of consultant-hours, but clearly the name change did little for the quality of the result.  No-one suggested, however, that an EIS should be built without first determining the information to be reported.

KM was next in line in the 90s; Knowledge Management was the umbrella descriptor that recognized the inherent fuzzy, non-stationary, and imprecise nature of executive information support. Trouble was, nobody could stipulate what KM actually is, or was.  Despite all the heavy technology artillery, everyone believed that we needed a road-map for KM –  the requirements definition to show the way.

Now we have a new millennium and BI; and Business Intelligence covers about everything from Corporate Performance Measurement, through Competitive Intelligence to  Customer Relationship Management.  Nothing wrong with that, if we can make it work, that is. See www.bipathfinder.com to find out where I am coming from in all this.

BUT, somewhere along this timeline – starting in the EIS phase, we created prototyping.  Make no mistake; I regard prototyping as an invaluable and effective tool for BI, if used correctly.  Start with a broad set of requirements, build version 1, try it, build version 2, try it…..the last prototype version is the finished product.   

The issue is, by the time we reached the BI phase, we appear to have cancelled out the first part, and the logic runs – “Don’t bother with the broad set of requirements, just build anything, see how it goes and we’ll fix it from there”.

As I’ve opined before in this blog, BI Requirements Definition is a tough task.  It often requires skills and attitudes that the IT analyst doesn’t have, or doesn’t want to use, e.g.: 

    • Understanding, and performing, Business Process Analysis

    • Developing empathy with the executive culture

    • Eschewing the guru culture, at least for the interviews

    • Accepting that ETL, data warehousing, cubes, dimensional modeling, etc. may not be all there is to make a project work.  These are deterministic tasks, but the soft aspects, particularly understanding the business, and its people, is also vital, perhaps more so.

It is no wonder that requirements definition for BI is on the nose; it’s too bloody hard.  And prototyping gives us an easy out – our task is to build a BI system, pity about the fact that it is far short of what could have been.  If we don’t start in the right ballpark, we won’t be able to prototype morph back into it.  The BI system is stuffed!

Next post:  What we can do about it, given we are in this mess.

BI Prototyping: Developers laughing? SOX auditors weeping? Users experimenting? Documentation lacking? March 21, 2006

Posted by Cyril Brookes in BI Requirements Definition, General, Issues in building BI reporting systems.
add a comment

Whatever happened to determining system requirements specifications before you build? 

Rigor in requirements determination for BI systems is unfashionable it seems.  The new wisdom says:

  • Why bother to determine requirements for a BI system when you can change it in minutes to fix an error or add new columns? 
  • Anyway, executives don’t know what  information they want till they don’t get it, or they see someone else with it.
  • Interviewing executives and documenting the results is hard work, and too embarrassing if we don’t understand their business well. 
  • We can serve up almost anything based on the old reports and pre-defined KPI metrics, and let them tell us what they don’t want. 
  • Prototyping is god.

The BI development fraternity seems to have psyched itself into thinking it is running a jazz musicians jam session.  Play what you like, and we’ll all join in making it up as we go along.
Jam sessions work because music is constrained by the rules of physics, and the customers can walk away if they don’t like what they hear. 
BI prototyping is different.  It’s more like a graffiti writer’s convention. Graffiti is all over the place, unstructured and in your face.
There are no rules in logical state space.  Anything can be built, given time and money [and data].   So the permutations are infinite, and the potential for confusion – immense.  Plus, executives can’t walk away, they need the information.
  We’re creating an auditing nightmare.  Users are learning to experiment, and are being trained to avoid the hard task of conceptualizing what is really required at the outset.  Change upon change upon change – to formulae, data transforms, aggregations, cube content, dimension hierarchies, and so on.  The tools for prototyping have outstripped our ability to control, monitor and manage the systems life cycle.  It’s bottom up style of management, when top down is what is required if an objective is to be met.
SOX compliance will find it hard to survive in such a context.  Any control that’s mutating faster than an influenza virus will be unreliable.

What can we, should we, be doing?  End to end control from requirements to implementation and associated documentation is the only answer.  But do we have it?  I fear not.Is this a concern for other BI designers?

Sarbanes-Oxley: BI’s big opportunity goes begging? February 19, 2006

Posted by Cyril Brookes in General, Issues in building BI reporting systems.
add a comment

With the right specification, generated with a disciplined structured workflow, BI tools can be a material component of any corporate implementation of a Sarbanes-Oxley compliant environment.

However, since the focus of almost all advertorial on the SOX compliance issue from the BI community has been on software tools, the accounting and auditing profession is taking the high ground – even though their took kit is technically inferior.

BI consultants and analysts need to realize that:

  1. The BI software tool set is tailor made for the task
  2. SOX compliance is best achieved through pro-active measures, not reactive (i.e. audit style) ones
  3. Compliance is best achieved via a high quality specification

The specification template for such a project is universal across all industries and corporate structures.  The main reporting elements are:

  • What are our business risks and their measures?
  • What is good and bad about current risk measures?
  • What are forecast risk measures and what is unusual?
  • What resources needed for detailed risk management?

I believe that a structured approach to specification will yield a set of requirements that are both efficient and effective in resolving the SOX dilemma for most enterprises. I’ll detail my thoughts in later posts.

IT Analyst and Executive User Disconnect. Red Flag #4 on why BI reports fail February 16, 2006

Posted by Cyril Brookes in General, Issues in building BI reporting systems.
add a comment

Executive users of a BI reporting system and the IT analyst designers have different objectives and reward mechanisms.  It is common, therefore, for a disconnect, often fatal to user satisfaction, to build between each type of participant’s expectations for the system.

The Executive user is typically expecting:

  • Timely access to information he/she requires to understand the status of the business, or the relevant part of it.
  • Ability to identify problems and opportunities as they arise
  • Alerting signals when parameters go out of normal range
  • BI reports that contain the information nominated as required
  • Compilation of a database and set of analytical tools to facilitate rapid diagnosis and solution of problems – when they are found.

Obviously these expectations must be built into the specification to be achieved; they can’t be retro-fitted later.

The IT analyst is typically expecting:

  • Project completion within time and budget
  • Use of standard software tools for presentation and access where possible
  • Use of standard Data Warehouse elements, preferably reused from earlier projects
  • No special data resource development if possible
  • Executive user sign-off on the specification as soon as practicable

For the IT analyst, a focus on obtaining an optimal specification is not a natural part of the task.  Certainly a specification is required, we can’t build a BI system without one; but ideally it will be that set of KPIs or corporate measures selected as being equivalent to whatever reporting this new system may be replacing. 

Certainly, IT analysts don’t want the system specification to stray outside the available data items in the data warehouse, irrespective of the need for such novel data.

Executives commonly believe that the act of specifying an information element for a BI report means it will be available.  The project documentation process must rapidly link information requirements as specified by the executive users with available data resources, to highlight any shortcomings that can be fed back.  Otherwise the executives will continue to believe that their stated needs will be met, when this may, in fact, be impossible.

Therefore the potential for disconnect is huge.  It is made more difficult because the executive user will typically not mentally engage with the project until it is about to be delivered.  The specification process, if not done well in a structured manner, will be regarded as a nuisance and waste of time.  A sign-off is meaningless if it isn’t based on a real interaction and will be repudiated if necessary.  Obtaining active executive involvement in the specification process is difficult, but clearly essential if a good result is to be had.

Therefore, the symptoms of Red Flag 4: IT and Executive User Disconnect include situations where:

  • The specification process has had minimal Executive – IT analyst interaction
  • User expectations are unrealistic, but not negotiated down, during planning or specification tasks
  • Important newly identified data collection needs are identified and planned for availability
  • There is no ability to link existing data with specified needs of executives, and therefore it is unknowable if the specification can be achieved.

No Surprises in my BI reporting system! Red Flag #3 on why BI reports fail February 2, 2006

Posted by Cyril Brookes in BI Requirements Definition, General, Issues in building BI reporting systems.
add a comment

Just like corporate customers, executive users of BI systems ought to be delighted, at least occasionally.

This implies that they come across unusual, particularly useful, items, or that they find that it is easy to determine the implications from a set of numbers – for whatever reason.

Is your BI system just a single shade of grey?  How did this happen?  Surely the specification must have been lacking something that might have given it a sparkle?

Therefore, the symptoms of Red Flag 3: No Surprises in this system include a specification where:

  • Reports and displays, dashboards, etc. focus on numeric data, without analysis
  • No attempt is made to interpret, alert, forecast or determine trends in the information and advise the executive user of same.
  • Relative importance of numbers presented is hard to determine – a bad number and a good number live side by side.
  • Positive swings in one area can cancel out negative swings in another, so the significance of each is lost.

To avoid this Red Flag 3 waving, it is necessary to ensure innovation and variation in the reporting based on the importance, significance, trends, novelty, etc. of the information being presented.  Special emphasis is needed on supporting the vital problem finding activity that executives are always monitoring. Bald statement of supposed factual numbers will set the Flag waving early on in the project life, dooming it to insignificance and disutility.