Utilize Predictive Modeling


The core benefit of the first principle of BTM, whether it goes by "modeling" or something else like "design" or "blueprinting," is that it helps to visualize the end goal before beginning costly ”and often irreversible ”implementation. In the broadest sense, a model is a virtual representation of a real thing. By manipulating this representation, modelers can preview a solution and address design flaws before they manifest themselves in the final product.

There's a widespread and unfortunate misconception that modeling is a highly technical exercise that needs to be tackled by a team of trained specialists. At times, of course, modeling can be found in pocket-protector-friendly environs like nuclear engineering, macroeconomics, or genetics . But this is more a reflection of inherent simplicity than any tendency towards complexity: By helping to simplify design and decision-making, modeling actually clears up complex problems, which is why it shows up in these areas. When observers mistake modeling for a technical, complicated exercise, they're essentially confusing the message (such as modeling a complex chemical reaction) with the messenger (modeling itself).

Some of the most powerful varieties of modeling (such as the spreadsheet example we'll look at in a moment) allow even non-technical users to preview change, or to "predict," before putting new ideas into practice. This, of course, is where the "predictive" in "utilize predictive modeling" comes from. It's also why modeling is such an important part of BTM: It helps to predict the impact of business and IT change by becoming the "aim" in "ready, aim, fire."

BTM puts modeling to work as an innovation infrastructure for IT projects. During the design stage, it functions as a blueprint in which teams can set clear goals and flesh out a solution before actually writing code. In the build, test, and deploy stages, the model acts as a reference point to orient ongoing work and to help guide last-minute modifications in the event that unforeseen challenges and opportunities pop up. By playing these important roles, modeling helps the IT project team pre-empt costly mistakes and improve the quality of the systems that they develop.

BTM's use of modeling isn't just about making incremental improvements to an existing process, however. In addition to relatively modest gains in efficiency, modeling also empowers BTM with other, more dramatic capabilities that can literally reinvent how IT projects approach the "aim" part of "ready, aim, fire." This sounds like a bold claim. However, there is ample precedent from previous modeling revolutions ”such as object modeling, computer-aided design/computer-aided manufacturing (CAD/CAM), and especially financial modeling and the spreadsheet ”to suggest that modeling can indeed accelerate critical business activities.

Financial Modeling and the Spreadsheet

Before the personal computer revolution, Wall Street analysts performed complex spreadsheet calculations using only a simple calculator. This process was completely inflexible , prone to mistakes, and thoroughly mind-numbing. In order to make changes to a model (whether to vary inputs or correct mistakes), analysts had to rework the entire thing, a process that ”needless to say ”was inefficient.

In 1978, Harvard Business School student Dan Bricklin recognized an opportunity to automate this tedious process using software and the rapidly maturing PC. He, along with former MIT classmate Bob Frankston, founded Software Arts, Inc., and introduced the VisiCalc spreadsheet to the market. Almost overnight, VisiCalc transformed how financial analysts worked. [1]

The obvious advantage to Bricklin and Frankston's innovation was efficiency. Complex models that once took hours to update could now be modified with a few keystrokes. Not surprisingly, spreadsheets like VisiCalc became the de facto standard for financial modeling, and frustrated business school students and financial analysts clamored all over each other to put the new technology to use. The demand for spreadsheets was so overwhelming, in fact, that it is frequently credited with creating the initial boom market for business PCs.

But the real revolution that the spreadsheet kicked off wasn't just about efficiency and automation. By unburdening analysts from the pedantic work of manual calculations, spreadsheets lowered the marginal cost of evaluating new scenarios from thousands of dollars to almost zero. This, in turn , encouraged experimentation and creativity. The same employee who once spent days perfecting a single model could suddenly produce several alternatives in a single afternoon.

Spreadsheets kicked off an industry-wide movement towards experimentation that revolutionized how analysts ”and the financial services industry ”worked. By allowing workers to easily create and analyze the impact of multiple scenarios, spreadsheets and predictive modeling encouraged a culture of rapid prototyping and innovation, or impact analysis, that is as applicable for aligning business and technology as it is for the financial world.

From Modeling to Impact Analysis

Impact analysis lets teams alter input factors, create multiple output scenarios, evaluate the end-to-end impact of each, and eventually select and implement the optimal solution. This stands in direct opposition to conventional, linear problem-solving techniques, where decision- makers analyze sub-problems at each logical step along the way, and then assume that the overall impact of their choices is the best one (see Fig. 3.1).

Figure 3.1. Linear problem solving decomposes sub-problems along the way, while impact analysis examines the end-to-end impact of multiple decisions

Like modeling in general, impact analysis can be used to address a broad range of activities. For example, it is often used in supply chain planning for advanced, data-driven calculations that optimize a particular function (such as inventory costs) given unique inputs and constraints (such as market demand, logistical restrictions, and manufacturing capabilities). At the other end of the spectrum, impact analysis can address much simpler problems. A good example is Dell Computer's build-to-order website, where potential buyers test multiple PC configurations until they find a good match between the features they want and the cost they can afford to pay. In both of these cases, individuals vary inputs, rules translate these inputs to outputs, and team members compare the impact of multiple scenarios to choose the solution that fits their needs.

In order for impact analysis to work, the scenario being modeled should conform to three guidelines:

- It should have easily identified inputs, rules, and outputs: Impact analysis requires employees to define a set of inputs and then link these to outputs using predefined rules. These inputs and outputs are often quantitative (as in the supply chain optimization problem), but they can also be qualitative (such as the PC configuration options). To produce good results, these criteria ”the rules that link them ”must accurately reflect the real-world problem.

- It should have multiple configuration options and decision factors: Problems that contain only a few inputs and outputs aren't suited to impact analysis because the effect of altering inputs is often obvious. When the outputs are less intuitive on the other hand, impact analysis can help decision makers experiment to identify good solutions.

- The relative cost of implementation to design is high: Scenarios that are inexpensive to design but difficult to implement are ideally suited to impact analysis. Our ongoing analogy to an architectural blueprint is a case in point here: It's unrealistic for you to contract a builder to build five houses so that you can choose the one you like the most. It's entirely possible, however, that you may choose to commission an architect to draft five blueprints. You can then compare them, choose your favorite, and give it to the contractor to build. This is where the synergy between modeling and impact analysis really comes into play: Predictive modeling is a powerful tool for lowering design costs, and so a crucial driver for impact analysis.

Anticipating Unforeseen Ripple Effects

These three characteristics combine to highlight a point that is crucial to understanding why impact analysis fits well with BTM. Disconnects between business, process, and technology are often introduced when individual decisions have unforeseen effects on the blueprint as a whole. "Projects lack a holistic view," PACCAR CIO Pat Flynn says, "because we tend to look at it as a linear process: decompose the problem, decompose the problem, decompose the problem, make a decision. But it's very hard to go back and say 'that decision has a set of ripple effects'."

Consider an example: A team of process analysts is working on a project for which they need to diagram the approval process for purchasing non-production goods. Using conventional methods , their actions would be informed by an in-depth analysis of the decision. They would start by gathering as much data as possible: the current approval process; the complete list of approved suppliers, products, and contract types; the organizational hierarchy and current purchasing limits for each employee; the existing technology assets that automate this process; and the supporting systems such as hardware and networks. After pulling all this information together, they would weigh the data, diagram a process flow that best fits the given constraints, and sign off on the decision.

This sounds reasonable at first glance, but it fails to take into account any ripple effects that might spread from this individual decision. Let's say, for instance, that one supplier relied on a legacy order-processing system to interface with our example company's procurement system. Let's also say that when our team reengineered their approval process, they did so in a way that made it incompatible with this legacy application. And finally, let's say that this particular supplier accounted for 40% of all purchases of non-production goods last year. Clearly, this should compel the process analysts to revisit their decision. But without impact analysis they wouldn't find out about the ripple effects until it was too late.

The Perceived of Value of Models and the Whitespace Problem

Before they can get started with modeling and impact analysis, companies need to overcome a couple of obstacles. The most obvious is the common perception that the time it takes to develop a model during the design stage is better spent on implementation. This is due in part to previous experiences with models that were frighteningly inaccessible to all but the most die-hard experts. Since non-specialists (a group that frequently includes managers and other authority figures) couldn't experience their value firsthand, they assumed that the models were a waste of time. The shorthand solution to this concern is to make the modeling environment friendly enough for a broad range of people to pick it up and experiment according to their own level of comfort . A good example of this is a financial model whose inner workings may be exceedingly complex but whose overall purpose is clearly communicated to a non-technical audience.

In extreme cases, however, modeling can be a waste of time. This happens when people get stuck in an endless design loop; By continuously tweaking the model in the quest for a perfect solution, they never get around to actually implementing what they're working on. The way to counter this impulse is by linking a system of real-time monitoring to metrics, goals, and objectives that are established at the beginning of the project. This implies a link to both project and performance management that is crucial to any type of modeling.

The other obstacle that stands in the way of modeling and impact analysis is the gap that exists between multiple models and between models and the real world. These gaps are referred to as "whitespace," and they're familiar culprits in cases where modeling hasn't been successful. Typically, the tools that are available to IT workers to model business, process, and technology are disjointed, and so they tend to exacerbate rather than overcome the whitespace problem. Most are geared either to a particular task (process modeling, object modeling, or knowledge management) or to broad horizontal activities (word processing, drawing, or spreadsheets). A consequence of these disjointed offerings is that companies tend to use multiple tools and environments to develop their models. When changes are made in one environment (say a process diagram) they aren't automatically reflected in other areas (a requirements document or business strategy memo, for example). Without integrated tools, the project team has to proactively anticipate ripple effects to keep their models aligned.

The Advantages of Predictive Modeling

The advantages that modeling provides for BTM are closely analogous to those that spreadsheets deliver in the financial world. By utilizing predictive modeling to align business and technology, enterprises can:

- Mitigate risk by forcing teams to flesh out details in the design stage

- Enable crea tive impact analysis by lowering the marginal cost of experimentation

- Democratize design decisions by hiding underlying details from non-technical team members

- Communicate overall design to promote collaboration

Mitigate Risk

The first of these advantages, mitigating risk, is a key advantage of modeling in general. Initiatives can fail because of any number of unforeseen obstacles: poorly defined business objectives; processes that don't map to application packages; system choices that require heavy customization; even plain, old-fashioned installation failures. By itself, modeling can't guarantee a flawless initiative; but by forcing stakeholders to collaborate and produce an end-to-end design before beginning the actual implementation, it helps work out kinks in the model ”where they are far easier to tackle than in the real world.

Mitigating risk is an important factor in any enterprise initiative, and it's a compelling counterbalance to our first concern about predictive modeling ”that it isn't worth the time and effort. Implementation mistakes can cost many times more than even the most thorough modeling.

Perform Impact Analysis

Second, predictive modeling helps companies to perform impact analysis. Most enterprise initiatives adhere to a linear planning process, where decisions made early on (the business drivers for the initiative, for example) become cast in stone as the project progresses. This is okay when both the initial guidelines are completely static and the consequences of decisions only affect future decisions.

In IT projects, however, neither of these conditions applies. Early choices such as business drivers can become out-of-date at a moment's notice in response to things like market changes and recent moves by competitors . At the same time, choices made later in the process (such as which application package to select) can affect decisions thought to have been nailed down earlier (such as the process flow that is to be automated). By locking in determinations up front, teams forfeit flexibility that they may need down the road.

Also, linear planning assumes that what's best for any individual decision must be best for the project as a whole. This attitude ignores hidden ripple effects between seemingly unrelated decisions. For example, a consultant choosing an application package may sensibly select the one that fits the most requirements. But this decision assumes that all the requirements are equally important to the initiative. If the consultant chooses a package that leaves out a few crucial requirements, he or she could introduce an inconsistency between the best individual decision (the package that meets the most requirements), and the best overall solution (the system that best supports the overall business goals of the project).

To compound this situation further, ROIs are frequently laced with intangibles such as "improved customer relationships" and "strategic fit with other systems." Managers who have been tasked with making a particular decision in a linear process often feel compelled to invent decision criteria to justify their choices, even if these criteria fail to take into account the project's overall, intangible returns. Eric K. Clemons, a professor at the Wharton School of Business, describes this phenomenon as "the 'concrete' and 'measurable' driving the significant out of the analysis." [2]

Impact analysis counters these concerns by letting teams compare end-to-end potential outcomes . Even in cases with intangible returns, the impact analysis technique improves decisions by making it easy to compare the relative value of multiple scenarios, rather than forcing teams to assign allegedly absolute criteria that obscure more important, elusive goals. Seeing end-to-end designs also helps to calm the impulse to enter an endless design loop by encouraging teams to select a final design, move from modeling to implementation, and avoid the temptation to get stuck on an individual decision.

Democratize Design Decisions

The third advantage of predictive modeling is that it hides underlying details from the non-technical audience. By simulating the general behavior of real-world subjects while simultaneously hiding complex details, models encourage even non-technical team members to "play around." This broadens the base of users who can make important design decisions from IT professionals to also include business managers, process analysts, and even senior executives. Collaboration between this variety of stakeholders to leverage business and technical expertise leads to new scenarios and innovative solutions to problems. Michael Schrage, the co-director of the MIT Media Lab's eMarkets Initiative and the author of Serious Play , describes how this phenomenon plays out in another modeling discipline, computer-aided design, or CAD:

Engineering organizations have found that nonengineering managers and marketers want to play with CAD software to test their own product ideas and enhancements. Such " amateur CAD" signifies a growing democratization of design promoted by pervasive and accessible modeling technology. The changing nature of the modeling medium is forcing design professionals to manage the prototyping efforts of design amateurs. The declining cost and rising importance of prototyping is broadening the community of designers. [3]

Communicate Design Details

Finally, models can be compelling communication tools. This can happen in the form of a business unit evaluating an existing enterprise system to see if it could be reused in their division; a development team communicating a proposed project to a manager for approval; or an enterprise architect team communicating interface specifications to an external business partner for integration purposes. This communication is also the key to bridging gaps between distinct models and ultimately to overcoming the whitespace problem.



The Alignment Effect. How to Get Real Business Value Out of Technology
The Alignment Effect: How to Get Real Business Value Out of Technology
ISBN: 0130449393
EAN: 2147483647
Year: 2001
Pages: 83
Authors: Faisal Hoque

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net