Solution Overview


We will build an extraction, transformation, and loading (ETL) process to add the Web site's visit-tracking data to the corporate data warehouse. We will use the data mining features of Analysis Services to help discover patterns in this data and provide the information back to the business.

Business Requirements

The high-level requirements to support the business objectives are as follows:

  • Customer segmentation. The data warehouse already has excellent profiling information on customers that is obtained through a popular store loyalty card program. This information includes demographic profiles and detailed purchasing histories, because the customer's unique card number can be used to identify store transactions. However, the business also needs a profile of customers' online activities.

    The main areas of interest are frequency, or how often the customer uses the Web site, and recency, or how much time has elapsed since they visited the site. There is already information in the data warehouse on the third area of interest, which is intensity, or how much money the customer is spending through the Internet channel.

    When these Internet profiling attributes are available, customers can be segmented into groups with relatively similar behavior. Analysts can use the information for marketing purposes, such as producing lists of customers for direct mail campaigns, as well as performing further analysis using the attributes and groups that we identified.

  • Online recommendations. They would like to add an online recommendations feature to the new DVD area of the Web site to drive additional profit per online transaction. When a customer adds a DVD to her shopping basket, she must be prompted with a short list of other titles that she may be interested in.

    The performance of this recommendation needs to be good because any delay in the responsiveness of the Web site has been shown to lead to more abandoned transactions. Also, the recommendation must include items sold through the physical stores as well as the Web site, because the stores currently make up the bulk of the sales.

High-Level Architecture

We will add the Internet visit information to the existing data warehouse and Analysis Services cubes. Because the e-commerce application already extracts data from the Web logs and inserts it into a relational database, we will use this as the source for the ETL process. The data in this source database already has discrete user sessions identified.

Many e-commerce applications (including those based on the Microsoft Commerce Server platform) provide this kind of extraction and log processing functionality, but for custom Web sites, the only available tracking information may be the raw Internet Information Server (IIS) logs. A full treatment of the steps to extract this kind of information from Web log files is beyond the scope of this chapter; see the sidebar "Extracting Information from IIS Logs" for a high-level explanation.

After this information is in the data warehouse, we will use the data mining features of Analysis Services to achieve the business goals for segmentation and recommendations, as shown in Figure 10-1. For each area, we will create a data mining structure that describes the underlying business problem and then run the appropriate data mining algorithm against the data to build a mathematical model. This model can then be used both for predictions such as recommending a list of products or for grouping information in cubes together in new ways to enable more complex analyses.

Figure 10-1. High-level architecture


Data mining in Analysis Services has several different types of algorithms to perform tasks such as classification, regression, and segmentation. We will use the Microsoft Clustering algorithm to create a customer segmentation mining model, and then the model will provide these categories of customers' online behavior as a new dimension for cube analysis. We will use the Microsoft Association algorithm to build a data mining model that can be used to make product recommendations, and then add code to the Web site to query this model to suggest appropriate DVDs for online shoppers.

Alternative Solution: Build a Feed from the Data Warehouse to the E-Commerce System

Because the e-commerce application already has some built-in BI capabilities, we could use these features for customer segmentation and product recommendations if we built a feed from the corporate data warehouse to supply extra information, such as detailed customer profile information or even sales totals for other channels.

However, this approach is not recommended in this case because it will be impractical to meet the business requirements. Product recommendations need to be based on sales through the physical store channel as well as online transactions, and copying all the sales transactions to the e-commerce data warehouse database is not viable. Also, customer segmentation is a core activity for the marketing department, and they need to have access to all the rich information in the data warehouse.

In summary, although many e-commerce applications have built-in analytical functionality, most large retailers that also have physical stores will already have an investment in a data warehouse, and the most appropriate approach will often be to find ways to extend this with information from the Internet channel.


Extracting Information from IIS Logs

Although in our example solution, we will be taking advantage of the log parsing facilities that are built in to the e-commerce application, many companies have built a custom Web site where the only available tracking information is the raw IIS logs.

The first step in extracting this information is to parse the log files and extract the information into a staging database. You could create an Integration Services package to perform this extraction, possibly with an additional tool to make the data easier to work with. Microsoft has a free Log Parser utility (www.logparser.com), and third-party parsers are also available.

However, after you have extracted the raw information, the real fun begins, and is not for the faint of heart. Finding discrete "sessions" involves looking for an identifier in the logs such as username or cookie and then identifying a time period that could identify a "visit" fact.

If you also want to look at what pages the users visited, you need to parse URLs to deal with pages that are parameterized with identifiers such as product IDs (for example, product.aspx?ID=322442). On the whole, it is generally much easier to take advantage of an e-commerce application's existing parsing facilities if they exist, or otherwise find an existing tool that meets your needs.


Business Benefits

The solution will deliver the following benefits:

  • Targeting direct mail and other marketing to identified groups of customers that will probably use the Internet channel will decrease the overall cost of marketing and increase the company's market share.

  • Profitability will be improved by increasing the average number of items sold per transaction, such as selling more DVDs in one transaction and still incurring one shipping cost.



Practical Business Intelligence with SQL Server 2005
Practical Business Intelligence with SQL Server 2005
ISBN: 0321356985
EAN: 2147483647
Year: 2007
Pages: 132

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net