Before you can successfully load the historical sales data for 1996 and 1997 into the SalesFact table in the SBS_OLAP database, you must add the appropriate time records to the time dimension table using the PopulateTimeDimension package.
After the time records are added to the TimeDim table, you can use the LoadHistoricalData package to load dimension data into the ProductDim and CustomerDim tables from the SBS_OLTP database, and fact data into the SalesFact table from delimited text files. The historical data that you will load is located in the C:\Microsoft Press\SQL DTS SBS\Ch10\ChapterFiles\Data folder. The LoadHistoricalData package is currently configured to load fact data from the Sales1996.txt and Sales1997.txt files in the C:\Microsoft Press\SQL DTS SBS\DataMovementApplication folder. You will edit the Config.ini file to point to the LoadHistoricalData package to the SalesData1996.txt and SalesData1997.txt files in the C:\Microsoft Press\SQL DTS SBS\Ch10\ChapterFiles\Data folder. After you edit the Config.ini file, you will execute the LoadHistoricalData package to load this historical data into the SBS_OLAP database, and then you will execute the MasterUpdate package to process this data into the Sales cube. Finally, you will verify that the historical data loaded successfully by viewing the AuditEvents table and browsing the Sales cube using Analysis Manager.
Using Windows Explorer, navigate to C:\Microsoft Press\SQL DTS SBS\DataMovementApplication.
Right-click UpdateTimeDim.cmd and then click Edit.
Change the StartDate global variable value to 7/1/1996 .
Change the EndDate global variable value to 1/1/1998 .
Save the changes to the UpdateTimeDim.cmd batch file. Do not close the UpdateTimeDim.cmd file in Microsoft Notepad.
Switch to the DataMovementApplication folder in Windows Explorer, and then double-click UpdateTimeDim.cmd.
After this batch file completes its execution, double-click the UpdateTimeDimensionExecutionLog.txt file in the DataMovementApplication folder to verify that the PopulateTimeDimension package executed successfully.
After reviewing the execution log, close the UpdateTimeDimensionExecutionLog.txt file in Notepad.
Now that you have added time records to the time dimension table for the period covered by the historical data, you are ready to modify the Config.ini file to point to the historical data located in the C:\Microsoft Press\SQL DTS SBS\Ch10\ChapterFiles\Data folder.
Switch to the DataMovementApplication folder using Windows Explorer, and then double-click Config.ini.
In the [HistoricalData] section, change the 1996Data key to C:\Microsoft Press\SQL DTS SBS\Ch10\ChapterFiles\Data\SalesData1996.txt , and then change the 1997Data key to C:\Microsoft Press\SQL DTS SBS\Ch10\ChapterFiles\Data\SalesData1997.txt .
Save the modified Config.ini file. Do not close the Config.ini file in Notepad.
You have just modified the Config.ini file to point to the new files, so you are ready to execute the LoadHistoricalData package.
Click Start and then click Run.
In the Open box, type DTSRunUI and click OK.
You are using the DTSRunUI program because you have not created a batch file to run the LoadHistoricalData package. You do not need a batch file because you will run this package only once.
Select StructuredStorageFile in the Location list, type C:\Microsoft Press\SQL DTS SBS\DataMovementApplication\LoadHistoricalData.dts in the File Name box, type LoadHistoricalData in the PackageName box, and then click Run.
Type mypassword in the Password box and then click OK.
The ProductDim , CustomerDim , and SalesFact tables were successfully loaded with historical data. The SalesStage table was used to stage the historical fact data before it was joined with dimension data from the three dimension tables and loaded into the SalesFact table.
Click Done and then click Cancel to close the DTS Run dialog box.
Now that you have loaded the historical data, you are ready to process that data into the Sales cube using the MasterUpdate package.
Switch to the DataMovementApplication folder in Windows Explorer, and then double-click DeleteAllStagingData.cmd.
Executing the MasterUpdate package using the DeleteAllStagingData.cmd batch file executes the packages in the data movement application based on a giConfigID global variable value of 3. The global variables in the PackageGVs table associated with the giConfigID value of 3 specify that each subpackage execute its delete phase and that the MasterUpdate package process the Sales cube. Processing the Sales cube enables its data to be browsed. You must delete the data in each staging table before you load additional sales data to avoid duplicate data in the Sales cube.
After this batch file completes its execution, switch to SQL Query Analyzer and click Clear Window on the toolbar.
In the query pane, type SELECT * FROM SBS_OLAP.dbo.AuditEvents ORDER BY ExecutionDate and then click Execute on the toolbar.
Notice that the delete phases of the UpdateCustomerDim , UpdateProductDim , and UpdateSalesFacts subpackages executed, followed by the execution of the ProcessSalesCube task in the MasterUpdate package.
You have verified that the data in the staging tables was deleted and that the Sales cube was processed . Now you are ready to browse the Sales cube using Analysis Manager.
Click Start, point to All Programs, point to Microsoft SQL Server, point to Analysis Services, and then click Analysis Manager.
In the Analysis Manager console tree, expand Analysis Servers, expand your Analysis Services instance, expand SQL DTS SBS, expand Cubes, and then click Sales.
In the details pane, click the Data tab.
In the details pane, drag the Time dimension and drop it just above the Bill Country field in the Customer dimension.
Analysis Manager displays sales information sliced by the Time dimension. Notice that the Sales cube contains data only for 1996 and 1997.
Do not close Analysis Manager.
Now that you have verified that you can browse the Sales cube and that it contains only the appropriate historical data, you are ready to add new sales data for the month of January 1998.