Package Development Features

Now you have seen all the wonderful tasks and transformations provided by Integration Services. However, just as with all those gadget commercials on TV, you need to wait because there's more. Integration Services also provides a number of additional features to aid in the package development process. In this section, we explore a number of features that make Integration Services packages easy to program and easy to debug.

Programming in Integration Services Packages

In DTS, creating packages was more like creating jobs in SQL Server Agent than it was like creating programs in Visual Studio. We would siring together a series of steps that was to execute as a unit. We could do some programming—scripting, actually—inside a DTS package, but it was not elegant and it certainly was not fast.

Integration Services takes us much closer to the program development paradigm. We can now create variables with strong types and scopes. We can use those variables in expressions throughout our Integration Services packages, while still maintaining speed and scalability. In addition, we can define event handlers to react to certain occurrences within a package.


Variables are used within Integration Services packages to pass information between the various parts of the package. This could be from one data transformation to another, from a control task to a connection manager, and so forth. In prior discussions of tasks, data sources, data transformations, and data destinations we have already seen several examples of variables in use.

Variables are managed using the Variables window shown in Figure 7-93. (The Variables window can be opened by selecting View I Other Windows I Variables from the Main menu or by right-clicking the designer area and selecting Variables from the Context menu.) Variables are added using the Add Variable button in the Variables window toolbar. The variable's name and data type must be specified when the variable is created. In addition, an initial value can also be provided, if desired. As opposed to other scripting environments, variables in Integration Services are strongly typed. This allows for memory allocation planning to be done in advance rather than at run time, leading to more efficient operation. A variable's name, data type, and initial value can be edited right in the grid of the Variables window. Variables are deleted using the Delete Variable button in the Variables window toolbar.

image from book
Figure 7-93: The Variables window

Each variable has its own scope, which is displayed in the Scope column in Figure 7-93. The variable's scope is the set of locations where the variable can be accessed and utilized. The Integration Services item that is selected when a variable is created is the item that appears in the Scope column. A variable can be used in expressions and other code that resides in the item specified in the variable's Scope column. The variable can also be used within any items that arc contained inside this item. For example, if a For Loop container is selected when a variable called LoopCounter is created, that LoopCounter variable can be used in any task that is placed inside the For Loop container.

The Show System Variables button in the Variables window toolbar toggles the grid between showing and hiding all of the predefined system variables. These system variables let us include information about the package, the user executing the package, and the computer the package is running on in the expressions and code created within the package.

The Choose Variable Columns button in the Variables window toolbar displays the Choose Variables Columns dialog box, which enables us to select the columns displayed in the grid. Two columns are not displayed by default: the Namespace column and the Raise Change Event column. A namespace simply provides a way to create logical groupings of variables for easier debugging and maintenance. All of the system variables are in the System namespace. By default, all of the variables we create are in the User namespace. We can change the namespace of these user-defined variables by using the Choose Variables Columns dialog to display the Namespace column, and then editing the namespace in the grid.

Using the Raise Change Event column, we can choose to trigger an event each time the value of a variable changes. We can then configure an event handler that executes each time the value changes. This can be done for both system- and user-defined variables. In fact, this is the only property of the system variables that we can edit in the Variables window grid. We talk more about events and event handlers in the section "Event Handlers."


Expressions let us modify the behavior of a package by evaluating conditions and changing properties at run time. Integration Services uses an expression syntax that is similar to that of the C# programming language. This is used to provide strongly typed variables. As was mentioned previously, strong typing is necessary to implement the advanced memory management necessary for top performance.

Literals Literals are used in expressions to represent values that are known at the time the code is written. Literals can be strings, numerics, or Booleans. String literals are enclosed in double quotes ( "" ). The following escape sequences can be used in string literals to represent nonprinting characters:

Escape Sequence







Form feed


New line


Carriage return


Horizontal tab


Vertical tab


Double Quote




Unicode character in hexadecimal notation

Numeric literals can be expressed as integers, as decimal numbers, or as scientific notation. They can include a negative sign, but not a digit grouping symbol (the comma is the digit grouping symbol when using U.S. regional settings). Numeric literals may include a suffix that specifies the data type to assign to it.

Integer values may use the following suffixes:




Services Data Type






Unsigned Integer








Unsigned Long



Real values and scientific notation may use the following suffixes:



Integration Services Data Type








Double-precision float


The suffixes are not case-sensitive.

Boolean literals are simply spelled as true and false. They do not include double quotes. Boolean literals are not case-sensitive.

Identifiers Identifiers arc used in expressions to represent values that are not known until run time. Identifiers can represent variables. For example:

 @FileName @_LoopCounter @PackageName 

Identifiers that represent variables are always preceded by an at sign ( @ ). Identifiers can also represent data flow columns. For example:

 Customer# AmountWith6PercentTax 

Identifiers that fit the following rules are called regular identifiers:

  • The first character is a letter or an underscore ( _ ).

  • The remaining characters are letters, numbers, an underscore ( _ ), an at sign ( @ ), a dollar sign ( $ ), or a pound sign ( #).

The at sign that precedes each variable name is not considered to be part of the variable name when determining if it is a regular identifier. All of the identifiers in the previous paragraph arc regular identifiers.

If an identifier does not fit these criteria, it is a qualified identifier. All this means is it must be enclosed in square brackets. For example:

 @ [Continue Processing Flag] [Customer Name] [12MonthTotal] 

The first two are qualified identifiers because they contain spaces. The third is a qualified identifier because it does not start with a letter or an underscore. If a variable name is a qualified identifier, the at sign is placed outside of the square brackets.

If two identifiers are within a given scope with the same name, we must provide additional information to allow the package to determine which identifier to use. When two variables have the same name, they must have different namespaces. We can then include the namespace with the variable name to make the reference unique. The name resolution operator, ::, is placed between the scope and the variable name. Whenever the name resolution operator is used, the identifier is always placed in square brackets, even if it is a regular identifier. For example:

 @[User::UserName] @[System::UserName] 

When two data flow columns have the same name, we can include the name of the data flow source or transformation item where each column was created. A period is placed between the item name and the column name. For example:

 FlatFileSource.CustomerName DataConversion.CustomerName 

If either the item name or the column name is not a regular identifier, they must be enclosed in square brackets. In all cases, the period is not enclosed in the brackets. For example:

 FlatFileSource.[Customer Number] [Character Map].ContactName [Derived Column].[12MonthTotal] 

Operators The following operators are supported by expressions:




(data type)

Data type conversion (Cast)




(4+5) * (4+7)



34 + 25



"Sally" + "" + "Jones"










39 / 3


Modulo division (provides the remainder of the division)

41 % 4

Logical OR

@LoopCount<5 @Alldone


Logical AND

@Continue && @ DatalsValid


Logical Not



Bitwise Inclusive OR

@Flags | @MaskVar


Bitwise Exclusive OR

@Flags ^ @MaskVar


Bitwise AND

@Flags & @MaskVar


Bitwise Not




@Quantity == @MaxValue



@Quantity != @MaxValue


Greater Than

@Quantity > @MaxValue


Less Than

@Quantity < @MaxValue


Greater Than or Equal To

@Quantity >= @MaxValue


Less Than or Equal To

@Quantity <= @MaxValue



@Counter == @MaxCounf?@ DoneFlag:@ContinueFlag

Functions The following mathematical functions are supported by the expressions:






Returns the absolute value of a numeric expression




Returns the exponential of a numeric expression




Returns the smallest integer that is greater than or equal to a numeric expression




Returns the largest integer that is less than or equal to a numeric expression




Returns the natural logarithm of a numeric expression




Returns the base-10 logarithm of a numeric expression




Returns the result of raising a numeric expression to a power




Returns the numeric expression rounded to the specified number of decimal places

ROUND(87.3863, 2)



Returns -1 if the number is negative




Returns the square of a numeric expression




Returns the square root of a given numeric expression



The following string functions are supported by expressions:






Returns the Unicode value of the leftmost character of the string expression

CODEPOINT ("Anderson Co.")



Returns the one-based index of the first occurrence of a character string within a string expression




Returns a string representing the hexadecimal value of an integer expression




Returns the number of characters in a string expression




Returns the lowercase version of a string expression

LOWER ("Anderson Co.")

anderson co.


Returns a string expression with all of the leading blanks removed




Returns a string with a set of characters replaced by another set of characters or with an empty string




Returns a string copied a specified number of times




Returns a string expression in reverse order




Returns the specified number of characters from the end of the string

RIGHT ("Anderson Co.", 7)

son Co.


Returns a string expression with all of the trailing blanks removed




Returns the specified portion of a string expression

SUBSTRING ("Anderson Co.", 3, 6)



Returns a string expression with all of the leading and trailing blanks removed




Returns the uppercase version of a string expression

UPPER ("Anderson Co.")


The following null functions are supported by expressions:






Returns a new date based on a set time period added to a specified date

DATEADD, ("day"'100, (DT_DBTIMESTAMP) "1/1/2005")

2005-04-11 00:00:00


Returns the time period between two specified dates

DATEDIFF ("day", (DT_ DBTIMESTAMP) "1/1/2005", (DT_DBTIMESTAMP) "4/11/2005")



Returns an integer representing the selected part of the specified date

DATEPART("day",(DT_DBTIMESTAMP) "4/11/2005")



Returns an integer that represents the day portion of the specified date

DAY((DT_DBTIMESTAMP) "4/11/2005")



Returns the current system date and time


2005-04-11 11:39:43


Returns the current system date in Universal Time Coordinate (UTC) time


2005-04-11 16:39:43


Returns an integer that represents the month portion of the specified date




Returns an integer that represents the year portion of the specified date



The following date functions are supported by expressions:





Returns True if the expression is null, otherwise returns False



Returns a null value of the requested data type


Event Handlers

The event handlers within a package enable us to create a control flow that executes in reaction to a certain event. The event could be an error, a warning, or a change to the value of a variable. The event could also be the beginning or the completion of a task within the package.


Remember, the Variable Value Changed event is triggered only if you have the Raise Change Event property set to True for one or more variables in the package.

The control flow that we create to respond to an event looks exactly the same as the control flow created for the package as a whole. While the package control How is executed exactly once each time the package is run, an event handler control flow may be executed many times or not at all. The execution of the event handler control flow depends on what happens during the running of the package. If no error conditions are encountered, the OnError event is never fired. If 20 error conditions are encountered (and this does not terminate the package as a whole), then the OnError event is executed 20 times.

Error handlers are useful for such things as:

  • Checking the state of the host server to make sure it is appropriate for the Integration Services package to execute

  • Sending an e-mail to the administrator when an error occurs

  • Doing any necessary cleanup after a process has completed

To create an event handler, click on the Event Handlers tab shown in Figure 7-94. Use the Executable drop-down list to select an item in the Integration Services package. Use the Event Handler drop-down list to select the event. Now drag-and-drop items from the Toolbox to create the functionality for this event handler.

image from book
Figure 7-94: The Event Handlers tab

Package Development Tools

The aids to Integration Services package development do not stop with the rich programming environment. Several features help with package creation and testing. These features make Integration Services a truly robust data transformation environment.

Import/Export Wizard

The Import /Export Wizard lets us quickly transfer data from one location to another. After we walk through the pages of the wizard to define the data transfer, the wizard creates an Integration Services package and executes that package. The package created by the wizard can also be saved for future use.

To launch the Import/Export Wizard, from the Business Intelligence Development Studio, select Project I SSIS Import and Export Wizard from the main menu. The Import/Export Wizard can also be launched from the SQL Server Management Studio as shown in Figure 7-95. To accomplish this, right-click on a database entry in the Object Explorer window and select either Tasks I Import Data or Tasks I Export Data from the Context menu. The only difference between these two menu items is the fact that the Import Data item defaults the destination to the database that was clicked, while the Export Data item defaults the source to the database that was clicked.

image from book
Figure 7-95: The Advanced Editor for Row Count dialog box

When the wizard is launched from the SQL Server Management Studio, we can choose to save the Integration Services package to a SQL Server or to a file for future use. When it is launched from the Business Intelligence Development Studio, the package is automatically saved to the Integration Services project. A saved package can be executed again in the future in its current form or it can be edited in an Integration Services project to serve as the starting point for a more complex data transformation.


Because Integration Services packages are, for the most part, designed for unattended operation, it can be extremely important to create a log documenting the execution of the package. This type of execution log can also be helpful for testing and debugging during the creation of the package. We control the logging performed by an Integration Services package using the Configure SS1S Logs dialog box shown in Figure 7-96.

image from book
Figure 7-96: The Providers and Logs tab of the Configure SSIS Logs dialog box

To display the Configure SS1S Logs dialog box, open an Integration Services package in the Business Intelligence Development Studio and select Logging from the SSIS menu. Use the Providers and Logs tab to determine where the information should be logged. Select the location for the logging from the Provider Type drop-down list, and then click Add to create a new log. We can create the following types of logs:

  • Comma-separated values text file

  • File to be read by the SQL Profiler

  • SQL Server Table named sysdtslog90

  • Windows Event Log

  • XML text file

All of the log types, with the exception of the Windows Event Log, need to be configured to specify exactly where the logged information is to be stored. A connection manager is used to determine where each log type will store its entries. Click in the Configuration column to select an existing connection manager or create a new one.

In the Containers list, select either the entry for the package itself or an entry for one of the items in the package. Check the check box for an item to enable logging for that item. If a check box is gray, it is being enabled or disabled along with its parent item. You can break this association with the parent by clicking on the grayed check box. Once you have enabled logging for an item, click one or more check boxes in the Name column for each of the log types that should be active for that item. Multiple log types can be selected.

Finally, we need to determine which events should be logged for the package or for a package item. On the Details tab, shown in Figure 7-97, check the check boxes for the events that should be logged for this item and any child items that remain associated with it. If no events are checked, only the beginning and ending of an item's execution will be logged.

image from book
Figure 7-97: The Details tab of the Configure SSIS Logs dialog box

In addition to the log types discussed, you can also view the log entries for a package by using the Log Events window in the Business Intelligence Development Studio. Logging must be enabled for the package or for one or more items in the package before log events are displayed in the Log Events window. However, you do not need to have any log types enabled for the events to appear in the Log Events window. The Log Events window is cleared each time you run the package.


As with operations in Transact-SQL, we can use transactions in Integration Services packages to insure that our data remains in a consistent state. By using transactions, we can insure that a series of items in a package all complete successfully or the entire process is rolled back to the state where it was prior to the attempted changes. We can even use distributed transactions to commit or roll back changes spread across multiple servers.

Each of the items in the Control Flow Toolbox and the Integration Services package itself has a TransactionOption property that determines how it deals with transactions. This property has three possible settings:

  • Supported An item with this property setting joins a transaction if one is already active, but it does not initiate a transaction itself. This is the default setting for the property.

  • Required An item with this property setting joins a transaction if one is already active and will initiate a transaction if one is not currently active.

  • NotSupported An item with this property setting does not join a transaction if one is already active and does not initiate a transaction if one is not currently active.

The TransactionOption is set by selecting an item, and then modifying this property in the Properties window. This property is not included in the Editor dialog boxes for each item.

Consider the sample Integration Services package with the TransactionOption property settings shown in Figure 7-98. Because the package has its Transaction-Option set to Required, a transaction is created when package execution begins. We call this Transaction 1 as shown in Figure 7-99. The File Transfer Protocol task participates in Transaction 1 because its TransactionOption is set to Supported.

image from book
Figure 7-98: Sample Integration Services package

image from book
Figure 7-99: Transactions created when executing the Sample Integration Services package

The Foreach Loop Container does not participate in Transaction 1 because its TransactionOption is set to NotSupported. The Data Flow task inside the Foreach Loop Container cannot participate in Transaction 1, even though its TransactionOption is set to Required. The reason for this is its parent object, the Foreach Loop Container, is not participating in Transaction 1. Instead, because its TransactionOption is set to Required, it initiates a second transaction called Transaction 2. All items in the Data Flow task participate in Transaction 2.

If any of the items in the Data Flow task should fail, all of the data changes within Transaction 2 are rolled back. Also, because Transaction 2 is nested within Transaction 1, all of the data changes within Transaction 1 are rolled back as well. If, on the other hand, all of the items in the Data Flow task complete successfully, the data changes in Transaction 2 are committed as soon as the Data Flow task completes. In other words, a rollback of Transaction 2 causes a rollback of Transaction 1, but not the other way around.

The File System task in Figure 7-85. This wizard page enables us to Supported, but there is no transaction for it to join. It is outside of Transaction 2 and it cannot join Transaction 1, because its parent object is not participating in that transaction. Therefore, the File System task is not a member of any transaction. Its success or failure does not affect either Transaction 1 or Transaction 2.

The Execute T-SQL Statement task has its TransactionOption set to Required. It does not need to create a new transaction, however, because it can join the active transaction, Transaction 1. If the Execute T-SQL Statement task should fail, it rolls back all data changes in Transaction 1. As mentioned earlier, the success or failure of the Execute T-SQL Statement task in Transaction 1 has no effect on Transaction 2.

The Execute Process task has its TransactionOption set to Supported. It also joins the active transaction, Transaction 1. Again, its success or failure only affects Transaction 1.


There are many situations where an Integration Services package could be restarted at some point in the middle of the control flow after a failure, rather than reexecuting the entire package from the beginning. Perhaps the first portion of a package deletes an old import text file and uses FTP to download a new copy of this import file. If the package fails at some point later in the process, it is probably not necessary to perform the file deletion and download again. Checkpoints provide a mechanism for performing this restart in the middle of the package.

The package includes three properties that let us enable checkpoint restarting. These are the SaveCheckpoints property, the CheckpointFilename property, and the CheckpointUsage property. The Save Checkpoints property is a Boolean flag that enables or disables the saving of checkpoint information. This property must be set to True for checkpoint information to be saved.

The CheckpointFilename property lets us specify the filename where checkpoint information is to be saved. When a package executes, information about each successfully completed task and each committed transaction is saved to this checkpoint file. In the event of a package failure, this file is used to determine where the package can be restarted.

The CheckpointUsage property determines how the checkpoint file is used when a package is executed. When the CheckpointUsage property is set to Never, the checkpoint file is not used when executing a package. The package is always ran from beginning to end. When the CheckpointUsage property is set to Always, the package always reads the checkpoint tile and starts from the last point of failure. When the CheckpointUsage property is set to IfExists, the package looks for a checkpoint file and uses it to restart from the last point of failure if it exists. If a checkpoint file does not exist, the package is run from beginning to end.

Package Debugging

One of the most helpful tools in program development is the capability to pause program execution at a breakpoint and examine the execution state. This feature makes it much easier to pinpoint a problem and fix it. Fortunately, Integration Services now provides this functionality.

Setting Breakpoints We can set a breakpoint on any of the control flow tasks in a package. To accomplish this, right-click the task and select Edit Breakpoints from the Context menu. The Set Breakpoints dialog box appears as shown in Figure 7-100. A breakpoint can be set on any of the events triggered by that task. We can use the Hit Count Type and Hit Count columns to determine whether execution pauses every time this event is triggered or only after a hit has occurred multiple times.

image from book
Figure 7-100: The Set Breakpoints dialog box

In addition to these task eveent breakpoints, breakpoints can be set on any line of code in the Script task. When execution is paused within a script, we may use the Step Into, Step over, and Step Out features to move through the script code one line at a time. The strp features do not function when paused at a task event breakpoint.

Viewing Package state While the package execution is paused at a breakpoint, there are several places to see the current execution state of the package. The windows discussed here are opened using the Debug | Windows menu, the View menu, or the Windows Toolbar button.

  • Progress Color-coding As the tasks of a package execute, their background color changes to show the current execution status: White—waiting to exccute, Yellow—currently ececuting, Green—completed execution successfully, and Red—completed execution with error. While execution is paused, we can switch between the Control Flow and Data flow and Data Flow tabs to see the ececution status of both.

  • Row Counts As the data flow tasks of a package ececute, the number of rows processed through each data flow are displayed nect to the fata flow arrows.

  • Progress Tab We can switch to the Progress tab to see a description of the current execution status in outline form.

  • Breakpoints Window The Breakpoints window shows all of the breakpoints set in the package and lets us to disable or delete breakpoints.

  • Output Window The Output window displays status messages as the package executes.

  • Script Explorer Window The Script Explorer window enables us to view the script structure.

  • Watch Windows The Watch windows enable us to examine and modify variable values. Variable names are entered into a Watch window without a leading "@" sign.

  • Autos Window The Autos window shows the value of the variables in the current and the previous line of script code.

  • Locals Window The Locals window displays all of the system and user variables that are currently in scope.

  • Immediate Window The Immediate window is used for evaluating expressions and displaying the value of variables.

  • Call Stack Window The Call Stack window shows the hierarchy of containers around the currently executing task.

  • Threads Window The Threads window shows the threads being used to execute the current package.

  • Modules Window The Modules window lets us view script modules.

  • Processes Window The Processes window shows the currently executing processes.

Several of the debug windows are not used for debugging Integration Services packages. These are the Memory window, the Disassembly window, and the Registers window.

Viewing Data Flow In addition to viewing the doings inside package tasks, it is also helpful (perhaps even more helpful) to see what is happening inside the data flow at various steps along the way. We do this by attaching data viewers to the data flow. The data viewer pauses the package execution and enables us to examine the rows in the data flow as it moves from one item to another.

To attach a data viewer to a data flow, we first need to view the Data Flow Path Editor dialog box. This is done by double-clicking on a How path arrow. The General page of the Data Flow Path Editor dialog box provides general information about this flow path arrow. The Metadata page of this dialog box provides information about the columns in the data flow at that point. This in itself can be helpful when developing and debugging packages. Data viewers are added to the data flow on the Data Viewers page.

Click Add on the Data Viewers page to attach a new data viewer to the data flow path. This displays the Configure Data Viewer dialog box shown in Figure 7-101. On the General tab of this dialog box, we have a choice of using a grid, a histogram, a scatter plot, or a column chart to display the data. The name of the second tab of this dialog box changes, depending on the type of data viewer that is selected. Use that tab to select the data flow columns that should be displayed in the data viewer.

image from book
Figure 7-101: The Configure Data Viewer dialog box

Multiple data viewers can be added to the same flow path arrow to provide different views of the same data. A data viewer icon is placed next to the flow path arrow to show that one or more data viewers has been attached. When the package executes, the package execution pauses whenever the package reaches a data flow with a data viewer attached. A data viewer window appears for each data viewer as shown in Figure 7-102.

image from book
Figure 7-102: Data Viewer windows

The data viewers show data one buffer at a time. The number of rows in the buffer depends on the number of columns in the data flow at that point and the width of the data in each column. When one buffer of data is loaded into the data viewer, the package execution pauses until we click the Continue button (the green triangle) or we detach all of the data viewers from that data flow. Data viewers are detached by clicking Detach. When data viewers are detached, the data flow resumes without loading data into the data viewer. Data viewers can be reattached by clicking Attach. We can visually inspect the data in the data viewer or click Copy Data to copy the data to the clipboard for saving or analysis in another application.

Migrating from SQL Server 2000 DTS Packages

It should be clear by now that, even though Integration Services provides all of the functionality that was found in Data Transformation Services in SQL Server 2000, it is a completely different animal. This causes problems when migrating current DTS packages from SQL Server 2000 to SQL Server 2005. Fortunately, SQL Server 2005 gives us several options to help ease the pain.

Executing without Migration

When SQL Server 2005 is installed, it checks to see whether SQL Server 2000 is already installed on the same server. If SQL Server 2000 is present, the installation routine automatically installs the SQL Server 2000 DTS run-time engine. Even if SQL Server 2000 is not present on the server, you can manually instruct the SQL Server 2005 installation routine to install the SQL Server 2000 DTS run-time engine. This is done by clicking Advanced on the Components to Install page of the installation wizard, and then selecting Data Transformation Services 2000 Runtime on the Features Selection page.

The SQL Server 2000 DTS run-time engine lets us schedule and execute SQL Server 2000 DTS packages. Note, however, that the DTS development environment is not installed with the run time. Therefore, we need access to the SQL Server 2000 Enterprise Manager if we want to modify these DTS packages.

Executing from Within an Integration Services Package

As noted earlier in this chapter, we can execute DTS packages as part of an Integration Services package using the Execute DTS 2000 Package task. This enables us to use many of the Integration Services features, such as logging and event handlers with the DTS packages. The SQL Server 2000 DTS run-time engine must be installed to use this task.

Using the Migration Wizard

SQL Server 2005 also provides a SQL Server Integration Services Migration Wizard for converting DTS packages to Integration Services packages, To launch the Migration Wizard, use the Project menu or right-click the SSIS Packages folder in the Business Intelligence Development Studio Solution Explorer window. Select Migrate DTS 2000 Package from the Context menu. The SQL Server Integration Services Migration Wizard appears and walks you through the migration process.

Custom DTS tasks, some ActiveX Script tasks, and other items in your DTS packages may be unable to be converted by the Migration Wizard. The items that can be converted become tasks in a new Integration Services package. Each item that cannot be converted from DTS is placed into a new DTS package, along with any connections it requires. These intermediate packages are then called from the new Integration Services package using the Execute DTS 2000 Package task.

Delivering Business Intelligence with Microsoft SQL Server 2005
Delivering Business Intelligence with Microsoft SQL Server 2005: Utilize Microsofts Data Warehousing, Mining & Reporting Tools to Provide Critical Intelligence to A
ISBN: 0072260904
EAN: 2147483647
Year: 2007
Pages: 112
Authors: Brian Larson

Similar book on Amazon © 2008-2017.
If you may any questions please contact us: