OLE DB Command


The OLE DB Command transformation is a component designed to execute a SQL statement for each row in an input stream. This task is analogous to an ADO Command object being created, prepared, and executed for each row of a result set. The input stream provides the data for parameters that can be set into the SQL statement that is either an in-line statement or a stored procedure call. Now I don't know about you, but just hearing the "for each row" phrase in the context of SQL makes me think of another phrase —" performance degradation." This involves firing an update, insert, or delete statement, prepared or unprepared some unknown number of times. This doesn't mean there aren't any good reasons to use this transformation — you'll actually be doing a few in this chapter. Just understand the impact and think about your use of this transformation. Pay specific attention to the volume of input rows that will be fed into it. Weigh the performance and scalability aspects during your design phases against a solution that would cache the stream into a temporary table and use set-based logic instead.

To use the OLE DB Command transform task, you basically need to determine how to set up the connection where the SQL statement will be run, provide the SQL statement to be executed, and configure the mapping of any parameters in the input stream to the SQL statement. Take a look at the settings for the OLE DB Command transformation by opening its editor. The OLE DB Command transform is another component that uses the Advanced Editor. There are four tabs in the editor:

  • Connection Manager: Allows the selection of an OLE DB Connection. This connection is where the SQL Statement will be executed. This also doesn't have to be the same connection that is used to provide the input stream.

  • Component Properties: Here you can set the SQL Command statement to be executed in the SQLCommand property and set the amount of time to allow for a timeout in the CommandTimeout property in seconds. The property works the same way as the ADO Command object. The value for the CommandTimeout of 0 indicates no time-out. You can also name the task and provide a description in this tab.

  • Column Mappings: This tab will display columns available in the input stream and the destination columns, which will be the parameters available in the SQL command. You can map the columns by clicking on a column in the input columns and dragging it onto the matching destination parameter. It is a one-to-one mapping, so if you need to use a value for two parameters, you'll need use a Derived Column transform to duplicate the column in the input stream prior to configuring the columns in this transform.

  • Input and Output Properties: Most of the time you'll be able to map your parameters in the Column Mappings tab. However, if the OLE DB provider doesn't provide support for deriving parameter information (parameter refreshing), you'll have to come here to manually set up your output columns using specific parameter names and DBParamInfoFlags.

The easiest way to learn this task is by example. Suppose you have a requirement to process a small daily volume of validated, electronically sent deposit entries and to run them through logic to create deposit entries in your accounting database. You also have to build payment transactions that will need to be reviewed by accounting personnel using the accounting software, which applies the money to each customer's account. Fortunately, you don't need to know how to create deposit transactions or payment transactions. You've been given two stored procedures that will do the work of building the transactions, so you'll use them in the example.

  1. Create an SSIS package named "OLE DB Command." Add a Data Flow component to the Control Flow.

  2. Create a text file containing the following entries and save it to "c:\ole db eft data.txt."

     CustomerID,DepositAmt,DepositDate,Invoice XY-111-222,$100.00,07/13/2005,222-063105 XX-Z11-232,$1000.00,07/13/2005,232-063105 XX-Y88-233,$555.00,07/13/2005,233-053105 

  3. Run the following SQL Scripts to create the simulated stored procedures in your AdventureWorks database.

     USE ADVENTUREWORKS GO CREATE PROC usp_DepositTrans_Add (             @CUSTOMERID varchar(10),             @DEPOSITAMT money,             @DEPOSITDATE smalldatetime,             @INVOICE varchar(15)) AS      ---THIS IS A DUMMY PROCEDURE FOR DEMO PURPOSES GO CREATE PROC usp_PaymentTrans_Add (             @CUSTOMERID varchar(10),             @DEPOSITAMT money,             @DEPOSITDATE smalldatetime,             @INVOICE varchar(15)) AS      --THIS IS A DUMMY PROCEDURE FOR DEMO PURPOSES 

  4. Add a Flat File Source to your Data flow to consume the text file "c:\ole db eft file.txt."

  5. Add an OLE DB Command transform task to the Data Flow design surface. Connect the output of the Flat File Source to the OLE DB command transform.

  6. Configure the OLE DB Command transform task to a connection to AdventureWorks. Update the SQLCommand property for the transform to add a deposit for each input row by setting the property to usp_DepositTrans_Add ?, ?, ?, ?. Each of the "?" marks stand in place of a parameter. Click refresh to pull the parameters from the proc. The completed tab should look like Figure 6-15.

  7. In the Column Mappings tab, map each column in the input columns collection to a column in the destination columns collection. This should look like Figure 6-16.

  8. Add another OLE DB Command transform task to the Data Flow design surface. Connect the output of the first OLE DB Command transform to the second and then go through the same configuration as for the deposit command, but this time set the SQLCommand property to usp_PaymentTrans_Add ?, ?, ?, ?.

image from book
Figure 6-15

image from book
Figure 6-16

When you run this package, you'll see that three rows were processed by each OLE DB Command transform. If the procedures were functional, they would have created three deposit and three payment transactions. In this example, you found a good reason to use this task — reusability. It may be more efficient to process these transactions in a SQL batch, but then you'd have to stage the data and code the batch transaction. In this example, you were able to reuse existing logic that was designed for manual or one-at-a-time data entry and bundle that into an automated SSIS package fairly quickly.



Professional SQL Server 2005 Integration Services
Wireless Java : Developing with Java 2, Micro Edition
ISBN: 189311550X
EAN: 2147483647
Year: 2006
Pages: 182

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net