Okay, so you don't want to visit each Recordset row one-by-one to see which rows failed to update. You don't have to; you can use the Recordset object's Filter property to make rows that don't match the Filter criteria invisible. No, rows are not removed from the Recordset when you set a Filter—they are simply hidden. The Filter property can be set on either client-side or server-side cursors. Table 6-9 lists the Filter constants.
ALLOWS YOU TO VIEW RECORDS:
Affected by the last Delete, Resync, UpdateBatch, or CancelBatch call.
Failed in last batch update.
In the current cache—that is, the results of the last call to retrieve records from the database.
Changed but not sent to the server. Batch update mode only.
All records in the Recordset—clears filter.
When you first use the UpdateBatch method, the Filter property can be used to show only those rows that had update problems. However, after you set the Filter property to adFilterNone to show all of the records, you can't ask to see the affected records again—that status information is cleared.
The Filter property can play an important role in the middle tier when it comes time to decide how to manage update collisions. There are a couple of things to remember, though. If you want to get a disconnected Recordset from a business object, make changes to the disconnected Recordset, and send it back to the business object, and you also want to see which records were deleted, you need to set the Recordset object's MarshalOptions property to adMarshalAll. If you set the MarshalOptions property to adMarshalModifiedOnly, you'll only see new and edited rows, not the deleted rows.
The following is a Microsoft Transaction Server business object that fetches the rows for the Recordset, detaches it, and returns it to the client:
Function GetRS(TitleWanted as String) As Recordset Dim conn As Connection Dim rs As Recordset Set conn = New Connection Set rs = New Recordset conn.Open "provider=sqloledb;data source=(local);initial catalog=pubs;", _ "Admin", "pw" rs.CursorLocation = adUseClient rs.Open "select Tc1, Tc2, Tc3 from TestTable" , _ conn, adOpenKeyset, adLockBatchOptimistic Set GetRS = rs rs.ActiveConnection = Nothing Set rs = Nothing conn.Close End Function
The following routine accepts the Recordset as changed by the client and posts the changes to the database:
Function ProcessUpdates(rs As Recordset) As String Dim strReturn As String rs.Filter = adFilterPendingRecords strReturn = "Rows changed = " & rs.RecordCount & vbCrLf Do Until rs. EOF If rs.Status = adRecDeleted Then strReturn = strReturn _ & "That row deleted, OriginalValues " & vbCrLf _ & rs(0).OriginalValue & vbTab _ & rs(1).OriginalValue & vbTab _ & rs(2).OriginalValue & vbTab _ & rs(3).OriginalValue & vbCrLf Else strReturn = strReturn _ & "That row changed, OriginalValues " & vbCrLf _ & rs(0).OriginalValue & vbTab _ & rs(1).OriginalValue & vbTab _ & rs(2).OriginalValue & vbTab _ & rs(3).OriginalValue & vbCrLf _ & "Here are the new values: " & vbCrLf _ & rs(0).Value & vbTab & rs(1).Value & _ & rs(2).Value & vbTab & rs(3).Value & vbCrLf End If rs.MoveNext Loop ProcessUpdates = strReturn End Function
The following code is the client code. We modify two rows, delete one, and add one.
Sub main() Dim rs As Recordset Dim obj As Object Set obj = CreateObject("RowsChanged.CheckStatus") Set rs = obj.GetRS("Data") Debug.Print "Rows in recordset before changes = " & rs.RecordCount rs.MoveFirst rs.Update Array(77,88,"Fred") rs.MoveNext rs.Delete rs.MoveNext rs.update Array(7,8,"George") rs.AddNew Array(100, 200,"New Row Added" rs.MoveNext Debug.Print "Rows in recordset after changes = " & rs.RecordCount rs.MarshalOptions = adMarshalModifiedOnly Debug.Print obj.ProcessUpdates(rs) End Sub
The following is the return string from ProcessUpdates when MarshalOptions is set to adMarshalModifiedOnly.
Rows in Recordset before changes = 6 Rows in Recordset after changes = 6 Rows changed = 4 That row changed, OriginalValue 1 1 1 One Here are the new values: 1 77 88 Fred That row deleted, OriginalValue 2 2 2 Two That row changed, OriginalValue 3 3 3 Three Here are the new values: 3 7 8 George That row changed, OriginalValue Here are the new values: 100 200 New row added
To help ADO know what specifically named table to update (add, change, or delete), the data provider fills in the source owner, database, and table names after the query is executed. If these cannot be determined, it's unlikely that ADO can construct a valid Update statement. Several properties are added to the client-side Recordset Properties collection to store this query-dependent information. You can set these properties in code to make a decision about what to update if it gets confused:
Unique Catalog specifies the catalog or name of the database containing the table.
Unique Schema specifies the schema or name of the owner of the table.
Unique Table specifies the name of the base table upon which updates, insertions, and deletions are allowed. This value is based on the catalog and schema (user) settings already set.
When the SELECT statement has data from many tables and ADO decides it can build an updatable Recordset, each Field object in the Recordset carries the source database and table.
Do you want to save yourself the trouble of coding another fetch to determine the current contents of the database tables involved in your successful or failed update operation? If so, you can set the Update Resync property to one of the enumerated values listed in Table 6-10. However, the Update Resync property is only available if the Unique Table dynamic property has already been set. Update Resync instructs ADO to automatically follow the UpdateBatch method with a Resync method call. The settings in Table 6-10 determine the scope of the operation.
UPDATE RESYNC PROPERTY ENUMERATIONS
Invokes Resync for each row with pending changes.
(Default.) Attempts to retrieve the new identity value for columns that are automatically incremented or generated by the data source, such as Microsoft Jet AutoNumber fields or Microsoft SQL Server Identity columns.
Invokes Resync for all rows in which the update or delete operation failed because of a concurrency conflict.
Invokes Resync for all successfully inserted rows. However, primary key column values are not resynchronized. Instead, contents of newly inserted rows are resynchronized based on the existing primary key value.[a]
Does not invoke Resync.
Invokes Resync for all successfully updated rows.
[a]If the primary key value has changed, Resync won't retrieve the contents of the intended row. For automatically incrementing primary key values, first call UpdateBatch with adResyncAutoIncrement to retrieve the data source-generated primary key value.
The following constants can be used in combination:
When working with more sophisticated backend operations, you'll often need to run more than one operation at a time. These operations can be SELECTs, action queries, or combinations of the two in any order. Not all providers support multiple operations. For example, the Jet provider can't accept more than one operation at a time, but SQL Server, Oracle, and others can. While it's not at all unusual to submit several SELECT, INSERT, DELETE, UPDATE, or other TSQL statements together (what we call scripts), these multiple operations are more likely to be the result of executing stored procedures.
There is nothing in the ADO rule book that says you can't submit multiple action or SELECT queries in a single CommandText property or Source argument string, as long as you do it right. The limitation lies with the provider. This means that if your provider supports multiple operations, you can bundle together 2, 12, or 1,200 SQL operations at once. The upper limit is a function of the size of the provider's command buffer.
It's easy to construct these command scripts. All you have to do is concatenate the commands together into a single string and use the string as you would any SQL query or action. Be sure to maintain the "white space" between operators. That is, when you concatenate two strings that should be separated by a space, include the leading (or trailing) space in the concatenation expression. However, if you're doing this for performance reasons, you won't want to use the Visual Basic concatenation operator (&). Using the MID$~ function is faster, as I illustrate in the next code sample.
When constructing the query string, be sure to avoid inserting (or leaving) any binary zeros in the string. If you do, ADO executes up to the zero and stops.
Once you execute the command, and after ADO returns control to your application, you can begin to parse the returning rowsets and result sets that are generated. Generally, all commands sent to the backend for processing return a result set. No, not all result sets contain rowsets, and you don't have to process all of the rowsets, or even all of the result sets generated by ADO; you can simply close the Recordset, set it to Nothing, and forget about it.
The following code illustrates a couple of techniques you might find useful. First, I construct a TSQL Create Table command to construct a new Sales test table. Note the use of fixed-length strings instead of ordinary variable-length strings.
… Dim strSQL As String * 32767 ' About 32K of ram Const strINSERT As String = " INSERT INTO Sales (SalesRegion,SalesAmount, ISBN) VALUES (" …
Next, I use the Execute method to execute two TSQL commands as a script. The first tests for the existence of the table about to be created. If it exists, the backend drops it. The next command creates the table. This step illustrates use of a batch or script to construct a new Sales table. Notice how there are really two parts to this query. One tests the existence of the table and executes a DropTable if it already exists and the second creates a new Sales table.
cn.Execute "if exists(select * from sysobjects where type = 'U' and name = 'Sales')" _ & " begin drop table sales end " _ & " CREATE TABLE Sales (SaleID INT IDENTITY(100,1) , " _ & " SalesRegion CHAR(2), " _ & " SalesAmount smallmoney," _ & " ISBN VarChar(20)" _ & " CONSTRAINT ID_PK PRIMARY KEY (SaleID) )"
The next section of code fills in the fixed-length query string with 200 INSERT statements, each containing a random ISBN and other randomly generated values. Note the use of the MID$ function instead of the Visual Basic concatenation (&) operator—this approach dramatically increases performance and is explained in more detail shortly. The MID$ lines insert (as opposed to concatenate) the insert values into the target query string. I left the concatenate operation (a single line of code) in the loop to illustrate how it's done. Also, note that the fixed-length string is initialized to spaces. This eliminates the chance that misaligning the MID$ insertions will leave an orphaned zero byte.
strSQL = String(intLenstrSQL, " ") For i = 1 To 200 intRegion = Int(Rnd(1) * 9) strCurAmount = Format((Rnd(1) * 100), "####.##") intRnd = Rnd(1) * 99 strISBN = "'" & vaISBNs(0, intRnd) & strInsertEnd ' ' simple but slow VB concatenation is commented out here ' strW = strW & strINSERT & intRegion & "," & strCurAmount & "," & strISBN strWork = String(intLenStrWork, ",") Mid$(strWork, 1, Len(intRegion)) = intRegion intTLen = Len(intRegion) + 1 Mid$(strWork, intTLen, Len(strCurAmount)) = strCurAmount intTLen = intTLen + Len(strCurAmount) + 1 Mid$(strWork, intTLen, Len(strISBN)) = strISBN intTLen = intTLen + Len(strISBN) Mid$(strSQL, intLenSQL, intLenInsert) = strINSERT intLenSQL = intLenSQL + intLenInsert Mid$(strSQL, intLenSQL, intTLen − 1) = Left$(strWork, tintTLen − 1) intLenSQL = intLenSQL + intTLen
To execute the query line-by-line, comment out these next three lines:
' cn.Execute Left$(strSQL, intLenSQL) ' intLenSQL = 1 ' strSQL = "" Next i
After the string is populated with our 200 INSERT statements, it's ready for execution. I simply set the string length and pass the script (strSQL) to the Recordset Open method. ADO passes this string straight through to the server for execution. Check out the Profiler to see!
intLenSQL = intLenSQL − 1 rs.Open Left$(strSQL, intLenSQL), cn, Options:=adCmdText
Each of the INSERT statements returns a result set, and we can loop through them all. However, ADO does not expose the Records Affected return value from the INSERT statements, so the following part of the example is really for naught. However, if you had embedded SELECT statements in stored procedures, you could use this routine to step through those rowsets and output parameters. It would probably make sense to add a Set NOCOUNT ON to the query, to completely eliminate the overhead generated by the INSERT result sets.
i = 0 Do Until rs Is Nothing For Each er In cn.Errors Debug.Print "Recordset"; i, er, er.Description Next er Set rs = rs.NextRecordset i = i + 1 Loop Debug.Print i; " Recordsets found"
This example, along with virtually all other examples in this book, is provided on the book's CD. To find this one, search for Sample Application\Recordset Objects\Multiple Operations.
Because Visual Basic is brutally clumsy about concatenating strings, it makes sense to try to avoid use of the ampersand (&) concatenation operator whenever possible.
However, trying to abstain from the & operator is tough. A number of clinics here in Redmond specialize in treating & addicts, but with limited success. I know I have trouble refraining from & on the podium, but I'm taking it one demo at a time.
I also recommend that you avoid "roll-your-own" SQL queries because of & expense. If your SQL is simply constructed by substituting parameters into a SQL WHERE clause, you can use the following techniques to cut 30 percent from your query setup time. Because this is an operation performed every time you execute your query, this 30 percent improvement can multiply into significant savings.
If you use this technique, you'll have to make sure the parameters don't contain any single quotes—ADO won't do it for you. Consider using the new Visual Basic Replace function to find embedded single quotes. Replace each single quote with two single quotes.
strParm = Replace(strParm, " ' ", " ' ' ")
To get the best performance when you construct your own SQL statements, you'll want to use the Mid function to insert the parameters instead of using & concatenation. This is easy if the length of the parameters is fixed, as when you pass numbers, but still doable if you pass variable-length arguments.
The following code example compares several "traditional" argument-management techniques with a more efficient approach using the Mid function. In this case, we set up the query ahead of time, much in the same way that we construct Command objects. I plan to use this approach in the middle tier where query construction time is critical and where I don't like to use a Command object (unless I expect to manage return parameters from the provider).
The first part of the code uses the & technique that I (and everyone else) have been illustrating for years as the way to insert parameters into SQL WHERE clauses.
SQL = "SELECT Name, Address " SQL = SQL & "FROM Authors " SQL = SQL & "WHERE Name like '" & txtParameter & "' ;"
Unfortunately, this technique costs us dearly in lost time, as Visual Basic allocates and reallocates memory (several times) to handle the strings.
The Mid technique eliminates the need for & concatenations by first figuring out ahead of time where to insert the parameter. It also uses a fixed-length string instead of a normal variable-length string declaration to manage the SQL query. After this preliminary step is done, we can easily reference the specific bytes in the query to change on each execution. This technique is about 30 percent faster than the preceding example.
' Set up the operation. This is done once—early in the application ' Note that flSQL is declared as a fixed length string… Dim flSQL as String * 128 flSQL = "SELECT Name, Address FROM Authors WHERE Name like '" j = InStrRev(flSQL, "'") + 1 ' Find the last single quote k = Len(txtParameter) + 1 ' Compute the parameter length ' Fill in the parameter. This is done just before execution Mid(flSQL, j) = txtParameter ' Insert the parameter in the query Mid(flSQL, j, k) = "';" ' Close the framing quote
Simply reducing the number of & operations can help improve performance. You save about 10 percent by simply using a Visual Basic constant for the preamble SELECT statement.
SQL = strSQL & "WHERE Name like '" & txtParameter & "' ;"
Remember that the line continuation character can help make code more human readable, but it hurts performance by about five percent each time it's used to break up a quoted string that has to be reassembled at runtime. No, the compiler does not preconcatenate these strings to save time later.
SQL = "SELECT Name, Address " _ & "FROM Authors " _ & "WHERE Name like '" & txtParameter & "' ;"
Oops, you made a change to your Recordset and you don't want to commit those changes to the database. Relax. ADO has lots of options to choose from. You can undo changes to just the current record or all of the changes made so far to the Recordset. Basically, you have these options:
After you execute the CancelBatch method, ADO might not be able to determine the current record in the Recordset. That's why it's a good idea to force the current row pointer to a specific location after using CancelBatch. For example, you could use the MoveLast method to move to the last valid row in the Recordset. If you are adding a new row when you use the CancelUpdate method, the current row is repositioned to the row that was current before the AddNew call.
If the attempt to cancel the pending updates fails because of a conflict with the underlying data (for example, a record has been deleted by another user), the provider returns warnings to the Errors collection but does not halt your program. A runtime error occurs only if there are conflicts on all the requested records. Use the Filter property (adFilterAffectedRecords) and the Status property to locate records with conflicts.
This code is not exactly coded to "best practice" standards. It overuses the string cocontation operator (&), which can materially affect performance. However, since this is an exception routine, it should not be executed very often..