Working with Field Objects


The definition of Field objects is usually done for you. That is, when you construct a Recordset from a data provider, it's the responsibility of the provider to fill in the Field properties, including the name, datatype, size, and all of the other properties, such as the Status and Attributes. You can also choose to define most of these yourself by constructing an in-memory Recordset, as we discussed in the "Creating and Opening Recordsets in Memory" section in Chapter 6.

In any case, once the Recordset is open, only the Field object's Value property can be changed, and then only under ADO's careful scrutiny. You can try to change the Field Value property using the AddNew or Update methods, which accept Field names and Value settings, or you can directly address the Value property in code. However, if you try to apply an incorrect value to the Value property, ADO informs you by tripping a trappable error. Often it's "-2147217887 Multiple-step operation generated errors. Check each status value." However, in my case, the Status value was 0, so it wasn't much help.


The error that tripped up my program was one generated by ADO when I tried to set an adVarChar field to "She said," "Hi there!""". Apparently, the embedded exclamation point confused ADO—it didn't seem to mind the embedded quotes or commas. Interesting.

It's not a good idea to change the Field Value property willy-nilly. That is, don't change it if it hasn't changed. Setting the Value property to a value that's the same as the existing value is still a change as far as ADO is concerned. This causes extra overhead on ADO's part, as it tries to include the changed fields in the UPDATE statement's WHERE clause.

The other Field properties were discussed in Chapter 5, when I described the ADO Command object Parameters collection:

  • ActualSize and DefinedSize: A Field object with a declared type of adVarChar and a maximum length of 50 characters returns a DefinedSize property value of 50, but the ActualSize property value it returns is the length of the data stored in the field for the current record.

  • Name: The name used to reference the field in the database. This is the human interface to the Field object, not the computer interface.

  • Type: The datatype of the Value property. The types used here are set when the table is first defined or fetched by the data source provider. See the discussion of the Type property in Chapter 5.

  • DataFormat: This property is an object that specifies how the field is formatted when displayed on a data-bound control. MSDN has orphaned it but it is discussed a little later in this chapter.

  • Precision and NumericScale: These properties are used to describe numbers that have decimal points or varying degrees of accuracy. For numbers capable of storing a fractional value, the NumericScale property reports how many digits to the right of the decimal point are used to represent values. The Precision property indicates the maximum number of digits to be stored for any number—integer or floating point.

  • OriginalValue: This property returns the state of the Field object as it was before you made changes.

  • UnderlyingValue: This property returns the state of the Field object as it is in the database after a Resync. It's the value some other user assigned to the column in the source table.

  • Value: This is the only Field object property that you are permitted to change. It contains the value as initially read from the data source and it can be set to any valid character or binary state, depending on the constraints defined by the other properties.

  • Properties: This addresses a collection of provider-specific properties that further define what the Field object is capable of doing, or other special attributes.

  • Status: This property is not interesting for the Recordset Field object because it returns only one value—adFieldOK. It has more meaning for the ADO Record object.

The Mysterious DataFormat Property

The DataFormat property has apparenly slipped through the cracks—at least to some extent. While it appears as a property in the ADO type libraries, it's not listed as such in MSDN. Perhaps this will be fixed by the time you read this—or perhaps not.

The DataFormat property is similar to the DataFormat property on simple bound controls, such as textboxes. If you set the DataFormat property on a field to Currency and check the Recordset object's Value property, you'll receive a string formatted as currency.

In the following sample code, notice that the UnitPrice field displays the data as currency after using the DataFormat property. If you bind controls to the Recordset, the setting of the DataFormat property on the Field objects does not apply. I'm guessing this is because the bound controls don't access the Field's contents via the Value property. To use the StdDataFormat object, you'll need to set a reference to Microsoft Data Formatting Object Library. (Thanks to David Sceppa for this insight.)

 Dim cn As ADODB.Connection Dim rs As ADODB.Recordset Dim strConn As String, strSQL As String Dim objDataFormat As StdFormat.StdDataFormat strConn = "Provider=SQLOLEDB;Data Source=Myserver;Initial Catalog=Northwind;" strSQL = "SELECT UnitPrice FROM Products Where Category = 'Seafood'" Set cn = New ADODB.Connection cn.CursorLocation = adUseClient cn.Open strConn, "Admin", "pw" Set rs = New ADODB.Recordset rs.Open strSQL, cn, adOpenStatic, adLockReadOnly, adCmdText MsgBox "Unformatted: " & rs(0)     ' UnitPrice Set objDataFormat = New StdFormat.StdDataFormat objDataFormat.Type = fmtCustom objDataFormat.Format = "Currency" Set rs.Fields("UnitPrice").DataFormat = objDataFormat MsgBox "Formatted as currency: " & rs!UnitPrice rs.Close cn.Close 

Field Methods—The Chunk Brothers

There are only two Field methods: GetChunk and AppendChunk. These two methods are used to manipulate Binary Large Objects (BLOBs) fields. Before I bore (you) into how these methods work, I would like to talk you out of using them. IMHO, BLOBs don't really belong in the database. Yes, some developers have come to me with successful BLOB column implementations, but more often than not, these were the result of many, many hours, weeks, or months of complex development.

Consider what happens when you define a BLOB column, such as TEXT or IMAGE in SQL Server, or as MEMO or OLE Object in Jet. The database engine sets aside a pointer in the column to point to a data area in which it will store the binary information when it arrives. The database does not grow appreciably when a BLOB column is defined. However, when you pass a BLOB to the database for storage, the DB engine has to allocate memory to store it temporarily (thus shoving out other data rows from the cache), allocate disk space to store it (thus stretching the size of the database), and consume CPU resources to process it (thus slowing down other operations). The ADO and low-level interfaces used to transport the data to the database are also forced to fragment the BLOB into chunks, or smaller pieces, to accommodate the limitations of the transport mechanism. Each of these packets is sent, one by one, to the server for storage.

When the database is backed up, the BLOBs stored therein are included (thus making the backup slower and consuming more media), even if the BLOBs are read-only. And all too often, that's the case. For example, employee pictures don't change very often—if ever. Archived documents, blueprints, and maps also don't change very frequently. Even document templates and forms are static until they are filled in. Working text documents or in-process graphic files are also BLOBs and can change quite frequently, but without a specially adapted retrieval system, they are hard to work with when stored in a database column. Yes, you can write your own retrieval system if you've a mind to. But consider that this might have already been done.

Retrieving a BLOB column reverses the process used to store it. You write a query that includes a BLOB column in the Recordset, the DB engine locates that row (along with others), and the BLOB is temporarily saved to the DB engine's data cache. Other rows are forced out, along with procedures and other recently discarded but reusable data and query plan pages. The BLOB is then rechunked and transmitted over the wire to your client, where it gets written to local RAM, overflowing to the \TEMP area on the client's hard disk. Your code then uses the GetChunk method to retrieve the data into yet another local memory area.

Once it's there, you have to decide what to do with it. If you're fetching a picture, you might try displaying it with a Visual Basic Picture or Image control, or with a third-party graphics control. Virtually all of these know how to deal with graphics resolved from files, not memory variables, so in most cases, you'll have to write the BLOB to a temporary file and use the Visual Basic LoadPicture function to display it. If the BLOB is a document, such as a Microsoft Word .DOC file, you'll also have to save the memory variable to a file and tell Word to open the file. However, Edward (my tech editor) tells me that one can also pass the contents of a PropertyBag element to the LoadPicture command. There's an example of doing this below.

Because we end up reading files to store the BLOBs and then writing files to manage them after retrieval, doesn't it make sense to leave the BLOBs in files? You can still store the path to the file in the database, and that makes sense. When you fetch the row, you get back the path. Microsoft Word, or your graphics processing program, or the LoadPicture function can all take that path directly and get the document, graphic, or picture—perhaps from a client-side CD or a server-side CD jukebox. This technique makes system maintenance easier (backups are faster and smaller), and for RW files, these can (must) be backed up separately as they are no longer part of the database.

SQL Server 7.0 has done quite a bit to make this process of storing BLOBs in the database work more efficiently. However, SQL Server still treats BLOB data pretty much like other data in that it lets a single 200MB BLOB flush its combined data and procedure cache. IMHO, a better choice for storing documents that need to be searched and retrieved using SQL is Index Server. See for more information.

Using ADO to Fetch BLOBs

Actually, you don't always have to use the Chunk methods to fetch or save BLOBs. Some ADO providers are pretty smart about handling BLOBs without any special code. As a matter of fact, I was able to extract pictures out of the database with only a few lines of code when I used the SQL Server OLE DB provider. However, your mileage may vary.

In the following code example, the vbiChunk variable is defined as a Byte array. When you use Byte arrays, Visual Basic doesn't try to convert the ANSI data to Unicode, which can be a problem with binary files. Because the LoadPicture method only accepts data from file, the BLOB data has to be saved to a file first, so we'll write the Byte array to a Binary file and use the LoadPicture function against that file.

 vbiChunk = rs.Fields(enFields.Cover) Put #1, , vbiChunk imgFromGetChunk.Picture = LoadPicture(strTempName) 


 pb.Contents = rs.Fields(enFields.Cover) img.Picture = pb.Readproperty("img") 

ADO informs you when you can use the Chunk methods (not when it makes sense to use them—that's up to you). If the adFldLong bit in the Field object's Attributes property is True, then you can use the GetChunk method for that field. Personally, though, I wouldn't go to the trouble if I can get away with the syntax shown above.

The GetChunk alternative looks similar to the code shown next. This shows how the query to fetch the desired row (and just one row is fetched), how the temporary file was set up, and how GetChunk was used to pick off 8192-byte chunks, one at a time. (This sample is on the CD at Samples\Recordset Objects\BLOB Sample.)

 strTempName = "C:\temp\" & fso.GetTempName Open strTempName For Binary As #1 If rs.State = adStateOpen Then rs.Close cn.GetCoverByISBN cmbISBNs.Text, rs 

Once the Recordset has been opened with the preceding code, we loop through the GetChunk method with the following code, fetching a portion of the binary bits, with each loop fetching the next set of bits. When the returned chunk is smaller than the selected chunk size, you've reached the end.

 Do    vbiTemp = rs.Fields(enFields.Cover).GetChunk(intChunkSize)    Put #1, , vbiTemp Loop Until UBound(vbiTemp) < intChunkSize − 1 imgFromGetChunk.Picture = LoadPicture(strTempName) ' 

If the Recordset column does not have a BLOB, the GetChunk method returns a null. The preceding code does not check for this, and it should.


Visual Basic can generate a unique temporary filename for you, so you don't have to worry about colliding with another filename somewhere. The trick is to figure out that the Microsoft Scripting Runtime contains this FileSystemObject function.

This example brings up another key point. When working with BLOBs, it's very important to fetch justthe Record containing the BLOB and no more. Don't fetch a hundred rows at a time unless you are ready to wait and bring your system to its knees.


When you save a picture in a Jet database with Access, it wraps the picture in a custom OLE wrapper.You'll have to decode that wrapper to access the picture in Visual Basic.

You say you saw us (maybe me) demonstrating how easy it was to fetch pictures from Jet databases? The demonstrators faked it. They copied the pictures to a Visual Basic-based Jet database.

Each subsequent GetChunk call retrieves data starting from where the previous GetChunk call left off. However, if you are retrieving data from one field, and then you set or read the value of another field in the current record, ADO assumes you have finished retrieving data from the first field. If you call the GetChunk method on the first field again, ADO interprets the call as a new GetChunk operation and starts reading from the beginning of the data. Accessing fields in other Recordset objects that are not clones of the first Recordset object does not disrupt GetChunk operations.

Using AppendChunk to Save BLOBs

Just as with GetChunk, ADO is smart enough to deal with most BLOB issues all on its own. You shouldn't need AppendChunk to save your BLOBs unless there is some other issue you are dealing with such as a tight memory situation.

ADO won't accept AppendChunk operations unless the adFldLong bit in the Attributes property of a Field object is set to True.

The first time you use the AppendChunk method on a Field object, ADO asks the data provider to write data to the field, overwriting any existing data. Subsequent AppendChunk calls add (append) to the existing data. If you are appending data to one field and then you set or read the value of another field in the current record, ADO assumes that you are finished appending data to the first field. If you call the AppendChunk method on the first field again, ADO interprets the call as a new AppendChunk operation and overwrites the existing data. Accessing fields in other Recordset objects that are not clones of the first Recordset object will not disrupt AppendChunk operations.

The following example illustrates how to use AppendChunk to pass chunks of data to the IMAGE field in the Covers table. I was lazy and didn't fetch the specific row in the Covers table to change—which would have been a better idea.

 Private Sub cmdPutChunk_Click() Dim strDataIn As String * 8192 Dim lngFPointer As Long Dim strFileName As String CommonDialog1.ShowOpen strFileName = CommonDialog1.FileName Open strFileName For Binary As #2 If rs.State = adStateOpen Then rs.Close rs.Open "Covers", cn, adOpenStatic, adLockOptimistic, adCmdTable Get #2, , strDataIn rs.AddNew rs.Fields(enCoverFld.FileName) = strFileName rs.Fields(enCoverFld.BType) = "XXX" rs.Fields(enCoverFld.Version) = "00000" rs.Fields(enCoverFld.ISBN) = cmbISBNs.Text Do     lngFPointer = lngFPointer + intChunkSize     rs.Fields(enCoverFld.Cover).AppendChunk (strDataIn)     Get #2, lngFPointer, strDataIn Loop Until lngFPointer >= FileLen(CommonDialog1.FileName) rs.Update quit: rs.Close Close #2 End Sub 


ADO Examples and Best Practices
Ado Examples and Best Practices
ISBN: 189311516X
EAN: 2147483647
Year: 2000
Pages: 106

Similar book on Amazon © 2008-2017.
If you may any questions please contact us: