Setting Up Tracing

 

Setting Up Tracing

The first step in setting up tracing within ADO.NET is to register the AdoNetDiag.dll component. You do this by adding a single registry value. Open the registry editor using regedit.exe and locate the following key:

HKLM\Software\Microsoft\BidInterface\Loader 

If this key does not exist, add it. Next add the following string value in the Loader key (where xxxxx is the current build number), if it doesn't already exist.

Name=":Path" Value="C:\WINDOWS\Microsoft.NET\Framework\v2.0.xxxxx\AdoNetDiag.dll" 

If the key and string value did not already exist, you must restart your machine to get this setting to work properly. This component makes any class library that is instrumented for data tracing look like an ETW provider.

Next you must configure AdoNetDiag.dll so that it shows up on both the public provider list and the Windows Management Instrumentation (WMI) provider list. You do this by compiling a Managed Object Format (MOF) file using the mofcomp.exe utility. The MOF file contains statements that add objects into the WMI repository. These objects describe the events being exposed by the AdoNetDiag.dll component. The MOF file is located in the same folder as the AdoNetDiag.dll component and can be configured using the following command:

mofcomp adonetdiag.mof 

One way to verify that the MOF file has properly added the object to the WMI repository is to use the WMI CIM Studio (available as a download from Microsoft). You can view the event trace classes in the WMI tree, as shown in Figure 3-1. (Notice that the providers have a Bid2Etw_ prefix.)

image from book
Figure 3-1: I checked the items that were added by running the mofcomp.exe utility.

Note 

Prior to the Visual Studio 2005 release, there were versions of the adonetdiag.mof file that did not include the SQL Native Client (Bid2Etw_SQLNCLI_1) information. If the SQL Native Client does not show in your list, you can use the AdoNetDiag.mof file that is included in the Chapter 3 sample files.

Another way to verify that the mofcomp command worked properly is to run the following Log Manager command:

Logman query providers 

This command lists all of the providers that are registered. Table 3-1 lists the providers that should have been registered as a result.

Table 3-1: ADO.NET Registered WMI Providers

Provider

GUID

Description

System.Data.1

{914ABDE2-171E-C600-3348-C514171DE148}

Provider for events in the System.Data.dll component

System.Data.SNI.1

{C9996FA5-C06F-F20C-8A20-69B3BA392315}

Provider for SQL Server Network Interface (SNI) events in the System.Data.dll component

SQLNCLI.1

{BA798F36-2325-EC5B-ECF8-76958A2AF9B5}

Provider for the SQL Native Client and its SNI events

System.Data.OracleClient.1

{DCD90923-4953-20C2-8708-01976FB15287}

Provider for events in System.OracleClient.dll

ADONETDIAG.ETW

{7ACDCAC8-8947-F88A-E51A-24018F5129EF}

Provider for events in the ETW Adapter

Using the logman.exe Utility

You can use the Log Manager (logman.exe) tool to perform all logging activities. This tool is a command-line utility that has a -? switch for viewing the command syntax and listing the switches. To turn on tracing, use the following command:

logman start MyTrace -pf ProviderList.txt -ct perf -o C:\Perflogs\Out.etl -ets 

This command begins with start MyTrace, which starts a trace that is arbitrarily called MyTrace. The command specifies -pf ProviderList.txt, which dictates the use of a list of providers included in the file ProviderList.txt. The next part of the command is -ct perf, which means use the high-performance clock type, as opposed to using the low-resolution -ct system setting. The next part of the command is -o C:\Perflogs\Out.etl, which identifies the output location as Out.etl (the .etl extension stands for Event Trace Log). Note that C:\Perflogs is specified as the path because the file needs to be placed in a folder that the providers have access to, and this is the default location for output files when running Performance Monitor. The last part of the command is -ets, which means that this is an ad hoc trace collection and the trace collection's definition will not be persisted.

The contents of the ProviderList.txt file are as follows. Notice that the Oracle provider is not included because none of the examples uses Oracle.

image from book

ProviderList.txt File

"ADONETDIAG.ETW"                       0x2  0x0   ADONETDIAG.ETW {914ABDE2-171E-C600-3348-C514171DE148} 0x2  0x0   System.Data.1 "System.Data.SNI.1"                    0x2  0x0   System.Data.SNI.1 "SQLNCLI.1"                            0x2  0x0   SQLNCLI.1 
image from book

Table 3-2 describes the elements of the ProviderList.txt file. Notice that the file contains an example using a GUID and an example using a name as the provider identification. Also notice that having a control bit value of 0x2 means that a regular trace is done, without enter/exit or other advanced information.

Table 3-2: ProviderList.txt Contents

Item

Description

Provider

Can be a GUID that identifies the ETW provider, delimited by curlybraces { }. Can also be the provider name delimited by double quotation marks " ". The ProviderList.txt file listing shows examples of both.

Control Bits

Flag mask that can be used to filter the trace. Bits can be set as follows and can be ORed together. The default value is 0x0006 if 0x0000 is specified.

  • 0x0002 Regular trace

  • 0x0004 Show execution flow with enter/leave.

For System.Data.1:

  • 0x0080 Advanced output; adds TDS Packet Data Read/Written

  • 0x1000 Add connection pooling information to the trace log.

Logging Level

Typically used to control the verboseness of the output. With data providers, the following values can be supplied.

  • 0x0000 Normal

  • 0x0040 Disables tracing on this component.

  • 0x0080 Convert Unicode output to ASCII to reduce the output size.

Provider Name

The name of the provider, which is required by ETW but ignored by the logman utility.

When the trace is running, the Out.etl file is populated with data from the providers. Running a simple application that populates a DataSet object can produce lots of data. Here is a simple program that generates some output data. If you haven't already done so, turn on tracing using the logman command shown on page 77, and then simply create a console application and insert the code that follows.

image from book

Visual Basic

Imports System Imports System.Data Imports System.Data.SqlClient Public Class MyApp    Public Shared Sub Main()       Dim cnSettings As SqlConnectionStringBuilder       Dim cn As SqlConnection       Dim da As SqlDataAdapter       Dim cmdBld As SqlCommandBuilder       Dim ds As DataSet       cnSettings = New SqlConnectionStringBuilder( _         "Data Source=.\SQLEXPRESS;" _         + "AttachDbFilename=|DataDirectory|NORTHWND.MDF;" _         + "Integrated Security=True;User Instance=True")       cn = New SqlConnection(cnSettings.ConnectionString)       da = New SqlDataAdapter("Select * from Products", cn)       cmdBld = New SqlCommandBuilder(da)       ds = New DataSet()       da.Fill(ds, "Products")       For Each dr As DataRow In ds.Tables("Products").Rows          dr("UnitPrice") = dr("UnitPrice") * 1.1       Next       da.Update(ds, "Products")    End Sub End Class 
image from book

image from book

C#

using System; using System.Data; using System.Data.SqlClient; namespace Tracing {    static class Program    {       static void Main()       {          SqlConnectionStringBuilder cnSettings;          SqlConnection cn;          SqlDataAdapter da;          SqlCommandBuilder cmdBld;          DataSet ds;          cnSettings = new SqlConnectionStringBuilder(            @"Data Source=.\SQLEXPRESS;"            + "AttachDbFilename=|DataDirectory|NORTHWND.MDF;"            + "Integrated Security=True;User Instance=True");          cn = new SqlConnection(cnSettings.ConnectionString);          da = new SqlDataAdapter("Select * from Products", cn);          cmdBld = new SqlCommandBuilder(da);          ds = new DataSet();          da.Fill(ds, "Products");          foreach (DataRow dr in ds.Tables["Products"].Rows)          {             dr["UnitPrice"] = (decimal)dr["UnitPrice"] * (decimal)1.1;          }          da.Update(ds, "Products");       }    } } 
image from book

After you compile this program, run it from the command prompt to keep Microsoft Visual Studio's Server Explorer from adding data into the trace output. You can then run the following command to turn off the trace.

Logman stop MyTrace -ets 

This command simply stops the trace called MyTrace that was started earlier. The output file will contain binary trace information that must be converted to a readable format.

Performance Logs and Alerts Snap-In

The previous example used the start command to create an ad hoc trace, but you can use the create command to create a trace that is controllable from the Performance Logs And Alerts snap-in, which is also known as Performance Monitor. To get access to the high-performance WMI events from within Performance Monitor, start Performance Monitor by executing the following command from the Run menu.

perfmon.exe  /wmi 

Open the Performance Logs And Alerts tree to see the Trace Logs node. Right-click the Trace Logs node and click New Log Settings. After specifying a name, on the General tab select the four providers from the non-system providers list. On the Log Files tab, uncheck the End File Names With option and select the Overwrite Existing Log File option. Also click the Configure button to change the filename to Out.etl and the location to C:\PerfLogs. On the Schedule tab, select Manually for both the Start Log and Stop Log options. Figure 3-2 shows the screen layout for setting up this trace collection.

image from book
Figure 3-2: Creating a new trace collection using Performance Monitor

After you create the trace collection, it is available within Performance Monitor, as shown in Figure 3-3. You can also move the trace settings from one computer to another by saving the settings as an HTML object file on one machine and using the Load Settings From menu option on the target machine to load the HTML settings file from the first machine.

image from book
Figure 3-3: Using the Performance Logs And Alerts snap-in to control the trace collections

The summary file provides an at-a-glance display of the numbers of events captured by each provider. This file is useful when you want to quickly check to see if a change to your application has caused additional events to be logged. The summary file should look like the example in the following section.

Working with Event Trace Log Files

The trace output is an Event Trace Log (.etl) file, but this is a binary file that can be difficult to work with, so let's use the following command to convert this file into something that's easier to work with a comma-separated value (.csv) file.

TraceRpt /y C:\Perflogs\Out.etl 

This command produces a summary file called summary.txt and a detailed log file called dumpfile.csv. We use the /y switch to silently answer yes to any prompts that might be displayed.

image from book

Summary.txt File

Files Processed:    Out.etl Total Buffers Processed 7 Total Events  Processed 2624 Total Events  Lost      0 Start Time              Saturday, April 30, 2005 End Time                Saturday, April 30, 2005 Elapsed Time            10 sec +---------------------------------------------------------------------------+ |Event Count   Event Name     Event Type        Guid                        | +---------------------------------------------------------------------------+ |       1   EventTrace        Header  {68fdd900-4a3e-11d1-84f4-0000f80464e3}| |       6   AdoNetDiag        TextW   {7acdcac9-8947-f88a-e51a-24018f5129ef}| |     685   System.Data       TextW   {914abde3-171e-c600-3348-c514171de148}| |    1807   System.Data.SNI   TextA   {c9996fa6-c06f-f20c-8a20-69b3ba392315}| |      85   SQLNCLI           TextA   {ba798f37-2325-ec5b-ecf8-76958a2af9b5}| |      40   SQLNCLI           TextW   {ba798f37-2325-ec5b-ecf8-76958a2af9b5}| +---------------------------------------------------------------------------+ 
image from book

Note that the results can vary, depending on the control settings you use and whether other applications (such as Reporting Services) are running.

The detailed log file contains granular log information. This information is in column format, as described in Table 3-3.

Table 3-3: DumpFile.csv File Format

Column

Description

Event Name

Name of the event provider.

Event Type

Either TextW (Unicode) or TextA (ASCII) for data providers.

TID

Thread identifier.

Clock Time

Event timestamp that specifies the time the event occurred. The time is in Integer8 format, which is a 64-bit value that holds the number of 100-nanosecond intervals that have occurred since 12:00 a.m., January 1, 1601.

Kernel (ms)

Processor time in milliseconds that the event was in kernel mode. The value is in clock ticks, which are 10-millisecond intervals. The time is from when the current thread started to when the event tracer fires the event.

User (ms)

Processor time in milliseconds that the event was in user mode. The value is in clock ticks, which are 10-millisecond intervals. The time is from when the current thread started to when the event tracer fires the event.

User Data

Data that the provider added about the trace point.

Our example used 0x2 in the control bits to provide basic output information. The intent was to start with a small amount of data. The following excerpt is a small part of the user data where the connection string and command text are being set.

image from book

Dumpfile.csv User Data Excerpt

"<prov.DbConnectionHelper.ConnectionString_Set|API> 1#  'Data  Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|NORTHWND.MDF;Integrated  Security=True;User Instance=True'" "<sc.SqlCommand.set_CommandText|API> 1#  '" "Select * from Products" "'" "<sc.SqlCommand.set_Connection|API>  1# 1#" "<ds.DataSet.DataSet|API> 1#" 
image from book

If the control bits were set to 0x6 (or 0x0, which uses 0x6 as a default setting), the summary.txt file would show that many more events were logged and the dumpfile.csv would contain enter events and leave events. The enter and leave events also contain a nest level number that is used to match the enter and the leave event. The following is a small part of the dumpfile.csv that shows enter and leave events.

image from book

Dumpfile.csv User Data Export with Enter/Leave Events

"<prov.DbConnectionHelper.ConnectionString_Set|API> 1#  'Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|NORTHWND.MDF;  Integrated Security=True;User Instance=True'" "<sc.SqlCommand.set_CommandText|API> 1#  '" "Select * from Products" "'" "<sc.SqlCommand.set_Connection|API> 1#  1#" "<ds.DataSet.DataSet|API> 1#" "<ds.DataTableCollection.DataTableCollection|INFO> 1#  dataSet=1" "enter_01 <comm.DbDataAdapter.Fill|API> 1#  dataSet  srcTable='Products'" "enter_02 <comm.DbDataAdapter.Fill|API> 1#  dataSet  startRecord maxRecords  srcTable  command behavior=16{ds.CommandBehavior}" "enter_03 <sc.SqlConnection.Open|API> 1#" "enter_04 <SNIInitialize|API|SNI> pmo: 00000000{void}" "enter_05 <SNIInitializeEx|API|SNI> pmo: 00000000{void}" "enter_06 <SNIxInitialize|API|SNI>" . . . other events . . . "leave_06" . . . other events . . . "leave_05" . . . other events . . . "leave_04" . . . other events . . . "leave_03" . . . other events . . . "leave_02" . . . other events . . . "leave_01" 
image from book

Notice that the user data contains bracketed information that helps you understand the program flow. The bracketed information is in the following format:

The abbreviated namespace saves space in the output. The abbreviations are defined in Table 3-4.

Table 3-4: Namespace Abbreviations

Namespace

Abbreviation

System.Data

ds

System.Data.Common

comm

System.Data.Odbc

odbc

System.Data.OleDb

oledb

System.Data.OracleClient

ora

System.Data.ProviderBase

prov

System.Data.SqlCient

sc

The keyword is used to define the category of the event that is being logged. Table 3-5 describes the keywords that are generated by the data providers.

Table 3-5: Keywords

Keyword

Description

ADV

Advanced trace point events.

API

Public member that is called.

CATCH

Code is in a caught exception.

CPOOL

Connection pool events.

ERR

Error event.

INFO

Information event.

ODBC

Code is in the ODBC API.

OLEDB

Code is in the OLEDB API.

RET

A return value.

SNI

Code is in SQL Server Networking Interface (SNI).

THROW

A new exception is being thrown. This does not include an exception being rethrown.

WARN

Warning.

The parameter contains the syntax 1# (number and pound symbol) to identify the instance of the class; this information makes it easier to work with traces that involve multiple instances.

Using the LogParser Utility

The LogParser utility is a command-line tool that can convert several different input file types into several output file types. LogParser is part of the IIS 6.0 Resource Kit, which is a free download from Microsoft's Web site. LogParser comes as two files; LogParser.exe is the command-line utility, and LogParser.dll is a set of COM objects that support scripting. These files are independent of each other, so you don't need to register the LogParser.dll file in order to run the LogParser.exe utility.

One use for the LogParser is to browse the dumpfile.csv in a DataGrid. Figure 3-4 shows the output when the following command is executed:

logparser.exe "SELECT * from c:\perflogs\dumpfile.csv" -o:DATAGRID 

image from book
Figure 3-4: Using the LogParser.exe command-line utility to display a .csv file in a DataGrid

Notice that this tool lets you supply SQL queries to the source, and the output will contain the result. For example, you can supply a WHERE clause that looks for events that took place on a specific thread by using the following SQL command:

"SELECT * from c:\perflogs\dumpfile.csv WHERE TID='0x00000BC4'" 

This is especially useful when you are trying to troubleshoot a problem in a production environment and you only want to see the events that took place on your thread.

Using Tracing as a Diagnostic Tool

Tracing is especially useful as a diagnostic tool when you are trying to troubleshoot a problem and you don't have the source code. For example, let's say you installed your application on a machine that will be used for final testing before the application goes to production. You need to make sure that the code being tested is the same code that will be pushed to production, but when you compile the application for release, there is no source code on the machine.

The problem is that the database was not copied to the testing machine, but this might not be readily apparent based on the user-friendly (developer-unfriendly) error message that was displayed. You can use the console application that we used to create trace output earlier in this chapter and simulate the problem by renaming or deleting the database before running the application.

To troubleshoot this problem, you simply start the trace before running the application and stop the trace after the problem occurs. We can start the Performance Monitor trace that we defined earlier in this chapter and then start the problem application. After the exception is thrown, stop the trace and convert the Out.etl file to dumpfile.csv by using the TraceRpt.exe utility. Finally, run the LogParser.exe command-line utility with a filter on events that are exceptions. The LogParser command looks like the following:

LogParser.exe    "SELECT * from c:\perflogs\dumpfile.csv WHERE [User Data] like '%|ERR|%'"        -o:DATAGRID 

In the LogParser DataGrid, only two events are listed as exceptions. The user data in the following event clearly explains that there is a problem locating the database.

"<comm.ADP.TraceException|ERR|CATCH> 'System.Data.SqlClient.SqlException: An attempt to attach an auto- named database for file C:\Projects\Chapter3\CS\Tracing\bin\Debug\NORTHWND.MDF failed. A  database with the same name exists or specified file cannot be opened  or it is located  on UNC share. ,   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException e xception  Boolean breakConnection) ,   at System.Data.SqlClient.TdsParser.ThrowException AndWarning(TdsParserStateObject stateObj) ,   at System.Data.SqlClient.TdsParser.Run(Run Behavior runBehavior  SqlCommand cmdHandler  SqlDataReader dataStream  BulkCopySimpleRes ultSet bulkCopyHandler  TdsParserStateObject stateObj) ,   at System.Data.SqlClient.SqlI nternalConnectionTds.CompleteLogin(Boolean enlistOK) ,   at System.Data.SqlClient.SqlInt ernalConnectionTds.OpenLoginEnlist(SqlConnection owningObject  SqlConnectionString conne ctionOptions  String newPassword  Boolean redirectedUserInstance) ,   at System.Data.Sql Client.SqlInternalConnectionTds..ctor(SqlConnectionString connectionOptions  Object prov iderInfo  String newPassword  SqlConnection owningObject  Boolean redirectedUserInstance ) ,   at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions  options  Object poolGroupProviderInfo  DbConnectionPool pool  DbConnection owningConnec tion) ,   at System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConne ction owningConnection  DbConnectionPool pool  DbConnectionOptions options) ,   at Syste m.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject)'" 

 


Programming Microsoft ADO. NET 2.0 Applications. Advanced Topics
Linux Application Development (2nd Edition)
ISBN: 735621411
EAN: 2147483647
Year: 2004
Pages: 85
Authors: Michael K. Johnson, Erik W. Troan
BUY ON AMAZON

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net