Managing Changes

DBAs have an awful reputation for being over-watchful of changes, second only to the firewall team. My boss used to tell me that if you haven't upset someone recently for slowing down their project just a little, you're probably not doing your job properly. This really hits home when you hear about database changes going in without any type of control over them and bringing down several customers' databases.

Each company deployment methodology is different, but in this section I'd like to share a little of what I've adopted over the years. The most critical item is separation of the various database environments. No, I don't mean the separation of Oracle and SQL Server. By separation, I mean the creation of various environments to represent a product's lifecycle from development to production. At a minimum, I would recommend the following database and parallel application environments:

  • Development Every developer needs their sandbox to try new stored procedures and code in. This is where developers would try new code that may endanger other environments. In the next chapter, I'll discuss how to secure this environment and still provide the developers the flexibility they need to program efficiently.

  • Integration As the developers complete testing their code internally, they'll give the nod to move the code to a more stable environment. The integration environment provides an environment that is closer to development but does not change on an hourly basis. It's best to keep builds into this environment daily at most. Ideally, you would have your business analyst and other testers use this environment to test the daily builds of the database and application.

  • Quality Assurance (QA) When you hand the database to the QA environment, it should be very stable and almost a complete product, since many of the obvious bugs should have been found in integration. QA is where the low-level testing really takes off, and they'll test the applications with every possible scenario.

  • Benchmark If budget allows, a benchmark environment is perfect for tuning stored procedures or synthesizing a stressed application and database. This environment is often combined with QA to keep cost down.

  • Client Acceptance Lab In this environment, the client will approve the application or change going into production. This should be a production-ready version of the database and application, and is similar to doing a final walkthrough of a house before the final purchase.

  • Production The final environment is production, of course. This is the most protected of environments, and if all of your migration has been successful, you should have a smoothly running database.

    Tip 

    If this type of setup is too expensive, consider running multiple instances of SQL Server.

Deploying Your Database

Once you have your database created and out of testing, you'll be looking for a way to deploy it. There are several ways to deploy your database to a client or another server:

  • Back up and restore

  • Attach and detach

  • Generate scripts

The restore and attach database options are valid, but leave much room for error. The main contention you'll find is version-controlling databases with these methods. If you have to choose between restoring and attaching a database, I prefer attachment, which is much faster and easier. You can attach a database in Enterprise Manager by clicking on the Database group, then selecting Attach Database. You can also use the sp_attachdb stored procedure to perform the same function. For more on attaching databases, see Chapter 8.

The preferred method of deployment is to generate scripts. This is because you can place those scripts in a versioning system such as Visual Source Safe. Once the script is brought into Visual Source Safe, you can use the utility to compare versions to see where there are discrepancies between different versions of your database. Scripts also neatly integrate with installation programs such as InstallShield and are therefore very portable.

You can generate scripts for a database by right-clicking on a database and selecting All Tasks | Generate SQL Script from the shortcut menu. Select Show All to view all the SQL Server objects that can be transferred.

Keep in mind that the scripting engine produces DDL for SQL Server 2000 by default. If you want the script to be SQL Server 7.0 compatible, you must select Only Script 7.0 Compatible Features in the Formatting tab. This excludes such features as column-level collation and extended properties. Also keep in mind that any functions that you have developed in SQL Server 2000 will not work in SQL Server 7.0.

The Options tab holds a number of important items that are unselected by default. You can script out the logins and the key information (primary and foreign) under this tab. You can also script out indexes. All of these options are disabled by default. Make sure that you select object-level permissions or your users will not be able to execute the stored procedures once they are transferred.

Before migrating a change into any environment, ensure that you have a rollback script or at least, a plan to get the database into its prior form. If a rollback script is impossible, consider backing up the transaction log prior to the change using transaction log marks. I'll discuss these types of backups in Chapter 8. Additionally, make sure that your scripts are modular and output to an error log of some sort. Give a visual confirmation of the script's success or failure, either in a table inside SQL Server or in a log file.

Tip 

You can also generate scripts for jobs by selecting All Tasks | Generate SQL Script under the Job menu. All you have to do is assign the script a filename and you're done. Additionally, there are many third-party tools that will script out the data in the tables as Insert statements. For an updated list of these tools, see http://www.sqlservercentral.com/products/.

Version Controlling SQL Server Code

Version controlling your scripts is the most boring part of a DBA's job. Unfortunately, Microsoft does not make it any easier with their tools. The tools that ship with SQL Server do not integrate automatically with the source control software, such as Visual Source Safe, which makes version controlling a manual process. Protecting your code can be as simple as having a file share on a server or as elaborate as a third-party product like Visual Source Safe.

Whatever your solution, the organization of the source code tree will be similar. Here's a sample Visual Source Safe project for supporting your database:

-Application Name --Server Information --Application Documentation --Disaster Recovery Documentation --Database Objects ----Major Release Number --------Tables and Views --------Stored Procedures  --------Triggers --------Permissions --------Other (Rules, DTS, Linked Servers, etc) --Installs --------Major Release (V1.0) ------------Build (V1.0.5) ---------------Upgrade ---------------New Install

Attached to each build, I insist on receiving a change document so I can better audit the change six months from now. It also helps with a DBA's communication
to have everything in writing. This too should go in the source control system.
The following is a sample change document that is pretty generic, but fits most environments:

click to expand

Version Control Audit

One of the first things you do when you speak to a support representative is tell
them what release of the product you're on. The same thing needs to apply for your database. Where you can run the query SELECT @@VERSION to determine the software release, you should be able to pull from a table to determine what version of the actual database you're on.

The method I use is to create a table called DB_VERSION, which holds the version of the database, which generally corresponds with the application version number. This table is installed in each user database. This is especially useful when you're trying to research if the application matches the version of the database that's installed.

IF NOT EXISTS (select * from dbo.sysobjects  where id = object_id(N'[dbo].[DB_VERSION]')  and OBJECTPROPERTY(id, N'IsUserTable') = 1)  BEGIN CREATE TABLE [DB_VERSION] (       [MajorVersion] [char] (5) NULL ,       [MinorVersion] [char] (5) NULL ,       [Build] [char] (5) NULL ,       [Revision] [char] (5) NULL ,       [OneOff] [char] (5) NULL,       [DateInstalled] [datetime] NULL CONSTRAINT        [DF__Version__DateIns__0876219E]        DEFAULT (getdate()),       [InstalledBy] [varchar] (50) NULL ,       [Description] [varchar] (255) NULL  ) ON [PRIMARY] END GO

In the following table, you can see the purpose of each column.

Column

Description

MajorVersion

Major release of the application. In the application version, it would be the following bolded number: 1.0.5.1.

MinorVersion

Minor release of the application. In the application version, it would be the following bolded number: 1.0.5.1.

Build

Build number of the application. In the application version, it would be the following bolded number: 1.0.5.1.

Revision

Also can be called the minor service pack (or patch) position. This number would also refer to bug fixes that are found in QA. For example, if you have numerous iterations of a Program 2.5.3 as they find and fix bugs, you could increment this number (Program 2.5.3.2, for the second iteration). In the application version, it would be the following bolded number: 1.0.5.1.

OneOff

In some cases, a customization code may be required for a client. In those cases, you could have 2.1.1.0 - 1 to indicate a customization (1 being for Suntrust, for example). This field is only used in specialized situations.

DateInstalled

Date this was installed in the environment. This is set to getdate() by default, which will set it to today's date and time.

InstalledBy

The name of the installer or creator of the service pack.

Description

Description of the service pack. Can be used in an environment where multiple clients are sharing one database. For example, 'Upgrade Program A to add index in CUSTOMER table on the CUSTOMER_ID column.'

Service Packs

Never underestimate the influence of service packs on your system. Although it's rare, after installing a SQL Server service pack, SQL behavior may change to the point where stored procedures that worked previously no longer function. This is where multiple instances come in handy with SQL Server. You can test a service pack on one instance and still preserve your core testing instance.

Note 

When you install a service pack on one instance, the service pack upgrades your shared tools. Because of this, you should test service packs on a separate machine from production.

Rolling back from a bad service pack is not a simple process. You can roll back by following these steps:

  1. Back up all of your databases (just in case).

  2. Detach your databases using the sp_detachdb stored procedure.

  3. Reinstall SQL Server 2000.

  4. Install the previous service pack and confirm functionality.

  5. Attach the databases using sp_attachdb or Enterprise Manager.

Installing hotfixes is a little more challenging. Installing hotfixes typically means copying and overwriting the SQL Server files after stopping the instance. Usually the files that need to be overwritten are in the \Program Files\Microsoft SQL Server\ MSSQL\binn directory. Make sure you create a manual backup of the databases and files before overwriting the files. Generally, there are a few scripts you may have to run as well against the master database after you overwrite the files.

Tip 

If you're in a cluster, you'll find it's especially useful to stay up to date on SQL Server and Windows service packs, as they usually will fix some of the unexplained troublesome behavior in your cluster.




SQL Server 2000 for Experienced DBAs
ppk on JavaScript, 1/e
ISBN: 72227885
EAN: 2147483647
Year: 2005
Pages: 126

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net