Deploying on Linux, UNIX, and Windows


This section covers deployment issues for DB2 on Linux, UNIX, and Windows.

Deploying SQL Procedures Using DDL

The most straightforward way to deploy SQL procedures is to export them in DDL form from the source environment and apply them against the target environment. The Export command in the DB2 Development Center can be used to generate the DDL file. To do this, after launching the DB2 Development Center, right-click on the Stored Procedures folder within a project and select Export (see Figure 11.1) to launch the Export Wizard.

Figure 11.1. Initiating export.


The first screen of the wizard gives you the option of choosing which SQL stored procedures you would like to export (see Figure 11.2).

Figure 11.2. Selecting stored procedures for export.


Tip

Select the procedures you want to export in the order that you want to create them. For example, if there are any interdependencies between the stored procedures, you must ensure that the called procedures are selected before the calling procedure. If stored procedures are created first that are dependent on other procedures which do not exist, then the compilation of the stored procedures with the dependencies will fail.


Clicking Next will take you to thescreen in the wizard, in which you can specify the location and name of an output file. In Figure 11.3, the target filename is sample_proc. The generated file will be a .DB2 file, contained in a zipped file with the same name.

Figure 11.3. Saving the export file.


The next screen (shown in Figure 11.4) offers up some alternatives for the output file. To export in DDL form, you need to select the Create an export script radio button. The Export as project radio button will bundle all of the procedures into a single zip file. The Include source files checkbox can be checked or unchecked. It makes no difference when exporting SQL procedures; it is only used when exporting Java stored procedures.

Figure 11.4. Export options.


The next screen in the wizard summarizes the options that were selected; clicking Finish will generate the file. Figure 11.5 shows the contents of the output file. Notice the first line of the output file is a database CONNECT statement that has been commented out. The file can be modified to remove the comments and specify the values for a different database, user, and password. Notice also that the statement termination character is the @ symbol instead of the semicolon (;). This is done in order to remove ambiguity, because the SQL procedure body contains embedded semicolons. The line delimiters are also needed for the procedures to compile properly.

Figure 11.5. Contents of the export file.
 -- connect to SAMPLE user DB2ADMIN using Password -- DROP PROCEDURE smp_export_proc1 (CHARACTER, CHARACTER(6), CHARACTER(7)) -- @ CREATE PROCEDURE smp_export_proc1 ( IN p_midinit CHAR                                ,IN p_empno CHAR(6)                                ,OUT p_empno_mid CHAR(7))     LANGUAGE SQL     SPECIFIC export_proc1 sep: BEGIN    -- Declare return codes    DECLARE SQLSTATE CHAR(5) DEFAULT '00000';    DECLARE SQLCODE INT DEFAULT 0;    SET p_empno_mid = p_empno || p_midinit; END sep @ -- DROP PROCEDURE simple_export_proc2 (CHARACTER, CHARACTER(6), INTEGER) -- @ CREATE PROCEDURE simple_export_proc2 ( IN p_midinit CHAR                                      ,IN p_empno CHAR(6)                                      ,OUT p_mid_empno INT)     LANGUAGE SQL     SPECIFIC simple_export_proc2 sep2: BEGIN     -- Declare variables     DECLARE SQLSTATE CHAR(5) DEFAULT '00000';     DECLARE SQLCODE INT DEFAULT 0;     SET p_mid_empno = length(rtrim(p_midinit || p_empno)); END sep2 @ -- DROP PROCEDURE empty_proc () -- @ CREATE PROCEDURE empty_proc ()     LANGUAGE SQL     SPECIFIC empty_proc ep: BEGIN END ep @ --CONNECT RESET -- @ 

Tip

In order to be able to use the script repeatedly against the same database, you can edit the file and modify it to uncomment out the DROP statements that have been included before the CREATE statements in the script. The SPECIFIC name of the procedure can be used in the DROP statements to avoid ambiguity for any overloaded procedures. The example in Figures 11.6 and 11.7 show how this can be done.

Figure 11.6. Procedure text created by the Export command.
 -- DROP PROCEDURE sum (INTEGER, INTEGER, INTEGER) -- @ CREATE PROCEDURE sum(IN p_a INTEGER,IN p_b INTEGER,OUT p_s INTEGER) SPECIFIC sum_ab LANGUAGE SQL     BEGIN         SET p_s = p_a + p_b;     END @ 

Figure 11.7. Script modifications for multiple script executions.
 DROP PROCEDURE sum (INTEGER, INTEGER, INTEGER) @f CREATE PROCEDURE sum(IN p_a INTEGER,IN p_b INTEGER,OUT p_s INTEGER) SPECIFIC sum_ab LANGUAGE SQL     BEGIN         SET p_s = p_a + p_b;     END @ 


The modified script in Figure 11.7 will return an error the first time it is run because there will be no procedure sum to drop. If the script is run multiple times, then it will run properly because it will first drop the old version of the script and then re-create it. The script in Figure 11.6 would never re-create the sum procedure if executed multiple times because the DROP statement is commented out, and would instead return an error on the Create Procedure command.

Tip

It's a good idea to remove the comment on the CONNECT statement. When uncommenting the CONNECT statement, specify a database and user but remove using <Password> altogether. This way, every time the script is invoked, it will prompt you for a password. This will help catch situations in which the script is invoked inadvertently, because it gives you the option of canceling out at the password prompt.


Other options exist for exporting SQL procedures. For example, you can SELECT the procedure body text directly from the system catalog tables and pipe the output directly to a file. If you want to have a more robust solution, then you can write a little SQL procedure to extract the procedure body, format the output, and generate the corresponding DROP statements. However, because the utility exists in the DB2 Development Center, why not use it?

Now that you have the SQL procedure in a DDL file, you can apply the script to a target environment. This can be done through the command line. Of course, you will have to make sure that any database objects which are accessed by any of the SQL procedures have been created prior to running the script on the target server. To import (and create the SQL procedures), you can enter the following from the command line:

 db2 -td@ -f SAMPLE_PROC.db2 

The td@ option specifies that the @ symbol is to be used for the statement termination character and the -f option specifies that the input comes from a file. Refer to Appendix C, "Building from the Command Line," for more information

You may also have noticed while using the DB2 Development Center, you have an option to import. You can use this option to import an SQL procedure (instead of the command line option described earlier); however, this option has limited use when importing from a file. Only one procedure can be imported at a time, and the input file can only contain CREATE PROCEDURE statements. The file that is generated by the export utility is also zipped. You will have to unzip it before you can import any stored procedures from it.

Deploying Functions

There are multiple methods to deploy functions with DB2. The Development Center's GUI is an ideal method to learn how to deploy functions. The script-based method covered later in the chapter is recommended for more experienced users who have already stabilized their functions and want to deploy them across one or more test or production environments.

Deploying Functions Using the Development Center

In addition to deploying procedures, you can use the Development Center to develop and deploy your functions. The main Project View menu provides a listing of the functions that are part of your project. The menu is shown in the upper-left corner of Figure 11.8.

Figure 11.8. Project function list.


To deploy a function, right-click over the function and select the Deploy option. This will open the Deployment Wizard for functions. The deployment option is shown in Figure 11.9.

Figure 11.9. Deploying functions option.


Figure 11.10 demonstrates the the Target Database screen in the wizard, which will allow you to specify the connection details for the database to which you will be deploying. The Options screen will allow you to control how the function is deployed to the target database. You can control if the source code is deployed and available on the target by selecting the Deploy Source to Database option. The Error Handling option specifies how an error should be handled during the compilation of a function.

Figure 11.10. Function deployment options.


One of the more important options is how duplicates should be handled. Often when you deploy functions to a target server, functions might already exist with identical names and inputs. If you want to overwrite the existing functions, then select Drop Duplicates. If , you do not want to compile any functions that already exist, then select Ignore Duplicates.

The final Summary screen, shown in Figure 11.11, will summarize all the options that you have selected. This process will have to be repeated for each of the functions that you are working with.

Figure 11.11. Function deployment summary screen.


Deploying Functions Using Scripts

The most common method for deploying functions is still to use scripts. Having the functions in a script allows you to group all of the functions into one unit. You can also control that the functions are all compiled in the proper order. As stated earlier, this is important because functions often have dependencies, and if they are created out of order then the creation may fail.

Appendix C, "Building from the Command Line," contains a detailed overview of how to deploy scripts from the command line.

Deploying Triggers

The Development Center does not currently support the deployment of triggers. In order to deploy triggers on your systems, you will have to write scripts. These scripts can also contain functions and stored procedures so you can combine your deployment into one large script. The best practice, however, is to break your scripts into separate files for functions, triggers, and stored procedures. This makes debugging easier because you can isolate any script failures to a particular group of functions. The overview of how to deploy your scripts can be found in Appendix C, "Building from the Command Line."

Deploying Using GET ROUTINE and PUT ROUTINE

In previous releases of DB2 UDB, a C compiler was required to create stored procedures. The GET ROUTINE and PUT ROUTINE commands were used to overcome the compiler requirement when deploying stored procedures. The commands can still be useful for deploying procedures if you do not want to have the procedure logic exposed in scripts. You need to have database administrator (DBADM) or higher authority to execute either of the commands.

GET ROUTINE retrieves necessary information about an SQL procedure from the source environment and places that information in an SQL archive (SAR) file. PUT ROUTINE deploys the SQL procedure to the target environment, taking the SAR file generated from the GET ROUTINE command as input.

Figure 11.12 shows the syntax diagram for the GET ROUTINE command.

Figure 11.12. Syntax diagram for GET ROUTINE.
 >>-GET ROUTINE--INTO--file_name--FROM--+----------+------------->                                        '-SPECIFIC-' >--PROCEDURE--routine_name--+-----------+----------------------><                             '-HIDE BODY-' 

As you can see, the command is self-explanatory. If the SPECIFIC keyword is used, then you must use the specific name of the procedure in place of routine_name; otherwise, the procedure name must be used.

The HIDE BODY keyword is used in cases where you do not want to expose the content of the procedure when it is deployed. The text of the SQL procedure will not be inserted into the system catalog tables; only the compiled code for the SQL procedure will be transferred. This helps protect your code, because it can be used to hide the business logic contained in it.

Figure 11.13 shows two examples of the command.

Figure 11.13. Example of GET ROUTINE.
 GET ROUTINE INTO procs/prod_ddl.sar FROM SPECIFIC PROCEDURE db2admin.proc_ddl HIDE BODY@ GET ROUTINE INTO procs/simple_result_set.sar FROM SPECIFIC PROCEDURE db2admin.simple_result_set@ 

This command will have to be executed once for each procedure that you want to deploy. This task can get cumbersome, Figure 11.14 demonstrates a SQL query to generate a command file for all SQL procedures in a given database. The path to where the SAR files will reside will need to be updated. You also have the option of changing the WHERE clause if you want to be more selective.

Figure 11.14. Generate GET ROUTINE SQL script.
 SELECT 'GET ROUTINE INTO procs\'       || SPECIFICNAME || '.sar FROM SPECIFIC PROCEDURE '       || RTRIM(PROCSCHEMA) || '.' || SPECIFICNAME       || ' HIDE BODY' || '@'   FROM SYSCAT.PROCEDURES  WHERE LANGUAGE = 'SQL'; 

Assuming that the script is saved in a file called get_routine_script.db2, you can use the following command to direct the output to another file called get_routine_all.db2:

 db2 -txf get_routine_script.db2 -z get_routine_all.db2 

The -x option suppresses column headings and the row count, and the -z option directs output to a file. Following is a sample of the contents of the get_routine_all.db2 file:

 GET ROUTINE INTO procs\PROC_DDL.sar FROM SPECIFIC PROCEDURE DB2ADMIN.PROC_DDL HIDE BODY@ GET ROUTINE INTO procs\SIMPLE_RESULT_SET.sar FROM SPECIFIC PROCEDURE DB2ADMIN.SIMPLE_RESULT_SET HIDE BODY@ 

You can then run the get_routine_all.db2 file against the source database. This will extract all the information on SQL procedures into SAR files.

Tip

Create the directory in which you want to store the SAR files prior to running the script. You may want to give it a descriptive name that indicates the date and time of when the SAR files were generated. Often, you will generate the SAR files but at a later date not remember when they were generated.


The SAR files can then be used by the PUT ROUTINE command to deploy the SQL procedures to the target environment. Figure 11.15 shows the syntax diagram for the PUT ROUTINE command.

Figure 11.15. Syntax diagram for PUT ROUTINE.
 >>-PUT ROUTINE----FROM----file-name-----------------------------> >-----+---------------------------------------+----------------><       '-OWNER--new-owner--+----------------+--'                           '-USE REGISTERS--' 

The options for the PUT ROUTINE command may not be as intuitive as GET ROUTINE. Typically, the owner of the SQL procedure in a development environment is not the same as the owner of the procedure in the production environment. Hence, the PUT ROUTINE command gives you the option of overriding the owner by using the OWNER keyword when deploying. Additionally, the default schema names and the path for function resolution will also likely be different between environments. The register values for CURRENT SCHEMA and CURRENT PATH can be set on the target server, and can be used by the PUT ROUTINE command using the USE REGISTERS keyword.

Figure 11.16 shows two examples of the PUT ROUTINE command. In both of the examples, the OWNER of the SQL procedure on the target system will be the user db2admin. To issue this command, the user db2admin must have sufficient authority to create the procedure (the user must have SYSADM or DBADM authority, or must have IMPLICIT SCHEMA authority on the database, or CREATEIN privilege on the default schema of the procedure). The second routine the CURRENT SCHEMA and CURRENT PATH values will be used for the procedure's schema and path values instead of the default values.

Figure 11.16. Example of a PUT ROUTINE.
 PUT ROUTINE FROM procs/proc_ddl.sar OWNER db2admin@ PUT ROUTINE FROM procs/simple_result_set.sar OWNER db2admin USE REGISTERS@ 

You can use the sample script shown in Figure 11.17 (against the source database) to generate a script for each of your PUT ROUTINE statements.

Figure 11.17. Generate PUT ROUTINE SQL script.
 SELECT 'PUT ROUTINE FROM procs\'       || SPECIFICNAME || '.sar OWNER db2admin@'   FROM SYSCAT.PROCEDURES  WHERE LANGUAGE = 'SQL'; 

The following is a sample of the output generated from this command:

 PUT ROUTINE FROM procs\PROC_DDL.sar OWNER db2admin@ PUT ROUTINE FROM procs\SIMPLE_RESULT_SET.sar OWNER db2admin@ 

The GET ROUTINE and PUT ROUTINE commands use the pre-installed procedures GET_ROUTINE_SAR() and PUT_ROUTINE_SAR() in schema SYSFUN. These procedures have been externalized, and can be used by you to write your own interface or application to perform the GET ROUTINE and PUT ROUTINE.

Note

On UNIX systems, special consideration should be given to the users and the owners of files. The instance owner must belong to the primary group of the fenced user. If these conditions are not met, the commands will not work.


Using GET ROUTINE and PUT ROUTINE is the primary method for SQL procedure distribution in packaged applications. For example, you could be a vendor developing an application that requires stored procedures. GET ROUTINE would be used to create the SAR files. PUT ROUTINE would be then used by the application's install process to deploy the procedures.

Using the GET ROUTINE and PUT ROUTINE commands can also save time during installation because the code does not have to be recompiled.

Tip

The DB2 Development Center can be used for quickly deploying stored procedures. Similar to the Import and Export options, a Deploy option is also available. In cases where you might have separate databases for development and testing, the Deploy option allows you to quickly deploy procedures directly from the source database to the target database.


Deploying Using Older Releases of DB2

When deploying procedures, it is very important to consider which release of DB2 you will be using as the source and which will be the target. Releases of DB2 before V8.2 required the use of a C compiler to create SQL stored procedures. The introduction of the use of Native PSM in V8.2 has eliminated the need for a C compiler. It is, however, important to remember that previous releases of DB2 still require it, and you must plan accordingly.

Deploying from Pre-V8.2 to Post-V8.2

All releases of DB2 before V8.2 required the use of a C compiler. The GET ROUTINE and PUT ROUTINE commands can be used to overcome the need for a C compiler on the target system. There is one limitation of this method of deployment that may not be obvious. When you compile a procedure using a C compile on the earlier versions of DB2, the compiled code is platform-specific. You can therefore only deploy the procedures onto a system with the same operating system.

The GET ROUTINE and PUT ROUTINE commands can still be used for deploying your procedures, but you will have to issue the GET ROUTINE command on each particular operating system to which you need to deploy. Make sure you keep track of which files are for which operating system. A simple method would be to change the name of the output file to filename_OS.DB2. You can then deploy the procedures to the appropriate operating system.

Deploying from Pre-V8.2 to V8.2

Despite V8.2 using Native PSM for its procedures, it still also supports procedures that were previously compiled using C. If you are going to be using V8.2 as a database for your application, it is recommended that you compile all your procedures on V8.2 to ensure they are all in Native PSM. If, however, there is a business need for you to develop on an earlier release of DB2, then it is still possible to deploy your procedures onto V8.2 databases.

When you compile your procedures on pre-V8.2 databases, you will have to have a C compiler installed on the system. When you deploy the procedures to a V8.2 system, the code will still be compiled C code. You will therefore have the same restrictions as with pre-V8.2 procedures concerning only being able to deploy to the same operating system on which you compiled.

Deploying from V8.2 to Pre-V8.2

All procedures that are compiled on V8.2 are compiled in Native PSM. This code is platform-independent, but it does require the new features built into the V8.2 database engine to read it. You will therefore not be able to deploy procedures that are created on V8.2 databases to any previous releases of DB2. To deploy these procedures to databases on an earlier release, you will have to upgrade the target database to at least V8.2 which is the same as V8.1 FixPak 7.

Code Page Considerations

When building applications in mixed code-page environments, you need to give some thought to deployment. For example, if the database server is created with a UNICODE code page, a client with any code page can connect to it.

As client applications connect to the database to invoke stored procedures, input character string parameters are converted from the client application's code page to the database server's code page. Because the stored procedure is on the database server, it uses the same code page as the database server, thus no more code page conversion will occur until the stored procedure completes. At this point, all output character string parameters are converted to the client application's code page and sent back to the caller.

This implies that when building the stored procedure (which will later be deployed using GET ROUTINE or PUT ROUTINE) on a development or test server, the build steps (PREP, COMPILE, and BIND) must be executed against a database with the same code page as the code page of the database on the target server.

Backup and Recovery Considerations

Although not directly related to deployment, you need to give special consideration to the recovery of a database and its SQL procedures.

Specifically, for databases containing SQL procedures the RESTORE DATABASE command or the first connection after the RESTORE command may fail. This will occur if the KEEPFENCED DB2 configuration parameter is set to YES, and the DLL associated with the stored procedures is still resident in memory. To get around this, you need to restart the instance prior to issuing the RESTORE command. Stopping the instance will unload any DLLs associated with the stored procedures from memory.

Tip

Always restart the instance prior to issuing a RESTORE if your database contains SQL procedures.




    DB2 SQL PL. Deployment and Advanced Configuration Essential Guide for DB2 UDB on Linux., UNIX, Windows, i5. OS, z. OS
    Intelligent Enterprises of the 21st Century
    ISBN: N/A
    EAN: 2147483647
    Year: 2003
    Pages: 205

    flylib.com © 2008-2017.
    If you may any questions please contact us: flylib@qtcs.net