Using the DB2 CONNECT IMPORT Command to Load Initial-State Data

You can use the DB2 CONNECT IMPORT command to process and load initial-state data into aDB2 database. Only ASCII codepages are supported.

Note:
This support is provided only in Windows environments at this time.


Prerequisites

Before the DB2 CONNECT IMPORT command can be used to load initial-state data to a DB2 database, via the Event Replicator Target Adapter, DB2 CONNECT for Windows must be installed. In addition, using the Configuration Assistant from DB2 CONNECT, make sure you configure the alias that should be used as a database name when importing the initial-state data occurs.

Limitations

The following limitations exist in this version:

  • This support is provided only in Windows environments at this time.

  • Only ASCII codepages are supported.

Enabling DB2 Database Initial-State Data Loading

Start of instruction setTo enable DB2 database initial-state data loading, complete the following steps:

  1. Once all prerequisites have been met, use the Administration tool to configure a target definition for your DB2 database. For more information, read Configuring Target Definitions for Event Replicator Target Adapter.

  2. Using the Administration tool, create a specific target processing option definition for your DB2 database. For more information about setting up target processing option definitions, read Specifying Target Database Processing Option Definitions.

    In the target processing option definition for your DB2 database:

    Note:
    No value need be specified in the Environment File field.

    • Select the Use Loader check box (check it).

    • Enter the alias name of the DB2 database in the Oracle Service, Teradata Tdpip, Microsoft SQL Schema Name or DB2 Alias field.

    • Provide a value for the Path to Loader Executable field, identifying the full path of the DB2 executable, db2cmd.exe.

    • Specify any of the following options, as required for your installation, in the Options field.

      When specified, these options must be specified in the order shown in the table and they must be separated by a space. The Event Replicator Target Adapter does not validate these command options; it just passes them to the DB2 CONNECT IMPORT command for processing. For complete information about these options, refer to your DB2 Command Reference.

      Option Description
      ALLOW WRITE ACCESS This option causes the Event Replicator Target Adapter to run the import process in online mode. Specify this option only when you want to update your Adabas database at the same time as running an initial-state. It will avoid the database table locks required by the IMPORT command. Otherwise, this option is only required when you import data to a table with a nickname (in which case the COMMITCOUNT option must also be supported with an integer setting, not the literal "AUTOMATIC").

      When this option is specified, an intent exclusive (IX) lock on the target table is acquired when the first row is inserted. This allows concurrent readers and writers to access table data.

      The import operation will periodically commit inserted data to prevent lock escalation to a table lock and to avoid running out of active log space. These commits will be performed even if the COMMITCOUNT option is not specified for the run. During each commit, the import process will lose its IX table lock, and will attempt to reacquire it after the commit.

      COMMITCOUNT { n | AUTOMATIC } This option controls when commits of imported records occur. Valid values are an integer representing the number of imported records or the literal "AUTOMATIC".

      To reduce network overhead, Event Replicator Target Adapter import processes use NOT ATOMIC compound SQL to insert the data, with 100 SQL statements attempted with each import. If your SQL transaction log is not sufficiently large, the import operation will fail. It must be large enough to accommodate the number of rows specified by the COMMITCOUNT option or the number of rows in the data file if COMMITCOUNT is not specified. We therefore recommend that the COMMITCOUNT option be specified to avoid transaction log overflows.

      When a number (n) is specified, the import process performs a commit after that many records have been imported. When compound inserts are used, a user-specified commit frequency of n is rounded up to the first integer multiple of the compound count value. When the literal "AUTOMATIC" is specified, the import process internally determines when a commit should be performed. The utility will commit the imported data for one of two reasons:

      • To avoid running out of active log space; or

      • To avoid a lock escalation from row level to table level.

      If the ALLOW WRITE ACCESS option is specified, and the COMMITCOUNT option is not specified, the import process will perform commits as if COMMITCOUNT AUTOMATIC had been specified.

      WARNINGCOUNT n This option controls when the import operation stops because of warnings. Valid values are integers beginning with zero (0).

      Set this parameter if no warnings are expected, but you want to verify that the correct file and table are being used. If the import file or the target table is specified incorrectly, the import utility will generate a warning for each row that it attempts to import, which will cause the import to fail.

      If a value of zero is specified for this option or if this option is not specified, the import operation will continue regardless of the number of warnings issued.

      NOTIMEOUT This option specifies that the import utility will not timeout while waiting for locks.

    Save the target processing option definition with the same name as the target definition you created in Step 1.

  3. When all processing options are specified, start initial-state processing, directing the output to the Event Replicator Target Adapter via a GFFT you created (using the Data Mapping Tool or the Adabas Event Replicator Subsystem) for the DB2 database. For more information on this, read Requesting Initial-State Data. The Event Replicator Target Adapter will collect the initial-state records and load them into your DB2 database.

Processing Output

The output from this processing includes:

  • Various DB2 CONNECT IMPORT files used for its processing;

  • DB2 CONNECT IMPORT log files from the processing; and

  • DAT files containing the initial-state data received by the Event Replicator Target Adapter prior to being imported.

All output files are written to the \logs subdirectory of your Event Replicator Target Adapter installation

Processing Files

The Event Replicator Target Adapter’s load processing for DB2 creates two processing files per table loaded. The following processing files are stored in the \logs subdirectory of your Event Replicator Target Adapter installation:

  • A single xxxx_yyyymmdd_hhmmss.BAT file for all DB2 CONNECT processing (where xxxx is the base table name, yyyymmdd is the date, and hhmmss is the time: This .BAT file is used to run the load utility.

  • One or more xxxx_yyyymmdd_hhmmss.DEL files (where xxxx is the table name, yyyymmdd and hhmmss are the timestamp identifying when the file was written, and DEL indicates the file is a delimited ASCII file). These files contain the data to be loaded to the database tables. These files can take a good deal of space, so make sure that there is enough space to accommodate them and be sure to manually delete them when initial-state processing has completed successfully.

Log Files

Two kinds of log files are produced by Event Replicator Target Adapter import processing for DB2: a primary log file and multiple processing log files. All log files are stored in the \logs subdirectory of your Event Replicator Target Adapter installation.

  1. The DB2 load processing primary log files make references to the processing log files. The primary log files have names in the format xxxx_yymmdd_hhmmsss.out, where xxxx is the table name, yymmdd is the date of the log file, and hhmmsss is the time (to the microsecond) when the log file was last updated.

    We recommend that you review these primary log files carefully to ensure that the import process had no problems restoring primary and foreign keys, indexes, or the unique index of tables. If import processing fails for some reason, we recommend reviewing these primary .out log files first when resolving the problem, proceeding then to the processing log files, as necessary.

  2. Multiple processing log files are created, one for each import process executed by the Event Replicator Target Adapter. These processing log files have names in the format xxxx_yyyymmdd_hhmmss.MSG, where xxxx is the table name, yyyymmdd and hhmmss are the timestamp identifying when the file was written, and MSG indicates the file is a log file.