Why you should always use DBT for transferring IC databases…

Why you should always use DBT for transferring IC databases...

Hi,

when talking about migrating IBM Connections from an old to the current release you have to think about migrating databases from the old to the new environment (in case you do a side-by-side migration – which is recommended if you migrate a production environment). You have two options to do such a migration:

  • db2 backup on the old server and restore on the new server
  • use the dbt.jar tool to transfer databases

I always perform my migrations using dbt. Gabriella Davis always prays this if you ask her and I can also encourage you to do so because of the following reasons:

  • DBT is more flexible and erroneous data that exists within the source database will be detected and not transferred from one to the other version
  • The process is really easy and straight forward if you prepare everything properly
  • New database parameters will be set for the new database and not taken from any older version backup
  • You can migrate databases from WINDOWS to LINUX (Backup / restore is not possible here without workarounds)
  • You can migrate databases between DB2 / Oracle / SQL Server. This is normally only possible using third party or other special tools

Many advantages in my opinion!!!

I have an example where erroneous data had to be deleted in order to complete a transfer.

When transferring the HOMEPAGE database from IBM Connections 5.0 to 5.5 the transfer failed with the following error message in the log:

 [07/07/16 13:22:08.738 UTC] Transferring table --{ HOMEPAGE.LOGINNAME}-- to table 
--{HOMEPAGE.LOGINNAME }-- [07/07/16 13:22:11.913 UTC] error.executing.transfer 
err.dbtransfer.exception.labelclass com.ibm.db2.jcc.am.BatchUpdateException: 
[jcc][t4][102][10040][3.65.110] Batch failure. The batch was submitted, 
but at least one exception occurred on an individual member of the batch. 
Use getNextException() to retrieve the exceptions for specific batched elements. 
ERRORCODE=-4229, SQLSTATE=null com.ibm.db2.jcc.am.BatchUpdateException

The reason for this failure was that the transfer tool tried to insert the same value twice into a database table which has a unique index set. db2 uses unique indexes to ensure that no identical key values are stored in a table. Normally this is NOT possible… please do not ask me how this happened… Anyway I had to remove this duplicate value:

Bildschirmfoto 2016-07-24 um 21.15.12

The unique index is set on the column “LOGINNAME”:

In the “createDB.sql” script you can see this unique index creation:

CREATE UNIQUE INDEX HOMEPAGE.LOGINNAME_UNIQUE
ON HOMEPAGE.LOGINNAME (LOGINNAME, ORGANIZATION_ID)@

To verify this, I ran the following queries:

Bildschirmfoto 2016-07-21 um 12.41.02

as you can see… two different PERSON_ID`s but twice the same LOGINNAME in the table. This is why the transfer failed because the target database judged this as not allowed and Constraint violation.

I deleted one of the user profile in the LOGINNAME table and the transfer finished successful.

Update: Thanks to Marius Meyer for his comment… You can also exclude the complete table if you add the following exclude statement in you dbt xml file for the homepage transfer:

<table sourceName="LOGINNAME" exclude="true"/> <!-- optional argument used to have dbtransfer skip a table -->

This might work for this table cause it is rebuilt after each user logs into the homepage feature … but you need to use this with caution. Each other table that you skip might cause a lot of trouble. So it is always a better idea to find the root cause why the transfer fails.

I can remember an older example where I had exactly the same issues with the IBM Connections metrics database where some kind of Vietnamese characters caused such problems.

In both cases I was able to identify those problems because of a failing DBT transfer and a deep analysis of the transfer and db2diag.log.

Btw. the dbt tool originally comes from WebSphere Portal where is approach is used since years to transfer data from derby to a DBMS system.

6 thoughts on “Why you should always use DBT for transferring IC databases…

  1. Great post (like always)! 🙂
    This is what I do in every migration. DBT tool is the swiss knife for DB content transfer.
    As you mentioned, the LOGINNAME in the HOMEPAGE DB gets violated sometimes. You can exclude the transfer of this table in the XML definition for the DBT tool with “”. (THX to B. Hering!) LOGINNAME will be rebuilded on users next login. Did not recognize any problems in any migration.

      • Hi Marius,

        thanks for your comment. I added the table exclusion into the post. Thanks for this hint – I never used it because I fear problems with data integrity… But for the loginname this might be okay and work. For all other tables you might run into serious trouble.
        Wordpress swallows tags… yep because of the editor that is used – and you cannot switch between modes of you make a comment. Sorry for that 😉

  2. Nice work. It’s good to highlight the importance of this tool even if it can cause frustration and delay when migrating. It’s better to get the data in good order which will only help future migrations.

  3. Another great reason to use DBT.jar, is that does not transfer unnecessary data for some of the Search-related tables in the HOMEPAGE database. In particular, I’m referring to the SR_INDEX_DOCS table which can grow to a huge number if the SR_INDEX_MANAGEMENT table contains rows for search nodes that now no longer exist on the Target system.

Leave a Reply

Your email address will not be published. Required fields are marked *