RSS

2 Ways of Updating/Deleting the bulk data from the table

06 Jun

Being a developer of DB2 LUW, I had written many Migration scripts where I needed to perform data manipulation on bulk data. Most known problem for such operations on bulk data is that most of the time we face transaction log full problem:

“SQL0964C The transaction log for the database is full”

This problem occurs frequently when the amount of data is huge. For example, consider a performance database or production database table having millions of customers and we need to manipulate all the customer data for some value.

To resolve this problem, DBAs usually configure the database parameters such as LOGBUFSZ, LOGFILSIZ, LOGPRIMARY and LOGSECOND. This approach is trail and error approach where we need to set the particular values and try for the operations.

But along with this, over the time I learned ways to perform such bulk operations using the PL/SQL operations.

Note: I use the sample database while providing the example. These examples are just for reference and can be used in different scenarios. For SAMPLE database, we don’t need to use these methods since the data is very low there.

1) Update n records at a time using GET DIAGNOSTICS:

We can write a stored procedure Where inside while Loop we can use the update statement in little different way as follows:

WHILE nCount <> 0
DO
   UPDATE (SELECT * FROM Trnsct_tab WHERE isUpdated = 0 FETCH FIRST 10000 ROWS ONLY) SET Transactky = pmtlineitemky;
   GET DIAGNOSTICS nCount = ROW_COUNT;
    COMMIT;
END WHILE;

In above example, After every 10000 records COMMIT operation will be performed and the transaction log will get cleared. I used 10000 records at a time. We can use any number depending on our environment and parameter values.

2) Update/Delete n records at a time using commit-count method:

We can design a stored procedure and provide the input parameter to determine after how much records the commit operation should be performed. This is the preferred way I follow in almost all of my migration scripts where I have a little doubt about the transaction log full. This is the most trusted way for me.

CREATE PROCEDURE BulkUpdateSP
(
    IN v_CommitInterval INTEGER
)
SPECIFIC BulkUpdateSP
LANGUAGE SQL 
BEGIN
   DECLARE v_cntcommit INTEGER DEFAULT 0;
   DECLARE v_empNo INTEGER;
   DECLARE SQLCODE INTEGER DEFAULT 0000;

   DECLARE C1 CURSOR WITH HOLD FOR
         SELECT empNo from EMPLOYEE;

   OPEN C1;
   FETCH C1 INTO v_empNo;
   WHILE (SQLCODE <> 100)
   DO
       Update EMPLOYEE
             SET workDept = ‘F004’
                     WHERE empNo = v_empNo;

    SET v_cntCommit = v_cntCommit + 1;
  — commit the database after certain number of transactions.
     IF v_cntCommit > v_commitInterval THEN
            commit;
            SET v_cntCommit = 0;
     END IF;
     FETCH C1 INTO v_empNo;
     END WHILE;
END

While Calling this stored Procedure user can specify the appropriate number after which he wish to commit the current transaction.

CALL PPANDE.BulkUpdateSP(500) — To perform the commit operations after every 500 transactions.

Advertisements
 
1 Comment

Posted by on June 6, 2014 in Databases, DB2

 

One response to “2 Ways of Updating/Deleting the bulk data from the table

  1. Saurabh_S

    June 6, 2014 at 9:50 am

    Reblogged this on Memoir and commented:
    Doing everyday tasks in a better way !!

     

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: