Pivotal Knowledge Base

Follow

How to bulk insert with a custom AsyncEventListener

Applies To:

SQLFire 1.0.x to 1.1.2.x

Purpose:

This document provides a couple of workarounds to avoid an issue where a bulk insert doesn't properly fire a custom AsyncEventListener to asynchronously write the data into external database (such as Postgresql).

Symptom:

A custom AsyncEventListener may fail to pass data that is loaded by bulk insert (i.e. batch insert by PreparedStatement) from a Java client application into the external database. This behavior does not occur with single (non-batch) inserts, nor with the default, built-in DBSynchronizer.

Root Cause:

The Generic AsyncEventListner in SQLFire uses a slightly different invocation path, and only Primary Key based events are sent to the AsyncEventListner Queue. A bulk insert's event type is BULK_INSERT and a custom AsyncEventListener that implements AsyncEventListener will not receive the BULK_INSERT type event.

Solution:

There are two basic workarounds for this behavior (which will be addressed in a future release).

Workaround 1:

Use simple (non-batch) SQL inserts in conjunction with a Prepared Statement instead of a bulk insert in your Java client application.

For example:

Statement.addBatch();
Statement.executeBatch();
-->
tatement.executeUpdate(insertSQL);

Workaround 2:

Extend the built-in DBSynchronizer instead of implementing AsyncEventListener in the custom AsyncEventListener code.

For example:

public class customDBSynchronizer implements AsyncEventListener
-->
import com.vmware.sqlfire.callbacks.DBSynchronizer;
public class customDBSynchronizer extends DBSynchronizer

Comments

Powered by Zendesk