Hi,
I am writing a Spring Batch job and everything is going well. However when I try to use the jdbcTemplate.batchupdate it's creating an insert statement per row.
the following is an example of my code
we are using the following as a datasource
I have multi-threading enabled and am using a pagingReader. When I view the GreenPlum(postgres) logs, there is one insert statement being created for each row. I have autocommit turned off in the code and in the database. I would have though the whole point of using batchupdate was so there would be only one insert per chunk?
If anyone can help I'd appreciate it.
Best Regards,
John
I am writing a Spring Batch job and everything is going well. However when I try to use the jdbcTemplate.batchupdate it's creating an insert statement per row.
the following is an example of my code
Code:
]int[] updateCounts = jdbcTemplate.batchUpdate(sqlStatement,
new BatchPreparedStatementSetter() {
public int getBatchSize() {
return productAdvisories.size();
}
public void setValues(java.sql.PreparedStatement arg0,
int arg1) throws SQLException {
all correct insert statements here
} // correct closing items
Code:
<bean id="traceDb" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName" value="${traceDB.driverClassName}" />
<property name="url" value="${traceDB.url}" />
<property name="username" value="${traceDB.username}" />
<property name="password" value="${traceDB.password}" />
<property name="defaultAutoCommit" value="false"></property>
<property name="defaultTransactionIsolation">
<util:constant
static-field="org.springframework.transaction.TransactionDefinition.ISOLATION_READ_UNCOMMITTED" />
</property>
</bean>
If anyone can help I'd appreciate it.
Best Regards,
John