Pivotal Knowledge Base

Follow

Hive insert failing with "No record of lock could be found, may have timed out"

Environment

Product Version
Pivotal HD 3.x
Hive 0.14

Symptom

Intermittently, when trying the insert into a Hive table, the following exception is seen:

Exception: org.apache.hadoop.hive.ql.lockmgr.LockException: No record of lock could be found, may have timed out
Killing DAG...
java.io.IOException: org.apache.hadoop.hive.ql.lockmgr.LockException: No record of lock could be found, may have timed out
at org.apache.hadoop.hive.ql.exec.Heartbeater.heartbeat(Heartbeater.java:84)
at org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor.monitorExecution(TezJobMonitor.java:293)
at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:167)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1606)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1367)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1179)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1006)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:996)

Cause

This is a software bug in the BoneCP connection pooling library.

Resolution

Set Hive to use the dbcp library instead of the BoneCP library by following these steps:

1. Open Ambari. 

2. Under "Services / Hive / Configs / Advanced / Custom hive-site" click "Add Property" and fill in the following key/value pair: 

datanucleus.connectionPoolingType = dbcp

3. Restart services as per Ambari. 

 

Comments

Powered by Zendesk