Pivotal Knowledge Base

Follow

Cannot Add HAWQ Segments in a HAWQ Cluster when HDFS is Stored on Isilon

Environment

 Product  Version
 Pivotal HDB  2.x
 Pivotal HD  2.4
 Ambari  2.x

Symptom

When trying to add HAWQ segments into a HAWQ cluster, the following error may be displayed if HDFS is stored on Isilon:

Error Message:

HAWQ segment requires DataNode to be installed along with it on the same host. Please add them first and then try adding the HAWQ segment. 

Cause

In older versions of Ambari and the Ambari HAWQ plugin, the HAWQ Segment/DataNode dependency was enforced, however, this was corrected in more recent versions via AMBARI-14227.

Although, if the cluster has been upgraded from an older version, the dependency may still be present on the system which may cause this issue to happen in any current release of Ambari or HAWQ. For example, in a HDP2.4 cluster, upgraded from HDP 2.4, we found that the dependency was present in the file: /var/lib/ambari-server/resources/stacks/HDP/2.2/services/HAWQ/metainfo.xml.

Resolution

The solution will be to remove the dependency from the Ambari server configuration files by following these steps: 

1. On the Ambari server, locate the metadata files associated with HAWQ that may be in these locations: 

/var/lib/ambari-server/resources/common-services/HAWQ/2.0.0/metainfo.xml
/var/lib/ambari-server/resources/stacks/HDP/2.4/services/HAWQ/metainfo.xml
/var/lib/ambari-server/resources/stacks/HDP/2.3/services/HAWQ/metainfo.xml
/var/lib/ambari-server/resources/stacks/HDP/2.2/services/HAWQ/metainfo.xml
/var/lib/ambari-server/resources/stacks/HDP/2.*/services/HAWQ/metainfo.xml

2. In each of the above files, look for lines that look like the ones in bold below:

<component>
<name>HAWQSEGMENT</name>
<displayName>HAWQ Segment</displayName>
<category>SLAVE</category>
<cardinality>1+</cardinality>
<timelineAppid>HAWQ</timelineAppid>

<commandScript>
<script>scripts/hawqsegment.py</script>
<scriptType>PYTHON</scriptType>
<timeout>600</timeout><dependencies>
</commandScript>
</dependencies>
<dependency>
<name>HDFS/DATANODE</name>
<scope>host</scope>
<auto-deploy>
<enabled>false</enabled>
<co-locate>HDFS/DATANODE</co-locate>
</auto-deploy>
</dependency>
</dependencies>

<customCommands>
<customCommand>
<name>IMMEDIATE_STOP_HAWQ_SEGMENT</name>
<commandScript>
<script>scripts/hawqsegment.py</script>
<scriptType>PYTHON</scriptType>
<timeout>1200</timeout>
</commandScript>
</customCommand>
</customCommands>
</component>

3. Backup the metainfo.xml files that contain the above lines.

4. Modify the files to remove the lines in bold.

5. Restart Ambari server: ambari-server restart.

6. Attempt to add the HAWQ Segments back into the cluster. 

Comments

Powered by Zendesk