Pivotal Knowledge Base

Follow

Datanode Exception: Cannot start secure cluster without privileged resources

Problem

In a secured Hadoop cluster,  datanode will fail to start if you are not configured properly

Snippet from datanode logs below: /var/log/gpdh/hadoop-hdfs/hadoop.*datanode*.log

2014-04-17 18:06:40,432 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.RuntimeException: Cannot start secure cluster without privileged resources.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:726)
.. at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1751)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1904)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1925)
2014-04-17 18:06:40,432 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.RuntimeException: Cannot start secure cluster without privileged resources.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:726)
a
..

at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1904)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1925)
2014-04-17 18:06:40,438 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2014-04-17 18:06:40,438 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2014-04-17 18:06:40,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:Troubleshooting:

The below pointers can help you start the investigation, we have used the above log snippet values for example. Once verified try to start datanode again. 

Check for the below two properties in hdfs-site.xml 

<property>
<name>dfs.datanode.address</name>
<value>0.0.0.0:1004</value>
</property>
<property>
<name>dfs.datanode.http.address</name>
<value>0.0.0.0:1006</value>
</property>

The above two properties will make sure the datanode starts on secure ports which are less than 1024. In this case root user is required to open up secure ports which are below 1024.

Make sure to uncomment the three lines fromt the file /etc/default/hadoop-hdfs-datanode

 export HADOOP_SECURE_DN_USER=hdfs
 export HADOOP_SECURE_DN_PID_DIR=$HADOOP_PID_DIR
 export HADOOP_SECURE_DN_LOG_DIR=$HADOOP_LOG_DIR/hdfs

Miscellaneous

Verify if you can kinit using the principal name and keytab for hdfs user ?

 $ kinit -kt  <path>/<keytab_name> <user_name>/<FQDN>@REALM.COM

How to verify contents of keytab file:

 $ klist -ket <path>/<keytab_name>

How to regenerate keytab file:

 $ kadmin.local 
   ktadd -norandkey -k <keytab_name> <user_name>/<FQDN>@REALM.COM <user_name>/<FQDN>@REALM.COM

Note: We will keep updating this document as we find more reasons for the same issue.

Comments

Powered by Zendesk