Skip to main content
  • Place orders quickly and easily
  • View orders and track your shipping status
  • Enjoy members-only rewards and discounts
  • Create and access a list of your products
  • Manage your Dell EMC sites, products, and product-level contacts using Company Administration.

ECS 3.6.2 Data Access Guide

PDF

Permissions errors

Insufficient permissions errors can occur for a number of reasons. You may receive this type of error when running a hadoop fs command, or you may see it in an application log, such as the log for mapreduce or hive.

INSUFFICIENT_PERMISSIONS errors

In the following example, the jhs principal tried to create a directory (/tmp) and received an INSUFFICIENT_PERMISSIONS error. In this case, the permissions of the root directory did not allow this user to create a directory.

root@lrmk042:/etc/security/keytabs# hadoop fs -mkdir /tmp
18/02/26 21:03:09 ERROR vipr.ViPRFileSystemClientBase: Permissions failure for request: User: jhs/lrmk042.lss.emc.com@HOP171_HDFS.EMC.COM (auth:KERBEROS), host: hdfsBucket3.s3.site1, namespace: s3, bucket: hdfsBucket3
18/02/26 21:03:09 ERROR vipr.ViPRFileSystemClientBase: Request message sent: MkDirRequestMessage[kind=MKDIR_REQUEST,namespace=s3,bucket=hdfsBucket3,path=/tmp,hdfsTrustedStatus=HDFS_USER_NOT_TRUSTED,permissions=rwxr-xr-x,createParent=true]
mkdir: java.security.AccessControlException: ERROR_INSUFFICIENT_PERMISSIONS

root@lrmk042:/etc/security/keytabs# hadoop fs -ls -d /
drwxr-xr-x - hdfs hdfs 0 2018-02-26 16:58 /
root@lrmk042:/etc/security/keytabs#

When the case of an insufficient permissions error is not obvious on the client, you may have to look at the server logs. Start with dataheadsvc-error.log to find the error. Open a terminal window to each ECS node, and edit the dataheadsvc-error.log file. Find the error that corresponds to the time you saw the error on the client.

Failed to get credentials

Where you see an error like the following in the dataheadsvc-error.log:
2018-02-26 22:36:21,985 [pool-68-thread-6] ERROR RequestProcessor.java (line 1482) Unable to get group credentials for principal 'jhs@HOP171_HDFS.EMC.COM'. This principal will default to use local user groups. Error message: java.io.IOException: Failed to get group credentials for 'jhs@HOP171_HDFS.EMC.COM', status=ERROR

This is not an error. The message means that the server tried to look up the principal's name to see if there are any cached Active Directory(AD) groups for the principal user making the request. This error is returned for a Kerberos user.

The error indicates the user name making the request. Make a note of it.

Bucket Access Error

If a user making are request to access a bucket does not have ACL permissions, you may see this error in dataheadsvc-error.log.

2018-02-26 21:35:26,652 [pool-68-thread-1] ERROR BucketAPIImpl.java (line 220) Getting bucket failed with
com.emc.storageos.objcontrol.object.exception.ObjectAccessException: you don't have GET_KEYPOOL_ACL permission to this keypool
at com.emc.storageos.objcontrol.object.exception.ObjectAccessException.createExceptionForAPI(ObjectAccessException.java:286)
at com.emc.storageos.data.object.ipc.protocol.impl.ObjectAccessExceptionParser.parseFrom(ObjectAccessExceptionParser.java:61)

In this case, you should either add an explicit user ACL for the bucket, or add a custom group ACL for one of the groups that the user is a member of.

Object Access Error

Another type of permission error is an object access error. Access to objects (files and directories) should not be confused with access to a bucket. A user may have full control (read/write/delete) to a bucket, but may receive an INSUFFICIENT_PERMISSIONS error because they do not have access to one or more objects in the path they are trying to access. The following provides an example of an object access error.

2018-02-26 22:36:21,995 [pool-68-thread-6] ERROR FileSystemAccessHelper.java (line 1364) nfsProcessOperation failed to process path: mr-history/done
2018-02-26 22:36:21,995 [pool-68-thread-6] ERROR ObjectControllerExceptionHelper.java (line 186) Method nfsGetSMD failed due to exception
com.emc.storageos.data.object.exception.ObjectControllerException: directory server returns error ERROR_ACCESS_DENIED
at com.emc.storageos.data.object.FileSystemAccessLayer.FileSystemAccessHelper.nfsProcessOperation(FileSystemAccessHelper.java:1368)
at com.emc.storageos.data.object.FileSystemAccessLayer.FileSystemAccessHelper.getSystemMetadata(FileSystemAccessHelper.java:466)
at com.emc.storageos.data.object.FileSystemAccessLayer.FileSystemAccessLayer.getSystemMetadata(FileSystemAccessLayer.java:532)
at com.emc.storageos.data.object.blob.client.BlobAPI.getStat(BlobAPI.java:1294)
at com.emc.vipr.engine.real.RealBlobEngine.stat(RealBlobEngine.java:1976)
at com.emc.vipr.engine.real.RealBlobEngine.stat(RealBlobEngine.java:802)
at com.emc.vipr.hdfs.fs.RequestProcessor.accept(RequestProcessor.java:499)
at com.emc.vipr.hdfs.net.ConnectionManager$RequestThread.run(ConnectionManager.java:136)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

The two important items to note here are the requested action (stat) and the path of the object (mr-history/done). Note that the leading slash character is not displayed, so the real path is /mr-history/done. Now you have three pieces of information that are important for debugging:

  • user principal (jhs@HOP171_HDFS.EMC.COM)
  • action (stat is hadoop fs -ls)
  • path (/mr-history/done)

There are two approaches for additional debugging are described below:

  • Blobsvc log debugging
  • Hadoop client debugging

Blobsvc log debugging

A failed permission request will have an error in blobsvc like this:
2018-02-26 22:36:21,994
[TaskScheduler-BlobService-COMMUNICATOR-ParallelExecutor-5892]
ERROR ObjectAclChecker.java (line 101) not permit, cred jhs@HOP171_HDFS.EMC.COM[hadoop]false1 with
action GET_OBJECT_ACL on object with acl/owner/group user={hdfs@hop171_hdfs.emc.com=[FULL_CONTROL]},
groups={hdfs=[READ_ACL, EXECUTE, READ]}, other=[], owner=hdfs@hop171_hdfs.emc.com, group=hdfs

Look for not permit. This tells us the user making the request (jhs), the object's owner (hdfs), object group (hdfs) and the permissions for owner, group, and others. What it does not tell us is the actual object that failed the permission check. On the Hadoop node, become the hdfs principal, and start with the path, and work up the tree, which leads to the other method of debugging, looking at the Hadoop file system from the client.

Hadoop client debugging

When a permission error is received, you should know the user principal making the request, what action the request is, and what items are being requested. In the example, the jhs user received an error listing the /mr-history/done directory. You can do some analysis to determine the root cause. If you have access to the superuser account, perform these steps as that account.
root@lrmk042:/var/log/hadoop-mapreduce/mapred# hadoop fs -ls -d /mr-history/done
drwxrwxrwt - mapred hadoop 0 2018-02-26 16:58 /mr-history/done

The following example shows that the jhs principal should have had access to list this directory.

root@lrmk042:/var/log/hadoop-mapreduce/mapred# hadoop fs -ls -d /mr-history
drwxr-xr-x - hdfs hdfs 0 2018-02-26 16:58 /mr-history

Likewise, the following output shows that the directory has no access issues.

root@lrmk042:/var/log/hadoop-mapreduce/mapred# hadoop fs -ls -d /
drwxr-x--- - hdfs hdfs 0 2018-02-26 16:58 /

The problem here, is that the root directory is owned by hdfs, the group name is hdfs, but the others setting is - (0). The user making the request is jhs@REALM, and this user is a member of hadoop, but not hdfs, so this user has no object ACL permissions to list the /mr-history/done directory. Performing the chmod command on the root directory enables this user to perform their task.

root@lrmk042:/var/log/hadoop-mapreduce/mapred# hadoop fs -chmod 755 /

root@lrmk042:/var/log/hadoop-mapreduce/mapred# hadoop fs -ls -d /
drwxr-xr-x - hdfs hdfs 0 2018-02-26 16:58 /

Rate this content

Accurate
Useful
Easy to understand
Was this article helpful?
0/3000 characters
  Please provide ratings (1-5 stars).
  Please provide ratings (1-5 stars).
  Please provide ratings (1-5 stars).
  Please select whether the article was helpful or not.
  Comments cannot contain these special characters: <>()\