Ranger-usersync安装失败

问题1:

Credential ranger.usersync.policymgr.password has NOT been created. Mkdirs failed to create file:/usr/hdp/current/ranger-usersync/conf (exists=false, cwd=file:/var/lib/ambari-agent)

详细报错:

Task Log:

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_usersync.py", line 120, in <module>
    RangerUsersync().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_usersync.py", line 49, in install
    mode = 0640
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 120, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/ranger-usersync/conf/ugsync.jceks'] failed, parent directory /usr/hdp/current/ranger-usersync/conf doesn't exist
 stdout:
2021-08-03 11:28:02,190 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2021-08-03 11:28:02,196 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2021-08-03 11:28:02,198 - Group['kms'] {}
2021-08-03 11:28:02,199 - Group['livy'] {}
2021-08-03 11:28:02,199 - Group['spark'] {}
2021-08-03 11:28:02,199 - Group['ranger'] {}
2021-08-03 11:28:02,200 - Group['hdfs'] {}
2021-08-03 11:28:02,200 - Group['hadoop'] {}
2021-08-03 11:28:02,200 - Group['users'] {}
2021-08-03 11:28:02,200 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,201 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,202 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,203 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,204 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-08-03 11:28:02,205 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,206 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,206 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-08-03 11:28:02,207 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-08-03 11:28:02,208 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2021-08-03 11:28:02,209 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-08-03 11:28:02,210 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-08-03 11:28:02,211 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-08-03 11:28:02,211 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,212 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2021-08-03 11:28:02,213 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,214 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,215 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 11:28:02,215 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-08-03 11:28:02,217 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-08-03 11:28:02,226 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-08-03 11:28:02,226 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-08-03 11:28:02,227 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-08-03 11:28:02,228 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-08-03 11:28:02,229 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-08-03 11:28:02,241 - call returned (0, '1009')
2021-08-03 11:28:02,241 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-08-03 11:28:02,249 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] due to not_if
2021-08-03 11:28:02,249 - Group['hdfs'] {}
2021-08-03 11:28:02,250 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2021-08-03 11:28:02,250 - FS Type: HDFS
2021-08-03 11:28:02,250 - Directory['/etc/hadoop'] {'mode': 0755}
2021-08-03 11:28:02,267 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-08-03 11:28:02,268 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-08-03 11:28:02,286 - Repository['HDP-3.1-repo-1'] {'base_url': 'http://10.1.192.57/hdp/HDP/centos7/3.1.0.0-78/', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2021-08-03 11:28:02,296 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://10.1.192.57/hdp-utils/HDP-UTILS/centos7/1.1.0.22/', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2021-08-03 11:28:02,300 - Repository['HDP-3.1-GPL-repo-1'] {'base_url': 'http://10.1.192.57/hdp-gpl/HDP-GPL/centos7/3.1.0.0-78/', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2021-08-03 11:28:02,303 - Repository[None] {'action': ['create']}
2021-08-03 11:28:02,304 - File['/tmp/tmpJny92v'] {'content': '[HDP-3.1-repo-1]\nname=HDP-3.1-repo-1\nbaseurl=http://10.1.192.57/hdp/HDP/centos7/3.1.0.0-78/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://10.1.192.57/hdp-utils/HDP-UTILS/centos7/1.1.0.22/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-1]\nname=HDP-3.1-GPL-repo-1\nbaseurl=http://10.1.192.57/hdp-gpl/HDP-GPL/centos7/3.1.0.0-78/\n\npath=/\nenabled=1\ngpgcheck=0'}
2021-08-03 11:28:02,305 - Writing File['/tmp/tmpJny92v'] because contents don't match
2021-08-03 11:28:02,305 - File['/tmp/tmpbpAzva'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2021-08-03 11:28:02,306 - Writing File['/tmp/tmpbpAzva'] because contents don't match
2021-08-03 11:28:02,307 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-08-03 11:28:02,414 - Skipping installation of existing package unzip
2021-08-03 11:28:02,414 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-08-03 11:28:02,424 - Skipping installation of existing package curl
2021-08-03 11:28:02,424 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-08-03 11:28:02,433 - Skipping installation of existing package hdp-select
2021-08-03 11:28:02,438 - The repository with version 3.1.0.0-78 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2021-08-03 11:28:02,833 - Package['ranger_3_1_0_0_78-admin'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-08-03 11:28:02,932 - Skipping installation of existing package ranger_3_1_0_0_78-admin
2021-08-03 11:28:02,934 - Package['ranger_3_1_0_0_78-usersync'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-08-03 11:28:02,943 - Skipping installation of existing package ranger_3_1_0_0_78-usersync
2021-08-03 11:28:02,945 - Package['ranger_3_1_0_0_78-tagsync'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2021-08-03 11:28:02,955 - Skipping installation of existing package ranger_3_1_0_0_78-tagsync
2021-08-03 11:28:02,956 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2021-08-03 11:28:03,010 - Execute[(u'/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', u'/usr/hdp/current/ranger-usersync/lib/*', 'org.apache.ranger.credentialapi.buildks', 'create', u'ranger.usersync.policymgr.password', '-value', [PROTECTED], '-provider', u'jceks://file/usr/hdp/current/ranger-usersync/conf/ugsync.jceks')] {'logoutput': True, 'environment': {'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'sudo': True}
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
log4j:WARN No appenders could be found for logger (org.apache.htrace.core.Tracer).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
WARNING: You have accepted the use of the default provider password
by not configuring a password in one of the two following locations:
    * In the environment variable HADOOP_CREDSTORE_PASSWORD
    * In a file referred to by the configuration entry
      hadoop.security.credstore.java-keystore-provider.password-file.
Please review the documentation regarding provider passwords in
the keystore passwords section of the Credential Provider API
Continuing with the default provider password.

Credential ranger.usersync.policymgr.password has NOT been created. Mkdirs failed to create file:/usr/hdp/current/ranger-usersync/conf (exists=false, cwd=file:/var/lib/ambari-agent)
create <alias> [-value alias-value] [-provider provider-path] [-strict]:

The create subcommand creates a new credential for the name
specified as the <alias> argument within the provider indicated
through the -provider argument. If -strict is supplied, fail
immediately if the provider requires a password and none is given.
If -value is provided, use that for the value of the credential
instead of prompting the user.
java.io.IOException: Mkdirs failed to create file:/usr/hdp/current/ranger-usersync/conf (exists=false, cwd=file:/var/lib/ambari-agent)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:458)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:443)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1118)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1098)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:987)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:975)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:652)
    at org.apache.hadoop.security.alias.JavaKeyStoreProvider.getOutputStreamForKeystore(JavaKeyStoreProvider.java:59)
    at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.flush(AbstractJavaKeyStoreProvider.java:288)
    at org.apache.hadoop.security.alias.CredentialShell$CreateCommand.execute(CredentialShell.java:355)
    at org.apache.hadoop.tools.CommandShell.run(CommandShell.java:72)
    at org.apache.ranger.credentialapi.buildks.createKeyStore(buildks.java:149)
    at org.apache.ranger.credentialapi.buildks.createCredential(buildks.java:86)
    at org.apache.ranger.credentialapi.buildks.main(buildks.java:40)
2021-08-03 11:28:03,789 - File['/usr/hdp/current/ranger-usersync/conf/ugsync.jceks'] {'owner': 'ranger', 'group': 'ranger', 'mode': 0640}
2021-08-03 11:28:03,796 - The repository with version 3.1.0.0-78 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries

 解决办法:

# cd /usr/hdp/current/ranger-usersync

# mv conf conf.bak

# ln -s conf.dist conf

问题2:

/usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh: Permission denied

详细报错:

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 241, in <module>
    RangerAdmin().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 971, in restart
    self.stop(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 61, in stop
    Execute(format('{params.ranger_stop}'), environment={'JAVA_HOME': params.java_home}, user=params.unix_user)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
    returns=self.resource.returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh stop' returned 126. -bash: /usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh: Permission denied
 stdout:
2021-08-03 14:10:17,602 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.0.0-78 -> 3.1.0.0-78
2021-08-03 14:10:17,623 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2021-08-03 14:10:17,833 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.0.0-78 -> 3.1.0.0-78
2021-08-03 14:10:17,839 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2021-08-03 14:10:17,841 - Group['livy'] {}
2021-08-03 14:10:17,842 - Group['spark'] {}
2021-08-03 14:10:17,842 - Group['ranger'] {}
2021-08-03 14:10:17,843 - Group['hdfs'] {}
2021-08-03 14:10:17,843 - Group['hadoop'] {}
2021-08-03 14:10:17,843 - Group['users'] {}
2021-08-03 14:10:17,843 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,844 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,845 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,846 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,847 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-08-03 14:10:17,848 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,849 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,849 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-08-03 14:10:17,850 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-08-03 14:10:17,851 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-08-03 14:10:17,852 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-08-03 14:10:17,853 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-08-03 14:10:17,854 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,854 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2021-08-03 14:10:17,855 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,856 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,857 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-08-03 14:10:17,858 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-08-03 14:10:17,859 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-08-03 14:10:17,867 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-08-03 14:10:17,868 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-08-03 14:10:17,869 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-08-03 14:10:17,871 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-08-03 14:10:17,872 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-08-03 14:10:17,885 - call returned (0, '1009')
2021-08-03 14:10:17,885 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-08-03 14:10:17,893 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] due to not_if
2021-08-03 14:10:17,893 - Group['hdfs'] {}
2021-08-03 14:10:17,894 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2021-08-03 14:10:17,895 - FS Type: HDFS
2021-08-03 14:10:17,895 - Directory['/etc/hadoop'] {'mode': 0755}
2021-08-03 14:10:17,917 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-08-03 14:10:17,918 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-08-03 14:10:17,938 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2021-08-03 14:10:17,952 - Skipping Execute[('setenforce', '0')] due to not_if
2021-08-03 14:10:17,952 - Directory['/data01/apache/logs/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2021-08-03 14:10:17,955 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2021-08-03 14:10:17,955 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2021-08-03 14:10:17,956 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2021-08-03 14:10:17,960 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2021-08-03 14:10:17,962 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2021-08-03 14:10:17,969 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2021-08-03 14:10:17,981 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-08-03 14:10:17,981 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2021-08-03 14:10:17,982 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2021-08-03 14:10:17,986 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2021-08-03 14:10:17,993 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2021-08-03 14:10:17,999 - Skipping unlimited key JCE policy check and setup since it is not required
2021-08-03 14:10:18,573 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.0.0-78 -> 3.1.0.0-78
2021-08-03 14:10:18,625 - Execute['/usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh stop'] {'environment': {'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'user': 'ranger'}

Command failed after 1 tries

解决办法 :

# cd /usr/hdp/current/ranger-admin/ews/

# chmod a+x /usr/hdp/current/ranger-admin/ews/ranger-admin-services.sh

问题3: 

详细报错:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 241, in <module>
    RangerAdmin().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 1006, in restart
    self.start(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 97, in start
    self.configure(env, upgrade_type=upgrade_type, setup_db=params.stack_supports_ranger_setup_db_on_start)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_admin.py", line 134, in configure
    setup_ranger_xml.ranger('ranger_admin', upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/setup_ranger_xml.py", line 49, in ranger
    setup_ranger_admin(upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/setup_ranger_xml.py", line 69, in setup_ranger_admin
    create_parents = True
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 177, in action_create
    raise Fail("Applying %s failed, looped symbolic links found while resolving %s" % (self.resource, path))
resource_management.core.exceptions.Fail: Applying Directory['/usr/hdp/current/ranger-admin/conf'] failed, looped symbolic links found while resolving /usr/hdp/current/ranger-admin/conf

 解决办法:

# cd /usr/hdp/current/ranger-admin

先备份软链接,以防出现其他错误好恢复

# mv conf conf.bak

重新创建软链接conf

# ln -s /etc/ranger-admin/2.6.3.0-235/0 /usr/hdp/current/ranger-admin/conf

问题4:

详细报错:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_tagsync.py", line 133, in <module>
    RangerTagsync().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 1006, in restart
    self.start(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_tagsync.py", line 69, in start
    self.configure(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/ranger_tagsync.py", line 63, in configure
    setup_ranger_xml.ranger('ranger_tagsync', upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/setup_ranger_xml.py", line 55, in ranger
    setup_tagsync(upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER/package/scripts/setup_ranger_xml.py", line 529, in setup_tagsync
    create_parents = True
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 177, in action_create
    raise Fail("Applying %s failed, looped symbolic links found while resolving %s" % (self.resource, path))
resource_management.core.exceptions.Fail: Applying Directory['/usr/hdp/current/ranger-tagsync/conf'] failed, looped symbolic links found while resolving /usr/hdp/current/ranger-tagsync/conf

解决办法:

# cd /usr/hdp/current/ranger-tagsync

# mv conf conf.bak

# ln -s /etc/ranger-tagsync/2.6.3.0-235/0 /usr/hdp/current/ranger-tagsync/conf

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值