HbaseRegionserver通过脚本自动重启

  • 环境:Hdp2.5 + hbase 1.2 + linux环境,5个数据节点

场景:

  • 由于平台提供出去使用,时常有一段时间进行大量数据的写入与查询,这时可能会导致Hbase RegionServer出现宕机的情况。为了保证对数据写入与查询不产生影响,分别间隔一定时间对ResionServer检测是否宕机,如果宕机则重启,否则不做处理。

脚本如下:

[root@bigdata41 project]# more autoRestartHbaseRegionserver.sh
#!/bin/bash
su - hbase -c "/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver"

脚本说明:该脚本只是很暴力地执行了当HbaseRegionserver宕机时重启,当存在HbaseRegionserver时则不做任何处理。需要切换至hbase用户执行,ambari界面才能监听并同步至可视界面。

定时任务设置如下:

#每隔一段时间定时检测hbaseRegionserver是否宕机,如果其宕机则重启以防止由于其宕机导致数据写入与输出异常
0 */1 * * * cd /root/project/&& ./autoRestartHbaseRegionserver.sh

思路:

  • 1、Ambari 有提供Hbase滚动重启的操作选项,只不过通过该选项只能启动一个轮回,每台机器之间的间隔时间可自动配置,但不能超过最大时间,大概最大是10个小时。详情如下图:
  • 在这里插入图片描述

在这里插入图片描述

  • 2、当通过Ambari界面重启Hbase Regionserver时,查看日志发现其实启动过程主要是执行了如下语句
2018-10-09 14:39:38,918 - Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] {'not_if': 'ambari-sudo.sh  -H -E test -f /var/run/hbase/hbase-hbase-regionserver.pid && ps -p `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-regionserver.pid` >/dev/null 2>&1', 'user': 'hbase'}
2018-10-09 14:39:38,981 - Skipping Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] due to not_if

在这里插入图片描述

通过上述日志于是便自行在脚本中简单执行上述语句,经测试可以正常执行,如需要优化脚本可详细解读日志跟源码。

  • 3、注意要用切换至hbase用户再执行,如若通过其他用户执行用需要手动删除如下文件
    [root@bigdata07 ~]# cd /var/run/hbase/
    [root@bigdata07 hbase]# ll
    total 8
    -rw-r--r-- 1 hbase hadoop  6 Oct  9 16:10 hbase-hbase-regionserver.pid
    -rw-r--r-- 1 hbase hadoop 60 Oct  9 16:10 hbase-hbase-regionserver.znode
  • 附全量日志,通过Ambari界面执行后日志如下所示:
stdout:   /var/lib/ambari-agent/data/output-9452.txt
2018-10-09 14:39:35,977 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
2018-10-09 14:39:35,977 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
2018-10-09 14:39:35,978 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2018-10-09 14:39:36,016 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
2018-10-09 14:39:36,017 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2018-10-09 14:39:36,060 - checked_call returned (0, '')
2018-10-09 14:39:36,061 - Ensuring that hadoop has the correct symlink structure
2018-10-09 14:39:36,061 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-09 14:39:36,174 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
2018-10-09 14:39:36,174 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
2018-10-09 14:39:36,175 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2018-10-09 14:39:36,215 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
2018-10-09 14:39:36,216 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2018-10-09 14:39:36,254 - checked_call returned (0, '')
2018-10-09 14:39:36,254 - Ensuring that hadoop has the correct symlink structure
2018-10-09 14:39:36,254 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-09 14:39:36,256 - Group['livy'] {}
2018-10-09 14:39:36,257 - Group['spark'] {}
2018-10-09 14:39:36,257 - Group['ranger'] {}
2018-10-09 14:39:36,258 - Group['hadoop'] {}
2018-10-09 14:39:36,258 - Group['users'] {}
2018-10-09 14:39:36,258 - Group['knox'] {}
2018-10-09 14:39:36,258 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,259 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,260 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,260 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2018-10-09 14:39:36,261 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2018-10-09 14:39:36,261 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,262 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,263 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2018-10-09 14:39:36,263 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,264 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,264 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,265 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,266 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,266 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,267 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,267 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,268 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,269 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-09 14:39:36,270 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-10-09 14:39:36,292 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2018-10-09 14:39:36,292 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:36,293 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-09 14:39:36,294 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-10-09 14:39:36,317 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2018-10-09 14:39:36,317 - Group['hdfs'] {}
2018-10-09 14:39:36,318 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2018-10-09 14:39:36,318 - FS Type: 
2018-10-09 14:39:36,319 - Directory['/etc/hadoop'] {'mode': 0755}
2018-10-09 14:39:36,334 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2018-10-09 14:39:36,335 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-10-09 14:39:36,348 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-10-09 14:39:36,372 - Skipping Execute[('setenforce', '0')] due to not_if
2018-10-09 14:39:36,372 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:36,375 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-10-09 14:39:36,375 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-10-09 14:39:36,382 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2018-10-09 14:39:36,383 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2018-10-09 14:39:36,384 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:36,394 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2018-10-09 14:39:36,395 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-10-09 14:39:36,396 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-10-09 14:39:36,400 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2018-10-09 14:39:36,421 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-10-09 14:39:36,609 - Stack Feature Version Info: stack_version=2.5, version=2.5.3.0-37, current_cluster_version=2.5.3.0-37 -> 2.5.3.0-37
2018-10-09 14:39:36,610 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
2018-10-09 14:39:36,610 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
2018-10-09 14:39:36,610 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2018-10-09 14:39:36,649 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
2018-10-09 14:39:36,649 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2018-10-09 14:39:36,688 - checked_call returned (0, '')
2018-10-09 14:39:36,688 - Ensuring that hadoop has the correct symlink structure
2018-10-09 14:39:36,689 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-09 14:39:36,691 - checked_call['hostid'] {}
2018-10-09 14:39:36,711 - checked_call returned (0, 'c20a0543')
2018-10-09 14:39:36,715 - Directory['/etc/hbase'] {'mode': 0755}
2018-10-09 14:39:36,719 - Directory['/usr/hdp/current/hbase-regionserver/conf'] {'owner': 'hbase', 'group': 'hadoop', 'create_parents': True}
2018-10-09 14:39:36,720 - Directory['/tmp'] {'create_parents': True, 'mode': 0777}
2018-10-09 14:39:36,720 - Changing permission for /tmp from 1777 to 777
2018-10-09 14:39:36,720 - Directory['/tmp'] {'create_parents': True, 'cd_access': 'a'}
2018-10-09 14:39:36,721 - Execute[('chmod', '1777', '/tmp')] {'sudo': True}
2018-10-09 14:39:36,744 - XmlConfig['hbase-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {}, 'configurations': ...}
2018-10-09 14:39:36,756 - Generating config: /usr/hdp/current/hbase-regionserver/conf/hbase-site.xml
2018-10-09 14:39:36,756 - File['/usr/hdp/current/hbase-regionserver/conf/hbase-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,797 - XmlConfig['core-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {'final': {'fs.defaultFS': 'true'}}, 'configurations': ...}
2018-10-09 14:39:36,805 - Generating config: /usr/hdp/current/hbase-regionserver/conf/core-site.xml
2018-10-09 14:39:36,805 - File['/usr/hdp/current/hbase-regionserver/conf/core-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,831 - XmlConfig['hdfs-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {'final': {'dfs.support.append': 'true', 'dfs.datanode.data.dir': 'true', 'dfs.namenode.http-address': 'true', 'dfs.namenode.name.dir': 'true', 'dfs.webhdfs.enabled': 'true', 'dfs.datanode.failed.volumes.tolerated': 'true'}}, 'configurations': ...}
2018-10-09 14:39:36,838 - Generating config: /usr/hdp/current/hbase-regionserver/conf/hdfs-site.xml
2018-10-09 14:39:36,839 - File['/usr/hdp/current/hbase-regionserver/conf/hdfs-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,889 - XmlConfig['hdfs-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {'final': {'dfs.support.append': 'true', 'dfs.datanode.data.dir': 'true', 'dfs.namenode.http-address': 'true', 'dfs.namenode.name.dir': 'true', 'dfs.webhdfs.enabled': 'true', 'dfs.datanode.failed.volumes.tolerated': 'true'}}, 'configurations': ...}
2018-10-09 14:39:36,896 - Generating config: /usr/hdp/current/hadoop-client/conf/hdfs-site.xml
2018-10-09 14:39:36,897 - File['/usr/hdp/current/hadoop-client/conf/hdfs-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,946 - XmlConfig['hbase-policy.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {}, 'configurations': {'security.masterregion.protocol.acl': '*', 'security.admin.protocol.acl': '*', 'security.client.protocol.acl': '*'}}
2018-10-09 14:39:36,954 - Generating config: /usr/hdp/current/hbase-regionserver/conf/hbase-policy.xml
2018-10-09 14:39:36,954 - File['/usr/hdp/current/hbase-regionserver/conf/hbase-policy.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,964 - File['/usr/hdp/current/hbase-regionserver/conf/hbase-env.sh'] {'content': InlineTemplate(...), 'owner': 'hbase', 'group': 'hadoop'}
2018-10-09 14:39:36,965 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-10-09 14:39:36,968 - File['/etc/security/limits.d/hbase.conf'] {'content': Template('hbase.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-10-09 14:39:36,968 - TemplateConfig['/usr/hdp/current/hbase-regionserver/conf/hadoop-metrics2-hbase.properties'] {'owner': 'hbase', 'template_tag': 'GANGLIA-RS'}
2018-10-09 14:39:36,975 - File['/usr/hdp/current/hbase-regionserver/conf/hadoop-metrics2-hbase.properties'] {'content': Template('hadoop-metrics2-hbase.properties-GANGLIA-RS.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2018-10-09 14:39:36,975 - TemplateConfig['/usr/hdp/current/hbase-regionserver/conf/regionservers'] {'owner': 'hbase', 'template_tag': None}
2018-10-09 14:39:36,977 - File['/usr/hdp/current/hbase-regionserver/conf/regionservers'] {'content': Template('regionservers.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2018-10-09 14:39:36,978 - TemplateConfig['/usr/hdp/current/hbase-regionserver/conf/hbase_regionserver_jaas.conf'] {'owner': 'hbase', 'template_tag': None}
2018-10-09 14:39:36,979 - File['/usr/hdp/current/hbase-regionserver/conf/hbase_regionserver_jaas.conf'] {'content': Template('hbase_regionserver_jaas.conf.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2018-10-09 14:39:36,980 - Directory['/var/run/hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-10-09 14:39:36,980 - Directory['/var/log/hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-10-09 14:39:36,981 - File['/usr/hdp/current/hbase-regionserver/conf/log4j.properties'] {'content': ..., 'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:36,981 - HBase: Setup ranger: command retry not enabled thus skipping if ranger admin is down !
2018-10-09 14:39:36,982 - call['ambari-python-wrap /usr/bin/hdp-select status hbase-client'] {'timeout': 20}
2018-10-09 14:39:37,019 - call returned (0, 'hbase-client - 2.5.3.0-37')
2018-10-09 14:39:37,020 - RangeradminV2: Skip ranger admin if it's down !
2018-10-09 14:39:37,104 - checked_call['/usr/bin/kinit -c /var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf -kt /etc/security/keytabs/hbase.service.keytab hbase/hdp05.gzbigdata.org.cn@BIGDATA.ORG.CN > /dev/null'] {'user': 'hbase'}
2018-10-09 14:39:37,155 - checked_call returned (0, '')
2018-10-09 14:39:37,156 - call['ambari-sudo.sh su hbase -l -s /bin/bash -c 'curl --location-trusted -k --negotiate -u : -b /var/lib/ambari-agent/tmp/cookies/cbd26bc2-f067-4bda-ab73-242aa6458d8b -c /var/lib/ambari-agent/tmp/cookies/cbd26bc2-f067-4bda-ab73-242aa6458d8b -w '"'"'%{http_code}'"'"' http://hdp06.gzbigdata.org.cn:6080/login.jsp --connect-timeout 10 --max-time 12 -o /dev/null 1>/tmp/tmptoVZk8 2>/tmp/tmpEkgdWB''] {'quiet': False, 'env': {'KRB5CCNAME': '/var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf'}}
2018-10-09 14:39:37,222 - call returned (0, '')
2018-10-09 14:39:37,223 - call['/usr/bin/klist -s /var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf'] {'user': 'hbase'}
2018-10-09 14:39:37,275 - call returned (0, '')
2018-10-09 14:39:37,276 - call['ambari-sudo.sh su hbase -l -s /bin/bash -c 'curl --location-trusted -k --negotiate -u : -b /var/lib/ambari-agent/tmp/cookies/45c51089-fe63-400b-95c5-11daa12e7484 -c /var/lib/ambari-agent/tmp/cookies/45c51089-fe63-400b-95c5-11daa12e7484 '"'"'http://hdp06.gzbigdata.org.cn:6080/service/public/v2/api/service?serviceName=GZ_BIG_DATA_PLAT_hbase&serviceType=hbase&isEnabled=true'"'"' --connect-timeout 10 --max-time 12 -X GET 1>/tmp/tmpdsDwLv 2>/tmp/tmphXpq2M''] {'quiet': False, 'env': {'KRB5CCNAME': '/var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf'}}
2018-10-09 14:39:37,406 - call returned (0, '')
2018-10-09 14:39:37,406 - Hbase Repository GZ_BIG_DATA_PLAT_hbase exist
2018-10-09 14:39:37,408 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-security.xml'] {'content': InlineTemplate(...), 'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:37,409 - Writing File['/usr/hdp/current/hbase-regionserver/conf/ranger-security.xml'] because contents don't match
2018-10-09 14:39:37,409 - Directory['/etc/ranger/GZ_BIG_DATA_PLAT_hbase'] {'owner': 'hbase', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:37,410 - Directory['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/policycache'] {'owner': 'hbase', 'group': 'hadoop', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:37,410 - File['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/policycache/hbaseMaster_GZ_BIG_DATA_PLAT_hbase.json'] {'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:37,411 - File['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/policycache/hbaseRegional_GZ_BIG_DATA_PLAT_hbase.json'] {'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:37,412 - XmlConfig['ranger-hbase-audit.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hbase', 'configurations': ...}
2018-10-09 14:39:37,421 - Generating config: /usr/hdp/current/hbase-regionserver/conf/ranger-hbase-audit.xml
2018-10-09 14:39:37,421 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-hbase-audit.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2018-10-09 14:39:37,435 - XmlConfig['ranger-hbase-security.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hbase', 'configurations': ...}
2018-10-09 14:39:37,443 - Generating config: /usr/hdp/current/hbase-regionserver/conf/ranger-hbase-security.xml
2018-10-09 14:39:37,443 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-hbase-security.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2018-10-09 14:39:37,449 - XmlConfig['ranger-policymgr-ssl.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hbase', 'configurations': ...}
2018-10-09 14:39:37,458 - Generating config: /usr/hdp/current/hbase-regionserver/conf/ranger-policymgr-ssl.xml
2018-10-09 14:39:37,458 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-policymgr-ssl.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2018-10-09 14:39:37,464 - Execute[('/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/install/lib/*', '-f', '/etc/ranger/GZ_BIG_DATA_PLAT_hbase/cred.jceks', '-k', 'sslKeyStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/local/java/jdk1.8.0_91'}, 'sudo': True}
Using Java:/usr/local/java/jdk1.8.0_91/bin/java
Alias sslKeyStore created successfully!
2018-10-09 14:39:38,193 - Execute[('/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/install/lib/*', '-f', '/etc/ranger/GZ_BIG_DATA_PLAT_hbase/cred.jceks', '-k', 'sslTrustStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/local/java/jdk1.8.0_91'}, 'sudo': True}
Using Java:/usr/local/java/jdk1.8.0_91/bin/java
Alias sslTrustStore created successfully!
2018-10-09 14:39:38,917 - File['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/cred.jceks'] {'owner': 'hbase', 'group': 'hadoop', 'mode': 0640}
2018-10-09 14:39:38,918 - Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] {'not_if': 'ambari-sudo.sh  -H -E test -f /var/run/hbase/hbase-hbase-regionserver.pid && ps -p `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-regionserver.pid` >/dev/null 2>&1', 'user': 'hbase'}
2018-10-09 14:39:38,981 - Skipping Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] due to not_if

Command completed successfully!

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值