Hive的日志操作

想要看hive的日志,我们查看/home/hadoop/hive/conf/hive-log4j2.properties

# list of properties
property.hive.log.level = INFO
property.hive.root.logger = DRFA
property.hive.log.dir = ${sys:java.io.tmpdir}/${sys:user.name}
property.hive.log.file = hive.log
property.hive.perflogger.log.level = INFO

红色部分说明了hive的日志目录,但是我们查看 echo ${sys:java.io.tmpdir} 是没有输出的,为空,其实日志目录在系统的/tmp/hostname/hive.log

[hadoop@master hadoop]$ pwd
/tmp/hadoop
[hadoop@master hadoop]$ ls -rlt
total 320
-rw-rw-r--. 1 hadoop hadoop  18669 Jan 11 11:12 hive.log.2019-01-11
-rw-rw-r--. 1 hadoop hadoop 269939 Mar 31 23:55 hive.log.2019-03-31
-rw-rw-r--. 1 hadoop hadoop   8003 Apr  1 22:32 hive.log.2019-04-01
-rw-rw-r--. 1 hadoop hadoop   8946 Apr  2 22:33 stderr
-rw-rw-r--. 1 hadoop hadoop  10850 Apr  2 22:35 hive.log

在hive的安装目录,新建一个logs目录,来存储日志文件

[hadoop@master hive]$ pwd
/home/hadoop/hive
[hadoop@master hive]$ mkdir logs

修改配置文件 hive-log4j2.properties

property.hive.log.dir = /home/hadoop/hive/logs

退出hive

[hadoop@master sbin]$ hive

Logging initialized using configuration in file:/home/hadoop/hive/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> show databases;
OK
db_hive
default
Time taken: 3.857 seconds, Fetched: 2 row(s)
hive> use db_hive;
OK
Time taken: 0.024 seconds
hive> exit ;

重新启动hive

[hadoop@master logs]$ pwd
/home/hadoop/hive/logs
[hadoop@master logs]$ ls -rlt
total 8
-rw-rw-r--. 1 hadoop hadoop 7946 Apr  2 22:50 hive.log

可以看到hive的日志已经更改路径了

可以看hive中的变量:

hive> set;

...
...
system:sun.java.launcher=SUN_STANDARD
system:sun.jnu.encoding=UTF-8
system:sun.management.compiler=HotSpot 64-Bit Tiered Compilers
system:sun.os.patch.level=unknown
system:user.country=US
system:user.dir=/home/hadoop/hadoop-2.7.3/sbin
system:user.home=/home/hadoop
system:user.language=en
system:user.name=hadoop
system:user.timezone=America/Los_Angeles

设置属性在启动的时候

[hadoop@master sbin]$ hive -help
usage: hive
 -d,--define <key=value>          Variable substitution to apply to Hive
                                  commands. e.g. -d A=B or --define A=B
    --database <databasename>     Specify the database to use
 -e <quoted-query-string>         SQL from command line
 -f <filename>                    SQL from files
 -H,--help                        Print help information
    --hiveconf <property=value>   Use value for given property
    --hivevar <key=value>         Variable substitution to apply to Hive
                                  commands. e.g. --hivevar A=B
 -i <filename>                    Initialization SQL file
 -S,--silent                      Silent mode in interactive shell
 -v,--verbose                     Verbose mode (echo executed SQL to the
                                  console)
[hadoop@master sbin]$ hive --hiveconf hive.root.logger=INFO,console

Logging initialized using configuration in file:/home/hadoop/hive/conf/hive-log4j2.properties Async: true
2019-04-02T22:56:32,735  INFO [main] SessionState: 
Logging initialized using configuration in file:/home/hadoop/hive/conf/hive-log4j2.properties Async: true
2019-04-02T22:56:32,978  WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> use db_hive;
2019-04-02T22:56:52,176  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Compiling command(queryId=hadoop_20190402225652_7939edc0-f75c-4660-8bc3-a2b793b0251f): use db_hive
2019-04-02T22:56:52,563  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2019-04-02T22:56:52,598  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.ObjectStore: ObjectStore, initialize called
2019-04-02T22:56:53,497  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2019-04-02T22:56:54,917  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
2019-04-02T22:56:54,918  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.ObjectStore: Initialized ObjectStore
2019-04-02T22:56:55,022  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: Added admin role in metastore
2019-04-02T22:56:55,025  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: Added public role in metastore
2019-04-02T22:56:55,056  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
2019-04-02T22:56:55,165  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_all_functions
2019-04-02T22:56:55,166  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_all_functions
2019-04-02T22:56:55,177  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_database: db_hive
2019-04-02T22:56:55,178  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: db_hive
2019-04-02T22:56:55,192  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Semantic Analysis Completed
2019-04-02T22:56:55,194  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
2019-04-02T22:56:55,197  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Completed compiling command(queryId=hadoop_20190402225652_7939edc0-f75c-4660-8bc3-a2b793b0251f); Time taken: 3.036 seconds
2019-04-02T22:56:55,198  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
2019-04-02T22:56:55,198  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Executing command(queryId=hadoop_20190402225652_7939edc0-f75c-4660-8bc3-a2b793b0251f): use db_hive
2019-04-02T22:56:55,210  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=ad332b5e-6ec9-4307-bef1-253a5ab59e59, clientType=HIVECLI]
2019-04-02T22:56:55,219  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2019-04-02T22:56:55,220  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
2019-04-02T22:56:55,221  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2019-04-02T22:56:55,221  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
2019-04-02T22:56:55,221  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2019-04-02T22:56:55,231  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Starting task [Stage-0:DDL] in serial mode
2019-04-02T22:56:55,232  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_database: db_hive
2019-04-02T22:56:55,232  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: db_hive
2019-04-02T22:56:55,344  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2019-04-02T22:56:55,347  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.ObjectStore: ObjectStore, initialize called
2019-04-02T22:56:55,361  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
2019-04-02T22:56:55,361  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.ObjectStore: Initialized ObjectStore
2019-04-02T22:56:55,367  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_database: db_hive
2019-04-02T22:56:55,367  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: db_hive
2019-04-02T22:56:55,373  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Completed executing command(queryId=hadoop_20190402225652_7939edc0-f75c-4660-8bc3-a2b793b0251f); Time taken: 0.175 seconds
OK
2019-04-02T22:56:55,376  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: OK
Time taken: 3.222 seconds
hive> show tables;
2019-04-02T22:57:06,557  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Compiling command(queryId=hadoop_20190402225706_0f585a6e-ca17-4926-8d99-3260e7caabf5): show tables
2019-04-02T22:57:06,565  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_database: db_hive
2019-04-02T22:57:06,565  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: db_hive
2019-04-02T22:57:06,571  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Semantic Analysis Completed
2019-04-02T22:57:06,592  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
2019-04-02T22:57:06,629  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] exec.ListSinkOperator: Initializing operator LIST_SINK[0]
2019-04-02T22:57:06,633  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Completed compiling command(queryId=hadoop_20190402225706_0f585a6e-ca17-4926-8d99-3260e7caabf5); Time taken: 0.076 seconds
2019-04-02T22:57:06,633  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
2019-04-02T22:57:06,633  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Executing command(queryId=hadoop_20190402225706_0f585a6e-ca17-4926-8d99-3260e7caabf5): show tables
2019-04-02T22:57:06,634  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Starting task [Stage-0:DDL] in serial mode
2019-04-02T22:57:06,636  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_database: db_hive
2019-04-02T22:57:06,637  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: db_hive
2019-04-02T22:57:06,639  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] metastore.HiveMetaStore: 0: get_tables: db=db_hive pat=.*
2019-04-02T22:57:06,639  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_tables: db=db_hive pat=.*
OK
2019-04-02T22:57:06,661  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: Completed executing command(queryId=hadoop_20190402225706_0f585a6e-ca17-4926-8d99-3260e7caabf5); Time taken: 0.028 seconds
2019-04-02T22:57:06,661  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] ql.Driver: OK
2019-04-02T22:57:06,684  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] mapred.FileInputFormat: Total input paths to process : 1
2019-04-02T22:57:06,701  INFO [ad332b5e-6ec9-4307-bef1-253a5ab59e59 main] exec.ListSinkOperator: Closing operator LIST_SINK[0]
u2
u4
Time taken: 0.105 seconds, Fetched: 2 row(s)

日志比较详细。

转载于:https://www.cnblogs.com/hello-wei/p/10645740.html

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值