HIVE仓库编码问题

问题如下:

9979 2018-11-28 05:09:48,064 ERROR [main]: exec.Task (SessionState.java:printError(569)) - Failed with exception MetaException(message:javax.jdo.      JDODataStoreException: Error(s) were found while auto-creating/validating the datastore for classes. The errors are printed in the log, and       are attached to this exception.
 9980         at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
 9981         at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)
 9982         at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
 9983         at org.apache.hadoop.hive.metastore.ObjectStore.addPartition(ObjectStore.java:1327)
 9984         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 9985         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 9986         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 9987         at java.lang.reflect.Method.invoke(Method.java:498)
 9988         at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
 9989         at com.sun.proxy.$Proxy6.addPartition(Unknown Source)
 9990         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.append_partition_common(HiveMetaStore.java:1767)
 9991         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.append_partition_with_environment_context(HiveMetaStore.java:1822)
 9992         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 9993         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 9994         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 9995         at java.lang.reflect.Method.invoke(Method.java:498)
 9996         at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:106)
 9997         at com.sun.proxy.$Proxy8.append_partition_with_environment_context(Unknown Source)
 9998         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.appendPartition(HiveMetaStoreClient.java:500)
 9999         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.appendPartition(HiveMetaStoreClient.java:494)
10000         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
10001         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
10002         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
10003         at java.lang.reflect.Method.invoke(Method.java:498)
10004         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
10005         at com.sun.proxy.$Proxy9.appendPartition(Unknown Source)
10006         at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:1660)
10007         at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1287)
10008         at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:412)
10009         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:155)
10010         at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
10011         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1554)
10012         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1321)
10013         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1139)
10014         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:962)
10015         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:952)

10027 NestedThrowablesStackTrace:
10028 com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
10029         at sun.reflect.GeneratedConstructorAccessor37.newInstance(Unknown Source)
10030         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
10031         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
10032         at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
10033         at com.mysql.jdbc.Util.getInstance(Util.java:387)
10034         at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:941)
10035         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3870)
10036         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3806)
10037         at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2470)
10038         at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2617)
10039         at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2546)
10040         at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2504)
10041         at com.mysql.jdbc.StatementImpl.executeInternal(StatementImpl.java:840)
10042         at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:740)
10043         at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)
10044         at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)
10045         at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:648)
10046         at org.datanucleus.store.rdbms.table.TableImpl.validateIndices(TableImpl.java:593)
10047         at org.datanucleus.store.rdbms.table.TableImpl.validateConstraints(TableImpl.java:390)
10048         at org.datanucleus.store.rdbms.table.ClassTable.validateConstraints(ClassTable.java:3463)
10049         at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3464)
10050         at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)
10051         at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
10052         at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
10053         at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
10054         at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
10055         at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
10056         at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)
10057         at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)
10058         at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)
10059         at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)
10060         at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)

在建表导入数据时发生错误,查看Mysql数据库中hive的元数据库:

mysql> show create database metastore;
+-----------+--------------------------------------------------------------------+
| Database  | Create Database                                                    |
+-----------+--------------------------------------------------------------------+
| metastore | CREATE DATABASE `metastore` /*!40100 DEFAULT CHARACTER SET utf8 */ |
+-----------+--------------------------------------------------------------------+
1 row in set (0.00 sec)

这里更改数据库编码:

mysql> alter database metastore character set latin1;
mysql> show create database metastore;
+-----------+--------------------------------------------------------------------+
| Database  | Create Database                                                    |
+-----------+--------------------------------------------------------------------+
| metastore | CREATE DATABASE `metastore` /*!40100 DEFAULT CHARACTER SET utf8 */ |
+-----------+--------------------------------------------------------------------+
1 row in set (0.00 sec)

这里我的做法略显粗糙 ,我将hive的元数据库删除,建了一个同名的,因为实在懒得更改hive配置文件

再次建表
导入数据

create database if not exists db_web_data ;

				create table if not exists db_web_data.track_log(

				id              string,

				url            string,

				referer        string,

				keyword        string,

				type            string,

				guid            string,

				pageId          string,

				moduleId        string,

				linkId          string,

				attachedInfo    string,

				sessionId      string,

				trackerU        string,

				trackerType    string,

				ip              string,

				trackerSrc      string,

				cookie          string,

				orderCode      string,

				trackTime      string,

				endUserId      string,

				firstLink      string,

				sessionViewNo  string,

				productId      string,

				curMerchantId  string,

				provinceId      string,

				cityId          string,

				fee            string,

				edmActivity    string,

				edmEmail        string,

				edmJobId        string,

				ieVersion      string,

				platform        string,

				internalKeyword string,

				resultSum      string,

				currentPage    string,

				linkPosition    string,

				buttonPosition  string

				)
				partitioned by (date string,hour string)
				row format delimited fields terminated by '\t';

导入成功!

hive (default)> load data local inpath '/mysoft/resources/2015082818' into table db_web_data.track_log partition(date='20150828', hour='18');
Loading data to table db_web_data.track_log partition (date=20150828, hour=18)
Partition db_web_data.track_log{date=20150828, hour=18} stats: [numFiles=5, numRows=0, totalSize=197127590, rawDataSize=0]
OK
Time taken: 8.09 seconds

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值