I am trying to import MySQL database into Hive to analysis of large MySQL Data according to
Non realtime: Sqoop
Realtime: Hadoop Applier for MySQL
so I decided to go with the 'Non realtime' approach and I have setup the Hadoop cluster with 4 node, Sqoop and Hive which working fine with following versions
Name Version
Apache Hadoop 2.6.0
Apache Hive hive-0.14.0
Apache Sqoop sqoop-1.4.5.bin__hadoop-2.0.4-alpha
Now when I am trying to import data using following command
Import Command
sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt.db --hive-overwrite
then I am getting following error
Error
INFO hive.HiveImport: FAILED: SemanticException [Error 10072]: Database does not exist: edgeowt.db
15/04/16 13:32:09 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 88
I logged in with Hiveserver2 and check the database, but I can able to see the given database
$HIVE_HOME/bin>beeline
beeline> !connect jdbc:hive2://localhost:10000 scott tiger org.apache.hive.jdbc.HiveDriver
0: jdbc:hive2://localhost:10000> show databases;
+----------------+--+
| database_name |
+----------------+--+
| default |
| edgeowt |
+----------------+--+
After looking into HDFS file System web interface I realize that the Owner of the DB is different
Permission Owner Group Size Replication Block Size Name
drwxr-xr-x hduser supergroup 0 B 0 0 B candidate
drwxr-xr-x scott supergroup 0 B 0 0 B edgeowt.db
drwxr-xr-x scott supergroup 0 B 0 0 B hd_temperature
as I am trying to import the data using hduser and and the database is created using scott user.
I tried to grant ALL previlages to hduser on edgeowt.db using following command
0: jdbc:hive2://localhost:10000>GRANT ALL ON DATABASE edgeowt TO USER hduser;
and check with
0: jdbc:hive2://localhost:10000> SHOW GRANT ON DATABASE edgeowt;
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| database | table | partition | column | principal_name | principal_type | privilege | grant_option | grant_time | grantor |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| edgeowt | | | | admin | ROLE | ALL | false | 1429170366000 | scott |
| edgeowt | | | | hduser | USER | ALL | false | 1429170906000 | scott |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
but unable to solve the error , So how can I solve this problem ? or any point to solve this will be helpful.
~/.bashrc
# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-7-oracle
# Set Hadoop-related environment variables
export HADOOP_INSTALL=/opt/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_HOME=$HADOOP_INSTALL
# Set hive home
export HIVE_HOME=/opt/hive
export PATH=$PATH:$HIVE_HOME/bin
# Set HCatlog home
export HCAT_HOME=$HIVE_HOME/hcatalog
export PATH=$PATH:$HCAT_HOME/bin
# Set Sqoop home
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
export SQOOP_HOME=/opt/sqoop
export SQOOP_CONF_DIR="$SQOOP_HOME/conf"
export SQOOP_CLASSPATH="$SQOOP_CONF_DIR"
export PATH=$PATH:$SQOOP_HOME/bin
EDIT
tried with following command still having same error
sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt --hive-overwrite