65.连接Hive和Impala-Python

65.1 演示环境介绍

  • CDH集群,Anaconda,pip工具,Python包均运行正常
  • CM和CDH版本为:5.11.2
  • RedHat版本为:7.2
  • Python版本为:2.6+ 或者 3.3+
  • 非安全集群环境

65.2 操作演示

  • 安装Impyla依赖的Python包
[root@ip-186-31-22-86 ~]# pip install bit_array
[root@ip-186-31-22-86 ~]# pip install thrift==0.9.3
[root@ip-186-31-22-86 ~]# pip install six
[root@ip-186-31-22-86 ~]# pip install thrift_sasl
[root@ip-186-31-22-86 ~]# pip install sasl
  • thrift的版本必须使用0.9.3,默认安装的为0.10.0版本,需要卸载后重新安装0.9.3版本,卸载命令pip uninstall thrift
  • Impyla所依赖的Python包
    • six
    • bit_array
    • thrift (on Python 2.x) orthriftpy (on Python 3.x)
    • thrift_sasl
    • sasl
  • 安装Impyla包
    • impyla版本,默认安装的是0.14.0,需要将卸载后安装0.13.8版本
 [root@ip-186-31-22-86 ec2-user]# pip install impyla==0.13.8
Collecting impyla
  Downloading impyla-0.14.0.tar.gz (151kB)
    100% |████████████████████████████████| 153kB 1.0MB/s 
Requirement already satisfied: six in /opt/cloudera/parcels/Anaconda-4.2.0/lib/python2.7/site-packages (from impyla)
Requirement already satisfied: bitarray in /opt/cloudera/parcels/Anaconda-4.2.0/lib/python2.7/site-packages (from impyla)
Requirement already satisfied: thrift in /opt/cloudera/parcels/Anaconda-4.2.0/lib/python2.7/site-packages (from impyla)
Building wheels for collected packages: impyla
  Running setup.py bdist_wheel for impyla ... done
  Stored in directory: /root/.cache/pip/wheels/96/fa/d8/40e676f3cead7ec45f20ac43eb373edc471348ac5cb485d6f5
Successfully built impyla
Installing collected packages: impyla
Successfully installed impyla-0.14.0
  • Python连接Hive(HiveTest.py)
from impala.dbapi importconnect
conn = connect(host='ip-186-31-21-45.ap-southeast-1.compute.internal',port=10000,database='default',auth_mechan
ism='PLAIN')
print(conn)
cursor = conn.cursor()
cursor.execute('show databases')
print cursor.description  # prints the result set's schema
results = cursor.fetchall()
print(results)

cursor.execute('SELECT * FROM test limit 10')
print cursor.description  # prints the result set's schema
results = cursor.fetchall()
print(results)
  • Python连接Impala(ImpalaTest.py)
from impala.dbapi importconnect
conn = connect(host='ip-186-31-26-80.ap-southeast-1.compute.internal',port=21050)
print(conn)
cursor = conn.cursor()
cursor.execute('show databases')
print cursor.description  # prints the result set's schema
results = cursor.fetchall()
print(results)


cursor.execute('SELECT * FROM test limit 10')
print cursor.description  # prints the result set's schema
results = cursor.fetchall()
print(results)
  • 在shell命令行执行Python代码测试
    • 测试连接Hive
[root@ip-186-31-22-86ec2-user]# python HiveTest.py
<impala.hiveserver2.HiveServer2Connectionobject at 0x7f66eee00250>
[('database_name', 'STRING', None, None, None, None, None)]
[('default',)]
[('test.s1', 'STRING',None, None, None, None, None), ('test.s2', 'STRING', None, None, None, None, None)]
[('name1', 'age1'), ('name2', 'age2'), ('name3', 'age3'), ('name4', 'age4'), ('name5', 'age5'), ('name6', 'age6'), ('name7', 'age7'), ('name8', 'age8'), ('name9', 'age9'), ('name10', 'age10')]
[root@ip-186-31-22-86 ec2-user]# 
  • 测试连接Impala
[root@ip-186-31-22-86ec2-user]# python ImpalaTest.py
<impala.hiveserver2.HiveServer2Connectionobject at 0x7f7e1f2cfad0>
[('name', 'STRING', None, None, None, None, None), ('comment', 'STRING', None, None, None, None, None)]
[('_impala_builtins', 'Systemdatabase for Impala builtin functions'), ('default', 'Default Hive database')]
[('s1', 'STRING', None, None, None,None, None), ('s2', 'STRING', None, None, None,None, None)]
[('name1', 'age1'), ('name2', 'age2'), ('name3', 'age3'), ('name4', 'age4'), ('name5', 'age5'), ('name6', 'age6'), ('name7', 'age7'), ('name8', 'age8'), ('name9', 'age9'), ('name10', 'age10')]
[root@ip-186-31-22-86 ec2-user]# 

常见错误解决方案
错误一:

building 'sasl.saslwrapper' extension
    creating build/temp.linux-x86_64-2.7
    creating build/temp.linux-x86_64-2.7/sasl
    gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -Isasl -I/opt/cloudera/parcels/Anaconda/include/python2.7 -c sasl/saslwrapper.cpp -o build/temp.linux-x86_64-2.7/sasl/saslwrapper.o
    unable to execute 'gcc': No such file or directory
    error: command 'gcc' failed with exit status 1
    
    ----------------------------------------
Command "/opt/cloudera/parcels/Anaconda/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-kD6tvP/sasl/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-WJFNeG-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-kD6tvP/sasl/

解决方案:

[root@ip-186-31-22-86 ec2-user]# yum -y install gcc 
[root@ip-186-31-22-86 ec2-user]# yum install gcc-c++ 

错误二:

gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -Isasl -I/opt/cloudera/parcels/Anaconda/include/python2.7 -c sasl/saslwrapper.cpp -o build/temp.linux-x86_64-2.7/sasl/saslwrapper.o
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ [enabled by default]
In file included from sasl/saslwrapper.cpp:254:0:
sasl/saslwrapper.h:22:23: fatal error: sasl/sasl.h: No such file or directory
#include <sasl/sasl.h>
                   ^
compilation terminated.
error: command 'gcc' failed with exit status 1

解决方法:

[root@ip-186-31-22-86 ec2-user]# yum -y install python-devel.x86_64 cyrus-sasl-devel.x86_64

大数据视频推荐:
CSDN
大数据语音推荐:
企业级大数据技术应用
大数据机器学习案例之推荐系统
自然语言处理
大数据基础
人工智能:深度学习入门到精通

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要使用Python连接Hive,你可以使用两种不同的库:pyhive和impyla。下面是使用这两个库的示例代码: 使用pyhive连接Hive的示例代码如下: ```python from pyhive import hive import pandas as pd # 创建Hive连接 conn = hive.Connection(host='10.16.15.2', port=10000, username='hive', database='user') cur = conn.cursor() # 执行查询语句 sql = "select * from user_huaxiang_wide_table" cur.execute(sql) # 获取查询结果并转换为DataFrame df = pd.DataFrame(cur.fetchall()) # 关闭连接 cur.close() conn.close() # 打印查询结果 print(df) ``` 使用impyla库连接Hive的示例代码如下: ```python from impala.dbapi import connect import pandas as pd # 创建Hive连接 conn = connect(host='10.16.15.2', port=10000, auth_mechanism='PLAIN', user='hive', password='user@123', database='user') cur = conn.cursor() # 执行查询语句 sql = "select * from user_huaxiang_wide_table limit 100" cur.execute(sql) # 获取查询结果并转换为DataFrame df = pd.DataFrame(cur.fetchall()) # 关闭连接 cur.close() conn.close() # 打印查询结果 print(df) ``` 这两种方法都可以连接Hive并执行查询语句,然后将结果转换为DataFrame进行处理。你可以根据自己的需求选择其中一种方法来连接Hive。 #### 引用[.reference_title] - *1* *2* [Python连接hive数据库小结](https://blog.csdn.net/qq_40304090/article/details/108263224)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [python 连接 hive](https://blog.csdn.net/zhenyangzhijia/article/details/48694721)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值