apache-phoenix-4.14.0-HBase-1.3-bin安装及基本使用
1、安装步骤
a. 下载相关版本 (注:要与 hbase 版本相对应)
b. 解压到安装目录
c. 把目录下的 phoenix-4.14.0-HBase-1.3-server.jar 自制到 hbase 目录下的 lib/ 里面
d. 重启hbase
e. 运行phoenix , 如下
cd /home/hadoop/apache-phoenix-4.14.0-HBase-1.3-bin/bin
sh sqlline.py master --报错
./sqlline.py master --正确
[root@master bin]# ./sqlline.py master
Setting property: [incremental, false]
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:master none none org.apache.phoenix.jdbc.PhoenixDriver
Connecting to jdbc:phoenix:master
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/apache-phoenix-4.14.0-HBase-1.3-bin/phoenix-4.14.0-HBase-1.3-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
18/08/01 19:43:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Connected to: Phoenix (version 4.14)
Driver: PhoenixEmbeddedDriver (version 4.14)
Autocommit status: true
Transaction isolation: TRANSACTION_READ_COMMITTED
Building list of tables and columns for tab-completion (set fastconnect to true to skip)...
133/133 (100%) Done
Done
sqlline version 1.2.0
2、基本操作
利用!help 查看所有command。
创建表
0: jdbc:phoenix:master> create table test (id varchar primary key,name varchar,age integer );
No rows affected (1.3 seconds)
0: jdbc:phoenix:master> select * from test;
+-----+-------+------+
| ID | NAME | AGE |
+-----+-------+------+
+-----+-------+------+
0: jdbc:phoenix:master> upsert into test(id,name,age) values('000001','liubei',43);
1 row affected (0.037 seconds)
0: jdbc:phoenix:master> select * from test;
+---------+---------+------+
| ID | NAME | AGE |
+---------+---------+------+
| 000001 | liubei | 43 |
+---------+---------+------+
1 row selected (0.051 seconds)
3、例子
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.PreparedStatement;
import java.sql.Statement;
public class phoniex_test {
public static void main(String[] args) throws SQLException {
Statement stmt = null;
ResultSet rset = null;
Connection con = DriverManager.getConnection("jdbc:phoenix:master");
stmt= con.createStatement();
stmt.executeUpdate("create table test2 (mykey integer not null primary key, mycolumn varchar)");
stmt.executeUpdate("upsert into test2 values (1,'Hello')");
stmt.executeUpdate("upsert into test2 values (2,'World!')");
con.commit();
PreparedStatement statement = con.prepareStatement("select * from test2");
rset= statement.executeQuery();
while(rset.next()) {
System.out.println(rset.getString("mycolumn"));
}
statement.close();
con.close();
}
}
[root@master caiyw]# java -cp "/home/hadoop/apache-phoenix-4.14.0-HBase-1.3-bin/phoenix-4.14.0-HBase-1.3-client.jar:." phoniex_test
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Hello
World!
4、如何通过Phoenix批量加载数据
Phoenix提供了两种方法用来加载CSV数据到Phoenix 表中,一种是通过psql命令,单线程方式加载;另一种是基于MapReduce批量加载方式。
psql方式适合几十MB的数据量,而基于MapReduce的方式适合更大的数据量加载。
下面我们来演示一下通过这两种方式加载CSV格式的数据到Phoenix表中。
(1)样例数据data.csv --必须是csv格式
12345,John,Doe
67890,Mary,Poppins
(2)创建表SQL
CREATE TABLE example (
my_pk bigint not null,
m.first_name varchar(50),
m.last_name varchar(50)
CONSTRAINT pk PRIMARY KEY(my_pk)
);
(3)通过psql方式加载
[root@master apache-phoenix-4.14.0-HBase-1.3-bin]# cd /home/hadoop/apache-phoenix-4.14.0-HBase-1.3-bin/
[root@master apache-phoenix-4.14.0-HBase-1.3-bin]# bin/psql.py -t EXAMPLE master /root/caiyw/data.csv
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/apache-phoenix-4.14.0-HBase-1.3-bin/phoenix-4.14.0-HBase-1.3-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
18/08/02 01:09:35 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
csv columns from database.
CSV Upsert complete. 16 rows upserted
Time: 0.193 sec(s)
(4)普通查询
0: jdbc:phoenix:master> select * from example;
+--------+-------------+------------+
| MY_PK | FIRST_NAME | LAST_NAME |
+--------+-------------+------------+
+--------+-------------+------------+
No rows selected (0.043 seconds)
0: jdbc:phoenix:master> select * from example;
+-------------+-------------+------------+
| MY_PK | FIRST_NAME | LAST_NAME |
+-------------+-------------+------------+
| 1 | Mary | Poppins |
| 1535 | John | Doe |
| 1635 | John | Doe |
| 6530 | Mary | Poppins |
| 6560 | Mary | Poppins |
| 12645 | John | Doe |
| 15635 | John | Doe |
| 15645 | John | Doe |
| 16535 | John | Doe |
| 65590 | Mary | Poppins |
| 65630 | Mary | Poppins |
| 65690 | Mary | Poppins |
| 152635 | John | Doe |
| 625630 | Mary | Poppins |
| 652630 | Mary | Poppins |
| 1563512345 | John | Doe |
+-------------+-------------+------------+
16 rows selected (0.067 seconds)
0: jdbc:phoenix:master> select count(0) from example;
+-----------+
| COUNT(0) |
+-----------+
| 16 |
+-----------+
1 row selected (0.133 seconds)
0: jdbc:phoenix:master> select max(FIRST_NAME) from example;
+--------------------+
| MAX(M.FIRST_NAME) |
+--------------------+
| Mary |
+--------------------+
1 row selected (0.054 seconds)
(5)通过MapReduce来加载数据
自行百度