一、基本原则
1.phoenix中的表名和字段名不加双引号,都为大写,加上双引号,则区分大写;
2.phoenix和RDBMS中一样,有数据类型
3.插入数据时,values中数值类型不用引号,字符串只能是单引号
二、常用语句
1.创建表
create table us_population(
state varchar(2),
city varchar(15),
population bigint,
constraint pk primary key(state,city)
);
2.插入/更新数据
upsert into us_population(state,city,population) values('NY','New York',8143197);
upsert into us_population(state,city,population) values('NY','New York',8143197);
upsert into us_population(state,city,population) values('CA','Los Angeles',3844829);
upsert into us_population(state,city,population) values('IL','Chicago',2842518);
upsert into us_population(state,city,population) values('TX','Houston',2016582);
upsert into us_population(state,city,population) values('PA','Philadelphia',1463281);
upsert into us_population(state,city,population) values('AZ','Phoenix',1461575);
upsert into us_population(state,city,population) values('TX','San Antonio',1256509);
upsert into us_population(state,city,population) values('CA','San Diego',1255540);
upsert into us_population(state,city,population) values('TX','Dallas',1213825);
upsert into us_population(state,city,population) values('CA','San Jose',912332);
其中:字符串是单引号
3.查询语句
select * from us_population;
+-------+-----------------+------------------------------------------+
| STATE | CITY | POPULATION |
+-------+-----------------+------------------------------------------+
| AZ | Phoenix | 1461575 |
| CA | Los Angeles | 3844829 |
| CA | San Diego | 1255540 |
| CA | San Jose | 912332 |
| IL | Chicago | 2842518 |
| NY | New York | 8143197 |
| PA | Philadelphia | 1463281 |
| TX | Dallas | 1213825 |
| TX | Houston | 2016582 |
| TX | San Antonio | 1256509 |
+-------+-----------------+------------------------------------------+
4.修改数据
upsert into us_population(state,city,population) values('AZ','Phoenix',1500000);
结果:
0: jdbc:phoenix:hadoop:2181> select * from us_population;
+--------+---------------+-------------+
| STATE | CITY | POPULATION |
+--------+---------------+-------------+
| AZ | Phoenix | 1500000 |
| CA | Los Angeles | 3844829 |
| CA | San Diego | 1255540 |
| CA | San Jose | 912332 |
| IL | Chicago | 2842518 |
| NY | New York | 8143197 |
| PA | Philadelphia | 1463281 |
| TX | Dallas | 1213825 |
| TX | Houston | 2016582 |
| TX | San Antonio | 1256509 |
+--------+---------------+-------------+
5.分组聚合查询
0: jdbc:phoenix:hadoop:2181> select state as "State",count(city) "City",sum(population) "Population" from us_population group by state order by sum(population) desc;
+--------+-------+-------------+
| State | City | Population |
+--------+-------+-------------+
| NY | 1 | 8143197 |
| CA | 3 | 6012701 |
| TX | 3 | 4486916 |
| IL | 1 | 2842518 |
| AZ | 1 | 1500000 |
| PA | 1 | 1463281 |
+--------+-------+-------------+
6 rows selected (0.034 seconds)
三、小写表以及添加列簇
1.创建能够正常在HBase中带有列簇,且都为表名和列簇都为小写的表
create table "us_population"(
"state" varchar(2),
"city" varchar(15),
"info"."population" bigint,
constraint pk primary key("state","city")
);
备注:
(1) state和city是一起用了做联合主键,就是hbase的行键!所以不需要列簇
(2)指明列簇info
2.插入数据
upsert into "us_population"("state","city","population") values('NY','New York',8143197);
upsert into "us_population"("state","city","population") values('CA','Los Angeles',3844829);
upsert into "us_population"("state","city","population") values('IL','Chicago',2842518);
upsert into "us_population"("state","city","population") values('TX','Houston',2016582);
upsert into "us_population"("state","city","info"."population") values('PA','Philadelphia',1463281);
upsert into "us_population"("state","city","info"."population") values('AZ','Phoenix',1461575);
upsert into "us_population"("state","city","info"."population") values('TX','San Antonio',1256509);
upsert into "us_population"("state","city","info"."population") values('CA','San Diego',1255540);
upsert into "us_population"("state","city","info"."population") values('TX','Dallas',1213825);
upsert into "us_population"("state","city","info"."population") values('CA','San Jose',912332);
注意:
(1)此时插入数据需要注意在表名和列名对应地方加双引号
(2)插入数据加不所在列簇都是没有关系的
3.删除数据
delete from "us_population" where "state" = 'IL Henan';