hive计算示例
先将数据通过sqoop从mysql导入hive,在hive执行mysql的查询语句,得到与mysql一样的执行结果
步骤:- mysql数据准备
- account账号表
- detail收支数据表
CREATE TABLE `account` (
`id` int(11) PRIMARY KEY AUTO_INCREMENT,
`account` varchar(20),
`name` varchar(5),
`age` int(3)
);
insert into account(account, name, age) values;
insert into account(account, name, age) values;
insert into account(account, name, age) values;
insert into account(account, name, age) values;
insert into account(account, name, age) values;
select * from account;
CREATE TABLE `detail` (
`id` int(11) PRIMARY KEY AUTO_INCREMENT,
`account` varchar(20),
`income` double,
`expenses` double,
`time` varchar(10)
);
insert into detail(account, income, expenses, time) values
insert into detail(account, income, expenses, time) values
insert into detail(account, income, expenses, time) values
insert into detail(account, income, expenses, time) values
insert into detail(account, income, expenses, time) values
select * from detail;
- 创建hive表
create table account (
id int,
account string,
name string,
age int
) row format delimited fields terminated by '\t';
create table detail (
id int,
account string,
income double,
expenses double,
time string
) row format delimited fields terminated by '\t';
- 通过sqoop将mysq当中的数据直接导入到hive当中
sqoop import
--connect jdbc:mysql://localhost:3306/mydata
--username root
--password 123456
--table account
--hive-import
--hive-overwrite
--hive-table account
--fields-terminated-by '\t'
sqoop import --connect jdbc:mysql://localhost:3306/mydata --username root --password 123456 --table detail --hive-import --hive-overwrite --hive-table detail --fields-terminated-by '\t'
- 计算结果,mysql和hive中计算结果一致
select a.account, , d.total
from account as a
join(
select account, sum(income - expenses) as total
from detail group by account
) as d
on a.account=d.account;
mysql计算结果
+--------------+-------+-------+
| account | name | total |
+--------------+-------+-------+
| black@ | Black | 10 |
| jack@ | Jack | -20 |
| jimi@ | Jimi | 20 |
| jone@ | Jone | -9 |
| tom@ | Tom | 20 |
+--------------+-------+-------+
hive计算结果
black@ Black 10.0
jack@ Jack -20.0
jimi@ Jimi 20.0
jone@ Jone -9.0
tom@ Tom 20.0
报错及解决
报错:
/tmp/hive on HDFS should be writable.
解决
> hadoop fs -chmod -R 777 /tmp
报错:
Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR
解决:
往/etc/profile最后加入
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*
然后刷新配置,source /etc/profile