阅读量:1
安装hive
上传安装包到/opt/software目录并解压
[bigdata@node101 software]$ tar -zxvf hive-3.1.3-with-spark-3.3.1.tar.gz -C /opt/services [bigdata@node101 services]$ mv apache-hive-3.1.3-bin apache-hive-3.1.3
配置环境变量
export JAVA_HOME=/opt/services/jdk1.8.0_161 export ZK_HOME=/opt/services/zookeeper-3.5.7 export HADOOP_HOME=/opt/services/hadoop-3.3.5 export HIVE_HOME=/opt/services/apache-hive-3.1.3 export PATH=$PATH:$JAVA_HOME/bin:$ZK_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin
分发环境变量
[bigdata@node101 bin]$ sudo ./bin/xsync /etc/profile.d/bigdata_env.sh
刷新环境变量,5台机器上执行
[bigdata@node101 ~]$ source /etc/profile
上传mysql驱动包到hive的lib目录下
[bigdata@node101 software]$ cp mysql-connector-java-8.0.18.jar /opt/services/apache-hive-3.1.3/lib/
解决jar包冲突
[bigdata@node101 ~]$ mv $HIVE_HOME/lib/log4j-slf4j-impl-2.17.1.jar $HIVE_HOME/lib/log4j-slf4j-impl-2.17.1.jar.bak
配置hive-site.xml文件
<property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://node101:3306/metastore?useSSL=false&useUnicode=true&characterEncoding=UTF-8&allowPublicKeyRetrieval=true</value> </property> <!-- jdbc 连接的 Driver--> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <!-- jdbc 连接的 username--> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> </property> <!-- jdbc 连接的 password --> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123456</value> </property> <!-- Hive 元数据存储版本的验证 --> <property> <name>hive.metastore.schema.verification</name> <value>false</value> </property> <!--元数据存储授权--> <property> <name>hive.metastore.event.db.notification.api.auth</name> <value>false</value> </property> <!-- Hive 默认在 HDFS 的工作目录 --> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> </property> <!-- 显示表头 --> <property> <name>hive.cli.print.header</name> <value>true</value> </property> <!-- 显示当前库 --> <property> <name>hive.cli.print.current.db</name> <value>true</value> </property> <!-- 配置元数据远程连接地址 --> <property> <name>hive.metastore.uris</name> <value>thrift://node101:9083</value> </property> <property> <name>hive.server2.thrift.port</name> <value>10000</value> </property> <property> <name>hive.server2.thrift.bind.host</name> <value>node101</value> </property> <property> <name>hive.users.in.admin.role</name> <value>bigdata</value> </property> <property> <name>hive.security.authorization.enabled</name> <value>true</value> </property> <!--Spark依赖位置--> <property> <name>spark.yarn.jars</name> <value>hdfs://mycluster:8020/spark-jars/*</value> </property> <!--Hive执行引擎--> <property> <name>hive.execution.engine</name> <value>spark</value> </property> <!--提交任务超时时间,单位ms--> <property> <name>hive.spark.client.connect.timeout</name> <value>100000ms</value> </property>
配置日志文件
[bigdata@node101 conf]$ cp hive-exec-log4j2.properties.template hive-exec-log4j2.properties [bigdata@node101 conf]$ cp hive-log4j2.properties.template hive-log4j2.properties
修改hive-log4j2.properties,添加日志目录
property.hive.log.dir = /opt/services/apache-hive-3.1.3/logs
编辑hive-env.sh
[bigdata@node101 conf]$ cp hive-env.sh.template hive-env.sh [bigdata@node101 conf]$ vim hive-env.sh
export HADOOP_HEAPSIZE=1024
创建元数据库
[bigdata@node101 conf]$ mysql -uroot -p'123456' mysql> create database if not exists metastore DEFAULT CHARACTER SET utf8 DEFAULT COLLATE utf8_general_ci;
初始化元数据库
[bigdata@node101 bin]$ schematool -initSchema -dbType mysql -verbose
修改编码集,解决乱码问题
mysql> alter table DBS convert to character set utf8; mysql> alter table COLUMNS_V2 character set utf8; mysql> alter table COLUMNS_V2 change COMMENT COMMENT varchar(256) character set utf8; mysql> alter table TABLE_PARAMS change PARAM_VALUE PARAM_VALUE mediumtext character set utf8; mysql> alter table PARTITION_KEYS change PKEY_COMMENT PKEY_COMMENT varchar(4000) character set utf8; mysql> alter table PARTITION_KEYS character set utf8;
编写hive.sh脚本
[bigdata@node101 bin]$ vim hive.sh
#!/bin/bash echo ==================== 启动hive服务 ========================= echo ==================== 启动metastore服务 ==================== ssh node101 "nohup $HIVE_HOME/bin/hive --service metastore > $HIVE_HOME/logs/metastore.log 2>&1 &" echo ==================== 启动hiveservice2服务 ================= ssh node101 "nohup $HIVE_HOME/bin/hive --service hiveserver2 > $HIVE_HOME/logs/hiveservice2.log 2>&1 &"
授权hive.sh
[bigdata@node101 bin]$ chmod +x hive.sh
分发hive.sh
[bigdata@node101 bin]$ xsync hive.sh
copy到其他机器
[bigdata@node102 bin]$ scp -r bigdata@node101:/opt/services/apache-hive-3.1.3/ /opt/services/apache-hive-3.1.3/ [bigdata@node103 bin]$ scp -r bigdata@node101:/opt/services/apache-hive-3.1.3/ /opt/services/apache-hive-3.1.3/
启动hive
[bigdata@node101 bin]$ hive.sh start