注:所有操作皆以hive用户运行
1.下载hive版本包apache-hive-3.1.2-bin.tar.gz进行解压
[hive@xgit3 ~]$ tar zxvf apache-hive-3.1.2-bin.tar.gz
2.修改配置文件
- 配置hive-env.sh
[hive@xgit3 apache-hive-3.1.2-bin]$ cd conf/
[hive@xgit3 conf]$ cp hive-env.sh.template hive-env.sh
[hive@xgit3 conf]$ vim hive-site.xml
配置内容如下:
export JAVA_HOME=/usr/java/jdk1.8.0_212-amd64
# Set HADOOP_HOME to point to a specific hadoop install directory
export HADOOP_HOME=/home/hadoop/hadoop-3.2.2
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/home/hive/apache-hive-3.1.2-bin/conf
# Folder containing extra libraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/home/hive/apache-hive-3.1.2-bin/lib
- 配置hive-site.xml
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:postgresql://10.90.15.223:5432/hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.postgresql.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>Username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>password to use against metastore database</description>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hive/apache-hive-3.1.2-bin/tmp</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.exec.scratchdir</name>
<value>/user/hive/tmp</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/user/hive/log</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/home/hive/apache-hive-3.1.2-bin/tmp/${hive.session.id}_resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>datanucleus.schema.autoCreateAll</name>
<value>true</value>
</property>
<!--hiveserver2的配置-->
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>10.90.15.223</value>
</property>
<!--<property>
<name>hive.server2.transport.mode</name>
<value>binary</value>
<description>
Expects one of [binary, http].
Transport mode of HiveServer2.
</description>
</property>-->
<property>
<name>hive.server2.active.passive.ha.enable</name>
<value>true</value>
</property>
</configuration>
- 配置日志文件
[hive@xgit3 conf]$ cp hive-log4j2.properties.template hive-log4j2.properties
[hive@xgit3 conf]$ vim hive-log4j2.properties
配置内容如下:
status = INFO
name = HiveLog4j2
packages = org.apache.hadoop.hive.ql.log
# list of properties
property.hive.log.level = INFO
property.hive.root.logger = DRFA
property.hive.log.dir = /home/hive/apache-hive-3.1.2-bin/logs
property.hive.log.file = hive.log
property.hive.perflogger.log.level = INFO
- 配置hadoop配置文件core-site.xml,添加如下内容:
<property>
<name>hadoop.proxyuser.hive.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hive.groups</name>
<value>*</value>
</property>
修改完需要重启Hdfs服务
3.上传postgres驱动包到hive的lib目录
下载postgresql-42.2.5.jar,并上传服务器
4.创建hdfs目录,并赋权
[hadoop@xgit3 ~]$ hdfs dfs -mkdir /user
[hadoop@xgit3 ~]$ hdfs dfs -mkdir /tmp
[hadoop@xgit3 ~]$ hdfs dfs -chmod 777 /user
[hadoop@xgit3 ~]$ hdfs dfs -chmod 777 /tmp
5.在postgresql数据库中创建hive用户和hive库
[postgres@xgit3 pgsql]$ psql
psql.bin (10.18)
Type "help" for help.
postgres=# \l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
-----------+----------+----------+-------------+-------------+-----------------------
postgres | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
template0 | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 | =c/postgres +
| | | | | postgres=CTc/postgres
template1 | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 | =c/postgres +
| | | | | postgres=CTc/postgres
(3 rows)
postgres=# create user hive with password 'hive123';
CREATE ROLE
postgres=# create database hive owner hive; -- 创建数据库指定所属者
CREATE DATABASE
postgres=# grant all on database hive to hive; -- 将hive库所有权限赋值给hive用户
GRANT
postgres=# \l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
-----------+----------+----------+-------------+-------------+-----------------------
hive | hive | UTF8 | en_US.UTF-8 | en_US.UTF-8 | =Tc/hive +
| | | | | hive=CTc/hive
postgres | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
template0 | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 | =c/postgres +
| | | | | postgres=CTc/postgres
template1 | postgres | UTF8 | en_US.UTF-8 | en_US.UTF-8 | =c/postgres +
| | | | | postgres=CTc/postgres
(4 rows)
postgres=#
6.初始化hive元数据库
[hive@xgit3 apache-hive-3.1.2-bin]$ ./bin/schematool -dbType postgres -initSchema
7.启动metastore服务和hiveserver2服务
[hive@xgit3 apache-hive-3.1.2-bin]$ nohup ./bin/hive --service metastore >> metastore.log &
[hive@xgit3 apache-hive-3.1.2-bin]$ nohup ./bin/hive --service hiveserver2 >> hiveserver2.log &
问题解决:
- 报错一
MetaException(message:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8661)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8656)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:8926)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:8843)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.hive.metastore.ObjectStore.correctAutoStartMechanism(ObjectStore.java:641)
at org.apache.hadoop.hive.metastore.ObjectStore.getDataSourceProps(ObjectStore.java:572)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:349)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:157)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:718)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:696)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:690)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:767)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:538)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
... 11 more
Exception in thread "main" MetaException(message:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8661)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8656)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:8926)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:8843)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.hive.metastore.ObjectStore.correctAutoStartMechanism(ObjectStore.java:641)
at org.apache.hadoop.hive.metastore.ObjectStore.getDataSourceProps(ObjectStore.java:572)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:349)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:157)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:718)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:696)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:690)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:767)
排查后发现是hive内依赖的guava.jar和hadoop内的版本不一致造成的,更换为相同的高版本就ok了
[root@xgit3 lib]# scp /home/hadoop/hadoop-3.2.2/share/hadoop/common/lib/guava-27.0-jre.jar hive@xgit3:/home/hive/apache-hive-3.1.2-bin/lib
[root@xgit3 lib]# rm -rf /home/hive/apache-hive-3.1.2-bin/lib/guava-19.0.jar