总览
- Flink-1.13.0版本
- JDK-1.8版本(java8)
Flink 下载
官网下载链接:https://flink.apache.org/zh/downloads.html#section-8
版本:Flink-1.13.0
选择合适的版本,最新版本flink-1.15.0,当前环境选择的版本为Flink-1.13.0来部署的,过程中选了最新版本部署过,jdk选择了18,但是开发过程遇到了各种错误,比如批处理能成功,流处理不能成功,模块中反射器冲突等等,小白 最终选择了放弃,后面有时间再来更新。
JDK下载
官网下载链接:https://www.oracle.com/java/technologies/downloads/
版本:JDK-1.8
这里不要选择太高版本,可能存在许多异常错误
pom.xml 依赖配置
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>flinktest</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<!-- <plugin>-->
<!-- <groupId>org.apache.maven.plugins</groupId>-->
<!-- <artifactId>maven-compiler-plugin</artifactId>-->
<!-- <configuration>-->
<!-- <source>1.8</source>-->
<!-- <target>1.8</target>-->
<!-- </configuration>-->
<!-- </plugin>-->
<!-- 打包插件-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<archive>
<manifest>
<!-- <mainClass>com.xxg.Main</mainClass>-->
</manifest>
</archive>
<!-- flink运行的依赖都打包,生成环境中flink依赖可以不要-->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<flink.version>1.13.0</flink.version>
<java.version>1.8.0</java.version>
<scala.binary.version>2.12</scala.binary.version>
<slf4j.version>1.7.30</slf4j.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<!-- 如果想要一些特殊功能做一些提交和管理上的需求可以引入flink-client,线上版本一般不需要导入-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<!-- flink-13.0 log4是必须配置的-->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-to-slf4j</artifactId>
<version>2.14.0</version>
<scope>provided</scope>
</dependency>
<!-- 使用table api 引入的依赖,使用桥接器和底层datastream api连接支持-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!--如果需要在本地运行table api和sql 还需要引入一下依赖-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!--如果想实现自定义的数据格式来做序列化,需要引入一下依赖-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-common</artifactId>
<version>${flink.version}</version>
</dependency>
</dependencies>
</project>
log4配置
log4j.properties
保存位置:resources目录下
#log4j.rootLogger=error,stdout
#log4j.appender.stdout=org.apache.log4j.ConsoleAppender
#log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
#log4j.appender.stdout.layout.conversionPattern=%-4r [%t] %-5p %c %x -%m%n
##------------------------------------------
# priority :debug<info<warn<error
#you cannot specify every priority with different file for log4j
log4j.rootLogger=debug,stdout,info,debug,warn,error
#console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern= [%-d{yyyy-MM-dd HH\:mm\:ss}]-[%p]-[%t] %l\: %m%n
#info log
log4j.logger.info=info
log4j.appender.info=org.apache.log4j.DailyRollingFileAppender
log4j.appender.info.DatePattern='_'yyyy-MM-dd'.log'
log4j.appender.info.File=logs/info.log
log4j.appender.info.Append=true
log4j.appender.info.Threshold=INFO
log4j.appender.info.layout=org.apache.log4j.PatternLayout
log4j.appender.info.layout.ConversionPattern=[%-d{yyyy-MM-dd HH\:mm\:ss}]-[%p]-[%t] %l\: %m%n
#debug log
log4j.logger.debug=debug
log4j.appender.debug=org.apache.log4j.DailyRollingFileAppender
log4j.appender.debug.DatePattern='_'yyyy-MM-dd'.log'
log4j.appender.debug.File=logs/debug.log
log4j.appender.debug.Append=true
log4j.appender.debug.Threshold=DEBUG
log4j.appender.debug.layout=org.apache.log4j.PatternLayout
log4j.appender.debug.layout.ConversionPattern=[%-d{yyyy-MM-dd HH\:mm\:ss}]-[%p]-[%t] %l\: %m%n
#warn log
log4j.logger.warn=warn
log4j.appender.warn=org.apache.log4j.DailyRollingFileAppender
log4j.appender.warn.DatePattern='_'yyyy-MM-dd'.log'
log4j.appender.warn.File=logs/warn.log
log4j.appender.warn.Append=true
log4j.appender.warn.Threshold=WARN
log4j.appender.warn.layout=org.apache.log4j.PatternLayout
log4j.appender.warn.layout.ConversionPattern=[%-d{yyyy-MM-dd HH\:mm\:ss}]-[%p]-[%t] %l\: %m%n
#error
log4j.logger.error=error
log4j.appender.error = org.apache.log4j.DailyRollingFileAppender
log4j.appender.error.DatePattern='_'yyyy-MM-dd'.log'
log4j.appender.error.File = logs/error.log
log4j.appender.error.Append = true
log4j.appender.error.Threshold = ERROR
log4j.appender.error.layout = org.apache.log4j.PatternLayout
log4j.appender.error.layout.ConversionPattern = [%-d{yyyy-MM-dd HH\:mm\:ss}]-[%p]-[%t] %l\: %m%n
log4j2.xml配置
保存位置:resources目录下
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<Properties>
<property name="log_level" value="error" />
<Property name="log_dir" value="log" />
<property name="log_pattern"
value="[%d{yyyy-MM-dd HH:mm:ss.SSS}] [%p] - [%t] %logger - %m%n" />
<property name="file_name" value="test" />
<property name="every_file_size" value="100 MB" />
</Properties>
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="${log_pattern}" />
</Console>
<RollingFile name="RollingFile"
filename="${log_dir}/${file_name}.log"
filepattern="${log_dir}/$${date:yyyy-MM}/${file_name}-%d{yyyy-MM-dd}-%i.log">
<ThresholdFilter level="DEBUG" onMatch="ACCEPT"
onMismatch="DENY" />
<PatternLayout pattern="${log_pattern}" />
<Policies>
<SizeBasedTriggeringPolicy
size="${every_file_size}" />
<TimeBasedTriggeringPolicy modulate="true"
interval="1" />
</Policies>
<DefaultRolloverStrategy max="20" />
</RollingFile>
<RollingFile name="RollingFileErr"
fileName="${log_dir}/${file_name}-warnerr.log"
filePattern="${log_dir}/$${date:yyyy-MM}/${file_name}-%d{yyyy-MM-dd}-warnerr-%i.log">
<ThresholdFilter level="WARN" onMatch="ACCEPT"
onMismatch="DENY" />
<PatternLayout pattern="${log_pattern}" />
<Policies>
<SizeBasedTriggeringPolicy
size="${every_file_size}" />
<TimeBasedTriggeringPolicy modulate="true"
interval="1" />
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Root level="${log_level}">
<AppenderRef ref="Console" />
<AppenderRef ref="RollingFile" />
<appender-ref ref="RollingFileErr" />
</Root>
</Loggers>
</Configuration>
导入flink 依赖
按住 ctrl + shlft + alt + s 打开工程设置
image.png
image.png
导入flink 依赖后,本地就可以编译和模拟环境运行了
新建工程
选用maven来构建工程,jdk选用版本8
image.png
image.png