1. 运行环境
组件
版本
Ambari
2.7.6
Spark
2.3.2
HDFS
3.1.1
2. 提交任务报错信息如下:
使用 spark-submit 提交任务时报错信息
1 java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
3. 解决办法 在IDEA中导出jar包之前,
File->Project Structure->Artifacts->xxx.jar->Output Layout
将hadoop-hdfs-x.x.x.jar这个文件去除即可
或在pom.xml 中配置,打包时去除相应的依赖
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 <?xml version="1.0" encoding="UTF-8" ?> <project xmlns ="http://maven.apache.org/POM/4.0.0" xmlns:xsi ="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation ="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <modelVersion > 4.0.0</modelVersion > <groupId > org.example</groupId > <artifactId > myTest</artifactId > <version > 1.0-SNAPSHOT</version > <properties > <hadoop.version > 3.2.1</hadoop.version > </properties > <dependencies > <dependency > <groupId > org.apache.spark</groupId > <artifactId > spark-core_2.12</artifactId > <version > 3.0.0-preview2</version > <exclusions > <exclusion > <groupId > org.apache.hadoop</groupId > <artifactId > hadoop-client</artifactId > </exclusion > <exclusion > <groupId > org.apache.hadoop</groupId > <artifactId > hadoop-hdfs</artifactId > </exclusion > </exclusions > </dependency > <dependency > <groupId > org.apache.hadoop</groupId > <artifactId > hadoop-client</artifactId > <version > ${hadoop.version}</version > </dependency > <dependency > <groupId > org.apache.hadoop</groupId > <artifactId > hadoop-hdfs</artifactId > <version > ${hadoop.version}</version > </dependency > <dependency > <groupId > org.apache.hadoop</groupId > <artifactId > hadoop-common</artifactId > <version > ${hadoop.version}</version > </dependency > </dependencies > </project >