网站首页 编程语言 正文
spark-submit hive SQL standards based authorization should not be enabled from hive cliInstead the
作者:鲁尼的小宝贝 更新时间: 2022-03-15 编程语言近期在使用spark-submit 提交任务, 并且执行, 碰到各种问题, 在实施现场卡了好几天. 现在终于解决了.
这个只包含自己的处理过程, 每一个人的不一样, 所以照着做了, 也不一定解决, 但是有一个已知的成功的方法还是不错的.
运行的环境是华为的大数据平台 FusionInsight HD V100R002C70SPC200 , 使用 kerberos 做为hive的访问权限认证, 遇到奇葩的权限问题:
.....................................................................................................
. uRule, is a Chinese style rule engine licensed under the Apache License 2.0, .
. which is opensource, easy to use,high-performance, with browser-based-designer. .
.....................................................................................................
resporityServerUrl:http://50.88.1.167:10009/urule/loadknowledge
2019-05-29 16:44:59,269 | WARN | main | In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN). | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
2019-05-29 16:44:59,279 | WARN | main | Detected deprecated memory fraction settings: [spark.shuffle.memoryFraction, spark.storage.memoryFraction, spark.storage.unrollFraction]. As of Spark 1.6, execution and storage memory management are unified. All memory fractions used in the old model are now deprecated and no longer read. If you wish to use the old memory management, you may explicitly enable `spark.memory.useLegacyMode` (not recommended). | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
2019-05-29 16:44:59,302 | WARN | main | Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 50.88.1.166 instead (on interface eth0) | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
2019-05-29 16:44:59,302 | WARN | main | Set SPARK_LOCAL_IP if you need to bind to another address | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
executor sql :select * from default.person
2019-05-29 16:45:07,642 | ERROR | main | Error setting up authorization: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli | org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:743)
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthzPluginException: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.assertHiveCliAuthDisabled(SQLStdHiveAuthorizationValidator.java:69)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.<init>(SQLStdHiveAuthorizationValidator.java:63)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory.createHiveAuthorizer(SQLStdHiveAuthorizerFactory.java:37)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:734)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizerV2(SessionState.java:1386)
at org.apache.spark.sql.hive.client.HiveClientImpl.org$apache$spark$sql$hive$client$HiveClientImpl$$checkMetastorePrivilege(HiveClientImpl.scala:567)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply$mcZ$sp(HiveClientImpl.scala:606)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:307)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:246)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:245)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:292)
at org.apache.spark.sql.hive.client.HiveClientImpl.checkPrivilege(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.acl.HiveACLInterface.checkPrivilege(HiveACLInterface.scala:28)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:471)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:62)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:188)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:305)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:287)
at org.apache.spark.sql.hive.acl.PrivCheck.checkPlan(PrivCheck.scala:62)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:41)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:35)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.execution.QueryExecution.prepareForExecution(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:125)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:125)
at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:2570)
at org.apache.spark.sql.Dataset.rdd(Dataset.scala:2567)
at org.poem.exectors.UruleOutlayExecutors$.run(UruleOutlayExecutors.scala:43)
at org.poem.SparkApp$.main(SparkApp.scala:19)
at org.poem.SparkApp.main(SparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:761)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthzPluginException: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:744)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizerV2(SessionState.java:1386)
at org.apache.spark.sql.hive.client.HiveClientImpl.org$apache$spark$sql$hive$client$HiveClientImpl$$checkMetastorePrivilege(HiveClientImpl.scala:567)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply$mcZ$sp(HiveClientImpl.scala:606)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:307)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:246)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:245)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:292)
at org.apache.spark.sql.hive.client.HiveClientImpl.checkPrivilege(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.acl.HiveACLInterface.checkPrivilege(HiveACLInterface.scala:28)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:471)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:62)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:188)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:305)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:287)
at org.apache.spark.sql.hive.acl.PrivCheck.checkPlan(PrivCheck.scala:62)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:41)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:35)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.execution.QueryExecution.prepareForExecution(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:125)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:125)
at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:2570)
at org.apache.spark.sql.Dataset.rdd(Dataset.scala:2567)
at org.poem.exectors.UruleOutlayExecutors$.run(UruleOutlayExecutors.scala:43)
at org.poem.SparkApp$.main(SparkApp.scala:19)
at org.poem.SparkApp.main(SparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:761)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthzPluginException: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.assertHiveCliAuthDisabled(SQLStdHiveAuthorizationValidator.java:69)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.<init>(SQLStdHiveAuthorizationValidator.java:63)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory.createHiveAuthorizer(SQLStdHiveAuthorizerFactory.java:37)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:734)
... 57 more
[root@localhost urule-azkaban-executor]#
解决了好几天, 看一下 项目 pom.xml 的原始配置
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.poem</groupId>
<version>1.0.0-SNAPSHOT</version>
<artifactId>urule-aspark-executor</artifactId>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.4.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<properties>
<spark-version>2.3.1</spark-version>
<scala.version>2.11.12</scala.version>
<commons-collction.version>3.2.2</commons-collction.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<lombok.version>1.18.4</lombok.version>
<swagger2.version>2.9.2</swagger2.version>
<swaggerbootstrapui.version>1.8.9</swaggerbootstrapui.version>
</properties>
<dependencies>
<!--HDFS-->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.5</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--SPARK-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark-version}</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark-version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>${spark-version}</version>
</dependency>
<!--log4j-->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.55</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-hbase-handler</artifactId>
<version>1.2.1</version>
</dependency>
<!--参数解析包-->
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>1.72</version>
</dependency>
<!--hive jdbc-->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.15</version>
</dependency>
<!--urule-->
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-core</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-console</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-common</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<!--ANTLR 4-->
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr4-runtime</artifactId>
<version>4.7</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.0.2</version>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.1.4.RELEASE</version>
</plugin>
<plugin>
<!-- 这是个编译java代码的 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<scalaCompatVersion>${scala.version}</scalaCompatVersion>
</configuration>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>org.poem.SparkApp</mainClass>
<!-- 入口程序 -->
<addClasspath>true</addClasspath>
<!-- 添加依赖jar路径 -->
<classpathPrefix>lib/</classpathPrefix>
<useUniqueVersions>false</useUniqueVersions>
</manifest>
</archive>
</configuration>
</plugin>
<!-- 跳过单元测试 -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
</plugins>
</build>
</project>
给的实例可以运行,但是自己的就不行了, 一个一个的检查, 发现了问题.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
问题就出在这儿.项目中使用到了 Spring Bean
/**
* 初始化bean
*
* @param knowPackage
* @return
*/
def initBean(knowPackage: String): KnowledgePackage = {
val ctx: ApplicationContext = new AnnotationConfigApplicationContext("org.poem")
val knowledgeService = ctx.getBean(KnowledgeService.BEAN_ID).asInstanceOf[KnowledgeService]
var knowledge: KnowledgePackage = null
try {
knowledge = knowledgeService.getKnowledge(knowPackage)
} catch {
case e: IOException =>
e.printStackTrace()
}
knowledge
}
spring-boot-maven-plugin 的主要作用是Spring boot的那一堆, 在打包的时候, 会把这些打进去, 最后的包有200+M, 但是去掉之后, 打出来之后70+k的大小, 加上这个之后, 就会出现权限认证失败的问题, 具体没有深究过. 项目中使用到了 urule 这个规则引擎, 里面有自己的更改. 不再赘述 最后的 pom.xml依赖是:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.poem</groupId>
<version>1.0.0-SNAPSHOT</version>
<artifactId>urule-aspark-executor</artifactId>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.4.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<spark-version>2.3.1</spark-version>
<scala.version>2.11.12</scala.version>
<commons-collction.version>3.2.2</commons-collction.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<lombok.version>1.18.4</lombok.version>
<spring.version>5.1.4.RELEASE</spring.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.esotericsoftware.kryo</groupId>
<artifactId>kryo</artifactId>
<version>2.21</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<!--urule-->
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-core</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-console</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-common</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.15</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<!-- spring-->
<!-- spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${spring.version}</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.0.2</version>
</plugin>
<plugin>
<!-- 这是个编译java代码的 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<scalaCompatVersion>${scala.version}</scalaCompatVersion>
</configuration>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
</execution>
</executions>
</plugin>
<!-- 跳过单元测试 -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
</plugins>
</build>
</project>
原文链接:https://blog.csdn.net/poem_2010/article/details/90720916
相关推荐
- 2022-08-29 Python使用re模块实现正则表达式操作指南_python
- 2023-10-12 img标签使用base64图片以及如何将图片转为base64格式
- 2022-04-01 安装k8s Error initializing network controller: Error
- 2023-01-07 Python源码加密与Pytorch模型加密分别介绍_python
- 2024-02-26 Yaml数据读取
- 2022-07-06 windows清理系统垃圾bat脚本及使用步骤_DOS/BAT
- 2022-07-15 Python 并行加速技巧分享_python
- 2023-03-11 CefSharp过滤图片RequestHandler问题_C#教程
- 最近更新
-
- window11 系统安装 yarn
- 超详细win安装深度学习环境2025年最新版(
- Linux 中运行的top命令 怎么退出?
- MySQL 中decimal 的用法? 存储小
- get 、set 、toString 方法的使
- @Resource和 @Autowired注解
- Java基础操作-- 运算符,流程控制 Flo
- 1. Int 和Integer 的区别,Jav
- spring @retryable不生效的一种
- Spring Security之认证信息的处理
- Spring Security之认证过滤器
- Spring Security概述快速入门
- Spring Security之配置体系
- 【SpringBoot】SpringCache
- Spring Security之基于方法配置权
- redisson分布式锁中waittime的设
- maven:解决release错误:Artif
- restTemplate使用总结
- Spring Security之安全异常处理
- MybatisPlus优雅实现加密?
- Spring ioc容器与Bean的生命周期。
- 【探索SpringCloud】服务发现-Nac
- Spring Security之基于HttpR
- Redis 底层数据结构-简单动态字符串(SD
- arthas操作spring被代理目标对象命令
- Spring中的单例模式应用详解
- 聊聊消息队列,发送消息的4种方式
- bootspring第三方资源配置管理
- GIT同步修改后的远程分支