網站首頁 編程語言 正文
spark-submit hive SQL standards based authorization should not be enabled from hive cliInstead the
作者:魯尼的小寶貝 更新時間: 2022-03-15 編程語言近期在使用spark-submit 提交任務, 并且執行, 碰到各種問題, 在實施現場卡了好幾天. 現在終于解決了.
這個只包含自己的處理過程, 每一個人的不一樣, 所以照著做了, 也不一定解決, 但是有一個已知的成功的方法還是不錯的.
運行的環境是華為的大數據平臺 FusionInsight HD V100R002C70SPC200 ,? 使用 kerberos 做為hive的訪問權限認證, 遇到奇葩的權限問題:
.....................................................................................................
. uRule, is a Chinese style rule engine licensed under the Apache License 2.0, .
. which is opensource, easy to use,high-performance, with browser-based-designer. .
.....................................................................................................
resporityServerUrl:http://50.88.1.167:10009/urule/loadknowledge
2019-05-29 16:44:59,269 | WARN | main | In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN). | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
2019-05-29 16:44:59,279 | WARN | main | Detected deprecated memory fraction settings: [spark.shuffle.memoryFraction, spark.storage.memoryFraction, spark.storage.unrollFraction]. As of Spark 1.6, execution and storage memory management are unified. All memory fractions used in the old model are now deprecated and no longer read. If you wish to use the old memory management, you may explicitly enable `spark.memory.useLegacyMode` (not recommended). | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
2019-05-29 16:44:59,302 | WARN | main | Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 50.88.1.166 instead (on interface eth0) | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
2019-05-29 16:44:59,302 | WARN | main | Set SPARK_LOCAL_IP if you need to bind to another address | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
executor sql :select * from default.person
2019-05-29 16:45:07,642 | ERROR | main | Error setting up authorization: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli | org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:743)
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthzPluginException: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.assertHiveCliAuthDisabled(SQLStdHiveAuthorizationValidator.java:69)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.<init>(SQLStdHiveAuthorizationValidator.java:63)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory.createHiveAuthorizer(SQLStdHiveAuthorizerFactory.java:37)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:734)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizerV2(SessionState.java:1386)
at org.apache.spark.sql.hive.client.HiveClientImpl.org$apache$spark$sql$hive$client$HiveClientImpl$$checkMetastorePrivilege(HiveClientImpl.scala:567)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply$mcZ$sp(HiveClientImpl.scala:606)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:307)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:246)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:245)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:292)
at org.apache.spark.sql.hive.client.HiveClientImpl.checkPrivilege(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.acl.HiveACLInterface.checkPrivilege(HiveACLInterface.scala:28)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:471)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:62)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:188)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:305)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:287)
at org.apache.spark.sql.hive.acl.PrivCheck.checkPlan(PrivCheck.scala:62)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:41)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:35)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.execution.QueryExecution.prepareForExecution(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:125)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:125)
at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:2570)
at org.apache.spark.sql.Dataset.rdd(Dataset.scala:2567)
at org.poem.exectors.UruleOutlayExecutors$.run(UruleOutlayExecutors.scala:43)
at org.poem.SparkApp$.main(SparkApp.scala:19)
at org.poem.SparkApp.main(SparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:761)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthzPluginException: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:744)
at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizerV2(SessionState.java:1386)
at org.apache.spark.sql.hive.client.HiveClientImpl.org$apache$spark$sql$hive$client$HiveClientImpl$$checkMetastorePrivilege(HiveClientImpl.scala:567)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply$mcZ$sp(HiveClientImpl.scala:606)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$checkPrivilege$1.apply(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:307)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:246)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:245)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:292)
at org.apache.spark.sql.hive.client.HiveClientImpl.checkPrivilege(HiveClientImpl.scala:603)
at org.apache.spark.sql.hive.acl.HiveACLInterface.checkPrivilege(HiveACLInterface.scala:28)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:471)
at org.apache.spark.sql.hive.acl.PrivCheck$$anonfun$checkPlan$1.applyOrElse(PrivCheck.scala:62)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:290)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:287)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:188)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:305)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:287)
at org.apache.spark.sql.hive.acl.PrivCheck.checkPlan(PrivCheck.scala:62)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:41)
at org.apache.spark.sql.hive.acl.PrivCheck.apply(PrivCheck.scala:35)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$prepareForExecution$1.apply(QueryExecution.scala:132)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.execution.QueryExecution.prepareForExecution(QueryExecution.scala:132)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:122)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:125)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:125)
at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:2570)
at org.apache.spark.sql.Dataset.rdd(Dataset.scala:2567)
at org.poem.exectors.UruleOutlayExecutors$.run(UruleOutlayExecutors.scala:43)
at org.poem.SparkApp$.main(SparkApp.scala:19)
at org.poem.SparkApp.main(SparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:761)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthzPluginException: SQL standards based authorization should not be enabled from hive cliInstead the use of storage based authorization in hive metastore is reccomended. Set hive.security.authorization.enabled=false to disable authz within cli
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.assertHiveCliAuthDisabled(SQLStdHiveAuthorizationValidator.java:69)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizationValidator.<init>(SQLStdHiveAuthorizationValidator.java:63)
at org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory.createHiveAuthorizer(SQLStdHiveAuthorizerFactory.java:37)
at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:734)
... 57 more
[root@localhost urule-azkaban-executor]#
解決了好幾天, 看一下 項目 pom.xml 的原始配置
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.poem</groupId>
<version>1.0.0-SNAPSHOT</version>
<artifactId>urule-aspark-executor</artifactId>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.4.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<properties>
<spark-version>2.3.1</spark-version>
<scala.version>2.11.12</scala.version>
<commons-collction.version>3.2.2</commons-collction.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<lombok.version>1.18.4</lombok.version>
<swagger2.version>2.9.2</swagger2.version>
<swaggerbootstrapui.version>1.8.9</swaggerbootstrapui.version>
</properties>
<dependencies>
<!--HDFS-->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.5</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--SPARK-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark-version}</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark-version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>${spark-version}</version>
</dependency>
<!--log4j-->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.55</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-hbase-handler</artifactId>
<version>1.2.1</version>
</dependency>
<!--參數解析包-->
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>1.72</version>
</dependency>
<!--hive jdbc-->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.15</version>
</dependency>
<!--urule-->
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-core</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-console</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-common</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<!--ANTLR 4-->
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr4-runtime</artifactId>
<version>4.7</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.0.2</version>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.1.4.RELEASE</version>
</plugin>
<plugin>
<!-- 這是個編譯java代碼的 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<scalaCompatVersion>${scala.version}</scalaCompatVersion>
</configuration>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>org.poem.SparkApp</mainClass>
<!-- 入口程序 -->
<addClasspath>true</addClasspath>
<!-- 添加依賴jar路徑 -->
<classpathPrefix>lib/</classpathPrefix>
<useUniqueVersions>false</useUniqueVersions>
</manifest>
</archive>
</configuration>
</plugin>
<!-- 跳過單元測試 -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
</plugins>
</build>
</project>
給的實例可以運行,但是自己的就不行了, 一個一個的檢查, 發現了問題.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
問題就出在這兒.項目中使用到了 Spring Bean
/**
* 初始化bean
*
* @param knowPackage
* @return
*/
def initBean(knowPackage: String): KnowledgePackage = {
val ctx: ApplicationContext = new AnnotationConfigApplicationContext("org.poem")
val knowledgeService = ctx.getBean(KnowledgeService.BEAN_ID).asInstanceOf[KnowledgeService]
var knowledge: KnowledgePackage = null
try {
knowledge = knowledgeService.getKnowledge(knowPackage)
} catch {
case e: IOException =>
e.printStackTrace()
}
knowledge
}
spring-boot-maven-plugin 的主要作用是Spring boot的那一堆, 在打包的時候, 會把這些打進去, 最后的包有200+M, 但是去掉之后, 打出來之后70+k的大小, 加上這個之后, 就會出現權限認證失敗的問題, 具體沒有深究過. 項目中使用到了 urule 這個規則引擎, 里面有自己的更改. 不再贅述 最后的 pom.xml依賴是:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.poem</groupId>
<version>1.0.0-SNAPSHOT</version>
<artifactId>urule-aspark-executor</artifactId>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.4.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<spark-version>2.3.1</spark-version>
<scala.version>2.11.12</scala.version>
<commons-collction.version>3.2.2</commons-collction.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<lombok.version>1.18.4</lombok.version>
<spring.version>5.1.4.RELEASE</spring.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.esotericsoftware.kryo</groupId>
<artifactId>kryo</artifactId>
<version>2.21</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<!--urule-->
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-core</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-console</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.poem</groupId>
<artifactId>urule-common</artifactId>
<version>1.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.15</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<!-- spring-->
<!-- spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${spring.version}</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.0.2</version>
</plugin>
<plugin>
<!-- 這是個編譯java代碼的 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<configuration>
<scalaCompatVersion>${scala.version}</scalaCompatVersion>
</configuration>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
</execution>
</executions>
</plugin>
<!-- 跳過單元測試 -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
</plugins>
</build>
</project>
?
原文鏈接:https://blog.csdn.net/poem_2010/article/details/90720916
相關推薦
- 2022-03-15 ant design: Instance created by `useForm` is not c
- 2022-04-01 exception occurred during ITK-SNAP startup itk-sn
- 2022-06-29 python版單鏈表反轉_python
- 2022-10-12 antd為Tree組件標題附加操作按鈕功能_Redis
- 2022-05-22 Python臨時文件創建之tempfile模塊介紹_python
- 2022-01-27 laravel的服務注入新增service層,多方式
- 2022-03-21 C語言動態內存管理介紹_C 語言
- 2022-07-18 數據結構 III 深入理解棧和隊列實現
- 最近更新
-
- window11 系統安裝 yarn
- 超詳細win安裝深度學習環境2025年最新版(
- Linux 中運行的top命令 怎么退出?
- MySQL 中decimal 的用法? 存儲小
- get 、set 、toString 方法的使
- @Resource和 @Autowired注解
- Java基礎操作-- 運算符,流程控制 Flo
- 1. Int 和Integer 的區別,Jav
- spring @retryable不生效的一種
- Spring Security之認證信息的處理
- Spring Security之認證過濾器
- Spring Security概述快速入門
- Spring Security之配置體系
- 【SpringBoot】SpringCache
- Spring Security之基于方法配置權
- redisson分布式鎖中waittime的設
- maven:解決release錯誤:Artif
- restTemplate使用總結
- Spring Security之安全異常處理
- MybatisPlus優雅實現加密?
- Spring ioc容器與Bean的生命周期。
- 【探索SpringCloud】服務發現-Nac
- Spring Security之基于HttpR
- Redis 底層數據結構-簡單動態字符串(SD
- arthas操作spring被代理目標對象命令
- Spring中的單例模式應用詳解
- 聊聊消息隊列,發送消息的4種方式
- bootspring第三方資源配置管理
- GIT同步修改后的遠程分支