Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] dss_linkis_one-click_install_20230809 一体包运行spqrk sql失败 #5125

Closed
1 task done
lichao0128 opened this issue May 29, 2024 · 2 comments
Closed
1 task done
Labels
Question Further information is requested

Comments

@lichao0128
Copy link

Before asking

Your environment

  • Linkis version used: 1.1.2
  • Environment name and version:
    • cdh-5.14.2
    • hdp-3.1.5
    • hive-2.1.1
    • spark-3.2.1
    • scala-2.12.2
    • jdk 1.8.0_121
      本地环境 spark:1.6.3

Describe your questions

Q1. 本地环境spark:1.6.3,跑sql失败
请问可以在linkis配置中指定spark运行版本吗

Eureka service list

image

Some logs info or acctch file

2024-05-29 10:45:27.045 ERROR jobRequest(IDE_linkis_spark_0) execute failed,21304, Task is Failed,errorMsg: errCode: 12003 ,desc: hdp03:9101_0 Failed to async get EngineNode AMErrorException: errCode: 30002 ,desc: ServiceInstance(linkis-cg-engineconn, hdpxx:46539) ticketID: 0bf51275-9a46-4d3b-be2a-0b19d620a670 Failed to initialize engine, reason: Failed to start EngineConn, reason: Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/appcom/tmp/engineConnPublickDir/aa5beb7a-33e1-4113-a59c-667a91afd881/v000001/lib/log4j-slf4j-impl-2.17.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.0-292/spark/lib/spark-assembly-1.6.3.2.6.5.0-292-hadoop2.7.3.2.6.5.0-292.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at org.apache.linkis.common.conf.CommonVars.(CommonVars.scala:24)
at org.apache.linkis.common.conf.CommonVars$.apply(CommonVars.scala:58)
at org.apache.linkis.common.conf.CommonVars.apply(CommonVars.scala)
at org.apache.linkis.manager.label.conf.LabelCommonConfig.(LabelCommonConfig.java:25)
at org.apache.linkis.manager.label.builder.factory.LabelBuilderFactoryContext.getLabelBuilderFactory(LabelBuilderFactoryContext.java:48)
at org.apache.linkis.engineconn.launch.EngineConnServer$.(EngineConnServer.scala:63)
at org.apache.linkis.engineconn.launch.EngineConnServer$.(EngineConnServer.scala)
at org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
You can go to this path(/appcom/tmp/linkis/20240529/spark/0bf51275-9a46-4d3b-be2a-0b19d620a670/logs) to find the reason or ask the administrator for help ,ip: hdp03 ,port: 9101 ,serviceKind: linkis-cg-linkismanager ,ip: hdp03 ,port: 9104 ,serviceKind: linkis-cg-entrance
2024-05-29 10:45:27.045 INFO Task creation time(任务创建时间): 2024-05-29 10:45:21, Task scheduling time(任务调度时间): 2024-05-29 10:45:22, Task start time(任务开始时间): 2024-05-29 10:45:22, Mission end time(任务结束时间): 2024-05-29 10:45:27
2024-05-29 10:45:27.045 INFO Task submit to Orchestrator time:2024-05-29 10:45:22, Task request EngineConn time:not request ec, Task submit to EngineConn time:not submit to ec
2024-05-29 10:45:27.045 INFO Your mission(您的任务) 8 The total time spent is(总耗时时间为): 5.9 s
2024-05-29 10:45:27.045 INFO Sorry. Your job completed with a status Failed. You can view logs for the reason.
2024-05-29 10:45:27.045 INFO job is completed.

@lichao0128 lichao0128 added the Question Further information is requested label May 29, 2024
Copy link

😊 Welcome to the Apache Linkis community!!

We are glad that you are contributing by opening this issue.

Please make sure to include all the relevant context.
We will be here shortly.

If you are interested in contributing to our website project, please let us know!
You can check out our contributing guide on
👉 How to Participate in Project Contribution.

Community

WeChat Assistant WeChat Public Account

Mailing Lists

Name Description Subscribe Unsubscribe Archive
[email protected] community activity information subscribe unsubscribe archive

@peacewong
Copy link
Contributor

Hello, this looks like the scala default version of linkis does not match the scala version inside your spark causing

你好,这个看是linkis的scala默认版本和你的spark里面的scala版本不匹配导致

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants