java.lang.ClassNotFoundException: org.apache.spark.sql.ForeachWriter

0

When running a Scala file that uses the Spark ForeachWriter type I get the following stack trace:

  

(run-main-0) java.lang.NoClassDefFoundError:   org / apache / spark / sql / ForeachWriter java.lang.NoClassDefFoundError:   org / apache / spark / sql / ForeachWriter           at main.scala.Collect.main (Collect.scala)           at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)           at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)           at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)           at java.lang.reflect.Method.invoke (Method.java:498) Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.ForeachWriter           at java.net.URLClassLoader.findClass (URLClassLoader.java:381)           at java.lang.ClassLoader.loadClass (ClassLoader.java:424)           at java.lang.ClassLoader.loadClass (ClassLoader.java:357)           at main.scala.Collect.main (Collect.scala)           at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)           at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)           at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)           at java.lang.reflect.Method.invoke (Method.java:498)

My build.sbt I have the following added to libraryDependencies:

    name := "SentimentAnalysis"

version := "2.0.0"

scalaVersion := "2.11.4"//"2.10.4"//

libraryDependencies ++= {
  val sparkVer = "2.1.0"//"1.6.1"//
  Seq(
    "org.apache.spark"     %% "spark-core"              % sparkVer % "provided" withSo
urces(),
    "org.apache.spark"     %% "spark-mllib"             % sparkVer % "provided" withSo
urces(),
    "org.apache.spark"     %% "spark-sql"               % sparkVer % "provided" withSo
urces(),
    "org.apache.spark"     %% "spark-streaming"         % sparkVer % "provided" withSo
urces(),
    "org.apache.spark"     %% "spark-streaming-kafka-0-10" % sparkVer withSources(),
    "org.apache.spark"     %% "spark-sql-kafka-0-10" % sparkVer withSources(),
    "org.apache.kafka"     %% "kafka" % "0.10.0" withSources(),
    "com.typesafe" % "config" % "1.3.1",
    "com.google.code.gson" % "gson" % "2.8.0"
  )
}


assemblyMergeStrategy in assembly := {
  case PathList("org", "apache", xs @ _*)      => MergeStrategy.first
  case PathList("javax", "xml", xs @ _*)      => MergeStrategy.first
  case PathList("com", "esotericsoftware", xs @ _*)      => MergeStrategy.first
  case PathList("com", "google", xs @ _*)      => MergeStrategy.first
  case x =>
"build.sbt" 32L, 1285C
    
asked by David Sandoval 27.04.2018 в 00:29
source

0 answers