eclipse - exception in thread "main" java.lang.NoSuchMethodError scala.collection.immutable.hashset$ -


header 1 # imported spark code run on eclipse getting build errors working fine on terminal

header 2

/*sampleapp.scala: application counts number of lines contain "val" */

import org.apache.spark.sparkcontext import org.apache.spark.sparkcontext._ import org.apache.spark.sparkconf  object simpleapp {     def main(args: array[string]) {          val txtfile = "file:///home/edureka/desktop/readme.txt"         val conf = new sparkconf().setmaster("local[2]").setappname("sample application")         val sc = new sparkcontext(conf)         val txtfilelines = sc.textfile(txtfile , 2).cache()         val numas = txtfilelines.filter(line => line.contains("bash")).count()         println("lines bash: %s".format(numas))       } } 

header 3 "

lf4j: class path contains multiple slf4j bindings. slf4j: found binding in [jar:file:/home/edureka/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/home/edureka/spark-1.1.1/assembly/target/scala-2.10/spark-assembly-1.1.1-hadoop2.2.0.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: see http://www.slf4j.org/codes.html#multiple_bindings explanation. slf4j: actual binding of type [org.slf4j.impl.log4jloggerfactory] 15/08/16 17:00:16 warn util.utils: hostname, localhost.localdomain resolves loopback address: 127.0.0.1; using 192.168.211.130 instead (on interface eth2) 15/08/16 17:00:16 warn util.utils: set spark_local_ip if need bind address 15/08/16 17:00:16 info spark.securitymanager: changing view acls to: edureka 15/08/16 17:00:16 info spark.securitymanager: changing modify acls to: edureka 15/08/16 17:00:16 info spark.securitymanager: securitymanager: authentication disabled; ui acls disabled; users view permissions: set(edureka); users modify permissions: set(edureka) exception in thread "main" java.lang.nosuchmethoderror: scala.collection.immutable.hashset$.empty()lscala/collection/immutable/hashset;     @ akka.actor.actorcell$.<init>(actorcell.scala:305)     @ akka.actor.actorcell$.<clinit>(actorcell.scala)     @ akka.actor.rootactorpath.$div(actorpath.scala:152)     @ akka.actor.localactorrefprovider.<init>(actorrefprovider.scala:465)     @ akka.remote.remoteactorrefprovider.<init>(remoteactorrefprovider.scala:124)     @ sun.reflect.nativeconstructoraccessorimpl.newinstance0(native method)     @ sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl.java:57)     @ sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl.java:45)     @ java.lang.reflect.constructor.newinstance(constructor.java:526)     @ akka.actor.reflectivedynamicaccess$$anonfun$createinstancefor$2.apply(dynamicaccess.scala:78)     @ scala.util.try$.apply(try.scala:191)     @ akka.actor.reflectivedynamicaccess.createinstancefor(dynamicaccess.scala:73)     @ akka.actor.reflectivedynamicaccess$$anonfun$createinstancefor$3.apply(dynamicaccess.scala:84)     @ akka.actor.reflectivedynamicaccess$$anonfun$createinstancefor$3.apply(dynamicaccess.scala:84)     @ scala.util.success.flatmap(try.scala:230)     @ akka.actor.reflectivedynamicaccess.createinstancefor(dynamicaccess.scala:84)     @ akka.actor.actorsystemimpl.<init>(actorsystem.scala:550)     @ akka.actor.actorsystem$.apply(actorsystem.scala:111)     @ akka.actor.actorsystem$.apply(actorsystem.scala:104)     @ org.apache.spark.util.akkautils$.org$apache$spark$util$akkautils$$docreateactorsystem(akkautils.scala:121)     @ org.apache.spark.util.akkautils$$anonfun$1.apply(akkautils.scala:54)     @ org.apache.spark.util.akkautils$$anonfun$1.apply(akkautils.scala:53)     @ org.apache.spark.util.utils$$anonfun$startserviceonport$1.apply$mcvi$sp(utils.scala:1504)     @ scala.collection.immutable.range.foreach$mvc$sp(range.scala:166)     @ org.apache.spark.util.utils$.startserviceonport(utils.scala:1495)     @ org.apache.spark.util.akkautils$.createactorsystem(akkautils.scala:56)     @ org.apache.spark.sparkenv$.create(sparkenv.scala:153)     @ org.apache.spark.sparkcontext.<init>(sparkcontext.scala:204)     @ simpleapp$.main(sampleapp.scala:14)     @ simpleapp.main(sampleapp.scala) 

be careful, kind of problems happen quite spark. if don't want other surprises, can build spak against right versions of dependencies may using (guava, log4j, scala, jackson) also, consider using spark.driver.userclasspathfirst , spark.executor.userclasspathfirst properties in order make classpath priotary on spark bundled dependencies. worked me when passing them parameter of spark-submit. not work when setting them in sparkconf (which makes sense).

even these properties set true, may still have problems because spark uses separate classloader, can lead issues if dependencies have same version number. in case, manually building spark allow fix (to knowledge).


Comments

Popular posts from this blog

dns - How To Use Custom Nameserver On Free Cloudflare? -

python - Pygame screen.blit not working -

c# - Web API response xml language -