专业编程培训机构——完成蜕变以后轻松拿高薪
电话+V: 152079-09430 ,欢迎咨询spark编译视频教程,[python实用课程],[C++单片机原理],[C#网站搭建],[Nodejs小程序开发],[ios游戏开发],[安卓游戏开发],[教会用大脑用想法赚钱实现阶层跨越]
一、SparkShell因为Scala编译器原因不能正常启动怎么解决
SparkShell由于Scala编译器原因不能正常启动
使用SBT安装完成Spark后,可以运行示例,但是尝试运行spark-shell就会报错:
D:\\Scala\\spark\\bin\\spark-shell.cmd
SLF4J:ClasspathcontainsmultipleSLF4Jbindings.
SLF4J:
Foundbindingin
[jar:file:/D:/Scala/spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J:Foundbindingin
[jar:file:/D:/Scala/spark/tools/target/scala-2.10/spark-tools-assembly-0.9.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J:Seehttp://www.slf4j.org/codes.html#multiple_bindingsforanexplanation.
SLF4J:Actualbindingisoftype[org.slf4j.impl.Log4jLoggerFactory]
14/04/0320:40:43INFOHttpServer:UsingSpark'sdefaultlog4jprofile:org/apache/spark/log4j-defaults.properties
14/04/0320:40:43INFOHttpServer:StartingHTTPServer
Failedtoinitializecompiler:objectscala.runtimeincompilermirrornotfound.
**Notethatasof2.8scaladoesnotassumeuseofthejavaclasspath.
**Fortheoldbehaviorpass-usejavacptoscala,orifusingaSettings
**objectprogramatically,settings.usejavacp.value=true.
14/04/03
20:40:44WARNSparkILoop$SparkILoopInterpreter:Warning:compiler
accessedbeforeinitsetup.AssumingnopostInitcode.
Failedtoinitializecompiler:objectscala.runtimeincompilermirrornotfound.
**Notethatasof2.8scaladoesnotassumeuseofthejavaclasspath.
**Fortheoldbehaviorpass-usejavacptoscala,orifusingaSettings
**objectprogramatically,settings.usejavacp.value=true.
Failedtoinitializecompiler:objectscala.runtimeincompilermirrornotfound.
atscala.Predef$.assert(Predef.scala:179)
at
org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:197)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:919)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:876)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:876)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
atorg.apache.spark.repl.SparkILoop.process(SparkILoop.scala:876)
atorg.apache.spark.repl.SparkILoop.process(SparkILoop.scala:968)
atorg.apache.spark.repl.Main$.main(Main.scala:31)
atorg.apache.spark.repl.Main.main(Main.scala)
之还是不求解。只是在SBT的网站上看到QA里面有个问题提到了:http://www.scala-sbt.org/release
/docs/faq#how-do-i-use-the-scala-interpreter-in-my-code。这里说代码中怎么修改设置。显然不
适合我。
继续求解。注意到错误提示是在2.8以后才有的,原因是有一个关于编译器解释权Classpath的提议被接受了:Defaultcompiler/interpreterclasspathinamanagedenvironment。
继续在Google中找,有一篇论文吸引了我的注意:ObjectScalaFound。里面终于找到一个办法:
“
However,aworkingcommandcanberecovered,likeso:
$jrunscript-Djava.class.path=scala-library.jar-Dscala.usejavacp=true-classpathscala-compiler.jar-lscala
”
于是修改一下\\bin\\spark-class2.cmd:
remSetJAVA_OPTStobeabletoloadnativelibrariesandtosetheapsize
set
JAVA_OPTS=%OUR_JAVA_OPTS%-Djava.library.path=%SPARK_LIBRARY_PATH%
-Dscala.usejavacp=true-Xms%SPARK_MEM%-Xmx%SPARK_MEM%
remAttention:whenchangingthewaytheJAVA_OPTSareassembled,thechangemustbereflectedinExecutorRunner.scala!
标红的部分即是心添加的一个参数。再次运行\\bin\\spark-shell.cmd:
D:>D:\\Scala\\spark\\bin\\spark-shell.cmd
SLF4J:ClasspathcontainsmultipleSLF4Jbindings.
SLF4J:
Foundbindingin
[jar:file:/D:/Scala/spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J:Foundbindingin
[jar:file:/D:/Scala/spark/tools/target/scala-2.10/spark-tools-assembly-0.9.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J:Seehttp://www.slf4j.org/codes.html#multiple_bindingsforanexplanation.
SLF4J:Actualbindingisoftype[org.slf4j.impl.Log4jLoggerFactory]
14/04/0322:18:41INFOHttpServer:UsingSpark'sdefaultlog4jprofile:org/apache/spark/log4j-defaults.properties
14/04/0322:18:41INFOHttpServer:StartingHTTPServer
Welcometo
____
__
/__/__________//__
_\\\\/_\\/_`/__/'_/
/___/.__/\\_,_/_//_/\\_\\version0.9.0
/_/
UsingScalaversion2.10.3(JavaHotSpot(TM)ClientVM,Java1.6.0_10)
Typeinexpressionstohavethemevaluated.
Type:helpformoreinformation.
14/04/0322:19:12INFOSlf4jLogger:Slf4jLoggerstarted
14/04/0322:19:13INFORemoting:Startingremoting
14/04/0322:19:16INFORemoting:Remotingstarted;listeningonaddresses:[akka.tcp://spark@Choco-PC:5960]
14/04/0322:19:16INFORemoting:Remotingnowlistensonaddresses:[akka.tcp://spark@Choco-PC:5960]
14/04/0322:19:16INFOSparkEnv:RegisteringBlockManagerMaster
14/04/03
22:19:17INFODiskBlockManager:Createdlocaldirectoryat
C:\\Users\\Choco\\AppData\\Local\\Temp\\spark-local-20140403221917-7172
14/04/0322:19:17INFOMemoryStore:MemoryStorestartedwithcapacity304.8MB.
14/04/0322:19:18INFOConnectionManager:Boundsockettoport5963withid=ConnectionManagerId(Choco-PC,5963)
14/04/0322:19:18INFOBlockManagerMaster:TryingtoregisterBlockManager
14/04/0322:19:18INFOBlockManagerMasterActor$BlockManagerInfo:RegisteringblockmanagerChoco-PC:5963with304.8MBRAM
14/04/0322:19:18INFOBlockManagerMaster:RegisteredBlockManager
14/04/0322:19:18INFOHttpServer:StartingHTTPServer
14/04/0322:19:18INFOHttpBroadcast:Broadcastserverstartedathttp://192.168.1.100:5964
14/04/0322:19:18INFOSparkEnv:RegisteringMapOutputTracker
14/04/03
22:19:18INFOHttpFileServer:HTTPFileserverdirectoryis
C:\\Users\\Choco\\AppData\\Local\\Temp\\spark-e122cfe9-2d62-4a47-920c-96b54e4658f6
14/04/0322:19:18INFOHttpServer:StartingHTTPServer
14/04/0322:19:22INFOSparkUI:StartedSparkWebUIathttp://Choco-PC:4040
14/04/0322:19:22INFOExecutor:UsingREPLclassURI:http://192.168.1.100:5947
Createdsparkcontext..
Sparkcontextavailableassc.
scala>:quit
Stoppingsparkcontext.
14/04/0323:05:21INFOMapOutputTrackerMasterActor:MapOutputTrackerActorstopped!
14/04/0323:05:21INFOConnectionManager:Selectorthreadwasinterrupted!
14/04/0323:05:21INFOConnectionManager:ConnectionManagerstopped
14/04/0323:05:21INFOMemoryStore:MemoryStorecleared
14/04/0323:05:21INFOBlockManager:BlockManagerstopped
14/04/0323:05:21INFOBlockManagerMasterActor:StoppingBlockManagerMaster
14/04/0323:05:21INFOBlockManagerMaster:BlockManagerMasterstopped
14/04/0323:05:21INFOSparkContext:SuccessfullystoppedSparkContext
14/04/0323:05:21INFORemoteActorRefProvider$RemotingTerminator:Shuttingdownremotedaemon.
14/04/03
23:05:21INFORemoteActorRefProvider$RemotingTerminator:Remotedaemon
shutdown;proceedingwithflushingremotetransports.
Good。浏览器打开http://Choco-PC:4040,就可以看到Spark的状态、环境、执行者等信息了。
这个Fix可能只是适用与我的情况。如果还有问题可以再找找相关的资料。
【FUTURE PROGRAMMING COURSE】尊享对接老板
电话+V: 152079-09430
机构由一批拥有10年以上开发管理经验,且来自互联网或研究机构的IT精英组成,负责研究、开发教学模式和课程内容。公司具有完善的课程研发体系,一直走在整个行业发展的前端,在行业内竖立起了良好的品质口碑。