QnaList > Groups > Spark-User > Mar 2016
faq

Spark-shell Failing But Pyspark Works

Hi,
I'm having issues to create a StreamingContext with Scala using spark-shell. It tries to access the localhost interface and the Application Master is not running on this interface :
ERROR ApplicationMaster: Failed to connect to driver at localhost:47257, retrying ...
I don't have the issue with Python and pyspark which works fine (you can see it uses the ip address) : 
ApplicationMaster: Driver now available: 192.168.10.100:43290
I use similar codes though :
test.scala :
import org.apache.spark._
import org.apache.spark.streaming._
val app = "test-scala"
val conf = new SparkConf().setAppName(app).setMaster("yarn-client")
val ssc = new StreamingContext(conf, Seconds(3))
command used : spark-shell -i test.scala
test.py :
from pyspark import SparkConf, SparkContext
from pyspark.streaming import StreamingContext
app = "test-python"
conf = SparkConf().setAppName(app).setMaster("yarn-client")
sc = SparkContext(conf=conf)
ssc = StreamingContext(sc, 3)
command used : pyspark test.py
Any idea why scala can't instantiate it ? I thought python was barely using scala under the hood, but it seems there are differences. Are there any parameters set using Scala but not Python ? 
Thanks
Cyril SCETBON
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

asked Mar 31 2016 at 20:22

Cyril Scetbon 's gravatar image



Related discussions

Tagged

Group Spark-user

asked Mar 31 2016 at 20:22

active Mar 31 2016 at 20:22

posts:1

users:1

Spark-dev

Spark-user

©2013 QnaList.com