1

enter image description hereI'm trying to execute my first PySpark code in PyCharm IDE ,and facing the following exception.

from pyspark import SparkContext


def example():
    sc = SparkContext('local')
    words = sc.parallelize(["scala", "java", "hadoop", "spark", "akka"])
    print(sc.getConf().getAll())
    return words.count()


print(example())

and Printed the following data.

[('spark.master', 'local'), ('spark.rdd.compress', 'True'), ('spark.serializer.objectStreamReset', '100'), ('spark.driver.port', '59627'), ('spark.executor.id', 'driver'), ('spark.submit.deployMode', 'client'), ('spark.app.id', 'local-1526547201037'), ('spark.driver.host', 'LAPTOP-DDRRK6SB'), ('spark.ui.showConsoleProgress', 'true'), ('spark.app.name', 'pyspark-shell')]

and the following exception.

py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException

Sorry for my English, Expecting what's wrong with the code.

9
  • which line fails? Commented May 17, 2018 at 9:08
  • Same works for me Commented May 17, 2018 at 9:08
  • words.count() for executing this line facing the exception Commented May 17, 2018 at 9:10
  • Works for me too. Commented May 17, 2018 at 9:15
  • 1
    Which Java / JDK version do you use? Commented May 17, 2018 at 9:36

1 Answer 1

1

I don't know exact problem, On rolling back to 1.8.0_171 to this java version, It is working fine. Thanks Rumoku for your suggestion.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.