I use PyCharm. Mind if I ask to elaborate what you did step by step?

2018년 6월 16일 (토) 오전 12:11, Marcelo Vanzin <van...@cloudera.com.invalid>님이
작성:

> I'm not familiar with PyCharm. But if you can run "pyspark" from the
> command line and not hit this, then this might be an issue with
> PyCharm or your environment - e.g. having an old version of the
> pyspark code around, or maybe PyCharm itself might need to be updated.
>
> On Thu, Jun 14, 2018 at 10:01 PM, Aakash Basu
> <aakash.spark....@gmail.com> wrote:
> > Hi,
> >
> > Downloaded the latest Spark version because the of the fix for "ERROR
> > AsyncEventQueue:70 - Dropping event from queue appStatus."
> >
> > After setting environment variables and running the same code in PyCharm,
> > I'm getting this error, which I can't find a solution of.
> >
> > Exception in thread "main" java.util.NoSuchElementException: key not
> found:
> > _PYSPARK_DRIVER_CONN_INFO_PATH
> >     at scala.collection.MapLike$class.default(MapLike.scala:228)
> >     at scala.collection.AbstractMap.default(Map.scala:59)
> >     at scala.collection.MapLike$class.apply(MapLike.scala:141)
> >     at scala.collection.AbstractMap.apply(Map.scala:59)
> >     at
> >
> org.apache.spark.api.python.PythonGatewayServer$.main(PythonGatewayServer.scala:64)
> >     at
> >
> org.apache.spark.api.python.PythonGatewayServer.main(PythonGatewayServer.scala)
> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >     at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >     at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >     at java.lang.reflect.Method.invoke(Method.java:498)
> >     at
> >
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
> >     at
> >
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
> >     at
> > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
> >     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
> >     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
> >     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> >
> > Any help?
> >
> > Thanks,
> > Aakash.
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to