Re: "Issues" mailing is not available?

2020-10-01 Thread Felix Cheung
Maybe it is moderation? Although I don’t think I’ve seen any mail that is
pending moderation.

But generally we can have discussion in dev@

issues@ in a lot of projects are dedicated only for JIRA updates (which
could be a lot)


On Thu, Oct 1, 2020 at 7:38 PM Jia Yu  wrote:

> Hello Felix,
>
> Several users reported that the iss...@sedona.apache.org is not a valid
> email address or their email didn't appear in the mailing list. But it
> looks like I can still post emails to this mailing list.
> https://lists.apache.org/list.html?iss...@sedona.apache.org
>
> Do you know whether there is anything wrong with the mailing list?
>
> Thanks,
> Jia
>
>


Re: Using GeoSpark with pip installed pyspark

2020-10-01 Thread Yutian Pang
Hello,

I'm forwarding my email to this email address since iss...@sedona.apache.org
is an invalid address.

Best,
Yutian

On Thu, Oct 1, 2020 at 4:35 PM Yutian Pang  wrote:

> Hello,
>
> I encountered a problem when I'm trying to install GeoSpark and upload the
> java packages from GeoSpark to Spark as in
> upload_jars() and the error messages,
>
> ---ValueError
> Traceback (most recent call 
> last) in > 1 upload_jars()  2 
> GeoSparkRegistrator.registerAll(spark)
> ~/anaconda3/lib/python3.7/site-packages/geospark/register/uploading.py in 
> upload_jars() 37 module_path = get_module_path(get_abs_path()) 38 
> upload_jars_based_on_spark_version(module_path)---> 39 
> findspark.init() 40 return True
> ~/anaconda3/lib/python3.7/site-packages/findspark.py in init(spark_home, 
> python_path, edit_rc, edit_profile)127 128 if not spark_home:--> 
> 129 spark_home = find()130 131 if not python_path:
> ~/anaconda3/lib/python3.7/site-packages/findspark.py in find() 34 if 
> not spark_home: 35 raise ValueError(---> 36 "Couldn't 
> find Spark, make sure SPARK_HOME env is set" 37 " or Spark is 
> in an expected location (e.g. from homebrew installation)." 38 )
> ValueError: Couldn't find Spark, make sure SPARK_HOME env is set or Spark is 
> in an expected location (e.g. from homebrew installation).
>
>
> The problem comes from the findspark function cannot find SPARK_HOME in
> /.bashrc.
>
> I installed pyspark with pip install which doesn't require me to set
> global environment during installation.
>
> And this problem still exists even if I have manually set
> SPARK_HOME=/home/ypang6/anaconda3/lib/python3.7/site-packages/pyspark in
> /.bashrc on my machine.
>
> I'm confused about how should I pass this to load the java packages. Could
> you please give me some hint?
>
> Best regards,
> Yutian
>
>