Re: [DISCUSS] Put all GeoTools jars into a package on Maven Central

2021-02-12 Thread Felix Cheung
LGPL is cat X - it wouldn’t be something Apache Sedona should distribute or
depend on
https://www.apache.org/legal/resolved.html#optional


On Thu, Feb 11, 2021 at 11:59 PM Jia Yu  wrote:

> OSGeos LocationTech owns GeoTools. I am thinking whether I should have my
> wrapper on Maven Central to bring those Sedona required GeoTools jars to
> Maven Central. Since it is LGPL, it might be OK to do so.
>
> On Thu, Feb 11, 2021 at 5:18 PM Felix Cheung 
> wrote:
>
> > Who owns or manages GeoTools if it is LGPL?
> >
> > On Thu, Feb 11, 2021 at 12:01 PM Jia Yu  wrote:
> >
> >> Pawel,
> >>
> >> Python-adapter module is always being used by users. But it does not
> come
> >> with GeoTools. To use it, users have to (1) compile the source code of
> >> Python-adapter, or (2) add GeoTools coordiantes from OSGEO repo via
> >> config(""), or (3) download and copy GeoTools jars to SPARK_HOME/jars/
> >>
> >> The easiest is 2, but it looks like it may not work in all environments
> >> since it needs to search OSGEO repo.
> >>
> >> What I am saying is that we can "move" GeoTools jars to Maven Central,
> >> Method 2 will 100% work, users just need to add
> >> "sedona-python-adapter-1.0.0-incubating" and "geotools-24-wrapper-1.0.0"
> >> coordinates in code.
> >>
> >> Do you think this is necessary?
> >>
> >> On Thu, Feb 11, 2021 at 11:40 AM Paweł Kociński <
> >> pawel93kocin...@gmail.com> wrote:
> >>
> >>> Both options seems good to me, but we have to remember that not all of
> >>> Sedona users using cloud solutions, some of them are using Spark with
> >>> hadoop. What about python-adapter module within sedona project, am I
> >>> missing sth ?
> >>> Regards,
> >>> Paweł
> >>>
> >>> czw., 11 lut 2021 o 14:40 Netanel Malka 
> >>> napisał(a):
> >>>
>  I think that we can make it work on Databricks without any changes.
>  After creating a cluster on Databricks, the user can install the
>  geotools packages and provide the osego *(or any other repo)
>  explicitly.*
> 
>  As you can see in the picture:
> 
>  [image: image.png]
>  I can provide the details on how to install it.
> 
>  I think it will solve the problem.
>  What do you think?
> 
> 
>  On Thu, 11 Feb 2021 at 12:24, Jia Yu  wrote:
> 
> > Hi folks,
> >
> > As you can see from the recent discussion in the mailing list
> > <[Bug][Python] Missing Java class>, in Sedona 1.0.0, because those
> LGPL
> > GeoTools jars are not on Maven Central (only in OSGEO repo),
> Databricks
> > cannot get GeoTools jars.
> >
> > I believe this will cause lots of trouble to our future Python users.
> > Reading Shapefiles and do CRS transformation are big selling points
> for
> > Sedona.
> >
> > The easiest way to fix this, without violating ASF policy, is that I
> > will publish a GeoTools wrapper on Maven Central using the old
> GeoSpark
> > group ID: https://mvnrepository.com/artifact/org.datasyslab
> >
> > For example, org.datasyslab:geotools-24-wrapper:1.0.0
> >
> > 1. This GeoTools wrapper does nothing but brings the GeoTools jars
> > needed by Sedona to Maven Central.
> > 2. When the Python user calls Sedona, they can add one more
> > package: org.datasyslab:geotools-24-wrapper:1.0.0
> >
> > Another good thing is that: this does not require a new source code
> > release from Sedona. We only need to update the website and let the
> users
> > know how to call it.
> >
> > Any better ideas?
> >
> > Thanks,
> > Jia
> >
> >
> >
> 
>  --
>  Best regards,
>  Netanel Malka.
> 
> >>>
>


Re: [DISCUSS] Put all GeoTools jars into a package on Maven Central

2021-02-11 Thread Jia Yu
OSGeos LocationTech owns GeoTools. I am thinking whether I should have my
wrapper on Maven Central to bring those Sedona required GeoTools jars to
Maven Central. Since it is LGPL, it might be OK to do so.

On Thu, Feb 11, 2021 at 5:18 PM Felix Cheung  wrote:

> Who owns or manages GeoTools if it is LGPL?
>
> On Thu, Feb 11, 2021 at 12:01 PM Jia Yu  wrote:
>
>> Pawel,
>>
>> Python-adapter module is always being used by users. But it does not come
>> with GeoTools. To use it, users have to (1) compile the source code of
>> Python-adapter, or (2) add GeoTools coordiantes from OSGEO repo via
>> config(""), or (3) download and copy GeoTools jars to SPARK_HOME/jars/
>>
>> The easiest is 2, but it looks like it may not work in all environments
>> since it needs to search OSGEO repo.
>>
>> What I am saying is that we can "move" GeoTools jars to Maven Central,
>> Method 2 will 100% work, users just need to add
>> "sedona-python-adapter-1.0.0-incubating" and "geotools-24-wrapper-1.0.0"
>> coordinates in code.
>>
>> Do you think this is necessary?
>>
>> On Thu, Feb 11, 2021 at 11:40 AM Paweł Kociński <
>> pawel93kocin...@gmail.com> wrote:
>>
>>> Both options seems good to me, but we have to remember that not all of
>>> Sedona users using cloud solutions, some of them are using Spark with
>>> hadoop. What about python-adapter module within sedona project, am I
>>> missing sth ?
>>> Regards,
>>> Paweł
>>>
>>> czw., 11 lut 2021 o 14:40 Netanel Malka 
>>> napisał(a):
>>>
 I think that we can make it work on Databricks without any changes.
 After creating a cluster on Databricks, the user can install the
 geotools packages and provide the osego *(or any other repo)
 explicitly.*

 As you can see in the picture:

 [image: image.png]
 I can provide the details on how to install it.

 I think it will solve the problem.
 What do you think?


 On Thu, 11 Feb 2021 at 12:24, Jia Yu  wrote:

> Hi folks,
>
> As you can see from the recent discussion in the mailing list
> <[Bug][Python] Missing Java class>, in Sedona 1.0.0, because those LGPL
> GeoTools jars are not on Maven Central (only in OSGEO repo), Databricks
> cannot get GeoTools jars.
>
> I believe this will cause lots of trouble to our future Python users.
> Reading Shapefiles and do CRS transformation are big selling points for
> Sedona.
>
> The easiest way to fix this, without violating ASF policy, is that I
> will publish a GeoTools wrapper on Maven Central using the old GeoSpark
> group ID: https://mvnrepository.com/artifact/org.datasyslab
>
> For example, org.datasyslab:geotools-24-wrapper:1.0.0
>
> 1. This GeoTools wrapper does nothing but brings the GeoTools jars
> needed by Sedona to Maven Central.
> 2. When the Python user calls Sedona, they can add one more
> package: org.datasyslab:geotools-24-wrapper:1.0.0
>
> Another good thing is that: this does not require a new source code
> release from Sedona. We only need to update the website and let the users
> know how to call it.
>
> Any better ideas?
>
> Thanks,
> Jia
>
>
>

 --
 Best regards,
 Netanel Malka.

>>>


Re: [DISCUSS] Put all GeoTools jars into a package on Maven Central

2021-02-11 Thread Felix Cheung
Who owns or manages GeoTools if it is LGPL?

On Thu, Feb 11, 2021 at 12:01 PM Jia Yu  wrote:

> Pawel,
>
> Python-adapter module is always being used by users. But it does not come
> with GeoTools. To use it, users have to (1) compile the source code of
> Python-adapter, or (2) add GeoTools coordiantes from OSGEO repo via
> config(""), or (3) download and copy GeoTools jars to SPARK_HOME/jars/
>
> The easiest is 2, but it looks like it may not work in all environments
> since it needs to search OSGEO repo.
>
> What I am saying is that we can "move" GeoTools jars to Maven Central,
> Method 2 will 100% work, users just need to add
> "sedona-python-adapter-1.0.0-incubating" and "geotools-24-wrapper-1.0.0"
> coordinates in code.
>
> Do you think this is necessary?
>
> On Thu, Feb 11, 2021 at 11:40 AM Paweł Kociński 
> wrote:
>
>> Both options seems good to me, but we have to remember that not all of
>> Sedona users using cloud solutions, some of them are using Spark with
>> hadoop. What about python-adapter module within sedona project, am I
>> missing sth ?
>> Regards,
>> Paweł
>>
>> czw., 11 lut 2021 o 14:40 Netanel Malka 
>> napisał(a):
>>
>>> I think that we can make it work on Databricks without any changes.
>>> After creating a cluster on Databricks, the user can install the
>>> geotools packages and provide the osego *(or any other repo)
>>> explicitly.*
>>>
>>> As you can see in the picture:
>>>
>>> [image: image.png]
>>> I can provide the details on how to install it.
>>>
>>> I think it will solve the problem.
>>> What do you think?
>>>
>>>
>>> On Thu, 11 Feb 2021 at 12:24, Jia Yu  wrote:
>>>
 Hi folks,

 As you can see from the recent discussion in the mailing list
 <[Bug][Python] Missing Java class>, in Sedona 1.0.0, because those LGPL
 GeoTools jars are not on Maven Central (only in OSGEO repo), Databricks
 cannot get GeoTools jars.

 I believe this will cause lots of trouble to our future Python users.
 Reading Shapefiles and do CRS transformation are big selling points for
 Sedona.

 The easiest way to fix this, without violating ASF policy, is that I
 will publish a GeoTools wrapper on Maven Central using the old GeoSpark
 group ID: https://mvnrepository.com/artifact/org.datasyslab

 For example, org.datasyslab:geotools-24-wrapper:1.0.0

 1. This GeoTools wrapper does nothing but brings the GeoTools jars
 needed by Sedona to Maven Central.
 2. When the Python user calls Sedona, they can add one more
 package: org.datasyslab:geotools-24-wrapper:1.0.0

 Another good thing is that: this does not require a new source code
 release from Sedona. We only need to update the website and let the users
 know how to call it.

 Any better ideas?

 Thanks,
 Jia



>>>
>>> --
>>> Best regards,
>>> Netanel Malka.
>>>
>>


Re: [DISCUSS] Put all GeoTools jars into a package on Maven Central

2021-02-11 Thread Jia Yu
Pawel,

Python-adapter module is always being used by users. But it does not come
with GeoTools. To use it, users have to (1) compile the source code of
Python-adapter, or (2) add GeoTools coordiantes from OSGEO repo via
config(""), or (3) download and copy GeoTools jars to SPARK_HOME/jars/

The easiest is 2, but it looks like it may not work in all environments
since it needs to search OSGEO repo.

What I am saying is that we can "move" GeoTools jars to Maven Central,
Method 2 will 100% work, users just need to add
"sedona-python-adapter-1.0.0-incubating" and "geotools-24-wrapper-1.0.0"
coordinates in code.

Do you think this is necessary?

On Thu, Feb 11, 2021 at 11:40 AM Paweł Kociński 
wrote:

> Both options seems good to me, but we have to remember that not all of
> Sedona users using cloud solutions, some of them are using Spark with
> hadoop. What about python-adapter module within sedona project, am I
> missing sth ?
> Regards,
> Paweł
>
> czw., 11 lut 2021 o 14:40 Netanel Malka  napisał(a):
>
>> I think that we can make it work on Databricks without any changes.
>> After creating a cluster on Databricks, the user can install the geotools
>> packages and provide the osego *(or any other repo) explicitly.*
>>
>> As you can see in the picture:
>>
>> [image: image.png]
>> I can provide the details on how to install it.
>>
>> I think it will solve the problem.
>> What do you think?
>>
>>
>> On Thu, 11 Feb 2021 at 12:24, Jia Yu  wrote:
>>
>>> Hi folks,
>>>
>>> As you can see from the recent discussion in the mailing list
>>> <[Bug][Python] Missing Java class>, in Sedona 1.0.0, because those LGPL
>>> GeoTools jars are not on Maven Central (only in OSGEO repo), Databricks
>>> cannot get GeoTools jars.
>>>
>>> I believe this will cause lots of trouble to our future Python users.
>>> Reading Shapefiles and do CRS transformation are big selling points for
>>> Sedona.
>>>
>>> The easiest way to fix this, without violating ASF policy, is that I
>>> will publish a GeoTools wrapper on Maven Central using the old GeoSpark
>>> group ID: https://mvnrepository.com/artifact/org.datasyslab
>>>
>>> For example, org.datasyslab:geotools-24-wrapper:1.0.0
>>>
>>> 1. This GeoTools wrapper does nothing but brings the GeoTools jars
>>> needed by Sedona to Maven Central.
>>> 2. When the Python user calls Sedona, they can add one more
>>> package: org.datasyslab:geotools-24-wrapper:1.0.0
>>>
>>> Another good thing is that: this does not require a new source code
>>> release from Sedona. We only need to update the website and let the users
>>> know how to call it.
>>>
>>> Any better ideas?
>>>
>>> Thanks,
>>> Jia
>>>
>>>
>>>
>>
>> --
>> Best regards,
>> Netanel Malka.
>>
>


Re: [DISCUSS] Put all GeoTools jars into a package on Maven Central

2021-02-11 Thread Netanel Malka
I think that we can make it work on Databricks without any changes.
After creating a cluster on Databricks, the user can install the geotools
packages and provide the osego *(or any other repo) explicitly.*

As you can see in the picture:

[image: image.png]
I can provide the details on how to install it.

I think it will solve the problem.
What do you think?


On Thu, 11 Feb 2021 at 12:24, Jia Yu  wrote:

> Hi folks,
>
> As you can see from the recent discussion in the mailing list
> <[Bug][Python] Missing Java class>, in Sedona 1.0.0, because those LGPL
> GeoTools jars are not on Maven Central (only in OSGEO repo), Databricks
> cannot get GeoTools jars.
>
> I believe this will cause lots of trouble to our future Python users.
> Reading Shapefiles and do CRS transformation are big selling points for
> Sedona.
>
> The easiest way to fix this, without violating ASF policy, is that I will
> publish a GeoTools wrapper on Maven Central using the old GeoSpark group
> ID: https://mvnrepository.com/artifact/org.datasyslab
>
> For example, org.datasyslab:geotools-24-wrapper:1.0.0
>
> 1. This GeoTools wrapper does nothing but brings the GeoTools jars needed
> by Sedona to Maven Central.
> 2. When the Python user calls Sedona, they can add one more
> package: org.datasyslab:geotools-24-wrapper:1.0.0
>
> Another good thing is that: this does not require a new source code
> release from Sedona. We only need to update the website and let the users
> know how to call it.
>
> Any better ideas?
>
> Thanks,
> Jia
>
>
>

-- 
Best regards,
Netanel Malka.