I think you could also fork and try to compile it for s390x - I think
that's what happened for ARM.
Perf probably isn't a big deal there, but that might be easier than
rewriting or replacing it.

On Thu, Sep 10, 2020 at 1:29 AM mundaym <mike.mun...@ibm.com> wrote:
>
> Hi all,
>
> I am currently building Spark from source and also have to build leveldbjni
> from source because the binary release (which is platform dependent) in
> mvnrepository does not support my target platform (s390x). People have run
> into similar problems when building for other platforms too (notably Spark
> builds on arm64 pull in an alternative binary release).
>
> The last binary release of leveldbjni was in 2013 and it does not appear to
> be actively maintained. I suspect the lack of new binary releases will cause
> more issues as time goes on. I am therefore curious whether anyone had any
> thoughts over substituting it with an alternative library. In particular I
> am interested in whether there are any constraints an alternative would need
> to adhere to:
>
> 1. Would an alternative need to be leveldb compatible?
> 2. How performance sensitive are the uses of leveldbjni in Spark? Could a
> pure Java library be used instead?
>
> Specifically I am thinking possible alternatives that are actively
> maintained might be something like https://github.com/dain/leveldb or even
> RocksDB. Or perhaps there is another key-value store used in the Apache
> ecosystem that could be adopted.
>
> Thanks,
> Michael
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to