Further clarification: I checked that protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl <https://files.pythonhosted.org/packages/a5/c6/a8b6a74ab1e165f0aaa673a46f5c895af8780976880c98934ae82060356d/protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl> comes packaged with the shared object file, while protobuf-3.10.0-py2.py3-none-any.whl <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl> doesn't.
On Wednesday, November 20, 2019 at 6:36:19 PM UTC+8, Teddy Hartanto wrote: > > Hello guys, > > We're running Spark Applications on GKE and we're using a docker image > hosted on https://gcr.io/spark-operator/spark-py. > Our spark application streams data from a Kafka topic containing messages > in Protobuf format. > Therefore, we need to deserialize these messages from their binary > Protobuf format. > > Locally, everything works fine. But, when we're deploying this to our GKE > environment using the said docker image. It breaks. Specifically, I'm > encountering this issue: > Traceback (most recent call last): > --- truncated --- > File "/usr/lib/python3.6/site-packages/google/protobuf/descriptor.py", > line 47, in <module> > from google.protobuf.pyext import _message > ImportError: cannot import name '_message' > > Upon further research, it seems that a shared object file is required. On > my local machine, I can find that file: > $ pwd > /usr/local/lib/python3.7/site-packages/google/protobuf/pyext > > ls > __init__.py _message.cpython-37m-darwin.so > python_pb2.py > __pycache__ cpp_message.py > > As you can see, there's a _message.cpython-37m-darwin.so file. This kind > of file wasn't present in the docker image even though in both environments > I installed the protobuf through pip install: > pip install protobuf > > Upon further investigation, I found out that locally it works fine because > the protobuf was installed via the > protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl > <https://files.pythonhosted.org/packages/a5/c6/a8b6a74ab1e165f0aaa673a46f5c895af8780976880c98934ae82060356d/protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl> > wheel > file. Meanwhile, in the docker image, the > protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl>one > > was used. It turns out that there isn't a wheel file supported for the > platform which the docker image is running. I've done further investigation > and found out that the image is running on > linux_x86_64 > > and none of the wheel files support this platform, therefore the fallback > is protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl> > . > > My question to you guys is: is it possible to support "linux_x86_64" and > the likes? What solution do you propose? > > Additional details: > > Protobuf > ---------------- > Version: 3.5.1 > Language: Python > > OS > ---------------- > Linux 771fb7930de6 4.9.184-linuxkit #1 SMP Tue Jul 2 22:58:16 UTC 2019 > x86_64 Linux > > Python > ---------------- > Python 3.6.8 (default, Apr 8 2019, 18:17:52) > [GCC 8.3.0] on linux > > Cheers, > Teddy > On Wednesday, November 20, 2019 at 6:36:19 PM UTC+8, Teddy Hartanto wrote: > > Hello guys, > > We're running Spark Applications on GKE and we're using a docker image > hosted on https://gcr.io/spark-operator/spark-py. > Our spark application streams data from a Kafka topic containing messages > in Protobuf format. > Therefore, we need to deserialize these messages from their binary > Protobuf format. > > Locally, everything works fine. But, when we're deploying this to our GKE > environment using the said docker image. It breaks. Specifically, I'm > encountering this issue: > Traceback (most recent call last): > --- truncated --- > File "/usr/lib/python3.6/site-packages/google/protobuf/descriptor.py", > line 47, in <module> > from google.protobuf.pyext import _message > ImportError: cannot import name '_message' > > Upon further research, it seems that a shared object file is required. On > my local machine, I can find that file: > $ pwd > /usr/local/lib/python3.7/site-packages/google/protobuf/pyext > > ls > __init__.py _message.cpython-37m-darwin.so > python_pb2.py > __pycache__ cpp_message.py > > As you can see, there's a _message.cpython-37m-darwin.so file. This kind > of file wasn't present in the docker image even though in both environments > I installed the protobuf through pip install: > pip install protobuf > > Upon further investigation, I found out that locally it works fine because > the protobuf was installed via the > protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl > <https://files.pythonhosted.org/packages/a5/c6/a8b6a74ab1e165f0aaa673a46f5c895af8780976880c98934ae82060356d/protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl> > wheel > file. Meanwhile, in the docker image, the > protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl>one > > was used. It turns out that there isn't a wheel file supported for the > platform which the docker image is running. I've done further investigation > and found out that the image is running on > linux_x86_64 > > and none of the wheel files support this platform, therefore the fallback > is protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl> > . > > My question to you guys is: is it possible to support "linux_x86_64" and > the likes? What solution do you propose? > > Additional details: > > Protobuf > ---------------- > Version: 3.5.1 > Language: Python > > OS > ---------------- > Linux 771fb7930de6 4.9.184-linuxkit #1 SMP Tue Jul 2 22:58:16 UTC 2019 > x86_64 Linux > > Python > ---------------- > Python 3.6.8 (default, Apr 8 2019, 18:17:52) > [GCC 8.3.0] on linux > > Cheers, > Teddy > On Wednesday, November 20, 2019 at 6:36:19 PM UTC+8, Teddy Hartanto wrote: > > Hello guys, > > We're running Spark Applications on GKE and we're using a docker image > hosted on https://gcr.io/spark-operator/spark-py. > Our spark application streams data from a Kafka topic containing messages > in Protobuf format. > Therefore, we need to deserialize these messages from their binary > Protobuf format. > > Locally, everything works fine. But, when we're deploying this to our GKE > environment using the said docker image. It breaks. Specifically, I'm > encountering this issue: > Traceback (most recent call last): > --- truncated --- > File "/usr/lib/python3.6/site-packages/google/protobuf/descriptor.py", > line 47, in <module> > from google.protobuf.pyext import _message > ImportError: cannot import name '_message' > > Upon further research, it seems that a shared object file is required. On > my local machine, I can find that file: > $ pwd > /usr/local/lib/python3.7/site-packages/google/protobuf/pyext > > ls > __init__.py _message.cpython-37m-darwin.so > python_pb2.py > __pycache__ cpp_message.py > > As you can see, there's a _message.cpython-37m-darwin.so file. This kind > of file wasn't present in the docker image even though in both environments > I installed the protobuf through pip install: > pip install protobuf > > Upon further investigation, I found out that locally it works fine because > the protobuf was installed via the > protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl > <https://files.pythonhosted.org/packages/a5/c6/a8b6a74ab1e165f0aaa673a46f5c895af8780976880c98934ae82060356d/protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl> > wheel > file. Meanwhile, in the docker image, the > protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl>one > > was used. It turns out that there isn't a wheel file supported for the > platform which the docker image is running. I've done further investigation > and found out that the image is running on > linux_x86_64 > > and none of the wheel files support this platform, therefore the fallback > is protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl> > . > > My question to you guys is: is it possible to support "linux_x86_64" and > the likes? What solution do you propose? > > Additional details: > > Protobuf > ---------------- > Version: 3.5.1 > Language: Python > > OS > ---------------- > Linux 771fb7930de6 4.9.184-linuxkit #1 SMP Tue Jul 2 22:58:16 UTC 2019 > x86_64 Linux > > Python > ---------------- > Python 3.6.8 (default, Apr 8 2019, 18:17:52) > [GCC 8.3.0] on linux > > Cheers, > Teddy > On Wednesday, November 20, 2019 at 6:36:19 PM UTC+8, Teddy Hartanto wrote: > > Hello guys, > > We're running Spark Applications on GKE and we're using a docker image > hosted on https://gcr.io/spark-operator/spark-py. > Our spark application streams data from a Kafka topic containing messages > in Protobuf format. > Therefore, we need to deserialize these messages from their binary > Protobuf format. > > Locally, everything works fine. But, when we're deploying this to our GKE > environment using the said docker image. It breaks. Specifically, I'm > encountering this issue: > Traceback (most recent call last): > --- truncated --- > File "/usr/lib/python3.6/site-packages/google/protobuf/descriptor.py", > line 47, in <module> > from google.protobuf.pyext import _message > ImportError: cannot import name '_message' > > Upon further research, it seems that a shared object file is required. On > my local machine, I can find that file: > $ pwd > /usr/local/lib/python3.7/site-packages/google/protobuf/pyext > > ls > __init__.py _message.cpython-37m-darwin.so > python_pb2.py > __pycache__ cpp_message.py > > As you can see, there's a _message.cpython-37m-darwin.so file. This kind > of file wasn't present in the docker image even though in both environments > I installed the protobuf through pip install: > pip install protobuf > > Upon further investigation, I found out that locally it works fine because > the protobuf was installed via the > protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl > <https://files.pythonhosted.org/packages/a5/c6/a8b6a74ab1e165f0aaa673a46f5c895af8780976880c98934ae82060356d/protobuf-3.10.0-cp37-cp37m-macosx_10_9_intel.whl> > wheel > file. Meanwhile, in the docker image, the > protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl>one > > was used. It turns out that there isn't a wheel file supported for the > platform which the docker image is running. I've done further investigation > and found out that the image is running on > linux_x86_64 > > and none of the wheel files support this platform, therefore the fallback > is protobuf-3.10.0-py2.py3-none-any.whl > <https://files.pythonhosted.org/packages/ad/c2/86c65136e280607ddb2e5dda19e2953c1174f9919b557d1d154574481de4/protobuf-3.10.0-py2.py3-none-any.whl> > . > > My question to you guys is: is it possible to support "linux_x86_64" and > the likes? What solution do you propose? > > Additional details: > > Protobuf > ---------------- > Version: 3.5.1 > Language: Python > > OS > ---------------- > Linux 771fb7930de6 4.9.184-linuxkit #1 SMP Tue Jul 2 22:58:16 UTC 2019 > x86_64 Linux > > Python > ---------------- > Python 3.6.8 (default, Apr 8 2019, 18:17:52) > [GCC 8.3.0] on linux > > Cheers, > Teddy > -- You received this message because you are subscribed to the Google Groups "Protocol Buffers" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/protobuf/e20d93fa-1f6f-4f11-b5a9-a97ea7ed67c6%40googlegroups.com.
