kbendick commented on issue #5453:
URL: https://github.com/apache/iceberg/issues/5453#issuecomment-1208487791
Based on the expanded output of the Python stack trace (I didn't scroll
over), it seems like _maybe_ one of your tasks is `yield`-ing empty data in
your "dagster" library. That might lead to the unexpected EOF error (either
writing data or reading it back into the JVM from Python - I suspect the
second).
Maybe check that that yield statement isn't returning empty data when trying
to coerce to an iterator or something funky?
```
File
"/usr/local/lib/python3.9/site-packages/dagster/core/execution/plan/utils.py",
line 47, in solid_execution_error_boundary
yield
File "/usr/local/lib/python3.9/site-packages/dagster/utils/__init__.py",
line 406, in iterate_with_context
next_output = next(iterator)
File
"/usr/local/lib/python3.9/site-packages/dagster/core/execution/plan/compute_generator.py",
line 66, in _coerce_solid_compute_fn_to_iterator
result = fn(context, **kwargs) if context_arg_provided else fn(**kwargs)
File "/var/lib/ngods/dagster/spark.py", line 20, in
normalize_food_arrays_op
data = spark.sql(
File "/usr/local/lib/python3.9/site-packages/pyspark/sql/session.py", line
1034, in sql
return DataFrame(self._jsparkSession.sql(sqlQuery), self)
File "/usr/local/lib/python3.9/site-packages/py4j/java_gateway.py", line
1321, in __call__
return_value = get_return_value(
File "/usr/local/lib/python3.9/site-packages/pyspark/sql/utils.py", line
190, in deco
return f(*a, **kw)
File "/usr/local/lib/python3.9/site-packages/py4j/protocol.py", line 326,
in get_return_value
raise Py4JJavaError(
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]