There is no such method in Spark. I think that's some EMR-specific
modification.

On Wed, Jul 26, 2023 at 11:06 PM second_co...@yahoo.com.INVALID
<second_co...@yahoo.com.invalid> wrote:

> I ran the following code
>
> spark.sparkContext.list_packages()
>
> on spark 3.4.1 and i get below error
>
> An error was encountered:
> AttributeError
> [Traceback (most recent call last):
> ,   File "/tmp/spark-3d66c08a-08a3-4d4e-9fdf-45853f65e03d/shell_wrapper.py", 
> line 113, in exec
>     self._exec_then_eval(code)
> ,   File "/tmp/spark-3d66c08a-08a3-4d4e-9fdf-45853f65e03d/shell_wrapper.py", 
> line 106, in _exec_then_eval
>     exec(compile(last, '<string>', 'single'), self.globals)
> ,   File "<string>", line 1, in <module>
> , AttributeError: 'SparkContext' object has no attribute 'list_packages'
> ]
>
>
> Is list_packages and install_pypi_package available for vanilla spark or
> only available for AWS services?
>
>
> Thank you
>

Reply via email to