Thanks for your reply. DataSet I am receiving from MainFrames system which I
don't have control .
Tried below things to move data to other executors but not succeeded
1. Called repartition method, data got re-partitioned but on same
executor. Only one core is processing all these
Severity: Medium
Vendor: The Apache Software Foundation
Versions Affected:
Spark versions from 1.3.0, running standalone master with REST API enabled,
or running Mesos master with cluster mode enabled
Description:
>From version 1.3.0 onward, Spark's standalone master exposes a REST API for
job
Hi,
I wanted to read a dataframe in one singleton class and and use that in
another singleton class.
Below is my code -
Class Singleton -
class Singleton(object):
_instances = {}
def __new__(class_, *args, **kwargs):
if class_ not in class_._instances:
class_._instances[class_]
Hi
You can call the java program directly though pyspark,
Following is the code that will help.
sc._jvm..
Harsh Takkar
On Sun, Aug 12, 2018 at 9:27 PM amit kumar singh
wrote:
> Hi /team,
>
> The way we call java program to executed stored procedure
> is there any way we can achieve the