Hi All,
Can someone pls hellp with this.
Thanks
On Tuesday, December 8, 2020, Amit Joshi wrote:
> Hi Gabor,
>
> Pls find the logs attached. These are truncated logs.
>
> Command used :
> spark-submit --verbose --packages org.apache.spark:spark-sql-
> kafka-0-10_2.12:3.0.1,com.typesafe:config:1
Hi Gabor,
Pls find the logs attached. These are truncated logs.
Command used :
spark-submit --verbose --packages
org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1,com.typesafe:config:1.4.0
--master yarn --deploy-mode cluster --class com.stream.Main --num-executors
2 --driver-memory 2g --executor-c
Well, I can't do miracle without cluster and logs access.
What I don't understand why you need fat jar?! Spark libraries normally
need provided scope because it must exist on all machines...
I would take a look at the driver and executor logs which contains the
consumer configs + I would take a loo
Hi Gabor,
The code is very simple Kafka consumption of data.
I guess, it may be the cluster.
Can you please point out the possible problem toook for in the cluster?
Regards
Amit
On Monday, December 7, 2020, Gabor Somogyi
wrote:
> + Adding back user list.
>
> I've had a look at the Spark code a
+ Adding back user list.
I've had a look at the Spark code and it's not
modifying "partition.assignment.strategy" so the problem
must be either in your application or in your cluster setup.
G
On Mon, Dec 7, 2020 at 12:31 PM Gabor Somogyi
wrote:
> It's super interesting because that field has
Hi All,
Thnks for the reply.
I did tried removing the client version.
But got the same exception.
Though one point there is some dependent artifacts which I am using, which
contains refrence to the Kafka client saw version.
I am trying to make uber jar, which will choose the closest version.
Thn
+1 on the mentioned change, Spark uses the following kafka-clients library:
2.4.1
G
On Mon, Dec 7, 2020 at 9:30 AM German Schiavon
wrote:
> Hi,
>
> I think the issue is that you are overriding the kafka-clients that comes
> with spark-sql-kafka-0-10_2.12
>
>
> I'd try removing the kafka-clie
Hi,
I think the issue is that you are overriding the kafka-clients that comes
with spark-sql-kafka-0-10_2.12
I'd try removing the kafka-clients and see if it works
On Sun, 6 Dec 2020 at 08:01, Amit Joshi wrote:
> Hi All,
>
> I am running the Spark Structured Streaming along with Kafka.
> Be
Hi All,
I am running the Spark Structured Streaming along with Kafka.
Below is the pom.xml
1.8
1.8
UTF-8
2.12.10
3.0.1
org.apache.kafka
kafka-clients
2.1.0
org.apache.spark
spark-core_2.12
${sparkVersion}
provided
org.apache.spark