Re: How to populate all possible combination values in columns using Spark SQL

2020-05-06 Thread Sonal Goyal
As mentioned in the comments on SO, can you provide a (masked) sample of the data? It will be easier to see what you are trying to do if you add the year column Thanks, Sonal Nube Technologies On Thu, May 7, 2020 at 10:26 AM

How to populate all possible combination values in columns using Spark SQL

2020-05-06 Thread Aakash Basu
Hi, I've described the problem in Stack Overflow with a lot of detailing, can you kindly check and help if possible? https://stackoverflow.com/q/61643910/5536733 I'd be absolutely fine if someone solves it using Spark SQL APIs rather than plain spark SQL query. Thanks, Aakash.

Cyber bullying for reporting bugs

2020-05-06 Thread JeffEvans1112
@Jeff Evans @Sean Owen Both of these postings are examples of same object orientated concept. They are examples of extraction of child Object from Parent Object. The difference is that when a Muslim asked he was told by Jeff Evans "we are not here handhold you." “do a simple Google search”

cyber bullying by so...@apache.org

2020-05-06 Thread JeffEvans1112
@Jeff Evans @Sean Owen Both of these postings are examples of same object orientated concept. They examples of extraction of child Object from Parent Object. The difference is that when a Muslim asked he was told by Jeff Evans "we are not here handhold you." “do a simple Google search”

Re: How to unsubscribe

2020-05-06 Thread Denny Lee
Hi Fred, To unsubscribe, could you please email: user-unsubscr...@spark.apache.org (for more information, please refer to https://spark.apache.org/community.html). Thanks! Denny On Wed, May 6, 2020 at 10:12 AM Fred Liu wrote: > Hi guys > > > >

Abstract of child object from Parent Object

2020-05-06 Thread JeffEvans
@Jeff Evans @Sean Owen Both of these postings are examples of same object orientated concept. They examples of extraction of child Object from Parent Object. The difference is that when a Muslim asked he was told by Jeff Evans "we are not here handhold you." “do a simple Google search”

Re: Which SQL flavor does Spark SQL follow?

2020-05-06 Thread Mich Talebzadeh
it closely follows Hive sql. from the analytical functions its is similar to Oracle. Anyway if you know good SQL as opposed to Java programmer turned to SQL writer you should be OK. HTH Dr Mich Talebzadeh LinkedIn *

Re: Which SQL flavor does Spark SQL follow?

2020-05-06 Thread Jeff Evans
https://docs.databricks.com/spark/latest/spark-sql/language-manual/index.html https://spark.apache.org/docs/latest/api/sql/index.html On Wed, May 6, 2020 at 3:35 PM Aakash Basu wrote: > Hi, > > Wish to know, which type of SQL syntax is followed when we write a plain > SQL query inside

Which SQL flavor does Spark SQL follow?

2020-05-06 Thread Aakash Basu
Hi, Wish to know, which type of SQL syntax is followed when we write a plain SQL query inside spark.sql? Is it MySQL or PGSQL? I know it isn't SQL Server or Oracle as while migrating, had to convert a lot of SQL functions. Also if you can provide a documentation which clearly says the above

How to unsubscribe

2020-05-06 Thread Fred Liu
Hi guys - To unsubscribe e-mail: user-unsubscr...@spark.apache.org From: Fred Liu Sent: Wednesday, May 6, 2020 10:10 AM To: user@spark.apache.org Subject: Unsubscribe [External

Unsubscribe

2020-05-06 Thread Fred Liu

Re: Pyspark Kafka Structured Stream not working.

2020-05-06 Thread Jungtaek Lim
Hi, 1. You seem to use DStream (Spark Streaming), not Structured Streaming. 2. I'm not familiar with pyspark, but looks like the error message is very clear - Kafka doesn't allow such name for "client.id". The error message guides the naming rule, so you may need to be adopted with such

Pyspark Kafka Structured Stream not working.

2020-05-06 Thread Vijayant Kumar
Hi All, I am getting the below error while using Pyspark Structured Streaming from Kafka Producer. 20/05/06 11:51:16 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - kafka.common.InvalidConfigException: client.id Python Kafka streamer is illegal, contains

Re: Spark hangs while reading from jdbc - does nothing Removing Guess work from trouble shooting

2020-05-06 Thread Ruijing Li
Wanted to update everyone on this, thanks for all the responses. I was able to solve this issue after doing a jstack dump - I found out this was the cause https://github.com/scala/bug/issues/10436 Lesson learned - I’ll use a safer json parser like json4s, seems like that one should be able to be

Re: Good idea to do multi-threading in spark job?

2020-05-06 Thread Ruijing Li
Thanks for the answer Sean! On Sun, May 3, 2020 at 10:35 AM Sean Owen wrote: > Spark will by default assume each task needs 1 CPU. On an executor > with 16 cores and 16 slots, you'd schedule 16 tasks. If each is using > 4 cores, then 64 threads are trying to run. If you're CPU-bound, that >