Hello there,
I am running a Java Spark application. Most of the modules write to a log
file (not the spark log file). I can either use "java -jar " or
"spark-submit" to run the application.
If I use "java -jar myApp.jar" the log file will be generated in the
directory $LOG_DIR or in a default dir
Doesn't this suggestion work for you? -- "To add yourself to the list,
please email d...@spark.apache.org with your organization name, URL, a list
of which Spark components you are using, and a short description of your
use case."
On Sat, Dec 22, 2018 at 12:13 AM Ascot Moss wrote:
> Hi,
>
> We u
Hello there,
I am trying to calculate simple difference btw adjacent rows ( ts = ts -10)
of a column for a dataset using Join (of itself). The sql expression was
working for static datasets (trackT) as:
Dataset trackDiff = spark.sql(" select a.*, "
+ "a.posX - coalesce(b.posX, 0) as delX,
Hello there,
I am trying to pass parameters in spark.sql query in Java code, the same
as in this link
https://forums.databricks.com/questions/115/how-do-i-pass-parameters-to-my-sql-statements.html
The link suggested to use 's' before 'select' as -
val param = 100
spark.sql(s""" select * from ta