Hi Rishi,
Spark and Flint are useful during the data engineering phase, but you'd
need to look elsewhere after that. I'm not aware of any active
Spark-native project to do ML/forecast on time series data.
If the data that you want to train the model on can fit in one node's
memory, you can use libs and models like ARIMA, Prophet, or LSTM-based
NN to train a model and use them for forecasting. You can then use Spark
to parallelize the grid search over the space of hyperparameters to get
the optimal model faster, as the grid search would be a
perfectly-parallel job (a.k.a, embarrassingly parallel). I gave a talk
on this which you may find useful:
https://www.analytical.works/Talk-spark-ml.html
Masood
__________________
Masood Krohy, Ph.D.
Data Science Advisor|Platform Architect
https://www.analytical.works
On 12/29/19 11:30 AM, Rishi Shah wrote:
Hi All,
Checking in to see if anyone had input around time series libraries
using Spark. I in interested in financial forecasting model &
regression mainly at this point. Input is a bunch of pricing data points.
I have read a lot of spark-timeseries and flint libraries but I am not
sure of the best way/use cases to use these libraries for or if
there's any other preferred way of tackling time series problems at scale.
Thanks,
-Shraddha
On Sun, Jun 16, 2019 at 9:17 AM Rishi Shah <rishishah.s...@gmail.com
<mailto:rishishah.s...@gmail.com>> wrote:
Thanks Jorn. I am interested in timeseries forecasting for now but
in general I was unable to find a good way to work with different
time series methods using spark..
On Fri, Jun 14, 2019 at 1:55 AM Jörn Franke <jornfra...@gmail.com
<mailto:jornfra...@gmail.com>> wrote:
Time series can mean a lot of different things and algorithms.
Can you describe more what you mean by time series use case,
ie what is the input, what do you like to do with the input
and what is the output?
> Am 14.06.2019 um 06:01 schrieb Rishi Shah
<rishishah.s...@gmail.com <mailto:rishishah.s...@gmail.com>>:
>
> Hi All,
>
> I have a time series use case which I would like to
implement in Spark... What would be the best way to do so? Any
built in libraries?
>
> --
> Regards,
>
> Rishi Shah
--
Regards,
Rishi Shah
--
Regards,
Rishi Shah