Re: Pyspark ML model Save Error

2022-11-16 Thread Raja bhupati
Share more details on error to help suggesting solutions

On Wed, Nov 16, 2022, 22:13 Artemis User  wrote:

> What problems did you encounter?  Most likely your problem may be
> related to saving the model object in different partitions.  If that the
> case, just apply the dataframe's coalesce(1) method before saving the
> model to a shared disk drive...
>
> On 11/16/22 1:51 AM, Vajiha Begum S A wrote:
> > Hi,
> > This is Vajiha, Senior Research Analyst. I'm working for Predictive
> > Analysis with Pyspark ML models. It's quite good working with the
> > features of spark in python. Though I'm having issues saving the
> > pyspark trained ML models. I have read many articles,stack overflow
> > and spark forum comments and applied all those and still I'm facing
> > issues on saving the ML model.
> > Kindly help us to solve this issue to continue working with spark.I
> > hope i will get support from the spark team to resolve my issue.
> > Kindly do the needful at the earliest. Thanks in Advance.
> >
> > Spark version- 3.3.0 (we are using this version)
> >
> > Regards,
> > Vajiha Begum
> > Sr.Research Analyst
> > Maestrowiz Solutions Pvt. Ltd
> > India
> >
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Prometheus with spark

2022-10-25 Thread Raja bhupati
We have use case where we would like process Prometheus metrics data with
spark

On Tue, Oct 25, 2022, 19:49 Jacek Laskowski  wrote:

> Hi Raj,
>
> Do you want to do the following?
>
> spark.read.format("prometheus").load...
>
> I haven't heard of such a data source / format before.
>
> What would you like it for?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://about.me/JacekLaskowski
> "The Internals Of" Online Books 
> Follow me on https://twitter.com/jaceklaskowski
>
> 
>
>
> On Fri, Oct 21, 2022 at 6:12 PM Raj ks  wrote:
>
>> Hi Team,
>>
>>
>> We wanted to query Prometheus data with spark. Any suggestions will
>> be appreciated
>>
>> Searched for documents but did not got any prompt one
>>
>