Re: Why Spark having OutOfMemory Exception?

2016-04-21 Thread Zhan Zhang
ginal Message- >From: kramer2...@126.com<mailto:kramer2...@126.com> [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: user@spark.apache.org<mailto:user@spark.apache.org> >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very sim

Re:Re: Re: Re: Why Spark having OutOfMemory Exception?

2016-04-20 Thread 李明伟
iginal Message- >From: kramer2...@126.com [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: user@spark.apache.org >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very simple calculation. The description is like below >(pseud

Re: Re: Re: Why Spark having OutOfMemory Exception?

2016-04-20 Thread Jeff Zhang
t;> > >> >my_dict[timestamp] = df # Put the data frame into a dict >> > >> >delete_old_dataframe( my_dict ) # Delete old dataframe (timestamp is one >> >24 hour before) >> > >> >big_df = merge(my_dict) # Merge the recent 24 hours

Re:Re: Re: Why Spark having OutOfMemory Exception?

2016-04-20 Thread 李明伟
. Lohith Samaga > > >-Original Message- >From: kramer2...@126.com [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: user@spark.apache.org >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very simple calculation. The d

Re: Re: Why Spark having OutOfMemory Exception?

2016-04-19 Thread Jeff Zhang
ds > Mingwei > > > > > > At 2016-04-11 19:09:48, "Lohith Samaga M" wrote: > >Hi Kramer, > > Some options: > > 1. Store in Cassandra with TTL = 24 hours. When you read the full > > table, you get the latest 24 hours data. > >

Re:Re: Why Spark having OutOfMemory Exception?

2016-04-19 Thread 李明伟
essage- >From: kramer2...@126.com [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: user@spark.apache.org >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very simple calculation. The description is like below >(pseudo code): >

Re: Why Spark having OutOfMemory Exception?

2016-04-18 Thread Zhan Zhang
to know if any thing wrong about this model? Because it is very slow >after started for a while and hit OutOfMemory. I know that my memory is >enough. Also size of file is very small for test purpose. So should not have >memory problem. > >I am wondering if there is lineage issue,

Re:RE: Why Spark having OutOfMemory Exception?

2016-04-18 Thread 李明伟
, April 11, 2016 16.18 >To: user@spark.apache.org >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very simple calculation. The description is like below >(pseudo code): > > >While timestamp == 5 minutes > >df = read_hdf() # Read h

RE: Why Spark having OutOfMemory Exception?

2016-04-11 Thread Lohith Samaga M
either). Best regards / Mit freundlichen Grüßen / Sincères salutations M. Lohith Samaga -Original Message- From: kramer2...@126.com [mailto:kramer2...@126.com] Sent: Monday, April 11, 2016 16.18 To: user@spark.apache.org Subject: Why Spark having OutOfMemory Exception? I use spark to do

Why Spark having OutOfMemory Exception?

2016-04-11 Thread kramer2...@126.com
not have memory problem. I am wondering if there is lineage issue, but I am not sure. * -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Why-Spark-having-OutOfMemory-Exception-tp26743.html Sent from the Apache Spark User List mailing list archive at