developing. How did you
solve this problem?
shicheng31...@gmail.com
v.sh . And
the executor's memory does increase. But the problem still exists.
What puzzles me is The JDBC Server application serves as driver, only
handle some code distribution and rpc connection works.Does it need so much
meormy? If so , how to increase it's memory?
shicheng31...@gmail.com
state'? Or other
solutions?
Thanks!
shicheng31...@gmail.com
= keyValueDStream.mapWithState[StateType,
MappedType](spec)
Can anyone help me with this problem?Thanks!
shicheng31...@gmail.com
direct or indirect API for this
in spark? Or is there any better solution for this?
Thanks!
shicheng31...@gmail.com