[
https://issues.apache.org/jira/browse/SPARK-13488?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
NITESH VERMA updated SPARK-13488:
---------------------------------
Comment: was deleted
(was: Hi Sean, well let me try to explain my problem.
i am trying to use new API mapWithState sample code as below from the link
https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/streaming/JavaStatefulNetworkWordCount.java
code snippet
JavaMapWithStateDStream<String, Integer, Integer, Tuple2<String, Integer>>
stateDstream =
wordsDstream.mapWithState(StateSpec.function(mappingFunc).initialState(initialRDD));
here i have introduced timeout call in the last as below
wordsDstream.mapWithState(StateSpec.function(mappingFunc).initialState(initialRDD).timeout(new
Duration(1000)));
now i have faced above attached exceptions. One thing that i have changed in
the deps is to use Guava library public library for optional instead of spark
reported spark-4819 seems org.apache.spark.api.java.Optional is not available
with 1.6. could you please guide me the reason or understanding that i am
missing.i am new to spark.
my maven deps for spark
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.0</version>
</dependency>
thanks
)
> PairDStreamFunctions.mapWithState fails in case timeout is set
> java.util.NoSuchElementException: None.get
> ---------------------------------------------------------------------------------------------------------
>
> Key: SPARK-13488
> URL: https://issues.apache.org/jira/browse/SPARK-13488
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.6.0
> Reporter: NITESH VERMA
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]