spark streaming socket read issue

2017-06-30 Thread pradeepbill
hi there, I have a spark streaming issue that i am not able to figure out ,
below code reads from a socket, but I don't see any input going into the
job, I have nc -l  running, and dumping data though, not sure why my
spark job is not able to read data from  10.176.110.112:.Please advice.

Dataset d = sparkSession.readStream().format("socket")
.option("host", 
"10.176.110.112").option("port", ).load();


thanks
Pradeep




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-socket-read-issue-tp28813.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



backward compatibility

2017-01-10 Thread pradeepbill
hi there, I am using spark 1.4 code and now we plan to move to spark 2.0, and
when I check the documentation below, there are only a few features backward
compatible, does that mean I have change most of my code , please advice.

One of the largest changes in Spark 2.0 is the new updated APIs:

Unifying DataFrame and Dataset: In Scala and Java, DataFrame and Dataset
have been unified, i.e. DataFrame is just a type alias for Dataset of Row.
In Python and R, given the lack of type safety, DataFrame is the main
programming interface.
*SparkSession: new entry point that replaces the old SQLContext and
HiveContext for DataFrame and Dataset APIs. SQLContext and HiveContext are
kept for backward compatibility.*
A new, streamlined configuration API for SparkSession
Simpler, more performant accumulator API
A new, improved Aggregator API for typed aggregation in Datasets


thanks
Pradeep



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/backward-compatibility-tp28296.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org