These topics have been included in the documentation for recent builds of Spark 
2.0.

Michael

> On Jul 5, 2016, at 3:49 AM, Romi Kuntsman <r...@totango.com> wrote:
> 
> You can also claim that there's a whole section of "Migrating from 1.6 to 
> 2.0" missing there:
> https://spark.apache.org/docs/2.0.0-preview/sql-programming-guide.html#migration-guide
>  
> <https://spark.apache.org/docs/2.0.0-preview/sql-programming-guide.html#migration-guide>
> 
> Romi Kuntsman, Big Data Engineer
> http://www.totango.com <http://www.totango.com/>
> 
> On Tue, Jul 5, 2016 at 12:24 PM, nihed mbarek <nihe...@gmail.com 
> <mailto:nihe...@gmail.com>> wrote:
> Hi,
> 
> I just discover that that SparkSession will replace SQLContext for spark 2.0
> JavaDoc is clear 
> https://spark.apache.org/docs/2.0.0-preview/api/java/org/apache/spark/sql/SparkSession.html
>  
> <https://spark.apache.org/docs/2.0.0-preview/api/java/org/apache/spark/sql/SparkSession.html>
> but there is no mention in sql programming guide
> https://spark.apache.org/docs/2.0.0-preview/sql-programming-guide.html#starting-point-sqlcontext
>  
> <https://spark.apache.org/docs/2.0.0-preview/sql-programming-guide.html#starting-point-sqlcontext>
> 
> Is it possible to update documentation before the release ?
> 
> 
> Thank you
> 
> -- 
> 
> MBAREK Med Nihed,
> Fedora Ambassador, TUNISIA, Northern Africa
> http://www.nihed.com <http://www.nihed.com/>
> 
>  <http://tn.linkedin.com/in/nihed>
> 
> 

Reply via email to