HeartSaVioR commented on code in PR #389:
URL: https://github.com/apache/spark-website/pull/389#discussion_r875607350


##########
site/streaming/index.html:
##########
@@ -125,135 +125,98 @@
   <div class="row mt-4">
     <div class="col-12 col-md-9">
       <div class="jumbotron">
-  <b>Spark streaming</b> makes it easy to build scalable fault-tolerant 
streaming
-  applications.
+  <a href="/docs/latest/structured-streaming-programming-guide.html">Spark 
Structured Streaming</a> makes it easy to build streaming applications and 
pipelines with the same and familiar Spark APIs.
 </div>
 
 <div class="row row-padded">
   <div class="col-md-7 col-sm-7">
-    <h2>Ease of use</h2>
-    <p class="lead">
-      Build applications through high-level operators.
-    </p>
+    <h2>Ease to use</h2>
     <p>
-      Spark Streaming brings Apache Spark's
-      <a 
href="/docs/latest/streaming-programming-guide.html">language-integrated API</a>
-      to stream processing, letting you write streaming jobs the same way you 
write batch jobs.
-      It supports Java, Scala and Python.
+      Spark Structured Streaming abstracts away complex streaming concepts 
such as incremental processing, checkpointing, and watermarks 
+      so that you can build streaming applications and pipelines without 
learning any new concepts or tools.
     </p>
   </div>
   <div class="col-md-5 col-sm-5 col-padded-top col-center">
 
     <div style="margin-top: 15px; text-align: left; display: inline-block;">
       <div class="code">
-        TwitterUtils.createStream(...)<br />
-        &nbsp;&nbsp;&nbsp;&nbsp;.<span class="sparkop">filter</span>(<span 
class="closure">_.getText.contains("Spark")</span>)<br />
-        &nbsp;&nbsp;&nbsp;&nbsp;.<span 
class="sparkop">countByWindow</span>(Seconds(5))
+        spark<br />
+        &nbsp;&nbsp;.<span class="sparkop">readStream</span><br />
+        &nbsp;&nbsp;.<span class="sparkop">select</span>(<span 
class="closure">cast("string").alias("jsonData")</span>)<br />

Review Comment:
   Just fixed it; I used the simple batch query below for the verification:
   
   ```
   import org.apache.spark.sql.functions._
   import org.apache.spark.sql.types._
   
   val jsonSchema = new StructType().add("hello", StringType).add("year", 
IntegerType)
   
   val df = spark.range(1)
     .select(lit("{'hello': 'world', 'year': 
2022}").alias("value")).select($"value".cast("string").alias("jsonData"))
     .select(from_json($"jsonData", jsonSchema).alias("payload"))
   
   df.show()
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to