reswqa commented on code in PR #25945:
URL: https://github.com/apache/flink/pull/25945#discussion_r1914028429


##########
docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md:
##########
@@ -75,31 +75,12 @@ wordCounts.keyBy(value -> value.f0);
 
 
 ```
-
-{{< /tab >}}
-{{< tab "Scala" >}}
-
-Scala case classes (and Scala tuples which are a special case of case 
classes), are composite types that contain a fixed number of fields with 
various types. Tuple fields are addressed by their 1-offset names such as `_1` 
for the first field. Case class fields are accessed by their name.
-
-```scala
-case class WordCount(word: String, count: Int)
-val input = env.fromElements(
-    WordCount("hello", 1),
-    WordCount("world", 2)) // Case Class Data Set
-
-input.keyBy(_.word)
-
-val input2 = env.fromElements(("hello", 1), ("world", 2)) // Tuple2 Data Set
-
-input2.keyBy(value => (value._1, value._2))
-```
-
 {{< /tab >}}

Review Comment:
   ditto.



##########
docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md:
##########
@@ -143,33 +124,17 @@ DataStream<WordWithCount> wordCounts = env.fromElements(
 
 wordCounts.keyBy(value -> value.word);
 
-```
-{{< /tab >}}
-{{< tab "Scala" >}}
-```scala
-class WordWithCount(var word: String, var count: Int) {
-    def this() {
-      this(null, -1)
-    }
-}
-
-val input = env.fromElements(
-    new WordWithCount("hello", 1),
-    new WordWithCount("world", 2)) // Case Class Data Set
-
-input.keyBy(_.word)
-
 ```
 {{< /tab >}}

Review Comment:
   ditto.



##########
docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md:
##########
@@ -366,20 +328,6 @@ TypeInformation<Tuple2<String, Double>> info = 
TypeInformation.of(new TypeHint<T
 Internally, this creates an anonymous subclass of the TypeHint that captures 
the generic information to preserve it
 until runtime.
 {{< /tab >}}
-{{< tab "Scala" >}}
-In Scala, Flink uses *macros* that runs at compile time and captures all 
generic type information while it is
-still available.
-```scala
-// important: this import is needed to access the 'createTypeInformation' 
macro function
-import org.apache.flink.streaming.api.scala._
-
-val stringInfo: TypeInformation[String] = createTypeInformation[String]
-
-val tupleInfo: TypeInformation[(String, Double)] = 
createTypeInformation[(String, Double)]
-```
-
-You can still use the same method as in Java as a fallback.
-{{< /tab >}}

Review Comment:
   ditto.



##########
docs/content/docs/dev/datastream/fault-tolerance/serialization/custom_serialization.md:
##########
@@ -56,18 +56,6 @@ ListStateDescriptor<Tuple2<String, Integer>> descriptor =
 checkpointedState = getRuntimeContext().getListState(descriptor);
 ```
 {{< /tab >}}
-{{< tab "Scala" >}}
-```scala
-class CustomTypeSerializer extends TypeSerializer[(String, Integer)] {...}
-
-val descriptor = new ListStateDescriptor[(String, Integer)](
-    "state-name",
-    new CustomTypeSerializer)
-)
-
-checkpointedState = getRuntimeContext.getListState(descriptor)
-```
-{{< /tab >}}

Review Comment:
   After remove scala part, we just have one tab. I suggest that we removing 
the top-level `tab`. Maybe replace it with code format(e.g. ```java)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to