cloud-fan commented on a change in pull request #26001: [SPARK-29331][SQL]
create DS v2 Write at physical plan
URL: https://github.com/apache/spark/pull/26001#discussion_r331843957
##########
File path:
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaWriter.scala
##########
@@ -50,7 +50,7 @@ private[kafka010] object KafkaWriter extends Logging {
topic: Option[String] = None): Unit = {
schema.find(_.name == TOPIC_ATTRIBUTE_NAME).getOrElse(
if (topic.isEmpty) {
- throw new AnalysisException(s"topic option required when no " +
+ throw new IllegalArgumentException(s"topic option required when no " +
Review comment:
> I think we typically want to always raise SparkException because all
exception types inherit from it.
In Spark SQL no exceptions inherit from it. In fact SparkException was
rarely used in Spark SQL before we adding the v2 commands. `SparkException` is
defined in spark core and usually used when Spark fails to run a task.
In Spark SQL, AnalysisException and standard Java exceptions are more widely
used.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]