nchammas commented on a change in pull request #27459: [SPARK-30510][SQL][DOCS] 
Publicly document Spark SQL configuration options
URL: https://github.com/apache/spark/pull/27459#discussion_r375032396
 
 

 ##########
 File path: sql/gen-sql-markdown.py
 ##########
 @@ -218,9 +236,73 @@ def generate_sql_markdown(jvm, path):
             mdfile.write("<br/>\n\n")
 
 
+def generate_sql_configs_table(jvm, path):
+    """
+    Generates an HTML table at `path` that lists all public SQL
+    configuration options.
+    """
+    sql_configs = _list_sql_configs(jvm)
+    value_reference_pattern = re.compile(r"^<value of (\S*)>$")
+    # ConfigEntry(key=spark.buffer.size, defaultValue=65536, doc=, public=true)
+    config_entry_pattern = re.compile(
+        r"ConfigEntry\(key=(\S*), defaultValue=\S*, doc=\S*, public=\S*\)")
 
 Review comment:
   @gatorsmile @HyukjinKwon - This part here is just removing some referenced 
configs _within_ the docstring that are added in places like this one: 
https://github.com/apache/spark/blob/898716980dce44a4cc09411e72d64c848698cad5/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L1605-L1609
   
   But you know what, maybe this is a bug in the docstring. Is that supposed to 
be `${BUFFER_SIZE.key}` instead?
   
   In any case, `getAllDefinedConfs()` only returns public configs. Perhaps we 
just need to add a test there? But I wonder how we would do that, since it just 
returns an array of tuples of `(name, default, docstring)`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to