Hey Mark,
I believe this is the name of the subdirectory that is used to store
metadata about which files are valid, see comment in code
https://github.com/apache/spark/blob/v2.3.0/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSink.scala#L33
Do you see the exception
Hey,
When running Spark on Alluxio-1.8.2, I encounter the following exception:
“alluxio.exception.FileDoseNotExistException: Path
“/test-data/_spark_metadata” does not exist” in Alluxio master.log. What
exactly is the directory "_spark_metadata" used for? And how can I fix this
problem?