mlegore opened a new issue, #24467:
URL: https://github.com/apache/beam/issues/24467

   ### What happened?
   
   I am seeing an error in my dataflow job:
   
   ```
   org.apache.beam.sdk.util.UserCodeException: 
java.lang.IllegalArgumentException: No enum constant 
com.sift.api.constants.proto.entitytype.EntityType.2022-10-17 01:18:05.235
   ```
   
   The field being returned to my SnowflakeIO.CsvMapper are this:
   ```
   fields[0] = customerid, 
   fields[1] = ....',USER,....', 
   fields[2] = 2022-10-01 01:10:01.000
   ```
   but they should be
   ```
   fields[0] = customerid, 
   fields[1] = ....'
   fields[1] = USER
   fields[2] = ....'
   fields[3] = 2022-10-01 01:10:01.000
   ```
   
   Switching to " quote character is not a sufficient workaround for me because 
the second field in my query can contains both `'` and `"` chars, so the issue 
still exists even if I switch.
   
   Repro: Set up a read with a snowflake query that produces two columns: a 
string column containing the quote character, in my case `'`, but it could also 
be `"`, and a second column, containing anytihng. SnowflakeIO connector will 
incorrectly decide that the first column continues past the `,` delimiter so 
the second column won't exist, or will be malformed.
   
   eg.
   
   NAME, DATE
   ....',2022-01-01
   
   Will be parsed as 
   NAME = `....',2022-01-01`
   DATE = ???
   
   The connector should make use of the Snowflake file format ESCAPE field to 
properly escape the quote char, and the SnowflakeIO connector should
   
   Expected behavior:
   SnowflakeIO provides an escape sequence and properly parses the CSV fields 
from my query, even when the fields contain the quote char.
   
   ### Issue Priority
   
   Priority: 3
   
   ### Issue Component
   
   Component: io-java-snowflake


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to