Re: Custom line/record delimiter
Thanks for the update Kwon. Regards, On Mon, Jan 1, 2018 at 7:54 PM Hyukjin Kwonwrote: > Hi, > > > There's a PR - https://github.com/apache/spark/pull/18581 and JIRA > - SPARK-21289 > > Alternatively, you could check out multiLine option for CSV and see if > applicable. > > > Thanks. > > > 2017-12-30 2:19 GMT+09:00 sk skk : > >> Hi, >> >> Do we have an option to write a csv or text file with a custom >> record/line separator through spark ? >> >> I could not find any ref on the api. I have a issue while loading data >> into a warehouse as one of the column on csv have a new line character and >> the warehouse is not letting to escape that new line character . >> >> Thank you , >> Sk >> > >
Re: Custom line/record delimiter
Hi, There's a PR - https://github.com/apache/spark/pull/18581 and JIRA - SPARK-21289 Alternatively, you could check out multiLine option for CSV and see if applicable. Thanks. 2017-12-30 2:19 GMT+09:00 sk skk: > Hi, > > Do we have an option to write a csv or text file with a custom record/line > separator through spark ? > > I could not find any ref on the api. I have a issue while loading data > into a warehouse as one of the column on csv have a new line character and > the warehouse is not letting to escape that new line character . > > Thank you , > Sk >
Custom line/record delimiter
Hi, Do we have an option to write a csv or text file with a custom record/line separator through spark ? I could not find any ref on the api. I have a issue while loading data into a warehouse as one of the column on csv have a new line character and the warehouse is not letting to escape that new line character . Thank you , Sk