On Tue, Jun 27, 2017 at 4:18 AM, Richard Hipp <[email protected]> wrote:
> The CSV import feature of the SQLite command-line shell expects to
> find UTF-8.  It does not understand other encodings, and I have no
> plans to add converters for alternative encodings any time soon.
>
> The latest version of trunk skips over a UTF-8 BOM at the beginning of
> the input file.

A little late, but it occurred to me how to make this "work" with
older versions of sqlite3 that support readfile / writefile. Say I
have a UTF8 BOM encoded file. I can trim it from SQLite then import
the trimmed version:

sqlite> select writefile('temp.csv', substr(readfile('utf8.csv'), 4));
<size of trimmed blob>
sqlite> .import temp.csv temp
sqlite> .import utf8.csv utf8
sqlite> .schema
CREATE TABLE temp(
  "a" TEXT,
  "b" TEXT,
  "c" TEXT,
  "d" TEXT
);
CREATE TABLE utf8(
  "?a" TEXT,
  "b" TEXT,
  "c" TEXT,
  "d" TEXT
);

Alternatively, without readfile / writefile support:

sqlite> pragma writable_schema = 1;
sqlite> update sqlite_master set sql = replace(sql, char(0xFEFF), '')
where name = 'utf8';
sqlite> pragma writable_schema = 0;
sqlite> vacuum;
sqlite> .schema
CREATE TABLE temp(
  "a" TEXT,
  "b" TEXT,
  "c" TEXT,
  "d" TEXT
);
CREATE TABLE utf8(
  "a" TEXT,
  "b" TEXT,
  "c" TEXT,
  "d" TEXT
);

Still, not nearly as friendly as sqlite shell doing it for you.

-- 
Scott Robison
_______________________________________________
sqlite-users mailing list
[email protected]
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to