at work, we have browser-app that load-and-persist ~100MB (500k rows)
csv-files into wasm-sqlite3 [1], (ingestion-time is ~15s for 100MB csv).
we wish to go bigger, but chrome's indexeddb has a hard-limit of 125MB per
key-value object (on windows).

i don't have anything actionable.  just want people aware of datapoint, and
think about javascript-language-design to improve UX-handling with
sqlite3-persistence.

fyi, ignoring persistence, chrome can handle wasm-sqlite3 datasets as large
as 300MB (1.5 million rows) in memory, before crashing (on 8gb windows10
machine).

[1] sql.js wasm-sqlite3
https://github.com/kripken/sql.js
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to