Is sqlite is a constraint?

>From big data presctive is better to find a more natural format like
parquet or even json. For example what if you've different sqlite version

בתאריך יום ה׳, 23 בדצמ׳ 2021, 17:02, מאת Дмитрий Иванов ‏<
firstdis...@gmail.com>:

> Or, if you want to extend this theme, you can use a PostgreSQL-based
> "SQLite file player" with
> PostgreSQL + Python[sqlite3] extension.This way you can provide direct
> access to SQLite files without duplicating data in PostgreSQL cluster
> tables.
> PS: It may seem that this will reduce performance. When I started my
> project, I had some preconceptions about Python. But analyzing projects
> like Patroni changed my mind.
> --
> Regards, Dmitry!
>
>
> ср, 22 дек. 2021 г. в 10:24, David G. Johnston <david.g.johns...@gmail.com
> >:
>
>> On Tue, Dec 21, 2021 at 10:06 PM David Gauthier <davegauthie...@gmail.com>
>> wrote:
>>
>>> I'll have to read more about sqlite_fdw. Thanks for that Steve !
>>>
>>> Each SQLite isn't that big (billions of records), more like 30K records
>>> or so.  But there are lots and lots of these SQLite DBs which add up over
>>> time to perhaps billions of records.
>>>
>>> This is for a big corp with an IT dept.  Maybe I can get them to upgrade
>>> the DB itself.
>>> Thank You too David !
>>>
>>>>
>>>>
>> So, more similar to the image storage question than I first thought, but
>> still large enough where the specific usage patterns and needs end up being
>> the deciding factor (keeping in mind you can pick multiple solutions - so
>> that really old data, ideally on a partition, can be removed from the DB
>> while still remaining accessible if just more slowly or laboriously).
>>
>> One possibility to consider - ditch the SQLite dependency and just store
>> CSV (but maybe with a funky delimiter sequence).  You can then us
>> "string_to_table(...)" on that delimiter to materialize a table out of the
>> data right in a query.
>>
>> David J.
>>
>>

Reply via email to