Thanks, everyone, for your responses on Tuesday. I wasn't on the mailing list, so didn't receive them in my email, but I am now, and I can see the responses on https://www.mail-archive.com/sqlite-users@mailinglists.sqlite.org/msg118979.html. Much appreciated!
> Can you show us some place where using 999 parameters is a reasonable way to use SQLite ? The use-case I have is that I'm a maintainer of Drupal <https://www.drupal.org/>, a CMS written in PHP. In Drupal, we support multiple database backends. Typically, a production site will operate on MySQL or PostgreSQL, but we also support SQLite, which is convenient for local testing. We have a database abstraction API that allows module and site developers to write code like: $all_results = $db->select('table_foo') ->fields('table_foo', ['field_1', 'field_2']) ->condition('field_3', $large_array, 'IN') ->execute() ->fetchAll(); The implementation of execute() above ends up calling a PHP PDO <https://www.php.net/manual/en/book.pdo.php> statement that's more or less as follows: $statement = $pdo->prepare("SELECT field_1, field_2 FROM table_foo WHERE field_3 IN (?, ?, ... ?)"); return $statement->execute($large_array); A similar use case is for insertion. For example, a module or site developer can call: $upsert = $connection->upsert('kv')->key('k')->fields(['k', 'v']); foreach ($large_associative_array as $key => $value) { $upsert->values(['k' => $key, 'v' => $value]); } $upsert->execute(); The execute() above ends up calling a PHP PDO statement that's appropriate for the database backend. On SQLite prior to 3.24 it's more or less: $statement = $pdo->prepare("INSERT OR REPLACE INTO kv (k, v) VALUES ((?, ?), (?, ?), ... (?, ?))"); return $statement->execute($values); In the case of the multi-row insert/upsert, we can change our implementation to execute multiple PDO statements in succession in order to keep each one below the 999 limit. In the case of the IN (...), thank you for your suggestions to use a temp table. Because there can be multiple calls to ->condition(), including nested ones, it might be a bit tricky for us to get the temp tables implemented correctly. We'd need to track multiple temp tables and then drop them at the correct time, which would break the current encapsulation of our API, but it's potentially doable with some refactoring. However, your temp table idea inspired me to come up with an alternate solution of using json_each(). So that first example could be implemented as: $statement = $pdo->prepare("SELECT field_1, field_2 FROM table_foo WHERE field_3 IN (select value from json_each(?))"); return $statement->execute([json_encode($large_array)]); If you're curious, I posted a proof of concept patch for both the upsert and the large IN condition to https://www.drupal.org/project/drupal/issues/2031261#comment-13454464. In summary, yes, I think we can change things in Drupal to work within the 999 limit. I think there might be other PHP projects that could benefit from a higher limit (for example, ones that don't have Drupal's abstraction API around PDO), but I don't know enough specifics about those to argue on their behalf. Thanks again for your consideration and insights. Alex. On Tue, Feb 4, 2020 at 10:27 AM Alex Bronstein <effulgents...@gmail.com> wrote: > Prepared statements are good practice and recommended in places such as > https://www.php.net/manual/en/pdo.prepared-statements.php. There are use > cases for using them with many items in an IN() clause, or when inserting > many rows in a single INSERT statement. In such cases, you can easily end > up with more than 999 parameters. > > While sqlite can be compiled with a larger SQLITE_MAX_VARIABLE_NUMBER > flag, there are situations where the application developer doesn't have > control over how the system libraries are compiled. For example, a given > PHP application could run on either a stock Debian/Ubuntu installation, a > stock Fedora/RHEL/CentOS installation, or other systems. Debian compiles > sqlite with SQLITE_MAX_VARIABLE_NUMBER=250000 > <https://sources.debian.org/src/sqlite3/3.27.2-3/debian/rules/#L50> (issue > <https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=717900>) whereas > Fedora doesn't set the flag > <https://git.centos.org/rpms/sqlite/blob/c8/f/SPECS/sqlite.spec#_152> ( > issue <https://bugzilla.redhat.com/show_bug.cgi?id=1798134>), so gets > sqlite's default. > > According to some of the answers on > https://stackoverflow.com/questions/6581573/what-are-the-max-number-of-allowable-parameters-per-database-provider-type, > PostgreSQL and some other databases support a 16 bit parameter count (64K > parameters). Given current memory availability, can sqlite's default be > raised to something similar to that? > _______________________________________________ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users