Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Solr Wiki" for change 
notification.

The following page has been changed by NoblePaul:
http://wiki.apache.org/solr/DataImportHandler

------------------------------------------------------------------------------
  
  We hope to expand this documentation even more by adding more and more 
examples showing off the power of this tool. Keep checking back.
  
- [[Anchor(faq)]]
- = FAQ =
-  * I'm using DataImportHandler with a MySQL database. My table is huge and 
DataImportHandler is going out of memory. Why does DataImportHandler bring 
everything to memory?
-   * DataImportHandler is designed to stream row one-by-one. It passes a fetch 
size value (default: 500) to Statement#setFetchSize which some drivers do not 
honor. For MySQL, add batchSize property to dataSource configuration with value 
-1. This will pass Integer.MIN_VALUE to the driver as the fetch size and keep 
it from going out of memory for large tables.
  
-  * I'm using DataImportHandler with MS SQL Server database with sqljdbc 
driver. DataImportHandler is going out of memory. I tried adjustng the 
batchSize values but they don't seem to make any difference. How do I fix this?
-   * There's a connection property called responseBuffering in the sqljdbc 
driver whose default value is "full" which causes the entire result set to be 
fetched. See http://msdn.microsoft.com/en-us/library/ms378988.aspx for more 
details. You can set this property to "adaptive" to keep the driver from 
getting everything into memory. Connection properties like this can be set as 
an attribute (responseBuffering="adaptive") in the dataSource configuration OR 
directly in the jdbc url specified in DataImportHandler's dataSource 
configuration.
  
  ----
  CategorySolrRequestHandler

Reply via email to