I index in 10K batches and commit after 5 index cyles (after 50K). Is
there any limitation that I can't search during commit or
auto-warming? I got 8 CPU cores and only 2 were showing busy (using
top) - so it's unlikely that the CPU was pegged.
2009/4/12 Noble Paul നോബിള് नोब्ळ्
On Mon, Apr 13, 2009 at 12:36 PM, vivek sar vivex...@gmail.com wrote:
I index in 10K batches and commit after 5 index cyles (after 50K). Is
there any limitation that I can't search during commit or
auto-warming? I got 8 CPU cores and only 2 were showing busy (using
top) - so it's unlikely
Hello,
I'm trying to import a simple book table with the full-import command. The
datas are stored in mysql.
It worked well when I tried to import few fields from the 'book' table :
title, author, publisher etc.
Now I would like to create a facet (multi valued field) with the categories
which
2009/4/13 Vincent Pérès vincent.pe...@gmail.com
dataConfig
dataSource
type=JdbcDataSource driver=com.mysql.jdbc.Driver
url=jdbc:mysql://localhost:33061/completelynovel user=root password=
/
document name=books
entity name=book pk=ID query=select isbn, listing_id AS id, title,
it is likely that your query did not return any data. just run the
query separately and see if it reallly works.
Or try it out in debug mode. it will tell you which query was run and
what got returned.
--Noble
2009/4/13 Vincent Pérès vincent.pe...@gmail.com:
Hello,
I'm trying to import a
I changed the ISBN to lowercase (and the other fields as well) and it works !
Thanks very much !
--
View this message in context:
http://www.nabble.com/DataImportHandler-with-multiple-values-tp23022195p23023374.html
Sent from the Solr - User mailing list archive at Nabble.com.
Also, in reference to the other question, I'm currently trying to edit the
main search page to search multiple fields.
Essentially, I detect if each field has been posted or not using:
if ($_POST['FIELD'] != '') {
$query = $query . '+FIELDNAME:' . $_POST['FIELD'];
}
Once it's processed all the
On Apr 13, 2009, at 11:20 AM, Johnny X wrote:
Also, in reference to the other question, I'm currently trying to
edit the
main search page to search multiple fields.
Essentially, I detect if each field has been posted or not using:
if ($_POST['FIELD'] != '') {
$query = $query .
Hi there
I installed Solr on tomcat 6 and whenever I click search it displays the xml
like I am editing it?
is that normal?
I added a connector line in my server.xml below.
--
Connector port=8080 protocol=HTTP/1.1
connectionTimeout=2
Do you know the specific syntax when querying different fields?
http://localhost:8080/solr/select/?q=Date:%222000%22version=2.2start=0rows=10indent=on
doesn't appear to return anything when I post it in my browser, when it
should, but (as before) if you change 'Date' to 'Content' it works!
A further update on this is that (when 'Date' is searched using the same URL
as posted in the previous message), whether Date is of type string or text,
the full (exact) content of a field has to be searched to return a result.
Why is this not the case with Content? I tried changing the default
The query method seems to only support solr/select requests. I subclassed
SolrRequest and created a request class that supports solr/autoSuggest -
following the pattern in LukeRequest. It seems to work fine for me.
Clay
-Original Message-
From: Grant Ingersoll
Here is some more information about my setup,
Solr - v1.4 (nightly build 03/29/09)
Servlet Container - Tomcat 6.0.18
JVM - 1.6.0 (64 bit)
OS - Mac OS X Server 10.5.6
Hardware Overview:
Processor Name: Quad-Core Intel Xeon
Processor Speed: 3 GHz
Number Of Processors: 2
Total Number Of Cores: 8
Interesting. Do you know if it's possible to get the HTTP headers with
Solrj?
Yonik Seeley wrote:
On Fri, Apr 10, 2009 at 11:58 AM, Richard Wiseman
rwise...@infosciences.com wrote:
Is it possible for a Solr client to determine if the index has changed since
the last time it performed a
Hi All,
I am trying to setup a Solr instance on my macbook.
I get following errors when m trying to do a full db import ... please help
me on this
Apr 13, 2009 11:53:28 PM org.apache.solr.handler.dataimport.JdbcDataSource$1
call
INFO: Creating a connection for entity slideshow with URL:
I am using Tomcat ...
On Mon, Apr 13, 2009 at 11:57 PM, Mani Kumar manikumarchau...@gmail.comwrote:
Hi All,
I am trying to setup a Solr instance on my macbook.
I get following errors when m trying to do a full db import ... please help
me on this
Apr 13, 2009 11:53:28 PM
On Mon, Apr 13, 2009 at 11:57 PM, Mani Kumar manikumarchau...@gmail.comwrote:
Hi All,
I am trying to setup a Solr instance on my macbook.
I get following errors when m trying to do a full db import ... please help
me on this
java.lang.OutOfMemoryError: Java heap space
at
Hi Shalin:
Thanks for quick response!
By defaults it was set to 1.93 MB.
But i also tried it with following command:
$ ./apache-tomcat-6.0.18/bin/startup.sh -Xmn50M -Xms300M -Xmx400M
I also tried tricks given on
http://wiki.apache.org/solr/DataImportHandlerFaq page.
what should i try next ?
Sorry, should have add that you should set the qt param:
http://wiki.apache.org/solr/CoreQueryParameters#head-2c940d42ec4f2a74c5d251f12f4077e53f2f00f4
-Grant
On Apr 13, 2009, at 1:35 PM, Fink, Clayton R. wrote:
The query method seems to only support solr/select requests. I
subclassed
Depending on your dataset and how your queries look you may very likely
need to increase to a larger heap size. How many queries and rows are
required for each of your documents to be generated?
Ilan
On 4/13/09 12:21 PM, Mani Kumar wrote:
Hi Shalin:
Thanks for quick response!
By defaults
Hi all,
Currently I wrote an xml file and schema.xml file. What is the next step to
index a txt file? Where should I put my txt file I want to index?
thank you,
Alex V.
Some more update. As I mentioned earlier we are using multi-core Solr
(up to 65 cores in one Solr instance with each core 10G). This was
opening around 3000 file descriptors (lsof). I removed some cores and
after some trial and error I found at 25 cores system seems to work
fine (around 1400 file
I'll start a new thread to make things easier, because I've only really got
one problem now.
I've configured my Solr to search on all fields, so it will only search for
a specific query in a specific field (e.g. q=Date:October) will only
search the 'Date' field, rather the all the others.
The
what about:
fieldA:value1 AND fieldB:value2
this can also be written as:
+fieldA:value1 +fieldB:value2
On Apr 13, 2009, at 9:53 PM, Johnny X wrote:
I'll start a new thread to make things easier, because I've only
really got
one problem now.
I've configured my Solr to search on all
On Tue, Apr 14, 2009 at 7:14 AM, vivek sar vivex...@gmail.com wrote:
Some more update. As I mentioned earlier we are using multi-core Solr
(up to 65 cores in one Solr instance with each core 10G). This was
opening around 3000 file descriptors (lsof). I removed some cores and
after some trial
Hi ILAN:
Only one query is required to generate a document ...
Here is my data-config.xml
dataConfig
dataSource type=JdbcDataSource name=sp
driver=com.mysql.jdbc.Driver url=jdbc:mysql://localhost/mydb_development
user=root password=** /
document name=items
entity name=item
DIH itself may not be consuming so much memory. It also includes the
memory used by Solr.
Do you have a hard limit on 400MB , is it not possible to increase it?
On Tue, Apr 14, 2009 at 11:09 AM, Mani Kumar manikumarchau...@gmail.com wrote:
Hi ILAN:
Only one query is required to generate a
Here is the stack trace:
notice in stack trace * at
com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1749)*
It looks like that its trying to read whole table into memory at a time. n
thts y getting OOM.
Apr 14, 2009 11:15:01 AM org.apache.solr.handler.dataimport.DataImporter
Hi Noble:
But the question is how much memory? is there any rules or something like
that? so that i can estimate the how much memory it requires?
Yeah i can increase it upto 800MB max will try it and let you know
Thanks!
Mani
2009/4/14 Noble Paul നോബിള് नोब्ळ् noble.p...@gmail.com
DIH
On Tue, Apr 14, 2009 at 11:18 AM, Mani Kumar manikumarchau...@gmail.comwrote:
Here is the stack trace:
notice in stack trace * at
com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1749)*
It looks like that its trying to read whole table into memory at a time. n
thts y getting OOM.
30 matches
Mail list logo