: If I can find the bandwidth, I'd like to make something which allows
: file uploads via the XMLUpdateHandler as well... Do you have any ideas
the XmlUpdateRequestHandler already supports file uploads ... all request
handlers do using the ContentStream abstraction...
: ohk.. that means I can't use colon in the fieldname ever in such a scenario
: ?
In most internals, the lucene/solr code base allows *any* character in the
field name, so you *can* use colons in field names, but many of the
surface features (like the query parser) treat colon's as special
Just my thoughts on the matter:
the designer of the runner-up logo and the 3rd place logo is also
responsible for 5 other logos that made it in the list. They are
basically different versions of the same concept. If you add up the
scores for logo's 2, 3, 6, 8, 11, 20 and 23 you will see a
On Dec 15, 2008, at 3:13 AM, Chris Hostetter wrote:
: If I can find the bandwidth, I'd like to make something which allows
: file uploads via the XMLUpdateHandler as well... Do you have any
ideas
the XmlUpdateRequestHandler already supports file uploads ... all
request
handlers do
Hi Erik,
This is indeed what I was talking about... It could even be handled
via some type of transient file storage system. this might even be
better to avoid the risks associated with uploading a huge file across
a network and might (have no idea) be easier to implement.
So I could send the
Have you tried using the
dynamicField name=* type=string indexed=true /
options in the schema.xml? After the indexing, take a look to the
fields DIH has generated.
Bye,
L.M.
2008/12/15 jokkmokk jokkm...@gmx.at:
HI,
I'm desperately trying to get the dataimport handler to work, however it
sorry, I'm using the 1.3.0 release. I've now worked around that issue by
using aliases in the sql statement so that no mapping is needed. This way it
works perfectly.
best regards
Stefan
Shalin Shekhar Mangar wrote:
Which solr version are you using?
--
View this message in context:
I'm no QueryParser expert, but I would probably start w/ the default
query parser in Solr (LuceneQParser), and then progress a bit to the
DisMax one. I'd ask specific questions based on what you see there.
If you get far enough along, you may consider asking for help on the
java-user
Hi all,
Whilst Solr is a great resource (a big thank you to the developers) it
presents me with a couple of issues.
The need for hierarchical facets I would say is a fairly crucial missing
piece but has already been pointed out
(http://issues.apache.org/jira/browse/SOLR-64).
The other
Jacob,
Hmmm... seems the wires are still crossed and confusing.
On Dec 15, 2008, at 6:34 AM, Jacob Singh wrote:
This is indeed what I was talking about... It could even be handled
via some type of transient file storage system. this might even be
better to avoid the risks associated with
Hi Erik,
Sorry I wasn't totally clear. Some responses inline:
If the file is visible from the Solr server, there is no need to actually
send the bits through HTTP. Solr's content steam capabilities allow a file
to be retrieved from Solr itself.
Yeah, I know. But in my case not possible.
See also http://wiki.apache.org/solr/SolrResources
On Dec 15, 2008, at 2:57 AM, Andre Hagenbruch wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sajith Vimukthi schrieb:
Hi Sajith,
I need some sample code of some examples done using solr. I need to
get an
idea on how I can use solr
Sorry,
Forgot the most important detail.
The document I am adding contains multiple names fields:
sInputDocument.addField(names, value);
sInputDocument.addField(names, value);
sInputDocument.addField(names, value);
There is no problem when a document only contains one value in the names field.
Which solr version are you using?
On Mon, Dec 15, 2008 at 6:04 PM, jokkmokk jokkm...@gmx.at wrote:
HI,
I'm desperately trying to get the dataimport handler to work, however it
seems that it just ignores the field name mapping.
I have the fields body and subject in the database and those
In the solrconfig.xml (scroll all the way to the bottom, and I believe
the example has some commented out)
On Dec 15, 2008, at 5:45 AM, ayyanar wrote:
I'm no QueryParser expert, but I would probably start w/ the default
query parser in Solr (LuceneQParser), and then progress a bit to the
Hi -
I am looking at the article here with a brief introduction to SolrJ .
http://www.ibm.com/developerworks/library/j-solr-update/index.html?ca=dgr-jw17SolrS_Tact=105AGX59S_CMP=GRsitejw17#solrj
.
In case we have multiple SolrCores in the server application - (since
1.3) - how do I specify
A solr core is like a separate solr server... so create a new
CommonsHttpSolrServer that points at the core.
You probably want to create and reuse a single HttpClient instance for
the best efficiency.
-Yonik
On Mon, Dec 15, 2008 at 11:06 AM, Kay Kay kaykay.uni...@gmail.com wrote:
Hi -
I am
What do you see in the admin schema browser?
/admin/schema.jsp
When you select the field names, do you see the property
Multivalued?
ryan
On Dec 15, 2008, at 10:55 AM, Schilperoort, René wrote:
Sorry,
Forgot the most important detail.
The document I am adding contains multiple names
I found the following solution in the forum to use BoostingTermQuery in solr:
I ended up subclassing QueryParser and overriding newTermQuery() to create
a BoostingTermQuery instead of a plain ol' TermQuery. Seems to work.
http://www.nabble.com/RE:-using-BoostingTermQuery-p19651792.html
I
http://lucene.apache.org/solr/tutorial.html
On Dec 15, 2008, at 12:56 AM, Sajith Vimukthi wrote:
Hi all,
Can someone of you give me a sample code on a search function done
with solr
so that I can get an idea on how I can use it.
Regards,
Sajith Vimukthi Weerakoon
Associate Software
HI,
I'm desperately trying to get the dataimport handler to work, however it
seems that it just ignores the field name mapping.
I have the fields body and subject in the database and those are called
title and content in the solr schema, so I use the following import
config:
dataConfig
On Dec 15, 2008, at 8:20 AM, Jacob Singh wrote:
Hi Erik,
Sorry I wasn't totally clear. Some responses inline:
If the file is visible from the Solr server, there is no need to
actually
send the bits through HTTP. Solr's content steam capabilities
allow a file
to be retrieved from Solr
Thanks Yonik for the clarification.
Yonik Seeley wrote:
A solr core is like a separate solr server... so create a new
CommonsHttpSolrServer that points at the core.
You probably want to create and reuse a single HttpClient instance for
the best efficiency.
-Yonik
On Mon, Dec 15, 2008 at 11:06
Hey all,
I'm having trouble articulating a query and I'm hopeful someone out there
can help me out :)
My situation is this: I am indexing a series of questions that can either be
asked from a main question entry page, or a specific subject page. I have a
field called referring which indexes the
I think in this case you would want to index each question with the
possible referrers ( by title might be too imprecise, I'd go with
filename or ID) and then do a search like this (assuming in this case
it's by filename)
q=(referring:TomCruise.html) OR (question: Tom AND Cruise)
Which
Thanks for the tip, I appreciate it!
However, does anyone know how to articulate the syntax of (This AND That)
OR (Something AND Else) into a query string?
i.e. q=referring:### AND question:###
On Mon, Dec 15, 2008 at 12:32 PM, Stephen Weiss swe...@stylesight.comwrote:
I think in this case
: I need to tokenize my field on whitespaces, html, punctuation, apostrophe
: but if I use HTMLStripStandardTokenizerFactory it strips only html
: but no apostrophes
you might consider using one of the HTML Tokenizers, and then use a
PatternReplaceFilterFilter ... or if you know java
Hi all,
i have a TextField containing over 400k of text
when i try to search a word solr doesn't return any result but if I search
for a single document, I can see that the word exists there
So I suppose that solr has a textfield size limit (the field is indexed
using a
Check your solrconfig.xml:
maxFieldLength1/maxFieldLength
That's probably the truncating factor. That's the maximum number of
terms, not bytes or characters.
Erik
On Dec 15, 2008, at 5:00 PM, Antonio Zippo wrote:
Hi all,
i have a TextField containing over 400k of
Hi guys,
I have a typical master/slave setup running with Solr 1.3.0. I did
some basic scalability test with JMeter and tweaked our environment
and determined that we can handle approximately 26 simultaneous
threads and get end-to-end response times of under 200ms even with
typically every 5
Hey guys,
Thanks for the response, but how would make recency a factor on
scoring documents with the standard request handler.
The query (title:iphone OR bodytext:iphone OR title:firmware OR
bodytext:firmware) AND _val_:ord(dateCreated)^0.1
seems to do something very similar to just sorting by
Check your solrconfig.xml:
maxFieldLength1/maxFieldLength
That's probably the truncating factor. That's the maximum number of terms,
not bytes or characters.
Erik
Thanks... I think it could be the problem.
i tried to count whitespace in a single text and it's
On Mon, Dec 15, 2008 at 5:28 PM, Antonio Zippo reven...@yahoo.it wrote:
Check your solrconfig.xml:
maxFieldLength1/maxFieldLength
That's probably the truncating factor. That's the maximum number of terms,
not bytes or characters.
Thanks... I think it could be the problem.
No need to re-index with this change.
But you will have to re-index any documents that got cut off of course.
-Yonik
Ok, thanks...
I hoped to reindex the documents over the existent index (with incremental
update...while solr is running) ...and without delete the index folder
But
Hello,
In my solrconfig.xml file I am setting the attribute hl.snippets to 3. When
I perform a search, it returns only a single snippet for each highlighted
field. However, when I set the hl.snippets field manually as a search
parameter, I get up to 3 highlighted snippets. This is the
You actually don't need to escape most characters inside a character class,
the escaping of the period was unnecessary.
I've tried using the example regex ([-\w ,/\n\']{20,200}), and I'm _still_
getting lots of highlighted snippets that don't match the regex (starting
with a period, etc.) Has
Try adding echoParams=all to your query to verify the params that the
solr request handler is getting.
-Yonik
On Mon, Dec 15, 2008 at 6:10 PM, Mark Ferguson
mark.a.fergu...@gmail.com wrote:
Hello,
In my solrconfig.xml file I am setting the attribute hl.snippets to 3. When
I perform a search,
Thanks for this tip, it's very helpful. Indeed, it looks like none of the
highlighting parameters are being included. It's using the correct request
handler and hl is set to true, but none of the highlighting parameters from
solrconfig.xml are in the parameter list.
Here is my query:
It does appear to be working for us now. The files replicated out
appropriately which is a huge help. Thanks to all!
-Jeff
On 12/13/08 9:42 AM, Shalin Shekhar Mangar shalinman...@gmail.com wrote:
Jeff, SOLR-821 has a patch now. It'd be nice to get some feedback if
you
manage to try it
Would this mean that, for example, if we wanted to search productId
(long) we'd need to make a field type that had stopwords in it rather
than simply using (long)?
Thanks for your time!
Matthew Runo
Software Engineer, Zappos.com
mr...@zappos.com - 702-943-7833
On Dec 12, 2008, at 11:56 PM,
I have a parent entity that grabs a list of records of a certain type from 1
table... and a sub-entity that queries another table to retrieve the actual
data... for various reasons I cannot join the tables... the 2nd sql query
converts the rows into an xml to be processed by a custom transformer
It seems like maybe the fragmenter parameters just don't get displayed with
echoParams=all set. It may only display as far as the request handler's
parameters. The reason I think this is because I tried increasing
hl.fragsize to 1000 and the results were returned correctly (much larger
snippets),
Hi everybody,
So I have applied the Ivans latest patch to a clean 1.3.
I built it using 'ant compile' and 'ant dist', got the solr build.war
file.
Moved that into the Tomcat directory.
Modified my solrconfig.xml to include the following:
searchComponent name=collapse
I do not observe anything wrong.
you can also mention the 'deltaImportQuery' and try it
someting like
entity name=table1 pk=id
query= SELECT ID,MY_GUID
FROM activityLog
WHERE type in (11, 15)
Jeff,
Thanks.
It would be nice if you just review the config syntax and see if all
possible usecases are covered . Is there any scope for improvement ?
On Tue, Dec 16, 2008 at 5:45 AM, Jeff Newburn jnewb...@zappos.com wrote:
It does appear to be working for us now. The files replicated out
I'ev had a chance to play with this more and noticed the query does run fine
but it only updates the records that are already indexed it doesn't add new
ones.
The only option that i'ev found so far is to do a full-import with the
clean=false attribute and created_date last_indexed_date...
Is
Are the queries being fired wrong/different when you tried full-import?
On Tue, Dec 16, 2008 at 9:57 AM, sbutalia sbuta...@gmail.com wrote:
I'ev had a chance to play with this more and noticed the query does run fine
but it only updates the records that are already indexed it doesn't add new
Derek,
q=+referring:XXX +question:YYY
(of course, you'll have to URL-encode that query string0
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Derek Springer de...@mahalo.com
To: solr-user@lucene.apache.org
Sent: Monday, December 15,
Hi,
I was trying to do a performance test on Solr web application.
If I run the performance tests, lot og logging is happening due to which
I am getting log files in GBs
Is there any clean way of deactivating logging or changing the log level
to say error ..
Is there any property
solr 1.3 uses java logging. Most app containers (tomcat, resin, etc)
give you a way to configure that. Also check:
http://java.sun.com/j2se/1.4.2/docs/guide/util/logging/overview.html#1.8
You can make runtime changes from the /admin/ logging tab. However,
these changes are not persisted
Hi Ryan,
Thanks for the inputs .These are the set of steps followed to solve this
issue.
1.make a loggging property file say solrLogging.properties.We can copy the
default logging property file available at JAVA_HOME/jre/lib folder.
default java logging file will look like the following.
Ryan,
It turned out that another multivalue field was causing my problem. This field
was no longer configured in my schema.
My dynamic catch-all field of type ignored was not multivalued, adding
multivalue to this field solved my problem.
Regards, Rene
-Original Message-
From: Ryan
52 matches
Mail list logo