Hi,
You could (theoretically) reduce the down-time to zero using a 'swap'
command:
http://wiki.apache.org/solr/CoreAdmin?highlight=%28swap%29#SWAP
Cheers
Henrib
muneeb wrote:
Hi,
I have indexed almost 7 million articles on two separate cores, each with
their own conf/ and data/ folder
ryantxu wrote:
...
Yes, I would like to see a way to specify all the fieldtypes /
handlers in one location and then only specify what fields are
available for each core.
So yes -- I agree. In 2.0, I hope to flush out configs so they are
not monstrous.
...
What about
ryantxu wrote:
Yes, include would get us some of the way there, but not far enough
(IMHO). The problem is that (as written) you still need to have all
the configs spattered about various directories.
I does not allow us to go *all* the way but it does allow to put
And *not* the default:
dataDir${solr.data.dir:./solr/data}/dataDir
Which will make both cores use the same index.
Hope this helps,
Henrib
rogerio.araujo wrote:
Hi!
I have a multicore installation with the following configuration:
solr persistent=false
cores adminPath=/admin/cores
core
Hi,
It is likely to be related to how you initialize Solr create your
SolrCore; there has been a few changes to ensure there is always a
CoreContainer created which (as its name stands), holds a reference to all
created SolrCore.
There is a CoreContainer.Initializer class that allows to easily
Seems you want something like:
public SolrCore nikhilInit(final IndexSchema indexSchema) {
final String solrConfigFilename = solrconfig.xml; // or else
CoreContainer.Initializer init = new CoreContainer.Initializer() {
@Override
public CoreContainer initialize() {
Nikhil Chhaochharia wrote:
I am assuming that these are part of some patch which will get applied
before 1.3 releases, is that correct ?
Nikhil
Yes, this is part of a patch and no, they most likely will not make it in
1.3.
However, I guess the following will bring you even closer:
Since I authored the patch, I'm guilty on all counts. :-)
Amit Nithian wrote:
I am not sure why they chose that direction over built-in entity include.
Entities are not the most used or known feature and I just did not think of
this was a way to do it.
I also wanted variable expansion in
This should be one use-case for
https://issues.apache.org/jira/browse/SOLR-646 SOLR-646 .
If you can try it, don't hesitate to report/comment on the issue.
Henri
zayhen wrote:
Hello guys,
I have to load solr/home from a .properties file, because of some
environment standards I have to
ballistic when they see logs in their console...)
Cheers
Henrib
zayhen wrote:
Hello Henrib,
I have read the issue and it seems an interesting feature for Solr, but I
don't see how it address to my needs, as I need to point Solr to the
multicore.properties file.
Actually, I have already
I'm re-adapting some pretty-old/hacked (1.2dev) code that performs a query
filtered by a list of document unique keys and returning results based on
the list order. Anyone having same requirement/feature/code ?
I've been looking in QueryComponent where there is code to handle shards
that
We could harness solr-646, reusing the property name='...'../property
syntax by creating a scope for field elements when reading the schema. Since
properties (the PropertyMap) are stored in the ResourceLoader, it seems we
should be able to access them in the useful places through the usual
Ryan McKinley wrote:
[ ] Keep solr logging as it is. (JDK Logging)
[X ] Use SLF4J.
Can't keep as is since this strictly precludes configuring logging in a
container agnostic way.
--
View this message in context:
Will,
I'd be definitely interested in your code but mostly in the config
deployment options if you can share.
You did not happen to deploy on Websphere 6 by any chance ? I can't find a
way to configure jul to only log into our application logs (even less so in
our log4j logs); I'm not even sure
Hi,
I'm (still) seeking more advice on this deployment issue which is to use
org.apache.log4j instead of java.util.logging. I'm not seeking re-starting
any discussion on solr4j/commons/log4j/jul respective benefits; I'm seeking
a way to bridge jul to log4j with the minimum specific per-container
I'm trying to filter my document collection by an external mean that
produces a set of document unique keys.
Assuming this goes into a custom request handler (solr-281 making that
easy), any pitfall using a ConstantScoreQuery (or an equivalent filtering
functionality) as a Solr filter query ?
I believe that keeping you code as is but initializing the query parameters
should do the trick:
HashMap params = new HashMap();
params.add(fl, id score); // field list is id score
...
Regards
John Reuning-2 wrote:
My first pass was to implement the embedded solr example:
--
We have an application where we index documents that can exist in many (at
least 2) languages.
We have 1 SolrCore per language using the same field names in their schemas
(different stopwords , synonyms stemmers), the benefits for content
maintenance overweighting (at least) complexity.
Using EN
Another possible (and convoluted) way is to use SOLR-215 patch which allows
multiple indexes within one Solr instance (also at this stage, you'd loose
replication and would probably have to adapt the servlet filter).
Regards
Henri
Yu-Hui Jin wrote:
Hi, there,
I have a few basic questions
= s.getDocList(query, rdocs,
SolrPluginUtils.getSort(req),
params.getInt(START,0),
params.getInt(ROWS,10),
flags);
}
Yonik Seeley wrote:
On 6/18/07, Henrib [EMAIL PROTECTED] wrote
())
bits.fastSet(termDocs.doc());
}
termDocs.close();
}
return new org.apache.solr.search.BitDocSet(bits);
}
Thanks again
Yonik Seeley wrote:
On 6/17/07, Henrib [EMAIL PROTECTED] wrote:
Merely an efficiency related question: is there any other way to filter
Merely an efficiency related question: is there any other way to filter on a
uniqueKey set than using the 'fq' parameter building a list of the
uniqueKeys?
In 'raw' Lucene, you could use filters directly in search; is this (close
to) equivalent efficiency wise?
Thanks
--
View this message in
,
Daniel
On 8/6/07 15:15, Henrib [EMAIL PROTECTED] wrote:
Hi Daniel,
If it is functionally 'ok' to search in only one lang at a time, you
could
try having one index per lang. Each per-lang index would have one schema
where you would describe field types (the lang part coming through
Hi Daniel,
If it is functionally 'ok' to search in only one lang at a time, you could
try having one index per lang. Each per-lang index would have one schema
where you would describe field types (the lang part coming through
stemming/snowball analyzers, per-lang stopwords al) and the same field
Updated (forgot the patch for Servlet).
http://www.nabble.com/file/7996/solr-trunk-src.patch solr-trunk-src.patch
The change should still be compatible with the trunk it is based upon.
Henrib wrote:
Following up on a previous thread in the Solr-User list, here is a patch
that allows
You can not have more than one Solr core per application (to be precise, per
class-loader since there are a few statics).
One way is thus to have 2 webapps - when if indexes do not have the same
lifetime/radically different schema/etc.
However, the common wisdom is that you usually dont really
probably also need to have a 'core name' passed down...
I'm still building my knowledge on the subject so my simplistic view might
not be accurate.
Let me know if this helps.
Cheers
Henrib
mpelzsherman wrote:
This sounds like a great idea, and potentially very useful for my company.
Can you
I suppose I'm not the only one having to cope with the kind of policies I
was describing ( their idiosynchrasies); in some organizations, trying to
get IT to modify anything related to 'deployment policy' is just (very close
to) impossible... Within those, having a dedicated Tomcat to run the
28 matches
Mail list logo