com]
Sent: Friday, April 29, 2016 4:58 AM
To: solr-user@lucene.apache.org
Subject: dataimport db-data-config.xml
I want to import data from mysql-table and csv file ata the same time beacuse
some data are in mysql tables and some are in csv file . I want to match
specific id from mysql table i
I want to import data from mysql-table and csv file ata the same time beacuse
some data are in mysql tables and some are in csv file . I want to match
specific id from mysql table in csv file then add the data in solar.
What i think or wnat to do
ame data-source. is this possible in
> db-data-config.xml.
>
>
>
>
> url="jdbc:postgresql://0.0.0.0:5432/iboats"
> user="iboats"
> password="root" />
>
>
>
I am try to run two pgsql query on same data-source. is this possible in
db-data-config.xml.
This code is not working please suggest any more example
--
View this message in context:
http://lucene
I am try to run two pgsql query on same data-source. is this possible in
db-data-config.xml.
This code is not working please suggest any more example
--
View this message in context:
http://lucene
trying to combine a script:and RegExTransformer in
a db-dataconfig.xml that is used to ingest data into Solr. Can anyone be
of any help?
There is definitly a comma between my script:add , and addRegexTransfomer
lines.
Any help would be appreciated.
My db-data-config.xml looks like
Hi -
I'm new to Solr and am trying to combine a script:and RegExTransformer in
a db-dataconfig.xml that is used to ingest data into Solr. Can anyone be
of any help?
There is definitly a comma between my script:add , and addRegexTransfomer
lines.
Any help would be appreciated.
My db-data
defaultCoreName=core1 adminPath=/admin/cores
zkClientTimeout=${zkClientTimeout:15000} host=${host:} hostPort=9985
hostContext=${hostContext:}
core loadOnStartup=true instanceDir=core1 transient=false
name=core1
property name=dbconfig value=shard1/db-data-config.xml
Hi,
We have a setup where we have 3 shards in a collection, and each shard in
the collection need to load different sets of data
That is
Shard1- will contain data only for Entity1
Shard2 - will contain data for entity2
shard3- will contain data for entity3
So in this case,. the db-data-config.xml
Might not be a solution but I had asked a similar question before..Check out
this thread..
http://lucene.472066.n3.nabble.com/Is-there-a-way-to-load-multiple-schema-when-using-zookeeper-td4058358.html
You can create multiple collection and each collecion can use completley
differnet sets of
Two answers:
1) Do you have maybe user names or timestamps for the comments?
Usually people want those also.
2) You can store the comments as one long string, or as multiple
entries in a field. Your database should have a concatenate function
that will take field X from multiple documents in a
Thanks for the response!
Its a bad news that it isnt that simple I hoped.
Certainly I need names and a timestamp for the comment. There are any
problems if I want to add a timestamp in the one long string? Apart from
this can I add this one long string to the index?
Example:
table blog: id,
Hi there!
I have 2 tables 'blog' and 'comment'. A blog can contains n comments (blog
--1:n-- comment). Up to date I use following select to insert the data into
solr index:
entity name=blog dataSource=mssqlDatasource pk=id
transformer=ClobTransformer
query=SELECT
Have to use exact JNDI name in db-data-config.xml, as unmanaged threads in
Websphere do not have access to java:comp/env namespace.
Resource name can not be mapped to websphere jdbc datasource name via
reference definition in web.xml.
Now using jndiName=jdbc/testdb instead of
jndiName=java:comp
I am trying to use jndiName attribute in db-data-config.xml. This works great
in tomcat. However having issues in websphere.
Following exception is thrown
Make sure that a J2EE application does not execute JNDI operations on
java: names within static code blocks or in threads created
Hi
I am using Oracle Exadata as my DB. I want to index nearly 4 crore rows. I
have tried with specifing batchsize as 1. and with out specifing
batchsize. But both tests takes nearly same time.
Could anyone suggest me best way to index huge data Quickly?
--
View this message in context:
On Thu, Jun 11, 2009 at 2:41 AM, jayakeerthi s mail2keer...@gmail.comwrote:
As displayed above
str name=*Total Requests made to DataSource**3739*/str
* * str name=*Total Rows Fetched**4135*/str
* * str name=*Total Documents Processed**1402*/str
are differing The request to the
*
* * str name=*config**
C:\apache-solr-nightly\example\example-DIH\solr\db\conf\db-data-config.xml*
/str
* * /lst
* * /lst
* * str name=*command**abort*/str
* * str name=*status**busy*/str
* * str name=*importResponse* /
*-* http://localhost:8983/solr/dataimport?command=abort# lst name
Hi All,
I am facing an issue while fetching the records from database by providing
the value '${prod.prod_cd}' in this type at db-data-config.xml.
It is working fine If I provide the exact value of the product code ie
'302437-413'
Here is the db-data-config.xm I am using
dataConfig
dataSource
AM, jayakeerthi smail2keer...@gmail.com wrote:
Hi All,
I am facing an issue while fetching the records from database by providing
the value '${prod.prod_cd}' in this type at db-data-config.xml.
It is working fine If I provide the exact value of the product code ie
'302437-413'
Here
Hi Noble,
Thanks for the reply,
As advised I have changed the db-data-config.xml as below. But still the
str name=Indexing completed. Added/Updated: 0 documents. Deleted 0
documents./str
dataConfig
dataSource type=FileDataSource name =xmlindex/
document name=products
entity name
:
Hi All,
I am trying to index the fileds from the xml files, here is the
configuration that I am using.
db-data-config.xml
dataConfig
dataSource type=FileDataSource name =xmlindex/
document name=products
entity name=xmlfile processor=FileListEntityProcessor
fileName=c:\test
the fileds from the xml files, here is the
configuration that I am using.
db-data-config.xml
dataConfig
dataSource type=FileDataSource name =xmlindex/
document name=products
entity name=xmlfile processor=FileListEntityProcessor
fileName=c:\test\ipod_other.xml recursive=true rootEntity
Hi All,
I am trying to index the fileds from the xml files, here is the
configuration that I am using.
db-data-config.xml
dataConfig
dataSource type=FileDataSource name =xmlindex/
document name=products
entity name=xmlfile processor=FileListEntityProcessor
fileName=c:\test
, jayakeerthi s mail2keer...@gmail.comwrote:
Hi All,
I am trying to index the fileds from the xml files, here is the
configuration that I am using.
db-data-config.xml
dataConfig
dataSource type=FileDataSource name =xmlindex/
document name=products
entity name=xmlfile processor
to include the add?
-Jay
On Fri, May 15, 2009 at 12:53 PM, jayakeerthi s mail2keer...@gmail.com
wrote:
Hi All,
I am trying to index the fileds from the xml files, here is the
configuration that I am using.
db-data-config.xml
dataConfig
dataSource type=FileDataSource name
Institutions by 'institution_name' in the INSTITUTION table.
2. Display institution_type for institution_type_id.
3. user should be able to search for institution by 'source_id' and
'source_entity_name'.
My db-data-config.xml is following
by 'source_id' and
'source_entity_name'.
My db-data-config.xml is following -
===
dataConfig
dataSource driver=net.sourceforge.jtds.jdbc.Driver
url=jdbc:jtds:sqlserver://localhost:1433/dummy-master user=dummy-master
' in the INSTITUTION table.
2. Display institution_type for institution_type_id.
3. user should be able to search for institution by 'source_id' and
'source_entity_name'.
My db-data-config.xml is following -
===
dataConfig
29 matches
Mail list logo