Synchronizing a Context

2004-01-15 Thread Niall Gallagher
Hi,

I am relatively new to Velocity and I have been trying to write a shared 
Context for a group of Servlets. However, I see in the Javadoc 
documentation that the VelocityContext object is not thread safe, I have 
also examined the hierarchy of super classes and find that it pretty 
much comes down to the InternalContextBase.icachePut and 
InternalContextBase.icacheGet methods.

What I was wondering was, if I synchronized these methods in a subclass 
and extended the AbstractContext would I have a thread safe Context 
implementation ? If not is there any support within the core Velocity 
API for thread safe (preferably chained) Context objects. Any help on 
this would be appreciated!

Niall

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Copy Directory to Directory function ( backup)

2004-01-15 Thread Nicolas Maisonneuve
hy ,
i would like backup a index.

1) my first idea  is to make a system copy of all the files
but in the FSDirectory class,  there is no public method to know where is located the 
directory. A simple methode like 
public File getDirectoryFile() {
return directory; would be great;
}
2) so i decide to create a copy(Directory source, Directory target) method 
i seen the openFile() and createFile method but after i 
but i don't know how use it (see my function  , this function make a Exception )

private void copy (Directory source, Directory target) throws IOException {
String[] files=source.list();
for(int i=0; ifiles.length; i++) {
InputStream in=source.openFile(files[i]);
OutputStream out=target.createFile(files[i]);
byte c;

while((c=in.readByte())!=-1) {
out.writeByte(c);
}
in.close();
out.close();
}

someone could help me please 
nico 


AW: Copy Directory to Directory function ( backup)

2004-01-15 Thread Karsten Konrad

Hi,

an elegant method is to create an empty directory and merge
the index to be copied into it, using .addDirectories() of
IndexWriter. This way, you do not have to deal with files
at all.

Regards,

Karsten

-Ursprüngliche Nachricht-
Von: Nicolas Maisonneuve [mailto:[EMAIL PROTECTED] 
Gesendet: Donnerstag, 15. Januar 2004 13:28
An: [EMAIL PROTECTED]
Betreff: Copy Directory to Directory function ( backup)


hy ,
i would like backup a index.

1) my first idea  is to make a system copy of all the files
but in the FSDirectory class,  there is no public method to know where is located the 
directory. A simple methode like 
public File getDirectoryFile() {
return directory; would be great;
}
2) so i decide to create a copy(Directory source, Directory target) method 
i seen the openFile() and createFile method but after i 
but i don't know how use it (see my function  , this function make a Exception )

private void copy (Directory source, Directory target) throws IOException {
String[] files=source.list();
for(int i=0; ifiles.length; i++) {
InputStream in=source.openFile(files[i]);
OutputStream out=target.createFile(files[i]);
byte c;

while((c=in.readByte())!=-1) {
out.writeByte(c);
}
in.close();
out.close();
}

someone could help me please 
nico 

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Copy Directory to Directory function ( backup)

2004-01-15 Thread Nicolas Maisonneuve
hmm, yes
but i don't want open a indexWriter for this
and there is the performance question when the index is big

- Original Message - 
From: Karsten Konrad [EMAIL PROTECTED]
To: Lucene Users List [EMAIL PROTECTED]
Sent: Thursday, January 15, 2004 2:20 PM
Subject: AW: Copy Directory to Directory function ( backup)



Hi,

an elegant method is to create an empty directory and merge
the index to be copied into it, using .addDirectories() of
IndexWriter. This way, you do not have to deal with files
at all.

Regards,

Karsten

-Ursprüngliche Nachricht-
Von: Nicolas Maisonneuve [mailto:[EMAIL PROTECTED]
Gesendet: Donnerstag, 15. Januar 2004 13:28
An: [EMAIL PROTECTED]
Betreff: Copy Directory to Directory function ( backup)


hy ,
i would like backup a index.

1) my first idea  is to make a system copy of all the files
but in the FSDirectory class,  there is no public method to know where is
located the directory. A simple methode like
public File getDirectoryFile() {
return directory; would be great;
}
2) so i decide to create a copy(Directory source, Directory target) method
i seen the openFile() and createFile method but after i
but i don't know how use it (see my function  , this function make a
Exception )

private void copy (Directory source, Directory target) throws
IOException {
String[] files=source.list();
for(int i=0; ifiles.length; i++) {
InputStream in=source.openFile(files[i]);
OutputStream out=target.createFile(files[i]);
byte c;

while((c=in.readByte())!=-1) {
out.writeByte(c);
}
in.close();
out.close();
}

someone could help me please
nico

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]





-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Fw: Betreff: Copy Directory to Directory function ( backup)

2004-01-15 Thread Nicolas Maisonneuve

- Original Message - 
From: Nick Smith [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Thursday, January 15, 2004 2:58 PM
Subject: Betreff: Copy Directory to Directory function ( backup)


 Hi Nico,
This is the method that I use for backing up my indices...

 Good Luck!

 Nick

   /**
* Copy contents of codedir/code, erasing current contents.
*
* This can be used to write a memory-based index to disk.
*
* @param dir a codeDirectory/code value
* @exception IOException if an error occurs
*/
   public void copyDir(Directory dir) throws IOException {
 // remove current contents of directory
 create();

 final String[] ar = dir.list();
 for (int i = 0; i  ar.length; i++)
 {
   // make place on disk
   OutputStream os = createFile(ar[i]);
   // read current file
   InputStream is = dir.openFile(ar[i]);

   final int MAX_CHUNK_SIZE = 131072;
   byte[] buf = new byte[MAX_CHUNK_SIZE];
   int remainder = (int)is.length();
   while (remainder  0) {
 int chunklen = (remainder  MAX_CHUNK_SIZE ? MAX_CHUNK_SIZE :
remainde!
 is.readBytes(buf, 0, chunklen);
 os.writeBytes(buf, chunklen);
 remainder -= chunklen;
   }

   // graceful cleanup
   is.close();
   os.close();
 }
   }







-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



lucene not indexing under apache 2.0/windows?

2004-01-15 Thread Pierce, Tania
Let me preface this by saying I am a total beginner to
apache/java/tomcat/cocoon etc.  I'm thankfully fluent in xml/xslt or
this would be a nightmare.

Anyway, I have been given the task to figure out why one of our sites
continues to chew up memory and never releases it to the point where I
have to stop and start the tomcat service on a daily basis.  We're using
tomcat 4.1.24, j2se 1.4.1.04 on a win2k server (isapi redirect through
iis).  Our site is made of up of a repository of xml docs (2,000 or so)
which get chewed up and spit out as html thanks to transforms set forth
in our cocoon pipeline.   We have lucene in place to create large xml
files (in memory) so that certain web pages don't have to loop through
hundreds of smaller xml files; instead, the xslt loops through the nodes
contained in the in-memory xml doc that's created for us by lucene.

So my manager had me set up a mirror site on a different machine running
all of the above EXCEPT no IIS, our web server is Apache 2.0 (to rule
out IIS, which I don't think is the issue anyway).   Everything on this
mirror site works except lucene.  I can rebuild the lucene index by
running a .bat file our vendor wrote for us and it runs w/o error.
However, when I take a look at the resulting aggregate xml docs
(cached), they're empty.  To top it off, the cocoon pipeline seems to be
trying to apply our xsl templates to the cached xml docs... There are no
errors in any of the log files.

Any ideas?  What do I need to do (as clearly as you can please, I have
just enough knowledge on all this java/apache/tomcat/cocoon to be
dangerous) to get lucene to write out the index to memory?  It's running
through the docs it should be indexing (I can watch the output to the
cmd screen).   This all works fine on our live site, I literally copied
over the webapps directory and a few tomcat/cocoon files (web.xml,
cocoon.xconf, etc).  I can say that w/ the exception of IIS/isapi
redirect, the set up and files are all identical... 

Hope that makes sense.

Huge thanks,
T.



Re: Betreff: Copy Directory to Directory function ( backup)

2004-01-15 Thread Nicolas Maisonneuve
thanks ! the copy function works
but i have troubles..
I used a scheduled task to backup the index.
for the test , a backup is made all the 15 secondes.
and sometime , in the backup process,
when i clean a directory with :
Directory target=FSDirectory.getDirectory(selected_backup_dir, true);
i have a Exception :
java.io.IOException: couldn't delete segments
 at org.apache.lucene.store.FSDirectory.create(FSDirectory.java:166)
 at org.apache.lucene.store.FSDirectory.init(FSDirectory.java:151)
 at org.apache.lucene.store.FSDirectory.getDirectory(FSDirectory.java:132)
 at
lab.crip5.ECR.cocoon.components.IndexBackupJob.backup(IndexBackupJob.java:13
5)

the exception happend sometimes

my backup function is simple :

  private void backup (String index_to_backup) throws Exception {
getLogger().info(begin backup index +index_to_backup+ at +new
Date()+...);

// get the directory of the index
Directory
source=index_manager.getIndex(index_to_backup).getDirectory();

// select target backup directory
File target_backup_dir=select_backup(index_to_backup);

// clean the old index
Directory target=FSDirectory.getDirectory(new_backup_dir, true);

// backup
copy(source, target);

target.close();

getLogger().info(end backup index +index_to_backup+ at +new
Date()+...ok);
}

- Original Message - 
From: Nicolas Maisonneuve [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Thursday, January 15, 2004 3:21 PM
Subject: Fw: Betreff: Copy Directory to Directory function ( backup)



 - Original Message - 
 From: Nick Smith [EMAIL PROTECTED]
 To: [EMAIL PROTECTED]
 Sent: Thursday, January 15, 2004 2:58 PM
 Subject: Betreff: Copy Directory to Directory function ( backup)


  Hi Nico,
 This is the method that I use for backing up my indices...
 
  Good Luck!
 
  Nick
 
/**
 * Copy contents of codedir/code, erasing current contents.
 *
 * This can be used to write a memory-based index to disk.
 *
 * @param dir a codeDirectory/code value
 * @exception IOException if an error occurs
 */
public void copyDir(Directory dir) throws IOException {
  // remove current contents of directory
  create();
 
  final String[] ar = dir.list();
  for (int i = 0; i  ar.length; i++)
  {
// make place on disk
OutputStream os = createFile(ar[i]);
// read current file
InputStream is = dir.openFile(ar[i]);
 
final int MAX_CHUNK_SIZE = 131072;
byte[] buf = new byte[MAX_CHUNK_SIZE];
int remainder = (int)is.length();
while (remainder  0) {
  int chunklen = (remainder  MAX_CHUNK_SIZE ? MAX_CHUNK_SIZE :
 remainde!
  is.readBytes(buf, 0, chunklen);
  os.writeBytes(buf, chunklen);
  remainder -= chunklen;
}
 
// graceful cleanup
is.close();
os.close();
  }
}
 
 
 




 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]






-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Copy Directory to Directory function ( backup)

2004-01-15 Thread Nicolas Maisonneuve

- Original Message - 
From: Nicolas Maisonneuve [EMAIL PROTECTED]
To: Lucene Users List [EMAIL PROTECTED]
Sent: Thursday, January 15, 2004 3:58 PM
Subject: Re: Betreff: Copy Directory to Directory function ( backup)


 thanks ! the copy function works
 but i have troubles..
 I used a scheduled task to backup the index.
 for the test , a backup is made all the 15 secondes.
 and sometime , in the backup process,
 when i clean a directory with :
 Directory target=FSDirectory.getDirectory(selected_backup_dir, true);
 i have a Exception :
 java.io.IOException: couldn't delete segments
  at org.apache.lucene.store.FSDirectory.create(FSDirectory.java:166)
  at org.apache.lucene.store.FSDirectory.init(FSDirectory.java:151)
  at org.apache.lucene.store.FSDirectory.getDirectory(FSDirectory.java:132)
  at

lab.crip5.ECR.cocoon.components.IndexBackupJob.backup(IndexBackupJob.java:13
 5)

 the exception happend sometimes

 my backup function is simple :

   private void backup (String index_to_backup) throws Exception {
 getLogger().info(begin backup index +index_to_backup+ at +new
 Date()+...);

 // get the directory of the index
 Directory
 source=index_manager.getIndex(index_to_backup).getDirectory();

 // select target backup directory
 File target_backup_dir=select_backup(index_to_backup);

 // clean the old index
 Directory target=FSDirectory.getDirectory(new_backup_dir, true);

 // backup
 copy(source, target);

 target.close();

 getLogger().info(end backup index +index_to_backup+ at +new
 Date()+...ok);
 }

 - Original Message - 
 From: Nicolas Maisonneuve [EMAIL PROTECTED]
 To: [EMAIL PROTECTED]
 Sent: Thursday, January 15, 2004 3:21 PM
 Subject: Fw: Betreff: Copy Directory to Directory function ( backup)


 
  - Original Message - 
  From: Nick Smith [EMAIL PROTECTED]
  To: [EMAIL PROTECTED]
  Sent: Thursday, January 15, 2004 2:58 PM
  Subject: Betreff: Copy Directory to Directory function ( backup)
 
 
   Hi Nico,
  This is the method that I use for backing up my indices...
  
   Good Luck!
  
   Nick
  
 /**
  * Copy contents of codedir/code, erasing current contents.
  *
  * This can be used to write a memory-based index to disk.
  *
  * @param dir a codeDirectory/code value
  * @exception IOException if an error occurs
  */
 public void copyDir(Directory dir) throws IOException {
   // remove current contents of directory
   create();
  
   final String[] ar = dir.list();
   for (int i = 0; i  ar.length; i++)
   {
 // make place on disk
 OutputStream os = createFile(ar[i]);
 // read current file
 InputStream is = dir.openFile(ar[i]);
  
 final int MAX_CHUNK_SIZE = 131072;
 byte[] buf = new byte[MAX_CHUNK_SIZE];
 int remainder = (int)is.length();
 while (remainder  0) {
   int chunklen = (remainder  MAX_CHUNK_SIZE ? MAX_CHUNK_SIZE :
  remainde!
   is.readBytes(buf, 0, chunklen);
   os.writeBytes(buf, chunklen);
   remainder -= chunklen;
 }
  
 // graceful cleanup
 is.close();
 os.close();
   }
 }
  
  
  
 
 
 
 
  -
  To unsubscribe, e-mail: [EMAIL PROTECTED]
  For additional commands, e-mail: [EMAIL PROTECTED]
 
 





-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: lucene not indexing under apache 2.0/windows?

2004-01-15 Thread Erik Hatcher
You're missing something in your explanation.  Lucene does not create 
XML files.

On Jan 15, 2004, at 11:35 AM, Pierce, Tania wrote:

Let me preface this by saying I am a total beginner to
apache/java/tomcat/cocoon etc.  I'm thankfully fluent in xml/xslt or
this would be a nightmare.
Anyway, I have been given the task to figure out why one of our sites
continues to chew up memory and never releases it to the point where I
have to stop and start the tomcat service on a daily basis.  We're 
using
tomcat 4.1.24, j2se 1.4.1.04 on a win2k server (isapi redirect through
iis).  Our site is made of up of a repository of xml docs (2,000 or so)
which get chewed up and spit out as html thanks to transforms set forth
in our cocoon pipeline.   We have lucene in place to create large xml
files (in memory) so that certain web pages don't have to loop through
hundreds of smaller xml files; instead, the xslt loops through the 
nodes
contained in the in-memory xml doc that's created for us by lucene.

So my manager had me set up a mirror site on a different machine 
running
all of the above EXCEPT no IIS, our web server is Apache 2.0 (to rule
out IIS, which I don't think is the issue anyway).   Everything on this
mirror site works except lucene.  I can rebuild the lucene index by
running a .bat file our vendor wrote for us and it runs w/o error.
However, when I take a look at the resulting aggregate xml docs
(cached), they're empty.  To top it off, the cocoon pipeline seems to 
be
trying to apply our xsl templates to the cached xml docs... There are 
no
errors in any of the log files.

Any ideas?  What do I need to do (as clearly as you can please, I have
just enough knowledge on all this java/apache/tomcat/cocoon to be
dangerous) to get lucene to write out the index to memory?  It's 
running
through the docs it should be indexing (I can watch the output to the
cmd screen).   This all works fine on our live site, I literally copied
over the webapps directory and a few tomcat/cocoon files (web.xml,
cocoon.xconf, etc).  I can say that w/ the exception of IIS/isapi
redirect, the set up and files are all identical...

Hope that makes sense.

Huge thanks,
T.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Help on IOException and FileNotFoundException (synchronization issue)

2004-01-15 Thread Ardor Wei
Hi, experts, 

I am new to Lucene. I am trying to fix bugs in
existing code. I read Lucene final 1.3 Doc (some of
API) and searched the related thread on the mailing
list archive. But I still couldn't solve problem even
though I know 
the problem might be related to synchronization
issues. Typically I encountered 3 types of problem:
couldn't delete, file not found, lock obtain timeout.
Here are some exception stacks: (Sorry for the long
post.)
java.io.IOException: couldn't delete _17.fdt
at
org.apache.lucene.store.FSDirectory.create(FSDirectory.java:166)
at
org.apache.lucene.store.FSDirectory.init(FSDirectory.java:151)
at
org.apache.lucene.store.FSDirectory.getDirectory(FSDirectory.java:132)
at
org.apache.lucene.store.FSDirectory.getDirectory(FSDirectory.java:113)
at
org.apache.lucene.index.IndexWriter.init(IndexWriter.java:151)
at
com.panva.lucene.ProfileIndexer.init(ProfileIndexer.java:47)
at
com.panva.lucene.ProfileDBIndexer.createIndex(ProfileDBIndexer.java:67)
at
com.panva.lucene.MainIndexScheduler.createMainSearchIndex(MainIndexScheduler.java:99)
at
com.panva.lucene.MainIndexScheduler.run(MainIndexScheduler.java:60)

java.io.FileNotFoundException:
C:\lucenesource\index\_17.f1 (The system cannot find
the file specified)
at java.io.RandomAccessFile.open(Native
Method)
at
java.io.RandomAccessFile.init(RandomAccessFile.java:200)
at
org.apache.lucene.store.FSInputStream$Descriptor.init(FSDirectory.java:389)
at
org.apache.lucene.store.FSInputStream.init(FSDirectory.java:418)
at
org.apache.lucene.store.FSDirectory.openFile(FSDirectory.java:291)
at
org.apache.lucene.index.SegmentReader.openNorms(SegmentReader.java:388)
at
org.apache.lucene.index.SegmentReader.init(SegmentReader.java:151)
at
org.apache.lucene.index.IndexWriter.mergeSegments(IndexWriter.java:423)
at
org.apache.lucene.index.IndexWriter.maybeMergeSegments(IndexWriter.java:401)
at
org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:260)
at
org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:244)
at
com.panva.lucene.ProfileIndexer.addProfile(ProfileIndexer.java:89)
at
com.panva.lucene.ProfileDBIndexer.createIndex(ProfileDBIndexer.java:72)
at
com.panva.lucene.MainIndexScheduler.createMainSearchIndex(MainIndexScheduler.java:99)
at
com.panva.lucene.MainIndexScheduler.run(MainIndexScheduler.java:60)

java.io.IOException: Lock obtain timed out
at
org.apache.lucene.store.Lock.obtain(Lock.java:97)
at
org.apache.lucene.index.IndexWriter.init(IndexWriter.java:173)
at
org.apache.lucene.index.IndexWriter.init(IndexWriter.java:151)
at
com.panva.lucene.ProfileIndexer.init(ProfileIndexer.java:47)
at
com.panva.lucene.ProfileDBIndexer.createIndex(ProfileDBIndexer.java:67)
at
com.panva.lucene.IndexScheduler.createRealTimeSearchIndex(IndexScheduler.java:103)
at
com.panva.lucene.IndexScheduler.run(IndexScheduler.java:63)


In my application, mutilple threads are writing and
searching. Here is the code snippet (not complete, but
should be enough):

// ProfileDBIndexer.java
public class ProfileDBIndexer
{

  public static void createIndex(String path, String
sqlStmt) throws Throwable
  {  // blah, blah
try
{
  // DB code
  rs = stmt.executeQuery(sqlStmt);

  indexer = new ProfileIndexer(path, true);

  while (rs.next())
  {

Profile profile = getProfileFromResultSet(rs);
indexer.addProfile(profile);  // if I do
synchronize(indexer) here and use writer.close() in
tbe following addProfile() method,
NullPointerException is thrown. Looks like
writeLock.release() in close() of IndexWriter throws
this. 
noOfRecordsProcessed++ ;
  }
}
catch (Exception e)
{
  e.printStackTrace();
}
finally
{
 // close DB connection
}
  }
}

// ProfileIndexer.java
public class ProfileIndexer {
  IndexWriter writer;

  public ProfileIndexer(String path, boolean create)
throws IOException {
Analyzer analyzer = new AlphanumStopAnalyzer();
writer = new IndexWriter(path, analyzer, create);
  }

  public void addProfile(Profile profile) throws
IOException {
   Document document = new Document();

document.add(Field.Keyword(Username,
profile.getUsername()));
.. //many document.add() here

writer.addDocument(document);
// writer.optimize();
// writer.close();
 }
}


In the thread class, the following method is called
frequently:
ProfileDBIndexer.createIndex( indexPath, indexQuery )
;

In my application, index searcher is driven by client
request, not multi-threaded, it doesn't delete index
file, and no synchronization is used.

I tried to use synchronization for some methods, but
it didn't work out. I know I didn't realize the real
problem. I am lost.

Could you help me or give any suggestion?  Thanks a

Re: Lucene based projects...?

2004-01-15 Thread Hamish Carpenter
Hi All,

My company has been working on a project involving lucene and intend on releasing the source for it.  

It is a socket based wrapper for lucene.  It listens on a socket for index and search requests then performs them and sends the results back down the socket.  The aim was to integrate lucene searching into our companies projects that do not use java.  The 'luceneserver' accepts a text based http header type of request and more recently an xml message type.  To use it you application will need to be able to create, receive and interpret these messages.

If any one is interested then I can send them the source for it, 2.4Mb including libraries or 1.2Mb inlcuding only essential libs (log4j, dom4j) or finally 250kb with no libs.

Any questions, I'd be happy to help, it needs a FAQ anyways!

Hamish Carpenter
Catalyst IT
Erik Hatcher wrote:

On Jan 12, 2004, at 6:24 AM, [EMAIL PROTECTED] wrote:

who knows other software projects (like Nutch) which are based and build
around Lucene??  I think it can be quite interesting and helpful for 
new people
to see and learn from examples...
This is the purpose of the Powered by section on Lucene's website.

More contributions are welcome!

Erik


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]