Basically, my UNIX/LINUX knowledge is non-existant. I've FTP'd the tar file
to our LINUX box, and have extracted it... but where to go from there??? I
haven't got a clue.
If that is so, you are bound to run into problems all the time.
Perhaps installing a search engine is not the first
Alexander Barkov skrev:
We finally found a bug in cache.c. New version is in attachement.
Everybody who has problems with splitter's crashes are welcome to test.
Please, give feedback!
You guys are great! I'll re-compile and get back to you with
reports.
BTW, can I remove
Alexander Barkov skrev:
We finally found a bug in cache.c. New version is in attachement.
Everybody who has problems with splitter's crashes are welcome to test.
Please, give feedback!
Oops. Something else is not OK:
cache.c:687:87: warning: #ifdef with no argument
cache.c:692:87:
Zenon Panoussis skrev:
Oops. Something else is not OK:
cache.c:687:87: warning: #ifdef with no argument
[etc]
I think that the mailer is responsible for this. There are
lots of broken lines in the code, that shouldn't be broken.
Perhaps it's better to attach the file in .gz format
Caffeinate The World skrev:
i've been going through this and back again time and time again. what
would really be nice is indexer save the logs in a format that's easy
to use again. for instance, you can use the format re-index to sql etc.
or if you want to reindex again, you don't
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Depends on what you mean. Follow the links at
http://search.mnogo.ru/users.html and see what it can do.
Z
Reply: http://search.mnogo.ru/board/message.php?id=1392
__
If you want to unsubscribe send "unsubscribe udms
Zenon Panoussis skrev:
By now, I have almost 1 GB of indexed files, 4 indexer
crashes and one splitter crash. I'll do the debugging and
post its output tomorrow.
===
# gdb indexer core.indexer.01
GNU gdb 5.0
Copyright 2000 Free Software Foundation, Inc.
GDB is free
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Grrr. OK, try this then: Make a static HTML page with this form in it:
htmlbody
form method=GET ACTION="/cgi-bin/search.cgi"
Search for: input type="text" name="q" SIZE=30 value="anything"
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
If I move search.htm the script complains that it cant find
search.htm via the ssh.
Well that's good. It shows that the cgi is looking in the right
place for the right file.
maybe it is any kind of help I have called the script
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
I have another stupid question. Is it possible that I have to call
the configure script with an option for the host type?
I have installed it via ssh on the host but have not used any
stuff there... and the database
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Question: are you running your own web server? Is the cgi-bin of
your particular domain and user account *really* set to the cgi-bin
directory that you are using? Are you sure?
If you don't know how the web server is configured
Zenon Panoussis skrev:
Now for 31 MB adventures :)
# ./run-splitter -k
Sending -HUP signal to cachelogd...
Done
# ./run-splitter -p
Preparing logs...
Open dir '/var/mnogo3110/raw'
Preparing word log 982024900 [ 42176 bytes]
Preparing word log 982027284 [31465324 bytes
Alexander Barkov skrev:
Can you guys give us a log file produced by splitter -p which caused
crash? We can't reproduce crash :-(
Huh? splitter doesn't accept the -v5 argument, so it won't give
more detailed logs than the normal ones. The only log I had, that
to stdout, is the one I
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Does it matter?
/bin/sh ../libtool --mode=compile gcc -DHAVE_CONFIG_H -I. -I. -I../include
-I../include -I/usr/include/m
ysql -g -O2 -DUDM_CONF_DIR=\"/usr/local/mnogo3110/etc\"
-DUDM_VAR_DIR=\"/var
Zenon Panoussis skrev:
And a really HARD hang at the same place as before. So hard
that I can't even kill splitter.
BTW, although I couldn't kill splitter, I did find a core dump
in sbin. Here's the backtrace:
# gdb splitter core
GNU gdb 5.0
snip copyright
This GDB was configured
Fredy Kuenzler skrev:
It seems to me, that cache mode in 3.1.9 and 3.1.10 does not work
good. Indexer (according to the /doc) works in cache mode and
single mode, however search.cgi does not find anything in cache
mode. In single mode everything works as expected.
It's a whole series of
Zenon Panoussis skrev:
I'll delete the entire tree directory and start re-indexing from
scratch. I'll make and split a small file first, ca 5 MB, then a
31 MB file, if that works yet another 31 MB file, and so on until
I get in problems again. Will report back later this evening.
First
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
I have built my index using this:
HTDBDoc \
SELECT concat( \
[etc]
FROM jobsadvertised \
WHERE job_id='$1' and to_days(now()) - to_days(job_inp_dte) lt;= '$2' and site_type
= '$4' and job_location = '$3' and job_type = '$5
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
After spending nearly 3 Days trying to get this thing to work, I
have come to the conclusion that it is a waste of time and a
JOKE:-(..)
In that case you are entitled to your money back. Every penny
of it.
The documentation
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
[3.1.10, RH 7.0 on PII, mysql-3.23.29-1, cache mode]
While trying to reproduce the splitter segfault, I got a segfault
from indexer. I don't remember this ever happening before and I've
been using mnogosearch since the early days
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
RH Linux 7.0, search 3.1.9, MySQL 3.23.29, cache mode, with the
new patches for cache.c and sql.c.
It happens all the time. It started happening when "maximum size"
31 MB log files were indexed, but by now it happens on an
Caffeinate The World skrev:
I run splitter -p and finish fine. I then run splitter and,
halfway through the splitting, crash: segmentation fault, or
just a hang, core dumped. So I restart splitter and next time
finish fine.
what machine are you on? Alpha? OS?
Intel PII, RH Linux
Caffeinate The World skrev:
i'll wait. for now, i'm indexing but running splitter when the files
are around 2MB.
I've been running indexer -c 3600 since last night, producing
log files of 5-10 MB and running splitter every time afterwards,
with cleaning of var/splitter and all. So far
Alexander Barkov skrev:
Now the tags and categories work fine, but not the site search.
The ul= directive is completely ignored by search.cgi.
What was the value of ul= variable you tryed?
I tried all of the following:
- http://www.domain.dom
- www.domain.dom
- domain
Alexander Barkov skrev:
We found a bug. Please find patches against sql.c and cache.c
in attachement.
The patch didn't work by itself, so I did the replacements manually.
The patched source compiled without complaints. I replaced the old
search.cgi with the new one but site search still
Alexander Barkov skrev:
Now the tags and categories work fine, but not the site search.
The ul= directive is completely ignored by search.cgi.
What was the value of ul= variable you tryed?
I tried all of the following:
- http://www.domain.dom
- www.domain.dom
- domain
- /path/
-
Luis Bravo skrev:
My files are in Spanish. We have words like oracin, apndice,
estmago, etc. When they are indexed, indexer split that words.
In the database they are in two words: oraci n, ap ndice, est mago.
What Can I do?
In later versions you need to set
LocalCharset
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
I have the following problem. If I try the search.cgi I
get an error message empty page! If I enter search.cgi
from the telnet seesion I get an valid html output but
all the vars from search.htm are empty (I mean you cannot
see
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
More problems: neither tags nor categories seem to work.
I'm using v 3.1.9 with MySQL in cache mode, compiled with
--enable-fast-tag/cat/site ...
Found the problem:
./configure --enable-fast-tag --enable-fast-cat --enable-fast-site
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
The search works very nicely, but it returns a tremendous
amount of quoted document data...
Can I take a look on your search page?
Yes. Go to http://search.freewinds.cx and use "New search".
Search for the word
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
The search works very nicely, but it returns a tremendous
amount of quoted document data...
This is because of --enable-news-extensions
Is there *any* way to limit the quotes to just a few lines?
If not, is there any chance
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
More problems: neither tags nor categories seem to work.
I'm using v 3.1.9 with MySQL in cache mode, compiled with
--enable-fast-tag/cat/site . I've read the part on fast
search with tag etc limits in cachemode.txt, but I doubt
I
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
As more things work, more questions arise. v 3.1.9 in DBMode cache,
compiled with news-extension and using MySQL with create.txt from
the news-3.1.tar.gz module.
1. cachemode.txt says that after running splitter, "it is b
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Shouldn't the files in /var/raw also be deleted? Or are they
needed in any way?
/Me stupid. The answer is in cachemode.txt: "All processed logs
in /var/raw directory are renamed to *.done ... you can remove
them or keep
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Minor, trivial stuff: indexer -S returns the caption \"UdmSearch statistics\". Since
that can end up public, as for instance in http://search.freewinds.cx/cgi-bin/stats ,
you might want to change it.
Z
Reply: http://searc
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
I can\\\'t find a way to make the Restricted Search work! i can\\\'t find in
the DB any data that says that a specific URL is relative to a
restricted criteria (like Sports or Shopping, wich are given as an
example in the search.php
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
There was a discussion about word separators back in January; see
http://www.mail-archive.com/udmsearch%40web.izhcom.ru/msg00200.html .
Since I just realised that I am facing the same problem, I wonder
if Charlie\'s idea was implemented
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Beginning today or yesterday, the search at http://search.mnogo.ru/search/search.php3
returns \"Fatal error: Cannot redeclare crc32() in
/usr/apache/search.mnogo.ru/share/htdocs/search/crc32.inc on line 11\". Looks bad fo
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
search.cgi is been working fine here, but experimenting with the PHP front end I run
into problems:
Query error: SELECT path,link,name FROM categories WHERE path LIKE
'__' ORDER BY NAME ASC
Table 'db.categories' doesn't exist
I
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Can't open template file '/usr/local/udmsearch/etc/search.htm'!
There is a search.htm-dist in that directory which I tried to rename by because of
permissions I could not. Any help would be appreciated.
:) I did the same thing
Are there any inherent limitations on how long the Server path
list can get? Would the indexer work with, say, a 2 MB list of
URLs to index, or would it choke?
Z
__
If you want to unsubscribe send "unsubscribe udmsearch"
to [EMAIL PROTECTED]
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
This is a stupid question. Please bear with a total newbie to mysql.
mysqlSELECT url FROM url WHERE status="404";
works fine and returns all the 404s. However,
mysqlSELECT url FROM url WHERE status="404" AND url
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
THANK YOU!
Z
Reply: http://search.mnogo.ru/board/message.php?id=770
__
If you want to unsubscribe send "unsubscribe udmsearch"
to [EMAIL PROTECTED]
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
If MirrorRoot is specified in indexer.conf,
mnogosearch copies the files it indexes to
directories such as mirror_root/http/domain/dir .
I can see three possible improvements in the
mirroring behaviour. The first two should be easy
Problem solved. There was a pointer to mysql in /etc/ld.so.conf
that pointed to the wrong place. Correcting the pointer and
recompiling mnogosearch didn't help. I ended up removing the
pointer, uninstalling mysql completely, reinstalling it again,
and then recompiling and reinstalling
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Try "indexer -C -s 403" or whatever status URLs you want to get rid of.
Z
Allo,
I forgot to switch on the DeleteBad to YES...
now we have about 26K of bad URLS.. can I delete then manually via MyAdmin...I do
not w
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
Using 3.1.8 with MySQL 3.23.24 on RH7.
I am indexing part of a site with
Server Path http://site/dir/dir/dir/ . Everything
in the directories to be indexed is normal HTML
with no funny stuff. Most directory indices are
auto-generated
mnogosearch 3.1.8, mysql 3.23.22
This happened:
The search worked fine. Then I re-installed MySQL (3.23 instead
of 3.22) and Apache, and the directory structure of both changed.
I moved the old search.cgi to the new cgi-bin. I exported the old
database with mysqldump and re-imported it in
UdmSearch version: 3.1.7
Platform: i586
OS:RH Linux 6.2 / 2.2.16
Database: MySQL 9.38 / 3.22.32
Statistics:
Perl
Severity: cosmetic.
The search page reports results +1. E.g. if 20 results per page are requested, the
caption on the results page will
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
The instructions in the indexer.conf-dist file say that M is minute and m is month.
However, the examples given right after the instructions indicate the opposite. Which
is correct?
Reply: http://search.mnogo.ru/board/message.php?id=611
Author: Zenon Panoussis
Email: [EMAIL PROTECTED]
Message:
I just installed 3.1.7 on Linux 2.2.16 with:
./configure --with-mysql (3.22.32)
make
make install
and no changes in the configuration file.
I proceeded to create one database and tables with:
mysqladmin create udmsearch
mysql udmsearch
51 matches
Mail list logo