[OPEN-ILS-DEV] problem during Evergreen installation

2008-07-22 Thread vijay kumar
Dear List,

Thank you for your help.
Now I have downloaded Evergreen ILS packages and trying to proceed next.

I have a lot of question regarding the installation of
evergreen..because I am new with UBUNTU
1where Put all extracted file related to Evergreen

2where put my extracted Evergreen folder

3 In  the address
http://open-ils.org/dokuwiki/doku.php?id=installing_evergreen_1.2_on_ubuntu_7.10
22nd stage
shall i write Allow from 127.0.0.0/16
   or 127.0.0.1/06

shall i change ServerName eg-server:80 to ServerName  ip no.:80 0r
ServerName 127.0.0.1:80 or

I am not able to understand this line
Change the two ServerAlias directives to something appropriate:
shall i also change ServerAlias 127.0.1.1:80 to
   ServerAlias ip no.:80
before doing this apache is working properly and after this stage i am
not getting apache

4Tell me sir, what is differenence between 127.0.0.1 and 127.0.0.0
and 127.0.1.1

5 Later some command is not working properlyI pasted some
screenshots related to Evergreen
Starting OpenSRF Router
Starting OpenSRF Perl
Can't locate Unix/Syslog.pm in @INC (@INC contains: /openils/lib/perl5
/etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8
/usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8
/usr/local/lib/site_perl .) at
/openils/lib/perl5/OpenSRF/Utils/Logger.pm line 5.
BEGIN failed--compilation aborted at
/openils/lib/perl5/OpenSRF/Utils/Logger.pm line 5.
Compilation failed in require at /openils/lib/perl5/OpenSRF/System.pm line 5.
BEGIN failed--compilation aborted at
/openils/lib/perl5/OpenSRF/System.pm line 5.
Compilation failed in require.
BEGIN failed--compilation aborted.
Starting OpenSRF C (host=vijay-desktop)

[EMAIL PROTECTED]:~# sudo -u opensrf ./autogen.sh
/openils/conf/opensrf_core.xml
sudo: ./autogen.sh: command not found



I think its very difficult to give answer at once
but please guide me

Thanks a lot,
Vijay Kumar
NCSI,IISc
Bangalore
INDIA


Re: [OPEN-ILS-DEV] problem during Evergreen installation

2008-07-22 Thread Dan Wells
Hello Vijay,

1  2  You can put the downloaded/extracted files anywhere where your user has 
write permissions and which is convenient for you.  The wiki page suggests 
/eg-srcs within your home directory.  It appears as though you have done this, 
so maybe I am misunderstanding the questions?

3  You should use: 127.0.0.0/16
The number after the slash is a bit mask, so this essentially means to allow 
from 127.0.*.*, which should let you access the config scripts only from a 
browser running on the server itself.  This is a very basic security measure.

Apache allows for resolving and forwarding of requests based on server name, so 
the 'ServerName' directive should be a name, not an address, e.g.:
ServerName eg-server:80

or

ServerName eg-server.yourdomainhere.com:80 (if you have a registered name)

If using the first unregistered version, you probably only be able to access 
http://eg-server/ from a browser running on the server itself.  It should be 
noted that the instructions are generally written to allow this sort of 
self-testing, not to allow access from other machines on your network.  You 
should generally still be able to access the web server using the machine's 
external IP address, though.  

ServerAlias means nothing more than specifying additional requests to which a 
given Apache server should respond.  You can set as many as you want, e.g.:

ServerAlias server:80 server2.domain.com:80 server2:80 127.0.1.1:80

and so on.

All that being said, Apache is extremely flexible, and the best Apache settings 
are those that work :)

4  127.0.0.1 is the address for 'localhost,' that is, the current computer you 
are using.  This allows a machine to use IPv4 protocols to communicate with 
itself, which is very convenient when accessing a 'network' service locally.  
On Ubuntu, any unqualified/unregistered machine name is mapped to 127.0.1.1, 
and is mostly just another redundant way for the machine to talk to itself, and 
a stand-in for a real IP address which goes with a real (DNS registered) 
name.  For example:

ping localhost = contact ping service on 127.0.0.1
ping eg-server = contact ping service on 127.0.1.1
(if I register my machine with my DNS server)
ping eg-server.yourdomainhere.com = contact ping service on 153.106.123.1 (some 
Internet legal IP address)

5  For the Syslog error, it seems that at least one prereq was missed in your 
install.  Try:

sudo apt-get install libunix-syslog-perl

If you have other missing Perl modules, review steps 8 and 9.

Finally, this command:

sudo -u opensrf ./autogen.sh /openils/conf/opensrf_core.xml

must be run from your /openils/ directory.


Good luck,
DW

 vijay kumar [EMAIL PROTECTED] 7/22/2008 7:03 AM 
Dear List,

Thank you for your help.
Now I have downloaded Evergreen ILS packages and trying to proceed next.

I have a lot of question regarding the installation of
evergreen..because I am new with UBUNTU
1where Put all extracted file related to Evergreen

2where put my extracted Evergreen folder

3 In  the address
http://open-ils.org/dokuwiki/doku.php?id=installing_evergreen_1.2_on_ubuntu_7.10
 
22nd stage
shall i write Allow from 127.0.0.0/16
   or 127.0.0.1/06

shall i change ServerName eg-server:80 to ServerName  ip no.:80 0r
ServerName 127.0.0.1:80 or

I am not able to understand this line
Change the two ServerAlias directives to something appropriate:
shall i also change ServerAlias 127.0.1.1:80 to
   ServerAlias ip no.:80
before doing this apache is working properly and after this stage i am
not getting apache

4Tell me sir, what is differenence between 127.0.0.1 and 127.0.0.0
and 127.0.1.1

5 Later some command is not working properlyI pasted some
screenshots related to Evergreen
Starting OpenSRF Router
Starting OpenSRF Perl
Can't locate Unix/Syslog.pm in @INC (@INC contains: /openils/lib/perl5
/etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8
/usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8
/usr/local/lib/site_perl .) at
/openils/lib/perl5/OpenSRF/Utils/Logger.pm line 5.
BEGIN failed--compilation aborted at
/openils/lib/perl5/OpenSRF/Utils/Logger.pm line 5.
Compilation failed in require at /openils/lib/perl5/OpenSRF/System.pm line 5.
BEGIN failed--compilation aborted at
/openils/lib/perl5/OpenSRF/System.pm line 5.
Compilation failed in require.
BEGIN failed--compilation aborted.
Starting OpenSRF C (host=vijay-desktop)

[EMAIL PROTECTED]:~# sudo -u opensrf ./autogen.sh
/openils/conf/opensrf_core.xml
sudo: ./autogen.sh: command not found



I think its very difficult to give answer at once
but please guide me

Thanks a lot,
Vijay Kumar
NCSI,IISc
Bangalore
INDIA


Re: [OPEN-ILS-DEV] 1.4 release: switching locales in the OPAC and staff client

2008-07-22 Thread Dan Scott
Hi Mike:

The database location is good. Is there any reason you chose to go
with the ll_ll form for language/region in the code field, rather
than the ll-LL form currently used for translations? The latter form
is also used by Dojo, which would be handy for locazing dates, times,
and currencies.

On 19/07/2008, Mike Rylander [EMAIL PROTECTED] wrote:
 On Tue, Jul 8, 2008 at 11:44 AM, Dan Scott [EMAIL PROTECTED] wrote:
 Thoughts on the following proposal for the (rapidly approaching) 1.4
 release?

 I'm particularly interested in the plumbing for supported  default
 locales. We could conceivably have one set of locales supported for
 the OPAC, and a different (probably smaller) set of locales supported
 for the staff client. And corresponding to that, a staff user might
 prefer to use the OPAC in one locale, but use the staff client in a
 different locale (probably because the corresponding translation isn't
 available in the staff client). This is trickiest to manage if we do
 opt to support a locale preference at the user level; but one way
 might be to implement locale preference as a fall-through list akin to
 how browsers do it, so if a given locale isn't available the next one
 is automatically tried.

 Related issue: I don't think there's a way of expressing a supported
 set of locales in the system. And the default locale is currently
 hard-coded as en-US.  Would it make sense to beef up opensrf.xml to
 include a locales element within default (possibly with a list of
 supported_locale child elements and a single default_locale child
 element) and teach the various libraries to rely on that? Or would it
 make more sense to push those settings into the database where we can
 provide a user-friendly admin interface?

 I'm unsure, at this time, of the best way to provide a precedence list
 of locales in any given situation, but I think it's important that
 this all be stored in the database.  To that end, I've created a new
 table and fkeys among existing tables:

 -- new table
 CREATE TABLE config.i18n_locale (
 codeTEXTPRIMARY KEY,
 marc_code   TEXTNOT NULL REFERENCES config.language_map (code),
 nameTEXTUNIQUE NOT NULL,
 description TEXT
 );

 -- available locales
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('en_us','eng',oils_i18n_gettext('American English'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('en_ca','eng',oils_i18n_gettext('Canadian English'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('fr_ca','fre',oils_i18n_gettext('Canadian French'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('es_us','spa',oils_i18n_gettext('American Spanish'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('es_mx','spa',oils_i18n_gettext('Mexican Spanish'));

 -- added fkey constraint
 CREATE TABLE config.i18n_core (
 id  BIGSERIAL   PRIMARY KEY,
 fq_fieldTEXTNOT NULL,
 identity_value  TEXTNOT NULL,
 translation TEXTNOT NULLREFERENCES
 config.i18n_locale (code),
 string  TEXTNOT NULL
 );


 Note that this makes config.language_map table the center of the
 natural language universe, with multiple locales pointing at the
 language codes held there.  The requirement is, then, any language for
 which we provide an interface translation must be a valid language in
 whatever metadata standard used by the system ... today, of course,
 that means MARC21.  Doesn't seem too restrictive, given nearly
 distinct language codes currently available. :)

 The locale names and descriptions are i18n ready and the dev database
 has been updated with these changes.


 Hmm. Part of me likes the database approach, as it means that we could
 have an actor.org_unit_setting override the system-wide default locale
 (in our consortium, some libraries are French-only, others are
 English-only). But perhaps that particular problem would be best
 handled via Apache configuration anyways (as the library would
 probably use a different URL entry point to get to the OPAC).

 Sorry, I started rambling there. Hopefully this is more helpful
 rambling than hurtful.

 

 In 1.4, the OPAC interface will be fully supported in multiple locales.

 Currently, the locale is determined by the URL, with supported locales
 and the default locale set in eg_vhost.conf. For example:
  * en-US
 (http://biblio-dev.laurentian.ca/opac/en-US/skin/lul/xml/index.xml)
  * fr-CA
 (http://biblio-dev.laurentian.ca/opac/fr-CA/skin/lul/xml/index.xml)

 For the production release of the i18n support for the OPAC, we need
 to add a user-friendly locale switcher mechanism in the OPAC.

 The switcher should expose:
  * the list of supported locales (defined in opensrf.xml?)
  * the associated locale name displayed in the language of the
 respective locale

 It would be nice if the preference were sticky across sessions (likely
 via a cookie).

 We may 

Re: [OPEN-ILS-DEV] 1.4 release: switching locales in the OPAC and staff client

2008-07-22 Thread Mike Rylander
On Tue, Jul 22, 2008 at 2:40 PM, Dan Scott [EMAIL PROTECTED] wrote:
 Hi Mike:

 The database location is good. Is there any reason you chose to go
 with the ll_ll form for language/region in the code field, rather
 than the ll-LL form currently used for translations? The latter form
 is also used by Dojo, which would be handy for locazing dates, times,
 and currencies.

The ll_ll form is simply normalized to avoid any case-based confusion.
 I'll have to look through the code to make sure there are no
assumptions of _ instead of -, but we can change to '-' notation.  If
we provide (which we will) an interface for creating supported
locales, then I suppose I could drop the case folding as well.

I normalize to lower and  _ in the core i18n stored proc and the
split on _ to find generalizations, but I can remove the normalization
if we can accept the constraint (human-imposed) of don't shoot
yourself in the foot -- use exact matches for local strings ... which
I suppose we can. :)

--miker


 On 19/07/2008, Mike Rylander [EMAIL PROTECTED] wrote:
 On Tue, Jul 8, 2008 at 11:44 AM, Dan Scott [EMAIL PROTECTED] wrote:
 Thoughts on the following proposal for the (rapidly approaching) 1.4
 release?

 I'm particularly interested in the plumbing for supported  default
 locales. We could conceivably have one set of locales supported for
 the OPAC, and a different (probably smaller) set of locales supported
 for the staff client. And corresponding to that, a staff user might
 prefer to use the OPAC in one locale, but use the staff client in a
 different locale (probably because the corresponding translation isn't
 available in the staff client). This is trickiest to manage if we do
 opt to support a locale preference at the user level; but one way
 might be to implement locale preference as a fall-through list akin to
 how browsers do it, so if a given locale isn't available the next one
 is automatically tried.

 Related issue: I don't think there's a way of expressing a supported
 set of locales in the system. And the default locale is currently
 hard-coded as en-US.  Would it make sense to beef up opensrf.xml to
 include a locales element within default (possibly with a list of
 supported_locale child elements and a single default_locale child
 element) and teach the various libraries to rely on that? Or would it
 make more sense to push those settings into the database where we can
 provide a user-friendly admin interface?

 I'm unsure, at this time, of the best way to provide a precedence list
 of locales in any given situation, but I think it's important that
 this all be stored in the database.  To that end, I've created a new
 table and fkeys among existing tables:

 -- new table
 CREATE TABLE config.i18n_locale (
 codeTEXTPRIMARY KEY,
 marc_code   TEXTNOT NULL REFERENCES config.language_map (code),
 nameTEXTUNIQUE NOT NULL,
 description TEXT
 );

 -- available locales
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('en_us','eng',oils_i18n_gettext('American English'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('en_ca','eng',oils_i18n_gettext('Canadian English'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('fr_ca','fre',oils_i18n_gettext('Canadian French'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('es_us','spa',oils_i18n_gettext('American Spanish'));
 INSERT INTO config.i18n_locale (code,marc_code,name) VALUES
 ('es_mx','spa',oils_i18n_gettext('Mexican Spanish'));

 -- added fkey constraint
 CREATE TABLE config.i18n_core (
 id  BIGSERIAL   PRIMARY KEY,
 fq_fieldTEXTNOT NULL,
 identity_value  TEXTNOT NULL,
 translation TEXTNOT NULLREFERENCES
 config.i18n_locale (code),
 string  TEXTNOT NULL
 );


 Note that this makes config.language_map table the center of the
 natural language universe, with multiple locales pointing at the
 language codes held there.  The requirement is, then, any language for
 which we provide an interface translation must be a valid language in
 whatever metadata standard used by the system ... today, of course,
 that means MARC21.  Doesn't seem too restrictive, given nearly
 distinct language codes currently available. :)

 The locale names and descriptions are i18n ready and the dev database
 has been updated with these changes.


 Hmm. Part of me likes the database approach, as it means that we could
 have an actor.org_unit_setting override the system-wide default locale
 (in our consortium, some libraries are French-only, others are
 English-only). But perhaps that particular problem would be best
 handled via Apache configuration anyways (as the library would
 probably use a different URL entry point to get to the OPAC).

 Sorry, I started rambling there. Hopefully this is more helpful
 rambling than hurtful.

 

 In 1.4, the OPAC interface will be fully supported in multiple 

Re: [OPEN-ILS-DEV] purging objson (legacy json) for opensrf 1.0 / Evergreen 1.4

2008-07-22 Thread Bill Erickson
On Monday 14 July 2008 2:38 Dan Scott wrote:
 2008/7/14 Bill Erickson [EMAIL PROTECTED]:
  The topic of purging objson, which implements the old-style,
  comment-embedded class hints for OpenSRF objects came up recently during
  a discussion of the new autotools infrastructure.  OK, fine, I brought it
  up.
 
  The current objson setup provides support for parsing old-style objects
  via a separate API call (used in the opensrf gateway) and an
  implementation of the old jsonObjectIterator API, which changed with the
  latest JSON code.
 
  The original idea for the legacy json layer was that the system may need
  to support old and new-style JSON objects for Evergreen 1.4.  However, if
  we are in agreement that there is no need to support old-style JSON
  objects in Evergreen 1.4, and I'm pretty sure we've passed that bridge
  already, then the legacy JSON layer seems like an unnecessary layer of
  complexity that we should just drop.
 
  What would it take?
 
  1. The cstore application makes heavy use of the jsonObjectIterator API,
  which would need to be manually updated to use the new jsonIterator API. 
  The difference there is that the call to next() now returns a jsonObject
  instead of a the intermediary jsonObjectNode.  Also, instead of accessing
  the current key through the node, you access it directly on the iterator
  object.
 
  2. Remove all references to objson on the source/makefiles for Evergreen
  (only a few remain)
 
  3. Purge objson from OpenSRF autotools and remove osrf_legacy_json* files
 
  Sound sane for Evergreen 1.4 and OpenSRF 1.0?

 That sounds quite sane to me. The earlier, the better, as far as
 testing before 1.4 goes :)

Attached is a patch to purge objson from the ILS tree.  It ports all of the 
cstore jsonObjectIterator calls to the newer jsonIterator and replaces all 
objson/object.h references with opensrf/osrf_json.h.

I wanted to push it out to the list since it touches a lot of cstore code and 
that's not my usual stomping ground.  This patch is running on 
acq.open-ils.org.  So far so good. 

If there are no objections, I'll get this committed and start clearing out the 
legacy JSON from OpenSRF.

-b

-- 
Bill Erickson
| VP, Software Development  Integration
| Equinox Software, Inc. / The Evergreen Experts
| phone: 877-OPEN-ILS (673-6457)
| email: [EMAIL PROTECTED]
| web: http://esilibrary.com
Index: Open-ILS/include/openils/oils_utils.h
===
--- Open-ILS/include/openils/oils_utils.h	(revision 10093)
+++ Open-ILS/include/openils/oils_utils.h	(working copy)
@@ -1,4 +1,4 @@
-#include objson/object.h
+#include opensrf/osrf_json.h
 #include opensrf/log.h
 
 // XXX replacing this with liboils_idl implementation
Index: Open-ILS/include/openils/oils_event.h
===
--- Open-ILS/include/openils/oils_event.h	(revision 10093)
+++ Open-ILS/include/openils/oils_event.h	(working copy)
@@ -1,6 +1,6 @@
 #ifndef OILS_EVENT_HEADER
 #define OILS_EVENT_HEADER
-#include objson/object.h
+#include opensrf/osrf_json.h
 #include opensrf/utils.h
 #include opensrf/log.h
 #include opensrf/osrf_hash.h
Index: Open-ILS/src/c-apps/oils_cstore.c
===
--- Open-ILS/src/c-apps/oils_cstore.c	(revision 10093)
+++ Open-ILS/src/c-apps/oils_cstore.c	(working copy)
@@ -6,7 +6,6 @@
 #include opensrf/log.h
 #include openils/oils_idl.h
 #include dbi/dbi.h
-#include objson/object.h
 
 #include time.h
 #include stdlib.h
@@ -53,9 +52,9 @@
 static char* searchWriteSimplePredicate ( const char*, osrfHash*,
 	const char*, const char*, const char* );
 static char* searchSimplePredicate ( const char*, const char*, osrfHash*, const jsonObject* );
-static char* searchFunctionPredicate ( const char*, osrfHash*, const jsonObjectNode* );
+static char* searchFunctionPredicate ( const char*, osrfHash*, const jsonObject*, const char* );
 static char* searchFieldTransform ( const char*, osrfHash*, const jsonObject*);
-static char* searchFieldTransformPredicate ( const char*, osrfHash*, jsonObjectNode* );
+static char* searchFieldTransformPredicate ( const char*, osrfHash*, jsonObject*, const char* );
 static char* searchBETWEENPredicate ( const char*, osrfHash*, jsonObject* );
 static char* searchINPredicate ( const char*, osrfHash*, const jsonObject*, const char* );
 static char* searchPredicate ( const char*, osrfHash*, jsonObject* );
@@ -181,7 +180,7 @@
 
 		int i = 0; 
 		char* method_type;
-		char* st_tmp;
+		char* st_tmp = NULL;
 		char* _fm;
 		char* part;
 		osrfHash* method_meta;
@@ -649,12 +648,12 @@
 		obj = doFieldmapperSearch(ctx, class_obj, ctx-params, err);
 		if(err) return err;
 
-		jsonObjectNode* cur;
-		jsonObjectIterator* itr = jsonNewObjectIterator( obj );
-		while ((cur = jsonObjectIteratorNext( itr ))) {
-			osrfAppRespond( ctx, cur-item );
+		jsonObject* cur;
+		jsonIterator* itr = jsonNewIterator( obj );
+