[Wikitech-l] I wish I would understand the mediawiki architecture and components

2020-01-29 Thread Jefsey

Gentlemen,
I run around 290 small thematic citizen research wikis (nr being 
developping) under an old mediawiki version (I fear an upgrading 
hassle). In order to simplify their set-up I systemized them in using 
a script to build the symbolic directories from a unique central one, 
so I only have to build the LocalSettings.php, the images directories 
mainly for the wikilogo.gif particular to the site and to enter the 
templates manually. To be sure I can move them around without too 
much pain and keep them under their own password, I use SQLite . 
Round 20 minutes set-up each.


With a friend we would like to transfer all this under MYSQL (or 
MariaDB?) in order to share template and WikiDB. Possibly on several 
machines. Possibly developping some extension on the middle range. 
Possibly transfering further on under another database system (to mix 
diffect entries and mail entries). I feel we would first need to 
study a conceptual block map of the MediaWiki architecture, internal 
exchanges and database requests. Does that exist ?


Also, in order to manage the whole thing advisably I would need two tips:
1. is there a secure/reliable method/extension to protect pages on a 
per page basis ?

2. how to get on a daily basis the access count of the wiki pages ?

Thank you !
jfc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SQLite Mediawiki support

2016-09-27 Thread Jefsey

At 18:42 26/09/2016, Aran wrote:
Content-Transfer-Encoding: base64I developed a MediaWikiLite system 
many years ago which worked

reasonably well. It was for having a wiki on a memory stick that
included the content and ability to edit it in the field without net
access.


Yes. Also easy back-up and replication. With extenbots around. Then 
the various wikilites being networked.



 It ran on SQLite and use Nanoweb, a web-server written in PHP to
reduce dependencies further.

It's very out of date now, but may be helpful:

https://www.organicdesign.co.nz/MediaWikiLite


Thanks. I had a read and will come back to it as I probably set-up a 
working group.

Best
jfc




On 26/09/16 11:00, Jefsey wrote:
> The personal way I am using wikimedia as an SQLite textbase I can
X\Ú[HÛÜKؘ@ckup from machine to machine and modifiy through
> external bundled applications leads me to consider there is a need for
HÚZÚ[YYXH\Ù\ˆL  HÛÛ\]X›H•ÒRÒS]Hˆ[egrated/maintained
> solution set.
ˆKˆ\ÈÛÛY][™Èike that been investigated in the 
pastˆ‹ˆHÛÝ[™H@nterested by comments on the idea?

ˈ[ÛÈX›Ý]H\proach that can best help users and possibly
> wikimedia dévelopment?
ˆH›ÝH]\ÈH™]ÛܚÙY[™]šYX[Ürofessionnal I am interested in
][KXYÙ[ÜšY[Y[€terwares and would like to investigate
> "wikilite" networking capabilities (both about what networked
> architectures could bring, and aboout capacity based content
> security/protection).
>
> Thank you for your attention.
™@st
> jfc
>
>
>
×À_
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SQLite Mediawiki support

2016-09-27 Thread Jefsey

At 00:36 27/09/2016, Brian Wolff wrote:
Content-Transfer-Encoding: base64Mediawiki supports sqlite. And you 
can pretty much use with any webserver -

so are you basically asking for an installer? Some sort of xulrunner-esque
thing so the interface doesnt look like a web browser?


I did not know XULetc... So, some parts could be related. Actually 
the need I have is quite opposite to mediawiki but with the same 
tools: not to support a community with common knowledge, but to 
support a single/a small group of users/authors, with sustainable 
knowledge, i.e. texts I can consantly augment and update, so I can 
structure my "mneme" and build upon it, for me, for others. The key 
issue seems to be individual (wikilite) versus community mneme (wikipedia)


I need the wikipedia proven tools and practices under the form of a 
compact and robuts system I can rely upon and run on my different 
machines through my dropbox directory (the way I actually use the 
same wiki on three machines). However, the difficulty are :


1. mediawiki is not documented in that perspective for technically 
agnostic end-users

2. some of the php tools are not complete for SQLite
3. installation of a new wikilite is still complex and long, I would 
like to be able to install 30 different ones in one minute for a 
group of students, with pre-entered pages, forms, docs,mailing lists, etc.
4. a review of extensions into that perspective - introduced as 
perpetual off-the-shelve part of the experience.
5. farming management and updates (I run around 200 wikilites using 
symlinks for the current release, each in its own directory)
6. I would like to develop specialized bots for content management 
and interfacing, etc. that do not necessarily make sense in broad uses.

etc.

What I would like to get is an end-user oriented (extension and bot 
included) documentation anyone can read, understand and use. Yes, an 
installation and maintenance tool, with various types of use 
configurations,  Then a clear documentation of the maintenance and 
extension tools with quality control and support. To become an 
end-user textbase++ system, consultants should be available and 
turn-click deliveries available (home and in the cloud).


Thx for suggestions.
jefsey



--
Brian

On Monday, September 26, 2016, Jefsey <jef...@jefsey.com> wrote:
> The personal way I am using wikimedia as an SQLite textbase I can easily
copy/backup from machine to machine and modifiy through external bundled
applications leads me to consider there is a need for a wikimedia user 100%
compatible "WIKILite" integrated/maintained solution set.
>
K€ has something like that been investigated in the past?
 2. I would be interested by comments on the idea?
> 3. also about the approach that can best help users and possibly
wikimedia dévelopment‚ˆH›ÝH]\ÈH™]ÛܚÙ@d individual/professionnal I 
am interested in

multi-agent oriented interwares and would like to investigate "wikilite"
networking capabilities (both about what networked architectures could
bring, and aboout capacity based content security/protection).
>
> Thank you for your attention.
™\ݏˆ™˜‚€£âõõõð___
> Wikitech-l mailing list
ÚZÚ]XÚ[\Ýˀwikimedia.org
΋ËÛ\Ý˝ÚZÚ[YYXK›Ü™ËÛXZ[X[‹Û\Ý@nfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] SQLite Mediawiki support

2016-09-26 Thread Jefsey
The personal way I am using wikimedia as an SQLite textbase I can 
easily copy/backup from machine to machine and modifiy through 
external bundled applications leads me to consider there is a need 
for a wikimedia user 100% compatible "WIKILite" integrated/maintained 
solution set.


1. has something like that been investigated in the past?
2. I would be interested by comments on the idea?
3. also about the approach that can best help users and possibly 
wikimedia dévelopment?


I note that as a networked individual/professionnal I am interested 
in multi-agent oriented interwares and would like to investigate 
"wikilite" networking capabilities (both about what networked 
architectures could bring, and aboout capacity based content 
security/protection).


Thank you for your attention.
Best
jfc
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reload SQLight from MYSQL dump?

2016-08-02 Thread Jefsey

At 02:44 02/08/2016, John wrote:
For mass imports, use importDump.php - see 
<<http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps>http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps> 
for details.


Dear John,

Thank you for your response.  But I am dump myself and I am still lost.

I am not sure I made my need clear enough. I had wikis under MySQL I 
dumped in XML. So I have n XML files to create n separate SQLite 
wikis. One file one SQLite wiki (n is around 35).


When I try to read with an XML viewer these files, they tell me that 
some character in line 295 or so is wrong and they cannot read it, so 
I do not really know how they are structured.


Therefore, my question is about how to create the n SQLwikis I need 
and import in each the pages which are in XML dumped from MySQL.


Also, further on, I understand that the manual is reporting about 
MySQL dumps, not about SQLite dumps? Does that mean that I should 
consider their backup with SQLite tools rathere than mediwiki tools? 
Hence my question about a mediawiki SQLite section/mailing list?


Sorry I am pretty new to this and need to take care of a lot of small 
Libre/Citizen/locally oriented wikis, with more to come. So I try to 
figure out the best way to manage all this 


Thank you !

jfc


On Mon, Aug 1, 2016 at 8:37 PM, Jefsey 
<<mailto:jef...@jefsey.com>jef...@jefsey.com> wrote:
I am not familiar with databases. I have old MySQL based wikis sites 
I cannot access anymore due to a change in PHP and MySQL versions. I 
have old XML dumps. Is it possible to reload them under SQLight 
wikis? These were working group wikis: we only are interested in 
restoring texts. We have the images. We are not interested in the 
access rights: we will have to rebuild them anyway.


Thank you for the help !
jefsey

PS. We dedicate to light wikis which are OK under SQLight, would 
there be a dedicated list to SQLight mangement (and further on development)?


___
Wikitech-l mailing list
<mailto:Wikitech-l@lists.wikimedia.org>Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Reload SQLight from MYSQL dump?

2016-08-01 Thread Jefsey
I am not familiar with databases. I have old MySQL based wikis sites 
I cannot access anymore due to a change in PHP and MySQL versions. I 
have old XML dumps. Is it possible to reload them under SQLight 
wikis? These were working group wikis: we only are interested in 
restoring texts. We have the images. We are not interested in the 
access rights: we will have to rebuild them anyway.


Thank you for the help !
jefsey

PS. We dedicate to light wikis which are OK under SQLight, would 
there be a dedicated list to SQLight mangement (and further on development)? 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] maintenance.php files in symlinks context problem

2015-01-08 Thread Jefsey

Hi!
I access as a system root.

I have loaded 1.23 in /wiki
I have symlinked it in my /home/wiki-site/www directories. It works 
without problem. However, when I try


 php maintenance/sqlite.php (or other .php files) --conf 
/home/wiki-site/www/LocalSettings.php (or without this -conf addition)


I get the same response:

Warning: require_once(__DIR__/Maintenance.php): failed to open 
stream: No such file or directory in /wiki/maintenance/sqlite.php on line 24
Fatal error: require_once(): Failed opening required 
'__DIR__/Maintenance.php' (include_path='.:/usr/local/php5/lib/php') 
in /wiki/maintenance/sqlite.php on line 24


-

I also installed some wikis without sysmlink.
I such a case if I enter

 php maintenance/dumpBackup.php --full  .xml

I get:
Access denied for user 'root'@'localhost' (using password: NO)

The same if I enter
 php maintenance/dumpBackup.php --dbuser user --dbpass 
pass  --full  .xml


Thank you for the help.
jefsey


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l