Oops, the mail client bungled my code. You can download it as an
application here:
http://code.xwiki.org/xwiki/bin/download/Applications/LargeExportBySpaceApplicationDownloads/Main.largeExportBySpace.xar

Caleb James DeLisle wrote:
> I modified the large export script to support individual spaces. With my
> modded version, you can also export to a directory rather than zip
> compressing the export (you can then zip the contents of the dir
> manually), hopefully skipping the zip compression will save enough ram
> to prevent running out of heap space.
>
> You say you have a 1gb database, are there lots of big attachments or
> just a ridiculous number of pages? It might be that one attachment is so
> big that it breaks the export engine.
>
> How long does it take to run out of memory? If it takes a long time,
> then it is probably too much data -> my script might fix it.
> If it's immediate, you are probably almost out of ram already. the
> HSQLDB database (which comes with the package) shares ram with XWiki,
> HSQL also seems to move the entire database to ram so you will need more
> than -Xmx1024m
> You can specify more memory with Xmx than you actually have physical ram
> because it can use swap space.
>
> here is my modified export script, I'll post it to the code zone after I
> decide if it needs any more features.
>
>
>
>
>
>
> 1.1 Large Export
>
> <%
>
> import com.xpn.xwiki.*;
> import com.xpn.xwiki.doc.*;
> import com.xpn.xwiki.plugin.packaging.*;
> import java.util.zip.*;
> import com.xpn.xwiki.util.Util;
>
> def getXAR(String filename, XWikiContext context) {
>     def request = context.getRequest();
>     def export = context.getWiki().getPluginApi("package", context);
>     List<String> spaces = request.getParameterMap().get("spaces");
>     export.setWithVersions(true);
>     export.setAuthorName("XWiki.Admin");
>     export.setDescription("");
>     export.setLicence("");
>     export.setVersion("");
>     export.setBackupPack(true);
>     export.setName("backup");
>     def pack = export.getPackage();
> //    pack.addAllWikiDocuments(context);
>     ArrayList<String> docNames = new ArrayList<String>();
>     for(String space : spaces){
>         for(String docName : context.getWiki().getSpaceDocsName(space,
> context)){
>             pack.add(docName,
> com.xpn.xwiki.plugin.packaging.DocumentInfo.ACTION_OVERWRITE, context);
>         }
>     }
>     if(request.dir){
>         pack.exportToDir(new File(filename), context);
>     }else{
>         pack.export(new FileOutputStream(new File(filename)), context);
>     }
> }
>
> if (request.filename) {
>    getXAR(request.filename, context.getContext())
>   } else {
> %>
> <form action="" method="post">
> <table border="0">
> <tr>
> <td>File/directory to write to:</td><td><input type="text"
> name="filename" size="60" /></td>
> </tr>
> </table>
> <%
> for(String space : xwiki.getSpaces()){
>     println(space+"<input type=\"checkbox\" name=\"spaces\"
> value=\""+space+"\"/>\n");
> }
> %>
> Don't zip files, output to directory <input type="checkbox" name="dir"/>
> <br/>
> <input type="submit" name="Export" />
> </form>
> <%
>   }
> %>
>
>
>
>
>
>
>
> [Ricardo Rodriguez] Your EPEC Network ICT Team wrote:
>   
>> Hi Francisco!
>>
>> Hernández Cuchí wrote:
>>   
>>     
>>> Hi, 
>>>
>>> That is not really what I want. My problem is that I have an xwiki 1.7.14 
>>> and I wanto to move to a 2.0milestone 2 in another machine. I get a lot of 
>>> corrupted xar files (cannot open them with 7zip) because of java memory 
>>> heap exception (even if they are spaces of only one page!). So I need a 
>>> different way of exporting the data, because if I have 40 spaces, 10 xar 
>>> files are corrupted. Even, I wrote my own snippet to get the spaces 
>>> exported selectively, but still getting corrupted xar files. So, ¿is it 
>>> posible to backup the database and import it in the new installation?
>>>   
>>>     
>>>       
>> Hopefully I will face a similar workflow in the following weeks, so let 
>> see if I am able to help here :-)
>>
>> As far as I understand, you have to XWiki installations. Each of them on 
>> a different box. What you are looking for is to move the whole 1.7.14 
>> database to a brand new 2.0m2 in a different box.
>>
>> If this is what you want and although this not solve the doubts about 
>> why do exports get corrupted, you can export the whole XWiki database 
>> schema, reimport it in the new database installation (if there is one in 
>> the same box where you have installed the new XWiki instance) and 
>> provided you have correctly configured username and password in 
>> hibernate.cfg.xml file it must work.
>>
>> The first time the new XWiki instances starts, the required database 
>> schema updated will be performed by the system. I've done that severel 
>> times in the past without any problem working with MySQL databases. 
>> There have been issues related with schema updates in early releases, 
>> but I've not read anything about them lately. In fact, it is a similar 
>> process to install a new XWiki from the scratch but dealing with an 
>> existing database. If XWiki finds a database, it will use it instead of 
>> creating a new one.
>>
>> Of course modifications done to Velocity templates, skins,... not stored 
>> in the database must be done separately. Also, as with a regular update, 
>> you must be extremely cautious when importing the new default xar file 
>> for the new release.
>>
>> HTH,
>>
>> Ricardo
>>
>>   
>>     
>
> _______________________________________________
> users mailing list
> users@xwiki.org
> http://lists.xwiki.org/mailman/listinfo/users
>   

_______________________________________________
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users

Reply via email to