Re: [xwiki-users] Ajax.Query and cross-site AJAX requests?

2013-01-26 Thread xwiki . mexon
Yeah, I guess that makes sense.  Thanks for the suggestions.  In fact I 
should be able to bring all my services under one domain so I'll just do 
that.  It was a misguided attempt to improve security by keeping my 
servers separate that brought me here in the first place!


On 2013-01-26 23:51 , Jerome Velociter - jer...@velociter.fr wrote:

Hi,

Indeed you are hitting the standard "same origin policy".

You have 3 possibilities to circumvent it :

* With JSONP requests - if the server supports them, and only for GET 
requests [1]

* With CORS/pre-flight requests - if the server support them [2]
* With a proxy (for example a page on your wiki) that does the URL GET 
or POST, and you hit the proxy with your Ajax requests.


Hope this helps,
Jerome

[1] http://en.wikipedia.org/wiki/JSONP
[2] http://en.wikipedia.org/wiki/Cross-origin_resource_sharing


Le 26/01/13 05:03, xwiki.me...@spamgourmet.com a écrit :

Hi,

I want one of my pages to make a post to another site and insert the 
results into its page.  Right now I've got a JavaScriptExtension that 
looks like:


function doquery() {
new Ajax.Request('http://mat.exon.name/test.php', {
method:'post',
parameters:{
'arg' : document.getElementById('thearg').value,
},
});
return false;
}

I find that this does an OPTIONS request, but not the intended POST.  
If I change the URL to a local page, the POST goes through as 
intended.  Am I tripping up over some kind of XSS defense, and is 
there some way to turn it off?


___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users




___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


Re: [xwiki-users] Images not resized server side with query string

2013-01-26 Thread Jan-Philip Loos
Hi there,

I just updated XWiki from 4.4 to 4.4.1 and it's now perfect.

For others run in this problem: If you updated double check the browser
caching, it wasn't done with shift + f5 in chrome.

Thanks a lot for the answer

Jan



--
View this message in context: 
http://xwiki.475771.n2.nabble.com/Images-not-resized-server-side-with-query-string-tp7583567p7583573.html
Sent from the XWiki- Users mailing list archive at Nabble.com.
___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


Re: [xwiki-users] Ajax.Query and cross-site AJAX requests?

2013-01-26 Thread Jerome Velociter

Hi,

Indeed you are hitting the standard "same origin policy".

You have 3 possibilities to circumvent it :

* With JSONP requests - if the server supports them, and only for GET 
requests [1]

* With CORS/pre-flight requests - if the server support them [2]
* With a proxy (for example a page on your wiki) that does the URL GET 
or POST, and you hit the proxy with your Ajax requests.


Hope this helps,
Jerome

[1] http://en.wikipedia.org/wiki/JSONP
[2] http://en.wikipedia.org/wiki/Cross-origin_resource_sharing


Le 26/01/13 05:03, xwiki.me...@spamgourmet.com a écrit :

Hi,

I want one of my pages to make a post to another site and insert the 
results into its page.  Right now I've got a JavaScriptExtension that 
looks like:


function doquery() {
new Ajax.Request('http://mat.exon.name/test.php', {
method:'post',
parameters:{
'arg' : document.getElementById('thearg').value,
},
});
return false;
}

I find that this does an OPTIONS request, but not the intended POST.  
If I change the URL to a local page, the POST goes through as 
intended.  Am I tripping up over some kind of XSS defense, and is 
there some way to turn it off?


___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


Re: [xwiki-users] off-line-Version of XWiki

2013-01-26 Thread Ashtar Communications
Thanks for the lead - After a quick initial test, it looks like TPP will do
exactly what I need. Much easier to use than Wget, and the pages are
actually working after download.

Best,

aaron


On Sat, Jan 26, 2013 at 2:17 AM, Roman Muntyanu
wrote:

> You could try something like
>http://www.tenmax.com/teleport/pro/home.htm
> Just make sure the user has read-only rights, because TPP clicks all the
> links :)
>
> -Original Message-
> From: users-boun...@xwiki.org [mailto:users-boun...@xwiki.org] On Behalf
> Of Ashtar Communications
> Sent: Saturday, January 26, 2013 05:11 AM
> To: XWiki Users
> Subject: Re: [xwiki-users] off-line-Version of XWiki
>
> Stephanie,
>
> I'm also very interested in anything you come up with, or whether anyone
> has additional thoughts on making an "offline" copy. I'm not sure whether
> my situation is similar to yours or not, but I'll explain a few things I've
> tried so far in case it helps either of us.
>
> Short version - has anyone successfully used Wget to mirror an XWiki
> instance?
>
> What I ultimately would like is an offline archive of an XWiki instance
> that is totally independent of needing a servlet container or database
> (even standalone).
>
> I understand Arnaud's suggestion to use a standalone XWiki instance to
> create an offline backup, but unfortunately that is too complicated for
> most of my users to access.
>
> In my circumstance, I have an XWiki instance that needs to "reset" at the
> beginning of each year, and then need to keep an archive of each full
> year's worth of contributions to the wiki. So I end up with a separate
> XWiki database for each year. This is quickly becoming cumbersome and
> eating up a lot of server resources to keep all of them online.
>
> I would like for an average user to be able to download an "archive" of a
> particular year's wiki instance so I no longer need to host it "live."
>
> One other consideration - almost all of the page content in each wiki
> instance is stored in objects attached to each page that are then retrieved
> with velocity and formatted with javascript. And most pages have a large
> number of attachments.
>
> Things I have tried:
>
> 1) HTML/PDF export - Like Stephanie, this doesn't work for me since it
> doesn't maintain navigation or scripting.
>
> 2) Standalone XWiki instance - this has proved just too complicated for my
> users. I'd prefer some type of archive in a flat file/HTML format if at all
> possible.
>
> 3) Wget - This seems to be the most promising option so far, since it's
> supposed to make a totally offline recursive mirror of the site. My
> attempts so far have been mixed - I can get some of the page content to
> download, but struggle with getting a completely working copy. I've also
> tried a few other "offline archiver" type programs, but none have worked
> better than Wget. If someone has successfully used Wget to mirror an XWiki
> site, I'd love to hear about it.
>
> Any other ideas?
>
> Thanks,
>
> aaron
>
>
>
> On Fri, Jan 25, 2013 at 8:53 AM, Arnaud bourree  >wrote:
>
> > Hello,
> >
> > Off-line: XWiki is off-line: you don't need internet to run XWiki
> > excepted for some connected extension.
> > More than Off-line, we may want portable XWiki instance you can put on
> > usb pen-drive.
> > IMO, standalone XWiki is ready for that.
> > OK, You want to put a copy of your XWiki server in you pen-drive ...
> >  - made you standalone XWiki read-only to not resynchronize back to
> > your server
> >  - after dump of your server database, you have to convert it to
> > Hslqdb, or you write Event listener extension to propagate page update
> > from server to pen-drive. Database conversion looks more easy to do
> >
> > Regards,
> >
> > Arnaud.
> >
> > 2013/1/23  :
> > > Hello again!
> > >
> > > I would like to make a xwiki-instance accessible off-line. I tried
> > > to
> > export
> > > everything as HTML with rather bad results as the navigation and the
> > > scripting isn't exported. So the idea was to somehow export xwiki to
> > > some standalone version. Has anyone ever done something like this
> > > and wouldn't mind sharing his/her inside thoughts?
> > >
> > > Thanks for your help,
> > >
> > > Stephanie
> > > ___
> > > users mailing list
> > > users@xwiki.org
> > > http://lists.xwiki.org/mailman/listinfo/users
> > ___
> > users mailing list
> > users@xwiki.org
> > http://lists.xwiki.org/mailman/listinfo/users
> >
> ___
> users mailing list
> users@xwiki.org
> http://lists.xwiki.org/mailman/listinfo/users
> ___
> users mailing list
> users@xwiki.org
> http://lists.xwiki.org/mailman/listinfo/users
>
___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


[xwiki-users] Getting the last blog via REST

2013-01-26 Thread Grüner Heinrich

Hi,

I want to display the last blog in another website.
I suspect I can use the rest interface, but have to idea how to query it.

Any help appreciated.
Thanks,
Stefan.
___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users


Re: [xwiki-users] Images not resized server side with query string

2013-01-26 Thread Vincent Massol
Hi,

There was a bug on 4.4, which is fixed in 4.4.1:
http://jira.xwiki.org/browse/XWIKI-8663

Thanks
-Vincent

On Jan 26, 2013, at 8:02 AM, Jan-Philip Loos  wrote:

> Hi,
> 
> currently I'm a bit clueless. When I use for example:
> [[image:XWikiLogo.png||width="50" height="50"]] the image is only resized
> with the attribute style but not transfered resized. When I inspect the
> image via chrome the "natural size" is still the original size.
> 
> The html result:
> 
>  
> 
> We are using XWiki 4.4 (XEM)
> The document is saved with XWiki Syntax 2.1 (I tried 2.0 also)
> in the xwiki.properties I set explicitly
> rendering.imageDimensionsIncludedInImageURL = true (but this should be the
> default)
> 
> Is there a wiki wise way to override the behaviour incidentally?
___
users mailing list
users@xwiki.org
http://lists.xwiki.org/mailman/listinfo/users