I finally had the time to walk through a complete test using the production
code I will utilize to actually consume the real time pusub updates and it
works flawlessly. It will enable me to get content updates in real time, and
only the updates, hence no need to repeatedly download whole feeds in
On Sat, Dec 19, 2009 at 05:31:24PM +0100, Henrik Sarvell wrote:
wonder about is the part where a normal request would be split and
its keys =3D values would parsed. Is that part enough or is there
something else happening in the big/original flow?
I would check for the initial POST, so that
In this special case I don't need to extract url parameters just the
POST contents in the case of verification.
So I'll incorporate the POST match and give it a try.
On Sun, Dec 20, 2009 at 8:52 AM, Alexander Burger a...@software-lab.de wro=
te:
On Sat, Dec 19, 2009 at 05:31:24PM +0100, Henrik
Apparently it was a GET request but I managed to parse it.
The problem now is that it seems like the prints are on the wrong
channel, before when I was using the normal http etc in the @pubsub
function I didn't get headers printed to my terminal for instance
which I'm getting now. If I only can
On Sun, Dec 20, 2009 at 01:59:59PM +0100, Henrik Sarvell wrote:
Apparently it was a GET request but I managed to parse it.
...
(while (setq L (line))
(cond
((match '(G E T @U H T T
P / 1 . @Y) L)
Hi Henrik,
So I made a change that would prevent (throw http) to be called and
everything worked.
Great!
...
(if *Xml
(setq *Xml
(ht:Read *ContLen))
(cond
On Sat, Dec 19, 2009 at 10:05:24AM +0100, Alexander Burger wrote:
1. Fork for each transaction:
(task (port 4000) # Background task listening on port 4000
(when (setq Sock (accept @)) # Accept a connection
(unless (fork) # Child process
(task P)
So I've come up with the below:
(de go ()
(rollback)
(task (port 4000) # Background task listening on port 4000
(let? Sock (accept @) # Accept a connection
(unless (fork) # Child process
(in Sock
(off *Xml *ContLen)
(use (L @X @Y)
After I sent the prior mail to the list Alex and I had a discussion on
IRC and we realized that there were two problems:
1.) In the interaction in this thread Alex assumed I was running a
simple dedicated process with its own simple http function, I wasn't
though, I was testing with the full
I've changed the whole http function to:
(de http (S)
(setq *Xml
(make
(in S
(until (eof) (link (line T))
(and S (close S) (task S)))
I'm not even trying to do anything with the headers. However with an
http function looking like this I'm not even able to
Good idea, the explicit linking, it shows me I'm actually getting the
body, however somewhere later on I'm stalling. I keep forgetting that
traceAll doesn't in fact trace everything ;-)
My @pubsub function is not executing (and the hub reports failure),
which it is when I access it through the
I think I was a bit quick in stating earlier that I got the whole
body, the PHP dump again:
?xml version=3D1.0 encoding=3Dutf-8?
feed xmlns=3Dhttp://www.w3.org/2005/Atom;titlehenrik's
stream/titleidhttp://localhost:8081//atom/stream/henrik/idauthorna=
mehenrik/name/author
I should really be more patient with myself before posting, I promise
this will be the last one for today, anyway I tried the following:
(when *Xml
(setq *Xml
(make
(until (eof) (line T) (link @)
And I now get this at
Thanks Alex, I've actually tried advertising as both 1.0 and 1.1,
doesn't seem to make any difference, will try ht:Ln tonight.
On Tue, Dec 15, 2009 at 7:39 AM, Alexander Burger a...@software-lab.de wro=
te:
Hi Henrik,
Alex, you mentioned on IRC how to handle/read chunked content and of
Still no luck, at this point I have no idea if I'm doing something
illegal from a PicoLisp point of view or a HTTP point of view, maybe
someone can shed some light on this.
During testing I have modified (http) and (_htHead) directly, when
I've discovered something that works I expect to be able
Hi Henrik,
Content-Length: 477
Accept-Encoding: gzipX-Hub-Signature:
sha1=c22189670d1aeb04d0333d737b3e97f74ccb380e
User-Agent: AppEngine-Google; (+http://code.google.com/appengine)
Host: localhost:3000
Referer: http://localhost/
Content-Type: application/atom+xml
And the feed contents
I've done some more examining and testing and I think the content
might be chunked even though there's nothing advertised in the
headers.
Alex, you mentioned on IRC how to handle/read chunked content and of
course I forgot to save the buffer. Would you care to go through it
again please?
I also
Hi Henrik,
Alex, you mentioned on IRC how to handle/read chunked content and of
course I forgot to save the buffer. Would you care to go through it
again please?
This is fairly easy, when you use the functions 'ht:Out' and 'ht:In':
(ht:Out T .. do printing ..)
does chunked output, and
18 matches
Mail list logo