Following is a message that was forwarded to me by someone who was
having difficulties posting to the list. Please Cc the author in your

I asked augustin to talk to [EMAIL PROTECTED] about the difficulties in
posting to the list (I think a move to is really going to need
to happen RSN); I also recommended to be sure to remove surrounding
space from the "=" in things like "--referer=...", and to use the
--debug flag to check wget's request headers against the target set.
Hopefully this will solve the problem, but if anyone has additional
advice, feel free.


----------  Forwarded Message  ----------

Subject: Downloading video list from youtube profile.
Date: Friday 31 October 2008
From: augustin <[EMAIL PROTECTED]>


I am trying to use wget to download the video list from a youtube
profile, but
youtube uses some AJAX and there is no direct download link to use, which
makes the task a bit complicated.

I tried to subscribe to this list but my subscription was refused with the
following message:
<[EMAIL PROTECTED]>: host[] said: 550
    5.7.1 Blocked by SpamAssassin (in reply to end of DATA command)
Therefore, I am NOT subscribed to this list and would appreciate if you
CC in your reply.

If you point your browser to:
you will invariably be pointed to the first page of videos.
You can see at the bottom that there are more pages. Clicking on any
subsequent page will call some AJAX script which will refresh the inside of
the page.

There is no direct way to get a link to download, say, the list of
videos on
page 38. Even manually, that would be fastidious, because you can only
on the largest page number available and hop page after page to the end of
the list.

I am using Firefox and the very good firebug extension to get a clue of
happening behind the scenes. Thus, I can get the full headers of the AJAX
request, and the reply. I use this to try to replicate the same request

Here is a sample HEADER for a request:

User-Agent      Mozilla/5.0 (X11; U; Linux i686; en-US; rv:
Ubuntu/8.04 (hardy) Firefox/
Accept-Language en-us,en;q=0.5
Accept-Encoding gzip,deflate
Accept-Charset  ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive      300
Connection      keep-alive
Content-Type    application/x-www-form-urlencoded
Content-Length  422
Cookie  use_hitbox=72c46ff6cbcdb7c5585c36411b6b334edAEAAAAw;
VISITOR_INFO1_LIVE=fvxHpXl_mLY; PREF=f1=11000000&gl=TW&hl=zh-TW;
__utmc=207772311; __utmz=207772311.1225423366.1.1.utmcsr=(direct)|

Pragma  no-cache
Cache-Control   no-cache

With the POST information:

messages        [{"type":"box_method","request":

With some PARAMS which I don't know how to use:

action_ajax     1
box_method      draw_page_internal
box_name        user_videos
user    BarackObamadotcom

Finally, here is my wget call, attempting to replicate the above request:

wget \
--keep-session-cookies \
--post-data = 'session_token=&messages=[{"type":"box_method","request":
{"start":40,"num":20,"view_all_mode":"True","sort":"p"}}}]' \
--save-cookies cookie.txt \
--load-cookies cookie.txt \
--referer =
--user-agent="Mozilla/5.0 (X11; U; Linux i686; en-US; rv:
Gecko/20080924 Ubuntu/8.04 (hardy) Firefox/" \


wget \
--keep-session-cookies \


--save-cookies cookie.txt \
--load-cookies cookie.txt \
--referer =
--user-agent="Mozilla/5.0 (X11; U; Linux i686; en-US; rv:
Gecko/20080924 Ubuntu/8.04 (hardy) Firefox/" \

I've tried various other combinations but all failed.

You can try yourself. I can't manage to download the second page or any
subsequent page. I only ever get the content of the first page in return.

I don't know what I am missing or what I am doing wrong.

Thanks for any help,



Reply via email to