Re: Change Request: Cookies 4 WGet

2001-05-15 Thread Hack Kampbjørn


Michael Klaus wrote:
 
 Dear WGet team,
 
 first of all, i want to say that WGet really is a _great_ program. My
 company is mostly using it for regression tests for different web
 servers and servlet engines. And there's the problem. Servlet engines
 meintain their sessions - which are critical for regression tests - via
 cookies. A functionality to hold cookies (one cookie would be sufficient
 for this task) and send them back with each request would really be
 helpful.
 
 Would it be able for someone of your team to support us getting this to
 work? We have a bit of c knowledge here and perhaps would even be able
 to write it ourselves...if we only had a clue where to change what :-/

Cookie support has been added in the current developement code. You can
get it from CVS see the Developement section on the web-site
(http://sunsite.dk/wget/). Of course with all the usual warnings about
using developement code.

 
 Many thanks in advance,
 
 Michael Klaus
 
 --
 Michael Klaus
 Entwickler / IT-Consultant
 
 orgafactory gmbh
 Hügelstraße 8
 60435 Frankfurt am Main
 
 Telefon (0 69) 90 54 66 35
 Telefax (0 69) 90 54 66 13
 mailto:[EMAIL PROTECTED]

-- 
Med venlig hilsen / Kind regards

Hack Kampbjørn   [EMAIL PROTECTED]
HackLine +45 2031 7799



¾È³çÇϼ¼¿ä. ȸ½ÅÀÌ ¾øÀ¸¼Å¼­ ´Ù½Ã¿¬¶ô µå¸³´Ï´Ù.

2001-05-15 Thread ³ë¼Ò¿¬ ¾¸
Title: 




¾È³çÇϼ¼¿ä. ÀÏÀü¿¡ ¸ÞÀÏ·Î ´º½º¹ßÇà¿¡ ´ëÇØ Á¦¾ÈÀ» µå·È´ø ³ë¼Ò¿¬ÀÔ´Ï´Ù.ȸ½ÅÀÌ ¾øÀ¸¼Å¼­ Ȥ½Ã 
Á¦°¡ ±×¶§ µå¸° ¸ÞÀÏÀ» ¸ø¹ÞÀ¸½Å°Í °°¾Æ ´Ù½Ã ¿¬¶ôµå¸³´Ï´Ù.ÀúÈñ°¡ ¹ßÇàÇص帮´Â ¹«·á ´º½º¿¡ ´ëÇØ ¾î¶»°Ô 
»ý°¢ÇϽôÂÁö¿ä..ÀÚ²Ù ±ÍÂú°Ô Çصå·Á¼­ Á˼ÛÇÕ´Ï´Ù¸¸ ¹Ù»Ú½Ã´õ¶óµµÈ¸½Å 
ºÎŹµå¸³´Ï´Ù. 
¾Æ·¡´Â ¿¹Àü¿¡ Á¦°¡ µå·È´ø ¸ÞÀϳ»¿ëÀÔ´Ï´Ù. ´Ù½ÃÇѹø º¸³»µå¸±²²¿©..Àо½Ã°í °£´ÜÇÏ°Ô ´äÀå 
ºÎŹµå¸³´Ï´Ù.^.^ °¨»çÇÕ´Ï´Ù.
---
3D°ü·ÃºÐ¾ß¿¡ °ü½ÉÀÌ ¸¹À¸½Å 
´Ô¿¡°ÔÁ¦¾ÈÀ» µå¸®±â À§ÇØ ÆíÁö¸¦ ¶ç¿ó´Ï´Ù.ÀúÈñ´Â Áö±Ý±îÁö ¸¹Àº 3D/±×·¡ÇÈ/Web°ü·ÃÀÚ 
ºÐµé¿¡°Ô3Â÷¿øÀÎÅÍ³Ý Àû¿ë»ç·Ê¸¦ ¸ÅÁÖ1ȸ¾¿ ´º½º·Î ¹ßÇàÇÏ°í ÀÖ½À´Ï´Ù.Web3D/Stereo3D 
±â¼ú/VR°ü·Ã ±â¼ú/3D±¸Çö ±â¼úµî, ÀÎÅͳݰú 3Â÷¿ø¿µ»ó¿¡ °ü·ÃµÈ ÇÏÀÌÅ×Å© ±â¼ú¸¸À» ´Ù·ç°í ÀÖ½À´Ï´Ù.ÀÎÅÍ³Ý Á¤º¸¸¦ ½Ç½Ã°£ 
3Â÷¿øÀ» ±¸ÇöÇÏ´Â3Â÷¿ø ÀÎÅͳݱâ¼ú°ú ¿Ü±¹¿¡¼­¹ßÇ¥µÇ´Â 3Â÷¿ø ÀÎÅÍ³Ý »ç·Ê ¹× ±â¼úµîÀ» ¸ÅÁÖ¿ù-¼ö¿äÀÏ¿¡ ´º½º·Î 
¹ßÇà ÇØ µå¸³´Ï´Ù. º» ¹«·á ¼­ºñ½º¸¦ ÅëÇØ ´Ô²²¼­µµ Web¿¡¼­ ±¸ÇöµÇ´Â 3Â÷¿ø ±â¼úÀ» Á¢ÇÏ°í »õ·Î¿î ºÐ¾ß¸¦ 
Ž±¸ÇÒ±âȸ°¡ µÇ±æ ¹Ù¶ø´Ï´Ù.´Ô¿¡°Ô ÃÖ¼ÒÇÑ ÀÛÀºÁ¤º¸´Â µÉ°ÍÀ» ¾à¼Óµå¸³´Ï´Ù. 

Á¦°¡ ¸ÕÀú¾à 5ȸ 
Á¤µµ´º½º¸¦Àü´ÞÇØ µå¸®°í ½Í½À´Ï´Ù. (¾ðÁ¦µç Ãë¼Ò°¡´É)ºÐ¸í µµ¿òÀÌ 
µÉ¼öÀÖ´Â Á¤º¸ÀÓÀ» ¾à¼Óµå¸³´Ï´Ù.±×¸®°í º» Á¦¾ÈÀº Á¦Ç°À» ÆǸÅÇϰųªÈ¸¿øÀ» À¯Ä¡Çϱâ À§ÇÑ ¸ñÀûÀÌ ¾Æ´Õ´Ï´Ù. »õ·Î¿î 
±â¼ú°ú ¿Ü±¹ÀÇ ½ÃÀå»óȲÀ» Çѱ¹½ÃÀå¿¡³Î¸®¾Ë·Á ±¹³»¿¡¼­µµ°¡»óÇö½Ç ½ÃÀåÀ» È°¼ºÈ­ ÇϱâÀ§ÇØ ¿î¿µµÇ´Â ¹«·á ¼­ºñ½ºÀÔ´Ï´Ù. 
¸¸¾à ½Ã¹üÀûÀ¸·Î ´º½º¸¦ 
¹Þ¾Æº¸½Ç·Á¸é "½Åû"À̶ó´Â Á¦¸ñÀ¸·Î ȸ½Å¸¸ Áֽʽÿä. À̸§,¿¬¶ôó,ÁֹιøÈ£µîÀ» ÀûÁö ¸¶½Ã±â ¹Ù¶ø´Ï´Ù. (°³ÀÎÁ¤º¸´Â ÇÊ¿äÇÏÁö 
¾Ê½À´Ï´Ù.)¹Ù»Ú½Ã°ÚÁö¸¸ ȸ½Å 
ºÎŹµå¸³´Ï´Ù. ±â´Ù¸®°í ÀÖ°Ú½À´Ï´Ù. °¨»çÇÕ´Ï´Ù.
-õÀÚÀÇ TEXTº¸´Ù ÇÑÀåÀÇ IMAGE°¡ 
È¿°úÀûÀÌ°íõÀåÀÇ IMAGEº¸´Ù ÇϳªÀÇ Web3D°¡ È¿°úÀûÀÌ´Ù.




Re: regetting 'file not found' (404)

2001-05-15 Thread J Scott Jaderholm

toad [EMAIL PROTECTED] writes:

 On Tue, May 15, 2001 at 06:21:06PM -0700, J Scott Jaderholm wrote:
  Hi,
  
  I'd like to recursively get a directory being served by an apache
  server.  The problem is that about 1 out of 4 requests for a file or
  directory are answered with a file not found (404) error.
  
  Is there a way to have wget reget the files until it succeeds?  I'd
  just fix the problem on the server but 1) I don't run it 2) no one
  that I've talked to (which includes a lot of apache users) has seen
  this problem before.
  
  Any suggestions for using wget as a solution to this problem would be
  _greatly_ appreciated.

 Something like the attached?

Almost, it seems.  I run mwgeta -c -r -np http://server.com/whatever/
but it doesn't quite work.  Many of the files/directories still don't
get downloaded, assummingly because the server said 404.

I've looked at it and I assume I'm supposed to change the line that
says

if grep unsupported protocol ~/1.txt.mwgeta; then echo UNSUPPORTED
PROTOCOL; END=1; fi

But I'm not sure what to do with it..

Sincerely,
jsj

-- 
the sky is tired of being blue




recursive web-suck ignores directory contents

2001-05-15 Thread J Scott Jaderholm

Hi,

Say I

wget -c -r -np http://server.com/foo/

It will download the files in foo for me, and the directories also,
but often not the contents of those directories.

Here's the contents of foo

10.txt
2.txt
bar/

and contents of bar
3.txt

It would download foo, and make a directory bar in the spot where it
downloads foo, but often times wget will not download 3.txt.

Is there a fix for this?

Sincerely,
jsj
-- 
the sky is tired of being blue