Title: ÅåÅå Æ¢´Â °³¼ºÀÖ´Â µµ¸ÞÀÎ Æ÷¿öµù ¾ÆÀÌ·¯ºê½ºÅ¸ ÀÔ´Ï´Ù.
Title: ¼¼»ó¿¡¼ °¡Àå ´ÞÄÞÇÑ ¿µÈ°ü
¹öÆ°À» Ŭ¸¯ÇÏ½Ã¸é ¼ö½Å°ÅºÎ󸮰¡ ÀÌ·ç¾î Áý´Ï´Ù.
ÇѸÞÀÏÀÇ ¿ìÇ¥Á¦ ½Ç½Ã·Î ¸¹ÀÌ Èûµå½Ç°Å¶ó »ý°¢ÇÕ´Ï´Ù.
Á¦°¡ ¾à°£ÀÇ µµ¿òÀ» µå¸±±îÇÕ´Ï´Ù. ¾ÆÁÖ ÆÄ°ÝÀûÀÎ Á¦¾ÈÀ¸·Î¿ä.
¹ß¼ÛÇÁ·Î±×·¥ 2°³ ¸ÞÀÏÃßÃâÇÁ·Î±×·¥ 3°³ °Ô½ÃÆǵî·ÏÇÁ·Î±×·¥ 1°³¿Í
¸ÞÀÏ ¸®½ºÆ® 1000¸¸°³¸¦ 9¸¸¿ø¿¡ µå¸±±î ÇÕ´Ï´Ù. ¸ÞÀϸ®½ºÆ®´Â ÇѸÞÀϵµ Æ÷ÇԵǾî ÀÖ½À´Ï´Ù.
ºÐ·ù´Â Á÷Á¢ÇÏ½Ã¸é µÉ°Ì´Ï´Ù. °ü
-º»
¸ÞÀÏÀº Á¤º¸Åë½Å¸Á ÀÌ¿ëÃËÁø ¹× Á¤º¸º¸È£ µî¿¡ °üÇÑ ¹ý·ü Á¦ 50Á¶¿¡ ÀÇ°ÅÇÑ
[±¤°í] ¸ÞÀÏÀÔ´Ï´Ù e-mailÁÖ¼Ò´Â ÀÎÅͳݻó °Ô½ÃÆÇ¿¡¼ ÃëµæÇÏ¿´À¸¸ç,
ÁÖ¼Ò¿Ü ¾î¶°ÇÑ °³ÀÎ Á¤º¸µµ °¡Áö°í ÀÖÁö ¾Ê½À´Ï´Ù
-
On Mon, 8 Apr 2002, Hrvoje Niksic wrote:
> Don't get me wrong, a "spam" detection wouldn't get discarded; it
> would simply need to be approved by an editor. So even the false
> positives would make to the list, only a tad later.
OK, then.
--
+ Maciej W. Rozycki, Technical University of Gda
"Maciej W. Rozycki" <[EMAIL PROTECTED]> writes:
> On Mon, 8 Apr 2002, Hrvoje Niksic wrote:
>
>> I was also thinking about checking for `Wget' in the body, and things
>> like that.
>
> That might be annoying (although it is certainly an option to consider
> anyway) as someone sending a mail legit
On Mon, 8 Apr 2002, Hrvoje Niksic wrote:
> I was also thinking about checking for `Wget' in the body, and things
> like that.
That might be annoying (although it is certainly an option to consider
anyway) as someone sending a mail legitimately may assume the matter being
obvious from the list's
Emil Obermayr <[EMAIL PROTECTED]> writes:
> recursive ftp with 1.7 worked fine and still it does with 1.8 and
> http. But 1.8 stops recursive ftp after the index file:
>
> these work fine:
>
> wget-1.7 -r http://
> wget-1.7 -r ftp://
> wget-1.8 -r http://
>
>
> but this one stops after first file
Martin Tsachev <[EMAIL PROTECTED]> writes:
> it compiles on i386-unknown-netbsdelf1.5.2 without any modifications
>
> I think that wget isn't parsing the @import CSS declaration, it should
> save those files when run with -p and convert the links if set so
That is true. Parsing @import would re
Guillaume Morin <[EMAIL PROTECTED]> writes:
> For example if a link to the URL "/foo?bar" is seen then the correct
> file is downloaded and saved with the name "foo?bar". When viewing
> the pages with Netscape the '?' character is seen to separate the
> URL and the arguments. This makes the lin
Ivan Buttinoni <[EMAIL PROTECTED]> writes:
> Again I send a suggestion, this time quite easy. I hope it's not
> allready implemented, else I'm sorry in advance. It will be nice if
> wget can use the regexp to evaluate what accept/refuse to download.
> The regexp have to work on whole URL and/or
Guillaume Morin <[EMAIL PROTECTED]> writes:
> If wget fetches a url which redirects to another host, wget
> retrieves the file, and there's nothing that can be done to turn
> that off.
>
> So, if you do wget -r on a machine that happens to have a redirect to
> www.yahoo.com you'll wind up trying
Good point there. I wonder... is there a legitimate reason to require
atime to be set to the mtime time? If not, we could just make the
change without the new option. In general I'm careful not to add new
options unless they're really necessary.
Guillaume Morin <[EMAIL PROTECTED]> writes:
> if I use 'wget ftp://site.com/file1.txt ftp://site.com/file2.txt',
> wget will no reuse the ftp connection, but will open one for each
> document downloaded from the same site...
Yes, that's how Wget currently behaves. But that's not a bug, or at
le
Samuel Hargis <[EMAIL PROTECTED]> writes:
> Is there a command to use that would only download HTML that
> contains a certain text string or key word in it? I would like to
> reject HTML that does not contain my key word or characters in the
> HTML.
I'm afraid there's no such option. One probl
Guillaume Morin <[EMAIL PROTECTED]> writes:
> I am forwarding you this bug. I can reproduce this on 1.8.1
Thanks for the report. I believe this patch should fix it:
Index: src/ChangeLog
===
RCS file: /pack/anoncvs/wget/src/ChangeL
David McCabe <[EMAIL PROTECTED]> writes:
> I am having a hell of a time to get the reg-ex stuff to work with the -A or -R
> options. If I supply this option to my wget command:
>
> -R 1*
>
> Everything works as expected. Same with this:
>
> -R 2*
>
> Now, if I do this:
>
> -R 1*,2*
>
> I get all
"Tomislav Goles" <[EMAIL PROTECTED]> writes:
> Now I need to add the twist where username account info
> resides on another machine (i.e. machine2 which by the way
> is on the same network as machine1) So I need to do something
> like the following:
>
> $ wget ftp://username:[EMAIL PROTECTED]@mac
Andre Majorel <[EMAIL PROTECTED]> writes:
> On 2002-03-22 04:08 +0100, Hrvoje Niksic wrote:
>
>> > May I suggest that you set a filter that prevents postings to the
>> > list unless the poster is a subscriber. That filter should forward
>> > the mail to the admins to allow them the pass the mail
If anyone creates a patch for rollback, I'll be your first tester ;)
Justin Piszcz wrote:
> I was curious if lftp or wget will ever support a rollback feature and
> somehow verify the bytes are correct somehow where the file has been
> resumed.
>
> Why? This is stated below:
>
> LFTP VS WGET E
ȸé100ÀÎÄ¡ º®°ÉÀÌTV °¡³ª¿Â´Ù¸é(1¾ï¿ø)À» È£°¡
ÇÑ´Ù°í ÇÕ´Ï´Ù.
¡¡
ºöÇÁ·ÎÁ§ÅÍ·Î TVº¸´Ù ¼±¸íÇÏ°í »ý»ýÇÑ1¾ï¿øÀÇ °¡Ä¡¸¦ Áñ°Ü º¸¼¼¿ä!
¡¡
ȸé 100ÀÎÄ¡/200ÀÎÄ¡ (ºöÇÁ·ÎÁ§ÅÍ ±ÞóºÐ)
¡¡
³ëÆ®ºÏ ¹× ·¹ÀÌÁ®´Â ¹«·á·Î µå¸®°Ú½À´Ï´Ù
¡¡
ÀÚ¼¼ÇÑ »çÇ
Please CC: any answers to my email address, since I'm not on this
list.
I'd like wget to get the time stamp of a file that is downloaded via
FTP and to set the mtime after writing the file to the local disk.
When using HTTP, this already happens, i.e. when doing a
wget http://host/file
the
22 matches
Mail list logo