Mail Archives: cygwin/2004/01/01/14:39:01
doblejota wrote:
> Hi,
>
> I'm trying to get a section of a web page using wget. I edit a previous page
> source just leaving an initial section. Then, I execute
wget is a good tool for downloading full websites. A good enough tool for
download one URL (but there are better ones for this, e.g. curl). But it has no
support for download only a part of a URL. It is possible if you know wget and
HTTP.
>
> wget -c url
>
> I obtain
>
> Continued download failed on this file, which conflicts with `-c'.
> Refusing to truncate existing file 'name.ext'. (Name of file I've edited).
>
> The program obtains the page size before crash, bigger than modified file,
> so why does it try to truncate the file?
When reporting problems with wget please use the --debug option and include the
full output. If the webserver does not support resuming donwload wget would start
over. wget did not crash, it stopped when the options you had given it
conflicted: truncate the file don't truncate the file.
>
> I've tried to apply -N option, no change. Also I've used -O option; the file
> is whole downloaded, I just want the last section.
Then throw the first part away. If that's too wastefull you'll need to find a
better tool.
>
> Do you know a command-line application witch uses HTTP Range request in that
> way?
>
> By the way, if I want to stop download after the application receives an
> specific word from server, how can I do it? I expected it is posible using a
> pipe that analize wget output, but how?
No, wget will try to download the full HTTP response unless you kill it.
I would use perl and Net::HTTP for this. But then I'm familiar with both.
> Note Cygwin pipes (DOS) features.
DOS features? What DOS features?
>
> Thanks to everybody.
--
Med venlig hilsen / Kind regards
Hack Kampbjørn
--
Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple
Problem reports: http://cygwin.com/problems.html
Documentation: http://cygwin.com/docs.html
FAQ: http://cygwin.com/faq/
- Raw text -