08.12.2005, 15:01
Zitat:Hallo GMT,
nutze das Programm auch. Ist gut. Mal ne Frage an den Experten: Gibt es so einen HTTP Sauger auch für Foren, also den Inhalt des Forums? Oder kann ich HTTrack auch dafür nutzen.
Mylow
You can use curl (curl.haxx.se) to download a series of Web pages with contiguous filenames. For example
Zitat:curl http://www.foo.com/page[1-10].htm -O
downloads all pages from page1.htm to page10.htm and saves everything under its own name.
You can also do the following trick if pages are implemented as directories:
Zitat:curl http://www.foo.com/[1-10]/page -o #1
This saves each page as a file named 1.htm to 10.htm, if you're not particular about the name conventions.
Read the curl manual to know more. Combine this and wget and you can download almost any page, including those dynamically generated and protected by passwords/cookies.