Pages: [1]   Go Down
  Print  
Author Topic: Use wget to archive websites and online docs on your Mac  (Read 411 times)
HCK
Global Moderator
Hero Member
*****
Posts: 79425



« on: September 11, 2018, 04:05:20 pm »

Use wget to archive websites and online docs on your Mac

Need to bulk save online resources? You can with wget!

Do you need to download a large quantity of online documentation for your work or university studies but have limited internet access? Or perhaps do you simply want to be able to locally store web documents so you can parse them with desktop tools? On macOS, you can easily archive any freely accessible online URL (or an entire subdomain if you have the disk capacity!) with free and open source software (FOSS) in one simple terminal command. Here's how!

The wget command
Options galore
Using wget
Getting wget
Final comments
The wget command

The wget command is network downloader that can follow and archive HTTP, HTTPS, and FTP protocols. It's designated as a "non-interactive" command because you can initiate the program and leave it to do its work without any other user interaction. The wget manual explains it this way:


  Wget can follow links in HTML, XHTML, and CSS pages, to create local versions of remote web sites, fully r...

Source: Use wget to archive websites and online docs on your Mac
Logged
Pages: [1]   Go Up
  Print  
 
Jump to: