How to download full website using Wget
Published on Thursday, Dec 30, 2021
Are you struggling to download a website to your local computer? Try this simple command and you will surprised how easy to download a full website using Wget.
wget --recursive --convert-links --backup-converted --page-requisites --level=inf --adjust-extension -U "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)" -p --mirror --html-extension --convert-links https://somedomain.org/
GNU Wget is a free software package for retrieving files using HTTP, HTTPS, FTP and FTPS, the most widely used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron
jobs, terminals without X-Windows support, etc.
GNU Wget has many features to make retrieving large files or mirroring entire web or FTP sites easy, including:
- Can resume aborted downloads, using
REST
andRANGE
- Can use filename wild cards and recursively mirror directories
- NLS-based message files for many different languages
- Optionally converts absolute links in downloaded documents to relative, so that downloaded documents may link to each other locally
- Runs on most UNIX-like operating systems as well as Microsoft Windows
- Supports HTTP proxies
- Supports HTTP cookies
- Supports persistent HTTP connections
- Unattended / background operation
- Uses local file timestamps to determine whether documents need to be re-downloaded when mirroring
- GNU Wget is distributed under the GNU General Public License.
click here https://www.gnu.org/software/wget/manual/wget.html to read more about wget commands