wget is typically used to download entire websites or specific files from a server to the user’s local system, and it does this recursively by default. This means it can navigate through a website, following links to download the entirety of a website, including text and media files.
Here are some different ways to use the
The most straightforward way to use
wget is to type
wget followed by the URL of the file you want to download.
1. Specify Download Directory
If you want to download the file to a specific directory, you can use the
-P option followed by the path to the directory.
wget -P /path/to/directory http://example.com/file.zip
2. Download in the Background
If you want to download a large file, you might want to move the download to the background. You can do this with the
wget -b http://example.com/large-file.zip
3. Resume an Interrupted Download
If a download gets interrupted, you can continue it with the
wget -c http://example.com/large-file.zip
4. Limit Download Speed
If you don’t want
wget to use all available network bandwidth, you can limit the download speed with the
wget --limit-rate=200k http://example.com/file.zip
5. Download Multiple Files
If you want to download multiple files, you can specify them all at once.
wget http://example.com/file1.zip http://example.com/file2.zip
Alternatively, you can put all URLs in a file (one URL per line) and use the
wget -i urls.txt
6. Download a Full Website
If you want to download a full website for offline viewing, you can use the
wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains example.com --no-parent http://www.example.com
This command will download the entire
www.example.com website. The options used in this command have the following roles:
--recursive: download the entire Web site.
--domains example.com: don’t follow links outside example.com.
--no-parent: don’t follow links outside the directory tutorials/html/.
--page-requisites: get all the elements that compose the page (images, CSS and so on).
--html-extension: save files with the .html extension.
--convert-links: convert links so that they work locally, off-line.
--restrict-file-names=windows: modify filenames so that they will work in Windows as well.
--no-clobber: don’t overwrite any existing files (used in case the download is interrupted and resumed).
7. Download Files from an FTP Server
You can use
wget to download files from an FTP server. If a username and password are required, use the format:
More Linux commands:
|File System Operations||
|Search and Text Processing||
|System Information and Management||
|User and Session Management||