Linux Tutorials - Herong's Tutorial Examples - v5.45, by Herong Yang
"wget" - Get Files from the Web
This section provides a tutorial example on how to use 'wget' command to download files from the Web. 'wget' can resume failed downloads.
What Is "wget"? - "wget" is a non-interactive network downloader. It allows you to get files from the Web with HTTP, HTTPS and FTP protocols.
Main features of "wget":
Here are some examples of installing and using "wget" command.
1. Install "wget" on a CentOS computer.
herong$ sudo dnf install wget ... Installed: wget-1.19.5-10.el8.x86_64
2. Download the home page on the local Web server. Processing messages are printed to the console. Downloaded file stores in "index.html".
herong$ wget http://localhost 03:33:33-- http://localhost/ Resolving localhost (localhost)... ::1, 127.0.0.1 Connecting to localhost (localhost)|::1|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 115 [text/html] Saving to: 'index.html' index.html 100%[=========================>] 115 --.-KB/s in 0s 03:33:33 (29.5 MB/s) - 'index.html' saved [115/115]
2. Download a Web file in background. Processing messages are saved in"wget-log".
herong$ wget --background http://localhost Continuing in background, pid 156813. Output will be written to 'wget-log'. herong$ (logout and login later) herong$ cat wget-log 03:49:00-- http://localhost/ Resolving localhost (localhost)... ::1, 127.0.0.1 Connecting to localhost (localhost)|::1|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 115 [text/html] Saving to: 'index.html' 0K 100% 31.1M=0s 03:49:00 (31.1 MB/s) - 'index.html' saved [115/115]
3. Download a large file with 30 retries instead of the default 20. It resumes the download after the server closes the connection for 30 times. The saved file is incomplete.
herong$ wget --tries=30 http://somesite.com/large_file.zip --05:02:54-- Connecting to somesite.com:80... connected. HTTP request sent, awaiting response... 200 OK Length: 10352938638 (9.6G) [application/octet-stream] Saving to: 'large_file.zip' large_file.zip 10%[===> ] 1.01G 6.48MB/s in 2m 35s 05:05:30 (6.67 MB/s) - Connection closed at byte 1082498714. Retrying. --05:05:31-- (try: 2) Connecting to somesite.com:80... connected. HTTP request sent, awaiting response... 206 Partial Content Length: 10352938638 (9.6G), 9270439924 (8.6G) remaining [application/octet-stream] Saving to: 'large_file.zip' large_file.zip 12%[===> ] 1.16G 5.95MB/s in 0m 18s 05:23:06 (5.95 MB/s) - Connection closed at byte 1174397295. Retrying. ... --06:29:38-- (try: 30) ... 05:23:06 (1.70 MB/s) - Connection closed ...
4. Continue to download an incomplete large file.
herong$ wget --continue http://somesite.com/large_file.zip --05:28:58-- Connecting to somesite.com:80... connected. HTTP request sent, awaiting response... 206 Partial Content Length: 10352938638 (9.6G), 1314882873 (1.2G) remaining [application/octet-stream] Saving to: 'large_file.zip' ...
Note that "--tries" and "--continue" work, because the server is able to respond with the "206 Partial Content" status.
5. Download all files from a Website. The entire Website is mirrored in a sub-directory.
herong$ wget --mirror http://somesite.com herong$ tree somesite.com (list of downloaded files) herong$ firefox index.html (browse the mirrored site)
Table of Contents
Cockpit - Web Portal for Administrator
SELinux - Security-Enhanced Linux
►"wget" - Get Files from the Web
SSH Protocol and ssh/scp Commands
Software Package Manager on CentOS - DNF and YUM
vsftpd - Very Secure FTP Daemon