- How to download a file with curl on Linux/Unix command line
- How to download a file with curl command
- Installing curl on Linux or Unix
- Verify installation by displaying curl version
- Downloading files with curl
- Resuming interrupted downloads with curl
- How to get a single file without giving output name
- Dealing with HTTP 301 redirected file
- Downloading multiple files or URLs using curl
- Grab a password protected file with curl
- Downloading file using a proxy server
- Examples
- Getting HTTP headers information without downloading files
- How do I skip SSL skip when using curl?
- Rate limiting download/upload speed
- Setting up user agent
- Upload files with CURL
- Make curl silent
- Conclusion
- Use wget Command To Download Files From HTTPS Domains
- Using wget command to download files from HTTPS websites
- Examples
- ftp_proxy shell environment variable
- Conclusion
How to download a file with curl on Linux/Unix command line
How to download a file with curl command
The basic syntax:
- Grab files with curl run: curl https://your-domain/file.pdf
- Get files using ftp or sftp protocol: curl ftp://ftp-your-domain-name/file.tar.gz
- You can set the output file name while downloading file with the curl, execute: curl -o file.pdf https://your-domain-name/long-file-name.pdf
- Follow a 301-redirected file while downloading files with curl, run: curl -L -o file.tgz http://www.cyberciti.biz/long.file.name.tgz
Let us see some examples and usage about the curl to download and upload files on Linux or Unix-like systems.
Installing curl on Linux or Unix
By default curl is installed on many Linux distros and Unix-like systems. But, we can install it as follows:
## Debian/Ubuntu Linux use the apt command/apt-get command ##
$ sudo apt install curl
## Fedora/CentOS/RHEL users try dnf command/yum command ##
$ sudo dnf install curl
## OpenSUSE Linux users try zypper command ##
$ sudo zypper install curl
Verify installation by displaying curl version
Type:
$ curl —version
We see:
Downloading files with curl
The command syntax is:
curl url —output filename
curl https://url -o output.file.name
Let us try to download a file from https://www.cyberciti.biz/files/sticker/sticker_book.pdf and save it as output.pdf
curl https://www.cyberciti.biz/files/sticker/sticker_book.pdf -o output.pdf
OR
curl https://www.cyberciti.biz/files/sticker/sticker_book.pdf —output output.pdf
The -o or —output option allows you to give the downloaded file a different name. If you do not provide the output file name curl will display it to the screen. Let us say you type:
curl —output file.html https://www.cyberciti.biz
We will see progress meter as follows:
The outputs indicates useful information such as:
- % Total : Total size of the whole expected transfer (if known)
- % Received : Currently downloaded number of bytes
- % Xferd : Currently uploaded number of bytes
- Average Dload : Average transfer speed of the entire download so far, in number of bytes per second
- Speed Upload : Average transfer speed of the entire upload so far, in number of bytes per second
- Time Total : Expected time to complete the operation, in HH:MM:SS notation for hours, minutes and seconds
- Time Spent : Time passed since the start of the transfer, in HH:MM:SS notation for hours, minutes and seconds
- Time Left : Expected time left to completion, in HH:MM:SS notation for hours, minutes and seconds
- Current Speed : Average transfer speed over the last 5 seconds (the first 5 seconds of a transfer is based on less time, of course) in number of bytes per second
Resuming interrupted downloads with curl
Pass the -C — to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out:
## Restarting an interrupted download is important task too ##
curl -C — —output bigfilename https://url/file
How to get a single file without giving output name
You can save output file as it is i.e. write output to a local file named like the remote file we get. For example, sticker_book.pdf is a file name for remote URL https://www.cyberciti.biz/files/sticker/sticker_book.pdf. One can save it sticker_book.pdf directly without specifying the -o or —output option by passing the -O (capital
curl -O https://www.cyberciti.biz/files/sticker/sticker_book.pdf
Downloading files with curl in a single shot
- No ads and tracking
- In-depth guides for developers and sysadmins at Opensourceflare✨
- Join my Patreon to support independent content creators and start reading latest guides:
- How to set up Redis sentinel cluster on Ubuntu or Debian Linux
- How To Set Up SSH Keys With YubiKey as two-factor authentication (U2F/FIDO2)
- How to set up Mariadb Galera cluster on Ubuntu or Debian Linux
- A podman tutorial for beginners – part I (run Linux containers without Docker and in daemonless mode)
- How to protect Linux against rogue USB devices using USBGuard
Join Patreon ➔
Dealing with HTTP 301 redirected file
The remote HTTP server might send a different location status code when downloading files. For example, HTTP URLs are often redirected to HTTPS URLs with HTTP/301 status code. Just pass the -L follow the 301 (3xx) redirects and get the final file on your system:
curl -L -O http ://www.cyberciti.biz/files/sticker/sticker_book.pdf
Downloading multiple files or URLs using curl
Try:
curl -O url1 -O url2
curl -O https://www.cyberciti.biz/files/adduser.txt \
-O https://www.cyberciti.biz/files/test-lwp.pl.txt
One can use the bash for loop too:
How to download a file using curl and bash for loop
Grab a password protected file with curl
Try any one of the following syntax
curl ftp://username:passwd@ftp1.cyberciti.biz:21/path/to/backup.tar.gz
curl —ftp-ssl -u UserName:PassWord ftp://ftp1.cyberciti.biz:21/backups/07/07/2012/mysql.blog.sql.tar.gz
curl https://username:passwd@server1.cyberciti.biz/file/path/data.tar.gz
curl -u Username:Password https://server1.cyberciti.biz/file/path/data.tar.gz
Downloading file using a proxy server
Again syntax is as follows:
curl -x proxy-server-ip:PORT -O url
curl -x ‘http://vivek:YourPasswordHere@10.12.249.194:3128’ -v -O https://dl.cyberciti.biz/pdfdownloads/b8bf71be9da19d3feeee27a0a6960cb3/569b7f08/cms/631.pdf
How to use curl command with proxy username/password
Examples
curl command can provide useful information, especially HTTP headers. Hence, one can use such information for debugging server issues. Let us see some examples of curl commands. Pass the -v for viewing the complete request send and response received from the web server.
curl -v url
curl -o output.pdf -v https://www.cyberciti.biz/files/sticker/sticker_book.pdf
Getting HTTP headers information without downloading files
Another useful option is to fetch HTTP headers. All HTTP-servers feature the command HEAD which this uses to get nothing but the header of a document. For instance, when you want to view the HTTP response headers only without downloading the data or actual files:
curl -I url
curl -I https://www.cyberciti.biz/files/sticker/sticker_book.pdf -o output.pdf
Getting header information for given URL
How do I skip SSL skip when using curl?
If the remote server has a self-signed certificate you may want to skip the SSL checks. Therefore, pass pass the -k option as follows:
curl -k url
curl -k https://www.cyberciti.biz/
Rate limiting download/upload speed
You can specify the maximum transfer rate you want the curl to use for both downloads and uploads files. This feature is handy if you have a limited Internet bandwidth and you would like your transfer not to use your entire bandwidth. The given speed is measured in bytes/second, unless a suffix is appended. Appending ‘k’ or ‘K’ will count the number as kilobytes, ‘m’ or ‘M’ makes it megabytes, while ‘g’ or ‘G’ makes it gigabytes. For Examples: 200K, 3m and 1G:
curl —limit-rate
curl —limit-rate 200 https://www.cyberciti.biz/
curl —limit-rate 3m https://www.cyberciti.biz/
Setting up user agent
Some web application firewall will block the default curl user agent while downloading files. To avoid such problems pass the -A option that allows you to set the user agent.
curl -A ‘user agent name’ url
curl -A ‘Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:66.0) Gecko/20100101 Firefox/66.0’ https://google.com/
Upload files with CURL
The syntax is as follows to upload files:
curl -F «var=@path/to/local/file.pdf» https://url/upload.php
For example, you can upload a file at
/Pictures/test.png to the server https://127.0.0.1/app/upload.php which processes file input with form parameter named img_file, run:
curl -F «img_file=@
/Pictures/test.png» https://127.0.0.1/app/upload.php
One can upload multiple files as follows:
curl -F «img_file1=@
/Pictures/test-1.png» \
-F «img_file2=@
Make curl silent
Want to make hide progress meter or error messages? Try passing the -s or —slient option to turn on curl quiet mode:
curl -s url
curl —silent —output filename https://url/foo.tar.gz
Conclusion
Like most Linux or Unix CLI utilities, you can learn much more about curl command by visiting this help page.
🐧 Get the latest tutorials on Linux, Open Source & DevOps via
Источник
Use wget Command To Download Files From HTTPS Domains
H ow do I download a file using https://example.com/dl/foo.tar.gz using wget command line utility?
GNU Wget is a free utility for the non-interactive download of files from the Web. It supports various protocols such as HTTP, HTTPS, and FTP protocols and retrieval through HTTP proxies. Wget is non-interactive, meaning that it can work in the background while the user is not logged on to the system. A perfect tool for your shell scripts to grab files from HTTPS enabled website too.
Tutorial details | |
---|---|
Difficulty level | Easy |
Root privileges | No |
Requirements | wget on Linux/Unix |
Est. reading time | 3 mintues |
Using wget command to download files from HTTPS websites
The syntax is:
wget https://cyberciti.biz/foo/bar.tar.gz
wget [options] https://url/file
If you don’t want about checking the validity of the certificate, just pass the option —no-check-certificate to the wget command-line:
wget —no-check-certificate https://cyberciti.biz/foo/bar.tar.gz
You can pass the —no-proxy option to the wget command. This option tells wget not to use proxies, even if the appropriate ‘ *_proxy ‘ environment variable is defined:
wget —no-proxy https://cyberciti.biz/foo/bar.tar.gz
Set the https proxy using https_proxy environment variable:
Examples
Here is what we see:
ftp_proxy shell environment variable
The ftp_proxy variable should contain the URL of the proxy for FTP connections. It is quite common that http_proxy and ftp_proxy are set to the same URL. Hence:
export http_proxy=»10.8.0.1:3128″
export https_proxy=»$
export ftp_proxy=»$
wget https://url
wget ftp://url
We can set and specify the username user and password for an FTP server when using wget using the —ftp-user=user and —ftp-password=password option:
Without an FTP username and password, wget will use the anonymous FTP option. We can force TLS/SSL FTP (FTPS) as follows for security and privacy reasons:
- No ads and tracking
- In-depth guides for developers and sysadmins at Opensourceflare✨
- Join my Patreon to support independent content creators and start reading latest guides:
- How to set up Redis sentinel cluster on Ubuntu or Debian Linux
- How To Set Up SSH Keys With YubiKey as two-factor authentication (U2F/FIDO2)
- How to set up Mariadb Galera cluster on Ubuntu or Debian Linux
- A podman tutorial for beginners – part I (run Linux containers without Docker and in daemonless mode)
- How to protect Linux against rogue USB devices using USBGuard
Join Patreon ➔
Please note that FTPS consists of initializing SSL/TLS from the very beginning of the control connection. This option does not send an “AUTH TLS” command. It assumes the server speaks FTPS and directly starts an SSL/TLS connection.
Conclusion
The GNU wget has many more options. I suggest you read the wget man page by typing the following Linux or Unix/macOS command:
man wget
wget —help
🐧 Get the latest tutorials on Linux, Open Source & DevOps via
Category | List of Unix and Linux commands |
---|---|
Documentation | help • mandb • man • pinfo |
Disk space analyzers | df • duf • ncdu • pydf |
File Management | cat • cp • less • mkdir • more • tree |
Firewall | Alpine Awall • CentOS 8 • OpenSUSE • RHEL 8 • Ubuntu 16.04 • Ubuntu 18.04 • Ubuntu 20.04 |
Linux Desktop Apps | Skype • Spotify • VLC 3 |
Modern utilities | bat • exa |
Network Utilities | NetHogs • dig • host • ip • nmap |
OpenVPN | CentOS 7 • CentOS 8 • Debian 10 • Debian 8/9 • Ubuntu 18.04 • Ubuntu 20.04 |
Package Manager | apk • apt |
Processes Management | bg • chroot • cron • disown • fg • glances • gtop • jobs • killall • kill • pidof • pstree • pwdx • time • vtop |
Searching | ag • grep • whereis • which |
Shell builtins | compgen • echo • printf |
Text processing | cut • rev |
User Information | groups • id • lastcomm • last • lid/libuser-lid • logname • members • users • whoami • who • w |
WireGuard VPN | Alpine • CentOS 8 • Debian 10 • Firewall • Ubuntu 20.04 |
Comments on this entry are closed.
Thanks That really helps
I want to know how can I set by default the –no-check-certificate on wget for https. Is this possible?
Not sure if you found the answer in the meantime?
You need to use: check-certificate = off (or check_certificate = off) in your .wgetrc startup file.
I have used wget to login to the application with save cookies parameter in the first step. I have used .pem files for the SSL handshake.
I pass userid and password to login to the app.
In the 2nd step I use the same certificate .pem files , also the same cookies file generated in 1st step. This step actually calls the main function of the application to perform certain operation/job.
Now the client doesn’t want to give userid and pass in the 1st step, I am not able to find a way how to directly invoke the function of 2nd step to call the application to do the job.
Can you tell me how wget can be used in windows to call the main function by directly logging in to application.
Источник