- 5 Linux Command Line Based Tools for Downloading Files and Browsing Websites
- 1. rTorrent
- Installation of rTorrent in Linux
- 2. Wget
- Installation of Wget in Linux
- Basic Usage of Wget Command
- 3. cURL
- Installation of cURL in Linux
- Basic Usage of cURL Command
- 4. w3m
- Installation of w3m in Linux
- Basic Usage of w3m Command
- 5. Elinks
- Installation of Elinks in Linux
- Basic Usage of elinks Command
- If You Appreciate What We Do Here On TecMint, You Should Consider:
- 2 Ways to Download Files From Linux Terminal
- Download files from Linux terminal using wget command
- Installing wget
- Download a file or webpage using wget
- Download files with a different name using wget
- Download a folder using wget
- Download an entire website using wget
- Bonus Tip: Resume incomplete downloads
- Download files from Linux command line using curl
- Installing curl
- Download files or webpage using curl
- Download files with a different name
- Pause and resume download with curl
- About Abhishek Prakash
- How to Download a File on Ubuntu Linux using the Command Line
- Download files using Curl
- Install curl
- Download and save the file using the source file name
- Download and save the file with a different name
- Download multiple files
- Download files from an FTP Server
- Pause and resume download
- Download files using Wget
- Install wget
- Download file or webpage using wget
- Download files with a different name
- Download files through FTP
- Recursively download files
- Download multiple files
- Pause and Resume download
- Karim Buzdar
5 Linux Command Line Based Tools for Downloading Files and Browsing Websites
Linux command-line, the most adventurous and fascinating part of GNU/Linux is a very cool and powerful tool. A command-line itself is very productive and the availability of various inbuilt and third-party command-line applications makes Linux robust and powerful. The Linux Shell supports a variety of web applications of various kinds be it torrent downloader, dedicated downloader, or internet surfing.
5 Command-Line Internet Tools
Here we are presenting 5 great command line Internet tools, which are very useful and prove to be very handy in downloading files in Linux.
1. rTorrent
rTorrent is a text-based BitTorrent client which is written in C++ aimed at high performance. It is available for most of the standard Linux distributions including FreeBSD and Mac OS X.
Installation of rTorrent in Linux
Check if rtorrent is installed correctly by running the following command in the terminal.
rTorrent Command Line Tool
Functioning of rTorrent
Some of the useful Key-bindings and their use.
- CTRL+ q – Quit rTorrent Application
- CTRL+ s – Start Download
- CTRL+ d – Stop an active Download or Remove an already stopped Download.
- CTRL+ k – Stop and Close an active Download.
- CTRL+ r – Hash Check a torrent before Upload/Download Begins.
- CTRL+ q – When this key combination is executed twice, rTorrent shutdown without sending a stop Signal.
- Left Arrow Key – Redirect to Previous screen.
- Right Arrow Key – Redirect to Next Screen
2. Wget
Wget is a part of the GNU Project, the name is derived from World Wide Web (WWW). Wget is a brilliant tool that is useful for recursive download, offline viewing of HTML from a local Server and is available for most of the platforms be it Windows, Mac, Linux.
Wget makes it possible to download files over HTTP, HTTPS, and FTP. Moreover, it can be useful in mirroring the whole website as well as support for proxy browsing, pausing/resuming Downloads.
Installation of Wget in Linux
Wget being a GNU project comes bundled with Most of the Standard Linux Distributions and there is no need to download and install it separately. If in case, it’s not installed by default, you can still install it using apt, yum, or dnf.
Basic Usage of Wget Command
Download a single file using wget.
Download a whole website, recursively.
Download specific types of files (say pdf and png) from a website.
Wget is a wonderful tool that enables custom and filtered download even on a limited resource Machine. A screenshot of wget download, where we are mirroring a website (Yahoo.com).
Wget Command Line File Download
For more such wget download examples, read our article that shows 10 Wget Download Command Examples.
3. cURL
a cURL is a command-line tool for transferring data over a number of protocols. cURL is a client-side application that supports protocols like FTP, HTTP, FTPS, TFTP, TELNET, IMAP, POP3, etc.
cURL is a simple downloader that is different from wget in supporting LDAP, POP3 as compared to others. Moreover, Proxy Downloading, pausing download, resuming download are well supported in cURL.
Installation of cURL in Linux
By default, cURL is available in most of the distribution either in the repository or installed. if it’s not installed, just do an apt or yum to get a required package from the repository.
Basic Usage of cURL Command
For more such curl command examples, read our article that shows 15 Tips On How to Use ‘Curl’ Command in Linux.
4. w3m
The w3m is a text-based web browser released under GPL. W3m support tables, frames, color, SSL connection, and inline images. W3m is known for fast browsing.
Installation of w3m in Linux
Again w3m is available by default in most of the Linux Distribution. If in case, it is not available you can always apt or yum the required package.
Basic Usage of w3m Command
5. Elinks
Elinks is a free text-based web browser for Unix and Unix-based systems. Elinks support HTTP, HTTP Cookies and also support browsing scripts in Perl and Ruby.
Tab-based browsing is well supported. The best thing is that it supports Mouse, Display Colours, and supports a number of protocols like HTTP, FTP, SMB, Ipv4, and Ipv6.
Installation of Elinks in Linux
By default elinks also available in most Linux distributions. If not, install it via apt or yum.
Basic Usage of elinks Command
That’s all for now. I’ll be here again with an interesting article which you people will love to read. Till then stay tuned and connected to Tecmint and don’t forget to give your valuable feedback in the comment section.
If You Appreciate What We Do Here On TecMint, You Should Consider:
TecMint is the fastest growing and most trusted community site for any kind of Linux Articles, Guides and Books on the web. Millions of people visit TecMint! to search or browse the thousands of published articles available FREELY to all.
If you like what you are reading, please consider buying us a coffee ( or 2 ) as a token of appreciation.
We are thankful for your never ending support.
Источник
2 Ways to Download Files From Linux Terminal
Last updated May 22, 2021 By Abhishek Prakash 4 Comments
If you are stuck to the Linux terminal, say on a server, how do you download a file from the terminal?
There is no download command in Linux but there are a couple of Linux commands for downloading file.
In this terminal trick, you’ll learn two ways to download file using command line in Linux.
I am using Ubuntu here but apart from the installation, rest of the commands are equally valid for all other Linux distributions.
Download files from Linux terminal using wget command
wget is perhaps the most used command line download manager for Linux and UNIX-like systems. You can download a single file, multiple files, entire directory or even an entire website using wget.
wget is non-interactive and can easily work in the background. This means you can easily use it in scripts or even build tools like uGet download manager.
Let’s see how to use wget to download file from terminal.
Installing wget
Most Linux distributions come with wget preinstalled. It is also available in the repository of most distributions and you can easily install it using your distribution’s package manager.
On Ubuntu and Debian based distribution, you can use the apt package manager command:
Download a file or webpage using wget
You just need to provide the URL of the file or webpage. It will download the file with its original name in the directory you are in.
To download multiple files, you’ll have to save their URLs in a text file and provide that text file as input to wget like this:
Download files with a different name using wget
You’ll notice that a webpage is almost always saved as index.html with wget. It will be a good idea to provide custom name to downloaded file.
You can use the -O (uppercase O) option to provide the output filename while downloading.
Download a folder using wget
Suppose you are browsing an FTP server and you need to download an entire directory, you can use the recursive option
Download an entire website using wget
Yes, you can totally do that. You can mirror an entire website with wget. By downloading an entire website I mean the entire public facing website structure.
While you can use the mirror option -m directly, it will be a good idea add:
- –convert-links : links are converted so that internal links are pointed to downloaded resource instead of web
- –page-requisites: downloads additional things like style sheets so that the pages look better offline
Bonus Tip: Resume incomplete downloads
If you aborted the download by pressing C for some reasons, you can resume the previous download with option -c.
Download files from Linux command line using curl
Like wget, curl is also one of the most popular commands to download files in Linux terminal. There are so many ways to use curl extensively but I’ll focus on only the simple downloading here.
Installing curl
Though curl doesn’t come preinstalled, it is available in the official repositories of most distributions. You can use your distribution’s package manager to install it.
To install curl on Ubuntu and other Debian based distributions, use the following command:
Download files or webpage using curl
If you use curl without any option with a URL, it will read the file and print it on the terminal screen.
To download a file using curl command in Linux terminal, you’ll have to use the -O (uppercase O) option:
It is simpler to download multiple files in Linux with curl. You just have to specify multiple URLs:
Keep in mind that curl is not as simple as wget. While wget saves webpages as index.html, curl will complain of remote file not having a name for webpages. You’ll have to save it with a custom name as described in the next section.
Download files with a different name
It could be confusing but to provide a custom name for the downloaded file (instead of the original source name), you’ll have to use -o (lowercase O) option:
Some times, curl wouldn’t just download the file as you expect it to. You’ll have to use option -L (for location) to download it correctly. This is because some times the links redirect to some other link and with option -L, it follows the final link.
Pause and resume download with curl
Like wget, you can also resume a paused download using curl with option -c:
Conclusion
As always, there are multiple ways to do the same thing in Linux. Downloading files from the terminal is no different.
wget and curl are just two of the most popular commands for downloading files in Linux. There are more such command line tools. Terminal based web-browsers like elinks, w3m etc can also be used for downloading files in command line.
Personally, for a simple download, I prefer using wget over curl. It is simpler and less confusing because you may have a difficult time figuring out why curl could not download a file in the expected format.
Your feedback and suggestions are welcome.
Like what you read? Please share it with others.
About Abhishek Prakash
Creator of It’s FOSS. An ardent Linux user & open source promoter. Huge fan of classic detective mysteries ranging from Agatha Christie and Sherlock Holmes to Detective Columbo & Ellery Queen. Also a movie buff with a soft corner for film noir.
it did not woerck
uget GUI seems much more proficient than wget at the terminal.
Tried wget on your page about how to use wget and it hangs on “connecting to itsfoss………..” but worked on a “pocket” page.
I’ve found that Aria2 is a much better way to download files, especially large ones like Linux LiveUSB ISOs. There’s so much Aria2 can do, which is why it is always on my list of the packages I always install first onto newly installed distros or even ones I’m trying out as LiveUSBs.
With Aria2 you can set download speed, so you don’t eat up bandwidth while doing something else that requires it. It has many other features like resuming unfinished DLs among many others.
One of my absolute favorite features is that Aria2 can also be used to both download and upload Torrents as a peer and seeder! It can do this by first downloading the .torrent file, which when finished will automatically begin downloading the contents of that file, or can be used by copying over the magnet-link. I think there’s even a way to do this with the torrent hash as well.
Or install it with:
sudo apt install aria2
pacman – S aria2
It’s also available for Solus and Sabayin with these commands (I think):
sudo eopkg install aria2
sudo equo install aria2
I primarily use Debian and Arch, so those are the only two I have memorized. I’ve tested Solus and Sabayon quite a bit too, that’s why I’m pretty sure those are the proper commands for those distros.
There’s obviously ways to install it with other distros like Fedora using dnf, but I’m not familiar with that package manager and don’t want to give incorrect info. I’d also be surprised if Aria2 isn’t available as a snap package.
Aria2’s homepage will give details about installing it in a myriad of package managers. It’s help page will also show all the other details like setting the aforementioned download speed, setting the parameters for downloading and seeding torrents, and many, many others!
You should definitely consider switching from Wget and Curl over to Aria2!
Источник
How to Download a File on Ubuntu Linux using the Command Line
Linux Command line offers more flexibility and control than GUI. A number of people prefer to use the command line than GUI because it is easier and quicker to use than GUI. Using the command line, it is easier to automate the tasks using one line. In addition, it utilizes fewer resources than GUI.
Downloading files is a routine task that is normally performed every day that can include file types like ZIP, TAR, ISO, PNG, etc. you can simply and quickly perform this task using the command line terminal. It requires only using your keyboard. So today, I will show you how you can download a file using the command line in Linux. There are normally two known ways to do this, that is using wget and curl utility. For this article, I am using Ubuntu 20.04 LTS for describing the procedure. But the same commands will work on other Linux distributions like Debian, Gentoo, and CentOS too.
Download files using Curl
Curl can be used to transfer data over a number of protocols. It supports many protocols including HTTP, HTTPS, FTP, TFTP, TELNET, SCP, etc. using Curl, you can download any remote files. It supports pause and resumes functions as well.
To get started with, first, you need to install the curl.
Install curl
Launch command line application in Ubuntu that is Terminal by pressing the Ctrl+Alt+T key combinations. Then enter the below command to install curl with sudo.
When prompted for a password, enter sudo password.
Once the installation is complete, enter the below command to download a file.
Download and save the file using the source file name
To save the file with the same name as the original source file on the remote server, use –O (uppercase O) followed by curl as below:
Instead of -O, you can also specify, “–remote-name” as shown below. Both work the same.
Download and save the file with a different name
If you want to download the file and save it in a different name than the name of the file in the remote server, use -o (lower-case o) as shown below. This is helpful when the remote URL doesn’t contain the file name in the URL as shown in the example below.
[filename] is the new name of the output file.
Download multiple files
To download multiple files, enter the command in the following syntax:
Download files from an FTP Server
To download a file from FTP server, enter the command in following syntax:
To download files from user authenticated FTP servers, use the following syntax:
Pause and resume download
While downloading a file, you can manually pause it using Ctrl+C or sometimes it automatically gets interrupted and stopped due to any reason, you can resume it. Navigate to the same directory where you have previously downloaded the file then enter the command in the following syntax:
Download files using Wget
Using wget, you can download files and contents from Web and FTP servers. Wget is a combination of www and the get. It supports protocols like FTP, SFTP, HTTP, and HTTPS. Also it supports recursive download feature. This feature is very useful if you want to download an entire website for offline viewing or for generating a backup of a static website. In addition, you can use it to retrieve content and files from various web servers.
Install wget
Launch command line application in Ubuntu that is terminal by pressing the Ctrl+Alt+T key combinations. Then enter the below command to install wget with sudo.
When prompted for a password, enter the sudo password.
Download file or webpage using wget
To download a file or a webpage, open the Terminal and enter the command in the following syntax:
To save a single webpage, enter the command in the following syntax:
Download files with a different name
If you want to download and save the file with a different name than the name of the original remote file, use -O (upper-case O) as shown below. This is helpful especially when you are downloading a webpage that automatically get saved with the name “index.html”.
To download a file with a different name, enter the command in the following syntax:
Download files through FTP
To download a file from an FTP server, type the command in the following syntax:
To download files from user authenticated FTP servers, use the below syntax:
Recursively download files
You can use the recursive download feature to download everything under the specified directory whether a website or an FTP site. To use the recursive download feature, enter the command in the below syntax:
Download multiple files
You can use wget to download multiple files. Make a text file with a list of file URLs, then use the wget command in the following syntax to download that list.
For instance, I have the text file named “downloads.txt” in which there is a list of two URLs that I want to download using wget. You can see my text file content in the below image:
I will use the below command to download the file links contained in the text file:
You can see that it is downloading both links one by one.
Pause and Resume download
You can Press Ctrl + C to pause a download. To resume a paused download, go to the same directory where you were downloading the file previously and use –c option after wget as in the below syntax:
Using the above command, you will notice that your download has resumed from where it was paused.
So in this article, we have discussed the basic usage of two command-line methods using which you can download a file. One thing to Note that if you do not specify a directory while downloading a file, the files will be downloaded in the current directory in which you are working.
Karim Buzdar
About the Author: Karim Buzdar holds a degree in telecommunication engineering and holds several sysadmin certifications. As an IT engineer and technical author, he writes for various web sites. You can reach Karim on LinkedIn
Источник