What Is Wget and How to Use It

Wget is a free command-line utility that downloads files from the web using HTTP, HTTPS, FTP, and FTPS protocols. It's a go-to tool for Linux users who need reliable, scriptable, and resumable downloads.

Publish date: 10/28/2025

If you've ever needed to download a file from a server, mirror an entire website, or automate file retrieval in a script, you've probably heard of Wget. It's one of those quiet workhorses of the command line that doesn't get much fanfare but does its job incredibly well.

Whether you're a sysadmin pulling down software packages, a developer automating deployments, or just someone who wants a better way to grab files without a browser, Wget has you covered.

What is Wget?

Wget is a free, open-source command-line utility designed for retrieving files from the web. It supports HTTP, HTTPS, FTP, and FTPS protocols, making it flexible enough to handle most download scenarios you'll encounter. The name itself is a portmanteau of "World Wide Web" and "get," which pretty much sums up what it does.

What makes Wget special isn't just that it downloads files. It's non-interactive, meaning it can work in the background without requiring user input. This makes it perfect for scripts, automated tasks, and situations where you need to download something remotely over SSH. It's also incredibly reliable when network connections are spotty; Wget can resume interrupted downloads and retry failed connections automatically.

Originally written by Hrvoje Nikšić in 1996, Wget has become a standard tool in most Linux distributions and is available for Unix-like systems, Windows, and macOS. If you're running a modern Linux system, there's a good chance Wget is already installed.

How Wget works

At its core, Wget functions as an HTTP/FTP client. When you run a Wget command, it sends an HTTP request to the specified server, receives the response, and writes the data to a file on your local system. But unlike a web browser, Wget doesn't render web pages or execute JavaScript; it simply retrieves the raw content.

The basic syntax is straightforward:

wget [options] [URL]

For example, downloading a single file looks like this:

wget https://example.com/file.zip

Wget will connect to the server, download the file, and save it to your current directory with the same filename. Simple as that.

But Wget gets interesting when you start using its options. You can limit download speed, set the number of retry attempts, download recursively to mirror entire websites, authenticate with usernames and passwords, and much more. The tool reads URLs from the command line, but it can also pull them from a text file if you're batch downloading.

One of Wget's most useful features is its ability to resume downloads. If your connection drops midway through downloading a large file, you can restart Wget with the -c flag, and it'll pick up right where it left off. This alone has saved countless hours of bandwidth and frustration over the years.

Wget also respects robots.txt files by default when mirroring websites, which means it won't accidentally hammer a server or download content that site owners have marked as off-limits to crawlers.

What Wget is used for

The use cases for Wget are surprisingly diverse. Here are some of the most common scenarios:

Downloading files from the command line

This is the bread and butter. If you're working on a server without a GUI or need to grab a file quickly over SSH, Wget is your friend. It's faster than transferring the file to your local machine first, especially if you're already working remotely.

Automating downloads

Because Wget works non-interactively, it's perfect for cron jobs and scripts. You can schedule regular downloads of backups, log files, software updates, or any other content that needs to be retrieved on a schedule.

Mirroring websites

Need a local copy of a website for offline browsing, archival, or testing? Wget can recursively download entire sites, following links and preserving directory structure. This is handy for creating static backups or analyzing site structure.

Testing and debugging

Developers often use Wget to test HTTP endpoints, check response headers, or verify that files are accessible from the command line. It's a lightweight alternative to tools like curl when you just need to see if something downloads correctly.

Batch downloading

If you have a list of URLs in a text file, Wget can process them all sequentially. This is useful for downloading datasets, media files, or any collection of resources that would be tedious to grab one by one.

How to use Wget

Let's walk through some practical examples that cover the most common use cases.

Installing Wget

Most Linux distributions include Wget by default. To check if it's installed, run:

wget --version

If it's not installed, you can grab it through your package manager:

# Debian/Ubuntu
sudo apt install wget

# RHEL
sudo dnf install wget

# Arch Linux
sudo pacman -S wget

On macOS, you can install it via Homebrew:

brew install wget

Basic download

The simplest use case is downloading a single file:

wget https://example.com/file.tar.gz

The file will be saved in your current directory with its original filename.

Saving with a different filename

If you want to specify a custom filename, use the -O flag:

wget -O custom-name.tar.gz https://example.com/file.tar.gz

Resuming interrupted downloads

If a download gets interrupted, resume it with:

wget -c https://example.com/largefile.iso

The -c flag tells Wget to continue from where it left off.

Downloading in the background

For large files that might take a while, you can run wget in the background:

wget -b https://example.com/bigfile.zip

Wget will log output to wget-log in the current directory.

Limiting download speed

To avoid saturating your bandwidth, you can throttle the download speed:

wget --limit-rate=1m https://example.com/file.zip

This limits the download to 1 megabyte per second. You can use k for kilobytes or m for megabytes.

Downloading multiple files

If you have a list of URLs in a text file (one URL per line), you can download them all at once:

wget -i urls.txt

Mirroring a website

To create a local copy of a website, use the mirror option:

wget --mirror --convert-links --page-requisites https://example.com

This will recursively download the site, convert links for offline browsing, and grab all necessary assets like CSS and images.

Downloading with authentication

If a resource requires HTTP authentication, provide credentials with:

wget --user=username --password=password https://example.com/protected-file.zip

For FTP, Wget handles authentication similarly:

wget ftp://username:password@ftp.example.com/file.zip

Checking headers without downloading

Sometimes you just want to see response headers without downloading the entire file:

wget --spider --server-response https://example.com/file.zip

The --spider flag tells Wget not to download anything.

Wget vs curl

If you've been around the command line for a while, you might be wondering how Wget compares to curl, another popular download tool. Both are excellent, but they have different strengths.

Wget is better suited for recursive downloads and mirroring websites. It's designed specifically for downloading files and handles this task with minimal configuration. Wget also makes resuming downloads straightforward and includes built-in support for retries.

curl, on the other hand, is more flexible when it comes to protocols and supports a wider range of them, including SMTP, IMAP, and more. It's often preferred for API testing and debugging because it makes it easy to customize requests with headers, POST data, and authentication methods.

In practice, many people use both tools depending on the task. If you're downloading files or mirroring content, reach for Wget. If you're working with APIs or need more granular control over HTTP requests, curl is probably the better choice.

Frequently asked questions about Wget

What does Wget stand for?

Wget stands for "World Wide Web get." The name reflects its purpose as a tool for retrieving content from the web via command line.

Is Wget available on Windows?

Yes, Wget is available for Windows. You can download pre-compiled binaries from the GNU Wget website or install it through package managers like Chocolatey or via Windows Subsystem for Linux.

Can Wget download from password-protected sites?

Absolutely. Wget supports HTTP authentication using the --user and --password flags, and it can handle FTP authentication as well. For more complex authentication schemes like OAuth, you might need to use curl or other specialized tools.

How do I make Wget ignore SSL certificate errors?

If you're downloading from a site with a self-signed or expired SSL certificate, you can bypass verification with the --no-check-certificate flag. Keep in mind this reduces security, so only use it when you trust the source.

Can Wget follow redirects?

Yes, Wget follows HTTP redirects by default. If you want to limit the number of redirects it follows, you can use the --max-redirect option.

How do I download only specific file types?

You can use the -A flag to accept only certain file types during recursive downloads. For example, to download only PDF files:

wget -r -A pdf https://example.com

Does Wget work with proxies?

Yes, Wget can work through HTTP and HTTPS proxies. You can specify proxy settings using environment variables or command-line options like --proxy or configure them in your .wgetrc file.

Conclusion

Wget is one of those tools that once you start using it, you wonder how you ever managed without it. It's reliable, scriptable, and handles everything from single file downloads to full website mirrors with equal ease. For anyone working in a server environment or just looking for more control over their downloads, it's an essential part of the toolkit.

Thanks for reading! If you're looking for reliable infrastructure to host your projects, xTom provides enterprise-grade dedicated servers and colocation services, while V.PS offers scalable, production-ready NVMe-powered VPS hosting perfect for any workload. We also offer IP transit, shared hosting, and a range of IT services to meet your needs.

Ready to discuss your infrastructure needs? Contact our team to explore the right solution for your projects.