In this guide, you will see:
- The basic cURL download file syntax
- How to handle more complex scenarios when downloading files with cURL
- How to download multiple files at once
- Some best practices for using cURL effectively
- A quick comparison between cURL and Wget
Let’s dive in!
Basic cURL Download File Syntax
This is the most basic cURL download file syntax:
curl -O <file_url>
Note: On Windows, replace curl
with curl.exe
. This is required because curl
is an alias for Invoke-WebRequest
in Windows PowerShell, while curl.exe
explicitly runs the cURL command-line tool.
The -O
flag tells cURL to save the downloaded file with its original name from the URL specified in <file_url>
. Equivalently, you can use --remote-name
:
curl --remote-name <file_url>
For example, consider the following download file cURL command:
curl -O "https://i.imgur.com/CSRiAeN.jpg"
This will produce an output with a download progress bar as below:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 35354 100 35354 0 0 155k 0 --:--:-- --:--:-- --:--:-- 158k
When the progress reaches 100%, a file named CSRiAeN.jpg
will appear in the folder where you ran the cURL command:
For more information on what cURL is and what options it offers, read our cURL guide. Time to explore more complex scenarios!
Using cURL to Download a File: Advanced Options
Now that you know the basic cURL download file syntax, you are ready to learn how to customize the command with additional options.
Change Downloaded File Name
By default, the -O
option downloads the file specified in the target URL using its original name. If the remote file specified in the URL does not include a name, cURL creates a file with no extension called curl_response
:
cURL will also print a warning to inform you of that behavior:
Warning: No remote file name, uses "curl_response"
To specify a custom name for the downloaded file, use the -o
(or --output
) flag as shown here:
curl "https://i.imgur.com/CSRiAeN.jpg" -o "logo.jpg"
This command instructs cURL to perform a GET request to the specified file URL. Then, it saves the downloaded content as the name specified after -o
instead of printing it to stdout.
This time, the output file will be a logo.jpg
file:
Follow Redirects
Some URLs do not directly point to the desired file and require automatic redirects to reach the final destination.
To instruct cURL to follow redirects, you need to use the -L
option:
curl -O -L "<file_url>"
Without the -L
flag, cURL would output the redirection response headers (such as 301 Moved Permanently
or 302 Found
). Specifically, it would not automatically follow the new location provided in the Location
header.
Authenticate to the Server
Some servers restrict access to their resources and require user authentication. To perform basic HTTP or FTP authentication, you can use the -u
(or --user
) option. This enables you to specify a username and password in the following format:
<username>:<password>
The username and password are separated by a colon (:
), which makes it impossible to include a colon in the username. Instead, the password can contain a colon.
The <password>
string is optional. If you only specify the username, cURL will prompt you to enter the password.
Here is the syntax for downloading a file with cURL using server authentication:
curl -O -u <username>:<password> <file_url>
For example, you can download a .png
file from a URL with authentication using this command:
curl -O -u "myUser:myPassword" "https://example.com/secret.txt"
cURL will authenticate to the server using myUser
and myPassword
as credentials. Next, it will download the secret.txt
file.
Impose Bandwidth Restrictions
By default, cURL downloads a file using the full available bandwidth—which may not always be desirable. To control the download speed, you can use the --limit-rate
option followed by the maximum download speed you want to set:
curl -O --limit-rate 5k "https://i.imgur.com/CSRiAeN.jpg"
The output will be something like:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 35354 100 35354 0 0 5166 0 0:00:06 0:00:06 --:--:-- 5198
Note that the download speed (5198 bytes per second, which corresponds to 5 KB per second) matches the one specified in the option. This occurs even when your machine’s normal download speed is higher than the value set with --limit-rate
.
--limit-rate
is useful for controlling bandwidth usage to avoid overloading the network, comply with bandwidth restrictions, or simulate slower network conditions for testing purposes.
Download Through a Proxy Server
When you perform a download request using cURL, your IP address is exposed to the target server. That is a problem if you want to maintain privacy or avoid anti-bot measures like rate limiting.
To mask your IP and route your request through a proxy, use the -x
(or --proxy
) option in your cURL command:
curl -x <proxy_url> -O <file_url>
<proxy_url>
must be specified in the following format:
[protocol://]host[:port]
Note that the proxy URL will vary depending on whether you are using an HTTP, HTTPS, or SOCKS proxy. For more detailed instructions, refer to our cURL proxy integration guide.
For example, if you are using an HTTP proxy the command would become:
curl -x "http://proxy.example.com:8080" -O "https://i.imgur.com/CSRiAeN.jpg"
Perform Background Downloads
By default, the cURL download file command displays a progress bar or an error message in case of failure. To disable these outputs, you can enable “silent” or “quiet” mode using the -s
(or --silent
) option:
curl -O -s "https://i.imgur.com/CSRiAeN.jpg"
This will make cURL operate silently. If the download is successful, the file will appear in the current directory, but there will be no feedback in the terminal.
Print Verbose Detail Information
In case of an error or to better understand what cURL is doing behind the scenes, enabling the verbose mode using the -v
(or --verbose
) option is recommended:
curl -O -v "https://i.imgur.com/CSRiAeN.jpg"
When you execute the command, you will see additional output that provides detailed information about the request and response process:
* IPv6: (none)
* IPv4: 146.75.52.193
* Trying 146.75.52.193:443...
* schannel: disabled automatic use of client certificate
* ALPN: curl offers http/1.1
* ALPN: server accepted http/1.1
* Connected to i.imgur.com (146.75.52.193) port 443
* using HTTP/1.x
> GET /CSRiAeN.jpg HTTP/1.1
> Host: i.imgur.com
> User-Agent: curl/8.10.1
> Accept: */*
>
* Request completely sent off
* schannel: failed to decrypt data, need more data
< HTTP/1.1 200 OK
< Connection: keep-alive
< Content-Length: 35354
< Content-Type: image/jpeg
< Last-Modified: Wed, 08 Jan 2025 08:02:49 GMT
< ETag: "117b93e0521ba1313429bad28b3befc8"
< x-amz-server-side-encryption: AES256
< X-Amz-Cf-Pop: IAD89-P1
< X-Amz-Cf-Id: wTQ20stgw0Ffl1BRmhRhFqpCXY_2hnBLbPXn9D8LgPwdjL96xarRVQ==
< cache-control: public, max-age=31536000
< Accept-Ranges: bytes
< Age: 2903
< Date: Wed, 08 Jan 2025 08:51:12 GMT
< X-Served-By: cache-iad-kiad7000028-IAD, cache-lin1730072-LIN
< X-Cache: Miss from cloudfront, HIT, HIT
< X-Cache-Hits: 1, 0
< X-Timer: S1736326272.410959,VS0,VE1
< Strict-Transport-Security: max-age=300
< Access-Control-Allow-Methods: GET, OPTIONS
< Access-Control-Allow-Origin: *
< Server: cat factory 1.0
< X-Content-Type-Options: nosniff
<
{ [1371 bytes data]
100 35354 100 35354 0 0 212k 0 --:--:-- --:--:-- --:--:-- 214k
* Connection #0 to host i.imgur.com left intact
This includes connection details, request headers, response headers, and extra download progress information.
Set a Simplified Progress Bar
The standard cURL download file progress bar might not suit your needs. You can enable a simpler progress bar with -#
(or --progress-bar
) option:
curl -O -# "https://i.imgur.com/CSRiAeN.jpg"
This will display a progress bar using the #
character, which will incrementally fill as the file downloads:
########################################################### 100.0%
The #
bar provides a more minimalistic view of the download progress compared to the default cURL progress output.
How to Download Multiple Files with cURL
You just saw how to download a file with cURL, but what about downloading multiple files with a single command? Get ready time to learn that!
Range File Download
cURL supports downloading multiple files at once using URL expansion. In detail, you can download multiple files with the same remote URL by specifying them using braces {}
:
curl -O "https://example.com/images/{1.jpg,2.jpg,3.jpg}"
This will download the three specified files:
1.jpg
2.jpg
3.jpg
Notice how the files specified in the {}
can have different extensions.
Equivalently, you can use the square brackets []
syntax:
curl -O "https://example.com/files/file[1-3].jpg"
This will achieve the same result as the first example. In this case, all files in []
must share the same extension.
Note: If you include custom options (such as -s
for silent mode or --limit-rate
for bandwidth restrictions), these will be applied to all the files being downloaded.
Multiple File Download
To download files from different URLs, you need to specify the -O
option multiple times:
curl -O "https://i.imgur.com/CSRiAeN.jpg" -O "https://brightdata.com/wp-content/uploads/2020/12/upload_blog_20201220_153903.svg"
This command will download CSRiAeN.jpg
from i.imgur.com
and upload_blog_20201220_153903.jpg
from brightdata.com
.
The output will contain a download bar per given URL:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 35354 100 35354 0 0 271k 0 --:--:-- --:--:-- --:--:-- 276k
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 22467 0 22467 0 0 34657 0 --:--:-- --:--:-- --:--:-- 34724
Similarly, you can use multiple -o
options:
curl "https://i.imgur.com/CSRiAeN.jpg" -o "logo.jpg" "https://brightdata.com/wp-content/uploads/2020/12/upload_blog_20201220_153903.svg" -o "blog_post.svg"
The above command will download CSRiAeN.jpg
and save it as logo.jpg
, and download upload_blog_20201220_153903.svg
and save it as blog_post.svg
.
Keep in mind that you can also mix -O
and -o
options:
curl "https://i.imgur.com/CSRiAeN.jpg" -o "logo.jpg" -O "https://brightdata.com/wp-content/uploads/2020/12/upload_blog_20201220_153903.svg"
This downloads logo.jpg
as before and upload_blog_20201220_153903.svg
with its original file name.
Note that other options like -v
, -s
or --limit-rate
apply to all URLs distinctively, so they must be specified once.
Best Practices When Downloading Files with cURL
Below is a list of some of the most important cURL file download best practices:
- Use
curl.exe
instead of curl on Windows: On Windows, usecurl.exe
rather thancurl
to avoid the conflict with theInvoke-WebRequest
cmdlet. - Ignore HTTPS and SSL/TLS errors (with caution): Use the
-k
(or--insecure
) option to ignore SSL/TLS certificate validation errors. At the same time, be aware that this compromises security and should only be used in trusted environments. - Specify the right HTTP methods: When making requests, use the appropriate HTTP method such as GET, POST, or PUT. The method affects how the server responds to your request. Use the
-X
option to specify the method. - Enclose URLs in quotes and escape special characters: Always wrap URLs in single or double quotes to handle special characters properly. Use escape characters (
\
) to avoid issues with spaces, ampersands, and other special characters in URLs. - Specify a proxy to protect your identity: Use the
-x
(or--proxy
) option to route your cURL requests through a proxy. This helps protect your IP address and maintain privacy when scraping or downloading files. - Save and reuse cookies across different requests: Use the
-c
and-b
options to save and reuse cookies in subsequent requests. This helps maintain session persistence and can be useful for authentication or tracking. - Limit download speed for better control: Use the
--limit-rate
option to control the download speed and avoid overwhelming your network connection or triggering rate limits on the server. - Add verbose output for debugging: Enable verbose mode with the
-v
option to get detailed information about the request and response. That can be helpful for debugging and troubleshooting. - Check for error responses: Always check the HTTP response codes using the
-w
option to verify if the file download was successful (e.g.,200 OK
) or if there was an error (e.g.,404 Not Found
).
cURL vs Wget for Downloading Files
cURL and Wget are both command-line tools for retrieving files from remote servers. The main difference between the two is as follows:
- Wget is designed for downloading files from the Web. It supports HTTP, HTTPS, FTP, and many other protocols. Wget is known for its ability to recursively download files, resume interrupted downloads, and work well in background processes. See how to use it to download web pages with Python.
- cURL is a versatile command-line tool used for transferring data to and from a server using various Internet protocols. It is commonly used for testing endpoints, performing simple HTTP requests, and downloading single files. cURL can also be used for web scraping.
The main difference between cURL and Wget is that cURL provides more granular control over data transfer. In detail, it supports custom headers, authentication, and more protocols. In contrast, Wget is simpler and better suited for bulk downloads, recursion, and handling interrupted transfers.
Conclusion
In this guide, you learned how to download files with cURL. You started from the basic cURL download file syntax and explored more complex scenarios and use cases. You now know how to scrape single or multiple files using cURL.
Keep in mind that whenever you make an HTTP request, you leave traces on the internet. To protect your identity, privacy, and enhance your security, you should consider integrating a proxy with cURL. Fortunately, Bright Data has you covered!
Bright Data controls the best proxy servers in the world, serving Fortune 500 companies and over 20,000 customers. Its worldwide proxy network involves:
- Datacenter proxies – Over 770,000 datacenter IPs.
- Residential proxies – Over 72M residential IPs in more than 195 countries.
- ISP proxies – Over 700,000 ISP IPs.
- Mobile proxies – Over 7M mobile IPs.
Overall, that is one of the largest and most reliable scraping-oriented proxy networks on the market.
Sign up now and test our proxies and scraping solutions for free!
No credit card required