The wget is a Linux utility for downloading files from the web. You typically use wget to retrieve an object or a web page at a particular URL (e.g., "wget http://www.cnn.com/index.html"). What if you want to download multiple files? While you could invoke wget multiple times manually, there are several ways to download multiple files with wget in one shot.
If you know a list of URLs to fetch, you can simply supply wget with an input file that contains a list of URLs. Use "-i" option is for that purpose.
If URL names have a specific numbering pattern, you can use curly braces to download all the URLs that match the pattern. For example, if you want to download Linux kernels starting from version 3.2.1 to 3.2.15, you can do the following.
So far you specified all individual URLs when running wget, either by supplying an input file or by using numeric patterns.
If a target web server has directory indexing enabled, and all the files to download are located in the same directory, you can download all of them, by using wget's recursive retrieval option.
What do I mean by directory indexing being enabled? If directory indexing is enabled on aaa.com, going to http://aaa.com/directory will give you a listing of files in that directory (assuming there is no separate index.html in that directory). In this case, you can use the "-r" option of wget to download multiple files in the directory. For example:
In the above example, "-r" and "-l1" options together enable 1-level deep recursive retrieval, and "-A" option specifies lists of file name suffixes to accept during recursive download (".bz2" in this case).
Subscribe to Xmodulo
You can have daily Linux tutorials and FAQs delivered to your email inbox. Simply enter your email address below to subscribe to our mailing list. You will receive hands-on guides and carefully written tutorials related to Linux, everything for free.