wget

Chris F.A. Johnson cfaj-uVmiyxGBW52XDw4h08c5KA at public.gmane.org
Fri May 26 22:49:46 UTC 2006


On Fri, 26 May 2006, Daniel Armstrong wrote:

> On 5/26/06, Daniel Armstrong <dwarmstrong-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org> wrote:
>>  But I would like to know how to use wget to download *all* the video
>>  files of a certain compression size with a single command. I checked
>>  the manpage and used the "-A" option to specify a filetype, using this
>>  command:
>>
>>  wget -A "*220k.rm" http://ocw.mit.edu/ans7870/7/7.012/f04/video/
>
> Gathering further info online, I think in this particular case I am
> limited to using wget to download the files via a direct link
> one-at-a-time. No big problem... I wonder why this video/ directory
> doesn't have an index.html file, though...

    Either through an oversight or to prevent people doing what you are
    trying to do. ;)

    If you have a list of the files, why not put them in a script?

list=(
          axxxx220k.rm
          bxxxx220k.rm
          cxxxx220k.rm
          dxxxx220k.rm
          .....
    )

for f in "${list[@]}"
do
    wget "http://ocw.mit.edu/ans7870/7/7.012/f04/video/$f"
done

-- 
    Chris F.A. Johnson                      <http://cfaj.freeshell.org>
    ===================================================================
    Author:
    Shell Scripting Recipes: A Problem-Solution Approach (2005, Apress)
--
The Toronto Linux Users Group.      Meetings: http://tlug.ss.org
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://tlug.ss.org/subscribe.shtml





More information about the Legacy mailing list