[ALUG] Globbed search for filenames in website

mick mbm at rlogin.net
Sun Jun 7 11:36:02 BST 2009

On Sun, 07 Jun 2009 08:41:36 +0100 (BST)
(Ted Harding) <Ted.Harding at manchester.ac.uk> allegedly wrote:

> Greetings!
> I'm looking for a way to find out what files for a particular
> wild-card form exist at a certain directory depth on a website.
> For example:
>   www.some.web.page/*/*.png
> for all PNG files 1 below the top level. The results to be listed
> (stored in a file) along the lines of what you would get from
>   ls */*.png
>if you were at the top level on the server.
> A browser won't do it (won't accept wild-cards). I've looked at wget,
> but this doesn't seem to have a simple listing option (except under
> ftp mode, which the remote site won't respond to).
> Any suggestions?


If I've understood you correctly then find should do it. Try

find . -maxdepth 2 -name "*.png" -print

(You can redirect output to a file of course)



The text file for RFC 854 contains exactly 854 lines. 
Do you think there is any cosmic significance in this?

Douglas E Comer - Internetworking with TCP/IP Volume 1


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: <http://lists.alug.org.uk/pipermail/main/attachments/20090607/33515ed7/attachment.pgp>

More information about the main mailing list