Hi
Do you know any software that can do that:
Open html/txt file (server or local file), read all urls 1 by one, open them and download 1 (or more) jpgs from them?
And after downloading it has to be easy to match thumbnails to urls. For example - I have txt file with url - 1 each line, software download thombs from all links and save them as 1.jpg *thumb from link1) , 2.jpg (thumb from link2) etc. or if downloading all jpgs from urls - make dirs like 1 (all jogs from url1) , 2 (all urls from dir2)...
I need this to download thumbs from FHGs
I do not want manually download over 1k files (and probably more in the future) :?
thank you in advance
PS: sorry for grama/spelling mistakes
