Sometimes you get a page of graphics on the Web, and you want to laboriously go through with the right button on the mouse and save each picture in turn. This is fine if there are five to ten pictures, but what if there are twenty-five to fifty or one-hundred? If only there was some way to easily download all images from Web pages.
Of course there is. In this article we show you a simple three-part Automator script on the Mac that will detect and save all the images from the current web page (in Safari browser) and save them to a directory on your desktop.
Note: for Firefox, you can follow the instructions here.
Download All Images from Web Pages with Automator
Automator is a really useful tool on the Mac that few users ever try to use, but they really should because it’s easy and powerful. It can automate a large number of repetitive tasks and has access to all parts of the system. You can batch convert graphics files to JPG, sound files from one format to another, or turn hidden files on and off.
Something which you may not know is that Automator has extensive hooks into Safari, allowing you to do otherwise impossible things with Web pages.
To automate the grabbing of images on a current Web page, it’s an easy matter of a three-item script, and of course, you can turn this into an app you can run on your desktop, from your dock (a service which runs from the menu bar), or even a folder action. In this instance we are going to make it an app, but feel free to experiment with other target methods.
Scripting the Workflow
Open a new Automator workflow. Select an app as your target workflow type.
From the actions toolbar on the left choose Internet actions. From the list choose “Get Current Webpage” from Safari action.
Now choose from the same Internet actions “Get Contents of Web Pages.” This will load a Web Archive of the contents of the current Web page into memory which can then be acted on by subsequent actions.
Finally, choose the action “Save Images From Web Content.” This action processes the Web Archive from the previous step and fillets it for image files, then saves those files to disk in the manner specified, either to a specific named directory or with the URL.
With this in mind, check the boxes “Use URLs as folder names” and “Replacing Existing Folders.” This will name the folder on your desktop with the website URL to make each folder different. If it already exists, it will replace it rather than put up an error and not complete the task. This is a little crude, but you could refine that with a little work.
Save the app to disk in your preferred location.
Running the App
Run the app as you would any other app, remembering of course to make sure the tab on the page you wish to capture is the current one. By the way, this works even if Safari is minimised.
You can run the app from the desktop or hide it in the Applications directory and run it from a shortcut on the dock.
Run the app, and a folder with the URL name is saved on your desktop containing all the images from the page.
There are certain limitations to the script which you could probably work out if you fiddle with it a bit. For example, you get an error if there are no pictures to grab. That aside, it works well, if a little basic.
Are there any tasks you would like to see Automated? Have you made this basic script more complicated? Let us know in the comments below.