A scheduling feature is on our development list for the Screaming Frog SEO spider alongside a fully functional command line option (for crawling, saving, exporting etc).

While the above is in development we have added a quick option to use the command line to auto start crawls and hence allow you to schedule them for now. Please note, this is for advanced users only and we can’t offer support for Windows Task Scheduler, Launchd or cron jobs. But, we can provide some guidance on how to schedule a crawl by command line using some of these!

Windows Scheduling

On Windows 7 this can be done using the Task Scheduler.

Start -> Control Panel -> System and Security. Then under the ‘Administrative Tools’ section choose ‘Schedule Tasks’.

windows task scheduler

Click ‘Create Basic Task’ in the ‘Actions’ section on the right.

windows task scheduler 2

Enter an appropriate ‘Name’ and ‘Description’ and click ‘Next’.

windows task scheduler three

For this example we are going to happen every day and 2am. So lets choose daily.

windows task scheduler four

Set the time to 2am.

windows task scheduler step 5!

Choose to start a program and click ‘Next’.

windows task scheduler step six

Now select ‘ScreamingFrogSEOSpider.exe’ by navigating to it using the ‘Browse’ button for the Program/script to run. By default it’s in C:\Program Files\Screaming Frog SEO Spider.

Set the argument to be:

 --crawl http://www.example.com/ 

(adjusting to be the site you’d like to crawl!).

Click ‘Next’.

windows task scheduler final step

Click ‘Finish’ on the final screen. That’s it! :-)

Mac OS X Scheduling

You can use Apple’s ‘launchd‘ to schedule crawls.

Note: We did experiment with using cron, but it launches in a headless environment which prevents the SEO Spider from running. Any solutions to this are welcome, however launchd is Apple’s prefered way of task scheduling.

Create a file called: uk.co.screamingfrog.seo.spider.scheduler.plist in ~/Library/LaunchAgents/.

With the following contents:

<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>uk.co.screamingfrog.seo.spider.scheduler</string>
<key>ProgramArguments</key>
<array>
<string>open</string>
<string>/Applications/Screaming Frog SEO Spider.app</string> <string>--args</string>
<string>--crawl</string>
<string>http://www.example.com/</string>
</array>
<key>StartCalendarInterval</key>
<dict>
<key>Hour</key>
<integer>2</integer>
</dict>
</dict>
</plist>

Alternatively you can simply save and edit this file here we have already created.

This is the configuration you need to crawl http://www.example.com/ at 2am everyday. Edit the file to select the site and times appropriately.

Now make launchd aware of this file:

launchctl load ~/Library/LaunchAgents/uk.co.screamingfrog.seo.spider.scheduler.plist

If you change the config file, you must unload and re-load:

launchctl unload ~/Library/LaunchAgents/uk.co.screamingfrog.seo.spider.scheduler.plist
launchctl load ~/Library/LaunchAgents/uk.co.screamingfrog.seo.spider.scheduler.plist

For more information on launchd options read their guide.

Linux Scheduling

This can be done using a cron job. The user running the job will need to be logged in for it to run successfully as the Screaming Frog SEO spider needs an x session to run, it can’t be run headless.

Open up a terminal and type:

crontab -e

Then enter the following line to start a crawl every day at 2am.

* 2 * * * export DISPLAY=:0 && screamingfrogseospider --crawl http://www.example.com/

Final Comments

Hopefully the above feature and guide will help for now when scheduling a crawl. As discussed above, we plan on developing this much further!

screamingfrog (31 Posts)

Dan Sharp is founder & Director of Screaming Frog. He has developed search strategies for a variety of clients from international brands to small and medium sized businesses and designed and managed the build of the innovative SEO Spider software.