Posted 29 July, 2014 by in Screaming Frog SEO Spider

How To Schedule A Crawl By Command Line In The SEO Spider

A scheduling feature is on our development list for the Screaming Frog SEO spider alongside a fully functional command line option (for crawling, saving, exporting etc).

While the above is in development we have added a quick option to use the command line to auto start crawls and hence allow you to schedule them for now. Please note, this is for advanced users only and we can’t offer support for Windows Task Scheduler, Launchd or cron jobs. But, we can provide some guidance on how to schedule a crawl by command line using some of these!

Windows Scheduling

On Windows 7 this can be done using the Task Scheduler.

Start -> Control Panel -> System and Security. Then under the ‘Administrative Tools’ section choose ‘Schedule Tasks’.

windows task scheduler

Click ‘Create Basic Task’ in the ‘Actions’ section on the right.

windows task scheduler 2

Enter an appropriate ‘Name’ and ‘Description’ and click ‘Next’.

windows task scheduler three

For this example we are going to happen every day and 2am. So lets choose daily.

windows task scheduler four

Set the time to 2am.

windows task scheduler step 5!

Choose to start a program and click ‘Next’.

windows task scheduler step six

Now select ‘ScreamingFrogSEOSpider.exe’ by navigating to it using the ‘Browse’ button for the Program/script to run. By default it’s in C:\Program Files\Screaming Frog SEO Spider.

Set the argument to be:


(adjusting to be the site you’d like to crawl!).

Click ‘Next’.

windows task scheduler final step

Click ‘Finish’ on the final screen. That’s it! :-)

Mac OS X Scheduling

You can use Apple’s ‘launchd‘ to schedule crawls.

Note: We did experiment with using cron, but it launches in a headless environment which prevents the SEO Spider from running. Any solutions to this are welcome, however launchd is Apple’s prefered way of task scheduling.

Create a file called: in ~/Library/LaunchAgents/.

With the following contents:

<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
<plist version="1.0">
<string>/Applications/Screaming Frog SEO</string> <string>--args</string>

Alternatively you can simply save and edit this file here we have already created.

This is the configuration you need to crawl at 2am everyday. Edit the file to select the site and times appropriately.

Now make launchd aware of this file:

launchctl load ~/Library/LaunchAgents/

If you change the config file, you must unload and re-load:

launchctl unload ~/Library/LaunchAgents/
launchctl load ~/Library/LaunchAgents/

For more information on launchd options read their guide.

Linux Scheduling

This can be done using a cron job. The user running the job will need to be logged in for it to run successfully as the Screaming Frog SEO spider needs an x session to run, it can’t be run headless.

Open up a terminal and type:

crontab -e

Then enter the following line to start a crawl every day at 2am.

* 2 * * * export DISPLAY=:0 && screamingfrogseospider --crawl

Final Comments

Hopefully the above feature and guide will help for now when scheduling a crawl. As discussed above, we plan on developing this much further!