Automate taking screenshots of webpages

From Wiki

Jump to: navigation, search



I wanted to setup cron to take screenshots of various websites on a daily basis. There are different approaches available. One is using wkhtmltopdf (still requires additional packages), another is using (requires vncserver). The approach I chose was to manually setup VNCserver and firefox (iceweasel) and script it myself.

Main Script

The following script starts a VNC session, starts firefox (iceweasel on debian) with a url and takes a screenshot, saving the image with the filename:


# start a server with a specific DISPLAY
vncserver :11 -geometry 1024x768

# read URLs from a data file in a loop
for url in `cat list.txt`
       firefox --display=:11 $url &
       sleep 5
       eval "import -display :11 -window root" `date +"%d-%m-%Y_%k:%M"`"_"`echo|awk '{print substr(v1,8)}' v1=$url`".jpg"
# clean up when done
vncserver -kill :11

list.txt contains a list of URLs in the format:
#awk is then used to remove the http:// and use the url as part of the filename


The above was done on debian squeeze.

apt-get install imagemagick
#This package provides the import command.

#The above page describes installing an Xserver and firefox/iceweasel


Main Reference:

Personal tools