I need a high perfomance system for taking screen shots of rendered web pages. The screen shots must then be turned into jpeg or gif thumbnail images.
Operating System: Red Hat Linux ES.
The system must do the following:
* use a browser component to render the page (e.g., Gecko, FireFox etc)
* capture the body onLoad event and take a copy of the fully rendered page
* reduce the size of the image to a 'thumbnail' while maintaining reasonable quality
* enable screenshots to be taken in parallel/asynchronously
* the minimum throughput required is 10 screen shots per second
* run on Linux in a small memory footprint - less than 30Meg of RAM at maximum throughput
* must be driven from the command-line in batch mode (e.g., perl [url removed, login to view] -size 400x140 [url removed, login to view]
* Ideally a perl script that given a list of URLs and IDs produces a directory full of thumbnail images
* Benchmarks of screenshooter working at over 10 screenshots per second