I want somebody to program a proxycrawler.
The crawler should run on linux machines. Currently i run Fedora Core 4 64Bit.
The crawler should search for different search terms, that come from a database for example "socks proxy" on google, yahoo...
Then it should get the results of the search query and save them to a database.
Now the programm should search for proxies on this website for exapmle [url removed, login to view], [url removed, login to view] 1234 and some other formats.
This proxies should be saved to the database.
The crawler should be able to follow links on the website.
You could be able to specify how deep the crawler should go and if he should follow links to other websites.
The crawler should support threads, so that everything is fast.
There are still some details but this is the basic task.