Hello,
I am a software (C#.Net,ASP.NET) and database (MySQL/MSSQL) developer specialized in web scraping and automation.
As I gained more and more experience through the years, I have build my own tools and programming resources that I am now using and optimizing in order to build highly customized webscrapers fast and acurately.
Here are some of the relevant features of a typical scraper/bot:
- job scheduler (you can set the scheduler to run on a regular basis, on certain days, at a certain time);
- proxy rotation, in order to avoid IP blacklisting;
- random delays (within specified amount of seconds) and pauses, to leave the impression of a human operator handling the websites;
- data can be saved in CSV(Excel)/MySql/MsSQL
- if needed, image download capability + image adjusting (watermark, re-size) and ftp uploading.
Cheers.