Need a simple Windows XP application that extracts a table of information from a website and writes the resultant data to a .CSV file. Details are:
- takes a text name as input (up to 50 characters)
- specific webpage URL to be defined in a configuration file
- enters the text name to execute a search function on the webpage
- Selects several "general" fields from the results page and writes to the output file
- then selects a specific view of a result table
- then extracts a table of results for that search from the webpage
- The results table is of indeterminate length with a requirement that multiple "next" page commands may need to be performed in order to extract the full table of resultant data
- Table has 12 columns of data, several a graphic icons which will need to be converted to a number based on the icon name?
- Code must be commented such that it can be updated when base webpage is changed
- Output .CSV file must also be an input parameter to the program.
- This program must be runable as a windows script for automated processing.
Open on language but should be easy for me to setup and run on a Windows XP machine.
Initially the output file will be read into excel for processing. Later projects may followon to write data directly to a SQL database.