Develop a software to search the Web and capture the data of certain websites, namely, product code, brand, EAN Code, product names, description, characteristics, technical data, price, price promo, photo name, other field, etc. and export data to excel, as well as download the respective photos.
The application, which from a web page, be able to recognize the name of the fields, where the information will be captured, and allow the user to validate and match these fields, to perform the operation of data capture and subsequent export to excel. That is, an application that allows the export of data from different sites, with prior recognition of fields for user validation and export to excel. It should also have a user-friendly interface.
The user, by indicating the URL of the site, to which he will extract the data, the page loads to the transfer area. After the user select, with the mouse, the area containing the data that he wants to extract, the application automatically recognizes the name of the source field, and the user relates to the respective field. This is for all available fields.
However, not all fields can be expected to be retrieved in this way (for example if the
product code field or EAN Code does not appear on the webpage but we know what the source field name is), so if it isn’t an automatism, this field must be entered manually, or in other way, concerning The "EAN Code" and "Product Code" fields, that usually are not visible in the pages of the websites, the program must detect these fields in the site database, by default and automatically (that is, if the user cannot identify and combine them). These fields are generally referred to as "EAN" and "SKU", respectively.
In the case of the name of the photos, I suggest that the user select the main photo with the mouse and from that selection, indicates the respective field on the left.
To take advantage of the space available in the application, in the end of the project is to insert a small exemplary film with the main functionalities, that is, “how select fields to extract, this is, how make match, how export excel and download photos”.
1. Firstly the program is to be downloaded from the web and to work in offline mode.
2. I wanted you to configure please 4 application operation options:
a) LITE: usage limit on 1 site; extract 50 products and photos; time limit:
15 non-renewable days;
b) SILVER: limit of use in 1 site; extract unlimited of products and photos; time limit: 30 renewable days;
c) GOLD: usage limit on 3 sites; extract unlimited of products and photos; time limit: 30 renewable days;
d) PROFESSIONAL: unlimited usage limit; extract unlimited products and photos; time limit: 30 renewable days;
3. To access the application, the user must log in using a combination of username and password
4. Creation of individual key for each version downloaded, must have a validity period (according to item 2 above) and controlled in the backoffice (point 5 below).
5. Back office very simple to control the application, which should contain the following fields:
- Product Key
- Product version (Lite, Silver, Gold or Professional)
- Possibility of changing the client version
- Disable / enable (enable / disable)
6. The product to be unloaded will be paid through Paypal and VISA
The client after making the payment, the application is automatically active for the period of another 30 days.
7. The application should automatically notify the user of the expiration date from 5 days before expiration.
8. The application is to be used only by a user (mac address)
9. The functionality to detect and capture data from websites is more or less according to the video at https://youtu.be/jsr_PiMzELI, whose functionality is similar.