We have a portal where we list thousands of mentor listings by different people and are called ‘Online Portals’. One example is the following URL:
[login to view URL]
These online portals are growing daily as new people sign up. These online portals should be crawled, indexed and ranked by Google and other search engines so that these people can be discovered by the search engine searches. To facilitate this we need to have a program that creates (dynamic) sitemap at regular intervals programmatically and refresh itself automatically so that Google and other engine crawlers are presented with the most recent updated version.
In the context of the above mentioned brief we feel the following would be the steps:
1. We need a program to crawl our own site ([login to view URL]), create internal index and Ranking and maintain it in a preferred database
2. Create an auto generated site map based on the database created in step 1 and make it presentable to the search engines
3. Ranking of the pages, as mentioned in point 1, would be based on our own algorithm that will pass on the ranking parameter to the database. This algorithm would keep changing so the program in point 1 should be flexible to accept updated ranking value.
Above are the brief steps that we thought would help achieve the goal. However the person interested in the project is free to give own suggestions to make it better.