Đang Thực Hiện

124590 WikiMedia project robot

I would like some way to automatically create pages for my wikimedia site. I want to be able to paste a list of URLs into a field, then press go and the program or script will go to each URL and read the meta tags and create the pages based on meta tag data. It needs to read all possible meta tags and meta data then create the page with the meta tag as the title for the category on the created page.

Title

Description

Abstract

Copyright

Distribution

Expires

Language

Author

Revised

Also read any other metatags that may be used in a page and place that data in the Additional information area.

Keywords meta tag will be used for deciding Category at the bottom of the page.

An example of the pages [url removed, login to view] however, you will notice that all the information isn't there, I would like to be able to automatically populate the pages that are created based on the list of URLs I submit. This also needs to comply with [url removed, login to view] files and meta robots tags.

Kỹ năng: Bất kì công việc gì, MySQL, Perl, PHP, Visual Basic, XML

Xem thêm: want author, abstract tag, wikimedia, meta robot, php read txt, page abstract, php meta data url, read project files, meta tags net, project populate, meta description tags, project based language, language project description, perl script populate, copyright project, project files net, meta tag php, project abstract, create pages based keywords, create url submit script, robot data, create robot, script robot, submit data perl, script read txt files

Về Bên Thuê:
( 54 nhận xét )

Mã Dự Án: #1870756