An ontology-based webpage classification approach for the knowledge grid environment
MetadataShow full item record
Copyright © 2009 IEEE This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
With the rapid growth of the amount of information available in the Web, webpage classification technologies arewidely employed by many search engines in order to formulate user queries and make users? search tasks easier. Knowledge Grid is a new form of Web environment, in which a Resource Space Model is employed in order to classify available semantic documents within the Web environment. However, it is well known that the semantic documents are proportionally small in relation to the whole Web documents, and the Resource SpaceModel cannot process these Web documents without semantic supports. In order to solve the above issue, in this paper, we present a novel ontology-based webpage classification method for the Knowledge Grid environment, which utilizes generated metadata from webpages as the intermedium to classify the webpages by ontology concepts. We design a conceptual model of a Webpage Classification Agent and build the prototype in a chosen domain. A series of experiments have been conducted using the prototype in order to evaluate the conceptual model. Conclusions about the evaluation are drawn in the final section.
Showing items related by title, author, creator and subject.
Dong, Hai (2010)With the emergence of the Web and its pervasive intrusion on individuals, organizations, businesses etc., people now realize that they are living in a digital environment analogous to the ecological ecosystem. Consequently, ...
Dong, Hai; Hussain, Farookh Khadeer; Chang, Elizabeth (2008)Crawlers are software which can traverse the internet and retrieve webpages by hyperlinks. In theface of the inundant spam websites, traditional web crawlers cannot function well to solve this problem.Semantic focused ...
Beacon Virtua: A Virtual Reality Simulation Detailing the Recent and Shipwreck History of Beacon Island, Western AustraliaWoods, Andrew; Oliver, Nick; Bourke, Paul; Green, Jeremy; Paterson, Alistair (2019)Beacon Virtua is a project to document and virtually preserve a historically significant offshore island as a virtual reality experience. In 1629, survivors of the wreck of VOC ship Batavia took refuge on Beacon Island, ...