Downloads: 107 | Views: 175 | Weekly Hits: ⮙1 | Monthly Hits: ⮙1
M.Tech / M.E / PhD Thesis | Computer Science & Engineering | India | Volume 4 Issue 2, February 2015
Document Annotation Based on Query Workload, Content-Value and User Expectation Tracking Form
Abstract: Document Annotation means these are comments. Annotations are metadata, it give additional information about data. In olden days Form based query interfaces are used for accessing the databases but it has some limitations in the design of a forms-based interface and also they are capable of expressing only limited number of queries. Inorder to overcome the difficulty most of the high level organisations use the texual descriptions of their product and services. The Texual Descriptions consists of significant amount of structured information. so we use Annotations. If the documents are properly annotated we can increase the quality of searching to some extent. Here we introduce Collaborative Adaptive Data Sharing platform (CADS), which is an annotate-as-you create infrastructure that facilitates fielded data annotation. Its main goal is that it reduces the cost of creating annotations and also it is accessible by larger number of queries. Here also we introduce a User Expectation form in the case of Annotations with different meaning in different situation. Actually it is time consuming but while searching sun in google and get user expected result (microsystem, solar system, Newspaper) by taking some minutes it is compensated by the above user expectation tracking.
Keywords: Document Annotations, Query workload, Content value, user expectation tracking, Collaborative Adaptive Data Sharing CADS
Edition: Volume 4 Issue 2, February 2015,
Pages: 1282 - 1284