Stratified sampling for data mining on the deep web

Tantan Liu, Fan Wang, Gagan Agrawal

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

In recent years, the deep web has become extremely popular. Like any other data source, data mining on the deep web can produce important insights or summaries of results. However, data mining on the deep web is challenging because the databases cannot be accessed directly, and therefore, data mining must be performed by sampling the datasets. The samples, in turn, can only be obtained by querying deep web databases with specific inputs. In this paper, we target two related data mining problems, association mining and differential rulemining. These are proposed to extract high-level summaries of the differences in data provided by different deep web data sources in the same domain. We develop stratified sampling methods to perform these mining tasks on a deep web source. Our contributions include a novel greedy stratification approach, which recursively processes the query space of a deep web data source, and considers both the estimation error and the sampling costs. We have also developed an optimized sample allocation method that integrates estimation error and sampling costs. Our experimental results show that our algorithms effectively and consistently reduce sampling costs, compared with a stratified sampling method that only considers estimation error. In addition, compared with simple random sampling, our algorithm has higher sampling accuracy and lower sampling costs.

Original languageEnglish (US)
Pages (from-to)179-196
Number of pages18
JournalFrontiers of Computer Science in China
Volume6
Issue number2
DOIs
StatePublished - Apr 2012
Externally publishedYes

Keywords

  • associate rule mining
  • deep web
  • stratified sampling

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Stratified sampling for data mining on the deep web'. Together they form a unique fingerprint.

Cite this