Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
112 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Composite Hashing for Data Stream Sketches (1808.06800v2)

Published 21 Aug 2018 in cs.DB

Abstract: In rapid and massive data streams, it is often not possible to estimate the frequency of items with complete accuracy. To perform the operation in a reasonable amount of space and with sufficiently low latency, approximated methods are used. The most common ones are variations of the Count-Min sketch. By using multiple hash functions, they summarize massive streams in sub-linear space. In reality, data item ids or keys can be modular, e.g., a graph edge is represented by source and target node ids, a 32-bit IP address is composed of four 8-bit words, a web address consists of domain name, domain extension, path, and filename, among many others. In this paper, we investigate the modularity property of item keys, and systematically develop more accurate, composite hashing strategies, such as employing multiple independent hash functions that hash different modules in a key and their combinations separately, instead of hashing the entire key directly into the sketch. However, our problem of finding the best hashing strategy is non-trivial, since there are exponential number of ways to combine the modules of a key before they can be hashed into the sketch. Moreover, given a fixed size allocated for the entire sketch, it is hard to find the optimal range of all hash functions that correspond to different modules and their combinations. We solve both these problems with extensive theoretical analysis, and perform thorough experiments with real-world datasets to demonstrate the accuracy and efficiency of our proposed method, MOD-Sketch.

Citations (1)

Summary

We haven't generated a summary for this paper yet.