Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimization of stochastic database cracking (1305.1713v1)

Published 8 May 2013 in cs.DB

Abstract: Variant Stochastic cracking is a significantly more resilient approach to adaptive indexing. It showed [1]that Stochastic cracking uses each query as a hint on how to reorganize data, but not blindly so; it gains resilience and avoids performance bottlenecks by deliberately applying certain arbitrary choices in its decision making. Therefore bring, adaptive indexing forward to a mature formulation that confers the workload-robustness that previous approaches lacked. Original cracking relies on the randomness of the workloads to converge well. [2][3] However, where the workload is non-random, cracking needs to introduce randomness on its own. Stochastic Cracking clearly improves over original cracking by being robust in workload changes while maintaining all original cracking features when it comes to adaptation. But looking at both types of cracking, it conveyed an incomplete picture as at some point of time it is must to know whether the workload is random or sequential. In this paper our focus is on optimization of variant stochastic cracking, that could be achieved in two ways either by reducing the initialization cost to make stochastic cracking even more transparent to the user, especially for queries that initiate a workload change and hence incur a higher cost or by combining the strengths of the various stochastic cracking algorithms via a dynamic component that decides which algorithm to choose for a query on the fly. The efforts have been put in to make an algorithm that reduces the initialization cost by using the main notion of both cracking, while considering the requirements of adaptive indexing [2].

Summary

We haven't generated a summary for this paper yet.