Downsizing Diffusion Models for Cardinality Estimation
Authors
Xinhe Mu
Zhaoqi Zhou
Zaijiu Shang
Chuan Zhou
Gang Fu
Guiying Yan
Guoliang Li
Zhiming Ma
Abstract
Learned cardinality estimation requires accurate model designs to capture the local characteristics of probability distributions. However, existing models may fail to accurately capture complex, multilateral dependencies between attributes. Diffusion models, meanwhile, can succeed in estimating image distributions with thousands of dimensions, making them promising candidates, but their heavy weight and high latency prohibit effective implementation. We seek to make diffusion models more lightweight by introducing Accelerated Diffusion Cardest (ADC), the first "downsized" diffusion model framework for efficient, high-precision cardinality estimation. ADC utilizes a hybrid architecture that integrates a Gaussian Mixture-Bayesnet selectivity estimator with a score-based density estimator to perform precise Monte Carlo integration. Addressing the issue of prohibitive inference latencies common in large generative models, we provide theoretical advancements concerning the asymptotic behavior of score functions as time $t$ approaches zero and convergence rate estimates as $t$ increases, enabling the adaptation of score-based diffusion models to the moderate dimensionalities and stringent latency requirements of database systems.
Through experiments conducted against five learned estimators, including the state-of-the-art Naru, we demonstrate that ADC offer superior robustness when handling datasets with multilateral dependencies, which cannot be effectively summarized using pairwise or triple-wise correlations. In fact, ADC is 10 times more accurate than Naru on such datasets. Additionally, ADC achieves competitive accuracy comparable to Naru across all tested datasets while maintaining latency half that of Naru's and requiring minimal storage (<350KB) on most datasets.