Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

EnclaveTree: Privacy-preserving Data Stream Training and Inference Using TEE (2203.01438v1)

Published 2 Mar 2022 in cs.CR

Abstract: The classification service over a stream of data is becoming an important offering for cloud providers, but users may encounter obstacles in providing sensitive data due to privacy concerns. While Trusted Execution Environments (TEEs) are promising solutions for protecting private data, they remain vulnerable to side-channel attacks induced by data-dependent access patterns. We propose a Privacy-preserving Data Stream Training and Inference scheme, called EnclaveTree, that provides confidentiality for user's data and the target models against a compromised cloud service provider. We design a matrix-based training and inference procedure to train the Hoeffding Tree (HT) model and perform inference with the trained model inside the trusted area of TEEs, which provably prevent the exploitation of access-pattern-based attacks. The performance evaluation shows that EnclaveTree is practical for processing the data streams with small or medium number of features. When there are less than 63 binary features, EnclaveTree is up to ${\thicksim}10{\times}$ and ${\thicksim}9{\times}$ faster than na\"ive oblivious solution on training and inference, respectively.

Citations (3)

Summary

We haven't generated a summary for this paper yet.