Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data collaboration analysis for distributed datasets (1902.07535v1)

Published 20 Feb 2019 in cs.LG and stat.ML

Abstract: In this paper, we propose a data collaboration analysis method for distributed datasets. The proposed method is a centralized machine learning while training datasets and models remain distributed over some institutions. Recently, data became large and distributed with decreasing costs of data collection. If we can centralize these distributed datasets and analyse them as one dataset, we expect to obtain novel insight and achieve a higher prediction performance compared with individual analyses on each distributed dataset. However, it is generally difficult to centralize the original datasets due to their huge data size or regarding a privacy-preserving problem. To avoid these difficulties, we propose a data collaboration analysis method for distributed datasets without sharing the original datasets. The proposed method centralizes only intermediate representation constructed individually instead of the original dataset.

Citations (3)

Summary

We haven't generated a summary for this paper yet.