Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
15 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
Gemini 2.5 Flash Deprecated
12 tokens/sec
2000 character limit reached

DLBC: A Deep Learning-Based Consensus in Blockchains for Deep Learning Services (1904.07349v2)

Published 15 Apr 2019 in cs.DC and cs.CV

Abstract: With the increasing artificial intelligence application, deep neural network (DNN) has become an emerging task. However, to train a good deep learning model will suffer from enormous computation cost and energy consumption. Recently, blockchain has been widely used, and during its operation, a huge amount of computation resources are wasted for the Proof of Work (PoW) consensus. In this paper, we propose DLBC to exploit the computation power of miners for deep learning training as proof of useful work instead of calculating hash values. it distinguishes itself from recent proof of useful work mechanisms by addressing various limitations of them. Specifically, DLBC handles multiple tasks, larger model and training datasets, and introduces a comprehensive ranking mechanism that considers tasks difficulty(e.g., model complexity, network burden, data size, queue length). We also applied DNN-watermark [1] to improve the robustness. In Section V, the average overhead of digital signature is 1.25, 0.001, 0.002 and 0.98 seconds, respectively, and the average overhead of network is 3.77, 3.01, 0.37 and 0.41 seconds, respectively. Embedding a watermark takes 3 epochs and removing a watermark takes 30 epochs. This penalty of removing watermark will prevent attackers from stealing, improving, and resubmitting DL models from honest miners.

Citations (12)

Summary

We haven't generated a summary for this paper yet.