Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Clones in Deep Learning Code: What, Where, and Why? (2107.13614v1)

Published 28 Jul 2021 in cs.SE

Abstract: Deep Learning applications are becoming increasingly popular. Developers of deep learning systems strive to write more efficient code. Deep learning systems are constantly evolving, imposing tighter development timelines and increasing complexity, which may lead to bad design decisions. A copy-paste approach is widely used among deep learning developers because they rely on common frameworks and duplicate similar tasks. Developers often fail to properly propagate changes to all clones fragments during a maintenance activity. To our knowledge, no study has examined code cloning practices in deep learning development. Given the negative impacts of clones on software quality reported in the studies on traditional systems, it is very important to understand the characteristics and potential impacts of code clones on deep learning systems. To this end, we use the NiCad tool to detect clones from 59 Python, 14 C# and 6 Java-based deep learning systems and an equal number of traditional software systems. We then analyze the frequency and distribution of code clones in deep learning and traditional systems. We do further analysis of the distribution of code clones using location-based taxonomy. We also study the correlation between bugs and code clones to assess the impacts of clones on the quality of the studied systems. Finally, we introduce a code clone taxonomy related to deep learning programs and identify the deep learning system development phases in which cloning has the highest risk of faults. Our results show that code cloning is a frequent practice in deep learning systems and that deep learning developers often clone code from files in distant repositories in the system. In addition, we found that code cloning occurs more frequently during DL model construction. And that hyperparameters setting is the phase during which cloning is the riskiest, since it often leads to faults.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Hadhemi Jebnoun (1 paper)
  2. Md Saidur Rahman (11 papers)
  3. Foutse Khomh (140 papers)
  4. Biruk Asmare Muse (5 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.