Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CloudMe Forensics: A Case of Big-Data Investigation (1807.10218v1)

Published 26 Jul 2018 in cs.CR

Abstract: The issue of increasing volume, variety and velocity of has been an area of concern in cloud forensics. The high volume of data will, at some point, become computationally exhaustive to be fully extracted and analysed in a timely manner. To cut down the size of investigation, it is important for a digital forensic practitioner to possess a well-rounded knowledge about the most relevant data artefacts from the cloud product investigating. In this paper, we seek to tackle on the residual artefacts from the use of CloudMe cloud storage service. We demonstrate the types and locations of the artefacts relating to the installation, uninstallation, log-in, log-off, and file synchronisation activities from the computer desktop and mobile clients. Findings from this research will pave the way towards the development of data mining methods for cloud-enabled big data endpoint forensics investigation.

Citations (35)

Summary

We haven't generated a summary for this paper yet.