Papers
Topics
Authors
Recent
2000 character limit reached

Error Propagation Analysis for Multithreaded Programs: An Empirical Approach (2312.16791v1)

Published 28 Dec 2023 in cs.SE and cs.DC

Abstract: Fault injection is a technique to measure the robustness of a program to errors by introducing faults into the program under test. Following a fault injection experiment, Error Propagation Analysis (EPA) is deployed to understand how errors affect a program's execution. EPA typically compares the traces of a fault-free (golden) run with those from a faulty run of the program. While this suffices for deterministic programs, EPA approaches are unsound for multithreaded programs with non-deterministic golden runs. In this paper, we propose Invariant Propagation Analysis (IPA) as the use of automatically inferred likely invariants ("invariants" in the following) in lieu of golden traces for conducting EPA in multithreaded programs. We evaluate the stability and fault coverage of invariants derived by IPA through fault injection experiments across six different fault types and six representative programs that can be executed with varying numbers of threads. We find that stable invariants can be inferred in all cases, but their fault coverage depends on the application and the fault type. We also find that fault coverage for multithreaded executions with IPA can be even higher than for traditional singlethreaded EPA, which emphasizes that IPA results cannot be trivially extrapolated from traditional EPA results.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. Emulation of Software Faults: A Field Data Study and a Practical Approach. IEEE Trans Softw Eng. 2006;32(11):849–867.
  2. On Fault Representativeness of Software Fault Injection. IEEE Trans Softw Eng. 2013;39(1):80–96.
  3. Soft-LLFI: A Comprehensive Framework for Software Fault Injection. In: Proc. ISSREW ’14; 2014. p. 1–5.
  4. Hints on Test Data Selection: Help for the Practicing Programmer. Computer. 1978;11(4):34–41.
  5. Offutt A. The Coupling Effect: Fact or Fiction. In: Proceedings of the ACM SIGSOFT ’89 Third Symposium on Software Testing, Analysis, and Verification. TAV3. New York, NY, USA: Association for Computing Machinery; 1989. p. 131–140. Available from: https://doi.org/10.1145/75308.75324.
  6. Offutt AJ. Investigations of the Software Testing Coupling Effect. ACM Trans Softw Eng Methodol. 1992 Jan;1(1):5–20. Available from: https://doi.org/10.1145/125489.125473.
  7. An experimental comparison of fault and error injection. In: Proc. ISSRE ’98; 1998. p. 369–378.
  8. PROPANE: An Environment for Examining the Propagation of Errors in Software. In: Proc. ISSTA ’02; 2002. p. 81–85.
  9. Evaluating the Use of Reference Run Models in Fault Injection Analysis. In: Proc. PRDC ’09; 2009. p. 121–124.
  10. Quickly Detecting Relevant Program Invariants. In: Proc. ICSE ’00; 2000. p. 449–458.
  11. Efficient mutation testing by checking invariant violations. In: Proc. ISSTA ’09; 2009. p. 69–80.
  12. Using Likely Invariants for Automated Software Fault Localization. SIGPLAN Not. 2013;48(4):139–152.
  13. Automatically patching errors in deployed software. In: Proc. SOSP ’09; 2009. p. 87–102.
  14. Tool-assisted unit test generation and selection based on operational abstractions. Automated Software Engineering Journal. 2006;13(3):345–371.
  15. IPA: Error Propagation Analysis of Multi-Threaded Programs Using Likely Invariants. In: 2017 IEEE International Conference on Software Testing, Verification and Validation (ICST); 2017. p. 184–195.
  16. The exception handling effectiveness of POSIX operating systems. IEEE Trans Softw Eng. 2000;26(9):837–848.
  17. Automatic detection and masking of nonatomic exception handling. IEEE Trans Softw Eng. 2004;30(8):547–560.
  18. Robustness testing of Java server applications. Software Engineering, IEEE Transactions on. 2005;31(4):292–311.
  19. Exception-Chain Analysis: Revealing Exception Handling Architecture in Java Server Applications. In: Proc. ICSE ’07; 2007. p. 230–239.
  20. LFI: A practical and general library-level fault injector. In: Proc. DSN ’09; 2009. p. 379–388.
  21. EDFI: A Dependable Fault Injection Tool for Dependability Benchmarking Experiments. In: Proc. PRDC ’13; 2013. p. 31–40.
  22. Software defects and their impact on system availability-a study of field failures in operating systems. In: Proc. FTCS-21; 1991. p. 2–9.
  23. FIAT-fault injection based automated testing environment. In: Proc. FTCS-18; 1988. p. 102–107.
  24. LLFI: An Intermediate Code-Level Fault Injection Tool for Hardware Faults. In: Proc. QRS ’15; 2015. p. 11–16.
  25. Jin A. A PIN-Based Dynamic Software Fault Injection System. In: Proc. ICYCS ’08. IEEE; 2008. p. 2160–2167.
  26. Specification-guided Golden Run for Analysis of Robustness Testing Results. In: Proc. SERE ’12; 2012. p. 157–166.
  27. TraceSanitizer - Eliminating the Effects of Non-Determinism on Error Propagation Analysis. In: 2020 50th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN); 2020. p. 52–63.
  28. FATE and DESTINI: A Framework for Cloud Recovery Testing. In: Proceedings of the 8th USENIX Conference on Networked Systems Design and Implementation. NSDI’11. USA: USENIX Association; 2011. p. 238–252.
  29. Uncovering Bugs in Distributed Storage Systems during Testing (Not in Production!). In: 14th USENIX Conference on File and Storage Technologies (FAST 16). Santa Clara, CA: USENIX Association; 2016. p. 249–262. Available from: https://www.usenix.org/conference/fast16/technical-sessions/presentation/deligiannis.
  30. Fault Injection Analytics: A Novel Approach to Discover Failure Modes in Cloud-Computing Systems. IEEE Transactions on Dependable and Secure Computing. 2020;p. 1–1.
  31. Decidability of Inferring Inductive Invariants. In: Proc. POPL ’16; 2016. p. 217–231.
  32. Tracking Down Software Bugs Using Automatic Anomaly Detection. In: Proc. ICSE ’02; 2002. p. 291–301.
  33. DySy: Dynamic Symbolic Execution for Invariant Inference. In: Proc. ICSE ’08; 2008. p. 281–290.
  34. Dynamic Generation of Likely Invariants for Multithreaded Programs. In: Proc. ICSE ’15; 2015. p. 835–846.
  35. Using likely program invariants to detect hardware errors. In: Proc. DSN ’08; 2008. p. 70–79.
  36. AVIO: Detecting Atomicity Violations via Access Interleaving Invariants. In: Proc. ASPLOS XII; 2006. p. 37–48.
  37. Mazurkiewicz A. Trace theory. In: Petri nets: applications and relationships to other models of concurrency. Springer; 1987. p. 278–324.
  38. Youngs EA. Human Errors in Programming. International Journal of Man-Machine Studies. 1974;6(3):361 – 376. Available from: http://www.sciencedirect.com/science/article/pii/S0020737374800271.
  39. Development and Evaluation of a Model of Programming Errors. In: Proc. HCC ’03; 2003. p. 7–14.
  40. LLVM: a compilation framework for lifelong program analysis transformation. In: Proc. CGO ’04; 2004. p. 75–86.
  41. Bienia C. Benchmarking Modern Multiprocessors (Dissertation). Princeton University; 2011.
  42. Null httpd. Accessed: 2016-05-17;. https://sourceforge.net/projects/nullhttpd/.
  43. Non-blocking data structures. Accessed: 2016-05-17;. https://code.google.com/p/nbds/.
  44. In: Skavhaug A, Guiochet J, and Bitsch F, editors. FIDL: A Fault Injection Description Language for Compiler-Based SFI Tools. Cham: Springer International Publishing; 2016. p. 12–23. Available from: http://dx.doi.org/10.1007/978-3-319-45477-1_2.
  45. Vipindeep V, and Jalote P. List of Common Bugs and Programming Practices to avoid them; 2005.
  46. Fault Injection Techniques and Tools. IEEE Computer. 1997;30(4):75–82. Available from: http://dx.doi.org/10.1109/2.585157.
  47. An automated approach for identifying potential vulnerabilities in software. In: Proc. IEEE S & P; 1998. p. 104–114.
  48. Identifying the root causes of memory bugs using corrupted memory location suppression. In: Proc. ICSM ’08; 2008. p. 356–365.
  49. The Evolution of C Programming Practices: A Study of the Unix Operating System 1973–2015. In: Proceedings of the 38th International Conference on Software Engineering. ICSE ’16. New York, NY, USA: Association for Computing Machinery; 2016. p. 748–759. Available from: https://doi.org/10.1145/2884781.2884799.
Citations (8)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.