Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Impact of Extensions on Browser Performance: An Empirical Study on Google Chrome (2404.06827v1)

Published 10 Apr 2024 in cs.PF, cs.HC, and cs.SE

Abstract: Web browsers have been used widely by users to conduct various online activities, such as information seeking or online shopping. To improve user experience and extend the functionality of browsers, practitioners provide mechanisms to allow users to install third-party-provided plugins (i.e., extensions) on their browsers. However, little is known about the performance implications caused by such extensions. In this paper, we conduct an empirical study to understand the impact of extensions on the user-perceived performance (i.e., energy consumption and page load time) of Google Chrome, the most popular browser. We study a total of 72 representative extensions from 11 categories (e.g., Developer Tools and Sports). We observe that browser performance can be negatively impacted by the use of extensions, even when the extensions are used in unintended circumstances (e.g., when logging into an extension is not granted but required, or when an extension is not used for designated websites). We also identify a set of factors that significantly influence the performance impact of extensions, such as code complexity and privacy practices (i.e., collection of user data) adopted by the extensions. Based on our empirical observations, we provide recommendations for developers and users to mitigate the performance impact of browser extensions, such as conducting performance testing and optimization for unintended usage scenarios of extensions, or adhering to proper usage practices of extensions (e.g., logging into an extension when required).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (62)
  1. Amazon “Amazon.com, Inc.”, 2023 URL: https://www.amazon.co.jp
  2. “Green Tracker: A Tool for Estimating the Energy Consumption of Software” In CHI ’10 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’10 Atlanta, Georgia, USA: Association for Computing Machinery, 2010, pp. 3337–3342 DOI: 10.1145/1753846.1753981
  3. Preeti Arora, Deepali and Shipra Varshney “Analysis of K-Means and K-Medoids Algorithm For Big Data” 1st International Conference on Information Security & Privacy 2015 In Procedia Computer Science 78, 2016, pp. 507–512 DOI: https://doi.org/10.1016/j.procs.2016.02.095
  4. “Users and Batteries: Interactions and Adaptive Energy Management in Mobile Systems” In UbiComp 2007: Ubiquitous Computing Berlin, Heidelberg: Springer Berlin Heidelberg, 2007, pp. 217–234
  5. “Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing” In Journal of the Royal Statistical Society. Series B (Methodological) 57.1 [Royal Statistical Society, Wiley], 1995, pp. 289–300 URL: http://www.jstor.org/stable/2346101
  6. “Random Search for Hyper-Parameter Optimization” In The Journal of Machine Learning Research 13, 2012, pp. 281–305
  7. “Understanding the Performance Costs and Benefits of Privacy-Focused Browser Extensions” In Proceedings of The Web Conference 2020, WWW ’20 New York, NY, USA: Association for Computing Machinery, 2020, pp. 2275–2286 DOI: 10.1145/3366423.3380292
  8. James Bornholt, Todd Mytkowicz and Kathryn S. McKinley “The model is not enough: Understanding energy consumption in mobile devices” In 2012 IEEE Hot Chips 24 Symposium (HCS), 2012, pp. 1–3 DOI: 10.1109/HOTCHIPS.2012.7476509
  9. “Investigating the Correlation between Performance Scores and Energy Consumption of Mobile Web Apps” In Proceedings of the Evaluation and Assessment in Software Engineering, EASE ’20 Trondheim, Norway: Association for Computing Machinery, 2020, pp. 190–199 DOI: 10.1145/3383219.3383239
  10. “A metrics suite for object oriented design” In IEEE Transactions on Software Engineering 20.6, 1994, pp. 476–493 DOI: 10.1109/32.295895
  11. Norman Cliff “Dominance statistics: Ordinal analyses to answer ordinal questions.” In Psychological Bulletin 114, 1993, pp. 494–509
  12. Douglas Curran-Everett “Explorations in statistics: Standard deviations and standard errors” In Advances in physiology education 32, 2008, pp. 203–8 DOI: 10.1152/advan.90123.2008
  13. “RAPL: Memory power estimation and capping” In 2010 ACM/IEEE International Symposium on Low-Power Electronics and Design (ISLPED), 2010, pp. 189–194 DOI: 10.1145/1840845.1840883
  14. “RAPL: Memory Power Estimation and Capping” In Proceedings of the 16th ACM/IEEE International Symposium on Low Power Electronics and Design, ISLPED ’10 Austin, Texas, USA: Association for Computing Machinery, 2010, pp. 189–194 DOI: 10.1145/1840845.1840883
  15. Spencer Desrochers, Chad Paradis and Vincent M. Weaver “A Validation of DRAM RAPL Power Measurements” In Proceedings of the Second International Symposium on Memory Systems, MEMSYS ’16 Alexandria, VA, USA: Association for Computing Machinery, 2016, pp. 455–470 DOI: 10.1145/2989081.2989088
  16. Swathi Dsouza, Jevita Deena Dsouza and Vanitha T “Analysis of data using k-means and k-medoids algorithms” In International Journal of Latest Trends in Engineering and Technology - Special Issue - SACAIM, 2017, pp. 370–373 URL: https://www.ijltet.org/journal/151065795883.pdf
  17. Bradley Efron “Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation” In Journal of the American Statistical Association 78.382 Taylor & Francis, 1983, pp. 316–331 DOI: 10.1080/01621459.1983.10477973
  18. Bradley Efron and Robert J Tibshirani “An introduction to the bootstrap” CRC press, 1994
  19. “Energy-optimizing source code transformations for OS-driven embedded software” In 17th International Conference on VLSI Design. Proceedings., 2004, pp. 261–266 DOI: 10.1109/ICVD.2004.1260934
  20. Jerome Friedman, Robert Tibshirani and Trevor Hastie “Regularization Paths for Generalized Linear Models via Coordinate Descent” In Journal of Statistical Software 33.1, 2010, pp. 1–22 DOI: 10.18637/jss.v033.i01
  21. “Green AI: Do Deep Learning Frameworks Have Different Costs?” In Proceedings of the 44th International Conference on Software Engineering, ICSE ’22 Pittsburgh, Pennsylvania: Association for Computing Machinery, 2022, pp. 1082–1094 DOI: 10.1145/3510003.3510221
  22. “Correlating Hardware Performance Events to CPU and DRAM Power Consumption” In 2016 IEEE International Conference on Networking, Architecture and Storage (NAS), 2016, pp. 1–2 DOI: 10.1109/NAS.2016.7549395
  23. Cannon Giglio and Steven D. Brown “Using elastic net regression to perform spectrally relevant variable selection” e3034 CEM-17-0239.R1 In Journal of Chemometrics 32.8, 2018, pp. e3034 DOI: https://doi.org/10.1002/cem.3034
  24. Gavin Hackeling “Mastering Machine Learning with scikit-learn” Packt Publishing Ltd, 2017
  25. A. Hindle “Green mining: a methodology of relating software change and configuration to power consumption” In Empirical Software Engineering 20, 2013 DOI: 10.1007/s10664-013-9276-6
  26. Raj Jain “The Art of Computer Systems Performance Analysis: Techniques For Experimental Design, Measurement, Simulation, and Modeling” Nashville, TN: John Wiley & Sons, 1991, pp. 216–217
  27. “On the Impact of the Critical CSS Technique on the Performance and Energy Consumption of Mobile Browsers” In Proceedings of the International Conference on Evaluation and Assessment in Software Engineering 2022, EASE ’22 Gothenburg, Sweden: Association for Computing Machinery, 2022, pp. 130–139 DOI: 10.1145/3530019.3530033
  28. “Rapid and accurate energy models through calibration with IPMI and RAPL” e5124 cpe.5124 In Concurrency and Computation: Practice and Experience 31.13, 2019, pp. e5124 DOI: 10.1002/cpe.5124
  29. “RAPL in Action: Experiences in Using RAPL for Power Measurements” In ACM Trans. Model. Perform. Eval. Comput. Syst. 3.2 New York, NY, USA: Association for Computing Machinery, 2018 DOI: 10.1145/3177754
  30. “Applications, energy consumption, and measurement” In 2015 International Conference on Information and Digital Technologies, 2015, pp. 161–171 DOI: 10.1109/DT.2015.7222967
  31. Max Kuhn “Building Predictive Models in R Using the caret Package” In Journal of Statistical Software 28.5, 2008, pp. 1–26 DOI: 10.18637/jss.v028.i05
  32. Max Kuhn “Futility Analysis in the Cross-Validation of Machine Learning Models”, 2014 arXiv:1405.6974 [stat.ML]
  33. “Feature Selection and Parameter Optimization of Support Vector Machines Based on Modified Cat Swarm Optimization” In International Journal of Distributed Sensor Networks 11.7, 2015, pp. 365869 DOI: 10.1155/2015/365869
  34. “Object-Oriented Software Metrics” In SIGSOFT Softw. Eng. Notes 20.1 New York, NY, USA: Association for Computing Machinery, 1995, pp. 91–93 DOI: 10.1145/225907.773556
  35. “Energy Wars - Chrome vs. Firefox: Which browser is more energy efficient?” In 2020 35th IEEE/ACM International Conference on Automated Software Engineering Workshops (ASEW), 2020, pp. 159–165 DOI: 10.1145/3417113.3423000
  36. “Energy Wars - Chrome vs. Firefox: Which Browser is More Energy Efficient?” In Proceedings of the 35th IEEE/ACM International Conference on Automated Software Engineering, ASE ’20 New York, NY, USA: Association for Computing Machinery, 2021, pp. 159–165
  37. Jukka Manner “Black software — the energy unsustainability of software systems in the 21st century” In Oxford Open Energy 2, 2022, pp. oiac011 DOI: 10.1093/ooenergy/oiac011
  38. “Block Me If You Can: A Large-Scale Study of Tracker-Blocking Tools” In 2017 IEEE European Symposium on Security and Privacy (EuroS&P), 2017, pp. 319–333 DOI: 10.1109/EuroSP.2017.26
  39. “De-fragmenting the cloud” In Proceedings of the 16th IEEE/ACM International Symposium on Cluster, Cloud, and Grid Computing, CCGRID ’16 Cartagena, Columbia: IEEE Press, 2016, pp. 511–520 DOI: 10.1109/CCGrid.2016.21
  40. “Mining Energy-Aware Commits” In 2015 IEEE/ACM 12th Working Conference on Mining Software Repositories, 2015, pp. 56–67 DOI: 10.1109/MSR.2015.13
  41. San Murugesan “Harnessing Green IT: Principles and Practices” In IT Professional 10, 2008, pp. 24–33 DOI: 10.1109/MITP.2008.10
  42. “On the Impact of Code Smells on the Energy Consumption of Mobile Applications” In Information and Software Technology 105, 2018 DOI: 10.1016/j.infsof.2018.08.004
  43. “What Do Programmers Know about Software Energy Consumption?” In IEEE Software 33.03 Los Alamitos, CA, USA: IEEE Computer Society, 2016, pp. 83–89 DOI: 10.1109/MS.2015.83
  44. “Analysis of RAPL Energy Prediction Accuracy in a Matrix Multiplication Application on Shared Memory” In Computer Science – CACIC 2017 Cham: Springer International Publishing, 2018, pp. 37–46
  45. Joshua M. Pearce “Energy Conservation with Open Source Ad Blockers” In Technologies 8.2, 2020 DOI: 10.3390/technologies8020018
  46. “The Influence of the Java Collection Framework on Overall Energy Consumption” In Proceedings of the 5th International Workshop on Green and Sustainable Software, GREENS ’16 New York, NY, USA: ACM, 2016, pp. 15–21 DOI: 10.1145/2896967.2896968
  47. Behnam Pourghassemi, Ardalan Amiri Sani and Aparna Chandramowlishwaran “What-If Analysis of Page Load Time in Web Browsers Using Causal Profiling” In Proc. ACM Meas. Anal. Comput. Syst. 3.2 New York, NY, USA: Association for Computing Machinery, 2019
  48. “Appropriate Statistics for Ordinal Level Data: Should We Really Be Using t-test and Cohen’s d for Evaluating Group Differences on the NSSE and other Surveys?” In Annual Meeting of the Florida Association of Institutional Research, 2006, pp. 1–33
  49. Peter Rousseeuw “Rousseeuw, P.J.: Silhouettes: A Graphical Aid to the Interpretation and Validation of Cluster Analysis. Comput. Appl. Math. 20, 53-65” In Journal of Computational and Applied Mathematics 20, 1987, pp. 53–65 DOI: 10.1016/0377-0427(87)90125-7
  50. Peter J. Rousseeuw “Silhouettes: A graphical aid to the interpretation and validation of cluster analysis” In Journal of Computational and Applied Mathematics 20, 1987, pp. 53–65 DOI: https://doi.org/10.1016/0377-0427(87)90125-7
  51. Erich Schubert and Peter J. Rousseeuw “Faster k-Medoids Clustering: Improving the PAM, CLARA, and CLARANS Algorithms” In Similarity Search and Applications Springer International Publishing, 2019, pp. 171–187 DOI: 10.1007/978-3-030-32047-8_16
  52. Erich Schubert and Peter J. Rousseeuw “Fast and eager k-medoids clustering: O(k) runtime improvement of the PAM, CLARA, and CLARANS algorithms” In Information Systems 101, 2021, pp. 101804 DOI: https://doi.org/10.1016/j.is.2021.101804
  53. Semrush “Semrush blog”, 2023 URL: https://www.semrush.com/blog/most-visited-websites
  54. “Regularization Paths for Cox’s Proportional Hazards Model via Coordinate Descent” In Journal of Statistical Software 39.5, 2011, pp. 1–13 DOI: 10.18637/jss.v039.i05
  55. J.Kenneth Tay, Balasubramanian Narasimhan and Trevor Hastie “Elastic Net Regularization Paths for All Generalized Linear Models” In Journal of Statistical Software 106.1, 2023, pp. 1–31 DOI: 10.18637/jss.v106.i01
  56. Robert L. Thorndike “Who belongs in the family?” In Psychometrika 18, 1953, pp. 267–276
  57. “Understanding Quality of Experiences on Different Mobile Browsers” In Proceedings of the 11th Asia-Pacific Symposium on Internetware, Internetware ’19 Fukuoka, Japan: Association for Computing Machinery, 2019 DOI: 10.1145/3361242.3361249
  58. “Instruction level power analysis and optimization of software” In Proceedings of 9th International Conference on VLSI Design, 1996, pp. 326–328 DOI: 10.1109/ICVD.1996.489624
  59. “Trends in energy consumption under the multi-stage development of ICT: Evidence in China from 2001 to 2030” In Energy Reports 8, 2022 DOI: 10.1016/j.egyr.2022.07.003
  60. Frank Wilcoxon “Individual Comparisons by Ranking Methods” In Biometrics Bulletin 1.6 [International Biometric Society, Wiley], 1945, pp. 80–83 URL: http://www.jstor.org/stable/3001968
  61. “A Survey on Evolutionary Computation Approaches to Feature Selection” In IEEE Transactions on Evolutionary Computation 20.4, 2016, pp. 606–626 DOI: 10.1109/TEVC.2015.2504420
  62. “Regularization and Variable Selection Via the Elastic Net” In Journal of the Royal Statistical Society Series B: Statistical Methodology 67.2, 2005, pp. 301–320 DOI: 10.1111/j.1467-9868.2005.00503.x
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Bihui Jin (3 papers)
  2. Heng Li (138 papers)
  3. Ying Zou (23 papers)
Citations (1)

Summary

Impact of Extensions on Browser Performance: An Empirical Study on Google Chrome

Introduction to the Study

The comprehensive research conducted by Jin, Li, and Zou offers an in-depth examination of the nuanced ways in which browser extensions, specifically within Google Chrome, influence the user-perceived performance metrics of browsers: page load time and energy consumption. Distinguishing itself by its breadth, this paper critically explores not only the immediate impact of extension usage but also explores the effects varying activation modes and extension-specific factors have on energy efficiency and speed.

Study Design and Methodology

The paper's robust experimental design meticulously selected 72 extensions from a pool of 110,240, spanning 11 categories. This selection process aimed at inclusivity, covering a wide spectrum of extension functionalities. The research scrutinized extensions across different phases: page load and post-load (stabilized consumption) energy use, assessing the impact on performance metrics. These assessments were executed against a backdrop of seven distinct user scenario settings, further deepening the paper's investigational rigor.

Collecting and Clustering Extensions

The initial phase involved collecting extensions from the Chrome Web Store, subsequently clustering them based on their declared privacy practices—information crucial for understanding each extension's potential performance impact. This meticulous process ensured a diverse and representative sample for the experimental analysis.

Designing Testing Scenarios

A thoughtful approach led to the creation of seven tailored testing scenarios, closely mirroring real-world user interactions with these extensions. Such detailed preparation underscores the paper's commitment to producing relevant and applicable findings.

Performance Measurement Strategy

Leveraging Running Average Power Limit (RAPL) for energy measurements alongside Selenium for automating user interaction scenarios, the paper maintained a high degree of accuracy and reproducibility in its experimental procedures. These measurements spanned critical performance indicators, including page load time and energy consumption, presenting a comprehensive view of the performance impact across different usage contexts.

Key Findings and Insights

The Dual Impact of Extensions

Extensions showcased a dual capacity to both enhance and degrade browser performance, with a pronounced tendency towards inducing higher energy consumption during page load. This nuanced view underscores the complexity of extensions' influence on browser performance, challenging the notion of a one-size-fits-all impact.

Activation Modes Matter

A groundbreaking insight of this paper is the discernible performance differential triggered by varying activation modes of extensions. Unexpectedly, even inactive extensions or those operating in non-designated modes exhibited a significant performance impact, underlining the importance of context and usage patterns in assessing extension performance.

Influential Factors on Performance

Through a sophisticated analysis employing elastic net regression models, the paper illuminated key factors influencing browser performance, notably including the number of functions, privacy practices, and specific file types within extensions. These factors offer actionable insights for developers aiming to optimize extension performance.

Theoretical and Practical Implications

This empirical investigation enriches the academic discourse on browser extensions, offering a granular understanding of how different elements and usage scenarios affect performance. Practically, it equips developers with data-driven insights to refine extensions, enhancing efficiency and usability. Furthermore, it empowers users to make informed decisions regarding extension usage, balancing functionality with performance considerations.

Future Directions in Browser Extension Research

Building on this foundation, future research could explore the longitudinal impact of extensions on browser performance, considering evolving web standards and user expectations. Additionally, investigations into the interplay between extension-induced performance variations and user behavior patterns could yield further refinements in extension development and optimization practices.

Conclusion

In conclusion, Jin, Li, and Zou's work establishes a nuanced understanding of browser extensions' impact on performance within Google Chrome, marked by methodological rigor and a comprehensive analytical lens. Their findings pave the way for enhanced user experiences and extension development practices, highlighting the intricate balance between added functionality and performance overhead.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets