Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Algorithmic Transparency and Participation through the Handoff Lens: Lessons Learned from the U.S. Census Bureau's Adoption of Differential Privacy (2405.19187v1)

Published 29 May 2024 in cs.CY

Abstract: Emerging discussions on the responsible government use of algorithmic technologies propose transparency and public participation as key mechanisms for preserving accountability and trust. But in practice, the adoption and use of any technology shifts the social, organizational, and political context in which it is embedded. Therefore translating transparency and participation efforts into meaningful, effective accountability must take into account these shifts. We adopt two theoretical frames, Mulligan and Nissenbaum's handoff model and Star and Griesemer's boundary objects, to reveal such shifts during the U.S. Census Bureau's adoption of differential privacy (DP) in its updated disclosure avoidance system (DAS) for the 2020 census. This update preserved (and arguably strengthened) the confidentiality protections that the Bureau is mandated to uphold, and the Bureau engaged in a range of activities to facilitate public understanding of and participation in the system design process. Using publicly available documents concerning the Census' implementation of DP, this case study seeks to expand our understanding of how technical shifts implicate values, how such shifts can afford (or fail to afford) greater transparency and participation in system design, and the importance of localized expertise throughout. We present three lessons from this case study toward grounding understandings of algorithmic transparency and participation: (1) efforts towards transparency and participation in algorithmic governance must center values and policy decisions, not just technical design decisions; (2) the handoff model is a useful tool for revealing how such values may be cloaked beneath technical decisions; and (3) boundary objects alone cannot bridge distant communities without trusted experts traveling alongside to broker their adoption.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (138)
  1. 2021. Alabama v. U.S. Dep’t of Commerce. 546 F. Supp. 3d 1057 (M.D. Ala.).
  2. Roles for computing in social change. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (FAT* ’20). Association for Computing Machinery, New York, NY, USA, 252–260. https://doi.org/10.1145/3351095.3372871
  3. The 2020 Census Disclosure Avoidance System TopDown Algorithm. Harvard Data Science Review Special Issue 2 (jun 24 2022). https://hdsr.mitpress.mit.edu/pub/7evz361i.
  4. John M. Abowd. 2018a. Disclosure Avoidance for Block Level Data and Protection of Confidentiality in Public Tabulations. https://www2.census.gov/cac/sac/meetings/2018-12/abowd-disclosure-avoidance.pdf
  5. John M. Abowd. 2018b. Protecting the Confidentiality of America’s Statistics: Adopting Modern Disclosure Avoidance Methods at the Census Bureau. https://www.census.gov/newsroom/blogs/research-matters/2018/08/protecting_the_confi.html Section: Government.
  6. John M. Abowd. 2021. Declaration of John M. Abowd. In State of Alabama v. U.S. Department of Commerce. https://censusproject.files.wordpress.com/2021/04/2021.04.13-abowd-declaration-alabama-v.-commerce-ii-final-signed.pdf
  7. John M. Abowd and Michael B. Hawes. 2023. Confidentiality Protection in the 2020 US Census of Population and Housing. Annual Review of Statistics and Its Application 10, 1 (March 2023), 119–144. https://doi.org/10.1146/annurev-statistics-010422-034226
  8. John M Abowd and Ian M Schmutte. 2019. An economic analysis of privacy protection and statistical accuracy as social choices. American Economic Review 109, 1 (2019), 171–202.
  9. John M. Abowd and Victoria A. Velkoff. 2019. Balancing privacy and accuracy: New opportunity for disclosure avoidance analysis. Census Blogs (2019).
  10. John M. Abowd and Victoria A. Velkoff. 2020. Modernizing disclosure avoidance: What we’ve learned, where we are now. Census Blogs (2020).
  11. Madeleine Akrich. 1992. The De-Scription of Technical Objects. In Shaping Technology / Building Society: Studies in Sociotechnical Change, Wiebe E. Bijker, John Law, Trevor Pinch, and Rebecca Slayton (Eds.). MIT Press, Cambridge, MA, USA, 208.
  12. Kevin Allis. 2020. [Letter from Kevin Allis to Steven D. Dillingham]. https://archive.ncai.org/policy-research-center/research-data/recommendations/NCAI_Letter_to_US_Census_Bureau_on_DAS_6_25_2020_FINAL_signed.pdf
  13. Mike Ananny and Kate Crawford. 2016. Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society 20, 3 (2016), 973–989.
  14. Solon Barocas and Moritz Hardt. 2014. Scope. https://www.fatml.org/schedule/2014/page/scope-2014
  15. It’s Just Not That Simple: An Empirical Study of the Accuracy-Explainability Trade-off in Machine Learning for Public Policy. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). Association for Computing Machinery, New York, NY, USA, 248–266. https://doi.org/10.1145/3531146.3533090
  16. Joseph R. Biden. 2023. Executive order on the safe, secure, and trustworthy development and use of artificial intelligence. (2023).
  17. Power to the people? opportunities and challenges for participatory AI. Equity and Access in Algorithms, Mechanisms, and Optimization (2022), 1–8.
  18. Dan Bouk and danah boyd. 2021. Democracy’s Data Infrastructure. http://knightcolumbia.org/content/democracys-data-infrastructure
  19. danah boyd and Jayshree Sarathy. 2022. Differential Perspectives: Epistemic Disconnects Surrounding the U.S. Census Bureau’s Use of Differential Privacy. Harvard Data Science Review Special Issue 2 (June 2022). https://doi.org/10.1162/99608f92.66882f0e
  20. Differential Privacy Working Group Deliverables: Report of the CSAC Differential Privacy Working Group. https://www2.census.gov/cac/sac/differential-privacy-wg-deliverables.pdf
  21. Jenna Burrell. 2016. How the machine “thinks”: Understanding opacity in machine learning algorithms. Big Data & Society 3, 1 (June 2016), 205395171562251. https://doi.org/10.1177/2053951715622512
  22. Pat Cantwell. 2021. How We Complete the Census When Households or Group Quarters Don’t Respond. https://www.census.gov/newsroom/blogs/random-samplings/2021/04/imputation-when-households-or-group-quarters-dont-respond.html Section: Government.
  23. Paul R. Carlile. 2002. A pragmatic view of knowledge and boundaries: Boundary objects in new product development. Organization Science 13, 4 (2002), 442–455.
  24. The CARE principles for indigenous data governance. Data Science Journal 19 (2020), 43–43.
  25. Danielle Keats Citron. 2007. Technological due process. Wash. UL Rev. 85 (2007), 1249.
  26. Cary Coglianese and David Lehr. 2019. Transparency and algorithmic governance. Administrative law review 71, 1 (2019), 1–56.
  27. European Commission. 2021. Laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain union legislative acts. Eur Comm 106 (2021), 1–108.
  28. Accountability in an algorithmic society: relationality, responsibility, and robustness in machine learning. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency. 864–876.
  29. A systematic review and thematic analysis of community-collaborative approaches to computing research. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–18.
  30. Eric Corbett and Emily Denton. 2023. Interrogating the T in FAccT. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency. 1624–1634.
  31. Power and Public Participation in AI. In Proceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization. 1–13.
  32. Sasha Costanza-Chock. 2020. Design justice: Community-led practices to build the worlds we need. The MIT Press.
  33. Stakeholder Participation in AI: Beyond” Add Diverse Stakeholders and Stir”. arXiv preprint arXiv:2111.01122 (2021).
  34. Deloitte. 2020. Trustworthy AI: Bridging the ethics gap surrounding AI. https://www2.deloitte.com/us/en/pages/deloitte-analytics/solutions/ethics-of-ai-framework.html
  35. William Deringer. 2018. Calculated values: Finance, politics, and the quantitative age. Harvard University Press.
  36. Uma Desai. 2019. uscensusbureau/census-dp. https://github.com/uscensusbureau/census-dp
  37. Alain Desrosières. 1998. The politics of large numbers: A history of statistical reasoning. Harvard University Press.
  38. Nicholas Diakopoulos. 2016. Accountability in algorithmic decision making. Commun. ACM 59, 2 (2016), 56–62.
  39. Nicholas Diakopoulos and Michael Koliska. 2017. Algorithmic transparency in the news media. Digital journalism 5, 7 (2017), 809–828.
  40. Irit Dinur and Kobbi Nissim. 2003. Revealing information while preserving privacy. In Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems. ACM, San Diego California, 202–210. https://doi.org/10.1145/773153.773173
  41. Re: Request for release of ”noisy measurements file” by September 30 along with redistricting data products. https://gking.harvard.edu/files/gking/files/2021.08.12_group_letter_to_abowd_re_noisy_measurements.pdf
  42. Differential Privacy in Practice: Expose your Epsilons! Journal of Privacy and Confidentiality 9, 2 (Oct. 2019). https://doi.org/10.29012/jpc.689
  43. Calibrating Noise to Sensitivity in Private Data Analysis. In Theory of Cryptography (Lecture Notes in Computer Science), Shai Halevi and Tal Rabin (Eds.). Springer, Berlin, Heidelberg, 265–284. https://doi.org/10.1007/11681878_14
  44. Expanding explainability: Towards social transparency in ai systems. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–19.
  45. Differential Privacy for 2020 US Census. https://assets.pubpub.org/j2yr11kl/11587735061843.pdf
  46. Wendy Nelson Espeland and Mitchell L. Stevens. 1998. Commensuration as a social process. Annual Review of Sociology 24, 1 (1998), 313–343.
  47. Wendy Nelson Espeland and Berit Irene Vannebo. 2007. Accountability, quantification, and law. Annu. Rev. Law Soc. Sci. 3 (2007), 21–43.
  48. International Organization for Standardization. 2020. ISO/IEC TR 24028:2020 Overview of trustworthiness in artificial intelligence. https://www.iso.org/standard/77608.html
  49. Jody Freeman. 1997. Collaborative governance in the administrative state. UCLA L. Rev. 45 (1997), 1.
  50. Issues Encountered Deploying Differential Privacy. In Proceedings of the 2018 Workshop on Privacy in the Electronic Society. ACM, Toronto Canada, 133–137. https://doi.org/10.1145/3267323.3268949
  51. Datasheets for datasets. Commun. ACM 64, 12 (Nov. 2021), 86–92. https://doi.org/10.1145/3458723
  52. Ruobin Gong. 2022. Transparent privacy is principled privacy. Harvard Data Science Review Special Issue 2 (June 2022). https://doi.org/10.1162/99608f92.b5d3faaa
  53. Ben Green. 2019. ”Good” isn’t good enough. In Conference and Workshop on Neural Information Processing Systems (AI for Social Good Workshop). Vancouver. https://aiforsocialgood.github.io/neurips2019/accepted/track3/pdfs/67_aisg_neurips2019.pdf
  54. Kenneth Haase. 2021. uscensusbureau/DAS“˙2020“˙Redistricting“˙Production“˙Code. https://github.com/uscensusbureau/DAS_2020_Redistricting_Production_Code
  55. Differentially Private Algorithms for 2020 Census Detailed DHC Race & Ethnicity. https://doi.org/10.48550/arXiv.2107.10659 arXiv:2107.10659 [cs, stat].
  56. Michael Hawes. 2021. The Census Bureau’s Simulated Reconstruction-Abetted Re-identification Attack on the 2010 Census. https://www.census.gov/data/academy/webinars/2021/disclosure-avoidance-series/simulated-reconstruction-abetted-re-identification-attack-on-the-2010-census.html Section: Government.
  57. The Dataset Nutrition Label: A Framework to Drive Higher Data Quality Standards. In Data Protection and Democracy, Dara Hallinan, Ronald Leenes, Serge Gutwirth, and Paul De Hert (Eds.). Data Protection and Privacy, Vol. 12. Bloomsbury Publishing, 1–26. Google-Books-ID: F2HRDwAAQBAJ.
  58. Balancing data privacy and usability in the federal statistical system. Proceedings of the National Academy of Sciences 119, 31 (2022), e2104906119.
  59. V. Joseph Hotz and Joseph Salvo. 2022. A Chronicle of the Application of Differential Privacy to the 2020 Census. Harvard Data Science Review Special Issue 2 (June 2022). https://hdsr.mitpress.mit.edu/pub/ql9z7ehf.
  60. Jessica Hullman. 2022. Show me the noisy numbers! (or not). https://statmodeling.stat.columbia.edu/2022/12/28/show-me-the-noisy-numbers-or-not/
  61. Towards accountability for machine learning datasets: Practices from software engineering and infrastructure. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. 560–575.
  62. Abigail Z Jacobs. 2021. Measurement as governance in and for responsible AI. arXiv preprint arXiv:2109.05658 (2021).
  63. Abigail Z Jacobs and Deirdre K Mulligan. 2022. The Hidden Governance in AI. The Regulatory Review (July 2022).
  64. Julia Jahansoozi. 2006. Organization-stakeholder relationships: exploring trust and transparency. Journal of management development 25, 10 (2006), 942–955.
  65. An in-depth examination of requirements for disclosure risk assessment. Proceedings of the National Academy of Sciences 120, 43 (2023), e2220558120.
  66. Sheila Jasanoff. 2016. Reclaiming the Future. In The Ethics of Invention: Technology and the Human Future. W. W. Norton & Company, New York, 211–245.
  67. Margot E. Kaminski. 2020. Understanding transparency in algorithmic accountability. In Cambridge Handbook of the Law of Algorithms, Woodrow Barfield (Ed.). Cambridge University Press, 20–34.
  68. Trustworthy artificial intelligence: a review. ACM Computing Surveys (CSUR) 55, 2 (2022), 1–38.
  69. Sallie Ann Keller and John M. Abowd. 2023. Database reconstruction does compromise confidentiality. Proceedings of the National Academy of Sciences 120, 12 (March 2023), e2300976120. https://doi.org/10.1073/pnas.2300976120 Publisher: Proceedings of the National Academy of Sciences.
  70. Comment: The Essential Role of Policy Evaluation for the 2020 Census Disclosure Avoidance System. arXiv preprint arXiv:2210.08383 (2022).
  71. The use of differential privacy for census data and its impact on redistricting: The case of the 2020 U.S. Census. Science Advances 7, 41 (Oct. 2021). https://doi.org/10.1126/sciadv.abk3283
  72. Innovation and knowledge sharing across professional boundaries: Political interplay between boundary objects and brokers. International Journal of Information Management 30, 5 (2010), 437–444.
  73. Shaping our tools: Contestability as a means to promote responsible algorithmic decision making in the professions. Ethics of Data and Analytics. Auerbach Publications (2022), 420–428.
  74. Joshua A Kroll. 2018. The fallacy of inscrutability. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 376, 2133 (2018), 20180084.
  75. Joshua A Kroll. 2021. Outlining traceability: A principle for operationalizing accountability in computing systems. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. 758–771.
  76. Accountable Algorithms. University of Pennsylvania Law Review 165, 3 (2017), 633.
  77. Boundary Work among Groups, Occupations, and Organizations: From Cartography to Process. Academy of Management Annals 13, 2 (July 2019), 704–736. https://doi.org/10.5465/annals.2017.0089 Publisher: Academy of Management.
  78. Fair, transparent, and accountable algorithmic decision-making processes: The premise, the proposed solutions, and the open challenges. Philosophy & Technology 31 (2018), 611–627.
  79. Nancy G Leveson. 2016. Engineering a safer world: Systems thinking applied to safety. The MIT Press.
  80. Algorithms and decision-making in the public sector. Annual Review of Law and Social Science 17 (2021), 309–334.
  81. The conflict between explainable and accountable decision-making algorithms. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency. 2103–2113.
  82. Transparency as design publicity: explaining and justifying inscrutable algorithms. Ethics and Information Technology 23, 3 (2021), 253–263.
  83. Assessing the Fairness of AI Systems: AI Practitioners’ Processes, Challenges, and Needs for Support. Proceedings of the ACM on Human-Computer Interaction 6, CSCW1 (2022), 1–26.
  84. Kirsten Martin. 2019. Ethical implications and accountability of algorithms. Journal of Business Ethics 160 (2019), 835–850.
  85. Laura McKenna. 2018. Disclosure Avoidance Techniques Used for the 1970 through 2010 Decennial Censuses of Population and Housing. Technical Report. U.S. Census Bureau Research & Methodology Directorate. https://www2.census.gov/ces/wp/2018/CES-WP-18-47.pdf
  86. Owning ethics: Corporate logics, silicon valley, and the institutionalization of ethics. Social Research: An International Quarterly 86, 2 (2019), 449–476.
  87. minutephysics. 2019. Protecting Privacy with MATH (Collab with the Census). https://www.youtube.com/watch?v=pT19VwBAqKA
  88. Model Cards for Model Reporting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT* ’19). Association for Computing Machinery, New York, NY, USA, 220–229. https://doi.org/10.1145/3287560.3287596
  89. Deirdre K Mulligan and Kenneth A Bamberger. 2018. Saving governance-by-design. California Law Review 106, 3 (2018), 697–784.
  90. Deirdre K. Mulligan and Kenneth A. Bamberger. 2019. Procurement as policy: Administrative process for machine learning. Berkeley Tech. LJ 34 (2019), 773.
  91. Deirdre K. Mulligan and Helen Nissenbaum. 2020. The concept of handoff as a model for ethical analysis and design. The Oxford handbook of ethics of AI 1, 1 (2020), 233.
  92. Priyanka Nanayakkara and Jessica Hullman. 2022. What’s driving conflicts around differential privacy for the U.S. Census. IEEE Security & Privacy 01 (2022), 2–11.
  93. 2020 Census Data Products: Data Needs and Privacy Considerations: Proceedings of a Workshop. National Academies Press.
  94. National Conference of State Legislatures. 2021. Differential Privacy for Census Data Explained. https://www.ncsl.org/technology-and-communication/differential-privacy-for-census-data-explained
  95. NCAI Policy Research Center. 2021. Differential Privacy and the 2020 Census: A Guide to the Data Analyses and Impacts on AI/AN Data. Research Policy Update. National Congress of American Indians, Washington, D.C. https://archive.ncai.org/policy-research-center/research-data/prc-publications/NCAI_PRC_2020_Census_Guide_to_Data_and_Impacts_5_17_2021_FINAL.pdf
  96. Helen Nissenbaum. 2004. Privacy as contextual integrity. Wash. L. Rev. 79 (2004), 119.
  97. Steven A. Ochoa and Terry Ao Minnis. 2021. Impact of Differential Privacy & the 2020 Census on Latinos, Asian Americans and Redistricting. https://www.maldef.org/wp-content/uploads/2021/04/FINAL-MALDEF-AAJC-Differential-Privacy-Preliminary-Report-4.5.2021-1.pdf
  98. OECD. 2021. Tools for trustworthy AI: A framework to compare implementation tools for trustworthy AI systems. OECD Digital Economy Papers 312 (Jun 2021). https://doi.org/10.1787/008232ec-en
  99. National Institute of Standards and Technology. 2023. Artificial Intelligence Risk Management Framework (AI RMF 1.0). (Jan 2023). https://doi.org/10.6028/nist.ai.100-1
  100. Getting Ourselves Together: Data-centered participatory design research & epistemic burden. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–11.
  101. Theodore M. Porter. 1995. Trust in numbers: The pursuit of objectivity in science and public life. Princeton University Press.
  102. Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In Proceedings of the 2020 conference on fairness, accountability, and transparency. 33–44.
  103. Brad Rawlins. 2008. Give the emperor a mirror: Toward developing a stakeholder measurement of organizational transparency. Journal of public relations research 21, 1 (2008), 71–99.
  104. David Gerald Robinson. 2022. Voices in the code: a story about people, their values, and the algorithm they made. Russell Sage Foundation, New York.
  105. How boundary objects help to perform roles of science arbiter, honest broker, and issue advocate. Science and Public Policy 47, 2 (2020), 161–171.
  106. Andrew K. Schnackenberg and Edward C. Tomlinson. 2016. Organizational transparency: A new perspective on managing trust in organization-stakeholder relationships. Journal of management 42, 7 (2016), 1784–1810.
  107. Mike Schneider. 2021. Census releases guidelines for controversial privacy tool. https://apnews.com/article/business-census-2020-55519b7534bd8d61028020d79854e909 Section: Voting rights.
  108. Jeremy Seeman. 2023. Framing Effects in the Operationalization of Differential Privacy Systems as Code-Driven Law. In International Conference on Computer Ethics, Vol. 1.
  109. Jeremy Seeman and Daniel Susser. 2023. Between Privacy and Utility: On Differential Privacy in Theory and Practice. ACM J. Responsib. Comput. (oct 2023). https://doi.org/10.1145/3626494 Just Accepted.
  110. Fairness and abstraction in sociotechnical systems. In Proceedings of the conference on fairness, accountability, and transparency. 59–68.
  111. Mona Sloane and Emanuel Moss. 2022. Introducing a Practice-Based Compliance Framework for Addressing New Regulatory Challenges in the AI Field. TechReg Chronicle (2022).
  112. Participation is not a design fix for machine learning. In Equity and Access in Algorithms, Mechanisms, and Optimization. 1–6.
  113. Introducing contextual transparency for automated decision systems. Nature Machine Intelligence 5, 3 (2023), 187–195.
  114. Mona Sloane and Janina Zakrzewski. 2022. German AI Start-Ups and “AI Ethics”: Using A Social Practice Lens for Assessing and Implementing Socio-Technical Innovation. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency. 935–947.
  115. Susan Leigh Star. 2010. This is not a boundary object: Reflections on the origin of a concept. Science, Technology, & Human Values 35, 5 (2010), 601–617.
  116. Susan Leigh Star and James R. Griesemer. 1989. Institutional ecology,translations’ and boundary objects: Amateurs and professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39. Social Studies of Science 19, 3 (1989), 387–420.
  117. Policy impacts of statistical uncertainty and privacy. Science 377, 6609 (2022), 928–931. https://doi.org/10.1126/science.abq4481 arXiv:https://www.science.org/doi/pdf/10.1126/science.abq4481
  118. Cass R Sunstein. 2002. The cost-benefit state: the future of regulatory protection. American Bar Association.
  119. Latanya Sweeney. 2000. Simple Demographics Often Identify People Uniquely. Working Paper. Carnegie Mellon University Data Privacy Lab, Pittsburgh. https://dataprivacylab.org/projects/identifiability/paper1.pdf
  120. U.S. Census Bureau. 2018. Soliciting Feedback From Users on 2020 Census Data Products. https://www.federalregister.gov/documents/2018/07/19/2018-15458/soliciting-feedback-from-users-on-2020-census-data-products
  121. U.S. Census Bureau. 2020a. 2020 Census Tribal Consultations with Federally Recognized Tribes. Report. U.S. Census Bureau. https://www.census.gov/content/dam/Census/library/publications/2020/dec/census-federal-tc-final-report-2020-508.pdf
  122. U.S. Census Bureau. 2020b. Invariants Set for 2020 Census Data Products. https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/process/disclosure-avoidance/2020-das-updates/2020-11-25.html
  123. U.S. Census Bureau. 2021a. Census Bureau Sets Key Parameters to Protect Privacy in 2020 Census Results. https://www.census.gov/newsroom/press-releases/2021/2020-census-key-parameters.html Section: Government.
  124. U.S. Census Bureau. 2021b. Disclosure Avoidance for the 2020 Census: An Introduction. Handbook. US Government Publishing Office, Washington, D.C. https://www2.census.gov/library/publications/decennial/2020/2020-census-disclosure-avoidance-handbook.pdf
  125. U.S. Census Bureau. 2023a. 2020 Decennial Census: Processing the Count: Disclosure Avoidance Modernization. https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/process/disclosure-avoidance.html
  126. U.S. Census Bureau. 2023b. Coming This Spring: New 2010 Redistricting and DHC ”Production Settings” Demonstration Microdata with Noisy Measurement Files. https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/process/disclosure-avoidance/newsletters/new-2010-redistricting-dhc-demo-microdata.html Section: Government.
  127. U.S. Census Bureau. 2023c. Disclosure Avoidance Webinar Series. https://www.census.gov/data/academy/webinars/series/disclosure-avoidance.html Section: Government.
  128. U.S. Census Bureau. 2023d. Why the Census Bureau Chose Differential Privacy. Brief C2020BR-03. U.S. Census Bureau. https://www2.census.gov/library/publications/decennial/2020/census-briefs/c2020br-03.pdf
  129. Contestability in algorithmic systems. In Conference companion publication of the 2019 on computer supported cooperative work and social computing. 523–527.
  130. Feedback on the April 2021 Census Demonstration Files. https://users.pop.umn.edu/~ruggl001/Articles/IPUMS_response_to_Census.pdf
  131. Fairness and accountability design needs for algorithmic support in high-stakes public sector decision-making. In Proceedings of the 2018 chi conference on human factors in computing systems. 1–14.
  132. Salome Viljoen. 2021. A relational theory of data governance. Yale Law Journal 131 (2021), 573.
  133. Maranke Wieringa. 2020. What to account for when accounting for algorithms: a systematic literature review on algorithmic accountability. In Proceedings of the 2020 conference on fairness, accountability, and transparency. 1–18.
  134. Aaron R. Williams and Claire McKay Bowen. 2023. The promise and limitations of formal privacy. Wiley Interdisciplinary Reviews: Computational Statistics (2023), e1615.
  135. Richmond Y. Wong. 2020. Values by Design Imaginaries: Exploring Values Work in UX Practice. PhD Dissertation. University of California, Berkeley, Berkeley, California.
  136. Seeing like a toolkit: How toolkits envision the work of AI ethics. Proceedings of the ACM on Human-Computer Interaction 7, CSCW1 (2023), 1–27.
  137. Larry Wright, Jr. 2022. Letter from National Congress of American Indians CEO to to US Census Director. https://www.ncai.org/policy-research-center/research-data/prc-publications/20220728_NCAI_Letter_to_US_Census_Bureau_FINAL.pdf
  138. Felix T Wu. 2013. Defining Privacy and Utility in Data Sets. University of Colorado Law Review 84 (2013), 1117–1177.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Amina A. Abdu (2 papers)
  2. Lauren M. Chambers (2 papers)
  3. Deirdre K. Mulligan (5 papers)
  4. Abigail Z. Jacobs (21 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com