CIPL Files Comments on Civil Rights Implications of Commercial Data Practices
Time 2 Minute Read

On March 6, 2023 the Centre for Information Policy Leadership (CIPL) at Hunton Andrews Kurth filed a response to the National Telecommunications and Information Administration’s request for comment on issues at the intersection of privacy, equity and civil rights.  

CIPL noted that the civil rights implications of commercial data practices raise questions addressing responsible uses of data, and CIPL has a long history of promoting responsible data practices through its efforts regarding organizational accountability. By encouraging organizations to implement and demonstrate accountability, CIPL has sought to ensure not only that organizations comply with applicable legal requirements and best practices, but also that organizations improve societal trust in their legitimate and beneficial uses of data.

While CIPL’s Accountability Framework was initially developed to help mitigate risks related to privacy harms, CIPL noted that its framework and the risk assessments it entails can have broader application and can help address risks associated with any data use, including harms impacting marginalized or underserved communities. Indeed, a contextual risk assessment would help identify not only potential harms to members of a particular group, but also appropriate measures to mitigate those harms.

Importantly, CIPL stressed that a risk assessment does not address whether certain types of data should be used generally or at all, but rather whether the data can be used responsibly and with appropriately tailored protections in a specific context and for a specific purpose. In support of this point, CIPL cited Professor Daniel Solove’s recent article “Data Is What Data Does,” which emphasizes that it is the use of data that matters, not whether it is sensitive or non-sensitive.

CIPL specifically noted that not all collection and uses of data related to race, religion and other data sets commonly regarded as sensitive are bad or harmful. Indeed, AI systems in particular need diverse data sets, including data commonly regarded as sensitive, to understand and subsequently limit biased and discriminatory outputs. While CIPL agreed that certain uses of data may have an adverse impact on marginalized or underserved communities, an accountability-based risk assessment will be able to identify such impacts and distinguish appropriate uses from inappropriate ones and will enable appropriate safeguards.

CIPL’s full comments are accessible here.

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Archives

Jump to Page