‘We Still Need a Fundamental Shift’; The Road to Data Privacy Legislation and AI Infrastructure Must Reflect a More Inclusive Landscape

In a far-ranging discussion last week addressing data privacy, digital inequities, artificial intelligence and the importance of diversity in every step—titled Emerging Tech and the Privacy-Equity Dilemma—a distinguished panel shared workable, forward-thinking and equitable solutions.  The panel, presented by SIIA, was moderated by Rebecca Kern, Politico technology policy reporter with panelists:  Charina Chou, Global Policy Lead for Emerging Technologies, Google; Cameron Kerry, Distinguished Fellow, Brookings; Matt McMurray, Legislative Director for Rep. Robin Kelly (D-IL); and Lo Smith, Senior Programs Manager for the National Digital Inclusion Alliance in Baltimore.

“As technology continues to dance at a rapid pace, policymakers must be vigilant to ensure that the inequities of today are not embedded in the technology of the future,” said SIIA President Jeff Joseph, introducing the discussion. “How we close the gaps and [maintain] respect for privacy and equity and technology, while also being mindful of security and the real and digital worlds, is an ongoing debate—and one that is taking center stage.

“While the past few years have brought an increased focus on AI in the tech sector, we still need a fundamental shift. We need to ensure that DEI principles and values are fully engrained, so we think of diversity and inclusion as an absolute and non-negotiable rather than a hopeful element on a wish list or a best-practice, check-the-box of business compliance. This becomes particularly important as we begin to set the rules of the road for emerging technology.”

There was agreement on the importance of getting comprehensive privacy legislation but the panelists aren’t optimistic it will get done. “Google has definitely called for federal privacy law for over a decade,” said Chou, stressing that privacy regulation and federal legislation in this area “is really an opportunity for innovation.”

“We’ve got a very narrow window at this point in this Congress, [so] it’s not entirely shot,” said Kerry. “The question really gets at some of the specifics of what needs to go into a civil rights, privacy provision – there needs to be an exception related to compliance to affirmative action.  Some important substantive work still needs to be done on boundaries for collection, use and sharing of information that’s really the heart of the privacy issues that we’re dealing with and the specifics of a civil rights provision.”

“I’m not particularly hopeful that something might get done,” said McMurray. “We have, unfortunately, 50 different rules and a lot of cases for different reporting requirements, which is exactly the fear I hear from a lot of companies. That could happen more and more as you see California and other states pass their own privacy laws. From the recent hearings, you can notice that both on the Republican and Democratic sides there seems to be an overlap of specific consideration on maybe a narrower bill around just children.  The EU moved ahead of us on GDPR, and I don’t think they’re going to wait for us on AI, and the longer it takes us to do privacy, the less goodwill we’re building across the Atlantic with our allies to show that we really take privacy seriously and issues around bias seriously as well.”

In addressing data and privacy concerns, Chou said “there’s a lot of discussion around learning from data—we’ve been able to make lots of new inferences, new insights. Are there ways to actually learn from data even when you don’t see the individual, private information? Those types of constraints, those types of respect for privacy have led to technologies like federated learning [and synthetic data], where a lot of individual user data stays on device, but you still get the benefit of the aggregated insights.”

Smith brought the discussion to a grassroots, community-oriented nexus, grappling with how to ensure digital inclusion is part of the privacy conversation. They emphasized the importance of self-reporting—“allowing the community to say this is who we are and this is what the impact is”—vs. digital redlining.  “Specific community members say, ’these are my identifying features, these are the experiences that I am having. Here you go. Here’s a data set that we are comfortable sharing and here is how we are being impacted by digital discrimination.’”

“When you begin to build conversations within physical and online communities on guarding data, dignity and regarding your ability to own your own personal data, that’s when you can really begin to have people coming together to compare experiences and create their own personalized community data sets to begin to have bigger greater conversations on a neighborhood or city county state or national level,” Smith added.

McMurray stressed the importance of transparency in gathering data, “Being up front, so that the customer knows how their information is being shared, is really important.”

Other topics addressed included: the importance of personal data vs privacy issues, digital literacy and digital inclusion, an AI bill of rights, and creating a specific office of civil rights within the FTC.

The entire panel discussion can be viewed here.





Comments are closed.