How computer software can make policy, explained by family separation at the border

How computer software can make policy, explained by family separation at the border

How computer software can make policy, explained by family separation at the border

Vox’s home for compelling, provocative narrative essays.

I was listening to the New York Times daily podcast a few weeks ago when a segment caught my attention. It was about the current administration’s failure to reunite the families separated at the border, despite court orders to do so. And what struck me is how this failure — so sinister in its impacts on these families — ultimately came down to the software that border agents were using.

As someone who has spent my career working at the intersection of technology and politics — as US deputy chief technology officer under Obama and in my role with Code for America, the nonprofit I founded and run — I was particularly struck by this story.

The podcast detailed how border agents process people coming across the border. They use a computer program that allows them to categorize people in one of three ways: as an “unaccompanied minor,” an “individual adult,” or an “adult with children,” which refers to the whole family unit. Each case gets assigned an identification number, and families (”adults with children”) shared one ID number.

This seemed to work fine, until the Trump administration ordered these agents to separate these same families. In order to do that, border agents reprocessed members of families as either individual adults or unaccompanied minors, and gave everyone new identification numbers, thus losing the one piece of data that connected the members of the family in the system. So, when the court ordered that agents reunite families, those same processing center records no longer reflected which children belonged to which parents.

As Caitlin Dickerson and Annie Correal, who reported this story, put it, “When people hear this, they immediately picture something sinister. They think border agents carrying out this policy were essentially trying to cover their tracks, to intentionally make it impossible to link parents and kids after they were separated.” Instead, as Dickerson and Correal stated, “They can’t change their computer systems in a way that would separate families but still hold on to that identification number, for example. They just don’t have that much power on an individual level.”

At a time when Silicon Valley and the larger public are waking up to the government’s reliance on software to carry out its agenda, it’s more important than ever for tech workers to be thoughtful about how they can be a force for good.

In many situations, software is policy

The fact that front-line workers couldn’t change the software they use to enable reunification of these families is both depressing and entirely unsurprising to those of us who work in government technology. And for most of us, the characterization of these front-line workers as more disempowered than sinister likely resonates. Knowing that this chaos could have been remedied — if not by software changes, then at least by some sort of hack, had someone with the right skills been allowed to help — is truly maddening.

But there are two larger lessons here that echo what we’ve learned at Code for America in eight years of working with government technology. The first is that implementation is policy. Whatever gets decided at various times by leadership (in this case, first to separate families, then to reunite them), what happens in real life is often determined less by policy than by software. And until the government starts to think of technology as a dynamic service, imperfect but ever-evolving, not just a static tool you buy from a vendor, that won’t change.

The second lesson has to do with how Silicon Valley — made up of people who very much think of technology as something you do — should think about its role in fighting injustice.

In the last year, employees of several major software companies have been petitioning their executives to disengage with government agencies involved with the enforcement of certain immigration policies since the 2016 election. Microsoft employees urged their leadership to revoke the licenses for Outlook and other Microsoft products from Immigration and Customs Enforcement, or ICE. I applaud those who take action and use whatever leverage they have to fight injustice. But in a world where most people in the tech industry, and the general public, have very little insight into how government works and doesn’t work, the conversation can miss critical context.

Imagine if Microsoft had revoked the Outlook licenses of the border patrols right in the middle of the attempt to reunite families, when the pressure to act was the highest. Now the agents, who (if you believe the reporting of the New York Times) were actually trying to comply with the court order to reunify families, wouldn’t even be able to send and receive email. Email would most certainly be one of the ways they were receiving updates on those very court orders, and it had probably become the key method of communicating with others in the system about the whereabouts of the family members they were trying to locate. If they lost their ability to communicate, who would get hurt the most in this scenario? The families.

This is one of the lessons you can’t escape if you work on government tech. When government is impaired, who gets hurt? More often than not, the most vulnerable people. Healthcare.gov may have dominated headlines, but technology failures frequently spell disaster for families in need when state benefit systems stop delivering needed food or Medicaid benefits. For example, in the past year and a half, Illinois, Rhode Island, and the District of Columbia have been among the jurisdictions where residents, tens of thousands in some cases, were dropped from the Medicaid and/or SNAP benefits they’d been receiving because new computer systems failed to enroll them in the program.

Tech employees should use their power. But they also must be smart about their activism.

I support employees at tech companies who use their power to withhold tools they believe a government will use unjustly. There are companies who are actively helping ICE track down undocumented immigrants using sophisticated data collection and analysis techniques. If you believe that a policy like this is wrong and you have the ability to at least not contribute to its enforcement, that’s a legitimate and principled stance to take. We need more of that.

But I would ask that those same people recognize that wholly separating the tech industry’s expertise and competencies from government work could lead to unintended consequences. The benefits of technology have unevenly accrued between the rich and the poor, the powerful and the less powerful, and, yes, between the private sector and the public sector. The technology community should pair its attempts to take capabilities away from government with an effort to strengthen its capability elsewhere, in thoughtful ways that align with the values of the creators of that technology.

In order to properly administer a social safety net, a just criminal justice system, and hundreds of other functions that constitute a functioning democracy, we must build government’s technology capabilities. In doing that, we run the risk of also increasing government’s effectiveness to do harm.

Which is why Silicon Valley can’t limit its leverage over government to software. Software doesn’t have values. People do. If the people who build and finance software (in Silicon Valley and elsewhere) really want government that aligns with their values, and they must offer government not just their software, but their time, their skills, and part of their careers. The best way to reclaim government is to become part of it.

Jennifer Pahlka is the founder and executive director of Code for America. In 2013–2014, she served as the US deputy chief technology officer, where she helped found the US Digital Service.

First Person is Vox’s home for compelling, provocative narrative essays. Do you have a story to share? Read our submission guidelines, and pitch us at [email protected].

Sourse: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *