Getting tech right in Iowa and elsewhere requires insight into data, human behavior

0
3
- Advertisement -

What happened in Iowa’s Democratic caucus last week is a textbook example of how applying technological approaches to public sector work can go badly wrong just when we need it to go right.

While it’s possible to conclude that Iowa teaches us that we shouldn’t let tech anywhere near a governmental process, this is the wrong conclusion to reach, and mixes the complexity of what happened and didn’t happen. Technology won’t fix a broken policy and the key is understanding what it is good for.

What does it look like to get technology right in solving public problems? There are three core principles that can help more effectively build public-interest technology: solve an actual problem, design with and for users and their lives in mind and start small (test, improve, test).

Before developing an app or throwing a new technology into the mix in a political process it is worth asking: what is the goal of this app, and what will an app do that will improve on the existing process?

Getting it right starts with understanding the humans who will use what you build to solve an actual problem. What do they actually need? In the case of Iowa, this would have meant asking seasoned local organizers about what would help them during the vote count. It also means talking directly to precinct captains and caucus goers and observing the unique process in which neighbors convince neighbors to move to a different corner of a school gymnasium when their candidate hasn’t been successful. In addition to asking about the idea of a web application, it is critical to test the application with real users under real conditions to see how it works and make improvements.

In building such a critical game-day app, you need to test it under more real-world conditions, which means adoption and ease of use matters. While Shadow (the company charged with this build) did a lightweight test with some users, there wasn’t the runway to adapt or learn from those for whom the app was designed. The app may have worked fine, but that doesn’t matter if people didn’t use it or couldn’t download it.

One model of how this works can be found in the Nurse Family Partnership, a high-impact nonprofit that helps first-time, low-income moms.

This nonprofit has adapted to have feedback loops from its moms and nurses via email and text messages. It even has a full-time role “responsible for supporting the organization’s vision to scale plan by listening and learning from primary, secondary and internal customers to assess what can be done to offer an exceptional Nurse-Family Partnership experience.”

Building on its program of in-person assistance, the Nurse Family Partnership co-designed an app (with Hopelab, a social innovation lab in collaboration with behavioral-science based software company Ayogo). The Goal Mama app builds upon the relationship between nurses and moms. It was developed with these clients in mind after research showed the majority of moms in the program were using their smartphones extensively, so this would help meet moms where they were. Through this approach of using technology and data to address the needs of their workforce and clients, they have served 309,787 moms across 633 counties and 41 states.

Another example is the work of Built for Zero, a national effort focused on the ambitious goal of ending homelessness across 80 cities and counties. Community organizers start with the personal challenges of the unhoused — they know that without understanding the person and their needs, they won’t be able to build successful interventions that get them housed. Their work combines a methodology of human-centered organizing with smart data science to deliver constant assessment and improvements in their work, and they have a collaboration with the Tableau foundation to build and train communities to collect data with new standards and monitor progress toward a goal of zero homelessness.

Good tech always starts small, tests, learns and improves with real users. Parties, governments and nonprofits should expand on the learning methods that are common to tech startups and espoused by Eric Reis in The Lean Startup. By starting with small tests and learning quickly, public-interest technology acknowledges the high stakes of building technology to improve democracy: real people’s lives are at stake. With questions about equity, justice, legitimacy and integrity on the line, starting small helps ensure enough runway to make important changes and work out the kinks.

Take for example the work of Alia. Launched by the National Domestic Workers Alliance (NDWA), it’s the first benefits portal for house cleaners. Domestic workers do not typically receive employee benefits, making things like taking a sick day or visiting a doctor impossible without losing pay.

Its easy-to-use interface enables people who hire house cleaners to contribute directly to their benefits, allowing workers to receive paid time off, accident insurance and life insurance. Alia’s engineers benefited from deep user insights gained by connecting to a network of house cleaners. In the increasing gig economy, the Alia model may be instructive for a range of employees across local, state and federal levels. Obama organizers in 2008 dramatically increased volunteerism (up to 18%) just by A/B testing the words and colors used for the call-to-action on their website.

There are many instructive public interest technologies that focus on designing not just for the user. This includes work in civil society such as Center for Civic Design, ensuring people can have easy and seamless interactions with government, and The Principles for Digital Development, the first of which is “design with the user.” There is also work being done inside governments, from the Government Digital Service in the U.K. to the work of the United States Digital Service, which was launched in the Obama administration.

Finally, it also helps to deeply understand the conditions in which technology will be used. What are the lived experiences of the people who will be using the tool? Did the designers dig in and attend a caucus to see how paper has captured the moving of bodies and changing of minds in gyms, cafes and VFW halls?

In the case of Iowa, it requires understanding the caucuses norms, rules and culture. A political caucus is a unique situation.

Not to mention, this year the Iowa Caucus deployed several process changes to increase transparency but also complexify the process, which needed to also be taken into account when deploying a tech solution. Understanding the conditions in which technology is deployed requires a nuanced understanding of policies and behavior and how policy changes can impact design choices.

Building a technical solution without doing the user-research to see what people really need runs the risk of reducing credibility and further eroding trust. Building the technology itself is often the simple part. The complex part is relational. It requires investing in capacity to engage, train, test and iterate.

We are accustomed to same-day delivery and instantaneous streaming in our private and social lives, which raises our expectations for what we want from the public sector. The push to modernize and streamline is what leads to believing an app is the solution. But building the next killer app for our democracy requires more than just prototyping a splashy tool.

Public-interest technology means working toward the broader, difficult challenge of rebuilding trust in our democracy. Every time we deploy tech for the means of modernizing a process, we need to remember this end goal and make sure we’re getting it right.

Written by Walter Thompson
This news first appeared on https://techcrunch.com/2020/02/13/getting-tech-right-in-iowa-and-elsewhere-requires-insight-into-data-human-behavior/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29 under the title “Getting tech right in Iowa and elsewhere requires insight into data, human behavior”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.