Technology for Whom?
by Jackie Swift
Every few weeks, it seems, we hear about another computer virus. We are warned over and over about phishing schemes and malware attacks by faceless hackers. We are urged to change our usernames and passwords often. But all that emphasis on security against unknown tech criminals has left us vulnerable to people who might know us well, says Nicola Dell, Information Science at Cornell Tech.
Dell focuses on human-computer interaction and strives to carry out research that not only has academic significance but also has practical impact on people’s lives. In 2017, she joined with Thomas Ristenpart, Computer Science, to carry out a study that showed technology is being used extensively in many ways by intimate-partner abusers to carry out abusive acts. “We found a startling landscape of technology-enabled abuse,” she says. “The attacks are really persistent and targeted and often incredibly harmful, but they often aren’t taken as seriously as other forms of violence by law enforcement and the courts.”
The Problem of Technology-Enabled Abuse
After learning of the extent of technology-enabled abuse, Dell and Ristenpart founded the Clinic to End Tech Abuse (CETA), which provides free tech support for survivors of intimate partner violence. The clinic is embedded within the Family Justice Center social support system run by the New York City Mayor’s Office to End Domestic and Gender-Based Violence. Anyone who comes to a Family Justice Center can request a referral to CETA.
CETA is modeled partly on the idea of a legal clinic offering pro bono legal assistance, Dell explains. “We train volunteer technologists so they can meet with survivors to go through their devices and accounts and check their security and privacy settings,” she says. “They look for ways that the survivors may be compromised or vulnerable and help address that. Our research showed this kind of help was missing from existing survivor-support ecosystems.”
Many of the technology-enabled attacks that researchers have seen are related to technology ownership, such as internet or cell phone family plans. “Often the family plan is owned by the abuser, and the survivor will be using the device on that plan,” Dell says. “This gives the abuser access to account activity. They can see who’s been called and text messaged and so on. It also gives them the ability to cut off the survivor’s access or shut down the line.”
In addition, abusers often install and/or turn on tracking software, such as patient tracking, or parental controls that enable location tracking or monitoring. They also might log into the survivor’s iCloud, mail server, or social media, Dell explains. “They might post harassing messages on social media,” she says, “Or they might lock the survivor out of the account by changing the password.”
Dell and Ristenpart run a research group linked to CETA that explores the issues identified through the clinic, then put the research findings back into clinical practice. “I really like the flavor of that, of first learning about problems and current situations, then doing something — whether it’s building technology or creating a clinic,” Dell says.
Supporting Home Health Aides
In another project, Dell joined with Madeline R. Sterling, Medicine, at Weill Cornell Medicine — in partnership with the health-care union 1199SEIU (United Healthcare Workers East) — to look into technology’s potential to help home health aides deliver care to chronically ill patients. “Home health aides are the backbone of providing care, and that has been exacerbated and highlighted by the COVID-19 pandemic,” Dell says. “But they are not treated as important within the health-care system. They don’t have the status of nurses or doctors, and their contributions and insights to patient treatment and care are often overlooked.”
“For some populations…a voice assistant provides things like touch-free, eyes-free interaction that could be transformative.”
When COVID first struck in the spring of 2020, Dell and Sterling looked at the experiences of aides during that stressful time and concluded that aides were struggling terribly with little support. The nature of their job required them to physically visit their patients, which meant they could not work from home, Dell explains. And since most aides are women, immigrants, and low-wage earners, they didn’t have the option of quitting or refusing to work. At the same time, they often were not given sufficient personal protective equipment and had to buy it themselves, or they were expected to come to their agencies to get it, which was difficult and a health risk for them.
“We saw that they were really scared,” Dell says. “They worried that they might be carrying the virus and give it to their patients who are, by definition, the most vulnerable community, and also vice versa, that they might take it from the workplace home to their own families. Many of them also pointed out that doctors and nurses were getting applauded and celebrated while they weren’t, even though they were the ones still going out to visit patients who were vulnerable to the virus. It was definitely a bleak situation for them.”
Dell and Sterling’s study, eventually published in JAMA: The Journal of the American Medical Association, has been highly cited. It also led to the researchers setting up a virtual, peer-support pilot program for aides to meet online weekly in small groups to discuss their job issues and to provide support to one another. In light of the positive feedback the researchers received, they are now writing a grant proposal to scale up to a larger test program.
In addition, the Dell lab is pursuing a new line of research: designing a voice assistant specifically for aides. Voice assistants like Siri and Alexa have been developed primarily as luxury goods, Dell points out; they’re mainly used by consumers to check things like the weather or the words to a song. “But for some populations, like home health aides or older adults, a voice assistant provides things like touch-free, eyes-free interaction that could be transformative,” she says. “You don’t have to be able to cross the room to turn it on. A voice assistant could give instructions or guidance to an aide, for instance, when she’s helping someone and has her hands full and her eyes on the patient and really needs some assistance.”
Technology for the Marginalized
Dell grew up in Zimbabwe and was drawn to the field of human-computer interaction as a way to ensure that technology can have impact beyond the richest countries. Later, she broadened her interests to include marginalized communities that tend to be ignored by the companies that produce high-tech products. “In grad school, the human aspects of computer systems were the ones I was most interested in, how people were using those systems and repurposing them,” she says.
“People often think technology will solve problems like poverty or gender-based violence,” she continues. “Yes, we can build technology so it’s a tool that’s available to people dealing with these sorts of issues, but it’s always going to be people and communities who solve the problems.”
Cornell research moves quickly. Keep up with our monthly e-newsletter.