One morning last June, Alderman Michael Murphy walked into a public safety committee meeting in Milwaukee with a plan to challenge the city’s funding of community violence intervention.
The city’s Office of Violence Prevention, with a budget of about $5.6 million, mostly from grants and philanthropy, was due to receive another $11 million in city and state funding. Much of it the office would likely pass along to a constellation of about 20 community-based organizations. As in many cities, these grassroots groups — sometimes consisting only of a handful of people — send teams into neighborhoods to try to interrupt violent incidents, to offer resources like summer programming for young people at high risk of being involved in conflict, and to help children heal from trauma.
But Murphy was skeptical. Why wasn’t there a plan to do a randomized study of the programs, he asked, as is sometimes done with public health programs? And why were the programs being evaluated based on the services they provided, rather than whether violence fell in the neighborhood where those services were offered?
Fellow Alderman Mark Borkowski joined in. “We set a homicide record two years ago. We broke the record last year. And we are 30 homicides ahead of the record breaking this year,” he said. “And so I’m trying to find that sliver of hope that says we’re making an impact. I’m not seeing anything, in essence, positive, coming out of any of these programs.”
Alderman Scott Spiker argued that the office should recruit an academic to study the programs. Without independent evaluation, the money might be better spent, he said, on a county program to decrease reckless driving.
The government — at the state and federal levels — is about to invest billions of dollars in community-based violence intervention programs, which focus on strategies like mediation of potentially violent disputes and social support for likely perpetrators of violence. Critics like the Milwaukee aldermen, however, are pushing back, arguing that there is not enough rigorous scholarship to support the investment.
In fact, there is evidence from across the country for the efficacy of such interventions. But large-scale traditional academic study of this type of work is rare. The complicated nature of violence makes it uniquely challenging to pull apart and the expense of formal public health and sociological studies is immense. For smaller groups, which now must compete for the millions available, the burden is particularly high.
The back and forth raises an important question: If gun violence is a key social crisis of our time, why don’t we have more science about how to stop it?
Erricka Bridgeford became interested in what she calls “peace work” after she lost friends to violence as a teenager. In 2001, she became a mediator with the Baltimore Community Mediation Center, got promoted to executive director, and soon found herself training mediators in Maryland and around the country.
“I know how to not just say values, but put them into action,” she said.
Five years ago, Bridgeford had a vision for a movement that challenged residents of Baltimore to practice and promote peace on weekends that were identified as some of the most violent of the year.
The movement’s motto is “nobody kill anybody,” and organizers ask all residents and groups in the city to do anything they can to reduce violence.
“We are inclusive,” Bridgeford said. “You can determine for yourself what [a] life-affirming event means for you.” Groups throw block parties or host football clinics or participate in rituals intended to reclaim spaces where violence has occurred.
Some people make a pledge to refrain from violence for the weekend, or to change their profile picture on social media. The organizers do not pass judgment on any ideas, and make sure they are all posted on the website’s calendar, Bridgeford said.
CeaseFire 365 “belongs to everybody who loves Baltimore,” she said.
But the group, which operated on a tiny budget, did not have the expertise or resources to produce an academic study of its own work. The program had no connection with academia that would allow it to lure a social scientist into studying them from the outside. And the group was already stretched thin trying to do its primary work of helping people abstain from violence.
Meanwhile, Peter Phalen, a graduate student at Johns Hopkins, was despondent after the 2016 election and looking for ways to get more involved in his community. He found himself at a CeaseFire 365 meeting and realized he could increase understanding of the organization’s work by assessing the weekends’ effect on violence in the city.
Fortuitously, the structure of CeaseFire itself was such that it basically set up a natural experiment. In order to assess whether a program is working, it is useful to compare it to a control group: a similar neighborhood or similar individuals who aren’t part of the program, sort of how a drug trial often requires that a set of patients be given a placebo. In the real world, of course, these controls are highly imperfect. However, because CeaseFire Weekends essentially turn on and off like a light switch, it is easy to compare weekends where the group is in action to other weekends where all other variables are virtually identical.
“It’s this perfect thing where I could really isolate the effect,” Phalen said.
It also helped that Baltimore releases uniquely robust and specific crime data. Phalen knew about the data already because he had used it as part of a student movement pushing back against Johns Hopkins’ efforts to institute its own police force.
Phalen — a psychologist who usually studies psychosis — had a set of advanced programming and analytical skills that are rare even among social scientists, and he was willing to work for free.
For efforts like CeaseFire 365 “there are no research dollars,” Phalen said. His compensation was that, “It made me feel good.”
Some members of the group had reservations about being studied, but in the end the “do what you can” ethos of CeaseFire 365 won the day.
When Phalen completed his analysis, he was stunned by the findings. In a world of Impressionist paintings, he had a finding that was crystal clear: On CeaseFire Weekends, gun violence decreased 50 percent, even when he controlled for other factors like the season, day of week, and long-term trends.
“It was like ‘Oh, shit, look at that’,” Bridgeford recalled of seeing the results for the first time.
The ensuing paper was published in the prestigious American Journal of Public Health, contributing important scholarship to the question of preventing gun violence. Phalen, who is now on the faculty of the University of Maryland School of Medicine, pointed out in a recent interview that the study and paper nearly didn’t happen. “It was very much happenstance,” he said.
His research is often cited in articles about CeaseFire 365, and showed grant funders and philanthropists that community-based violence intervention can work in Baltimore, which had something of a reputation as a violence prevention graveyard. Many programs that have achieved good results in other cities have failed there.
Bridgeford credits the study with bringing positive attention to the program.
Three years after the study was published, the group was invited to apply for — and was awarded — a $75,000 grant, Now attendance at weekend events is a must for anyone seeking office, Bridgeford said. Some weekends have as many as 50 events.
But to Bridgeford statistical outputs on a page mean nothing next to the lived experience of violence prevention. “We know how this work impacts people,” she said. “No matter what the result said, we were gonna keep doing it.”
For Phalen, his study raises another question. Policing gets much more funding than community-based violence prevention, and it is also extremely difficult to study. “Does the amount of research supporting policing track with the funding that goes to it?” he asked.
On its face, Alderman Borkowski’s logic seems so straightforward. If we spend money on community violence intervention programs and then violence in the city goes up, the programs must not be working. But the question is infinitely more complex.
What began as Operation Ceasefire (a program unrelated to CeaseFire 365) in Boston in the late 1990s has since been adopted by multiple cities. Resources are targeted toward the neighborhoods where violence is most likely to occur, and an effort is made to communicate with the small percentage of people who are most likely to be involved in it.
The program uses a carrot-and-stick approach to communicate with those likely to be involved in violence. Make good choices, and resources like job assistance will come their way. Make bad choices, and the full fury of law enforcement will be brought to bear.
This message is not just from police or government officials, but also from community members and groups who are a critical component of spreading the message.
Varying versions of the program — with different levels of police involvement — have shown results in most of the cities where they’ve been implemented and studied. While in effect in Boston, it reduced youth homicides by 63 percent. The Chicago version reduced homicides by 37 percent.
In Baltimore, the picture has been much more mixed, which highlights the difficulty of getting the sweeping answers politicians and the public crave.
Results looked promising a decade ago, said Daniel Webster, a professor at Johns Hopkins who evaluates the program’s results in Baltimore for the city. Since then, they have been somewhat baffling. In some neighborhoods the program seems to be helping, but in others, it appears to be making the problem worse.
This doesn’t mean the program should necessarily be abandoned, Webster said, but it does provide information that could be used to make the program better.
It also highlights why it’s so hard to do this type of assessment at all. To measure the effectiveness of a community-based intervention, it is ideal to eliminate all other possible explanations for a change in violence. When built into a study model, these variables are known as controls. Controlling for things like household income and the median age of a population is relatively easy. But how can a researcher control for a long-dormant rivalry between neighborhood gangs that is reactivated? Or a violent person who moves onto a block? Or a huge apartment building where the air conditioning breaks in the middle of a brutal heat wave?
The original Boston program pioneered an approach through which community members were invited to help design the model by suggesting controls, and also appropriate outcomes to measure, which is unusual in social science research.
But even with community input, it just isn’t possible to include all possible controls, and that can muddy the picture that an analysis paints. The impossibility of including all control variables, plus the general randomness that characterizes any real-world neighborhood or group of people, creates statistical noise that can drown out the signal researchers are looking for.
As Phalen put it: “Effects will tend towards zero if there’s a lot of noise. Don’t confuse a lack of finding with a lack of effect.”
In other words, just because we can’t discern the effect a program is having at 30,000 feet using the statistical techniques and controls available doesn’t mean the effect isn’t there.
Webster also emphasized that decisions about where resources will go are often made in a political context, and not necessarily in a way that is the best for experimental design.
Violence varies by day of week and season of the year, but also seems to have an intrinsic wave pattern, surging and then retreating on its own, Webster said. In Baltimore, though, neighborhoods targeted by the program are those where violence is surging. It takes complex math to figure out if a subsequent decrease is because of the program, or just due to the natural ebb and flow of neighborhood violence. And it is possible that building a model that works cannot be done.
Studying changes at the neighborhood level is also problematic, Webster said. Because beneficial effects outside of studied neighborhoods are not captured, he said, “there is a built-in conservative bias.”
“Even using the most rigorous methods we can come up with, we may still be underestimating program effects,” he said.
Another challenge to producing research that allows cities to learn from each other, Webster said, is that programs that are similar conceptually might have dramatically different results because of a city’s level of commitment and investment. He said his research shouldn’t be used as evidence that violence interruption couldn’t work anywhere, because the implementation of it in Baltimore is so poor.
“Baltimore’s version of community violence intervention is about as stripped down as it comes,” he said.
Other programs have struggled in Baltimore for similar reasons, as previously reported by The Trace.
Baltimore is also notorious for an extreme lack of trust between police and community members.
Webster described the approach as, “Go perform miracles, guys — and if it doesn’t work, we’ll blame you.”
“Let’s actually build something strong and test that,” he said.
If groups like CeaseFire 360 are have-nots, READI Chicago is in the have category. READI’s model involves identifying young men who are likely to be involved in violence and then giving them job training, paid jobs, and therapy. The program is well funded by philanthropy, and researchers from the University of Chicago have been involved in tracking its achievements since its launch in 2017.
READI’s approach has been praised by the Biden administration, and in February 2022 its director, Eddie Bocanegra, was tapped to serve as senior advisor for community violence intervention at the U.S. Department of Justice.
READI is often cited as a program that is evidence based, but a recently released report based on data run by the University of Chicago’s Crime Lab takes a different approach to evidence. Instead of looking at violence across the population of an entire city, like critics often do, it looks at measurable changes in the population that is directly engaged by the program.
According to its report, which they are careful to say is preliminary, men who showed up for at least one day of READI programming have 79 percent fewer arrests for shootings and homicides, though results for other types of violent crime have not yet been detected.
Researchers also simply asked participants themselves whether they felt the program was working.
Anibal DeJesus, 35, grew up in Humboldt Park, a largely Puerto Rican neighborhood on the West Side of Chicago. He was the oldest of seven kids, and had to quit school to care for his siblings. At age 13, he joined a gang and started selling drugs. Eventually, he was arrested for possessing a firearm, which he says he acquired to protect himself, and served nine years in prison.
After he got out, his parole officer told him about READI. “Well, sure,” he said. “I’m ready to try anything.”
A typical day for DeJesus started with group therapy, where participants learn to open up and explore their trauma. The code of the groups is called ‘Control, Alt + Delete,’ a way of letting group members know that they can trust their peers. What is said in the room, stays in the room.
READI espouses Cognitive Behavioral Therapy, a therapeutic technique that encourages patients to challenge nonconstructive thoughts and employ conscious strategies to cope with them constructively.
“I’ve learned how to control myself,” DeJesus said. “Before, I would blow up and wouldn’t know why. Before, I didn’t do no thinking. Now, I take a deep breath and find the words.”
After therapy, DeJesus had a professional development class, then he was off to a worksite where he spent the bulk of the day. Some days he worked at a furniture bank that distributes donated furniture, but his favorite job was stocking the shelves at a food bank. The older woman who runs it was kind, and he liked the feeling of “giving away to people that need it.”
When the researchers arrived, they mainly wanted to know how participants had transformed.
“They want to know how we see things now. They want to know: are we a better version of ourselves?” he said. “And we are.”
DeJesus also worked with a career coach to get a commercial driver’s license.
“They ask me on a daily basis if I’m OK,” he said. “That’s the most important thing, that’s the support.”
Gun violence travels through vulnerable populations like a disease. If you draw a picture of its path, it looks little different than a pandemic.
For this reason, a large portion of research about gun violence, but specifically about community violence intervention programs, comes from the field of public health. But professors of public health do not look like those most affected by gun violence.
Only 5.7 percent of U.S. public health professors are African-American, and only 5.9 percent are Hispanic, according to a 2017 analysis. Just seven public health professors in all of America are Native American, though that group has an extremely high rate of gun suicide.
In the 1990s, Robert Fullilove, a public health professor at Columbia University, conducted an experiment. He sent his students to a distressed neighborhood at the upper end of Manhattan and instructed them to interview 100 residents about violence. When participants were asked to define violence, they gave 60 different answers, he said. Some pictured men shooting each other down in the street, but others thought about domestic violence, or child abuse, or police harrassment. Grinding poverty itself inflicts violence on the body in the form of stress, poor nutrition, and exhaustion. It ends lives early.
“Violence is often filtered through a very personal lens,” Fullilove said.
That means, too, that the life experiences of researchers begin affecting research even at the early stages, as these differing experiences are reflected in the questions asked and the basic assumptions built into experimental design.
For example, when all public health researchers were white men, Fullilove said, studies of policing typically were informed by the idea that police are good guys, fighting back against the criminal elements in society. It was only later, when researchers of color began studying policing, that the picture changed to a more nuanced one, with a more complete view of how policing and vulnerable communities intersect.
Quality public health research should make every attempt to work with the communities being studied, but there can be a lack of trust when researchers from outside parachute in, Fullilove said. The type of strict experimental design that critics of community violence intervention are clamoring for is “expensive, artificial, and creates a lab out of human interactions.”
In 2018 the Association of Schools and Programs of Public Health launched a task force to identify the points of friction preventing diverse public health students from continuing along the path to becoming a professor.
“We are working to dismantle structural racism in public health,” said Linda Alexander, the chief academic officer of the association.
This is critical, she said, because gun violence — and indeed many public health disparities — has “a well-documented history in racism and discrimination,” she said. “Very little has changed in terms of these structures.”
A researcher needs to come at the question with an appropriate lens that acknowledges the historical roots of an issue, she said.
There’s evidence that the pipeline is becoming more diverse, but it could take decades before that translates into more diverse researchers.
In Milwaukee, the criticism held up city funding but the Office of Violence Prevention received a federal grant that allowed it to offer $500,000 worth of children’s programming through the summer. The office has a new director now, Ashanti Hamilton, whom the mayor brought in, he said, to impose more accountability. But after a few months on the job, Hamilton is also left emphasizing that overall crime statistics should not be used to evaluate the office’s work.
“Our measurement shouldn’t be the rise and fall of crime statistics,” he told WTMJ. “Our measurement should be whether or not we are serving the people who live in their different communities and whether or not they see their community as safer.”
The difficulty of studying the efficacy of violence interruption persists.
As a professor of criminal justice at Rutgers in New Jersey, Liza Chowdhury understands the need to collect data. Chowdhury also runs a group, Reimagining Justice, that provides services to victims of violence in Patterson, New Jersey, who come to the local hospital for treatment. It offers emergency funding to make sure that victims and their families have housing and food. It pays for the funerals of children who were killed. Mediators try to prevent victims of gun violence from retaliating and creating more victims. One victim was permanently disabled by a shooting, and the group rescued him from homelessness.
Investigating America’s gun violence crisis
Reader donations help power our non-profit reporting.
But in August, after months of being blown off by the state Attorney General’s Office, the group received an email that said its grant funding will not be renewed past September.
“Every six months, I have to figure out where the money’s going to come from for the next six months,” Chowdhury said.
The grants that fund the program do not pay for a grant writer, a research assistant, or even an administrative assistant to run the office, so Chowdhury has taken on all these tasks.
“We’re capturing data the best we can,” she said, but it is a struggle that shouldn’t distract from the group’s primary mission.
The hardest thing, she said, is when workers are emotionally exhausted from working with victims all day, and then she has to ask them to fill out a report.
“People are always going to be complaining about data,” she said. “Our goal right now is focusing on violence.”
This story text was updated to clarify that Operation CeaseFire is distinct from SafeStreets, another violence intervention program in Baltimore, and that Daniel Webster’s comments were specific to Operation CeaseFire.