August, 2020

Studies Find Domestic Violence Increases During Lockdowns

Recent research of government-mandated lockdowns, or stay-at-home orders, shows that complaints of domestic violence increase. For example, a study of lockdowns in India showed domestic violence complaints increased 131% in May 2020 in the most-heavily locked-down areas. In the U.S., in four major cities, complaints increased between 10% and 27%. A study of lockdowns in Mexico City, published in May 2020, similarly found an increase in complaints; the study also examined whether or not prohibitions on the sales of alcohol, in certain areas, had any effect, and the researchers found no significant effect on the number of complaints.

The study of the lockdowns in India – which the researchers noted “has been ranked the world’s most dangerous country for women” – focused on 577 districts, of 639 districts, for which data was available. The districts were classified by the government into 120 “red zones,” 257 “orange zones,” and 262 “green zones.” The red zones, which had the greatest number of new COVID-19 cases and the fastest rate of infection, were subject to the most restrictive lockdown measures; the green zones were subject to the least restrictive measures. The researchers examined data from the National Commission for Women (NCW), which is a governmental entity tasked with “protecting and promoting the interests of women.” Most of the complaints received by the NCW are complaints of cybercrime, harassment, rape and sexual assault, domestic violence, or combinations of those offenses. The lockdowns, officially in effect in May 2020, led to a decline in the mobility of the populace ranging from about 30% in the green zones to above 70% in the red zones.

The researchers found significant increases in domestic violence (131% increase) and cybercrime complaints (184% increase) in the red zones, relative to the other zones, in May 2020. Sexual harassment complaints fell in April, as did rape and sexual assault complaints across the three zones. The researchers found the data consistent with fewer people using public transportation and being in the workplace during the lockdown measures. The researchers noted that marital rape is not criminal in India, and instances during lockdowns may be underreported; when it is reported, it is usually classified as domestic violence. Of the women filing complaints of domestic violence, only about 20% complain of rape, while 41% complain of psychological abuse and 89% complain of physical violence.

The authors explained that attitudes towards domestic violence play a large role in the incidence and reporting of domestic violence and that, in the years preceding the COVID-19 outbreak, surveys showed that 49% of women surveyed, and 42% of the men, believed that a man was justified in beating his wife. During the lockdown, the largest increases in domestic violence complaints occurred in those districts where the man thought beating the woman was justified.

The researchers concluded that “[s]ocial norms and attitudes around violence are important drivers of both violent behavior and reporting,” and “[w]hile lockdowns may be an effective way of controlling disease spread, they also come with costs.”

In the Mexican study, the researchers found that in March 2020 – when the lockdown was imposed – there were 812 reports of domestic, and in April 2020, there were 746 complaints; however, although the April number was down from March, the April number was three times as high as that in April 2019. The researchers found an overall pattern to domestic violence calls. The number of calls decreases during the first week of a lockdown, the calls increase in the second and third weeks, the calls trend down between the third and sixth weeks, and the calls increase again in the seventh week of a lockdown.

The researchers found that, overall, the lockdown did not affect the number of domestic violence calls; however, the type of complaint, and the help sought in the call, was affected. For example, the calls seeking psychological services increased during the lockdown 100%, but the calls seeking legal assistance decreased during that period by 100%. The authors noted that many courts were closed during that time-frame.

The study found that alcohol consumption and “negative-income shocks” are risk factors for domestic violence. However, the authors found that those areas in Mexico City that imposed prohibitions on the sale of alcohol had no significant difference in the complaints of violence. However, the researchers found correlation between a loss of income, or unemployment, and the probability of domestic violence. The Mexican government did not provide financial support to families during the lockdown.

The authors concluded that the government should “analyze the possibility of channeling monetary transfers to the families in order to smooth their consumption and possibly reducing domestic violence when facing a risky situation like COVID-19.”

Sources:  John Miltimore, “Domestic Violence More Than Doubled Under Lockdowns, New Study Finds,”, July 20, 2020:
National Bureau of Economic Research access page:
Research paper: 
Saravana Ravindran and Manisha Shah, “Unintended Consequences of Lockdowns: COVID-19 and the Shadow Pandemic,“ July, 2020:
Adan Silverio-Murillo and Jose Roberto Balmori de la Miyar, “Families under Confinement: COVID-19, Domestic Violence, and Alcohol Consumption,”, May 30, 2020:


Deepfake technology continues to be more common and widespread, and the potential dangers to the criminal justice system better known. Deepfake technology can be used for low-level audio alteration, as well as creation of sophisticated fake videos. One report, from Deeptrace Labs, estimated that in June 2020 there were almost 49,000 deepfake videos and 20 content creation hubs available online.

In February 2020, the ABA hosted a technology show, and one of the participants, Sharon Nelson of Sensei Enterprises, cited two cases from the U.K. where deepfake evidence was criminally used. In one, a company executive was persuaded through a fake audio, seemingly from his boss, to transfer 220,000 euros. In another, a child custody matter, one party submitted an altered audio that suggested the other party had threatened a child. Byron James, of Expatriate Law in Dubai, the attorney for the “threatening” parent, had experts who analyzed the metadata of the recording and disproved its authenticity. Mr. James was quoted as saying that if they could not challenge the evidence his client would have been “portrayed … as a violent and aggressive man.”

A potential danger of deepfake technology is the erosion of trust in the judicial system. Associate director of surveillance and cybersecurity Riana Pfefferkorn, with the Center for Internet and Society at Stanford Law School, cautioned that “juries may be primed to come into the courtroom and be less ready to believe what they see and believe what they hear and will be more susceptible to claims that something is fake or not.” Rebert Chesney and Danielle Citron, authors of a paper on deepfakes, “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security,” warned about the danger of “fake-but-viral” videos, especially in our current social environment, noting that “where strong narratives of distrust already exist, provocative deep fakes will find a primed audience.”

MIT’s Center for Advanced Virtuality has a goal to inform the public about deepfakes. An interactive website was created featuring a film, “In Event of Moon Disaster,” which includes video of President Richard Nixon seemingly giving a speech about the loss of the first two astronauts to land on the moon in 1969. The speech was real and had been written for President Nixon as a contingency, but, of course, the speech was never given as the two astronauts survived the mission. The MIT Center’s video utilized AI, an actor, synthesized voice, and other technology (explained in the interactive website) to create a seemingly realistic fake. The Center’s XR Creative Director, Francesca Panetta, was quoted as saying, “This alternative history shows how new technologies can obfuscate the truth around us, encouraging our audience to think carefully about the media they encounter daily.”

The MIT Center provided four factors contributing to the rise of deepfakes: 1) “advances in deep learning technologies and computational hardware”; 2)  “the development of social media platforms that make it easier for people to share and access information”; 3) “a deregulated climate of media production and distribution”; and 4) “the embattled state of contemporary journalism.”

One digital forensics expert, Hany Farid, a professor at University of California at Berkley, suggested that stamping a digital signature on a recording at the time it is made would, while not being foolproof and could still be subject to hackers, help in authenticating content. He suggested the digital stamping could be in every bodycam and police surveillance camera. John Simek, vice president of digital forensics and cybersecurity at Sensei Enterprises, thinks that jurors could, similar to the “CSI effect,” expect that lawyers in trials will prove or disprove the authenticity of evidence through experts. However, according to Henry Ajder, an author with Deeptrace Labs, there are not enough digital forensic experts. He was quoted as saying, “As the technology for generating synthetic media and deep fakes increases and becomes more accessible, the number of human experts who could rule with authority on whether a piece of media is real or not, has not.”

Professor Pfefferkorn, of Stanford Law School, suggested in an article that lawyers budget for experts and know enough about deepfake technology to recognize outward signs that something might not be real; and, also be aware of ethical considerations if a client presents the lawyer with potential evidence that might not be real. Professor Pfefferkorn said, “If it’s a smoking gun that seems too good to be true, maybe it is.”

Sources:  MIT Open Learning, “Tackling the misinformation epidemic with “In Event of Moon Disaster,”, July 20, 2020:
MIT’s interactive resource website and “In Event of Moon Disaster:”
James Rogers, “Creepy Apollo 11 Nixon deepfake video created by MIT to show dangers of high-tech misinformation,”, July 21, 2020:
Matt Reynolds, “The judicial system needs to learn how to combat the threat of 'deepfake' evidence,”, February 28, 2020:
Matt Reynolds, “Courts and lawyers struggle with growing prevalence of deepfakes,”, June 9, 2020:
Related:  Criminal Defense Newsletter, Volume 42, Issue 11, August 2019, p. 8:
Criminal Defense Newsletter, Volume 43, Issue 9, June, 2020, p. 9:

by Neil Leithauser
Associate Editor