The unintended consequences of self-harm monitoring software

Experts are calling for policies to govern the self-harm monitoring technology used by schools.

Self-harm is the second leading cause of death among 10 to 24 year olds in the United States. During the COVID-19 pandemic, educators were concerned about their inability to identify students at risk of self-harm without in-person interactions at school. To address this concern, thousands of school districts across the United States have purchased monitoring software to flag digital activity containing self-harm content.

But this software is far from perfect. He may not be able to differentiate a student’s academic research into the late poet Sylvia Path, who died by suicide, from the personal research activity of a student with suicidal thoughts.

Self-harm monitoring software has not been proven to protect students, explain Sara Collins of Public Knowledge and her co-authors in a recent report published by the Future of Privacy Forum. Collins and his co-authors argue that in the absence of strong regulation, this software can have unintended negative consequences for student privacy, fairness, and mental health.

“In the absence of other support, simply identifying students at risk of self-harm – if the system does it correctly – will lead to no results at best,” and at worst, could trigger a response harmful, warn Collins and his co-authors.

Collins and his co-authors liken self-harm monitoring software to the reviewing, reporting, and alerting credit card companies do for suspicious transactions. Self-harm monitoring software used by schools works by analyzing student activity on school-provided devices, including email, social media, documents, and other online communications. The software then keeps a record of all activity for school administrators to conduct research or collect only reported activity. When the software identifies a risk of self-harm, it reports the activity to school employees and, in some cases, the software alerts third parties such as law enforcement.

This practice of monitoring and disclosing student information has various legal implications, Collins and his co-authors explain. At the federal level, the Children’s Internet Protection Act (CIPA) requires schools that receive some federal funding to monitor the activity of minors on all school-provided devices to protect them from access to obscene material. .

Collins and his co-authors explain that school districts have unevenly interpreted the scope of oversight required by the CIPA and how oversight requirements might interact with other federal laws, such as the Education Rights Act. of Family and Confidentiality (FERPA), which requires schools to prevent disclosure. student records without parental consent. The Federal Communications Commission has not issued any guidance on CIPA or FERPA and the use of monitoring.

This gap in guidance has serious privacy implications, according to Collins and his co-authors. “Without clarity on CIPA requirements, schools may unintentionally over-monitor and over-collect sensitive personal information about students or their families in an effort to comply with the law,” Collins and co-authors argue. Some state-level laws may also require screening to comply with cyberbullying laws, further complicating the regulatory landscape.

This type of monitoring and disclosure also presents equity issues related to disability-related discrimination, Collins and his co-authors explain. All schools must comply with the Americans with Disabilities Act (ADA). Additionally, schools that receive federal funding—which are virtually all public schools—must also comply with Section 504 of the Rehabilitation Act. Both laws protect people with disabilities and perceived disabilities. The ADA defines a disability as “a physical or mental impairment that significantly limits one or more major life activities of such an individual.”

Collins and his co-authors argue that reporting a student for self-harm would meet the criteria for defining a perceived disability. Therefore, when a school flags a student as being at risk for a mental health condition that involves their safety, schools may be legally required to provide confidentiality and non-discrimination protections to that student.

Collins and his co-authors are also concerned that this policy violates Title IX protections against discrimination based on gender identity and sexual orientation. Title IX prohibits discrimination against people because of their sex or gender. Some monitoring software flags content such as “gay”, “lesbian” and “queer” as a risk factor. LGBTQ+ students may be harmed by this data collection, warn Collins and his co-authors.

In one study, experts found that less than half of LGBTQ+ youth shared their sexual orientation with school staff. Collins and her co-authors also explain that LGBTQ+ youth are more likely than their peers to see information and resources on the Internet about their identity than their non-LGBTQ+ peers. Collins and his co-authors recommend that school districts carefully craft policies related to LGBTQ+ identities. Specifically, Collins and his co-authors recommend that these policies limit how long schools keep records and limit disclosures to prevent LGBTQ+ students from being marginalized and private information being shared.

Overall, Collins and his co-authors draw attention to the potential dangers of life-saving software. Collins and coauthors recommend a balanced approach to self-harm monitoring. They advocate that stakeholders from all school communities contribute to the development of guidance that interprets the various legal requirements implied by this new software.

Without strong mental health resources, careful implementation, and enforcement of anti-discrimination laws, self-harm monitoring software can do more harm than good, Collins and coauthors conclude.

If you or a loved one is having suicidal thoughts, dial or text 988 in the United States to reach the 988 Suicide & Crisis Lifeline or visit https://www. for additional resources.

Comments are closed.