October 3, 2022
The Trevor Project Partners with Student Surveillance Company Accused of LGBTQ+ Bias
Emell Adolphus READ TIME: 2 MIN.
Despite warnings from lawmakers and civil rights groups that a student surveillance company could use digital surveillance tools to discriminate against at-risk students, the Trevor Project has formed a financial partnership with a brand that specializes in it.
As reported by The Guardian, the Trevor Project now lists Gaggle as a "corporate partner," a controversial surveillance company that uses artificial intelligence and human content moderators to sift through student chats and homework assignments in search of students who may harm themselves or others. The LGBTQ+ mental health-focused nonprofit said their partnership is to "improve mental health outcomes for LGBTQ young people."
Gaggle has reportedly given the Trevor Project between $25,000 and $50,000 in support.
Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis, had a firsthand experience with Gaggle's surveillance dragnet.
"It really does feel like a 'We paid you, now say we're fine,' kind of thing," said Logsdon-Wallace, who is transgender, and worries that the partnership amounts to a "seal of approval" from the Trevor Project.
They added, "People who want to defend Gaggle can just point to their little Trevor Project thing and say, 'See, they have the support of 'the Gays' so it's fine, actually,' and all it does is make it easier to deflect and defend actual issues with Gaggle."
In a statement, a Trevor Project spokesperson said Gaggle's digital monitoring tools can keep students safe without invading their privacy and the collaboration started with Gaggle's "desire to identify and address privacy and safety concerns that their product could cause for LGBTQ+ students."
"It's true that LGBTQ+ youth are among the most vulnerable to the misuse of this kind of safety monitoring – many worry that these tools could out them to teachers or parents against their will," read the statement from the Trevor Project. "It is because of that very real concern that we have worked in a limited capacity with digital safety companies – to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies."