News: 0175496399

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Explicit Deepfake Scandal Shuts Down Pennsylvania School (arstechnica.com)

(Monday November 18, 2024 @05:40PM (BeauHD) from the class-canceled dept.)


An anonymous reader quotes a report from Ars Technica:

> An AI-generated nude photo scandal has [1]shut down a Pennsylvania private school . On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images. The outcry erupted after a single student created sexually explicit AI images of nearly 50 female classmates at Lancaster Country Day School, Lancaster Online [2]reported . Head of School Matt Micciche seemingly first learned of the problem in November 2023, when a student anonymously reported the explicit deepfakes through a school portal run by the state attorney's general office called "Safe2Say Something." But Micciche allegedly did nothing, allowing more students to be targeted for months until police were tipped off in mid-2024.

>

> Cops arrested the student accused of creating the harmful content in August. The student's phone was seized as cops investigated the origins of the AI-generated images. But that arrest was not enough justice for parents who were shocked by the school's failure to uphold mandatory reporting responsibilities following any suspicion of child abuse. They filed a court summons threatening to sue last week unless the school leaders responsible for the mishandled response resigned within 48 hours. This tactic successfully pushed Micciche and the school board's president, Angela Ang-Alhadeff, to "part ways" with the school, both resigning effective late Friday, Lancaster Online [3]reported .

>

> In a statement announcing that classes were canceled Monday, Lancaster Country Day School -- which, according to Wikipedia, serves about 600 students in pre-kindergarten through high school -- offered support during this "difficult time" for the community. Parents do not seem ready to drop the suit, as the school leaders seemingly dragged their feet and resigned two days after their deadline. The parents' lawyer, Matthew Faranda-Diedrich, [4]told Lancaster Online Monday that "the lawsuit would still be pursued despite executive changes." Classes are planned to resume on Tuesday, Lancaster Online reported. But students seem unlikely to let the incident go without further action to help girls feel safe at school. Last week, more than half the school walked out, MSN [5]reported , forcing classes to be canceled as students and some faculty members called for resignations and additional changes from remaining leadership.



[1] https://arstechnica.com/tech-policy/2024/11/school-failed-to-report-ai-nudes-of-kids-for-months-now-parents-are-suing/

[2] https://lancasteronline.com/news/local/lancaster-country-day-parents-take-first-step-in-suing-school-leaders-over-deepfakes-of-students/article_bfb6066e-a2c7-11ef-af86-df31e3bc5b86.html

[3] https://lancasteronline.com/news/local/lancaster-country-day-head-of-school-board-president-leaving-amid-legal-pressure-from-parents/article_cd0039c8-a3d2-11ef-90aa-1fe2bfb2b21d.html

[4] https://lancasteronline.com/news/local/lancaster-country-day-school-cancels-classes-monday-as-challenges-over-deepfakes-continue/article_9ee180e4-a5a6-11ef-ae00-d3ea73e7727f.html

[5] https://www.msn.com/en-us/news/us/lancaster-country-day-students-stage-walkout-over-ai-nude-photo-scandal/ar-AA1tLO6w?apiversion=v2&noservercache=1&domshim=1&renderwebcomponents=1&wcseo=1&batchservertelemetry=1&noservertelemetry=1



\o/ (Score:1, Offtopic)

by easyTree ( 1042254 )

When did 'Explicit' start to mean only 'sexually explicit' ? Are modern people so vague that explicitness has lost all other meaning?

Re:\o/ (Score:5, Funny)

by Anonymous Coward

It doesn't mean only that, but in this context it's implicit.

Re: (Score:2)

by sg_oneill ( 159032 )

Its images of kids. Do we REALLY need to know the details of the photos? Whether the kid responsible was using AI to "undess" or do much worse, its going to be massively harmful shit to the mental wellbeing of the girls involved.

I worked a number of years at the DOJ, the effect this sort of stuff has on children is horrifying. In one case we had a 12 year old girl who was abused and had photos taken of her. She suicided mid trial, causing the judge to have to call a mistrial due to technicalities regarding

Re: (Score:2)

by ISoldat53 ( 977164 )

I thought it was required for people in authority to report these incidents and it was a crime not to.

Re: (Score:1)

by Narcocide ( 102829 )

You might be surprised about what percentage of school administration believes that they either are the law, or that they are above it. The math teacher/football coach at the high school I attended sexually harassed his entire football team regularly for decades before they eventually fired him for losing too many football games.

Re: (Score:2)

by taustin ( 171655 )

Varies by state, but yeah, pretty much.

Re: (Score:3)

by sjames ( 1099 )

Or a "mean girl"

Privacy laws Re:Don't want to be sexist (Score:1)

by davidwr ( 791652 )

The gender of the alleged criminal probably hasn't been released.

Re: (Score:3)

by jythie ( 914043 )

Unless the boy was white, in which case they would probably do exactly what they did.. nothing. School administrators who report white boys tend to find themselves on the wrong end of parental outrage for ruining his life for 'just being a boy!'.. Lancaster is a deep red county, meaning 'tough on crime' and 'reporting' laws are NOT designed for little white boys, only getting those scary dark ones into prison where they belong.

Also depends on how much money (Score:3)

by rsilvergun ( 571051 )

The kids parents have. Do yourself a favor if you're not of the right social cast Don't send your kid to a private school. They're just going to eventually drop out from the bullying.

My next door neighbor sent their kids to a private school back in the day and it was absolutely brutal for them.

Except for the really really expensive private schools you also have to keep your grades way way up. The way those charter schools end up looking better than public schools is that any kid who falls below a hi

Yeah, but you seem to be. (Score:2)

by demon driver ( 1046738 )

How likely is it that a girl would make or fake pornographic images of other girls in her neighborhood and show them around? Among whom? Other girls? Some boys?

(Is there even one precedent that would have become known?)

Anyway, an [1]article on Lancaster Online [lancasteronline.com] from August clearly refers to “the boy” as the perpetrator. He was also 9th grade, so you were right or close with the "14 y.o. boy". You should have stopped there, before starting to ideologize it...

[1] https://lancasteronline.com/news/local/authorities-investigate-reports-of-ai-generated-nude-images-featuring-lancaster-country-day-students/article_49cf24d0-658c-11ef-b8de-0f35cffb17dc.html

Never too young to be a creep (Score:3)

by Malay2bowman ( 10422660 )

And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.

Re: (Score:2)

by jythie ( 914043 )

Which is probably why administrators were reluctant to report anything. Given the area, they tend to see reporting laws as something for getting brown kids into the prison system, NOT something to 'ruin the lives' of little white boys. If it had been a dark skinned kid making deepfakes of white classmates, he would probably be in juvenile detention.

Maybe not Re:Never too young to be a creep (Score:1)

by davidwr ( 791652 )

> And now this person is going to end up on the registry.

I don't know Pennsylvania's laws, but many US states have special laws for young (typically under 18 or under 21) first-time sex offenders so they aren't automatically put on the sex-offender registry and which allow them to get removed much earlier if they are forced to register.

This could just be dumb kid shit (Score:2)

by rsilvergun ( 571051 )

Kids do not always understand the consequences of their actions or the gravity of them relative to our society. It sounds like a big deal that they did 50 of them but they could literally take their yearbook, take a picture of it with their iPhone and then run the picture through any one of several programs to generate the nudes in mass.

It's one of those things where at a certain point when we as a civilization have made it so easy to commit a crime I have a hard time wanting to bring the full force of

Re: (Score:1)

by cascadingstylesheet ( 140919 )

> Also if our country and our society didn't have such a fucked up opinion of nudity this wouldn't be an issue. The reason this is so damaging is because the assumption is that if you can find nude pictures of a girl then that girl is of low moral character and should be ostracized.

Civilized countries have always had restrictions on nudity.

Re: (Score:2)

by ihavesaxwithcollies ( 10441708 )

> And now this person is going to end up on the registry. Goodbye future and hello ankle monitor whenever he or she gets out of lockup.

Don't forget to say hello to Matt Gaetz. This person will be able to be Attorney General someday.

A GOP DEI hire includes child molesters.

You get what I pay for (Score:2)

by dpille ( 547949 )

Welcome to the new world of vouchers.

Re: (Score:1)

by vivian ( 156520 )

> Is pasting a kids face on a drawing a nude adult enough to call it child porn?

yes.

Some states also categorize sexual cartoons or paintings of completely made up children as child pornography, and there have been prosecutions for possession of japanese cartoons depicting "under-age" characters engaged in sex. Since until quite recently (2023), the age of consent in Japan was 13, there was quite a lot of "Japanese school girl fantasy" type artwork out there. It was pretty gross to see some old guy reading that stuff on the train when I lived there.

Where's the abuse? (Score:5, Insightful)

by Murdoch5 ( 1563847 )

If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech. This idea has been fought before, it's why people can publish child sexual novels, and why organizations like NAMBLA can exist.

Regardless how anyone feels about the content, I know as a father of two teenage daughters, I would want the death penalty, the right to free speech and free expression must be upheld. What is the difference between a generated picture of a naked teenager, vs, Romeo and Juliet? Would those same parents demand Romeo and Juliet get removed, and people resign?

Just so there's absolutely no confusion, I'm only coming at this from a free speech / freedom of expression context. I do not defend generating sexual fakes of others.

Re: (Score:1)

by abEeyore ( 8687599 )

There are. indeed, laws against generating content depicting the sexual exploitation of minors, even if it is entirely fictitious - and this is not entirely fictitious because they are based on real people. I understand that you feel this skirts close to the edge of so called "thought crime", but in this case, the violation is not simply in creating the pictures, but in then distributing them.

The creator's rights - whatever you may imagine them to be - end where the rights of the girls in question begin

Re: (Score:2)

by Murdoch5 ( 1563847 )

They weren't exploited because the images were generated. Imagine generating a James Bond novel, that had themes and elements from old novels, but was generated enough as to not breach copyrights or trademarks. If you distribute the novel, all you've really done is distributed a ripoff, contrast the Hardy Boys with Nancy Drew, they're effectively the same concept, but different enough as to be unique.

I'm not going to defend distributing the images because I think it's gross AF, but fundamentally, if th

Re: (Score:1)

by cascadingstylesheet ( 140919 )

> the right to free speech and free expression must be upheld

Unless, of course, you say something truly horrific, like "this is actually Hunter Biden's laptop" ...

Re: (Score:2)

by Murdoch5 ( 1563847 )

That's hilarious!

Re: (Score:2)

by hey! ( 33014 )

> if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake

They are invasions of privacy. While we usually think of privacy as protecting disclosures of sensitive information, that's only just one kind of privacy intrusion. The legal and ethical issues of privacy are actually considerably broader. They have to do with a whole host of issues relating to your right to personal autonomy.

For example if you are in a public place people are free to observe you, or even in most jurisdiction to film you. But if someone follows you around observing you to the point that

Re: (Score:2)

by Murdoch5 ( 1563847 )

This is absolutely a privacy violation, and a terrible privacy violation, without argument. Would I be okay with that website existing, no, but, would I allow it? Depending on the content, providing it wasn't making factual statements about me, I probably would.

I say this as a person whose nickname for ~4 years, during high school, was “child porn” because several jocks spread a rumour that I was into it. I just ignored it, and after ~4 years, it died because everyone realized it was clear

Re: (Score:2)

by tiananmen tank man ( 979067 )

Aren't most nude deep fakes just photoshopping someone's head on someone else's nude body. So isnt the original nude photo considered child porn? The "AI" part is just automating it.

Re: (Score:2)

by Murdoch5 ( 1563847 )

That's a good question, I honestly don't know, but if the images are all real and reorganized, that's a different issue!

Re: (Score:2)

by ihavesaxwithcollies ( 10441708 )

> If someone generated AI “deep fakes”, then what abuse took place? While those images are generally distasteful, rude, offensive, and invasive, if they're generated images, then they can't rise to the level of abuse, since the majority of the content is fake, and therefore protected until freedom of expression / speech.

Today I learned Michelangelo & Leonardo da Vinci are criminals because they created images of people without clothes on.

Is anyone surprised? (Score:2)

by Don'tJoin ( 6185656 )

Not I.

Everything that can be used to create porn will be used to create porn.

Even if something is only hinting at nudity there are those that will find it offensive and call it porn. *

Hormones do lower intelligence on all teenagers.

And these large artificial neural models that are open for all to use lowers the bar substantially for a horny and/or bullied teenager to do this.

This might not have been done out of spite or with ill intent. "Don't attribute to..."

*generally speaking of what "AI" models can prod

Great way to frame someone else... (Score:1)

by Anonymous Coward

I can see little Billy making a bunch of deepfakes, and being smart enough to use an encrypted virtual machine, VPN, or even use a bogus credit card to use a VM service to make the deepfakes. From there, find a way to throw them on the computer of someone he is bullying, either by copying pictures while the computer is unattended or just making a USB drive with the target's name and address on it, throwing it in their backpack, then calling the principal that they were looking at pr0n on their computer.

Bam

Squirming:
Discomfort inflicted on young people by old people who see no
irony in their gestures. "Karen died a thousand deaths as her father
made a big show of tasting a recently manufactured bottle of wine
before allowing it to be poured as the family sat in Steak Hut.
-- Douglas Coupland, "Generation X: Tales for an Accelerated
Culture"