A child sexual abuse material case utilizing artificial intelligence technology is emerging across several states, including Montana, with victims questioning how their photographs were utilized without their knowledge or consent.
Rachel Burk, a former Cascade resident now residing in Oregon, said she realized something was wrong after receiving a message on social media.
Quentin Shores reports - watch the video here:
"I got a message through social media, basically telling me that they think they possibly found some pictures of me," Burk recalled. "I didn't believe it at all because I thought, who is going to reach out over social media?"
Burk initially thought the message was a scam. However, after contacting Montana law enforcement, she learned that the situation was real.
Investigators informed her that photographs from social media were reportedly utilized in generative AI programs to make child sexual abuse material, generally known as CSAM.
Burk stated that identifying the victims was difficult and emotional: "I had to look at sanitized photos to identify anybody that I knew, and I knew most of them."
According to investigators, many of the victims were former Cascade High School classmates.
Authorities believe roughly 450 photographs were made or identified, some entirely naked, others changed with ill-fitting clothing, all showing juveniles aged six to seventeen through artificial generation.
As the inquiry progressed, authorities worked to determine who was responsible.
"I didn't recognize the name of the person that he gave me," Burk told me. "I asked to view a picture, and I did know that person. I went to school with him from fifth grade until my sophomore year of high school."
The suspect was later identified as Dalten Bryne Montana Johnson. Court documents state:
Upon his arrest, the Defendant was advised of his Miranda rights and agreed to speak with law enforcement. Post waiver, the Defendant admitted he owned the account where the CSAM was discovered. Defendant admitted that he has had a pornography addiction since 2020 and admitted that he has previously saved sexually explicit images depicting people who may have been children. Also in the account were a significant number of images which appear to be CSAM, but where the age of the victim is currently unknown, commonly referred to as “age difficult” CSAM.
In addition, law enforcement discovered that the Defendant possessed approximately 450 images of female children that are not nude, but which depict the children wearing swimsuits, leotards, and other tight, form-fitting, or revealing clothing. The children depicted in these photographs are between the ages of 6 and 17. The Defendant admitted that he had access to these children as they are either relatives or friends of his own children. Furthermore, the Defendant’s wife has operated an in-home daycare at the residence for nearly 10 years.
Investigators believe that the name initially caused confusion because he was previously known as Dalten Knapstad.
Sarah Burk, Rachel's sister and another victim in the case, claimed that her image had been taken from a shared photo online.
“I wasn't even friends with him on Facebook,” she said. “He got my photo through a picture that Rachel and I were in together. So that was actually a picture that he edited of us sisters together.”
Johnson is facing felony charges in Utah. The Utah Attorney General's Office has charged him with six counts of sexual exploitation of a minor.
Investigators also believe he had access to other children through his wife's in-home daycare, and some of those kids may have been victims.
For the victims, the case calls into question long-held beliefs about who is targeted in such crimes.
"When you think CSAM, you're like, oh, vulnerable children, right?" Rachel Burk stated. "We are in our thirties, and this piece of our history was taken and exploited.”
To help mitigate the risk of exploitation, the Internet Crimes Against Children Task Force recommends limiting personal information and photographs online and reporting suspicious behavior to law enforcement; click here for more information.
We will update you if we get more information.