Adelphi accused a student of using AI to plagiarize. He fought back — and won.
Orion Newby at his home in Lido Beach. A judge ruled that Adelphi University must expunge its plagiarism accusations from his record. Credit: Newsday/Alejandra Villa Loarca
An Adelphi University student who sued the school over what he called a “completely false” allegation that he used artificial intelligence to write an essay has won his case, with a judge ordering the school to reverse the disciplinary measures against him.
The Garden City university’s finding that the student, Orion Newby, used AI to commit plagiarism, and its denial of his subsequent appeal, were “without valid basis and devoid of reason,” state Supreme Court Judge Randy Sue Marber in Nassau County wrote on Jan. 28. The school must expunge his record, the judge ruled.
According to the lawsuit, Adelphi relied in part on AI detection software to accuse Newby of improperly using AI. University officials ordered Newby to attend a plagiarism workshop — which, while considered a "nondisciplinary" punishment, could have resulted in suspension or expulsion if he was found to have committed a second offense.
“It feels incredible to finally have my name cleared,” said Newby, 20, who lives in Lido Beach with his parents, Candace and Hunter Newby, and is now in his second year at Adelphi, where he majors in history. “Winning this case is a huge weight off of my shoulders.”
WHAT NEWSDAY FOUND
- An Adelphi University student accused of using artificial intelligence to plagiarize has won his case, with a judge ordering the school to expunge his record.
- The Garden City university’s finding that the student, Orion Newby, used AI to commit plagiarism, and its denial of his subsequent appeal, were “without valid basis and devoid of reason,” the judge wrote.
- An Adelphi spokeswoman said they were "evaluating the court's decision and will proceed accordingly.”
The university’s denial of Newby’s appeal left the family “no choice” but to sue, given that the same thing could have happened again and resulted in expulsion, Candace Newby said. The family has spent six figures on legal costs, she said.
An Adelphi spokeswoman said the school “does not comment on litigation or on individual or personal cases involving students or faculty. We are evaluating the court's decision and will proceed accordingly.”
Newby’s attorney, Mark Lesko, a former acting U.S. attorney for the Eastern District of New York and former vice president at Hofstra University, said the ruling should prompt Adelphi and other colleges to overhaul their AI policies and make sure students get due process.
The decision is “a bellwether example of why universities need to be very careful and protective of their students when they address issues regarding the use of AI in the classroom,” Lesko said.
“I can't tell you how many parents have reached out to us with similar issues,” Lesko said. “It's clearly a growing problem in higher education.”
AI allegation
Adelphi’s case against Newby stemmed from work he submitted in a fall 2024 World Civilizations 1 class taught by Assistant Professor Micah Oelze, according to court filings.
It was Newby’s first semester in college.
A third-degree black belt in tae kwon do and an ocean lifeguard, Newby has been treated since he was about 2 years old for learning and neurological disabilities that include language and auditory processing disorders and attention deficit hyperactivity disorder, his mother said. Newby works hard to overcome his disabilities, spending hours refining his writing with assistance from tutors, she said.
Orion Newby and his parents, Candace and Hunter Newby. Credit: Newsday/Alejandra Villa Loarca
In November 2024, Newby submitted a paper on Christianity and Islam. He had worked on it with a tutor from Bridges to Adelphi, the university’s $5,000-a-semester program that advertises “individualized academic, social and vocational support services” for students with disabilities, according to legal filings.
Oelze gave the paper a grade of zero, according to court filings, in part because he thought it was AI-generated.
In a message to Oelze, Newby said he was working with tutors at home and through Bridges, and that he would go to the writing lab for help. “I work many hours on these assignments,” Newby wrote.
Oelze responded that he aimed “to set you up for success in future history classes, so that you can be the best writer possible.”
In a one-on-one meeting, Oelze asked whether Newby had used the generative AI program Grammarly. In legal filings, Oelze stated Newby said he had used the program. However, Newby stated he told Oelze that he had received grammatical help from a tutor, not from Grammarly, court papers show.
Soon after, Oelze filed a violation report with Adelphi alleging that Newby had violated the school’s academic integrity code. The AI detection system Turnitin rated the essay 100% AI-generated, and the work “does not carry the voice that I associate with Orion (or any college student),” among other indications of AI use, Oelze wrote.
Newby said he was stunned.
“I thought at first I was going to get arrested,” he recalled. He submitted his paper to two other AI detection tools, which both called it human-written, court filings show.
But Adelphi’s academic integrity officer, Associate Professor Michael LaCombe, ruled against Newby after reviewing submissions by the student and professor.
Newby appealed, stating the finding would “punish me for something I did not do.” Adelphi’s Student Bill of Rights promises a fair, impartial hearing and an “adviser of choice” in such cases, and “this does not seem to be happening,” Newby wrote.
LaCombe denied the appeal, writing that other faculty had also reviewed the matter and the violation report “will remain.”
LaCombe declined to comment. Oelze did not respond to a request for comment.
In court papers, Adelphi said Oelze filed the report “based on his experience and judgment” grading thousands of papers in his 10 years as a full-time professor. Turnitin is "reliable, accurate and an important tool" in detecting students’ prohibited use of AI, and Oelze also considered the content of the essay and other factors, the university said in legal filings.
However, the judge wrote in her decision that LaCombe “failed to even consider” the two AI detection programs that found Newby’s essay was human-written.
In addition, Marber wrote, allowing the same professor, LaCombe, who made the initial decision to also rule on the appeal thwarted “a student’s right to an avenue of meaningful ‘appeal’ ” as promised in school policies.
Marber’s decision comes as the use of AI is exploding on campuses, sparking concern that some students are using it to cheat — as well as worries that colleges are falsely accusing them. Almost nine out of 10 college and graduate students acknowledged using AI in academic work, according to the 2025 AI in Education Trends Report by Copyleaks, an AI text analysis platform.
At many colleges, professors use AI detection tools such as Turnitin, which advertises accuracy rates of 96% or higher, depending on the amount and type of text submitted. A Turnitin spokeswoman said in a statement that it is “designed to be one, but not the only, tool in the educator and administrator toolkit.”
Some educators believe AI detection tools are not reliable enough, especially if students could face serious consequences based on their findings. A growing number of colleges have banned the tools.
It’s essential to avoid false accusations, said Jim Samuel, executive director of the Informatics Program at Rutgers University, where he does research on AI. “In most cases, it's difficult to say with a sufficiently high degree of certainty that a person has used AI,” he said.
Even so, he said, “If we don't do anything about it, and students get the idea that they can just ... use AI and get a 4.0 GPA and get away with it, that I think would be very counterproductive.”
Critical thinking concerns
A survey of college professors found 90% believe AI will harm students’ critical thinking skills, and 73% have dealt with academic integrity issues involving students’ use of AI, according to a January report by the American Association of Colleges and Universities and Elon University’s Imagining the Digital Future Center.
Certain schools are inviting students to play a larger role in creating AI policies.
The University of Virginia is forming an elected council of students who will advise professors and administrators about technology systems, “including questions and issues around plagiarism and cheating,” said Mona Sloane, who teaches data science and media studies at the school. Students “must be at the table,” she said.
Professors should also have a say, said Britt Paris, chair of the American Association of University Professors’ AI committee. Many adjuncts earn roughly $2,000 per class and have seen their workloads increase dramatically as they redesign courses and spend more time grading to account for potential use of AI, she said.
Professors can help students understand that developing critical thinking skills instead of relying on AI will benefit them, Paris said. “We don't need to think about our students as adversaries,” she said.
Indeed, professors could encourage students to use AI to review their drafts and suggest ways to improve them, said Kirsten Peterson, a senior project director at the nonprofit Education Development Center who is working on a National Science Foundation-funded project to provide AI instruction at community colleges.
Educators, she said, need to “slow down and teach our students the power of human intelligence and why they want to retain their critical thinking skills and to ensure that their voice matters."
Schools should strike a balance, teaching students to use AI but also making sure they develop the communication skills that will help them stand out, said Chris Cheetham-West, who educates colleges and other organizations about effective AI use.
“Right now, everybody's on their phone, everybody's texting ... people are scared to pick up the phone,” he said. “Being able to communicate your ideas, being able to talk to outside vendors or outside customers, or just even colleagues, and explain things in a simple way, that's going to be one of the strongest skills to have.”
To head off improper AI use, some professors require students to complete assignments in class using pencil and paper, said James Brusseau, a philosophy professor at Pace University who researches AI use in higher education.
It is also now possible to administer oral exams in which an AI tool assesses students’ spoken responses in real time and prompts them to give more information if needed, a system that makes it all but impossible to use AI to generate answers, he said.
“We can go back to the way things used to be, or we can go forward to the way things could be,” Brusseau said. “But the way things are right now obviously doesn't work.”


