An Adelphi University student was accused of using AI to plagiarize an essay; Now he's suing the school
Orion Newby at his home in Lido Beach Credit: Newsday/Alejandra Villa Loarca
An Adelphi University student is suing the school after it accused him of using artificial intelligence to write an essay deemed “too advanced” for a first-year student — a charge he calls “completely false,” saying the essay’s writing style was influenced by his learning and neurological disabilities and the tutoring he receives for them.
The lawsuit filed in state court in Nassau County by Orion Newby, 19, argues the Garden City school used an AI detection program called Turnitin as a factor in its ruling that Newby used AI to write a history paper. The university ordered Newby to attend a plagiarism workshop and did not allow him to appeal the verdict, according to the lawsuit. A second offense could have resulted in suspension or expulsion.
The disciplinary process failed to follow Adelphi’s own policies, and Turnitin is “not sufficiently accurate” to support the ruling, the lawsuit argued.
“I was just devastated” by the accusation, Newby said in a recent interview at the Lido Beach home where he lives with his parents, Candace and Hunter. “I was very overwhelmed and stressed out, you know, and I thought my career at Adelphi was going to be over.”
WHAT NEWSDAY FOUND
- An Adelphi University student is suing the college after he was accused of using artificial intelligence to write an essay.
- The student says the essay’s writing style was influenced by his learning and neurological disabilities and the tutoring he receives for them. He argues the AI detection system used by the school is “not sufficiently accurate."
- Adelphi officials said in court papers that the student violated the university’s code of academic integrity and the AI detection program is “reliable, accurate and an important tool” in addressing improper use of AI.
An attorney for Adelphi and a university spokeswoman declined to comment on the pending litigation. But in court papers, it stood by its ruling. The university asked Judge Randy Sue Marber on Sept. 29 to dismiss the lawsuit, saying Newby violated the university’s code of academic integrity and the school followed proper procedures in handling the situation. The AI detection program is “reliable, accurate and an important tool” in addressing improper use of AI, Adelphi said in court papers.
Newby and Adelphi are scheduled to appear in court in November for a hearing on the lawsuit and the dismissal request.
The lawsuit sheds light on how colleges are struggling to cope with the explosion in use of AI among their students. Nearly 9 out of 10 American students in a recent survey reported using AI in their schoolwork, and 73% say they use it more now than they did last year, according to the 2025 AI in Education Trends Report released last month by the AI text analysis platform Copyleaks. A survey of Harvard University undergraduates last year found that almost 90% used AI, with about one quarter using the technology as a substitute for doing required reading or attending office hours.
The legal dispute at Adelphi also raises questions about the best ways to prevent and detect the improper use of AI — questions that experts on technology and higher education say have no easy answers.
A growing number of universities recommend that professors avoid Turnitin and other AI detection tools, saying they result in too many false accusations.
“They are correct most of the time, but you can't count on them,” said Emily Isaacs, interim associate provost at Montclair State University in New Jersey, which advises faculty not to use the tools. Plus, she said, since AI can generate entirely new text, it’s impossible to know for sure if a particular work was created by a human being or by AI.
“There's no smoking gun,” she said.
Some research indicates the tools might unfairly target those with neurological conditions such as autism, though more study is needed, said Summer Chambers, a doctoral student in linguistics at George Mason University who has examined the topic.
Turnitin says it has a false positive rate of about 1%, and it advises educators not to use the tool as the sole factor in determining whether a student improperly used AI. The company also says it takes care to avoid unfairly flagging the work of people with divergent writing styles, including non-native English speakers. A spokeswoman for Turnitin declined to comment on the lawsuit.
Essay dispute
The youngest of four children, Newby has received treatment since about age 2 for learning and neurological disabilities including language and auditory processing disorders and attention deficit hyperactivity disorder, Candace Newby said. He has a third-degree black belt in tae kwon do, and is an ocean lifeguard and skilled ballroom dancer.
“He’s a rock star,” Candace Newby said. “He has accomplished what his older siblings have done, but he has to work … double as hard as the others to accomplish that.”

Orion Newby and his parents, Candace and Hunter, at their home in Lido Beach. Credit: Newsday/Alejandra Villa Loarca
In the lawsuit, Newby and Adelphi described in court papers how the conflict unfolded.
Newby started as a freshman at Adelphi last year. He chose the school in part because of the university’s Bridges to Adelphi program, which costs more than $5,000 a year, according to the lawsuit. The program’s website states that it offers “the highest levels of individualized academic, social and vocational support services” to students “who self-disclose as being on the autism spectrum or who experience other nonverbal or neurological-social disorders.”
An aspiring history major, Newby enrolled in assistant professor Micah Oelze’s World Civilizations 1 class last fall. In the history class syllabus, Oelze told students to not use AI tools such as ChatGPT “at all in this classroom, not even for brainstorming,” Oelze stated in court papers.
At first, Newby was a “great student,” Oelze wrote. But then, Newby submitted a three-paragraph essay that “sounded like AI-generated text,” Oelze wrote. The professor sent a message to Newby and urged him to “give me your own take” and use “your own voice,” court papers show.
In response to a different assignment, Newby submitted an essay that Turnitin flagged as “100% AI,” Oelze wrote. It was the highest rating Oelze had ever seen on Turnitin, but he still did not treat it as “conclusive proof” on its own, he wrote. The essay was “too advanced” for a first-year student, it strayed from the course’s focus, it lacked required details and it referenced works that were not assigned, he wrote.
Newby wrote to Oelze that he worked with tutors at Adelphi and at home, and he planned to go to the writing center. “I work many hours on these assignments,” he wrote, in an exchange of messages submitted to the court. “I am having a hard time seeing my grades go lower and lower but I am putting in longer hours.”
Oelze wrote back that his goal “is to set you up for success in future history classes, so that you can be the best writer possible.”
The professor said in court papers that when the two spoke, Newby acknowledged using the web-based tool Grammarly. Newby insisted in court papers he told Oelze he received grammatical help from his tutors at Adelphi and at home, but did not say he used Grammarly.
Oelze reported him for an academic integrity violation.
The paper Newby submitted in Oelze’s class “does not carry the voice that I associate with Orion (or with any college student)” and appeared to be AI-generated, Oelze wrote in the violation report, court papers show.
The essay included phrases such as “apostolic journeys,” and others that do not “reflect the vocabulary of a first-year student,” he wrote.
In his response, Newby wrote that the accusations were “completely false.” He offered to rewrite the essay and insisted he did not use AI, writing, “I did not participate in academic dishonesty.”
Newby submitted the essay to two other AI detection programs, which deemed the report not AI-generated, the lawsuit said.
In an interview, his mother recalled Newby filling multiple notebooks with handwritten notes and spending hours working with tutors on his essays for Oelze's course. “We go through it over and over, sentence by sentence, ‘how we can write this better?’ And that's how he learns,” she said.
Adelphi did not take into consideration the effect his disabilities may have had on the “voice” reflected in the essay, the lawsuit stated.
In early December, the university’s academic integrity officer, associate history professor Michael LaCombe, ruled against Newby, requiring him to take an online anti-plagiarism workshop and noting that though this was a “nondisciplinary” consequence for a first offense, a second offense could result in suspension or even expulsion, court papers show.
Newby requested an appeal, writing that the penalty would “harm me and punish me for something I did not do.” Adelphi’s Student Bill of Rights guarantees a fair, impartial opportunity to be heard and an “adviser of choice” in disciplinary matters, but “this does not seem to be happening,” Newby wrote.
LaCombe rejected the request, writing that he sent the request to a group of faculty members who “reviewed it carefully” and determined that the violation report “will remain.”
Later, Oelze wrote to Newby that professors “are honestly all struggling to learn how to discern AI use” from instances when students have “internalized the generic writing style” common on the internet.
LaCombe declined to comment. Oelze did not respond to requests for comment.
In the lawsuit, Newby seeks to overturn Adelphi’s finding of wrongdoing and get a refund for the tuition and fees he paid and compensation for the harm he said was caused by the ruling. The family paid Adelphi's annual tuition of about $49,000 last year, as well as the approximately $5,000 fee for the Bridges program, Candace Newby said.
Universities typically take accusations of plagiarism and similar violations “very, very seriously, because they know how damaging something like this can be,” said Newby’s attorney, Mark Lesko, a former acting U.S. attorney for the Eastern District of New York and vice president at Hofstra University.
But, he said, the situation required “tender loving care and…an awareness of Orion's needs. What happened here struck us as pretty heavy-handed.”
'Where do you draw the line?'
In an effort to avoid such disputes, some professors at colleges across the country have begun requiring students to make in-class presentations or use pencils and paper to write in-class assignments.
But for at-home assignments, it can be difficult to ferret out improper AI use in thousands of student submissions without automated tools, educators say. Some believe it’s worth risking some false accusations in order to prevent and detect students’ improper use of AI, said James Brusseau, a philosophy professor at Pace University who has studied the use of AI in higher education.
Speaking generally, not about the Adelphi lawsuit, he said, “The question is, how many? Where do you draw the line? Is it one in a million flagged incorrectly? One in 1,000? One in 100?”
He added, “The answer to that question might also have to do with what the penalty is for cheating.” If a student simply has to redo an assignment under a professor’s supervision, a falsely accused student will perform well “and things will be fine,” he said.
But if the consequences are more severe, or if there is no opportunity to redo an allegedly AI-generated assignment, he said, “The cost of using these tools might be too high.”