
Trusted, responsible AI tools for students and educators, from personalized learning to research assistance.



Trusted, responsible AI tools for students and educators, from personalized learning to research assistance.
Trusted, responsible AI tools for students and educators, from personalized learning to research assistance.
Trusted, responsible AI tools for students and educators, from personalized learning to research assistance.
Over the past two years, Jeff Rubin, Senior Vice President and Chief Digital Officer at Syracuse University, has led a rollout of Claude to every student, faculty member, and staff member at the university. Syracuse is now launching a new way for students to search for classes. Instead of keyword searches, students can ask questions in natural language and get answers that support and align with the pursuit of their career goals. The system runs on Claude Opus and queries millions of rows of enterprise data in real time.
We talked to Rubin about the class search launch, what he's learned from redesigning how professors teach, and why he's optimistic about AI in education.
Jeff Rubin: About five years ago, I started working with the chancellor on a digital transformation strategy. Then, two years ago, I was appointed the university’s inaugural Chief Digital Officer. In this role, I oversee enterprise security, applications, academic computing, research computing, and the newest area: data and AI. As we started building out that strategy, it was clear that AI was central to every one of our endeavors. I'm not sure there's anything we plan to do where AI doesn't play a role in some way.
I said early on: we're not just going to adopt AI. We're going to equip our people to use AI effectively. That meant giving everyone access to this tool. Just like we do with Microsoft Office or any other operating system.
Rubin: What stood out about Anthropic was, first, the quality of the models. But perhaps more importantly, it was mission alignment with higher education. When you look at who Anthropic is, the high ethical standard, the safety, the values—those align with what we want to do. We understand you're a corporation, but the way you've gone about it to date is impressive and distinguishes you in the marketplace.
From the beginning, Anthropic looked at us as a partner, not a customer. That was a big deal, because we believe Syracuse is on the leading edge of implementing AI, and we wanted somebody who would listen to us and align their offerings with our evolving needs.
And they have. My team told Anthropic we needed help with Claude Code quotas at an enterprise scale. There wasn't a way to set individual spending limits. Within a week, we were in a beta program. And we implemented it immediately. That's the difference between being viewed as a partner, not a customer.
Rubin: Fundamentally change pedagogy. The way teachers teach and the way students learn haven’t changed in over a hundred years. Teachers go into the classroom and lecture. Students take notes, maybe on a computer instead of by hand, but it’s still largely the same classroom experience than when I was a student. Then there's an assessment: projects, reports, exams. That really hasn't been challenged.
I teach a class with a couple hundred students. Whether there's 200 or 15, the problem is always reaching every student, where they are. Some students believe the content is moving too fast, some too slow. This project was too easy; this homework was too hard. You can't individualize it. But AI is going to let you do that.
So what I'm trying to do is have professors lecture, then put out a case study challenge based on the material. As students respond, Claude assesses where each one is and gives tailored feedback. All of that feeds back into learning analytics, so the professor has a real picture of where students stand. And it changes every class. A student might struggle with one topic and excel at the next. That's the beauty of it.
Rubin: Let me tell you about a challenge I had and how Claude helped me identify a solution. I created a Claude-based practice exam for my class: 20 multiple-choice questions. I was excited about it. Then I looked at the data and the results were actually bad. Using the AI-generated practice exam was barely moving the needle on scores. Maybe a point above average.
So I went back to some experts at our School of Education, and they basically told me I wasn't doing it right. Multiple choice wasn't testing recall. Students were just clicking through answers without really engaging with the material.
So I changed it. I used Claude to redesign the practice exam so students get a term and have to type in a definition. Based on actual words that I said during class, Claude grades their response, tells them what they got right, and where they could be stronger. I just gave the real exam and scored it. Exam scores were 12 points higher than they've ever been. And I've been teaching this class for a long, long time.
The problem wasn't the tool. It was how I was using it. And that's what I mean by changing pedagogy: we have to change how we teach, not just hand our community a new tool.
Rubin: The way students search for classes today is broken. It's keyword-based. If you type "AI," you get painting classes because the letters A-I are in the word. There's no way for a student to say, "I've taken these classes, and I have this internship, what's next?" That just wasn't possible in our current environment.
This week, we launched a system called Clementine that uses Claude Opus 4.6 to query our enterprise data in real time through custom MCPs. Students ask a question in natural language and get an answer that's specific to their situation. It's sitting on top of millions of rows of data.
At the end of the day, we're concerned about career outcomes. This helps ensure they're taking the right classes aligned to the careers they want to pursue. It gives you information tailored to the individual, not just a list of course numbers with no context and no "why."
Rubin: I just got off the phone with somebody at a pretty big institution and they are trying to determine whether to adopt AI at all. And I talked to a senior graduating from another university, a computer science student, who told me his school won't embrace it. And I think: what are they doing to him? What's he going to do when he graduates?
We can all look at the clickbait of people who are doom and gloom. Or we can talk to psychologists, education PhDs, and people who've studied cognitive behavior. When we started talking to that community, we realized there is a real opportunity here. AI is part of our ecosystem now. We have to learn to work with it.
That doesn't mean there aren't risks. There are real concerns, and we need guidelines and guardrails. But as an educational institution, we can either ignore it, which is what a lot of institutions are doing, or we can prepare our students to be successful within this new environment and teach our faculty to use it effectively. Because AI is going to be a life skill. Not just a career skill. A life skill.
Rubin: If you just give a written assignment, "write me a paper on this aspect of World War II," you are begging for people to use AI to answer it. I don't mean that as a criticism. We have to show faculty how to change assessment in the age of AI. That's where our educational series comes in, and I've done it in my own classes. It works.
We also created Communities of Practice: open forums where any faculty or staff can come talk about cheating, mental health, sustainability, or whatever concerns them about AI. We have psychologists, education researchers, and technologists. Their charge is to help us create guidance and inform the community.
One of our biggest detractors, someone who was genuinely worried that this would ruin education, recently sent us a note saying she's excited about the future. I think it's because she was heard. She realized this is as much about educating her as it is about educating the students. People are so focused on the students, they're forgetting: we've got to educate the educators.
Cheating is not new. It's been around as long as education. I don't think we should teach to the group that wants to cheat. We should teach to the students who want to learn. And for those students, AI means they can learn even more, faster. One of our budget planning officers told my team that what used to take him hours, he now does in minutes with Claude. My colleague asked him if he had gone home early. He said no. He got more work done.
Rubin: Claude Code is the number one request we get from our community every single day. We just turned it on for our IT staff as a pilot, and the uptake has been immediate. We're working with Anthropic on a consumption-based licensing model so the entire university can use whatever Anthropic releases, without having to renegotiate each time something new comes out.
And it's not just developers using it. One of our classroom development people, who doesn't know Python, shut his conference room door and started a Claude Code session to prototype control system software. It had been one week since we turned it on. That's the kind of thing that makes you think: ‘Okay, this is real.’
When I grew up in the late '70s and '80s, I had an encyclopedia. That was it. That was the boundary of what you could know about an aardvark or anything else. Then you got Google, Wikipedia, and now AI. In my household, my boys are 20 and 18, and I don't allow the words "I don't know" anymore. There's no reason not to know. You have access to so much more than any previous generation ever had, and I think that's an incredible thing.

Anthropic's agentic coding tool. Claude Code understands your codebase, edits files, runs commands, and helps you ship faster.
Anthropic's agentic coding tool. Claude Code understands your codebase, edits files, runs commands, and helps you ship faster.
Anthropic's agentic coding tool. Claude Code understands your codebase, edits files, runs commands, and helps you ship faster.