The Daily Gamecock

Column: USC paid $1.5 million for a tool that can spy on students

The university paid $1.5 million for a "first in the state" campus-wide ChatGPT license in June. The catch though? The version delivered to students is intentionally feature-crippled and, by almost every objective benchmark, less useful than the free public edition. In effect, the university paid a premium to downgrade our educational technology.

 University Spokesman Collyn Taylor confirmed in a statement that students will have access to ChatGPT 5. That’s currently the default for the education model. Students will also have access to the GPT-4o model, according to Taylor.

When a student logs onto USC's ChatGPT, they will see that USC's version of ChatGPT has had all of its useful features seemingly removed. No more reasoning models, image generation, custom/tailored GPTs or advanced data analysis functions for you. To be truly fair to USC, they so kindly offered us two of the dumber models, gave us the ability to use web search and the ability to use Canvas, a Grammarly-style editing suite. We really ought to thank the university for its heaven-sent gift to the students, shouldn't we?

In contrast, the public free version of ChatGPT offers Deep Research, Code Interpreter, Image Generator and a multitude of other features that have been disabled in USC's paid version. By removing these basic capabilities, the university has relegated a powerful educational asset to a tool with a little more utility than a sophisticated grammar and style editor.

However, the most concerning feature is the fact that all education-and-enterprise-grade ChatGPT accounts allow for system administrators to view user interactions with the chatbot according to OpenAI itself.

According to a statement from Taylor, “USC won’t monitor specific queries people put into ChatGPT.”  Despite this, the technical capability for such monitoring exists and can be utilized — the question is in what scenarios the university plans to use this capability.

This creates a scenario where the university could view sensitive data about students in certain cases. This is all the more concerning given that the university has not developed privacy guidelines regarding AI usage within the university.

While USC's academic integrity page provides current AI guidelines, it lacks any specifics regarding privacy. Of course, the university offers a litany of options that all students will surely find endearing: maybe a sample syllabus? If that isn't quite your cup of tea, then maybe a citation guide for AI, or details on how AI is integrated into the honor code or even how to use document version control, will be right up your alley. They seem to have something for everyone there!

In an era of heightened awareness regarding digital privacy, the deployment of a platform that can be monitored without a publicly-accessible clear AI policy is a huge overstep. This arrangement would have students use a system that can be surveilled for academic and personal inquiries, without transparent guidelines defining the scope and purpose of such monitoring.

The decision is further complicated by the university's existing technological ecosystem. As an institution that operates primarily on Microsoft platforms, a more logical and powerful solution, Microsoft Copilot, was seemingly overlooked.

According to a statement from Taylor, "USC is still partnering with Microsoft Copilot to give students, faculty and staff the ability to purchase those licenses. ChatGPT Edu, however, is provided at no cost to students."

The Division of Information Technology's steering committee ran a comprehensive pilot program on Microsoft CoPilot, resulting in positive results. The committee's findings, developed in early 2025, documented significant productivity gains, with over 75% of participants reporting time savings of one to five hours per week.

The report concluded that Copilot was a "secure solution for improving productivity" and explicitly recommended that university units evaluate investing in licenses. 

1.5million to spy-09.png

Meeting minutes from May 6, 2025, clarify the university's decision to disregard the recommendation. In the meeting, Vice President for Information Technology and CIO Brice Bible revealed that a Microsoft Copilot pilot program tripled storage use. Upgrading the university's on-premises storage, which is already "at max capacity and out of date," would cost an additional $300,000 per year.

This shows that the university is willing to spend $1.5 million per year on a less functional, non-integrated and privacy-compromised alternative to avoid addressing its bigger underlying issue — seemingly underfunded digital technologies.

In a statement from June, Bible said the "campus-wide adoption of secure enterprise AI technology puts USC on the leading edge," and will "make our students more employable."

The decision to adopt AI, though well-intentioned, was poorly executed. The final implementation not only fails to solve the underlying issues, but it also gives students a worse version of AI than what's freely available and offers no real solution to the privacy problem.

This is a slap in the face. The university looked at a superior, well-studied solution and chose instead to squander a fortune on this Trojan horse — a system seemingly designed to compromise student privacy while delivering inferior results.

Why? Seemingly not for any discernible educational benefit but to win a meaningless PR race. Our money was wasted and gave us a useless tool. To me, it seems someone at the top has made it clear: their ego is worth more than our education.

If you are interested in commenting on this article, please send a letter to the editor at sagcked@mailbox.sc.edu. 


Comments

Trending Now




Send a Tip Get Our Email Editions