The email is ominously titled “Final Reminder : The Importance of AI” and flagged “High Importance” when it’s just an ad.
Rest of it goes on about how they got a wallstreet bro to come give a speech about “AI replacing 40% of the jobs.”
Idk why a university likes AI this much either. They even overlook cheating as long as it involves AI.
Both students and graders tell each other they use it openly.
At first it felt weird I am complaining about free accounts and free speeches from investors but then it kinda clicked these are NOT free either. My tuition is paying for all of this.
Even the bus pass has a 5-day sign up process to “save money from students not using that service.”
But somehow they just arbitrarily gave everyone multiple premium chatbot accounts anyways.
Am I just paying a 50% ransom to microsoft for my degree at this point?
Also the email is AI generated too. ITS FROM THE FUCKING DEAN.


What I meant with my comment is that AI is a far broader field than just LLMs. But I see so many proposals that are just a horrible waste of ressources.
For example, image analysis. A friend of mine helped to develop special tools for glacier analysis via satelite images. They trained a model to specifically analyze satellite images and track the “health” of a glacier in near real time.
Or take mathematical analysis. Some suggest to just throw a pile of data into a LLM and let ChatGPT make sense of it. But a far more reasonable approach would be to learn about different statistical models, learn how to use the tools (e.g. python), and build a verifyable, explainable solution.
I work in networking and InfoSec, and all the vendors try to add AI chatbots into their firewalls and wifi access points. But many of the challenges are just anomaly detection, or matching series of events to known bad events. But guess what all these AI tools are not: LLMs. (Except maybe for spam filters, thats where an LLM might be a good fit. But we don’t need a huge, expensive model for this).