That Pesky Little Detail: Data in an AI World

Bill Balint • February 27, 2025

A little cottage industry seemingly arises at the conclusion of each decade, joyously pointing out those long-since forgotten, failed techy items from the past 10 years that were supposed to impact the world but were miserable failures instead.


While we are only at the midpoint of the 2020s, it is safe to say AI will not be the next Google Glass, 3D television or the loads of other mainstays on the 2010s lists of IT infamy.


Higher education quickly realized both the potential AI positives and negatives as it applied to the teaching, learning and academic research space (think plagiarism on one hand matched against the prospect of personalized learning on the other). Underscoring this fact is the groundbreaking recent announcement that the California State University System intends to become the nation’s “first and largest AI-empowered university system” (https://www.calstate.edu/csu-system/news/Pages/CSU-AI-Powered-Initiative.aspx).


However, AI adoption for administrative tasks – providing desperately-needed help as struggling institutions look to lower costs, attract/retain more students, and obtain external support via fundraising, grants, etc. – has been a little more deliberate.


But this is changing fast, as it seems every higher education information system vendor is now flexing its AI muscles – or at least the sales and marketing teams are doing so. Phrases like ‘Throw your CRMs into the trash bin because mine innovates using AI’ or ‘I’ll see your legacy registration system and raise you a machine language course schedule wizard’ are lurking in that sea of PR if you read between the lines hard enough.


The fear of missing the AI train must be balanced because higher education cybersecurity and data privacy risks because AI requires data and that’s where things get complicated.


Higher education is always among the most vulnerable industries because its data is so valuable to cyber attackers, and it is considered an easy target. No industry has the combination of user churn, number of inexperienced and casual users, the plethora of personal devices, and an overriding culture of openness. Couple it with IT budgets and staffing often facing unprecedented challenges and it is a mix that attracts bad actors from across the globe. The increasing AI usage will likely bring even more frequent, more sophisticated attacks.


Adding to the complexity is the presence of shadow systems housing sensitive or confidential data lurking in higher education for some 40 years. Among the relevant examples are a power user downloading student fiscal data onto a personal hard drive, a researcher locally storing sensitive data, and an office which has deployed an information system for which the IT department does not even know exists.


Consider the dark possibilities if a user innocently exposes such data to a GenAI model.



This all means answers to traditional questions like ‘Where is the data actually stored and what security measures exist for that data both at rest and in transit?’ and ‘How robust are the tools restricting data access?’ deserve more scrutiny than ever.


Perhaps more importantly, the question of ‘Does my executive who listened to AI hype at a conference last week and is now eager to buy an AI-infused product fully grasp the potential risk?’ At one time, it may have taken a concerning cybersecurity audit finding to catch the attention of the institution’s board or cabinet. But these can no longer those times and executive recognition of AI risk up front is critical.


Executive leadership should prioritize the creation of practical, common-sense policies governing AI usage. Tactical and operational leadership needs empowered to keep those policies up to date and to make key decisions on tools and techniques to help keep data safe. They can then build appropriate procedures, guidelines, standards, FAQs, and best practices so users can effectively work in an emerging AI world.


Bill Balint is the owner of Haven Hill Services LLC, contracted as TriVigil’s Advisory CIO for Education.

By Mark McGinnis, Chief Evangelist, TriVigil October 7, 2025
For the last two years, as Cybersecurity Awareness Month returns, I find myself thinking less about firewalls and frameworks and more about people. Technology evolves. Threats evolve faster. But the heart of cybersecurity has always been human. The quiet decisions made every day by educators, administrators, and students determine whether our institutions remain safe or become headlines. And in education, where purpose runs deeper than profit, the stakes feel different. The New Reality of Risk in Education Over the past decade, education has transformed. Hybrid learning, connected devices, digital testing, and research collaboration have all expanded what it means to “protect the classroom.” But with that progress has come complexity, and complexity invites risk. Many schools and universities are now operating with sprawling technology ecosystems managed by small, overstretched teams. These professionals are trying to keep up with relentless change while defending systems that were never designed for today’s threat landscape. I’ve seen firsthand how easily a single vulnerability can cascade into real-world consequences: lost data, canceled classes, disrupted operations, and shaken trust. It’s never just a technical problem, it’s a human one. Awareness Is Not a Checkbox Every October, inboxes fill with reminders about cybersecurity awareness training. But genuine awareness does not come from compliance modules or quiz completions. It comes from culture. It begins when people feel ownership. When they understand why it matters, not just what to do. A district I worked with recently lost its long-time IT director unexpectedly. When the dust settled, leadership realized how much institutional knowledge had lived in one person’s head. It was not about negligence; it was about unseen vulnerability. That moment reminded me that awareness is not about assigning blame. It is about creating clarity. It is the point when leaders say, "We do not have to know everything, but we need to know where we stand." The Leadership Moment Cybersecurity has become a leadership issue, not just an IT issue. It is about creating space for uncomfortable conversations about risk, capacity, funding, and accountability. It is about understanding that every decision, from procurement to password policy, reflects values as much as priorities. The most secure campuses I have seen are not those with the most tools. They are the ones where people talk to each other. Where technology teams, faculty, and administrators work from a place of shared responsibility instead of silos and assumptions. That is not a technical investment. It is a leadership commitment. Awareness That Lasts Beyond October Cybersecurity Awareness Month is a good reminder to pay attention, but awareness can’t be seasonal. The real challenge is how we sustain it through the rest of the year: how we build systems and cultures that make security second nature, not second thought. For leaders in education, that means showing vulnerability. Admitting what we don’t know. Asking for help when we need it. Encouraging the same openness in our teams. It also means balancing mission and protection, ensuring that the drive to connect, innovate, and share knowledge never compromises the safety of those we serve. Closing Thought Cybersecurity is not about locking down learning. It is about preserving it. In every district, college, and university I have worked with, I see the same quiet determination: to keep moving forward despite the noise, the fatigue, and the fear. And that gives me hope. Because awareness is not built by rules or reminders. It is built by leaders who care enough to keep asking hard questions. As we navigate another Cybersecurity Awareness Month, that is where I choose to focus. Not on the threats that surround us, but on the responsibility that unites us.
By Bill Balint June 10, 2025
Higher Education IT professionals must be committed to taking care of others. After all, great IT organizations were never in the business of looking after computing but were always in the business of customer service. It is not about bits, bytes, clouds, anti-virus, border firewalls or even processing credit card payments online. The best IT organizations make it all about people. But we higher ed. IT people find ourselves in the middle of a disrupted industry and this disruption is not going away. In this case, it is not the disruption of GenAI, or data breaches run wild. Instead, it is about survival. The tragic Spring 2025 story of Limestone University in Gaffney, S.C. is yet another in a growing list of institutions no longer able to weather the ominous reality. Founded in 1845, 16 years before the Civil War erupted in Limestone’s home state, Limestone overcame every challenge of a small private institution for some 180 years. That is until April 29 when Limestone’s governing board officially announced its immediate closure. The announcement came after Limestone lost some 50 percent of its enrollment in the past decade, from about 3,200 students to 1,600. A large percentage of these are student athletes as the institution fielded 23 teams at the NCAA Division II level. The closure story is repeated often enough nationally that it sadly runs the risk of no longer being newsworthy. According to federal data provided to The Hechinger Report ( https://hechingerreport.org/tracking-college-closures/ ), 28 higher education institutions closed in the first nine (9) months of 2024 alone. What does this have to do with IT departments? Everything. From an IT perspective, many institutions rely on online learning, video conferencing, worker collaboration suites, CRMs, SaaS ERPs and SIS’, and comprehensive cybersecurity tools at levels that could not have even been dreamed about in the pre-COVID world. That’s not even addressing the emerging AI world, coupled with unfunded mandates from increasingly complex IT compliance requirements. More and more money is needed to attract and retain fewer and fewer potential students at many institutions and that IT budget may look like fertile ground. Not surprisingly, some view IT as a liability – like a very expensive utility bill – as higher education muddles through this dark time. Perhaps a necessary evil, but one that needs to operate as cheaply, as possible. True enough, IT brings significant expense money, and it generates very little direct revenue in most cases. The Good Ole’ Days of IT being directed to “do more with less” is being replaced with “we can do IT without you”. All of which leads back to the higher education IT professional and the mental health impact of this disruption that really dates to the 2008 recession when budgets and staffing levels took a negative turn from which some departments never recovered. Cybersecurity and data privacy professionals are arguably facing the highest stress levels in the organization. The Information Systems Audit and Control Association’s (ISACA) 2024 State of Cybersecurity survey report notes that 66 percent of cybersecurity staff believe their role is more stressful than it was five (5) years ago ( https://www.isaca.org/about-us/newsroom/press-releases/2024/nearly-two-thirds-of-cybersecurity-pros-say-job-stress-is-growing-according-to- new-isaca-research ). Though its focus is on the higher education ecosystem in general, 2025 EDUCAUSE Horizon Action Plan: Mental Health Supports ( https://library.educause.edu/-/media/files/library/2025/1/2025horizonactionplanmentalhealth.pdf ) offers some practical, common sense and sustainable tips for the IT professional, their team, the IT organization, and beyond, to help. Like most things in an IT organization, leadership – or lack thereof – is a key difference maker. A subtle action by a leader to prioritize staff mental health similar to the department’s larger goals of professional development, productivity gains or continuous improvement will make all goals easier to achieve. It is well established that mental health wellness leads to less workplace tension, better employee retention, and less time missed due to illness. But it is also simply the right thing to do because the disruption is disrupting IT employees like never before and it seems like the disruption is here to stay. Bill Balint is the owner of Haven Hill Services LLC, contracted as TriVigil’s Advisory CIO for Education.
By Bill Balint April 30, 2025
In a higher education world where cybersecurity, data protection and data privacy activities are bathed in multiple regulations, policies, procedures, standards and all the rest, what happens when victims claim, “compliance is just not good enough”? The answer can be quite costly. The March 2025 data breach incident at the Yale New Haven Health System (https://www.ynhhs.org/legal-notices) could potentially be such a case. Yale New Haven Health reported a data breach incident to the public on March 11, 2025, and a pair of 52-page federal lawsuits were filed on behalf of victims were already filed just over a month later. There are reports that as many as six additional suits were filed in the following days. A variety of law firms have created web pages where victims can seek legal engagement, so the number of suits could potentially increase. It does not appear Yale New Haven Health is being accused of specifically failing to meet a given governmental regulation – such as HIPAA, PCI, GLBA, or a state breach notification law. The fact that a generic notification letter about the incident can be found at the Massachusetts Office of the Attorney General website implies at least that state’s requirements have been met. But according to the Hartford Business Journal (https://www.hartfordbusiness.com/article/yale- new-haven-health-faces-lawsuits-over-data-breach-health-system-discloses-more-details), the suit claims Yale New Haven Health did not “…properly secure and safeguard Plaintiff ’s and Class Members’ sensitive personally identifiable information (PII) and personal health information (PHI), which, as a result, is now in criminal cyberthieves’ possession.” These lawsuits understandably infer that provider storing sensitive or confidential customer needs to use a portion of its revenue to fund customer data protection measures. The goal should be protecting data even beyond regulatory compliance demands. A Big Year For Settlements Beyond the question of governmental regulations and their relationship to lawsuits, there is no doubt higher education is suffering increased direct financial penalties resulting from data breaches. Just one example from 2025 is the $2 million settlement of the class action data breach lawsuit against St. Louis University and SSM Health Saint Louis University Hospital from mid-April, stemming from the data breach of up to 93,000 individuals (https://www.hipaajournal.com/saint- louis-university-data-breach-lawsuit-settlement/). Besides the common practice of receiving identity theft protection benefits, claimants can receive up to $2,500 in unreimbursed expenses resulting from the breach. St. Louis University and SSM Health Saint Louis University Hospital are not alone, as various similar suits are on schedule to be settled later in 2025. Large or small, public or private, no institution appears immune. Too Early? Too Late? Another new lawsuit is among those that confront the long-debated ”time to notify the victims” issue. Michael Harris, a potential incoming student at Lee University, filed the suit against Lee in the U.S. District Court Eastern District of Tennessee (https://www.local3news.com/local-news/lee-university-sued-for-negligence-after-data-breach-impacts-thousands/article_ca5ecb44- 8872-4692-9dd8-4ce35defe574.html). The lawsuit includes multiple complaints, among them is the claim that Lee waited for more than one year to notify the impacted individuals. One could argue notifying potential victims before all facts are known runs the risk of providing incomplete information. But waiting for an investigation to complete runs the risk of victims suffering the consequences of the breach without even knowing a breach of their information occurred. Damage Over Dollars? Of course, data breaches are often about a lot more than money. They hold the potential to devastate victims by inflicting non-economic temporary and sometimes even permanent damage. The recent takeover of the New York University (NYU) website by a hacker who briefly exposed NYU applicant information datasets back to 1989 (https://nyunews.com/news/2025/04/01/nyu-data-breach-lawsuits/) serves as a reminder. Public policy – often via regulation – tries to limit the damage by requiring those who house sensitive and confidential data adhere to strict standards. But higher education institutions need to know that compliance with all regulations and data breach laws might not be enough. These large settlements should provide institutions with a constant reminder. Bill Balint is the owner of Haven Hill Services LLC, contracted as Trivigil’s Advisory CIO for Education.