Almost one year after an alarming cyber attack on Stanford's information systems, the university is trying to balance stronger security with potential intrusions on student, faculty and staff privacy.
Among the challenges is the concern that enhanced security technology could be misused—by accident or against university policy—to compromise confidential information of either a personal or professional nature. A faculty committee, led by genetics professor Andrew Fire, will pursue solutions in partnership with the administration.
Underlying all the university's actions, says Randy Livingston, vice president for business affairs, is an increased investment in Stanford's information security team and improved technology. The July 2013 intrusion had a lasting effect.
"What I can say is that this was a sophisticated attack," Livingston notes. "We believe it originated from a quasi-government entity in Asia, or at least it had all the appearances of that. We don't have any definitive proof."
As skillful as the incursion was, says Livingston, Stanford apparently escaped drastic damage. "The attack was very serious," he explains, "in that it got really to the very core of our authentication systems, the place where we store user names and passwords for all the SUNet users. We have done a fair amount of forensics, and we have not found evidence that they accessed any of what we call prohibited information—protected health information, social security numbers, credit card numbers and things of that sort. But the attack shook us up, because certainly, given the hackers' ability to penetrate as deeply as they did into our systems, it gave them the ability to get to a lot of other places."
There's no evidence of any breach in the systems holding research data, despite the incessant nature of attacks directed at Stanford. Last summer's infiltration stands out as "the only attack we're aware of that has successfully penetrated through the core center of our administrative systems."
To understand the scope of Stanford's vulnerabilities, remember, too, that campus operations include more than 100 merchants who take credit cards. That's another area in which Stanford is beefing up protections.
There are multiple flashpoints in the tension between increased security and privacy, including one that's as much cultural as logistical—the prospect of requiring encryption software on personal devices, including phones. That provides a failsafe should a device with sensitive information be lost or stolen (another realm of security headaches). But that software is also a tool that could be misused by someone at Stanford to pilfer or alter private information.
What should Stanford's policies be in regard to personal devices used for university activities? "Well, by law," says Livingston, "the regulations follow the data. Whose data is it? If it's university data that is compromised, it doesn't matter whether it's compromised on a personally owned device or a third-party vendor's device or on a university-owned device.
"So we have a responsibility for that data. Having said that, it's harder to convince individuals who bring their own device that we should be allowed to put monitoring tools on them."
Amid headline-making revelations about National Security Agency spying, Livingston observes, people are warier of all data gathering. "I think the biggest difference in the university is just the degree of decentralization and the resistance to mandates," compared to businesses that require all employees to follow the same rules.
"In a university environment, you have every flavor of device and situation. It's just much harder to control."
More from Stanford: