“We’re trying to study where there’s a trade-off between what you’re willing to share and what you get as a utility,” says computer science professor Sharad Mehrotra. “We want to know if there’s a way to build privacy protections on a layer in between the sensors and the end user.” Jocelyn Lee / UCI

Private practices

UCI seeks to balance electronic surveillance with privacy of people being monitored

Earlier this month, technicians began installing sensors, scanners and surveillance cameras throughout Donald Bren Hall on the University of California, Irvine campus. The technology upgrades may look like management overreach to some, but in reality, they’re part of an effort to test the limits of individual privacy in our data-rich, networked world.

Donald Bren Hall

Outfitted with sensors, scanners and surveillance cameras, Donald Bren Hall is serving as a “test bed” for the project.
Chris Nugent / UCI

Researchers in UCI’s Bren School of Information & Computer Sciences plan to convert the six-story structure into a “test bed” for systems to better safeguard people’s identities and confidential data.

“It’s not clear that privacy technologies are out there at all, or to what degree they actually help us in ensuring privacy, so we’re significantly behind,” says computer science professor Sharad Mehrotra, who is spearheading the unique project.

The effort stems from a $5 million grant from the Defense Advanced Research Projects Agency (a unit of the U.S. Department of Defense) for the Brandeis program, named in honor of Louis Brandeis, an associate justice on the Supreme Court from 1916 to 1939 and a right-to-privacy pioneer.

This four-year, multi-institutional initiative will attempt to answer concerns about the status of personal information in the cloud environment, mobile computing and the rapidly expanding “Internet of Things,” a global conglomeration of connected devices that can continuously scan, sense and share data.

Mehrotra, a database and privacy expert, is being joined by computer science professor Nalini Venkatasubramanian, a middleware and “Internet of Things” system design guru; and informatics professor Alfred Kobsa, who studies user preference modeling and privacy.

Mehrotra outlines a hypothetical scenario in which people’s anonymity might be compromised through their exposure to the new digital reality: When a student enters a campus building, her image is captured by a surveillance camera. The facility’s wireless network “sniffs” the mobile phone in her backpack, and the building management system knows which office she enters, what computer she elects to use and whether that room’s lights are on.

With the potential for this level of electronic monitoring to be happening in our workplaces, Mehrotra wants to develop novel technologies to enable building occupants to state their preferences.

“You can say, ‘I don’t want myself tracked at all,’” he explains. “You and the building will negotiate with one another about which sensors will capture data about you, consistent with your wishes but also with the policy of the building.”

But facilities managers may limit access to certain spaces or other services if occupants opt out of monitoring. “We’re trying to study where there’s a trade-off between what you’re willing to share and what you get as a utility,” Mehrotra says. “We want to know if there’s a way to build privacy protections on a layer in between the sensors and the end user.”

Creating methods by which humans can interact with smart building systems in a way that protects their privacy is something largely overlooked till now, the researchers say.

“Privacy has always been retrofit rather than a primary design consideration,” Mehrotra says. “It’s not just with the ‘Internet of Things.’ We’re behind the curve in mobile computing and almost everything. In security, usually you build a system and then wait for someone to break it; then you build a patch.”

After the initial phase of installing sensors throughout Bren Hall – with help from Raj Rajagopalan, a building management systems professional at Honeywell – the group will develop a host of applications for monitoring energy usage and human presence in workplaces. Automated power saving programs depend on such data.

The challenge, according to Mehrotra, is to determine how and where to embed privacy technologies in the system up front so that identifying details are secure. One of the keys is a concept called differential privacy.

“For a lot of the services – controlling the temperature in a certain zone, for example – we don’t necessarily need to know individual information. Aggregate information is probably good enough,” he says. “You’re hiding in a crowd, just one person in a set of people.”

Other smart building utilities require more of your identity to function. Mehrotra points to a sophisticated people-finder tool that allows someone you’ve made an appointment with to find you if you’re not at your desk. Another would be context-sensitive messaging, letting you send a note to a colleague that only gets delivered when the recipient is at his or her computer.

Both of these applications depend on the building management system knowing individuals’ precise whereabouts at a given time. Mehrotra says his team’s objective is to deliver these services while asking users for the least amount of information necessary.

Why would a federal agency known for developing advanced national security technologies be concerned about personal privacy on the Internet? Mehrotra says his group’s research ultimately will have a direct bearing on how information sharing happens in the armed forces.

“They’re interested in privacy technologies because they know they cannot fight a war with a firewall around their data,” he says. “There has to be information sharing as a way to foster trust between partners.”

Share.