Privacy practices: The challenge of safeguarding digital data
Privacy once meant drawing the drapes. Now that we depend on technology to do the world's business, privacy means securing data, protecting personal information and keeping hackers at bay. Drawing the drapes in an electronic sense will call for a complex system of safeguards and require policymakers to create guidelines. On January 28, Data Privacy Day was observed across the U.S., Canada and 27 European nations. The W. P. Carey School of Business celebrated the day with a symposium for privacy leaders in the public, private and academic arenas. The event was hosted by the Center for Advancing Business Through Information Technology (CABIT) and Intel.
Privacy once meant drawing the drapes.
Now that we depend on technology to do the world's business, privacy means securing data, protecting personal information and keeping hackers at bay. Drawing the drapes in an electronic sense will call for a complex system of safeguards and require policymakers to create guidelines.
Before leaving for her new post as Secretary of Homeland Security, Arizona governor Janet Napolitano signed a proclamation declaring January 28 Data Privacy Day, observed across the U.S., Canada and 27 European nations. The W. P. Carey School of Business celebrated the day with a symposium for privacy leaders in the public, private and academic arenas. The event was hosted by the Center for Advancing Business Through Information Technology (CABIT) and Intel.
Moderator and center Director Julie Smith David challenged speakers and audience members to identify the hurdles impeding privacy assurance and to suggest solutions for security breaches that plague information systems from computers to smart phones.
Trust is job one
Don Whiteside, vice president of legal and corporate affairs at Intel delivered a clear message: Manufacturers of computers and computer components need to deliver more than faster machines. Privacy protection needs to be built into every product.
"We look at privacy at Intel as part of what we call a 'continuum of trust,'" Whiteside said. "If we're going to be successful as a company, we have to build trust in technology. We need users, whether they're consumers or business users, that trust in the technology that they purchase."
Every microprocessor comprises "an amazing amount of microcode," Whiteside said. "How do we start thinking about protecting data sources? At the top is integrity. The systems we build can't be misappropriated and used for malicious things."
Innovation occurs either in small increments, or in landmark inventions that appear once or twice a century. Railroads and electricity were transformative, and so is digital technology. Just as electricity's effects were far reaching, nearly all aspects of life today are somehow digitized, utilizing the Internet.
"When you think about it," Whiteside said, "how you educate, how you engage with the government, how you learn, how you play, how you socialize -- everything is being touched by technology." That includes not just sitting down at a computer but also using a smart phone, conducting online banking, participating in social networks. "There is a tremendous empowerment through that, but there is a tremendous risk to privacy," he said.
Ensuring data privacy requires a shared responsibility between governments, industry and non-governmental organizations, or NGOs.
"We need to know what the rules are, how the rules are going to be enforced and what is acceptable practice in the marketplace relative to privacy, data sharing, data use and data storage," Whiteside said. "We need governments who are willing to articulate those rules and enforce them."
One of the biggest challenges that companies such as Intel face is the market: global but segmented, because the rules and regulations regarding privacy and conformity assessments vary country by country.
Too many players
Intel and other manufacturers are just a handful of the players in the technology ecosystem as they attempt to establish an environment worthy of trust. In this ecosystem, other companies write the operating systems; third parties can steal data or share information; computer users are subject to Trojan horses and other phishing attacks.
"When you look at this ecosystem, it only takes one rogue actor to affect data privacy," Whiteside said.
Programs exist to battle such forces, but Whiteside called these "Band-Aids." Trust and integrity have to be built into computer systems, he said. "We have to look at it architecturally."
Whiteside said that the challenges are numerous. Hackers stay one step ahead of innovation. The cost of building trust into technology is high. Consumers haven't been trained to demand trust over performance. How much are they willing to pay?
"It will take years for us to collectively introduce platforms and operating systems and services and applications that integrally elevate overall data privacy," Whiteside warned. "It is not a problem that's going to get solved overnight."
Arizona attacks the problem
At the state government level, privacy concerns are the responsibility of the Government Information Technology Agency.
GITA provides standards for information technology for about 95 executive agencies, said Mary Beth Joublanc, Arizona's Chief Privacy Officer. GITA is a cabinet level agency that reports to the governor's office and works with the legislature. It was created by statute in 1996. GITA later spawned the Statewide Information Security & Privacy Office (SISPO), which is responsible for strategic planning, facilitation and coordinating information technology security in Arizona.
One daunting task is transitioning from paper to digital documents, establishing guidelines for the protection of each, and dealing with breaches of security, Joublanc said.
"How do we shift the culture to thinking about identifying risks, identifying incidents, and reporting them -- not being afraid to report them because it's a very political environment?" Joublanc said.
She looks at the life cycle of a document. Perhaps it arrives in paper form. The question is: Should it become digitized? Should it stay in hard copy form? How long should it be retained?
Paper or plastic?
Systems are not yet consistent. Archiving standards for paper are different from standards applied to digital material. Breach notification laws address only electronic information, while data destruction laws address only paper.
"We are on the threshold of starting to develop policy, look at the big picture and not be so technology-centric … Our data breach policy probably needs to include losses, unauthorized access of media like hard copy, photographs, X-rays, in addition to electronic information," Joublanc said.
Another challenge is reconciling privacy with public access. Government has a mandate to be transparent, but does transparency apply to individual citizens? How can public access be guaranteed without compromising an individual's right to have personal information protected?
The government of Arizona, she noted, probably has the largest database in the state. To manage that wealth of information in a secure and responsible way, GITA will reach out to Arizona State University and other universities, as well as Native American communities and such companies as Intel for consensus on how to bring privacy to the forefront.
"It's something that needs to be built into the business process cycle," Joublanc said.
How research can help
Information systems Professor Paul Steinbart believes that research can advance the discussion and facilitate solutions.
"From an academic's point of view, looking at the issue of privacy, there's a whole host of different perspectives we could be taking," Steinbart said. "Obviously the first thing that probably comes to a lot of people's minds is the customer. What are their attitudes about privacy? What are their behaviors?"
But, a significant number of data breaches involve employees of a company or organization whose personal information has been compromised. What effect does that have on morale and willingness to continue to comply with privacy policies, Steinbart asked. What is the impact on the stock market?
Third parties involved
Sometimes control over privacy is not autonomous. Companies have business partners, for instance.
"You can't have tunnel vision and say. 'I'm protecting your information, whether it be hard copy or whether it be electronic, but I've got all these arrangements with other partners with whom I'm doing outsourcing,'" Steinbart said. "Therefore, it's important to ask 'What are they doing to protect the privacy of the data that I'm sharing with them?'"
Organizations should ask why information is being collected in the first place, what is being done with it and what benefits are being gained, he said.
"How can we mine this personal information in a way that gives, say, legitimate marketing insights, and at the same time protects privacy and confidentiality?" Steinbart asked.
Analytical modeling
Academics are using analytical modeling to make reasonable assumptions about individual and group behavior and then develop predictions about how people would or should behave.
For instance, what are the repercussions for business and society if people don't want to be profiled and avoid being indentified by marketers? Some consumers take steps to hide themselves. The Do Not Call list and Caller ID, are examples of ways consumers deflect unwanted contact. How does that impact marketing campaigns?
Some researchers are conducting field surveys to gauge privacy attitudes through consumer self-reporting. That can be tricky, Steinbart said, because actual behavior can belie assertions. People will claim that they value their privacy but will often reveal personal information on Web sites and engage in other unsafe practices on the Internet.
"The thing that's interesting is that privacy is in the eye of the beholder," Steinbart said. He recalled hearing a Canadian privacy commissioner discussing a couple who installed a webcam in their apartment allowing the world to electronically peek through their curtains. The same couple recoiled at sharing personal financial information.
Such contradictory attitudes make it difficult to reach broad generalizations on privacy that everyone can agree on, he said.
How companies establish trust
Other research has examined whether disclosure of a company's privacy policies encourages consumers to share more information. Some results show that to be true.
"If people understand what you're doing, what you're collecting and how you're using the information, they're more willing to share that information than if you don't," Steinbart said.
In the future, researchers are keen to dig deeper and mine richer data, he said, but that requires access to information and partnerships with government, for-profit organizations and non-profit organizations.
"There's a lot to do and a lot of places to go," Steinbart said.
Input from the community
In a lively discussion, symposium participants raised their concerns about privacy as it affects their own lives.
James M. Dzierzanowski, manager of awareness and training for SISPO, called for more training in the workplace.
Such training could include cautions about securing data when employees take company documents home on laptops or thumb drives, or have e-mail sent to home web addresses, said Steven C. Szczepanski, IT manager for Electro-Optical Systems of Tempe.
Several people expressed concern that computer safeguards and government protection policies could foster a false sense of security.
As a parent, Whiteside expressed concern for all the new technology prompting young people to share personal information. Steinbart said privacy should be protected automatically and anonymity should be the default mode.
Another parent, Stacy Norton, vice president and quality assurance manager for Wells Fargo, wondered if social networking isn't immunizing a generation of young people. With so much intimate information being shared, will they define privacy in a different way when they grow up and assume leadership roles?
Bottom Line:
- Privacy needs to be defined and regulations standardized.
- Protecting privacy is a shared responsibility of government, industry and users.
- Trust and anonymity should be built into computers and other communications devices. But safety can never be guaranteed.
- Consumers might have to pay more for privacy and trade some performance for security.
- Government has a duty to protect citizens' information whether in paper or digital form.
- Privacy needs to be reconciled with the public's right to know.
- Researchers can partner with business and policymakers to determine how the public will best respond to calls for privacy, understanding that different communities define privacy in a different way.
- More training and education could be done in the workplace and for young people engaging in social networking.
Latest news
- Lab lessons: Modern Grind brews up expansion with help from ASU
Avondale's coffee, tea, and health drink drive-thru partners with the SMB Lab to empower…
- Lab lessons: Roadcase.com VP shares how ASU's SMB Lab fueled growth and efficiency
The Arizona-based audio/visual equipment case manufacturer gets expert guidance on improving…
- Best installment loans
Loans should be prioritized by their ability to improve human capital, says an ASU finance…