Privacy, cloud and big data
I agree that "It gets really tricky because DNA data is so sensitive" and that the hard part is to "alleviate customers' privacy concerns".
Many organizations are looking to the cloud and outsourcing solutions for massive processing but international privacy laws are now escalating and organizations are desperately looking for effective ways to comply to these new stringent regulations. Europe and US are leading with very stringent privacy laws.
I studied one interesting project that addressed the challenge to protect sensitive information about individuals in a way that could satisfy European Cross Border Data Security requirements. This included incoming source data from various European banking entities, and existing data within those systems, which would be consolidated in one European country. The project achieved targeted compliance with EU Cross Border Data Security laws, Datenschutzgesetz 2000 - DSG 2000 in Austria, and Bundesdatenschutzgesetz in Germany by using a data tokenization approach.
I recently read an interesting report from the Aberdeen Group that revealed that "Over the last 12 months, tokenization users had 50% fewer security-related incidents(e.g., unauthorized access, data loss or data exposure than tokenization non-users". Nearly half of the respondents (47%) are currently using tokenization for something other than cardholder data The name of the study, released a few months ago, is "Tokenization Gets Traction".
Aberdeen has also seen "a steady increase in enterprise use of tokenization as an alternative to encryption for protecting sensitive data".
Ulf Mattsson, CTO Protegrity