That's because organizations are trying to sell to, not harm, consumers -- and due to the many existing laws that protect individuals, their freedoms, and privacy, he adds.
"It's feasible -- since wearable technology using sensors is in people's phones, cars, or homes -- you'll be able to figure out something about somebody's health condition," says Castro. "It's more likely your neighbor knows already. Even if [app developers] could make an educated guess they're not going to sell that information to employers because that information is prohibited by law. The concern is there: It's a legitimate concern, that somebody will make an adverse decision because of healthcare, but more realistically it'll be the result of interactions. That's more likely to be the cause of one of these adverse actions than wearing a Fitbit."
Indeed, the advent of wearables and mHealth apps does not require a separate flurry of legislation, he says. Existing laws, perhaps with some amendments, adequately protect patients and their information, the Center wrote in a letter to the Federal Trade Commission this month. In addition to HIPAA, laws such as the Americans with Disabilities Act (ADA), Genetic Information Nondiscrimination Act (GINA), Fair Credit Reporting Act (FCRA), and Employee Retirement Income Security Act (ERISA) protect consumers and patients, the organization said.
With more participants from across the wellness spectrum, mHealth data could lead to vital insight, treatments, and perhaps cures. Until all app developers speak the same language and operate under the same rules, some patients could find deidentification and anonymity hurdles too high to clear.
Most IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift. Get the 6 Tools To Protect Big Data report today (registration required).