As we move closer to the day when individual items by and large will be tagged, companies had better be prepared to have clear policies for how they'll handle data they may collect from consumers.
The good thing about being a privacy lawyer is that you can spot the risks, problems, and issues and let the clients and business people make the final decisions. When it comes to radio-frequency identification, I'm relieved not to have to make these decisions. The high cost of adoption, and the challenges that must be met to overcome problems such as read failures when metals and liquids are involved, raise the stakes. The potential to reduce shrinkage and the proliferation of counterfeit goods, as well as to make it easier to track recalls for products that may present health hazards, provides significant value.
RFID is going to happen, given the mandates of key retailers and the Department of Defense to their suppliers. For now, suppliers and retailers are dealing with the early-stage problems and promise of the technology, but already watchdog groups and politicians are questioning the technology's evolution, whether that will present challenges to individuals' privacy, and how they or the industry can minimize any such threats.
At this point, RFID tags are being affixed mostly to cases and pallets, but the day will come when they are affixed to individual products, too. Maybe they'll have a kill switch so that consumers can disable the tags when they leave the store, or maybe they won't. Whatever direction the industry at large takes, your business must operate on the Spider-Man principle: With great power comes great responsibility. Where privacy and data collection are concerned, privacy professionals understand that with more data comes greater responsibility and legal risks. If the industry reaches a point where it can somehow use RFID tags to track a product all the way into consumers' homes and beyond, the industry also must ensure that it's protecting the privacy rights of the individuals who buy that product.
For instance, imagine a shopper is buying an RFID-tagged sweater using a credit card. At the point of sale, the shopkeeper scans the sweater's RFID tag, and the sweater's unique information is combined with the customer's name and other identifying information. Now fast-forward a few months, and imagine this same shopper returns to the store wearing that same RFID-tagged sweater. Unbeknownst to the shopper, un-manned readers throughout the store can scan and collect the data on the sweater's RFID tag--pinging the tag multiple times as the shopper walks around the store and then leaves empty-handed--and then send that info to a back-end database holding the original record of that sweater and the shopper's name, address, etc. Reporting and analysis tools can be applied to the record to glean that the customer returned to the store, spent 30 minutes there, and didn't buy anything. Armed with that information, the store could opt to mail a coupon to the shopper to encourage her to buy the next time she visits.
But that data, like any data that is combined with personally identifiable information, had better be stored securely, and access to it had better be supervised and monitored. And companies had better work on the idea that they're collecting only the data they need, had better ask whether it's important to make that data personally identifiable, and had better check that they're following any regulations of their industry or special privacy rules or practices that they themselves or their industries adhere to.
Whenever privacy technology, laws, or best practices are implicated, there are four issues that always should be considered: notice, consent, access and security. If the data is personally identifiable or capable of becoming personally identifiable when combined with other data you have, have you given notice of what you're doing to those whose data is being collected (the "notice" requirement)? Have you received the requisite consent for what you're doing (the "consent" requirement)? How can people review what you've collected for accuracy or stop you from using it later on (the "access" requirement)? And how well are you protecting the security of the data (the "security" requirement)?
When non-personally identifiable information is collected, either in generic or aggregate forms, these four factors may not be required. And even when notice is required, consent may not be. But most privacy regulatory schemes involving the collection of personally identifiable information start with these four requirements. And most privacy professionals start here as well.
But how well do these apply to the new technologies with shifting points of collection and no way of giving notice or consent? And how well do notice and implied consent work when the use of the data may shift over time, or later become combined with personally identifiable information from other sources? Any consents obtained now are likely not to expand to future uses, and that could taint an entire database, in the same way the mixing of opt-in, double-opt-ins and opt-out databases make them worthless in many instances. So the answer may be, not very well at all.
The Business of Going DigitalDigital business isn't about changing code; it's about changing what legacy sales, distribution, customer service, and product groups do in the new digital age. It's about bringing big data analytics, mobile, social, marketing automation, cloud computing, and the app economy together to launch new products and services. We're seeing new titles in this digital revolution, new responsibilities, new business models, and major shifts in technology spending.
What The Business Really Thinks Of IT: 3 Hard TruthsThey say perception is reality. If so, many in-house IT departments have reason to worry. InformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business views IT's performance in delivering services - and, more important, powering innovation. The news isn't great.