The Internet of Things (IoT) is quickly becoming a highly populated digital space. Two popular types of IoT items are the Amazon Echo personal helper, that answers to “Alexa” (or “Echo” or “Amazon”), and the Google Home personal helper, that responds to “OK” (or “Google”). These highly proclaimed smart gadgets are always listening; as are generally all similar types of smart gadgets and toys.
Listening can quickly change to recording and storing the associated files in the vendors’ clouds because of how these devices are engineered. Let’s consider the privacy implications of how those recordings are made, where they are stored, how the recordings are used, and who has access to the recordings.
Amazon and Google both claim that their smart personal assistant devices do not keep any data that they are listening to before those keywords that trigger the recordings. However, here are just a few important privacy-impacting facts:
Amazon keeps approximately 60 seconds of the recordings from before the wakeup request to communicate with the devices within the local device, and a “fraction” of that is sent to the cloud.
All the sounds going on within the vicinity are also part of the recordings, along with a large amount of meta data, such as location, time, and so on.
The recordings will be kept indefinitely until consumers take it upon themselves to take actions and request the recordings be deleted.
Data, possibly including recordings (this topic is not directly addressed by Amazon or Google), may be shared with a wide range of third parties, and both vendors state they have “no responsibility or liability” for how that data is used by the third parties.
There are other privacy issues, of course. But, for now, let’s focus on these, which are significant on their own.
Privacy protections currently require manual intervention
While the Amazon and Google privacy policies each boast of privacy protections, those policies fall short of providing full explanation for full privacy protections specifically for Alexa and Home. And for the most part, consumers must take actions to protect their privacy, particularly for the issues listed previously. For example, users must, at a minimum, take the following six actions to establish a minimum level of privacy protections for themselves:
- Physically turn off the devices to keep them from recording everything in the vicinity. The devices do not turn off by themselves. These devices have been known to respond to words other than the keywords, and even order items as a result. By keeping the devices on all the time, you risk having private conversations recorded and accessed by whomever has access to the vendors’ clouds. Users should keep smart devices turned off when they have guests over and when they simply do not plan to use these devices.
- Set a password and change default passwords and wake words. Choose ones that are different from your other passwords, that are long and complex, and that are not composed of words found in any type of dictionary or are commonly spoken.
- Opt out of data-sharing. Generally, for most businesses in the U.S., if you don’t opt-out of data-sharing, you will be implicitly allowing the manufacturer to give, or even sell, your data to unlimited numbers of third parties; e.g., marketers, researchers and other businesses. You will then have no control or insights into how the data about YOU is used and shared by THEM.
- Use encryption. Turn on encryption for data transmissions and data in storage. Most are off by default. Amazon and Google generally state they encrypt all data in transit and in the cloud for all their services and products. However, disappointingly, neither give an option to encrypt the in-home device data storage.
- Delete your data from the cloud. Don’t forget that all the audio recorded, and the associated meta data, will be kept within the Amazon and Google cloud systems forever – unless you take the initiative to delete it. And since that data is being accessed by a wide range of unknown third parties, you don’t want the information to be used to violate your privacy or result in privacy harms.
Effective privacy protections must be built in and automatic
These manual actions need to be taken for current versions of smart personal gadgets to protect privacy in the short-term. However, the time is long overdue for privacy protections and security controls to be engineered into every type of smart device available to consumers. The amount of data collected and the potential privacy harms that could occur with that data are too great to allow IoT vendors to simply take a few incomplete actions that only start, and do not complete, the implementation of all privacy protections that are necessary to protect the privacy and security those using the devices.
For example, to address the issues discussed here, Google and Amazon could have engineered the devices so that:
- Device settings could be set by consumers to automatically turn the devices off without physically doing so.
- Authentication was required and had to be strong.
- Data would not be shared with third parties without explicit permission as a device setting from the associated consumers.
- Data in storage on the device was automatically and strongly encrypted.
- Privacy notices could be accessed (possibly via audio) through the device.
- Consumers could have settings for automatic deletion from the cloud.
Over the past couple of years, I’ve chatted with my friends at CW Iowa Live about the privacy issues involved with these IoT devices. For more information on this topic beyond this blog post, you can listen to them here and here.
Utilize ISACA Privacy Principles to build privacy into processes
So how should engineers approach building privacy controls into IoT devices? Use new ISACA privacy resources! I am grateful and proud to have been part of the two ISACA International Privacy Task Force groups, both led by Yves Le Roux, since 2013, and to have been the lead developer authoring the newly released ISACA Privacy Principles and Program Management Guide (PP&PMG), incorporating the recommendations and input of the International Task Force members, as well as a complementary privacy guide targeted for publication in mid-2017.
The ISACA PP&PMG outlines the core privacy principles that organizations, as well as individuals, can use to help ensure privacy protections. These privacy principles can be used by engineers to build the important privacy and security controls into IoT devices right from the beginning of the initial design phase, and use them all the way through the entire product development and release lifecycle. Aligned and compatible with international privacy models and regulatory frameworks, the ISACA Privacy Principles can be used on their own or in tandem with the COBIT 5 framework.
The second ISACA privacy guide that will be released this year will include many examples throughout the entire data lifecycle and a detailed mapping of where to incorporate privacy controls within the COBIT 5 control framework component.
(About the author: Rebecca Herold is president of SIMBUS LLC and CEO of The Privacy Professor, and a member of the ISACA. This post originally appeared on her ISACA blog, which can be viewed here).
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access