Organizers: Susan Landau (Tufts University), David Choffnes (Northeastern University)

Organized by: SPLICE and ProperData

Date: Wednesday, April 13, 2022

Our two SATC Frontier projects (SPLICE and ProperData) hosted a joint virtual workshop. The focus of the workshop was on smart IoT home devices and the new privacy and security threats, including national-security ones, arising from their rapidly increasing integration into consumer lives. Our panelists discussed the law, policy, and national concerns of these devices, in addition to the intersecting technical privacy and security aspects of IoT devices. This event was open only to ProperData and SPLICE members and guests, no press was allowed. 


Below is a recap:

The workshop began with a discussion of privacy principles, found both in the Federal Trade Comissions’ Fair Information Practice Principles (FIPPs), which undergird both US privacy regulation (especially states) and also Europe’s General Data Protection Regulation (GDPR). Panelists discussed central topics related to home IoT including transparency (what data will be collected?), minimization (which data elements will be collected?,  how long will they be retained?), security (how will data be secured?), and accountability (how will compliance be ensured?) and the impact that regulation like GDPR has through enforcing these principles by applying higher fines for noncompliance and promoting “Privacy by Design,” a fancy term that simply means designing (devices?) with privacy principles in mind. Furthermore, panelists and participants engaged in a conversation on the impact home IoT devices have on individuals (who controls devices?) and the meaning of privacy in homes, a seemingly unexplored area.  

We then turned to national-security implications of smart devices. Panelists discussed (i) the lack of vendor diversity (especially important when factoring in who controls networks; there is danger of lockin on Home IoT by Tuya  and 5G by Huawei) and (ii) the cybersecurity risks when everyone has personal/home IoT devices. Of special concern, was discussion on home IoT increasing a “splinternet effect” (regionally connected internets instead of one globally connected Internet) of our cyberspace through the fact that in the US, companies effectively own the personal data, in the EU, users own the personal data, and in China, the state owns the personal data. This could play out in very complex ways geopolitically, e.g., at what point do countries refuse to connect to countries with different principles? This is already occurring to some extent, but will home/personal IoT be a further driver in this direction?

Likewise, interoperability was a topic of national-security discussed by our panelists. Do we need to enforce a minimal mandate for interoperability of personal/home IoT devices?

Our panelist also shared their thoughts on the design and development of ubiquitous IoT. As IoT becomes more prevalent in daily life, there is a responsibility to think about the social, economic, and political effects of the use, design, and prevalence of these devices. In which instances will it be required for users to live with IoT devices (e.g., in a rental unit, a dorm, a shared housing situation such as a retirement community)? If living with IoT devices is no longer a choice for users— and no longer a choice for the least powerful in society (e.g., low-income people) —what design principles are critical for such IoT devices? There are many threat models- the landlord, the abusive partner, the nation-state adversary– all must be taken into account. Who are systems designed for? Who are we protecting? How do we take into account differential impact on users (e.g., the elderly, the unsafe partner, etc.)? The issue is: who is in control of IoT devices? One panelist remarked that if you’ve ever seen children fighting over the TV remote or people disagreeing on the office thermostat, you know that government regulation will be really hard.  

Our panelists ended our workshop considering key questions: What should we do as researchers? Can we force transparency into the system? How do we do so given that IoT devices have limited interfaces for doing so. Can we ensure minimum standards for safety, security, and privacy, akin to the Underwriters Limited (UL) model? Can we develop compliance tools to aid in this work? 

What threat models should we consider in our research? What parts of the ecosystem are being targeted? Ad system? ML? Advertiser?  

Likewise, the use of data collected from IoT is important to consider. IoT devices provide additional potential for the targeting of individuals and the potential for discrimination, bias, and other unfair practices. When, if ever, should we allow protected features (e.g., gender, race) to be targeted directly? Can we trust an algorithm to target in ways that avoid biases?


We look forward to continuing our efforts in the protection of personal data using a combination of technical and policy solutions. And we look forward to more collaborative activities with SPLICE and its members, in the future.