Skip to main content
Photo of a Law Library

When “Things” Go Wrong: Redefining Liability for the Internet of Medical Things

Bethany A. Corbin[1]*


The technological landscape of the healthcare industry is evolving at a rapid and unprecedented pace.[2] Increasingly dominated by connected medical devices, the healthcare sector is in the midst of a massive transformation to its approach to patient outcomes and value-based care.[3] Medical technology (MedTech), situated at the heart of this revolution, has the potential to significantly improve healthcare efficiency, convenience, and patient comfort, while also positively impacting quality of life.[4] However, MedTech is not without its vulnerabilities, and as connected medical devices become the norm in patient care settings, the risks for hacking and malicious attacks on these devices increase exponentially.[5]

As the new frontiers of MedTech continue to expand, one particular sector of the digital health industry has garnered significant attention: The Internet of Medical Things (IoMT). At its most basic, IoMT refers to the ability of healthcare devices to communicate, gather, and exchange data across WiFi and Internet platforms.[6] These devices can provide up-to-date patient information, enhance patient self-sufficiency, and decrease the cost of care.[7] WiFi-connected pacemakers, insulin pumps, and pill-shaped cameras are only the most recent examples of what this technology is projected to accomplish.[8] By 2025, experts estimate the impact of IoMT on the healthcare industry will range from $1.1 trillion to $2.5 trillion per year, mostly stemming from improved efficiency in treating chronically ill patients.[9]

As a subset of the Internet of Things (IoT), the IoMT ecosystem is unique from prior technology in that it “hinges on the interconnectivity of countless devices and participants,” which requires the legal framework governing IoMT to account for the rights, responsibilities, and obligations of numerous stakeholders.[10] Given the nascent stage of IoMT, however, a comprehensive legal and liability structure does not yet exist for hacks, breaches, and hijacks of IoMT devices that cause harm to patients.[11] Privacy and security regulations in the United States are sectoral and patchwork in nature, and those applicable to the healthcare sector have not been regularly updated to reflect the technological innovation associated with digital health.[12] As a result, significant gaps exist in healthcare regulations for IoMT devices, with some aspects of the industry completely unregulated.[13]

Compounding this issue of non-regulation is the lack of a comprehensive liability framework for patients to follow if their IoMT device is hacked or malfunctions.[14] With numerous developers, suppliers, and manufacturers involved in the IoMT supply chain, it can be difficult for patients to identify the culpable party and apply existing liability standards to innovative technology.[15] While products liability currently serves as the primary vehicle for restitution if a device malfunctions, its application to IoMT is imperfect at best.[16] Apportioning liability between software and device manufacturers in an IoMT product can be difficult, and there are no clear boundaries to establish which party is at fault for a hack or breach.[17] Moreover, defect-free software does not exist, which complicates the application of strict products liability to software companies (assuming embedded software can even be considered its own separate product to trigger application of products liability standards).[18]

Further, the prevalence of end-user licensing agreements operates as a contractual tool to limit manufacturer liability for unsecure devices.[19] These agreements, which appear in many IoMT products and disclaim all liability for software failures, shift the risk of harm almost exclusively to consumers, and eliminate the burden for manufacturers to comply with industry best practices for cybersecurity and privacy.[20] The presence of these agreements hinders consumers’ ability to bring product liability or breach of warranty actions, making restitution and recovery all but moot points.[21] Combined with the restrictions of the economic loss doctrine, which precludes tort recovery for purely financial harm, products liability (in its current form) is an almost unworkable liability structure for IoMT devices.[22] Not to mention, the products liability model risks exposing healthcare providers and IoMT device manufacturers to unbounded liability despite the lack of mandatory federal cybersecurity guidance and adherence to industry cybersecurity frameworks.[23]

Given the projected growth in the IoMT market over the coming years, new or revised liability models will undoubtedly develop as cases make their way through the court system.[24] As liability standards evolve, there must be an increased recognition that healthcare organizations and IoMT manufacturers are victims of cyberattacks and heightened emphasis should be placed on the proactive adoption of cybersecurity best practices.[25] That said, there is also a need to counterbalance these considerations against the requirements of patient safety and secure medical devices.[26] The existing regulatory gaps and liability frameworks have resulted in insufficient incentives to ensure adequate security measures are implemented into IoMT software to protect patients from hacks, hijacks, and breaches.[27] In a climate where a breach or hack can produce life or death consequences, it is imperative to develop well-defined security standards and liability expectations.[28]

Although it is impossible to predict at this stage the form of any new IoMT liability structure, two proposals merit consideration as incremental steps towards the new liability framework. First, end-user agreements that limit a software manufacturer’s liability for vulnerable code should be prohibited in the IoMT context.[29] The unique risk of bodily harm posed by certain IoMT devices requires a corresponding liability system that will hold software manufacturers accountable for their failure to implement security best practices.[30] End-user agreements operate as an impediment to this goal and shield software developers from accountability.[31] Software companies would almost certainly resist this course of action with a stringent warning that such liability would open the floodgates for judicial lawsuits and stifle innovation in a developing industry.[32] Thus, a second proposal should be simultaneously implemented that guards against unfettered liability for IoMT device manufacturers that adopt cybersecurity best practices.[33] Specifically, a “safe harbor” statute should be adopted that limits civil liability if IoMT manufacturers and software companies comply with voluntary, industry-approved cybersecurity frameworks.[34] These proposals help balance incentives with punishment and can result in safer IoMT products for patients.[35]

Further, by implementing small changes to the IoMT liability structure at this stage—without waiting for a liability scheme to be developed exclusively by the federal or state legislatures or the courts—IoMT companies and healthcare organizations can contribute to the dialogue on what an end-stage liability framework should entail.[36] Removing the protection afforded through end-user agreements can incentivize IoMT manufacturers to help form liability standards and best practice expectations that will continue to govern this evolving industry through public-private stakeholder participation.[37] These companies may additionally be motivated to adopt and adhere to existing cybersecurity frameworks, which may reduce the companies’ compliance burden when new IoMT-specific standards are eventually promulgated.

To support this two-pronged proposal, this Article proceeds in four parts. Part II offers a succinct introduction to IoMT, including the mechanics of how IoMT works and the benefits and vulnerabilities associated with these devices. Part III then explains the need to incentivize safer coding in these devices, focusing particularly on regulatory gaps for IoMT accountability and liability. This part highlights the strengths and weaknesses of regulations promulgated by the Department of Health and Human Services (HHS) and the Food and Drug Administration (FDA), along with the development and role of voluntary, industry-developed cybersecurity frameworks. Part IV discusses the need for a comprehensive liability framework to incentivize safer coding, describing why current liability structures are insufficient to govern and mitigate the risks posed by these digital health devices. This part presents the benefits and drawbacks of implementing the two-pronged liability model and articulates why such revisions are urgently needed in the IoMT industry. Finally, Part V concludes the Article.

Understanding IoMT: Background, Benefits, and Vulnerabilities

The technological revolution has taken the healthcare sector by storm.[38] Once perceived as mere science fiction, digital health has transformed imagination into reality with the advent of implantable, Internet-connected medical devices that not only monitor patient health, but also gather and exchange data across wireless networks with little human involvement.[39] Revolutionizing both patient behavior and the practice of medicine, the “smart” medical device industry is projected to be worth over $66 billion by 2024.[40] Indeed, experts anticipate that by 2026, approximately one-third of Americans will have either temporary or permanent healthcare devices in their bodies.[41] As the human body becomes increasingly connected to the Internet—a phenomenon that some deem the “next logical frontier” of digital health—it is crucial to understand the benefits and vulnerabilities of this technology.[42] This part discusses the advancement of the MedTech industry, including the evolution of IoMT, and explores the drivers and risks associated with using connected medical devices.

What’s in a Name?: Understanding and Defining the Internet of Medical Things

In the past five years, the MedTech industry has experienced exponential growth, fueled primarily by a corresponding advancement in the Internet of Things.[43] As the name suggests, IoT represents a network of smart devices that collect and exchange personal data over the Internet.[44] While no universally accepted definition for IoT exists, the term refers to the general interaction between computers, sensors, and objects to collect and transfer information through a wireless data infrastructure.[45] These devices “operate on embedded sensors that automatically measure and transfer data (i.e., environmental and activity information) over a network to data stores without human interaction.”[46] Breaking this down, IoT devices function in three stages.[47]

First, IoT devices are embedded with radio-frequency identification (RFID) sensors, which use radio waves to identify people and objects.[48] These embedded sensors enable IoT devices to detect and gather data from their hosts and surrounding environment, including the individuals who operate the devices.[49] Next, the IoT device transmits this data through WiFi, Bluetooth, mobile phone networks, or the Internet, where the data is stored using cloud-based applications.[50] Finally, end-users sift through the “massive troves of data” collected from these devices.[51] This data is analyzed for insights, trends, and intelligence that can guide future decision-making and increase productivity, safety, and efficiency.[52]

In the healthcare industry, IoT operates by creating a network of medical devices that connect to healthcare information technology (IT) systems.[53] Known as IoMT, this network uses technology to enhance information and data flow between patients and physicians.[54] The most obvious examples of IoMT involve connected medical devices that are used to track patient progress and manage chronic illness.[55] These well-known devices include pacemakers, blood pressure monitors, intravenous fluid pumps, defibrillators, ingestible pill cameras, blood glucose monitors, imaging and scanning equipment, and electrocardiogram devices.[56] While IoT is relatively new to the healthcare context, it has been described recently as “permeat[ing] nearly every sector of the healthcare industry.”[57]

Encouraging Innovation: IoMT Benefits

The benefits of IoMT in the healthcare sector are promising and are a major factor driving increased adoption of connected medical technology.[58] These benefits fall into three broad categories: (1) remote monitoring and telehealth; (2) behavioral modification and patient outcomes; and (3) administrative efficiency.[59] First, IoMT benefits patients and providers by transforming the landscape of telemedicine and enabling remote monitoring.[60] Remote monitoring allows providers to establish a constant connection with patients anywhere in the world, which assists with monitoring acute and chronic conditions.[61] Platforms such as e-mail, video conferencing, texting, and patient portals enable health care to transcend the physical bounds of the provider’s office and provide care to patients when it is most convenient and necessary.[62]

Second, IoMT encourages behavioral modifications for patients, particularly those with chronic illnesses, and has the potential to improve patient outcomes.[63] With connected devices, patients can manage their medical conditions at home, and have data transmitted to their providers automatically.[64] This offers patients a sense of responsibility and accountability for their health, and provides an incentive to take medication and perform necessary testing.[65] For example, ingestible pill sensors are being developed that notify healthcare providers when medication is taken.[66] Given that more than 20% of prescriptions are never filled, a doctor may refuse to order medication refills or increase medication dosages if she learns that a patient is not taking her medication consistently.[67] Patients, therefore, have more motivation to follow their medical plans when their activities will be reported to their healthcare providers.[68]

Finally, IoMT enhances administrative efficiency and operations.[69] Numerous medical tasks may be automated, and patient data can be gathered from various sources, even when the patient is not present in the doctor’s office.[70] Connected medical devices may run more efficiently and offer increased reliability with the potential to identify errors and mistakes more quickly than human providers.[71] In some instances, connected medical devices may even be able to warn providers of potential failure indicators before they occur.[72] Moreover, IoMT can encourage the development of innovative services and more efficient use of organization infrastructure.[73] For example, a hospital in Orlando, Florida used IoMT to develop a real-time location system in which family members can track the progress of a loved one undergoing surgery.[74] Similarly, a separate hospital in Waterbury, Connecticut used IoMT to analyze workflow trends with the goal of identifying staffing needs for each shift.[75] Use of IoT data saved the hospital $650,000 in just six months by reducing unnecessary overtime.[76] Thus, IoMT may be used by patients and healthcare providers to streamline and enhance healthcare delivery.

The Flip Side of Progress: IoMT Vulnerabilities

While IoMT has the potential to revolutionize patient care, the heightened connectivity of medical devices raises questions regarding patient security, network and data privacy, long-term maintenance, and device resilience.[77] In its 2014 NSTAC Report to the President on the Internet of Things, the President’s National Security Telecommunications Advisory Committee explained that the risks accompanying IoMT devices include “new attack vectors, new vulnerabilities, and perhaps most concerning of all, a vastly increased ability to use remote access to cause physical destruction.”[78] Although this list is not exhaustive, it highlights three areas of vulnerability that warrant discussion: (1) personal privacy and security; (2) network privacy and security; and (3) safety risks.[79]

First, IoMT presents an inherent risk of data breach that can expose sensitive user information.[80] Of particular concern is that many IoMT devices connect over unsecure networks, or networks with weak password protections.[81] Wireless transmission of data through these channels creates access points for hackers to compromise device security and privacy.[82] When such a device is compromised, sensitive patient health data may be shared publicly, resulting in a violation of individual privacy.[83] Given that healthcare data is highly coveted on the black market—medical records alone are 20 to 50 times more valuable than financial data—there is no shortage of bad actors attempting to hack unsecure medical devices.[84] Further, as discussed in Part III, numerous IoMT device manufacturers may fall outside the confines of federal regulation, which can disincentivize adoption of secure technology.[85] Consumers may not recognize the inherent risks associated with IoMT device use, believing their health data to be secure in the cloud.[86]

Second, IoMT exponentially expands the attack surface from which unauthorized users can gain entry into broader medical networks.[87] Each IoMT device that a healthcare operator places on its IT network has the potential to serve as a backdoor entry point into the entire healthcare system.[88] The mere presence of IoMT devices in a healthcare setting weakens the overall security of the network and creates access points that must be monitored by IT professionals.[89] Any unlawful hack or breach of an IoMT device has the potential to not only expose the sensitive health data of its user but also the data of other patients stored on the broader healthcare network.[90] Hackers can even negatively impact organizational operations by encrypting patient and administrative data and demanding a ransom for the encryption key.[91] Without access to its data, a healthcare organization cannot function efficiently, and cannot confirm patient treatment plans.[92]

Finally, because IoMT operates as a portal between cyberspace and humans, it has the potential to inflict bodily harm or death that may not be present with other IoT applications.[93] As showcased in a 2012 episode of Homeland, implantable IoMT devices may be hacked to cause the device to purposefully malfunction.[94] Former Vice President Dick Cheney had the remote capabilities for his pacemaker disabled after research identified vulnerabilities that could enable hackers to cause heart attacks remotely.[95] Similarly, researchers and white hat hackers have showcased their ability to hack insulin pumps from a remote location and alter the device’s settings to either deny delivery of medicine completely, or provide excessive insulin.[96] In 2014, the Federal Bureau of Investigations even warned hospitals to discontinue using certain infusion pumps designed with a security flaw that could allow an unauthorized user to alter medication dosages remotely.[97] Therefore, unlike other technologies, IoMT creates the possibility for significant human harm if a device is hacked or malfunctions.[98] These risks cannot be ignored when evaluating device security.

The Need to Incentivize Safer Coding for IoMT Devices

Despite the serious risks accompanying IoMT, the industry has experienced impressive and sustained growth that far outpaces the federal legislature’s adoption of safety and security regulations.[99] Existing laws and regulations do not sufficiently capture and mitigate the risks associated with digital technology, and require substantial updates to eliminate gaps in their applicability and coverage.[100] Indeed, due to the healthcare industry’s slow adoption of MedTech at the beginning,[101] cybersecurity infrastructure and corresponding regulatory frameworks for IoMT are only in the nascent stages.[102] Further, IoMT manufacturers are not emphasizing safe coding practices, focusing instead on a “race to market” strategy that may result in unsafe consumer products.[103] This part explains the need to incentivize safer coding in IoMT devices, with particular emphasis on: (1) the regulatory gaps that enable IoMT device manufacturers to operate outside the bounds of regulatory authority; and (2) the economic realities of device creation that prioritize the “race to the market.” Specifically, Part III explains why existing privacy and security standards are insufficient to comprehensively regulate the IoMT sector and clarify liability structures. Although state regulations also exist on this topic, they are beyond the scope of this Article.

Regulatory Gaps: Evaluating the Roles of HHS and FDA

MedTech in the United States does not operate in a wholly unregulated environment.[104] Rather, the United States has implemented a patchwork and sector-based framework to govern privacy and security throughout the nation.[105] In the healthcare industry, this regulatory authority is vested primarily with two government agencies: Department of Health and Human Services and Food and Drug Administration.[106] HHS’s Office for Civil Rights (OCR) is the primary regulator of privacy and security in the healthcare sector, while FDA’s jurisdiction extends to the safety and efficacy of medical devices.[107] Although the reach of both government agencies may encompass certain IoMT devices and their developers, significant gaps exist in the established regulatory frameworks such that portions of the IoMT industry remain unregulated, with no mandatory privacy and security standards.[108]

The Health Insurance Portability and Accountability Act: Applicability, Scope, and Gaps

Congress passed the Health Insurance Portability and Accountability Act (HIPAA) in 1996 to ensure motility of health insurance coverage and reduce costs associated with healthcare delivery.[109] Although HIPAA’s goals did not originally encompass privacy and security, such protections were later mandated as healthcare organizations transitioned to electronic health records and digital systems to reduce costs of care and administrative burdens.[110] As a result, the HIPAA Privacy and Security Rules govern the healthcare landscape for privacy and security issues.[111]

The scope of HIPAA, however, is intentionally limited. HIPAA applies only to “covered entities” and only protects a subset of health information known as “protected health information.”[112] To qualify as a covered entity,[113] an organization must fall into one of three categories that are statutorily defined: (1) healthcare provider;[114] (2) health plan;[115] or (3) healthcare clearinghouse.[116] In 2009, the Health Information Technology for Economic and Clinical Health (HITECH) Act expanded HIPAA’s provisions to encompass “business associates,” which include any person or organization that performs certain specified functions on behalf of a covered entity.[117] Regardless of the type of covered entity involved, HIPAA’s coverage only extends to protected health information (PHI), which is individually identifiable health information transmitted in any form or medium.[118]

The HIPAA Privacy Rule works by setting limitations on a covered entity’s or business associate’s use or disclosure of PHI.[119] The basic principle, subject to certain exceptions, is that a covered entity may not use or disclose PHI except as the Privacy Rule permits or requires, or as the individual (whose PHI is at issue) authorizes in writing.[120] The HIPAA Security Rule, in turn, complements the HIPAA Privacy Rule by operationalizing the Privacy Rule’s protections through implementation of administrative, technical, and physical safeguards for a subset of PHI—electronic PHI (ePHI).[121] The Security Rule focuses on guarding against unauthorized access to a patient’s ePHI and represents the first set of widely accepted security standards for healthcare practitioners.[122]

While HIPAA appears to offer comprehensive privacy and security frameworks for the healthcare industry, significant gaps are revealed by applying these regulations to digital health technology.[123] First, as explained above, HIPAA’s protections and requirements extend only to digital health actors that are covered entities or business associates.[124] This means that if an organization does not qualify as a covered entity or business associate, it has no obligation to comply with HIPAA’s privacy and security requirements.[125] For instance, companies that manufacture fitness trackers that collect basic health data, such as weight, heart rate, and height are not subject to HIPAA’s regulations because they do not qualify as a healthcare provider, healthcare plan, or healthcare clearinghouse.[126] Rather, the company provides this product directly to consumers without involving providers or insurers.[127] Numerous MedTech companies, therefore, exist outside the bounds of the HIPAA Privacy and Security Rules because they are not covered entities or business associates.[128]

Second, HIPAA’s applicability is limited by the type of information it protects.[129] Extending only to PHI, HIPAA excludes categories of health information that may be sensitive but not individually identifiable or directly related to a person’s physical or mental health.[130] Healthcare data that does not satisfy the definition of PHI may be collected, used, and disclosed by any company without violating federal healthcare regulations.[131] For example, the Ohio Supreme Court held that lead-contamination notices issued by the Cincinnati Health Department could be disclosed even though they contained blood test results because the child’s name was not included in the document.[132] MedTech companies that gather or aggregate data that is not personally identifiable are within their rights to sell or disclose such data under HIPAA.[133] Similarly, if the data MedTech companies collect does not directly relate to a person’s physical or mental health, and does not concern the provision of healthcare services or payment for such services, then individually identifiable health data may be disclosed, used, and sold.[134]

As a result, HIPAA is limited in its applicability and contains regulatory gaps that cause MedTech actors to fall outside its scope.[135] With minor exceptions, most digital health companies today will not qualify as covered entities or will collect data outside the scope of PHI, allowing them to remain unregulated by federal privacy and security frameworks.[136] When this occurs, MedTech companies may operate with little to no federal oversight, and can lack incentives to ensure adequate privacy and security standards are upheld.[137] Moreover, there is nothing in HIPAA that addresses liability for the malfunctioning, hijacking, or hacking of a healthcare device. HIPAA even precludes a private right of action for violations of its own provisions.[138] HIPAA’s focus is thus purely on establishing federal standards of care that covered entities must satisfy, not remedying harm to consumers from device vulnerabilities.[139] Accordingly, HIPAA does not sufficiently regulate the MedTech industry.

FDA: Device and Cybersecurity Guidance

In contrast to the limited oversight of MedTech companies by HHS, FDA plays a central role in the regulation of medical devices generally. FDA is responsible for ensuring the safety and efficacy of certain classifications of devices, though not all MedTech products will trigger FDA scrutiny.[140] The type of oversight and pre-market approval that medical devices must undertake depends principally on the device’s classification, which is determined by the level of risk posed by the device.[141] Class I devices pose the least risk and are subject only to general controls.[142] Class II devices, which are slightly riskier, must satisfy general controls and special controls.[143] Finally, Class III devices, which are used to support, or sustain human life or pose an unreasonable risk of illness or injury, are the most heavily regulated.[144]

Recognizing the developing intersection of medical devices and technology, FDA issued industry guidance titled Postmarket Management of Cybersecurity in Medical Devices on December 28, 2016.[145] This voluntary guidance sets forth FDA’s recommendations for effectively managing post-market cybersecurity vulnerabilities.[146] FDA expressly recognizes that medical devices may now be “networked” and connected with other medical applications that comprise the IoMT.[147] The interconnected structure enables the exploitation of vulnerabilities and “may represent a risk to health” such that “continual maintenance throughout the product life cycle” is necessary to protect “against such exploits.”[148] FDA thus recommends that medical device manufacturers proactively address cybersecurity vulnerabilities to reduce health and safety risks.[149]

Importantly, FDA acknowledges that risk management for cybersecurity vulnerabilities in medical devices “is a shared responsibility among stakeholders including the medical device manufacturer, the user, the Information Technology (IT) system integrator, Health IT developers, and an array of IT vendors that provide products that are not regulated by the FDA.”[150] While FDA encourages collaboration among these actors to enhance post-market cybersecurity, it cannot mandate cybersecurity protections in devices that are already approved and marketed.[151] Thus, although FDA’s guidance is a crucial step towards securing medical devices, including IoMT products, its voluntary nature does not adequately incentivize compliance.

Nearly two years later, FDA continues to recognize the overwhelming importance of cybersecurity in medical devices. Recently, FDA issued draft guidance regarding the Content of Premarket Submissions for Management of Cybersecurity in Medical Devices.[152] FDA again reiterated “[t]he need for effective cybersecurity to ensure medical device functionality and safety,” and noted that this objective has become increasingly important with the continued use of wireless and network-connected devices, and the frequent electronic exchange of patient data.[153] As manufacturers design their medical devices and apply for pre-market approval, FDA hopes that they will mitigate cybersecurity risks.[154] However, as with the post-market cybersecurity guidance, the pre-market guidance is voluntary.[155]

While FDA has taken a proactive approach to encourage medical device cybersecurity, additional measures—particularly those that involve compliance incentives—must be adopted to protect patient safety.[156] Moreover, although FDA regulates the approval of medical devices, it does not provide relief for patients that are harmed by device malfunctions or hacking.[157] Although FDA’s role is to regulate medical device safety and security, it lacks authority and frameworks to create a comprehensive mandatory cybersecurity system, and to apportion liability and remedies accordingly. Combined with HIPAA, this creates a regulatory gap that has not yet been resolved by the federal government.

Industry Cybersecurity Frameworks

To help address federal regulatory gaps for the security of IoMT devices, numerous industry organizations have published their own voluntary cybersecurity frameworks that seek to illuminate best practice standards.[158] These frameworks are intended to enable digital health companies to adopt a cybersecurity structure that best meets their organizational needs.[159] A 2018 survey conducted by the Healthcare Information and Management Systems Society[160] reported that there are five primary security frameworks in use by healthcare organizations today: (1) National Institute of Standards and Technology (NIST);[161] (2) Health Information Trust Alliance (HITRUST);[162] (3) Center for Internet Security (CIS) Critical Security Controls;[163] (4) International Organization for Standardization (ISO);[164] and (5) Control Objectives for Information and Related Technologies (COBIT).[165] The framework established by NIST is the most well-recognized voluntary cybersecurity structure today,[166] and a cross-walk document exists highlighting the interaction between the NIST cybersecurity standards and the HIPAA Security Rule.[167] NIST has further proposed guidance for IoT devices in the form of a white paper on October 17, 2018,[168] and is in the process of creating a privacy framework for 2019.[169]

The current industry cybersecurity frameworks represent a collective effort to define best-practice standards in a constantly evolving technological environment without stifling innovation.[170] The frameworks incorporate flexibility to match organizational structure, yet they provide the benefit of shared expert experience among myriad organizational groups and industry actors.[171] This cooperative approach to cybersecurity enhances overall device safety and organizational response to security incidents by aggregating experience and ideas.[172] Thus, these cybersecurity models possess extreme merit, particularly in the face of legislative gaps.

While industry cybersecurity frameworks are crucial to bridging the gaps for IoMT device security, they suffer from the same drawbacks as FDA’s medical device cybersecurity guidance: the standards are purely voluntary.[173] Presently, there is no way to enforce these standards to hold manufacturers of IoMT products accountable for unsafe coding or lax device security. If an IoMT device manufacturer elects not to follow a voluntary cybersecurity framework, it faces little consequences and, depending on the company, may remain unregulated by federal and industry actors.[174] Such a result does not encourage heightened device safety standards, but instead enforces the status quo. Thus, existing federal and industry cybersecurity standards—while a promising step in the right direction—do not create a comprehensive security structure for IoMT devices, and fail to address liability and relief for patients who suffer harm from breached or hijacked IoMT devices.

Economic Realities and the “Race to Market”

In addition to voluntary cybersecurity frameworks and regulatory gaps that permit IoMT developers to evade government oversight, economic realities for device creation unintentionally foster a “race to the market” mindset that prioritizes speed over safety.[175] Scholars note that the absence of effective cybersecurity measures is attributable, in part, to “weak economic incentives,” and that these weak incentives and market failures “have led to an accumulation of insecure hardware and software.”[176] While creating secure code can potentially result in a marketing advantage or impact brand reputation,[177] it is extremely difficult for consumers to compare product security.[178] Such benefits, therefore, may go unnoticed, causing manufacturers to lose money without seeing a sufficient return on their investment.[179]

Indeed, a 2017 study by the Ponemon Institute underscores the need for manufacturers to consider consumer safety and cybersecurity when developing IoMT devices.[180] Sixty-seven percent of medical device manufacturers surveyed in the Ponemon study believed that a cyberattack on one or more medical devices built by their organization is likely, yet only 17% of these manufacturers have taken any substantial steps to prevent such an attack.[181] Only one-third of medical device manufacturers encrypt traffic among IoT devices, and 53% of these manufacturers acknowledge that “there is a lack of quality assurance and testing procedures that lead to vulnerabilities in medical devices.”[182] Further, despite the fact that 31% of medical device developers are aware of actual attacks involving connected medical devices, only 25% of these developers have added security protocols or architecture inside the devices to protect patients and clinicians.[183]

Device manufacturers additionally note that few connected medical devices are actually tested in the design phase.[184] Only 28% of medical device respondents affirmed that testing is done prior to development and post-release, and only 9% of manufacturers test their deployed medical devices annually.[185] This lack of testing is explained, in part, by the pressure device manufacturers face to market connected devices quickly.[186] Where rushing devices to the market is a priority, software security becomes an unfortunate afterthought.[187] Moreover, just 51% of device manufacturers follow the existing voluntary cybersecurity framework, best practice security framework, or both.[188] Thus, current “[m]edical device security practices in place are not the most effective,”[189] and “[a]ccountability for the security of medical devices manufactured or used is lacking.”[190] It is time to incentivize safer IoMT development, particularly given the life or death risks these devices can pose.

Developing a Comprehensive Liability Structure

Given the voluntary cybersecurity frameworks in existence today, the “race to market” reality, and the regulatory gaps that permit IoMT developers to evade government oversight, it is necessary that legislatures provide sufficient incentives for manufacturers to create safe and secure code for IoMT products.[191] While numerous bills have been proposed at the federal and state levels regarding regulation of IoMT products, these bills have a low probability of passage and have been met with fierce opposition by IoMT manufacturers.[192] These manufacturers claim that such legislation will not only drive developers out of the field, but will also hinder progress in the IoMT industry because these legislative proposals cannot be quickly and efficiently amended to keep pace with technological progress.[193] In fact, by the time many IoMT bills are passed, they will be outdated due to the rapid advances in technology that can occur over a short period of time.[194] Thus, IoMT has proven difficult for state and federal legislatures to regulate, and this lack of mandatory regulation has created a dearth of incentives for IoMT manufacturers to develop secure IoMT products.[195]

Provided this current state of affairs, a comprehensive IoMT liability model would offer critical incentives to manufacturers to increase the security of their products and comply with voluntary cybersecurity best practice standards. As is, there are limited incentives to encourage manufacturers to expend time, resources, and money on developing safer products when they are not subject to regulatory oversight and do not have a clear grasp on the potential liability they could face for unsecure code.[196] Part IV explains why current liability frameworks are ill suited to the IoMT context and advocates for the development of a comprehensive liability structure as the “stick” that encourages incentivizing safer code.

Existing Liability Standards: Evaluating the Application of Products Liability to IoMT

Pursuant to traditional tort doctrines, device malfunctions are typically addressed through products liability laws at the state level (when preemption is not implicated).[197] Products liability refers to the liability of a manufacturer, processor, or seller whose goods injure consumers.[198] Three legal paths may be pursued under the products liability framework: (1) strict liability; (2) negligence; and (3) breach of warranty.[199] The application of these doctrines to IoMT, however, is akin to fitting a square peg in a round hole.[200] As IoMT progresses, “it could reshape the law of products liability by redefining who can be held at fault and who will bear the financial consequences if something were to go wrong with a product.”[201] This section explains the fundamentals of products liability and details why this tort doctrine, as currently structured, is ill-fitted to remedy harm from IoMT devices.

Strict Products Liability

First, strict products liability is used to combat harm caused by unreasonably dangerous products.[202] This doctrine is applicable to products that cause substantial harm, death, or property damage due to defects.[203] The purpose of strict products liability is to ensure that the manufacturers, developers, and sellers of defective devices bear the costs of any harm a consumer experiences due to that product.[204] In contrast to other tort doctrines, such as negligence, strict products liability does not require prior knowledge of a risk as a prerequisite to liability.[205] Rather, liability is automatic when it is proven that a device is defective, regardless of whether the manufacturer exercised all possible care when developing the product.[206] In this manner, strict liability is intended to incentivize manufacturers to “weigh the potentially small cost of mitigating the defective design or manufacturing element in their product against releasing the product with defects and having to cover potentially large damages that these defects may cause.”[207]

While defective digital products are not new, strict products liability has only been applied in rare instances.[208] Its limited application is due to three primary factors. First, the economic loss doctrine limits the type of damages that can be remedied through strict products liability.[209] As previously noted, strict products liability requires a demonstration of physical harm, death, or property damage that is directly attributable to the defective device.[210] The economic loss doctrine precludes claims based solely on financial losses, which are often the kind of impacts that insecure digital products have produced in the past.[211] IoMT, however, has the potential to skirt this economic loss limitation because its interconnectivity may result in physical harm or death, depending on the nature of the hack, breach, or hijack.[212] As digital technologies become increasingly integrated with devices, “the potential for physical harm may grow.”[213] Thus, while the economic loss doctrine has traditionally barred strict products liability for digital devices, it is less of a concern for IoMT.[214]

Second, and more importantly, consumers face an uphill battle trying to prove that “missing security features or digital defects alone led to harm or damage,” and most consumers do not have an “empirically-based cost-benefit calculation with supporting probabilities for claims.”[215] Similarly, third-party interference with the device by hackers may constitute an intervening event that absolves the manufacturer of liability (though it may have been the manufacturer’s insecure code that enabled the hacker to access the device in the first place).[216] Finally, ambiguity exists regarding whether software is a product or a service.[217] Products liability applies only to products, and in some U.S. states, software or code may be viewed as an intangible item.[218] Given the variability in products liability standards throughout the United States, it is possible that some jurisdictions may find strict products liability inapplicable to insecure code.[219] Thus, strict products liability may not provide a sufficient remedy for consumers despite the risk of harm presented by insecure IoMT code.

The converse, however, also has the potential to be true. As IoMT develops, there may be greater application of strict products liability to IoMT devices in ways that were not originally intended.[220] For example, given that most IoMT products present a risk of death or bodily injury from device hacks or hijacks, courts could conceivably apply the strict products liability doctrine to all IoMT devices regardless of whether harm actually occurs.[221] The problem with this approach is three-fold. First, because there is no universally secure code, each IoMT device—regardless of whether it satisfies the strictest security requirements to date—will still have the potential to be hijacked and create life-or-death scenarios.[222] This places IoMT device manufacturers at a continuous risk for unfettered liability related to digital products, even if the manufacturer took all reasonable steps and adhered to voluntary industry frameworks. Such a risk for liability, in turn, may cause manufacturers to abandon the IoT market, which will derail and stifle innovation in an industry that promises to revolutionize health care.[223]

Moreover, it is still unclear where along the supply chain liability will fall for a malfunctioning device.[224] IoMT differs from past technological developments in that it has an extensive supply chain that involves numerous manufacturers, developers, suppliers, coders, and sellers.[225] At this time, there is no clear demarcation of liability along this chain.[226] “While contractual arrangements might allow for the allocation of liability between parties,” strict liability cannot be transferred by contracts.[227] Companies would therefore need to show which manufacturer or party was responsible for the defect, which can be difficult to determine.[228] This will require the “development of digital technology failure standards and thorough incident investigation,” which is costly and may drive developers out of the market.[229]

In short, the application of strict products liability in this manner will not result in a proper balancing of consumer harm and manufacturer responsibility. No clear guidelines exist for apportioning liability among an IoMT supply chain, and the mere risk of unfettered strict products liability for device manufacturers can inhibit fundamental innovation.[230] Thus, in its current form, strict products liability cannot be easily applied to IoMT.


The second theory of liability potentially applicable to IoMT devices is negligence. Proof of negligence requires demonstration of five factors: (1) a duty or standard of care; (2) breach of that duty or standard of care; (3) cause in fact; (4) proximate cause; and (5) damages.[231] Negligence in the context of products liability can occur if a supplier, retailer, or manufacturer places an IoMT product into the stream of commerce with inadequate labeling, or if there are manufacturing or design defects.[232] The manufacturer or supplier will be liable if it failed to exhibit ordinary care to a party who suffers injury proximately caused by the manufacturer’s negligent conduct.[233] For products liability, negligence can arise in numerous ways, including: design of the product, selection of materials, production process, product assembly and testing, and placement of inadequate warnings or directions.[234]

One of the most common applications of negligence to products liability occurs in the context of design defects.[235] A design defect claim alleges that the manufacturer’s product design was not reasonable in light of the product’s risk of harm and availability of safer alternative designs.[236] Accordingly, a design defect claim requires proof of at least three elements: (1) the product posed a substantial likelihood of harm; (2) a safer and more feasible alternative product or design existed; and (3) the product, as designed, caused the plaintiff’s injury.[237] In these cases, the factfinder must evaluate the manufacturer’s intent and judgment in selecting the particular product design.[238] Some courts view this analysis in terms of risk versus utility.[239]

With respect to IoMT, it is conceivable that plaintiffs could bring design defect claims premised on insecure code. Specifically, a plaintiff may argue that the manufacturer’s design of an IoMT product is inherently risky due to the manufacturer’s selection of certain code, failure to use cybersecurity best practice standards in testing the code prior to launch, or both. Plaintiffs, however, will face numerous problems with such allegations.[240] First, there is no universally secure code, and plaintiffs will have difficulty establishing that the code and accompanying security processes selected by a manufacturer are inherently less safe than other alternatives.[241] In fact, it is estimated that “programmers make between 10 and 50 errors for every 1,000 lines of code.”[242] Second, negligent design defect claims are premised on the existence and availability of a safer alternative product design.[243] Absent a safer alternative, negligence design claims can fail as a matter of law.[244] Given the rapidly evolving state of technology, it is possible that there may not be alternative products on the market for which a plaintiff could compare the manufacturer’s product. Further, given the inherent flaws in software, it is possible that any similar products that do exist on the market would not be safer.

Third, plaintiffs may have difficulty establishing that the product posed a substantial risk of harm.[245] All implantable devices embedded with connectivity mechanisms are likely to pose similar risks of harm,[246] and it will be challenging to establish that one product is more or less risky than another device that is similarly implanted into a patient’s body. Further, the courts would risk opening the litigation floodgates and driving manufacturers out of the IoMT field if they were to find that any IoMT device implanted into a patient’s body poses a substantial likelihood of harm, given that there is no defect-free code.[247] Risk will exist with any IoMT device, and it is unclear at this stage what levels of risk are and are not acceptable.[248]

Moreover, to the extent a plaintiff attempted to apply general negligence principles outside the design defect context, she would face substantial difficulty establishing the existence of a duty of care.[249] As noted, negligence is premised upon the violation of an established standard of care.[250] There are no mandatory federal cybersecurity standards, however, for IoMT products.[251] As evidenced in Part III, IoMT products regularly fall within the cracks of federal legislation and sometimes are not subject to government oversight.[252] Additionally, federal agencies do not actively regulate cybersecurity of IoMT devices at this stage—as evidenced by the voluntary nature of the FDA’s post-market cybersecurity guidance.[253] While industry cybersecurity frameworks exist, they are also voluntary, and there is no consensus on which cybersecurity framework should or must be adopted by healthcare organizations.[254] Without readily discernable and established standards in place, it is difficult to argue that these standards have been breached.[255] Thus, the negligence model may fail to provide sufficient relief to injured consumers.

Breach of Warranty

The final liability model that is routinely applied to device defects is breach of warranty, including common law warranties and warranties under Article 2 of the Uniform Commercial Code (UCC). Unfortunately, the law surrounding whether Article 2 applies to IoMT devices—which can incorporate software, software-related services, and tangible goods—is unclear.[256] IoMT has transformed interactions between buyers and sellers, and created more elaborate hybrid transactions with increased levels of complexity.[257] This complexity has resulted in a lack of clarity regarding whether Article 2, which covers consumer goods, applies to hybrid transactions.[258] This uncertainty “belies the UCC’s stated goals of uniformity and simplicity and can lead to unwarranted disputes between parties about the laws applicable to a transaction.”[259] Thus, whether Article 2 and its warranty provisions apply to IoMT devices is in a state of flux, with such discussion extending beyond the scope of this Article.

Difficulties also exist with applying common law warranties to IoMT devices. Two types of common law warranties exist: (1) express warranties; and (2) implied warranties.[260] With respect to express warranties—which are explicit promises that devices will perform in a particular manner—it is possible that IoMT device manufacturers may expressly guarantee their products in limited contexts, but such a warranty is likely to only extend to the device itself, and not to any software, product monitoring, or guarantees against breaches, hacks, or hijacks.[261] Moreover, it is doubtful that any IoMT manufacturer will warrant its product for secure software code, given the intrinsic “bugginess” that exists in code today.[262] Indeed, software manufacturers routinely evade liability for software vulnerabilities through end-user agreements, which disclaim all responsibility and liability for breaches, hacks, hijacks, and other harm resulting from insecure code.[263] By using products associated with end-user agreements, consumers waive any rights they have regarding the safety and security of the software.[264] More concerning, only 8% of consumers even read this dense legalistic disclaimer.[265] Such end-user agreements make it difficult—if not impossible—to bring product liability actions, particularly for breach of warranty.[266]

Implied warranties, on the other hand, are not expressly provided by manufacturers, but are instead inferred when a manufacturer sells a product to a consumer.[267] An implied warranty may arise from the circumstances surrounding the transaction or from the product itself.[268] Typical implied warranties include the implied warranty of fitness for a particular purpose, the implied warranty of merchantability for goods, and the implied warranty of workmanlike quality for services.[269] Implied warranties, however, may be disclaimed, and sellers often do this in either the contract or the end-user licensing agreement.[270] Indeed, the implied warranty of merchantability—which is intended to assure consumers that the goods will meet baseline standards of quality—is so often disclaimed that scholars have questioned its usefulness.[271] Thus, given the absence of defect-free code and the prevalence of end-user licensing agreements, products liability, in its various forms, is not a viable cause of action for injured consumers.

The Carrot and the Stick: Incentivizing Safer Code Through a New Liability Framework

As IoMT continues to evolve and define consumer experiences and expectations, it is crucial that safer code be prioritized in IoMT devices. As noted above, IoMT device manufacturers are currently well insulated from liability and are able to externalize the costs of insecure software.[272] The threat of liability, however, is a proven deterrent that can reduce the probability of consumer harm or damage.[273] Because IoMT developers are underinvesting in software security, it is necessary to create incentives that will be economically and legally attractive to manufacturers.[274] This requires combining “ex ante incentives to invest in security with ex post liability that, while sufficient to discipline developers, does not stifle innovation.”[275] The goal is to balance consumer safety with technological advancement.[276] Thus, a reasonable and workable liability framework should be developed to provide consumers with relief for injuries and clarify manufacturer responsibilities and obligations.

The form that this new liability structure should take for IoMT devices, however, is less clear. Traditional products liability principles cannot be seamlessly applied to IoMT devices, given their unique design and extensive supply chains that make it difficult to not only apportion liability but also to determine relevant standards of care.[277] A rigid liability structure risks stifling innovation, but the laissez-faire attitude towards IoMT risks must be combatted with effective incentives to develop secure code and reduce consumer risk.[278] Further, it is necessary that this liability structure be created with manufacturer input, and not on an ad hoc or case-by-case basis, which risks inconsistent judicially-created standards.[279]

While it is uncertain what a finalized liability framework for IoMT devices may consist of,[280] there are two important steps that should be implemented now to begin building this framework. First, IoMT developers should be prohibited from disclaiming liability for insecure code in end-user agreements. Second, a safe harbor provision should be simultaneously implemented that provides IoMT manufacturers with a defense to liability if they have satisfactorily complied with cybersecurity best practices in developing and marketing their products. These recommendations can help form the basis of a final liability framework while demonstrating an early commitment to holding IoMT device manufacturers accountable for insecure products.

Eliminating Liability Disclaimers in End-User Agreements

The first prong of this interim liability proposal requires the elimination of liability disclaimers in end-user agreements for IoMT devices. Software vulnerabilities have cost consumers and businesses tens of billions of dollars annually, yet software developers have refused to take responsibility for the security of their products, and have instead shifted the risk of insecure software to consumers.[281] It is unfair for consumers to shoulder the burden of insecure devices—particularly when such devices can be implanted into consumers’ bodies and have life or death consequences—simply because software manufacturers have traditionally been permitted to disclaim liability through end-user agreements.[282] Permitting IoMT manufacturers to evade liability contributes to the weak economic climate that has permitted vulnerable code to develop in the first place.[283]

IoMT manufacturers should therefore not be free of all liability, but instead should be held to reasonable standards of care for their products.[284] IoMT developers are best positioned to identify risks with their software code and to mitigate those risks during the development process. Yet, the presence of an end-user licensing agreement eliminates any incentive that an IoMT developer has to consider consumer safety and security.[285] Without the risk of liability, and without the presence of mandatory federal standards for IoMT devices, manufacturers can place insecure products on the market without adequate testing and potentially compromise consumer well-being.[286] For any final IoMT liability model to be successful, there must be a foundational understanding among all parties that the failure to implement reasonable security measures into IoMT devices will be grounds for punishment.[287] Elimination of liability disclaimers in end-user agreements for IoMT products is a crucial step in setting the foundation for a future liability framework.[288] Such action will garner substantial attention among IoMT device manufacturers, as it represents a significant shift away from the laissez-faire attitude surrounding software products to date.[289] However, this shift is necessary to establish standards for connected devices that have the potential to cause serious bodily harm or death.

Further, attaching liability to IoMT developers can incentivize businesses to increase their security budgets.[290] Respondents in the Ponemon study indicated that their organizations would increase the security budget for connected medical devices only if a potentially life threatening attack occurred.[291] It is irresponsible to withhold adequate funding for security until tragedy takes place, particularly given that the majority of IoMT manufacturers are already aware of the real-life potential for such attacks.[292] By signaling that IoMT device manufacturers may be held liable for insecure code, the hope is that MedTech organizations will increase funding to strengthen device security now, as a proactive measure, before harmful attacks occur. The reactive model of security in place today fails to adequately protect consumers, and it is time for the incentive of liability to enhance the security environment.[293]

Indeed, the idea of eliminating disclaimers of liability in end-user agreements has also been recently proposed by Senator Mark Warner of Virginia, a member of the Senate Intelligence Committee.[294] Senator Warner explained that “eliminating software makers’ long-held exemption from liability lawsuits could be a key part of a cybersecurity plan,” and that a “fulsome debate” is needed regarding “whether the software sector’s legal immunity has outlived its usefulness, especially in an age of relentless cyberattacks that frequently exploit software vulnerabilities.”[295] Former White House Cybersecurity Coordinator Michael Daniel agreed with Senator Warner that it is time to debate this proposal in Congress, but cautioned that this dialogue should not be generalized to all software.[296] Instead, this “requires a more sectoral approach, such as medical devices or autonomous vehicles.”[297] Such arguments and suggestions are in line with the approach proposed in this Article, which advocates for elimination of liability disclaimers in end-user agreements for the IoMT sector only, given the unique risks posed by connected medical devices.

While prohibiting liability disclaimers in IoMT end-user agreements will increase incentives for manufacturers to develop secure code, this action will likely be met with substantial resistance from the software and connected device industries.[298] The software industry has enjoyed protection from liability for decades, and will oppose any change to this status quo.[299] Software manufacturers may argue that the imposition of liability will stifle innovation in a developing field, and that manufacturers will flee the industry.[300] The likelihood of this occurring, however, is slim.[301] Almost all other industries hold manufacturers and developers liable for flaws in their products, and IoMT is projected to revolutionize health care.[302] The elimination of liability protection is unlikely to hinder a rapidly evolving industry, given the numerous players in the market. Instead, the elimination of liability provisions in end-user agreements will likely increase competition among manufacturers to develop more secure products to avoid hefty fines or damages.[303] With manufacturers appropriately incentivized to prioritize security, standards and duties of care can also begin to develop for this industry.[304]

Moreover, as technology changes, liability structures must adapt. Just because policy makers determined that “business productivity software manufacturers should not be held liable for security flaws in their products during the growth period of this industry in the 1990s does not mean all software manufacturers for all applications in all industries should get the same exemption forever.”[305] Instead, there must be a balancing of consumer safety with manufacturer liability.[306] As consumer risks increase, technology manufacturers must bear some of the burden for device safety.[307] Because society has entered a new technological age marked by increased cyber risk, it is crucial that liability models progress accordingly.[308]

Cybersecurity Safe Harbor

Given that the elimination of liability waivers in end-user agreements will represent a marked change for software and IoMT manufacturers, it is important that these manufacturers not be exposed to unbounded liability. Depending on the final liability model that emerges, it is necessary to guard against the imposition of liability on IoMT developers merely because their products use software code or connect over Internet networks. As noted, there is no defect-free code in existence today, and manufacturers should not be liable at this stage for device malfunctions, hacks, or hijacks that occur despite the manufacturer’s use of cybersecurity best practices and secure product development lifecycles.[309] The purpose of a robust liability framework is, in part, to incentivize the development of secure code.[310] It is not intended to saddle manufacturers with liability risks that no reasonable individual could guard against, even if all proper steps were taken.[311] Thus, a second proposal should be simultaneously adopted with the elimination of liability waivers: cybersecurity safe harbors.[312]

A cybersecurity safe harbor operates by preventing the imposition of liability on device manufacturers that adopt and adhere to recognized industry cybersecurity standards, frameworks, or both.[313] The safe harbor prioritizes proactive cybersecurity measures that protect consumer well-being instead of focusing on the reactive regulatory structure that exists today.[314] These industry frameworks can help fill the gaps that presently exist in cybersecurity and IoMT oversight by HHS and the FDA, and can be more readily and easily updated and amended than statutes or regulations.[315]

Moreover, safe harbors that encourage adoption of industry-developed cybersecurity frameworks enhance the public-private partnership model that has become a cornerstone of cybersecurity policy.[316] By design, cyberspace operates as “a network of both private-sector and public-sector infrastructure” and “requires a continuation of the partnership between the government and companies” to thrive.[317] There are limits to the government’s technical skills and ability to develop workable practices for cybersecurity that are best left to industry cybersecurity experts. A partnership approach, such as the one envisioned by safe harbors, can enhance development of the IoMT and cybersecurity industries without stifling innovation.[318] It is therefore strongly recommended that cybersecurity safe harbors be adopted simultaneously with the prohibition on liability waivers in IoMT end-user agreements.[319]

State legislatures have already begun to recognize the merit in adopting cybersecurity safe harbors. On November 1, 2018, Ohio became the first state to supplement its Data Protection Act with an “incentive-based mechanism to strengthen cybersecurity business practices”—namely a safe harbor against data breach lawsuits for businesses that “implement, maintain and comply with an industry-recognized cybersecurity program.”[320] This law, formerly known as S.B. 220, offers protection to any business that accesses, maintains, or processes personal information, so long as the business implements recognized cybersecurity measures that are designed to: (1) protect the security and confidentiality of data; (2) protect against reasonably anticipated threats to the security or integrity of data; and (3) guard against unauthorized access to personal information that is likely to result in a material risk of identity theft or fraud.[321] In exchange for implementing an appropriate cybersecurity framework, businesses receive an affirmative defense to tort actions that arise from alleged “failure[s] to implement reasonable information security controls, resulting in a data breach.”[322]

Similarly, New York recently passed the Stop Hacks and Improve Electronic Data Security (SHIELD) Act.[323] The original version of this bill intended to grant a safe harbor to a “certified compliant entity.”[324] A “certified compliant entity” is one that meets the independent certification of compliance with government data security regulations (such as HIPAA and Gramm-Leach-Bliley) or recognized industry-approved cybersecurity frameworks, including the ISO/NIST standards.[325] Pursuant to the original legislation, an organization could take advantage of this safe harbor by providing copies of its certification(s) to the Attorney General.[326] The final legislation, however, omitted this broad safe harbor language with respect to limiting liability for certified compliant entities, but still allows certain companies to be deemed compliant with New York’s “reasonable safeguards” requirement if they are covered by—and comply with—certain regulations, such as HIPAA.[327] Thus, while there is flexibility in how cybersecurity safe harbors are structured, it is necessary to provide a level of protection to manufacturers as liability increases for insecure IoMT devices. This can incentivize adoption of safer code and more stringent cybersecurity programs by providing a more limited and tailored exception to liability than end-user agreements.[328]

In fact, numerous organizations expressed their support for the development of cybersecurity safe harbors in response to a request for comments issued by the Department of Homeland Security.[329] Tasked with evaluating and recommending incentives to encourage private sector participation in voluntary cybersecurity programs, the Secretary considered liability limitations as part of her review.[330] Organizations of all sizes
—including large companies like Microsoft and small start-up companies—indicated that liability safe harbors offset the costs of participation in voluntary cybersecurity frameworks and can serve as an effective cost reduction mechanism.[331] These companies further explained that liability protection creates tangible benefits and adds predictability to an otherwise unclear and unsettled area of law.[332] Moreover, these safe harbors recognize the inherent flaws present in all connected devices and signal that targeted IoMT organizations are also victims of cybercrime.[333]

It is important, however, that cybersecurity safe harbors be implemented in conjunction with the end-user agreement prohibition, and not as the sole method to enhance device security. The rationale for this is that the adoption of a cybersecurity safe harbor, on its own, fails to incentivize manufacturers to increase the security of their IoMT products. IoMT device manufacturers will still have a shield against liability through end-user agreements. With the ability to contractually limit their liability for insecure code, manufacturers will remain un-incentivized to protect consumer welfare and can continue placing unsecure devices on the market.[334] Additionally, given the limited IoT lawsuits to date, as well as courts’ varying interpretations of standing requirements, IoMT manufacturers may question whether such lawsuits can be successfully maintained.[335] An IoMT manufacturer may believe it is cheaper to fight a future lawsuit—with a potential for success at the motion to dismiss stage depending on the jurisdiction and harm suffered by the plaintiff—than to implement a comprehensive cybersecurity program. A recent survey of over 800 companies noted that only 35% of organizations currently view regulatory or liability risk as one of the largest concerns associated with poor cybersecurity practices.[336] In the IoMT context, this may be because of the insufficient regulatory structures and ability to contractually avoid liability.[337] Without fear of liability, organizations are not properly incentivized to adopt a voluntary cybersecurity framework.[338] Thus, safe harbors are a crucial component of the new liability framework, but will fail to achieve their purpose if enacted as the sole remedy.

Combining liability and safe harbors into a joint proposal, therefore, offers sufficient incentives for manufacturers to not only continue investing in the IoMT industry, but to also adopt appropriate cybersecurity frameworks and strengthen the code that is used in connected medical devices.[339] This model represents an appropriate balancing of consumers’ need for safer products with manufacturers’ need to prevent unlimited liability in a nascent industry with ever-evolving standards.[340] By taking these first two steps towards creating a comprehensive IoMT liability structure, the legislature can demonstrate its commitment to medical device security while helping the industry grow in a safe and secure manner.


It is undeniable that IoMT is set to revolutionize the healthcare industry and redefine standards for patient care. Utilizing its connectivity to monitor chronic patient conditions and increase care convenience, IoMT contains fascinating new opportunities, with manufacturers only scratching the surface to date. As IoMT becomes more readily adopted, however, it presents challenges with respect to device security and patient safety that can result in consumer harm or death.[341] With risks for data breaches, hacks, and hijacking increasing in the medical industry, it is essential that IoMT manufacturers create secure products that do not expose consumers to unnecessary vulnerabilities and risks.[342] Unfortunately, given the legal and regulatory frameworks in place, appropriate incentives do not exist for IoMT manufacturers to prioritize device security.[343] Manufacturers may avoid liability through end-user agreements and can fall outside the bounds of regulatory oversight by HHS and FDA.[344] As new devices proliferate on the market, however, it is essential that a comprehensive liability structure be created to incentivize adoption of cybersecurity best practices and provide relief to injured consumers.

The two-prong approach to liability proposed in this Article operates as a foundation for the broader IoMT liability discussion and ultimate liability framework. This interim proposal creates incentives to secure IoMT products by eliminating manufacturers’ ability to disclaim liability and proposing the adoption of safe harbors that can restrict liability to reasonable levels if manufacturers comply with voluntary cybersecurity frameworks.[345] The goal is to signal a strong interest by the legislature in holding IoMT manufacturers accountable for the security of their products while recognizing the reality that no IoMT device will ever be 100% secure. Further, by implementing these two steps now, the legislature can help foster a dialogue on what the ultimate IoMT liability framework should consist of, and can encourage, IoMT manufacturers to participate in this discussion at an early stage. This prevents the imposition of an ad hoc liability framework by the judiciary.[346] The adoption of a comprehensive IoMT liability structure will result in consistency and predictability for manufacturers while benefiting consumers through safer code and remedies for unreasonably insecure devices.

  1. * Certified Information Privacy Professional (CIPP/US), Certified in Healthcare Compliance (CHC), and Certified in Healthcare Privacy Compliance (CHPC); Director, Wake Forest University School of Law, Master of Studies in Law Program; Health Care LL.M., 2018, Loyola University Chicago School of Law; J.D., 2013, Wake Forest University School of Law.

  2. . How Technology is Changing Healthcare, Tex. Healthcare (Aug. 1, 2016), [].

  3. . See Karen Taylor, Deloitte Centre for Health Sols., Connected Health: How Digital Technology Is Transforming Health and Social Care 4–11 (2015).

  4. . See Marie-Valentine Florin, Governing Cyber Security Risks and Benefits of the Internet of Things: Application to Connected Vehicles and Medical Devices 5, 7 (Maya Bundt et al. eds., 2016).

  5. . Alaap Shah, Death by a Thousand Cuts: Cybersecurity Risk in the Health Care Internet of Things, Am. Health Laws. Ass’n Wkly. (May 18, 2018), [https://perma .cc/UGN5-W9TQ]; Untangling the Web of Liability in the Internet of Things, Mason Hayes & Curran: Tech L. Blog (May 19, 2016) [hereinafter Untangling the Web], [].

  6. . Mauricio Paez & Mike La Marca, The Internet of Things: Emerging Legal Issues for Businesses, 43 N. Ky. L. Rev. 29, 33 (2016).

  7. . Id. at 32–33; see also Bernard Marr, Why The Internet of Medical Things (IoMT) Will Start to Transform Healthcare in 2018, Forbes (Jan. 25, 2018),

  8. . Paez & La Marca, supra note 5, at 32.

  9. . Id. at 33.

  10. . Id. at 30.

  11. . Id. at 29; see also H. Michael O’Brien, The Internet of Things and the Inevitable Collision with Product Liability PART 4: Government Oversight, Prod. Liab. Advoc. (Oct. 16, 2015), vitable-collision-with-product-liability-part-4-government-oversight/ [].

  12. . Paez & La Marca, supra note 5, at 40.

  13. . President’s Nat’l Sec. Telecomm. Advisory Comm., NSTAC Report to the President on the Internet of Things 6 (Nov. 19, 2014); Nikole Davenport, Smart Washers May Clean Your Clothes, But Hacks Can Clean Out Your Privacy, and Underdeveloped Regulations Could Leave You Hanging on a Line, 32 J. Marshall J. Info. Tech. & Privacy L. 259, 260 (2016).

  14. . Salen Churi et al., Univ. Chi. L. Sch., Internet of Things (IoT) Risk Manager Checklist, U.S. 4 (2017).

  15. . See Benjamin C. Dean, An Exploration of Strict Products Liability and the Internet of Things 12–13, 21 (2018). See generally Untangling the Web, supra note 4 (“Lawmakers and regulators will need to consider either new forms of liability, or new ways to manage and apply existing laws to different entities in the IoT supply chain.”).

  16. . See Dean, supra note 14, at 16; Paez & La Marca, supra note 5, at 58.

  17. . See Untangling the Web, supra note 4.

  18. . See Dean, supra note 14, at 17, 19; Paez & La Marca, supra note 5, at 59; Jon Evans, Should Software Companies Be Legally Liable for Security Breaches?, Tech Crunch (Aug. 6, 2015), [].

  19. . See Robert Lemos, Security Liability is Coming for Software: Is Your Engineering Team Ready?, Tech Beacon, [].

  20. . Michael D. Scott, Tort Liability for Vendors of Insecure Software: Has the Time Finally Come?, 67 Md. L. Rev. 425, 427 (2008); Paul Rosenzweig, The Evolving Landscape of Cybersecurity Liability, Chertoff Group (June 29, 2017),

  21. . Dawn Beery & Kevin Burns, The Application of Traditional Product Liability Law to Emerging Technologies, Defense, Apr. 2018, at 58.

  22. . Id. at 55; see also Alan Butler, Products Liability and the Internet of (Insecure) Things: Should Manufacturers Be Liable for Damage Caused by Hacked Devices?, 50 U. Mich. J.L. Reform 913, 915, 926–27 (2017).

  23. . See generally Jack Detsch, Should Companies Be Held Liable for Software Flaws?, Christian Sci. Monitor (Dec. 2, 2016),
    1202/Should-companies-be-held-liable-for-software-flaws [] (discussing the benefits and drawbacks of holding companies liable for software flaws).

  24. . See generally Mildred Segura et al., The Internet of Medical Things Raises Novel Compliance Challenges, Med. Device Online (Jan. 3, 2018),
    [] (acknowledging with the growth of IoMT manufacturers should “keep abreast of current minimum-security standards” to avoid lawsuits); Untangling the Web, supra note 4.

  25. . See Megan Brown et al., Cyber Imperative: Preserve and Strengthen Public-Private Partnerships 12 (2018).

  26. . See Detsch, supra note 22.

  27. . See generally Charlie Mitchell, Mark Warner Eyes Liability for Software Developers as Key Way to Shore up Cybersecurity, Wash. Examiner (Apr. 10, 2018), [].

  28. . See Untangling the Web, supra note 4.

  29. . See Lemos, supra note 18; see also Matthew Ashton, Note, Debugging the Real World: Robust Criminal Prosecution in the Internet of Things, 59 Ariz. L. Rev. 805, 834 (2017); Mitchell, supra note 26.

  30. . See generally Mitchell, supra note 26 (suggesting a cybersecurity doctrine should be implemented to include software liability); Detsch, supra note 22 (“[L]eading digital security experts are calling on US policymakers to hold manufacturers liable for software vulnerabilities in their products in an effort to prevent the bugs commonly found in smartphones and desktops from pervading the emerging IoT space.”).

  31. . See Mitchell, supra note 26.

  32. . See id.; John Daley, Note, Insecure Software is Eating the World: Promoting Cybersecurity in an Age of Ubiquitous Software-Embedded Systems, 19 Stan. Tech. L. Rev. 533, 542 (2016) (“Critics will contend that any liability borne by software vendors will extinguish the vibrant startup ecosystem.”).

  33. . See Daley, supra note 31, at 541.

  34. . See generally id. (describing an alternate safe harbor model); Mauricio Paez & Kerianne Tobitsch, The Industrial Internet of Things: Risks, Liabilities, and Emerging Legal Issues, 62 N.Y.L. Sch. L. Rev. 217, 228 (2018); Scott Wenzel, Not Even Remotely Liable: Smart Car Hacking Liability, 2017 U. Ill. J.L. Tech. & Pol’y 49, 69.

  35. . See generally Daley, supra note 31, at 541 (discussing the need to incentivize cybersecurity practices and proposing a separate safe-harbor liability structure).

  36. . See generally Mitchell, supra note 26 (explaining that a dialogue must be started on incentivizing software companies to develop secure code, and liability may contribute to this incentive model); Paul Merrion, Litigation Key to Securing Internet of Things, Capitol Hill Staffers Told, CQ Roll Call, June 8, 2017, 2017 WL 2470487.

  37. . See generally Mitchell, supra note 26 (relying on software maker’s user license agreements, courts have found in favor of software developers in civil suits); Merrion, supra note 35.

  38. . Untangling the Web, supra note 4; see Paez & Tobitsch, supra note 33, at 238; see also; Davenport, supra note 12, at 260; Shah, supra note 4 (explaining that “[h]ealth care continues to undergo lightning-fast transformation,” particularly as it “enter[s] the brave new world of the Internet of Things”).

  39. . Scott R. Peppet, Regulating the Internet of Things: First Steps Toward Managing Discrimination, Privacy, Security, and Consent, 93 Tex. L. Rev. 85, 92 (2014).

  40. . Tarifa B. Laddon & Blake A. Angelino, Medical Device Litigation: The “Internet of Things is Coming, In-House Def. Q., Summer 2017, at 26, 26.

  41. . Amelia R. Montgomery, Note, Just What the Doctor Ordered: Protecting Privacy Without Impeding Development of Digital Pills, 19 Vand. J. Ent. & Tech. L. 147, 148 (2016).

  42. . Id.

  43. . Charlotte A. Tschider, Enhancing Cybersecurity for the Digital Health Marketplace, 26 Annals Health L., Winter 2017, at 1, 1.

  44. . Leta E. Gorman, The Era of the Internet of Things: Can Product Liability Laws Keep Up?, 84 Def. Couns. J. 1, 1 (2017).

  45. . See id. at 1–2; Davenport, supra note 12, at 261; see also Alan M. Winchester & Jaime L. Regan, Attacking Justiciability of Cybersecurity Claims in the Product Liability Context, Defense, Nov. 2015, at 84, 87 (2015) (discussing the definition of IoT).

  46. . Gorman, supra note 43, at 1–2; see also Dalmacio V. Posadas, Jr., After the Gold Rush: The Boom of the Internet of Things, and the Busts of Data-Security and Privacy, 28 Fordham Intell. Prop. Media & Ent. L.J. 69, 75 (2017) (explaining the collection of data).

  47. . See Paez & La Marca, supra note 5, at 31; Posadas, supra note 45, at 76–77.

  48. . See Posadas, supra note 45, at 76–77; Frequently Asked Questions, RFID J., [].

  49. . Paez & La Marca, supra note 5, at 31; Posadas, supra note 45, at 76–77.

  50. . Paez & La Marca, supra note 5, at 31; Posadas, supra note 45, at 76.

  51. . Paez & La Marca, supra note 5, at 31.

  52. . Id.

  53. . See Florin, supra note 3, at 8; Segura et al., supra note 23; Shah, supra note 4.

  54. . See Segura et al., supra note 23.

  55. . See id.

  56. . Id.; see Paez & La Marca, supra note 5, at 31–33; Gorman, supra note 43, at 7; Sarah Knapton, Terrorists Could Hack Pacemakers Like In Homeland, Say Security Experts, Telegraph (Nov. 6, 2014),
    ml [].

  57. . Segura et al., supra note 23; see also Paez & La Marca, supra note 5, at 29 (supporting the emergence of IoT).

  58. . See Segura et al., supra note 23.

  59. . See id.

  60. . See id.; see also Fed. Trade Comm’n, Internet of Things: Privacy & Security in a Connected World 7–8 (2015).

  61. . See Segura et al., supra note 23; see Fed. Trade Comm’n, supra note 59, at 7.

  62. . See Gorman, supra note 43, at 2 (noting the decreased need of human interaction); Segura et al., supra note 23 (“As the IoMT streamlines telemedicine, the physical office is becoming less critical for routine appointments, because patients can now communicate with their doctors via phone and video conference, as well as get prescription orders re-filled—all without leaving their homes, and at reduced cost.”).

  63. . See Fed. Trade Comm’n, supra note 59, at 2, 7–8.

  64. . Id. at 7 (“For example, insulin pumps and blood-pressure cuffs that connect to a mobile app can enable people to record, track, and monitor their own vital signs, without having to go to a doctor’s office.”).

  65. . See, e.g., Hendrik Sybrandy, Doctors Hope New Digital Pill Will Encourage Medication Adherence, CGTN: America (Aug. 5, 2018), 08/05/doctors-hope-new-digital-pill-will-encourage-medication-adherence [ TT3N-3CYR].

  66. . See id.

  67. . See id.; Andrea B. Neiman et al., CDC Grand Rounds: Improving Medication Adherence for Chronic Disease Management–Innovations and Opportunities, 66 CDC Morbidity & Mortality Wkly. Rep. 1241, 1248 (2017).

  68. . See Sybrandy, supra note 64.

  69. . See Florin, supra note 3, at 5 (explaining that IoT can “improve performance and reduce inefficiencies in numerous sectors”).

  70. . See Paez & La Marca, supra note 5, at 34; Fed. Trade Comm’n, supra note 59, at 7–8.

  71. . President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at 5.

  72. . Id.

  73. . Id. at ES-1; see Shah, supra note 4 (“Health care organizations often pursue IoT efforts to find novel ways to engage patients, monitor health status, derive insights from clinical data, and advance care management and population health.”).

  74. . Segura et al., supra note 23.

  75. . Id.

  76. . Id.

  77. . See Sandra Burmeier et al., Swiss Re SONAR–New Emerging Risk Insights 11 (Urs Leimbacher et al. eds., 2015); see also President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at ES-1 (noting additional possible risks).

  78. . President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at ES-1.

  79. . See Fed. Trade Comm’n, supra note 59, at 10.

  80. . Gorman, supra note 43, at 3; Paez & La Marca, supra note 5, at 37–40.

  81. . See O’Brien, supra note 10; Kathryn R. Coburn, The Internet of Medical Things: Scientific and Technical Innovations Predict, Preempt, and Treat Disease, SciTech L., Spring 2016, at 18, 19 (2016) (“Data in the IoMT is not secure.”); see also Fed. Trade Comm’n, supra note 59, at 12 (discussing the exploitation of vulnerabilities in devices).

  82. . Coburn, supra note 80, at 19; Paez & La Marca, supra note 5, at 46.

  83. . Gorman, supra note 43, at 3; Paez & La Marca, supra note 5, at 39.

  84. . Clemens Scott Kruse et al., Cybersecurity in Healthcare: A Systematic Review of Modern Threats and Trends, Tech. & Health Care, Aug. 19, 2016, at 1, 6; Tschider, supra note 42, at 8.

  85. . See Ashton, supra note 28, at 834; Paez & Tobitsch, supra note 33, at 240.

  86. . See Ashton, supra note 28, at 834 (“[T]he average consumer tends to undervalue the security of Internet-based products.”).

  87. . Florin, supra note 3, at 18; President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at 6.

  88. . Cf. President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at 1 (discussing the growing threat caused by the expansion of interconnected IoT devices).

  89. . See id. at 6, 12.

  90. . See Florin, supra note 3, at 18.

  91. . Charlie Osborne, U.S. Hospital Pays $55,000 to Hackers after Ransomware Attack, ZDNet (Jan. 17, 2018), [].

  92. . See id.

  93. . Florin, supra note 3, at 5; see Gorman, supra note 43, at 3; Paez & La Marca, supra note 5, at 48 (“[A] breach of an IoT object can also result in significant bodily harm.”); Peppet, supra note 38, at 134 (“[I]nsulin pumps have been shown to be vulnerable to hacking.”). See generally Detsch, supra note 22 (referencing the risks of injury from IoT).

  94. . Andrea Peterson, Yes, Terrorists Could Have Hacked Dick Cheney’s Heart, Wash. Post: The Switch (Oct. 21, 2013), 2013/10/21/yes-terrorists-could-have-hacked-dick-cheneys-heart/?utm_term=.bf1506f843d8 [].

  95. . Id.; see Trevor Weyland, Medical Device Cybersecurity, Gallagher Healthcare: Industry Insights Blog (May 19, 2016), blog/post/medical-device-cybersecurity [] (“In 2015, students at the University of Alabama hacked the pacemaker implanted in an iStan (a robotic dummy patient used to train medical students) and were able to speed up its heart rate.”).

  96. . Paez & La Marca, supra note 5, at 48, supra note 59, at 12; Weyland, supra note 94.

  97. . Weyland, supra note 94.

  98. . See Paez & La Marca, supra note 5, at 48.

  99. . See Brown et al., supra note 24, at 10.

  100. . Churi et al., supra note 13, at 4.

  101. . See Health Care Indus. Cybersecurity Task Force, Report on Improving Cybersecurity in the Health Care Industry 9 (2017),

  102. . See Dean, supra note 14, at 2–4; Dave Fornell, Raising the Bar for Medical Device Cyber Security, DAIC: Cybersecurity (Aug. 16, 2017), [].

  103. . See Dean, supra note 14, at 2–4.

  104. . See Paez & Tobitsch, supra note 33, at 240.

  105. . Paez & La Marca, supra note 5, at 40.

  106. . See, e.g., Office for Civil Rights, HIPAA for Professionals, U.S. Dep’t Health & Hum. Servs. (June 16, 2017), []; Medical Devices, U.S. Food & Drug Admin., https://www.
    []. The Federal Trade Commission (“FTC”) is another agency responsible for consumer protection and the elimination of anti-competitive behaviors. See Fed. Trade Comm’n, Privacy & Data Security Update: 2017 1 (2017). The FTC is not specific to health care, and its regulatory authority extends primarily to unfair and deceptive acts or practices. See id. FTC’s authority is not directed to preventing or regulating privacy and security standards in the healthcare industry, and the FTC does not create cybersecurity standards. See Kirk J. Nahra & Bethany A. Corbin, Digital Health Regulatory Gaps in the United States, 4 Compliance Elliance J. 21, 30 (2018). As a result, the FTC “does not address legislative gaps that may leave digital health technology unregulated.” Id.

  107. . See Office for Civil Rights, About Us (OCR), U.S. Dep’t Health & Hum. Servs., []; Consumers (Medical Devices), U.S. Food & Drug Admin., [].

  108. . See Paez & Tobitsch, supra note 33, at 240.

  109. . HIPAA Privacy and Security for Beginners, Wiley Rein: Newsls. (July 2014), [] [hereinafter HIPAA for Beginners].

  110. . Id.

  111. . See id.

  112. . Paez & Tobitsch, supra note 33, at 240.

  113. . See id.

  114. . A healthcare provider is any individual or organization that gets paid to provide health care and transmits health information in electronic form. 45 C.F.R. § 160.102 (2018).

  115. . A health plan is an individual or group that pays the cost of medical care. Id.

  116. . A healthcare clearinghouse consists of entities that process information so it can be transmitted in standard format between covered entities. Id.

  117. . See id. § 160.103; HIPAA for Beginners, supra note 108.

  118. . § 160.103; HIPAA for Beginners, supra note 108.

  119. . HIPAA for Beginners, supra note 108.

  120. . 45 C.F.R. § 164.502(a).

  121. . Id. § 164.302.

  122. . Id.

  123. . See DHHS, Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not Regulated by HIPAA 20 (2016); Scott J. Shackelford et al., When Toasters Attack: A Polycentric Approach to Enhancing the “Security of Things, 2017 U. Ill. L. Rev. 415, 448–49; Kirk Nahra, What Closing the HIPAA Gaps Means for the Future of Healthcare Privacy, HITECH Answers (Nov. 9, 2015),
    re-privacy-2/ [].

  124. . Paez & Tobitsch, supra note 33, at 240; Montgomery, supra note 40, at 170 (examining the application of HIPAA to the Proteus digital pill and noting relevant statutory gaps).

  125. . See Elizabeth Snell, How Do HIPAA Regulations Apply to Wearable Devices?, HealthIT Security (Mar. 23, 2017), [].

  126. . See Nicolas P. Terry, Will the Internet of Things Transform Healthcare?, 19 Vand. J. Ent. & Tech. L. 327, 342 (2016) (“HIPAA data protection seldom will apply to data generated or stored on a mobile device, wearable, or IoT node.”).

  127. . See Snell, supra note 124.

  128. . Paez & Tobitsch, supra note 33, at 240.

  129. . See 45 C.F.R. § 164.502(a).

  130. . See id. § 164.502(a), (d).

  131. . See generally Terry, supra note 125, at 338, 342 (discussing coverage of electronic medical apps under HIPAA).

  132. . State ex rel. Cincinnati Enquirer v. Daniels, 844 N.E.2d 1181, 1185 ¶ 11 (Ohio 2006).

  133. . See 45 C.F.R. § 164.502(a), (d).

  134. . See id.

  135. . President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at 6; Davenport, supra note 12, at 260.

  136. . See Terry, supra note 125, at 338–39, 342.

  137. . See id. at 343; Dean, supra note 14, at 3–4; President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at 6.

  138. . See 42 U.S.C. § 1320d-6 (2009); Standards for Privacy of Individually Identifiable Health Information, 65 Fed. Reg. 82462-01, 82601 (Dec. 28, 2000); Acara v. Banks, 470 F.3d 569, 571–72 (5th Cir. 2006); Byrne v. Avery Ctr. for Obstetrics & Gynecology, P.C., 102 A.3d 32, 45 (Conn. 2014).

  139. . See generally HIPAA for Beginners, supra note 108.

  140. . See New York Attorney General Addresses Key Health Care Privacy Gaps, Wiley Rein: Newsls. (Apr. 2017), [

  141. . See 21 U.S.C. § 360c (2012).

  142. . Steve Kanovsky et al., Chapter 8: The Medical Device Approval Process, in A Practical Guide to FDA’s Food and Drug Law and Regulation 211, 213 (Kenneth R. Piña & Wayne L. Pines eds., 6th ed. 2017) [hereinafter Food & Drug Law Guide].

  143. . See 21 U.S.C. § 360c(a)(1)(B) (2016); Kanovsky et al., supra note 141, at 213–14.

  144. . See 21 U.S.C. § 360c(a)(1)(C) (2016); Kanovsky et al., supra note 141, at 213–14.

  145. . FDA, Postmarket Management of Cybersecurity in Medical Devices: Guidance for Industry and Food and Drug Administration Staff 1, 4 (2016) [hereinafter FDA, Postmarket Guidance].

  146. . Id. at 4.

  147. . Id.

  148. . Id.

  149. . See id.

  150. . Id. at 6.

  151. . See id.

  152. . FDA, Content of Premarket Submissions for Management of Cybersecurity in Medical Devices: Draft Guidance for Industry and Food and Drug Administration Staff (2018).

  153. . Id. at 4.

  154. . See generally id. (presenting draft guidelines to strengthen medical devices against cybersecurity threats).

  155. . See id. at 1, 5; Louiza Dudin, Note, Networked Medical Devices: Finding a Legislative Solution to Guide Healthcare into the Future, 40 Seattle U. L. Rev. 1085, 1093, 1098 (2017).

  156. . See generally Dudin, supra note 154, at 1093 (explaining that voluntary FDA guidance does “not appear to provide a strong incentive for manufacturers to meet their duty of care in ensuring the cybersecurity of their devices”). Dudin advised that the FDA should “leverage its ability to increase oversight under its regulatory authority in order to ensure that manufacturers comply with safety and security standards and address threats proactively rather than reporting adverse events after the fact.” Id. at 1098.

  157. . See id. at 1093 (“[D]evices approved for market by the FDA are shielded from manufacturer liability claims.”).

  158. . See infra notes 153–56.

  159. . See Tara Swaminatha, The Rise of the NIST Cybersecurity Framework, CSO (May 11, 2018), [].

  160. . HIMSS N. Am., 2018 HIMSS Cybersecurity Survey 18 (2018).

  161. . Nat’l Inst. of Standards & Tech., Framework for Improving Critical Infrastructure Cybersecurity Version 1.1 (2018).

  162. . HITRUST, Introduction to the HITRUST CSF, Version 9.1 (2018).

  163. . Ctr. for Internet Sec., CIS Controls, Version 7 (2018).

  164. . Joint Tech. Comm. ISO/IEC JTC 1, Int’l Org. for Standardization, ISO/IEC 27001 (2013).

  165. . IT Governance Inst., Control Objectives for Information and Related Technologies 4.1 (2007).

  166. . See Scott Schlimmer, Implementing the NIST Cybersecurity Framework Could Be Worth at Least $1.4m to Your Business, CSO (Apr. 19, 2018), [].

  167. . DHHS, Office for Civil Rights, HIPAA Security Rule Crosswalk to NIST Cybersecurity Framework (2016).

  168. . See Jeffrey Voas et al., Nat’l Inst. of Standards & Tech, NIST Cybersecurity White Paper: Internet of Things (IoT) Trust Concerns, at i–ii (draft 2018); see also Paez & La Marca, supra note 5, at 51 (explaining that in September 2015, NIST published a draft IoT framework called the Framework for Cyber-Physical Systems, which sought to create a shared understanding of Cyber-Physical Systems).

  169. . See Privacy Framework, Nat’l Inst. of Standards & Tech., [].

  170. . See generally NIST Releases Second Draft to Cybersecurity Framework, ANSI Encourages Stakeholders to Comment, Am. Nat’l Standards Inst. (Dec. 8, 2017),
    6-9753-9e6e0ddaff9f [] (explaining that the NIST framework was created through collaboration between industry and government).

  171. . Swaminatha, supra note 158.

  172. . See id.

  173. . See, e.g., Nat’l Inst. of Standards & Tech., supra note 160, at v.

  174. . See President’s Nat’l Sec. Telecomm. Advisory Comm., supra note 12, at 6.

  175. . See, e.g., Dean, supra note 14, at 3–4.

  176. . Id.; see Paez & La Marca, supra note 5, at 52–53 (“IoT manufacturers often lack an economic incentive to provide software updates and support. . . .”); Daley, supra note 31, at 535 (“[T]he economic and legal structure of the software development industry leaves no single entity with strong enough incentives to secure software before it is shipped.”).

  177. . See Dean, supra note 14, at 3–4.

  178. . Daley, supra note 31, at 537–38. “[T]he vast majority of consumers lack the expertise to effectively evaluate security features. Consumers therefore lack the ability to effectively compare security across competitors.” Id.

  179. . See id. Dean, supra note 14, at 3–4.

  180. . See Ponemon Inst., Medical Device Security: An Industry Under Attack and Unprepared to Defend 4 (2017).

  181. . Id. at 1.

  182. . Id. at 2.

  183. . Id. at 1–2.

  184. . Id. at 14.

  185. . Id. at 2, 14.

  186. . Id. at 2.

  187. . Paez & La Marca, supra note 5, at 53.

  188. . See Ponemon Inst., supra note 179, at 3. Sixty-two percent of device manufacturers also do not follow a published Secure Development Life Cycle process for medical devices. Id. at 14.

  189. . Id. at 8.

  190. . Id. at 2.

  191. . See Dean, supra note 14, at 8–11; Daley, supra note 31, at 538; Lemos, supra note 18.

  192. . See Daley, supra note 31, at 542; Evans, supra note 17; Rosenzweig, supra note 19; see, e.g., Bethany Corbin & Megan Brown, Partnerships Can Enhance Security in Connected Health and Beyond, CircleID (Dec. 14, 2007),

  193. . See Brown et al., supra note 24; Ashton, supra note 28, at 834; Evans, supra note 17.

  194. . Jonathan D. Klein, 2017: The Year of Big Shifts in Cybersecurity, Legal Intelligencer (May 30, 2017), [].

  195. . See Dean, supra note 14, at 4; Paez & La Marca, supra note 5, at 52–53; Klein, supra note 193.

  196. . See generally Dean, supra note 14, at 4 (explaining the reasons for medical devices lacking security); Gorman, supra note 43.

  197. . See Gorman, supra note 43, at 4. While products liability claims for medical devices may be preempted by the Food, Drug, and Cosmetics Act and subsequent amendments, preemption is beyond the scope of this Article. For purposes of this Article, it is assumed that preemption does not bar state products liability claims.

  198. . Dean, supra note 14, at 9; Gorman, supra note 43; Paez & La Marca, supra note 5, at 57.

  199. . Dean, supra note 14, at 9; Paez & La Marca, supra note 5, at 57.

  200. . See, e.g., Liability and IoT Devices—A Legal Can of Worms, Data Foundry Blog (May 15, 2018), []. Determining the liability for IoT devices “will be more difficult than ever” because “the diversity of the IoT field has turned the typical regulatory landscape on its head.” Id.

  201. . Paez & La Marca, supra note 5, at 57.

  202. . Dean, supra note 14, at 10.

  203. . Id.

  204. . Id.

  205. . Id.

  206. . Paez & La Marca, supra note 5, at 57.

  207. . Dean, supra note 14, at 10.

  208. . Id. at 16.

  209. . See Beery & Burns, supra note 20, at 55; see also Butler, supra note 21, at 919–21 (discussing the impact of the economic loss doctrine on IoT devices).

  210. . Dean, supra note 14, at 16.

  211. . Id.

  212. . See Butler, supra note 21, at 919–21; Paez & La Marca, supra note 5, at 48.

  213. . Dean, supra note 14, at 16.

  214. . See Butler, supra note 21, at 919–21.

  215. . Dean, supra note 14, at 16.

  216. . See id. at 18.

  217. . Id. at 17; see also Paez & La Marca, supra note 5, at 58 (discussing whether software should be considered a product or a service).

  218. . Dean, supra note 14, at 17.

  219. . See id. at 16–18; see also Untangling the Web, supra note 4.

  220. . Dean, supra note 14, at 19.

  221. . See generally Paez & La Marca, supra note 5, at 46, 48, 59.

  222. . See Dean, supra note 14, at 7 (expounding on the vast amount of errors that always exist in code); Paez & La Marca, supra note 5, at 59; Wenzel, supra note 33, at 59 (stating that there is no such thing as a computer that cannot be hacked); Beery & Burns, supra note 20, at 55 (noting that all complex software is understood to have bugs).

  223. . See Paez & La Marca, supra note 5, at 59–60; Evans, supra note 17; see also Daley, supra note 31, at 542 (discussing that any liability borne by software vendors will extinguish the current startup ecosystem).

  224. . See Dean, supra note 14, at 12–13, 21; Untangling the Web, supra note 4.

  225. . See Paez & La Marca, supra note 5, at 30 (“[T]he IoT ecosystem hinges on the interconnectivity of countless devices and participants, companies will need to account for the legal rights and obligations of multiple stakeholders involved throughout a product’s entire lifecycle, from design and manufacturing to installation, operation, maintenance and decommissioning.”); see also Dean, supra note 14, at 12–13, 21; Untangling the Web, supra note 4.

  226. . See, e.g., Paez & La Marca, supra note 5, at 60.

  227. . Dean, supra note 14, at 21.

  228. . Id.

  229. . Id. at 3, 21.

  230. . See id. at 12–13, 21; Paez & La Marca, supra note 5, at 60.

  231. . Lewison v. Renner, 905 N.W.2d 540, 548 (Neb. 2018) (citing Latzel v. Bartek, 846 N.W.2d 153 (Neb. 2014)).

  232. . See Scott, supra note 19, at 459.

  233. . See Lewison, 905 N.W.2d at 548 (explaining the general standards for prevailing in a negligence action).

  234. What Is Product Liability Negligence?,, [].

  235. . See id.

  236. . Scott, supra note 19, at 459.

  237. . See Michael Weinberger, New York Products Liability § 18:3 (2d ed. 2018).

  238. . Id.

  239. . See id.; Butler, supra note 21, at 927.

  240. . See Butler, supra note 21, at 915.

  241. . See Dean, supra note 14, at 7; Vincent J. Vitkowsky, The Internet of Things: A New Era of Cyber Liability and Insurance 15, 16 (2015); Paez & La Marca, supra note 5, at 59; Wenzel, supra note 33, at 59; Beery & Burns, supra note 20, at 58; Evans, supra note 17; Untangling the Web, supra note 4.

  242. . Dean, supra note 14, at 7.

  243. . See, e.g., Karen Schultz & Theodore Z. Wyman, Texas Jurisprudence § 34 (3d ed. 2018).

  244. . See Connally v. Sears, Roebuck & Co., 86 F. Supp. 2d 1133, 1137 (S.D. Ala. 1999) (quoting Beech v. Outboard Marine Corp., 584 So. 2d 447, 450 (Ala. 1991)).

  245. . See, e.g., Butler, supra note 21, at 915.

  246. . See Burmeier et al., supra note 76, at 11; Carmen Camara et al., Security and Privacy Issues in Implantable Medical Devices: A Comprehensive Survey, 55 J. Biomedical Informatics 272, 272 (2015).

  247. . See Paez & La Marca, supra note 5, at 59; see also Detsch, supra note 22 (discussing faulty codes in IoT devices that cause serious bodily harm or death).

  248. . See Dean, supra note 14, at 7; Paez & La Marca, supra note 5, at 52–53.

  249. . See Vitkowsky, supra note 240, at 16; see also Beery & Burns, supra note 20, at 57–58 (explaining that it will be difficult to establish an accepted duty of care).

  250. . Lewison v. Renner, 905 N.W.2d 540, 548 (Neb. 2018) (citing Latzel v. Bartek, 846 N.W.2d 153 (Neb. 2018)).

  251. . See, e.g., Gorman, supra note 43, at 4–5; Merritt Baer & Chinmayi Sharma, What Cybersecurity Standard Will a Judge Use in Equifax Breach Suits?, Lawfare (Oct. 20, 2017, 7:30 AM), [].

  252. . See discussion supra Part III.

  253. . See, e.g., FDA, Postmarket Guidance, supra note 144.

  254. . See, e.g., Nat’l Inst. of Standards & Tech., supra note 160, at v.

  255. . See Vitkowsky, supra note 240, at 16; see also Beery & Burns, supra note 20, at 57–58 (quoting Vitkowsky, supra note 240, at 16).

  256. . Stacy-Ann Elvy, Hybrid Transactions and the Internet of Things: Goods, Services, or Software?, 74 Wash. & Lee L. Rev. 77, 79–80, 87–88, 104 (2017).

  257. . Id. at 103.

  258. . See id. at 88–89, 103–04.

  259. . Id. at 89.

  260. What     are     Express     and     Implied     Warranties?,     FindLaw, [].

  261. . Elvy, supra note 255, at 115; see also What are Express and Implied Warranties?, supra note 259 (explaining in more detail how express warranties are different from implied warranties).

  262. . Vitkowsky, supra note 240, at 16; see Paez & La Marca, supra note 5, at 59; Wenzel, supra note 33, at 59; Beery & Burns, supra note 20, at 58; see Untangling the Web, supra note 4. See generally Dean, supra note 14, at 7 (noting that “buggy software is not exceptional” in that programmers make an estimated ten to fifty errors for every one-thousand lines of code that they write).

  263. . Dean, supra note 14, at 10; Evans, supra note 17.

  264. . See Gorman, supra note 43, at 4.

  265. . Lemos, supra note 18, at 2.

  266. . Beery & Burns, supra note 20.

  267. . See What are Express and Implied Warranties?, supra note 259.

  268. . See generally id. (explaining the circumstances in which an implied warranty may arise).

  269. . Can Implied Warranty Protection Be Disclaimed?,, [].

  270. . See, e.g., Robert W. Gomulkiewicz, The Implied Warranty of Merchantability in Software Contracts: A Warranty No One Dares to Give and How to Change That, 16 J. Marshall J. Computer & Info. L. 393, 398 (1998); Charles H. Moellenberg, Jr. & Robert W. Kanter, Be Wary of Warranties for Software Design, Jones Day Insights (Aug. 2018), [].

  271. . See Gomulkiewicz, supra note 269, at 394.

  272. . Daley, supra note 31, at 538.

  273. . Dean, supra note 14, at 9; see Merrion, supra note 35 (“[E]xperts on the Internet of Things said class-action product liability lawsuits could help pressure manufacturers to build more security into web-connected devices. . . . [L]itigation was mentioned repeatedly as a way to get the attention of web device manufacturers in the near term.”).

  274. . Daley, supra note 31, at 538.

  275. . Id. at 541.

  276. . Detsch, supra note 22.

  277. . See Beery & Burns, supra note 20; Untangling the Web, supra note 4.

  278. . See Daley, supra note 31, at 537–38; see, e.g., Lemos, supra note 18.

  279. . See Butler, supra note 21, at 927.

  280. . See Dean, supra note 14, at 12 (noting that liability structures take time to develop).

  281. . Scott, supra note 19, at 426–27.

  282. . See Butler, supra note 21, at 926; Daley, supra note 31, at 538.

  283. . See Dean, supra note 14, at 4; Paez & La Marca, supra note 5, at 52–53.

  284. . Butler, supra note 21, at 916 (“[H]olding manufacturers liable for downstream harms caused by their insecure devices is well aligned with the purposes of products liability law—to minimize harm by encouraging manufacturers (as a least-cost-avoider) to invest in security measures.”); see Detsch, supra note 22; see also Wenzel, supra note 33, at 67 (presenting a similar argument for smart car technology).

  285. . Daley, supra note 31, at 538.

  286. . Rosenzweig, supra note 19; see Daley, supra note 31, at 538 (describing the current environment of under-investment in software security); see also Untangling the Web, supra note 4 (discussing the risk of IoT devices caused by manufacturers that fail to provide security measures).

  287. . See Daley, supra note 31, at 538.

  288. . See, e.g., Mitchell, supra note 26.

  289. . Daley, supra note 31, at 538.

  290. . See Ponemon Inst., supra note 179, at 2, 10.

  291. . Id.

  292. . Id. at 1–2; see Rosenzweig, supra note 19.

  293. . See Dean, supra note 14, at 12.

  294. . See Mitchell, supra note 26.

  295. . Id.

  296. . Id.

  297. . Id.

  298. . See id.

  299. . Daley, supra note 31, at 538, 542 (explaining that protection from liability keeps costs down).

  300. . Daley, supra note 31, at 537, 542; Evans, supra note 17; see also Vitkowsky, supra note 240, at 16 (explaining that some will argue holding software companies liable for defects will “discourage innovation and growth”); Paez & La Marca, supra note 5, at 60 (“Extending strict products liability to software defects could also dramatically obstruct technological progress for the IoT.”).

  301. . See Daley, supra note 31, at 542.

  302. . See Paez & La Marca, supra note 5, at 59; see also Why You Probably Don’t Have Product Liability for the Software You Develop…Yet, Insureon: You’re IT (June 21, 2016), []; Evans, supra note 17 (discussing industries that are subject to liability).

  303. . See generally Abbott & the Chertoff Grp., Why Medical Device Manufacturers Must Lead on Cybersecurity in an Increasingly Connected Healthcare System 6 (2018) (“[C]ybersecurity should not function as a competitive differentiator, but as a uniform device enabler.”); Rosenzweig, supra note 19 (discussing some federal organizations consideration of fines for violating cybersecurity “best practices”).

  304. . See Wenzel, supra note 33, at 69.

  305. . Mitchell, supra note 26.

  306. . See Detsch, supra note 22.

  307. . See id. (“[L]eading digital security experts are calling on US policymakers to hold manufacturers liable for software vulnerabilities in their products in an effort to prevent the bugs commonly found in smartphones and desktops from pervading the emerging IoT space.”).

  308. . See Dean, supra note 14, at 12 (noting that liability structures take time to develop); Evans, supra note 17.

  309. . Paez & La Marca, supra note 5, at 59; Beery & Burns, supra note 20, at 58.

  310. . See Dean, supra note 14, at 8–10; Scott, supra note 19, at 469; Ashton, supra note 28, at 834–35; Merrion, supra note 35.

  311. . See Scott, supra note 19, at 469–70.

  312. . See, e.g., Daley, supra note 31, at 541; Wenzel, supra note 33, at 69.

  313. . See, e.g., Ohio Rev. Code Ann. § 1354.01–.05 (West, Westlaw through Files 1 to 9, immediately effective RC sections of File 10, and Files 11 to 14 of the 133rd General Assembly (2019–2020)).

  314. . Jeff Kosseff, Positive Cybersecurity Law: Creating a Consistent and Incentive-Based System, 19 Chap. L. Rev. 401, 403 (2016) (“[P]ositive cybersecurity law,” such as incentives, “requires a shift in thinking from our nation’s longstanding mindset in which nearly all cybersecurity laws are punitive.”).

  315. . See Davenport, supra note 12, at 260; Klein, supra note 193.

  316. . See Corbin & Brown, supra note 191.

  317. . Kosseff, supra note 313, at 411; see U.S. Dep’t of Homeland Sec., Executive Order 13636: Improving Critical Infrastructure Cybersecurity 4 (2013).

  318. . See Kosseff, supra note 313, at 403.

  319. . See, e.g., Daley, supra note 31, at 541.

  320. . Alysa Austin et al., Ohio Enacts First Cybersecurity Safe Harbor, JD Supra (Nov. 7, 2018), []; see Data Protection Act, S.B. 220, 132nd Gen. Assemb. (Ohio 2017); Ohio Rev. Code Ann. §§ 1354.01–1354.05 (West, Westlaw through Files 1 to 9, immediately effective RC sections of File 10, and Files 11 to 14 of the 133rd General Assembly (2019–2020)).

  321. . Austin et al., supra note 319.

  322. . Id.

  323. . N.Y. Gen. Bus. Law § 899-bb (McKinney 2019) (effective Mar. 21, 2020).

  324. . S. 6933, 2017–2018 Leg., Reg. Sess. (N.Y. 2017); cf. Romaine Marshall & Craig Stewart, Safe Harbor for Data Security: New York’s Proposed Changes Could Be Followed by Other States, Legal Insights, Holland & Hart (Nov. 11, 2017), safe-harbors-for-cybersecurity-new-yorks-proposed-changes-could-be-followed-by-other-states [] (describing another New York act, the Stop Hacks and Improve Electronic Data Security Act, as an amendment that includes a safe harbor provision for companies that obtain certification).

  325. . See Marshall & Stewart, supra note 323.

  326. . See S. 6933, 2017–2018 Leg.

  327. . Alejandro Cruz & W. Scott Kim, New York’s SHIELD Act Heads to the Governor’s Desk, JD Supra (July 9, 2019), [].

  328. . See generally Scott, supra note 19, at 469 (arguing that imposing liability on software companies will encourage more security measures); Ashton, supra note 28, at 834–35 (arguing that incentives will help to produce safer and more consistent security measures for software products); Merrion, supra note 35; Rosenzweig, supra note 19 (discussing the current use of end-user agreements in software contracts which disclaim liability).

  329. . See generally Scott J. Shackelford et al., Toward a Global Cybersecurity Standard of Care?: Exploring the Implications of the 2014 NIST Cybersecurity Framework on Shaping Reasonable National and International Cybersecurity Practices, 50 Tex. Int’l L.J. 305, 343, 345 (2015) (“The incentive reports issued by the DHS . . . included discussion on some form of limited liability for companies who voluntarily adopt the Framework.”); U.S. Dep’t of Homeland Sec., supra note 316, at 62 (“A common suggestion among respondents was the need for indemnity, at some level, from liability for security breaches [for] organizations adopting cybersecurity measures.”); Nat’l Telecomm. & Info. Admin., Discussion of Recommendations to the President on Incentives for Critical Infrastructure Owners and Operators to Join a Voluntary Cybersecurity Program 11 (2013).

  330. . U.S. Dep’t of Treasury, Treasury Dep’t Report to the President on Cybersecurity Incentives Pursuant to Executive Order 13636, at 2–3, 10–12 (2013).

  331. . See Letter from Robert W. Holleyman, II, President & CEO, BSA, to Alfred Lee, NTIA (Apr. 29, 2013).

  332. . See HITRUST, Framework for Reducing Cyber Risks to Critical Infrastructure 3, 10 (2017); Letter from Jim Wunderman, President & CEO, Bay Area Council, to The Honorable Dennis Hightower, Deputy Sec’y, Nat’l Inst. of Standards & Tech. (July 29, 2011).

  333. . Craig Spiezle, Uber, Equifax Hacks Signal Need for Accountability and Breach Regulation, Int’l Bus. Times (Dec. 5, 2017), [].

  334. . See Daley, supra note 31, at 538; Lemos, supra note 18 (“What we have is an incentive gap, and we are not going to see something different unless we incentivize something different.”).

  335. . See generally Doug Olenick, IoT Liability: Legal Issues Abound, SC Media (Mar. 30, 2017), [https://perma. cc/U42Y-8CMB] (discussing that “few cases have been filed” regarding IoT liability, so case law is sparse).

  336. . Ctr. for Strategic & Int’l Studies, Tilting the Playing Field: How Misaligned Incentives Work Against Cybersecurity 3, 7 (2017).

  337. . See Lemos, supra note 18.

  338. . See Carrots for Cybersecurity, Blade (Dec. 4, 2017), Editorials/2017/12/04/Carrots-for-cybersecurity.html []; see also U.S. Dep’t of Homeland Sec., supra note 316, at 62.

  339. . See, e.g., Dean, supra note 14, at 8–10 (explaining that liability incentivizes developers to weigh the cost of mitigating known defects with the potential for large damages); Ashton, supra note 28, at 834–35; Merrion, supra note 35.

  340. . See generally Detsch, supra note 22 (explaining the need to balance the security of consumers with the technology development of software companies).

  341. . See supra Part I.

  342. . See Lemos, supra note 18.

  343. . See id.; Dean, supra note 14, at 4; Paez & La Marca, supra note 5, at 52–53.

  344. . See supra Part III.

  345. . See Dean, supra note 14, at 8–9; Scott, supra note 19, at 469; Daley, supra note 31, at 538.

  346. . See Butler, supra note 21, at 927 (noting that without clear guidance, IoT tort outcomes by courts come become random).