This post is a continuation of a series of articles written by the Citrix Labs R&D staff on the topic of IoT.
In the previous posts, we’ve defined the role of IoT in the Citrix software defined workplace, identified many security challenges unique to the IoT, reviewed the information security “CIA” triad fundamentals and described a simple IoT framework with a device layer, gateway layer, and service layer.
In this article, we examine two concrete security models which can be used to build security directly into IoT devices. The memorably named “Resurrecting Duckling” and the classic Biba security models provide a useful lens through which we can start to see the form of secure IoT taking shape.
For examples of the current state of IoT device security, consider the HP Research Study of 10 popular (but unnamed) IoT devices ranging from door locks to hubs that control multiple devices found that 90% of the devices collected personal information, 80% did not require strong passwords, and 70% sent unencrypted data. Similarly, a researcher over at Kaspersky Lab hacked his own IoT enabled home and found over 14 vulnerabilities in 20 minutes, some as serious as an administrative root password that was “1” and readable configuration files containing user credentials.
The obvious conclusion reached in these reports was that, for IoT device security to be fully realized, device manufactures must build security into devices from their inception. Security cannot be an afterthought that is added on later. It is not acceptable to omit security just because it is an “entertainment” device that will be obsolete and out of production in 12 months, because we all know that the devices will be in use for years to come in the hands of consumers even if the manufacturer has moved on.
The remainder of this post reviews potential security models to help protect IoT devices and describes Citrix’s own Octoblu IoT platform’s implementation of the models.
The Resurrecting Duckling Security Model
The Resurrecting Duckling security model’s name, first conjured by Frank Stajano in 2002, comes from the following metaphor. A duckling emerging from its egg will recognize as its mother the first moving object its sees that makes a sound; regardless of its appearance. This phenomenon is called imprinting. After imprinting, the duckling will follow the orders of its mother and no one else until its death. The metaphor is used to describe how IoT devices could implement secure, temporary connections over ad-hoc networks.
When applied to the IoT, the “egg” is the factory sealed box that encloses the device. When it is removed and powered on, the device will recognize its owner as the first entity to send it a secret key. This key could be a password, a UUID, a cryptographic key, or even a biometric signature. As soon as the key is received, the device is “claimed” and no longer a newborn and will stay faithful to its owner until death. “Death” for devices is an important concept in this model because this is how a device can change ownership. Death is the only way the device can revert back to the pre-birth state so that it can be imprinted by a new master.
Device death can be designed to occur in specific scenarios, when a medical instrument is dropped into the disinfection bin, for example. Another scenario is a simple timeout so that the device dies of “old age,” for rental equipment perhaps. Yet other devices will only die when instructed to do so by their owner (such as when the device is lost, stolen, or sold), thus only the current authorized user can transfer control of the device.
Below is a simple state diagram that illustrates the Resurrecting Duckling security model and summarizes its four key principals.
Following Multiple Masters
The mother/duckling relationship works well to secure personal devices with only one owner, but in the real IoT world we expect many people to interact with the same devices as well as many devices interacting with each other. The imprinted duckling is faithful to its mother for its entire life, but it should also be happy to talk to others. It will even follow the directions of others, as long as the mother duck says it’s OK to do so.
To accommodate this, the model is expanded so that there are two different ways to be master; one is the long term mother/duckling relationship that lasts for the life of the duckling. The second, is a master/slave or even peer-to-peer relationship that is transient in nature, lasting only as long as required to complete a brief transaction. The first type of relationship requires the secret imprinting key, while the second does not.
Imagine the duckling as an IoT device with a number of properties that can be read and actions it can perform. The security model calls for the IoT device to have policy rules stating, for each of the device’s functions, which credentials a person (or another device) must provide to access the device’s specific properties or methods. These rules can grant or deny privileges to any of the possible device functions. A requirement in this model is that if a person/device presents the imprinting key to the device, it can upload new policy rules into it. Given these policies are critical to the security of the device, they will most likely be created by the device manufacturer.
The various policies for the device could be ranked at different integrity levels, such as public and private or can even be as granular as per user. This creates the need for a multi-layer integrity model, i.e. the Biba security model as illustrated below. This security model can be summarized by its 3 key properties that stipulate:
- The Simple Integrity Property – Data can be read from a higher integrity level
- The Star Integrity Property – Data can be written to a lower integrity level
- The Invocation Property – User cannot request service (invoke) from a higher integrity level
In this example, someone using the public interface does not have to provide any credentials to read data from the device. However, the public interface cannot write data to a higher layer of security nor can it invoke a function in a higher layer of security. Someone using the private interface can write data and invoke methods from a lower layer of security, but this higher layer of security does not trust data read from lower layers of security.
Being the managing cloud service (mother duck) allows one to perform the special action of uploading a new policy to a duckling. Apart from that, any person or thing that presents the required credentials can invoke any action permitted by the duck’s policy. This enables peer-to-peer interaction between things without having to be the mother duck.
Threat Model for Resurrecting Duckling
The secret key given to the device when imprinted is an obvious high value target in this model. To protect the key during the on-boarding process, it must be delivered over a channel that maintains its confidentiality and integrity. In addition, some level of tamper resistance is also necessary to make it suitably difficult and expensive to “assassinate” (maliciously revert to imprintable) the device without damage. In the same vein, it is necessary to make the secret key equally difficult to recover from the device so that it cannot be used for impersonation. Finally, the secret key should be unique for each device so that if a single device is compromised, only the data on that device is at risk, and not that of the entire network.
A perfect example of the need for tamper resistant physical security is outlined in this article where a security researcher hacks a connected light bulb to gain access the Wi-Fi connection credentials. While tamper resistance is required to mitigate this risk at the device layer, it alludes to security functions needed at the higher layers as well. For example, the gateway or cloud service responsible for issuing the secret key must do so in a secure way. It’s a good idea to take a security posture that assumes you are already hacked. With this in mind, these layers should also employ real-time analytics to identify anomalous behaviors of potentially rogue devices.
Octoblu Implements Resurrecting Duckling Imprinting
In Citrix’s Octoblu platform, the imprinting process is implemented by assigning a UUID and token to an IoT device running the Microblu OS. When connecting to the Octoblu cloud service, devices are authenticated with their UUID and token. When a device doesn’t have an owner, it is in an unclaimed (imprintable) state. The device and its properties are searchable by authenticated resources on the same network. The device can then be claimed (see the claim API). Once a device is claimed (imprinted), it will not be publicly visible, except to the owner of the device.
Octoblu also gives you the ability to further secure access to registered devices by configuring permission white-lists and black-lists. There can be a white-list or a black-list for each of the permissions stored in the device properties. The lists contain the UUIDs of devices that are granted access or banned from communicating with the secured device. Though this approach does not implement all the properties of the Biba security model, it does require a device or person to provide a secure token before they can access a particular function of the device.
An IoT system that is “secure-by-design” is dependent on devices that have security features built-in by the manufacturer from the start. Using security models like Resurrecting Duckling and Biba as a guide, we can derive the following set (though not exhaustive) of requirements for securing IoT devices:
- Device identity and enrollment – Use secret keys during enrollment or onboarding to establish identity and some level of trust between a given device and the rest of the IoT system. A device that uses cryptography will be more trustworthy than one that doesn’t.
- Imprinting – After device’s identity is established with the IoT management system it should enter the claimed or imprinted state to restrict usage of the device to a single management scope. For example, assume a homeowner enrolls a connected door lock in their IoT management service, the lock should be flagged as claimed. After this occurs, the lock denies enrollment in another IoT management service until the first IoT service resets the claim switch. If not claimed, what is to stop someone with malicious intent finding a way to claim your lock and gain entry to your house?
- Tamper evident/resistant – It must be easy to tell if a thing has been physically compromised, and even if physically compromised, it must be impractical to extract valuable information.
- Isolation – If a single device in a network of things is compromised, only the data on that device should be at risk, and not that of the entire network. This usually means avoiding using symmetric group keys on the device for encryption.
- Multi-Layer integrity – To support multiple masters and peer-to-peer interactions, the device must have multiple layers of security such as public interfaces open to anyone and private interfaces where authentication is required before the device will interact with the user or another device. The data that is exchanged between the different layers of security is carefully controlled to prevent contamination.
- Software updates – This requirement is not explicitly spelled out in the security models discussed in this post, but we all realize that bugs in software are found and exploited by those with malicious intent. To keep IoT devices (like the hacked lightbulb example above) secure, there must be a framework to advertise, distribute, and install software updates to close security holes after they found. This is largely a function of the higher layers of the IoT system, but devices must be updateable in the field.
Even after meeting all of the requirements listed here, device security can easily be compromised if the gateway or cloud service layers are not protected as well.
Check back soon because next we’ll cover security models for the gateway and cloud service layers of the IoT framework.