Trusted Computing Workshop
University of Surrey, 2018 January 25
The first talk was on Presence Attestation and was given by Professor Gene Tsudik from the University of California, Irvine.
- Key question: How can a non-techsavvy user establish trust with a smart device?
- Key insight: use ambient properties like location to prove that a TEE is active on a smart device to a non-techsavvy person.
- Through his talk Gene assumed there was a trusted module inside the smart device, but how can a user that is not tech-savy know whether they are being protected by this trusted module in the presence of an untrustworthy operating system.
- Solutions:
- Scenario 1: there is a dedicated channel between the trusted module and the user (such as an LED light on a smartphone). He said this was too expensive to work in practice.
- Scenario 2: There is an untrusted analog channel (like a screen) between the user and the trusted module
- He proposed attestation through a physical/ambient property like location or a scene in a room.
- The trusted module must have a trusted channel to the sensor (e.g. GPS/camera)
- The trusted module then signs this data and sends it to an external verifier
- The external verifier could be a laptop that is inherently trusted and can tell the user if the trusted module is enabled or not on their smart device
- Side note: He mentioned Cuckoo attacks (B Parno), which is a hard challenge to solve in these scenarios. His solution he solves these attacks by timing the response.
The second talk was on Applications for Hardware TEEs and was given by Professor N Asokan from Aalto University, Finland.
- Key question: What do we need to do to increase the use of Trusted Execution Environments (TEEs) by applications.
- Key contribution: create a side-channel resistant way to query a database from a TEE
- Why haven’t TEEs taken off yet?
- TEEs are hard to use by developers
- Attacks against TEEs discourage their use
- People are suspicious of TEEs: maybe they are used to force vendor lock-in or DRM...
- His aim for the rest of the talk was to show that there are places where TEEs can be used and which will alleviate the previously named concerns.
- Common sense applications for hardware TEEs
- Private membership test
- You want to check whether a value is in a database without disclosing the query to the server that holds the database
- His specific use-case was an antivirus provider that has a malware database and a user wants to check whether an application is in the database without disclosing which app they installed.
- His technique is to tunnel the query into a TEE at the server side
- The next problem is that it usually is not feasible to bring the database into the TEE, so how can we query a database without disclosing the query:
- One solution is ORAM, but this was too slow for his application, since the database can be large
- He made a system called cuckoo on carousel; cuckoo being a one-way function and the carousel meaning that a query always needs to be compared to all values in the database before returning (as to mitigate side-channel attacks)
- Protection of password based web applications
- When a password is sent by client to server, the server does not want to know the password.
- They want to have the password managing functionality in a TEE at the server
- There then also needs to be a TEE at the client side to tunnel the password directly into the servers TEE.
- Private membership test
The third talk was on Root of Trust on Commodity Computer Systems and was given by Professor Virgil Gligor from Carnegie Mellon University.
- Key goal: how to establish and maintain a root of trust on commodity computer systems that use untrusted software.
- Key contribution: using an untrusted device driver to communicate with IO and creating code in a TEE that can verify the driver’s operations for USB and GPU.
- This root of trust should remove persistent malware from all devices in a computer: from the main CPU but also from device controllers like hard disks, GPUs etc.
- His problems with creating an initial known trusted state were:
- Protocol atomicity: snapshotting all the devices at the same time (otherwise malware could reinfect check devices) and verifiable control flow (malware should not be able to interrupt the checking procedure)
- Concrete optimality: has something to do with the function being used to prove the content of the memory: e.g. the function should not be able to be computed faster by an adversary
- The second part of his talk was about once you have a root of trust, how do you use small pieces of trusted code alongside large pieces of untrusted code:
- He had some funny terminology for this, namely that there are big software “giants" with many features and a couple of software “wimps” that are trusted and verified
- How to do IO with a TEE that does not rely on untrusted drivers?
- Is it possible to create simple trusted software to verify the operation of the untrusted counterpart?
- How to securely separate IO devices? (Current IOMMUs are broken)
- His research group created device driver verifiers that can be formally verified for both USB and GPU.
Additionally, they allowed PhD students to give a 5 minute pitch of their research:
- My presentation was on private execution using CHERI's hardware-enforced capabilities. Private execution would enable applications keeping data private from the operating system they are running on top of.
- A repairable key partitioning scheme (T,D,N), where T out of N devices are necessary to authenticate, but where D devices can help repair a lost partition.
- How to create decentralised authorisation process in cloud federations (Shorouq Alansari from University of Southhampton)
- Post-Quantum hash and lattice based cryptography. For example, how to do direct anonymous attestation using Lattice crypto (Matthias Kannwischer and Nada Elkassen from University of Surrey)