Posted by Rob Slade on September 23, 2016.
Computer and system architectures have security implications. Any new technology needs to be assessed in terms of the risk it may present. A completely new architecture means that there will be new vulnerabilities. And quantum computer architectures will be novel indeed. Many fundamental concepts of computing will have to be rethought in regard to quantum information processing. We would be remiss in implementing any such a novel technology without understanding potential security issues.
On the other hand, we have great difficulty in analyzing our current architectures, and security architectures, to determine whether they are effective or not. The standard practice tends to be to create and implement an architecture, using experience and shared wisdom (such as security guidelines and frameworks), and then see how effective it is (or whether it is effective at all). It would be very helpful to have a simulation of vulnerabilities and protections driven by a given architecture, and to be able to evaluate different architectures in terms of which one gives the best result. Simulation is, however, very difficult and time consuming–with traditional systems. If quantum computing allows us more effective simulation we may be able to do better than trial and error.
Security architecture makes great use of models, but even these are limited. The information flow model is one the more basic and useful (in terms of finding, for example, covert channels), but even it is time-consuming to use effectively. By building modelling tools on quantum computers, the use of least path and simulation analysis could be made faster and more effective.
One problematic aspect of a quantum architecture is in regard to integrity. Quantum devices are highly susceptible to noise of all types, thermal, electromagnetic, and radio frequency. Some have to operate at temperatures close to absolute zero, others need to be in a vacuum, most need to be shielded from radio transmitters (including wireless LANs and cell phones) and various electrical devices. All quantum equipment needs to have careful handling of input and output, and analogue apparatus in particular requires I/O filtering. Even with all this care there still seems to be just a bit of indeterminacy in results.
At the moment, and with the fairly rudimentary computing mechanisms developed, “voting” (comparison of multiple devices or multiple runs) and checking of errors against other standards is sufficient. However, as applications become more complex, these measures may no longer be sufficient.
Fortunately, quantum error correction is a recently determined general outline which indicates that, using a concept of entanglement transfer, quantum information processing can be used to correct a wide range of noise in a properly designed quantum system. It has been demonstrated that rectification can be achieved even when the remedial operations are faulty. This may have ramifications for fault tolerant computing.
However, another factor may arise with regard to integrity. Quantum computing will likely provide us with operations that cannot be performed by traditional computers: or, at least, not within a useful time frame. We may, therefore, be presented with answers or results that we simply cannot verify in any other way. Providing assurance requirements, and even the basic concept of separation of duties will be very difficult indeed in these types of situations. The “trusted computing base” may be very large, indeed. (The idea of implementing a reference monitor is enough to make your head spin.)
Quantum computing systems will likely be available soon. However, these will, initially, be proprietary and closed, as various vendors release various designs, and try to protect their intellectual property. This will create problems for integration in enterprises where the computer architecture is based on open systems and standards.Share This: Submitted in: Expert Views, Rob Slade, Security |