Saturday, December 31, 2005

Mobile code and Security

I was searching around for a quote about things... and found this gem from Arnold J. Toynbee:
Apathy can be overcome by enthusiasm, and enthusiasm can only be aroused by two things: first, an ideal, with takes the imagination by storm, and second, a definite intelligible plan for carrying that ideal into practice.
Now, it's clear to me that we're dealing with apathy when it comes to security. Nobody really wants to have to do the massive amount of work it takes to replace a security model, and we're all willing to put up with "good enough", because we're overwhelmed with stuff to do.

I will keep trying to push both parts of the above equation... talking about the ideal, and helping to author the plan to carry it into practice. This blog is a tool in that process. I thank Doc for the linkage, it's got me charged up... and I thank my lovely bride for her support.

How to recognize to a secure OS

Definition

A secure OS is one which is immune to the threat of mobile code. - me, just now

The need for mobile code

Exposition
We live in a networked world, one that we hope will remain open and unrestricted. It is necessary to secure the ends of the Internet, if we are to have any hope of discouraging the filtering of the middle as necessary for security. Thus, in order to secure a future without censorship and severe limits on innovation and freedom, it is crucial to get a foundation built which we can actually trust.

According to the DoD memo:
Mobile code is a powerful software tool that enhances cross-platform capabilities, sharing of resources, and web-based solutions. Its use is widespread and increasing in both commercial and government applications. In DoD, mobile code is employed in systems supporting functional areas ranging from acquisition to intelligence to transportation. Mobile code, unfortunately, has the potential to severely degrade DoD operations if improperly used or controlled.
Software is the distillation of Knowledge into Executable form. Thus, the sharing of software is the sharing of knowledge. We MUST be able to run other peoples programs without fear of ill consequences. This requires a Secure OS.

It's said that the only secure computer is one that is entirely unusable. I've repeated the story that the only way to secure a computer is
Disconnect it, crate it up, bury it 6 feet down in a concrete vault, and post armed guards... even then it might be secure.
Now, I know that physical access to a machine trumps any software measures, OS, etc... and thus I'm leaving this out of the scope. A typical user doesn't want to destroy is machine, he just wants to use it as a tool, run whatever programs, view whatever documents, and just get on with things. It's our job to make it happen.

Capabilities

A capability explicit permission to do a specific task. Jonathan Shappiro points the way:
The term capability was introduced by Dennis and Van Horn in 1966 in a paper entitled Programming Semantics for Multiprogrammed Computations. The basic idea is this: suppose we design a computer system so that in order to access an object, a program must have a special token. This token designates an object and gives the program the authority to perform a specific set of actions (such as reading or writing) on that object. Such a token is known as a capability.
The concept of capabilities has been around for a very long time, but it's not been chosen as the basis of a security model in a modern OS. The tradeoffs made during the design of our current crop of OSs didn't need to take mobile code into account, obviously things need to be reconsidered.

There's a whole lot more writing and editing to do... I'll cut it off here, and thank you for your time... here are some of the links I used to write this post:

No comments: