Wednesday, August 17, 2005

Secure Computing done right

I worry about security, it's just good sense to be aware of the true nature of the systems we all rely on heavily to get our jobs done, to play, etc. The current crop of Operating Systems all share the same flaws:
  • Poor security model
  • Poor choice of implementation languages
  • Bugs
Windows, OS-X, and Linux all utilize the same security model from Multics, which dates back to the 1970's. It's fine for a cooperative environment, but falls down flat in today's environment. The assumption is that the user knows what they are doing, and can trust all the code they need to run... both of which are obviously false in the present.

The battle between C and Pascal was won by the C++ camp, and as a result we're trying to build operating systems which can't pass strings around, let alone complex objects, as native types. This leads to the buffer overflows, and a whole host of issues that would not be present in a Pascal based environment. (Note that I'm a Pascal fan, so I might be biased and/or wrong here!)

Bugs are a given in software development. The more use and testing of software, the more likely a bug is to be found. It makes sense to re-use working code (which contains the distilled experience of the authors and users), instead of re-inventing it. The price is that a flaw anywhere is a flaw everywhere.

The combination of these three factors combines to present a "perfect storm" in which insecurity is the inevitable result, as follows:
  • some error somewhere in the manual code to deal with a pointer and a buffer fails to check the size of a buffer
  • data gets copied from an input to the program into the variable after buffer (or into the stack space)
  • the program flow can now be captured through careful study of the result of the bug and the engineering of an exploit
  • the exploit can then inject code to run as the user (or system process) on a targeted machine
  • because the code IS the user as far as the OS is concerned, the bug now becomes a trojan horse, and can open the gates to any desired action
It's the combination of all of these factors, which results in the current vulnerable state of computering as we know it. I started talking about this with Windows, because I'm most familar with it, but make no mistake, Linux and OS-X machines also contain bugs, and the same security model, so in the long run, they are just as vulnerable.

The situation will remain bleak until this trifecta of doom is defeated by concerted action on everyone's part:
  • Add Strings and other complex types NATIVELY to C/C++
  • OR switch to Pascal ;-) {no I don't expect this to actually happen -- Mike}
  • Make bounds checking the default
  • OR find a code model that makes bounds overflows IMPOSSIBLE (Compile time type checking?)
  • Ditch the old ACL model in favor of something modern such as Capability based security
Note that this does require work. I believe it's a fairly realistic approach, but it requires leadership from a few places. The GNU Compiler crew should work with everyone to make the required changes to C/C++. If the Microsoft and other compiler vendors went along, the new builds would have fewer holes to exploit, which helps us all.

The other big hurdle (which is NOT optional) is to move from ACL (Access Control List) based security to Capabilities. This is going to force a lot of assumptions built into everything to be re-evaluated, and it won't be trivial. It'll force the drivers out of the Kernel and into User space, which will be a performance hit, but necessary. It'll mean a lot of work for Linus and crew (as well as the folks in the closed source world).

Education as to the details of Capabilities, and the adaptation of them into the scripts and service code will take time, but the benefits will be enormous. Once all of this work is done, we'll have no less than honest to goodness Secure Computers!

  • bugs still exist in programs, but are now limited to implementation errors, and variables no longer are subject to random scope violations, resulting in much more deterministic results. (fewer crashes)
  • strings and objects can be passed via a native set of operations, reducing the impedance mismatch, and the chance of error handing such items (fewer crashes)
  • in the event that some code does turn evil, the extent of the damage is strictly limited to its present runtime capabilities list.
I believe this is the direction we need to take to make things better for us all.

Thanks for your time and attention.
Comments and questions welcome.

--Mike--

2 comments:

Anonymous said...

No need to for drastic changes. The problem is that people have not bothered to implement even the bare minimum of best practices.

You forget that there are much simpler solutions such as:

Personal Firewalls
Authenticode Code execution policy
Reduced user privilege mode
Free patch management tools like WSUS.

First of all, I'm not prepared to loose any performance on my operating systems. Second, have you ever tried to write an OS yourself?

Anonymous said...

Microsoft Visual Studio already audits code for buffer overflow at compile time. XP SP2 was recompiled with this compiler that did the checking.

There are also plenty of 3rd party solutions that offer this same type of sanity check. These don't eliminate the possibility of error, but vastly reduces them.