- Poor security model
- Poor choice of implementation languages
- Bugs
The battle between C and Pascal was won by the C++ camp, and as a result we're trying to build operating systems which can't pass strings around, let alone complex objects, as native types. This leads to the buffer overflows, and a whole host of issues that would not be present in a Pascal based environment. (Note that I'm a Pascal fan, so I might be biased and/or wrong here!)
Bugs are a given in software development. The more use and testing of software, the more likely a bug is to be found. It makes sense to re-use working code (which contains the distilled experience of the authors and users), instead of re-inventing it. The price is that a flaw anywhere is a flaw everywhere.
The combination of these three factors combines to present a "perfect storm" in which insecurity is the inevitable result, as follows:
- some error somewhere in the manual code to deal with a pointer and a buffer fails to check the size of a buffer
- data gets copied from an input to the program into the variable after buffer (or into the stack space)
- the program flow can now be captured through careful study of the result of the bug and the engineering of an exploit
- the exploit can then inject code to run as the user (or system process) on a targeted machine
- because the code IS the user as far as the OS is concerned, the bug now becomes a trojan horse, and can open the gates to any desired action
The situation will remain bleak until this trifecta of doom is defeated by concerted action on everyone's part:
- Add Strings and other complex types NATIVELY to C/C++
- OR switch to Pascal ;-) {no I don't expect this to actually happen -- Mike}
- Make bounds checking the default
- OR find a code model that makes bounds overflows IMPOSSIBLE (Compile time type checking?)
- Ditch the old ACL model in favor of something modern such as Capability based security
The other big hurdle (which is NOT optional) is to move from ACL (Access Control List) based security to Capabilities. This is going to force a lot of assumptions built into everything to be re-evaluated, and it won't be trivial. It'll force the drivers out of the Kernel and into User space, which will be a performance hit, but necessary. It'll mean a lot of work for Linus and crew (as well as the folks in the closed source world).
Education as to the details of Capabilities, and the adaptation of them into the scripts and service code will take time, but the benefits will be enormous. Once all of this work is done, we'll have no less than honest to goodness Secure Computers!
- bugs still exist in programs, but are now limited to implementation errors, and variables no longer are subject to random scope violations, resulting in much more deterministic results. (fewer crashes)
- strings and objects can be passed via a native set of operations, reducing the impedance mismatch, and the chance of error handing such items (fewer crashes)
- in the event that some code does turn evil, the extent of the damage is strictly limited to its present runtime capabilities list.
Thanks for your time and attention.
Comments and questions welcome.
--Mike--
2 comments:
No need to for drastic changes. The problem is that people have not bothered to implement even the bare minimum of best practices.
You forget that there are much simpler solutions such as:
Personal Firewalls
Authenticode Code execution policy
Reduced user privilege mode
Free patch management tools like WSUS.
First of all, I'm not prepared to loose any performance on my operating systems. Second, have you ever tried to write an OS yourself?
Microsoft Visual Studio already audits code for buffer overflow at compile time. XP SP2 was recompiled with this compiler that did the checking.
There are also plenty of 3rd party solutions that offer this same type of sanity check. These don't eliminate the possibility of error, but vastly reduces them.
Post a Comment