Tuesday, May 29, 2012

A future letter to Eben Moglen - draft 001


I watched a talk by Eben Moglen, and was quite moved by it. I felt the need to warn him about the next curveball about to be thrown his way (at least as I see it).

This is draft 001... and needs a lot more work before I send it to him directly.



Hello Mr Moglen,

Thank you for your work on behalf of all of us, re: PGP, and your well thought out 1st draft about innovation under austerity.

In your talk about innovation, you describe a conversation in 1995 with Jamie Gorelick and Stewart Baker where he thows out this spoiler at you after your victory on PGP

"buy nobody here cares about anonymity, do they?"

Which lead to 20 years fighting about anonymity, which you say isn't going so well.

I hope to convince you that there is another spoiler waiting out there, and to give you some advanced warning about it, to be able to help head it off proactively.

I believe the next excuse to be used to curtail freedom will be security. Specifically, the inability of the user-centric default permissive environment of Linux, Windows, etc... to be secure. It is this weakness in security which will be used as an execuse to assert the need to manage all hardware which can be made to do general purpose computation.

There is a big cultural assumption amongst the slashdot crowd that Linux is somehow much more secure than Windows. Nothing could be futher from the truth. They both share the same flawed assumptions, albeit with significant differences in implementation. The assumption is that the user is the correct line for determining security decisions. It is not.

In the past, it was a quite sane demarcation line, because students generally ran the code then wrote, and you were worried about their behavior. In an age where nobody writes there own compilers and tool chains, we all have to trust code we didn't write.

Because we don't write the code, and because it can't be perfect, you can't predict its results. You should have a way to run it without having to trust it. There is no (easy) way to do this under Windows, or Linux, or anything else out besides some research OSs like Eros.

Virus scanners try to maintain a list of known bad programs. This doesn't work.

Linux fanboys will have you believe that the users are stupid, and if you lock things down, they won't be able to screw up the system. The user is blamed here... this is a false conclusion as well.


It is my belief that We need to push, as hard as possible, for the adoption of a security design which allows NOTHING by default, and limits running code to a list of positively stated capabilities, maintained in a per-process list. This framework can still support the access control lists, user names, etc.. we all need to feel comfortable, and to manage users, when appropriate. But this framework makes possible a new form of expression, which isn't even possible for the user to do if they don't have the tools provided.

If a user can run a program without having to trust it, they are free to experiment on it much the same way as we were free to try things out when DOS fit on a floppy disk, and you could write-protect it. You can directly limit the side-effects of any given instance of running code to the list of things you give it, and rest easy.


We need to make this world possible... sooner rather than later.

Thank you for your time and attention.

No comments: