I've been a programmer since the 1980s. I feel that the peak of the field was somewhere between 1995 and 2000. We had Windows 95/98, the Internet, and all programs were local applications run on a desktop, that people had gotten very productive on.
The existence of Visual Basic and VBA support in the Microsoft Office Suite made it possible, and even practical, for most domain experts to build usable applications that allowed everyone to get their jobs done. If there were performance problems, or it needed to be made more reliable, professional programmers would be brought in to rebuild things in a more properly designed manner.... it was at this point that we almost shifted to being actual Software Engineers, and professionalized.
Since then VB was cast into the pyre as a sacrifice to the very unnecessary migration to .Net, and the bloat that ensued as desktop programming lost a decade of productivity, people decided to just shove everything onto the web.
It was only as this was starting to happen that Steve Jobs further crippled programming by introducing the iPhone, and suddenly GUI applications were expected to work on tiny screens (in either orientation) without proper input hardware like 3 button mice and keyboards, connected across a slow and unreliable network connection.
Needless to say, the last 2 decades have been a total loss as far as programmer productivity goes, with one shining exception.... GIT. Git has its flaws, mostly arising when people don't realize it's a set of snapshots that fake storing deltas, and not the other way around.
GIT/GitHub, et al... are fantastic. The ability to just keep multiple machines up to sync without hassle in seconds is sooooo good. I used to keep stacks of floppy disks with ZIP files of source code, all manually managed.
In the future, we need to recover to the point where you can drag/drop GUI elements and have them work anywhere, like we were with VB/Delphi/Hypercard.
When we get there, we'll let users build basic applications, and we can finally professionalize and apply actual engineering practices to the art of programming.
Until then, please stop calling it engineering. We don't put in anywhere near the effort that Margaret Hamilton (the first actual Software Engineer) and crew did, in safely getting men to the moon. We're programmers, not Engineers. As Uncle Bob said, we don't profess anything. We certainly don't use engineering practices as described by The Engineer Guy.
---
Example: You can plug a lamp into an outlet, and in the US, it can draw up to 15 amperes, and under almost all circumstances, you can't damage the wiring in the house via a fault in the load.
We have no equivalent in software. Chroot, sandboxes, etc... are far too unsafe. We have no standard way of letting the user choose resources to give to applications at run time.
The worst part is, most people don't even see the deficiency. Imagine the current power grid with no fuses or circuit breakers.... the first wiring mistake would crash civilization.
We can do better, we must do better.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment