Thursday, December 31, 2009

Brain storm rising....

I've been typing up quite a bit, trying to transform a ton of stream of conscious writing into something a bit more coherent, and I'm not there yet. One of the first posts to come out of it I suspect will be about measuring the "length" of a hyperlink...

I'm still not sure what I mean, precisely, but I'm convinced that some have negative lengths, which is intriguing.

Parenting in 2010

I found this via a post Crime & Federalism, a new daily blog I intend to read.

I worry about Virginia... this addresses it, and gives me some ideas about how to help her stay safe.

Tuesday, December 29, 2009

Beware DimDim - Fake open source

I've been trying in the background of my day job to get a workable installation of DimDim going, and have discovered an ugly truth... they don't make a usable version of their software available... they are open-washing their service.

The current version of their service as deployed is 5.0

The last open source release is 4.5 as of the beginning of 2008... it is only available in a semi-working state as a complete VMware image, I can't get any of their installs to work after spending a few man-weeks on it. The VMware image is based on CentOS, which I'm totally unfamiliar with, customizing it was a nightmare, at best. Because it's an OS I'm unfamiliar with, I have no way of knowing what other things it is doing in the background.

I GIVE UP.

Learn from my errors, Save your time and effort, find some alternative to do your video conferencing.

Oh... and I'm not alone...

Read the comment by David Strickland at http://ostatic.com/dimdim

Thursday, December 24, 2009

FollowTheMoney

It's an idea whose time has come.... the #FollowTheMoney hashtag for helping the rest of us see how our future is being sold out.

Thursday, December 17, 2009

The sad state of US weapons technology

I saw this while I was waiting for the train today.... it was sufficiently surprising to warrant a $2.00 purchase of a newspaper... my first impulse purchase of a newspaper in more than a decade.

I can't believe how stupid the situation is.... go read this and weep....

  • The Wall Street Journal


Insurgents Hack U.S. Drones

$26 Software Is Used to Breach Key Weapons in Iraq; Iranian Backing Suspected



What this means is that our $10,000,000 drone can be subverted with a laptop for less than $1000.... a 10,000:1 advantage. It doesn't end there of course.

If the signal can be detected, it can be tracked, and jammed... possibly with something like a $50 microwave oven.

Spread spectrum technology was invented in the 1940's to allow us to securely control torpedoes... I can't believe they haven't come up with a communications channel for this stuff that can't be a bit harder to detect in the last 60 years with military budgets.

UGH!

--Mike--

Tuesday, December 15, 2009

Indeterminant Intermediaries Imminent

Doc Searls recently wrote a blog post - The Revolution Will Not Be Intermediated in which he states some hopeful things:

We still seem to think that progress on the Net is the work of “brands” creating and disrupting and doing other cool stuff. Those may help, but what matters most is what each of us does better than anybody or anything else. The term “content” insults the nature of that work. And of its sources.

The revolution that matters — the one that will not be intermediated — is the one that puts each of us in the driver’s seat, rather than in the back of the bus. Or on a bus at all.
As much as I like Doc, and wish he were right, I can't help but be cynical for a number of reasons. I'm hoping to be proven wrong, and/or convinced that there is room for hope. Please find some flaws with my logic...

The future of the live web is in doubt, for good reasons.

The long tail - theory vs practice

The Long tail theory holds that everyone should have a blog, and they will have a voice. This is widely interpreted to mean that the tools are sufficient to get your views into public. This theory is valid, but as with any description of reality, it fails at the edges, and a better model of understanding should eventually replace it.

The blogging tools now have succeeded in making it possible for anyone to write an opinion, and have it accessible in an instant from any Internet connected PC in the world. There are limitations when Governments, or ISPS get in the way, but for the most case those limits are exceptions _post _facto__ and do not impose prior constraints.

However, the freedom of speech enshrined in our US Constitution is worthless without the freedom of assembly, which is the freedom to hear someone else speech, and to have conversation with them. It is here that the intermediation sets in.

Historically, the cost and logistics publishing provided a natural damper on the quantity of material, and people generally focused on getting quality of content up to the point where it seemed acceptable to expend the effort to publish and distribute it. The nature of the Internet and the web has done away with this limitation, and made a more basic limitation which has always been there, apparent to all... the limits of human attention.

Now that we all have fast and effectively zero incremental cost to say what we want, the problem now becomes connecting with an audience. Out here in the long tail, where this blog will reside for the foreseeable future, the loyal audience is very small, mostly family and a few friends. In this blogs history, the rare moments in the spotlight are for things unrelated to the normal subject matter which I tend to be interested in, for example... this post "Songs about teamwork" is the source of more than 50% of all hits here... ever.

This means that while the tools make it possible to speak, there is very low probability of being heard when you take the time and effort to set up a blog, especially if your areas of interest are many and varied. An intelligent response is to consider alternate ways to route around the attention problem, and to write where the readers are.... thus putting content on social networking sites, into emails and other non-live-web related channels, and into comments, twitter, and other aggregated sources of attention.

The long tail in practice works out far from theory, we now all have a voice, but we have to find an audience... everyone has a soapbox, but the public square is full of people with ipods, tuned into their own narrow circle of interest. The key is to find a topic already in progress, and to attempt to join into the conversation. We all have our own bus (or car), now we need passengers who are going to our destination.

I'm a commuter, I regularly share my trip with a circle of friends on the South Shore railroad daily to Chicago. We all give up a bit of our privacy to have a shared experience that is more cost effective and efficient for each of is... this is the same logic behind using social media sites, twitter, FaceBook, etc.

If the train fails to arrive, gets delayed, etc... we route around it, share rides, etc... just like when twitter fails, etc. The metaphor can be extended quite a way if you like... it almost writes itself.

It's a silo, sure.. but that is easy to ignore if it stays out of view, and your friends are there. It's only when you have a dispute or disagreement with management that the situation becomes unacceptable.

With this in mind, it's not hard to disagree when Charles Arthur wrote in the Guardian that The long tail of blogging is dying. I'm spending upwards of 4 hours working on this post, if not more... and I might have 30 people read it, or maybe a few thousand if I'm VERY lucky and it goes viral. The residual effects will be very, VERY limited, in my estimation. It's only faith and the need to speak my piece which keep me going.

With all of this rambling in place, let's go back to the quotes from Doc again...

We still seem to think that progress on the Net is the work of “brands” creating and disrupting and doing other cool stuff. Those may help, but what matters most is what each of us does better than anybody or anything else. The term “content” insults the nature of that work. And of its sources.
I think it's better to say that we're used to having reliable labels on our content, enforced by law (Branding, Trade Mark, Copy Right). These labels made it possible to aggregate many different pieces of stuff under one reputation umbrella.

The main challenge in the live web is to figure out how to aggregate enough reputation to be easily quantified by anonymous third parties for purposes of deciding where to spend their limited resources of time, attention, energy, and possibly even money.

Google tries to use PageRank as a proxy for quality, on a per site basis... which isn't a fine enough grain to measure quality on each piece that someone like Demand Media might produce, so the quality is going to be spotty. The idea of tying PageRank to a person is interesting, but that really won't work out in the long run, because we all have different areas of expertise. For example, you might decide to trust my opinions on computers and security, but you probably disagree with my opinions on political matters. These areas are orthogonal to each other, so there need to be many degrees of freedom in ranking and labeling content.

Google is slipping, but all engines slip from time to time. They can only keep up a certain pace, as their infrastructure gets more complex and entrenched in a certain paradigm. They have made excellent use of the resources they have available, especially the 20% time they give to feed (and harvest) what would otherwise be the seeds of their destruction.


I'm a geek, and I tend to think in terms of technical solutions to problems... I see this all as a matter of lack of metadata and the infrastructure to support it. I think over time it'll all pull together, it's just a question of timeframes. It took 500 years for print to get to where it was when the Internet took off... it may take a few lifetimes for the Internet to get figure out to the same extent.

I'd of course like to skip some of the learning curve and get more benefits now, while I can appreciate them, and pass them along to my daughter.

Thursday, December 03, 2009

More code

I've now got a source code management system (Mercurial / TortoiseHg) running on my desktop and home machines, and have successfully pushed code around. I've learned a bit more about Google App Engine, and put together the first of many demos to help illustrate capabilities with some real world examples, which might even become the canonical ones if I'm very, very fortunate.

Here is a simple guestbook which allows you to overwrite a previous entry if you happen to have the token (a random number) associated with it. The jargon for such a key is a capability, to make things easy, I put them out there for all to see and abuse.

If you wish to replace an entry, copy the token from it and enter it below your new text.


Wednesday, December 02, 2009

Learning more about tools

With the latest addition to the personal computing empire, and the consequent lost of context that having an empty slate entails, I've decided it's time to learn to use Delicious and/or sync all of my bookmarks.

I've added a tag cloud off to the right of this blog, which should get more useful as I plow through my vast hoard of bookmarks with an eye towards present and future value, and sharing.

Sunday, November 29, 2009

Mission Accomplished - Tamiya Forklift


Virginia and I just finished assembling a remote control forklift kit.

It looks much like the advertised picture. I only had a few tweaks to make because of some slightly unsprung switch contacts.

It is amazing how many small parts of pretty tight tolerances can be made from a few ounces of plastic. I'm in awe of modern injection molding technology.

Monday, November 02, 2009

Confessions of a Linux newbie.... part 001

Today I learned of the power of the showmount -e command.... which helped me figure out that I had the permissions given to a visually similar, but wrong IP address range.

I have now managed to share a folder, and access it, using NFS across two Ubuntu boxen. 8)

Sunday, November 01, 2009

How cold warriors screwed up science for the rest of us.

I watched this story while I was doing some server maintenance today at work. (Yes... working on a Sunday... ugh).. and it brings a few points to mind:
  • Giving both sides to a story isn't fair and balanced if one side is a tiny vocal minority
  • Tiny vocal minorities can be created by corporations with a relatively modest investment of resources, and can net a large disruption of Democracy in the process. I'd recommend reading the works of John Robb to see what that implies for the future.
  • The science behind global warming has been known for quite a long time... longer than even I imaged it to be.
  • There are parts of government which actually work quite well. Let's not throw it out, but encourage the good parts, and weed out the bad.
  • Slippery slope arguments are bad for one's logical thought processes, because they over-ride the possibility of progress through dialog with one of fear.
Global warming is real, and we need to address it.

Thursday, October 29, 2009

VRMish download manager... 2nd mention

I'm thinking that there needs to be some sort of agent that can sit in the cloud, or on your system, to handle things you download, and allow you to socially network info about them, like delicio.us for files instead of bookmarks.

So now I'm not the only one with an idea.... Dave Winer has it too... some way to manage all of the downloads, via a 3rd party.

He's only worried about local stuff.... I want to distribute the list, etc... but I think there is some common ground.

Saturday, October 24, 2009

The Terminatrix

At last, an explanation of why Obama got the Nobel that makes sense..



How do I extract Picasa's "Name Tags" for use on Flickr, etc?

I'm quite happy with the feature of Picasa which does facial recognition as a concept, but not in practice. I can't get the data to couple to any other place, and for now it remains in a Walled Garden that is Picasa. I was hoping that I could get it out via EXIF or ITPC tags, but that doesn't seem to be the case.

Does anyone know how I can get this data into a more useful (portable) format?

Revolution, fear, and what to do about it.

I read a lot of things I classify as "tin foil hat"... but that label is increasingly inapplicable, because a lot of things that were previously fringe, seem to be working their way towards the mainstream... one of the sites is What Really Happened, which I started reading soon after 9/11/2001. It's a collection of story summaries, with links, to all sorts of tin foil hat material.

One of the stories What is the threshold for revolution? asks about something I do NOT want to happen... revolution, and how to discern the cause or trigger point. Today's experience waiting with a lot of my fellow Hoosiers has me convinced that local government still works, is fairly competent, and has the trust of all of those in line. This is a good thing.

As long as we all believe that we have some measure of control over things, and we have some transparency, and that we're all in this together things still have a reasonable change of not spiraling dangerously out of control. The key here is to work on the local level, coordinate on the larger level, and change the rules to allow the conversations open and honest.

The Federalists have amassed a great deal of central power, but the corruption costs due to the lack of transparency, and capture of the government by lobbyists and other non-citizen groups has to be addressed. I would like to see the transparency promised by the current administration materialize, and I fear the consequences if this part of the promise is not kept.

We, the people, will pull together, and make it through the consequences of the Federalist agenda that have savaged our nation, but we have to sense that there is at least hope of turning things around.

The airwaves are filled with a lot of Left vs Right noise... it's all a distraction, to keep you from talking with your neighbors who you might not agree with on specifics, but definitely agree with on the general issues: Life, Liberty, and the pursuit of Happiness. If you avoid the hot-button issues, I bet you'll find you really have a lot to agree with, and some common cause.

Revolution is messy, wasteful, and gets a lot of innocent people killed, and destroys much... lets all pull together and prevent it, while making things better, and getting our government back under our control.

Thanks again for your time and attention.

--Mike--

Novel N1H1, a nice dream, and a world of hurt.

Dear diary,
It's been an interesting day. Today I write about Novel N1H1 Influenza, a dream about a parts exchange program (PEP?) and a world of pain.


The day started off innocently enough. I awoke before 8:00 AM to get Virginia ready to go, and we went to Crown Point, Indiana, to get her immunized against the Novel N1Hi strain of influenza that will be making the rounds soon. We arrived at the county courthouse at about 9:00 AM, the stated opening time, to find the line wrapping around from the side of the building to across the front. It took approximately 3 hours, 40 minutes, to reach the front of that line. During this time, Virginia got tired a few times, and wanted to be picked up. (She's currently 3, and weighs about 45 pounds with all of the layers we had on today). I knew I would pay the price later, but that's what fathers do to help their little girls.

Here's a story from the local paper that covers the experience from a few days ago.

Virginia, like always made friends in the line. Hi Lauren and Mario!

She ended up getting the mist variety of immunization. This apparently contains a live, but "attenuated" virus to give her immune system something to recognize later, with a life-saving head start. Not what we expected, but good enough. The actual immunization only took about a minute, with about 2 minutes of paperwork leading up to it.

The car made the day more interesting by having a loose bit under the front, which I had to re-secure after returning home. There is an air dam, to help with aerodynamics, great when it works, but not fun when it flaps around on the highway. It's fastened down, and should be good enough for a while.

We all took a nap. I had a nice dream about writing a program, open source, that sounded really nice during the dream. It was called PEP - Parts Exchange Program, and was a way to deal with the situation I found myself in during the 1980s... lots of parts, a surplus in fact, but always missing something I needed. PEP would allow you to list all of the parts you had for sale, trade, etc... and automatically find things you needed from nearby sources. (Radio Shack, EDI, friends,ACRO, Digikey, etc) In the dream... it was really sweet, and everybody was excited by the idea.

Upon waking... the world of pain made itself apparent. My back hurts... a lot! I'm fortunate that I now have time to rest and let the Advil kick in and do it's thing. I'm also fortunate to have a blog to write into to distract me from the pain for a while.

The PEP program is a good idea, but the actual cataloging of all my parts would take more time than I can imaging spending right now. I think the concept could be applied to a number of places, especially in a world of decreasingly available cheap new parts. It can also be applied across a number of disciplines, pretty much anywhere people have things to share.

I also have the idea of a cheap curve tracer / parts measurement box for discreet and or 3/4 pin semiconductors, which would interface with PEP.

Well... that's it... Novel N1H1 gave me an interesting day, and gives you a hopefully interesting blog post.

Thanks for your time and attention.

--Mike--

Monday, October 19, 2009

VMware ESXi 4.0 first boot

I managed to get my first VMware ESXi 4.0 server booted today (an HP ML-310 with direct attached disk). I'm using the Free version.

This now allows me to allocate CPU and RAM, setting limits on a rogue machine... and getting performance data to help figure out what's doing what.

I look forward to the greater control and transparency this will provide me.

Sunday, October 11, 2009

Refinement #1 - Download Agent

My earlier post gave my first iteration of an idea... here is a refinement of it

The user should be able to install and/or choose a service to be their download agent. This agent would, regardless of location, have certain information the user might be willing to trade for downloads, and would handle the actual process of handing off data and doing the downloads.

The use of OpenID to authenticate things, and a VRMish way of treating the users data as the property of the user, would be one of the key functionalities to make this better than the current system of download managers installed by the vendor.


Saturday, October 10, 2009

A User provided dowload service standard?

I have an idea, which I'll attempt to express in enough words to get the point across, hopefully to someone who can actually implement it.

The idea is simple... a shopping cart/download list service, with some optional social networking tools.

We often need to download some items from the internet, and sometimes later refer to them. When you deal with multiple computers, you often find yourself having to move files around, and/or re-download them multiple times for various installs, patches, etc.

What if we could do the equivalent of a shopping cart functionality, which is provided/provisioned by the USER. This would have some advantages to trade for added complexity:
  • All downloads would get tracked in one place, per user, instead of per PC
  • All downloads would happen across very high speed links (assuming a hosted service)
  • Virus scanning could be built in
  • Tagging and other forms of metadata could be added
  • Source metadata could be automatically saved
  • Registration data could be supplied via OpenID, or some other means, instead of filling out the same lead data over and over.
  • Lower friction providing lead data might increase the quantity and quality provided to the site allowing downloads
  • Rating and other social networking features could be added as well.
I'm sure that it's non-trivial to set this type of thing up... and a standard needs to be in place, with a canonical example. Who is interested in taking this ball forward?

Friday, October 09, 2009

Wednesday, October 07, 2009

My Heart is in the right place, with you Noran!

I had this idea on the way in to work today. I never imagined life could be so much more interesting and joyous... thank you for sharing the journey with me, Noran!




My Heart is in the right place, with you Noran!
Originally uploaded by --Mike--

Monday, August 31, 2009

New fast way to help skim profits from the rest of us!

Ladies and Gentleman, bankers of all ages.... let me introduce the newest invention to help you skim profits from those of us silly enough to play the market:


This is a brilliant piece of analysis, useful for tracking down a slew of hard to find networking issues at relatively low cost. Of course, the biggest use will be for helping to reduce the latency in the routers at Wall Street.

Why is this important? Here is Karl Denninger's explanation of "High Frequency Trading"... which should make it crystal clear.

Technology has a bias towards specialization... in this case let's try not to let this "invention" (actually a clever idea and simple algorithm) be used only for evil.

--Mike--


Tuesday, July 07, 2009

Arch_Draft1024


Arch_Draft1024
Originally uploaded by --Mike--
I think this is a nice composition, I need to reshoot this with a bit more work.

Sunday, June 28, 2009

The Big Problems

This is a list I've been working on... of problems which are big, and most people consider unsolveable... and I believe CAN be solved:

  • Secure OS
  • Sync
  • Mesh Internet (related: Throttling, Price Tiers, Metering)
  • DNS Security
  • SMTP
  • Spoofing/Authentication (possibly related: Context)
  • Freedom/Censorship
  • HTML/Markup
Now.. this list is pretty uninteligble as it sits, without the context and description of each of the problems, the attempted solutions to date, and a proposed new way of solving them... which will follow along in a series of blog posts... at least one per topic.


Saturday, June 27, 2009

notes from APCU

http://store.sansdigital-shop.com/sadihd.html

The list of big problems
  • Secure OS
  • Sync
  • Mesh Internet
  • DNS / DNSSEC
  • SMTP
  • Spoofing/Authentication/Context
  • Censorship

Wednesday, June 24, 2009

3 Tipping Points

Here are three big tipping points which determine much about the way we use the internet.
  1. Secure operating systems
  2. Mesh networking
  3. Distributed content systems / sync gets solved
Fixing these may take 20 years... but I believe they can all be solved.

Secure Operating Systems

There is a big hole in the way we currently approach computer security. The user has no way to limit the actions of a program. They are forced to trust completely that each and every byte of code does no harm. To get around this hole, layers of firewalls, virus scanners, and support personell are thrown on top of the big hole in the side of the Titanic, with similar results.

You'll know you've got a secure OS when you can run ANY program, without fear. You'll be able to throw away your virus scanner... and the technicians will stop blaming the user for clicking on the wrong button.

Until this gets fixed, the insecurity of the ends, and the need to use maintenance staff as a bandaid will be used to justify filtering, censoring, and increasingly intrusive regulation of the internet.

Mesh networking

The internet is a very brittle tree, with few main branches, which is only reliable because of the heroic efforts of the staff of the various entities who work around problems. It's a bailing wire and duct tape affair, on a massive scale. It doesn't have to be this way.

It's possible to expand the address space, auto-assign addresses to everyone, and just get on with it... doing away with fixed IP addresses, etc. The woman who invented the protocol that makes all of our switches just work together has said so.

If we change the nature of the internet so that it actually can route around problems by itself, with the need for obscure and hard to configure protocols, it can get an order of magnitude faster, and reach into smaller crevases.

Adding wireless nodes into the mesh would make the final transition to a truely shared resource possible, with everyone chipping in to make things faster, every time they turn on their gear.

Distributed Content Systems / Sync gets solved

I'm a commuter, and have extensive experience with the woes of having multiple computers. You're always being forced to sync things, and resolve conflicts. You never seem to have the right stuff on the machine in front of you.

The promise of always on connectivity seems appealing, but doesn't actually solve the real problem, synchronization. In a single person, multiple machine environment, it's possible to us manual sync processes, with sufficient discipline... but any deviation will result in lost work.

When you scale sync problems up to groups, even a perfect file sync system (everyone sharing the same files on a server) has problems. The next problem is one of granularity of changes.

Google Wave solves this problem, by breaking up any set of changes into discreet chunks which can be broadcast and synchronized for any given number of users, across organizational bounds. There is a lot of code to be written to build upon this solution, but it will be worth it.

Summary

I've presented what I think are the 3 tipping points for the future. All of them require major changes to the code we use in order to be implemented, most of them bordering on "boil the ocean" level... but the costs will be worth it, in each case.

Thank you for your time and attention.

Friday, May 29, 2009

more thoughts about Google Wave

I recently wrote:
 
It would be nice to be able to actually markup hypertext... but that still appears to be outside the range of feasibility.

Maybe it could be called TLM - Text Language Markup?

It turns out the given name is Wave, and is a dialect of XML still being tweaked by the wizards of Google and a few thousand developers who just saw it for the first time this week.  I spent the time to see the demo, and have been following up to see what others think. The gut level reaction seems to be one of  hope that this is something good, because it does seem to be a game changer.

The void being filled here is hard to describe, I've been trying for years... it all boils down to context. When you send an email and then reply, you're forced to use all sorts of mechanisms and tools to attempt to keep your train of thought, your conversational cache, your context.  Every step away from being able to just add a note, circle something, highlight or annotate text to draw attention makes conversation less efficient. Wave is going to provide a mechanism that does a much better job of preserving context. 

I expect Wave to succeed because of the good Kharma that Google has built up, along with their pledge to open-source most of it... which will greatly help adoption as a defacto standard.

The money quote for me was Tim O' Reilly's mention of the need for granularity when editing book manuscripts, which I feel vindicates some of my howling in the wilderness these past years. 



Google wave - Web 2.0 at last

Google has acidentally created the first good realization of web 2.0.  By providing a way to mark up hypertext, they are on the path to resolving one of my long term frustrations with HTML and Web 1.0.  It will now be possible to collaborate in fine grain, with minimum loss of context because of tools that lack the ability to point at a part of a document.

As Tim O'Reilly says:
Our experience with collaborative editing of book manuscripts at O'Reilly suggests that the amount and quality of participation goes up radically when comments can be interleaved at a paragraph level.
Colaboration and sharing of data is about to take off in new and very powerful directions. I highly recommend you take a few hours, watch the demo video, and dive in to see what the future is going to bring. It's very exciting.


Tuesday, May 19, 2009

More on why Google sucks

Google forces you to try to figure out keywords unique to the subjects you are looking for. This doesn't always work, and is especially hard to use for conversations that are non-technical and thus have the least amount of topic-specific jargon to latch on to. 

The next great leap will be to figure out the subjects of text, so that you can match on multiple areas of interest, and explore the intersections.

Why google sucks

Ideas are like Reese's peanut butter cups... they are all the synthesis of other concepts in new combinations. Creative energy is the exploration of new combinations to yield new and interesting combinations, some of which turn out to have value. Ideas are all a synthesis of *more than one* pre-existing concept. Thus any catalog of ideas necessarily will have all of the component ideas cataloged so that you can search for combinations of ideas...   a single heirarchy will fail miserably at this task. Tags with a folksonomy have a much, MUCH better chance of yielding positive results, and being a better tool.

Why virus scanners are doomed to fail

Intent is very important, for example... spam is email with an intention to push a product. Malware and trojan horses are similar in intent, but even more malicious. 

The user of a computer can not tell what the intent of the author is... thus it's necessary to provide a mechanism for limiting the scope and actions of that program, in an effort to help with that judgement.

The current group of operating systems do not provide a way to limit the scope and environment of a program prior to its execution, to the ordinary user. 

Virus scanners all assume that programs can be examined and found not to be of ill intent merely by checking them against some arbitrary lists. This can be seen to be a losing battle when viewed from this perspective.

Intent

Intent is very important. For example, if there is an auto accident with a tragic outcome, the driver may get the sympathy of the community if they judge the intent wasn't there... on the other hand, if malace aforethought is judged to be the intent, the driver may be executed by the state. 

The intent of an action has very little to do with the reactions of others... it is important to divorce your feelings from the situation and to be objective if you are attempting to judge the intent of others. A 2 year old girl has little intent beyond the next few minutes amusement or contentment, for example. It's very hard sometimes to keep this in mind, but a good parent will do so, even if it's after the fact.

Empowering IT

It is the IT departments mission to provide a stable and consistent set of tools for our users... empowering them should also be the focus here... providing the best tools and skillsets possible to maximize the reach of the users is our goal.

Empowering Virginia

We are limited by our language, vocabulary, and skill set in using them. I believe the "classical" education was designed to give the largest possible set of tools to the pupil, which is in stark contrast to the currently accepted system which provides a lowest common denominator.

It is a parents responsibility to provide their children with the best possible set of tools, to empower them as much as possible. It is THIS that should be the focus, not the idea of giving them a head start in a race. The head start can rapidly be overcome by someone who has better skills negotiating with others, and can communicate more efficiently.

It is with this realization that I'm going to refocus my energies as a Father to Virginia.

Monday, May 18, 2009

Expressing limits - can't be done yet


If someone came to you with a program which you knew might be evil... how do you express to the operating system that you wish to run it, with only the following access:

  C:\danger
  No internet or other access?

You can't... your power to express this is non-existent. This forces you to have to trust each and every piece of code you run to not make system wide changes.... which is just plain stupid.

Virus scanners check the code against a list before running it... this list is never perfect, and there is a delay in adding new entries...

Thus security on a pc is not perfect. This will not change until there is a way to express the limits on a program prior to (and during) it's execution.

Operating systems came about as a means to share resources safely... they aren't done yet doing that job.

Tools - it's all about tools

Better tools... it's all about better tools. A good tool reliably allows you to extend your reach, and do things more efficiently, with more control.

The internet is still relatively young... the web is now 20 years old, and we're still figuring out the tools that can be built. Mashups are cool, but being able to program enough to put together something that can be optimized by others is a very good thing.

The complexity of putting a simple "Hello World" example in place using the LAMP (Linux, Apache, MySql, PHP/Python) precludes a lot of us from using it as a tool. Perhaps we need something a bit easier?

Tags - an oops

I started to get rid of tags from entries on my blog because I thought a shorter tag list would be more useful, but I've figured out that is not the case. There is an expressive force in tags that is lost when you cut out the free association aspects of it. You need to be able to make those serendipitious connections, this is where ideas come from.

I'm going to review all of my previous posts, and put tags as appropriate on them, as time permits. This will allow eaiser discovery of the good stuff, and make the barriers to it lower for someone who happens in.

The limits of power

What you can express limits your actions. If you can't communicate an idea clearly to others, who wish to help you actualize it, you are stuck with the resources you personally have control over.  Thus it's best to have a toolset that expands your capabilities.

Ideas are worthless, it's been said... and there are good arguments to support that. However, not all ideas are worthless, there are some small fraction that could do great things, but are trapped in isolation because the person who concieved of them can't express them.

The flaming of newbie questions in the Usenet/Linux culture limits the number of people who will adopt it as a tool, as well as limiting those who might otherwise pick up a new language, try out something new. Clearly this is broken, or at least immature.

Teaching tools to empower others gives them the option to exert their own work, and their own time, to try to give their ideas life... this parellizes the problem of weeding out the bad ones, and makes the possibility of a good idea coming out of a population of X people increase because they have a better set of tools.

Thanks to Daryll for putting up with my first run through of these ideas.


Thursday, May 14, 2009

Small improvements in the right places

I'm cleaning up the tags on this blog, getting rid of the most single instance items which are noise. I'm also clearly tagging all of the rants, just because I think I need to. These small changes should at a bit of value to this blog.

Wednesday, May 06, 2009


Blogging forces some unique constraints upon the author, these are mostly societal and expectations, not technical. The blogger is expected to write a short essay (french for "to try") on a regular basis. This stream of text is also expected to stay on a single topic, the best to feed the point and click sensibilities of the larged audience possible.

No more!

Blogging is broken. Here's why.

The short form doesn't allow for the necessary amount of exposition necessary to lay the groundwork for common understanding that is necessary to overcome the limitations of text and the vagaries of human language. There are always differences between the intended meanings of the author, and those inferred upon reading. It's necessary to supply more than one set of explanation to help the reader decide which of their multiple guesses at the intent of the author is closest. 

Conversation involves listening to the other partner, and giving feedback on the concepts discussed, to make sure your minds are as close to synchronized as possible. This helps then ensure clear communication of the intented subjects. The better the communication, the further the reach from the everyday is possible.

In order to be able to discuss deep subjects, specific vocabularies must be developed... specific metaphors, and examples. This is the reason doctors have their own medical speak, and programmers know what loops are.

The short form doesn't allow this exposition at length. It forces an artificial division of text... which reduces the amount of signal, and increases the proportion of text used to frame each piece... thus it's not best for very deep subjects.

The limits of what you can express and explain are the very limits of your ability to change the world. If nobody was able to explain the chain reaction, the atom bomb would have never made it out of the theory of one physicic and into reality.

Language is power.

Blogging can be tweeked, ever so slightly, to be more powerful. The limitations are not technical, they are societal, so we don't even need new tools... just new expectations.  Fortunately, expectations can be adjusted as a matter of intent.

Blogging tools allow keywords on pieces of exposition, to link things together. Blogging tools allow the linking of a longer explaination seemlessly into your text, to help shorted the text for the reader already in sync, while providing additional support for the reader who is not familiar with the topic, or has additional doubt about the intent of the author.

The need to stay on a single topic is an artifact of the pre-tagging days of blogging, and is hereby declared obsolete!  (HA)

Actually, if you give up the idea of having a stream of regular readers, and rather chance meetings of other minds, your point of view is shifted... and your emphasis should then be on allowing the best (easiest) discovery of the content you already have. Linking back to previous work related is one of the most powerful tools to accomplish this. The calendar based archive is appropriate for a single-topic blog, but the entire creative output of a person will necessitate new and better approaches.

----

Suggestions for improving blogging in general.

Don't waste time and emotional energy limiting yourself to a single topic, issue, etc...  go ahead and enjoy the freedom of free association, but please be sure to tag things, and nuture your own links, and links to others.

Revisit your older work, update, revise, correct it... much as if you were to produce a new edition of a book... if you expect people to read it, you should be willing to take the effort to fix mistakes that you've found, and to clarify it.   Mark your changes to help avoid confusion if they are significant.

Give up the idea of having to do something every day... just spend time on it when you like.  If you need to peek at your Google analytics, go ahead... but it's like dieting, there are ups and downs, and you have to keep focused on the big picture and not the numbers immediately in front of you.

Tag everything, prune the tags (get rid of tags only used once that you doubt you'll use again) to make your index smaller.

Ask others for feedback... give others feedback.  It's lonely here out on the small end of the long tail. Your time and attention is the most valuable thing you have to give. A little bit goes a long way.

----

In summary... blogging daily on a single topic is broken... blogging at length with good links back to relevant topics and a good index is better.

It's the same tool set, you just use it a bit better. You extend the reach of topics you can discusss, and thus extend the extent to which you can change the world.

Which is what we all want in the end... to leave it better than we found it.

Right?

--Mike--

Wednesday, April 29, 2009

Somewhere between twitter, rss, and tag clouds lies a better path forward

Dave Winer correctly points out some serious flaws in twitter, and while he might not get brownie points or whuffie for it, he's right. He also shows where he thinks things could go. It's all about the metadata, and this subtle point seems to fly right past most people who read him.

I think it's time to explode twitter into its components making each of them public, implementable as a service, or as a hosted app, or something run on the end user's hardware.

The success of Twitter is because it allows for the rapid spread of messages from a controlled user base... there is a central control that can (but doesn't always) ban a user, etc. This means that every message is authenticated a bit... and all messages are tied to an identity. This makes filtering possible.

There is NOT any really good rating metadata. The messages are too short. This sets expectations, but really does cripple it for important stuff.

Blogging is seen as too slow, but RSS is a slow version of twitter. I think that metadata richness is the fix to this whole thing. Trade a bit of speed (twitter is too fast anyway) for expressive power.

We need to be able to aggregate our own stuff, which is one of the strengths of RSS. The ability to follow (and unfollow) an authenticated message channel is a great plus.

Collective ratings are ok, but the scales of services such as digg (up or down), and slashdot (funny is the same as insightful), are very limiting. It would be more useful for longer term (slower) conversations to be able to add more expressive metadata. Funny, Insightful, Biased, SelfPromoting, etc... could all be but a few of the plethora of possible bits of critical review that could be added.

It would be nice to be able to actually markup hypertext... but that still appears to be outside the range of feasibility.

Maybe it could be called TLM - Text Language Markup?

Tuesday, April 07, 2009

Monday, March 30, 2009

Follow the money

The post-911 spying infrastructure should be completely installed, and should be fully capable of watching our bailout money, and tracking it down to the leaches at the ends of the money trail. We need to get our money back, and help recoup some of the investment on this upgrade.

A few trials and people would see it's a new world order... one they didn't see coming out of left field. The days of wild west kleptocracy are over.

----

The economy is a national security issue... a perfect reason to use our national security apparatus to get our money back. 8)

Saturday, March 28, 2009

Truth

Since 9/11/2001 we've ramped up our snooping capability to read everyone's email, listen to their phone calls, and trace pretty much ALL financial transactions worldwide.

Surely we can trace the billions of bailout dollars that were used to pay off naked credit default swaps and repatriate that money to the treasury, where it belongs. We shouldn't reward betting behaviour when we weren't involved in making the odds, nor the rules.

We could also trace back all of the Madoff money, and get some people paid back.

We could use the post 9/11 snooping apparatus for a truely patriotic purpose... protecting our national security by protecting us from greedy abusers of the system.

Which would suit me just fine.

Wednesday, March 25, 2009

Burning Fuffle

We've got way too much Fuffle in this country, we need to get rid of it, quickly and efficiently, if we are to survive as a nation.

Friday, February 27, 2009

Princeton Premier - Spam De Jure

In today's email...

Dear Mike,

It is my pleasure to inform you that you are being considered for inclusion into the 2009-2010 Princeton Premier Business Leaders and Professionals Honors Edition section of the registry.

The 2009-2010 edition of the registry will include biographies of the world's most accomplished individuals. Recognition of this kind is an honor shared by thousands of executives and professionals throughout the world each year. Inclusion is considered by many as the single highest mark of achievement.

You may access our application form using the following link:

xxxxx (link removed)

Upon final confirmation, you will be listed among other accomplished individuals in the Princeton Premier Registry.

For accuracy and publication deadlines, please complete your application form and return it to us within five business days.

There is no cost to be included in the registry.

If you've already received this email from us, there is no need to respond again.

This email serves as our final invitation to potential members who have not yet responded.

On behalf of the Executive Publisher, we wish you continued success.

Sincerely,
Jason Harris

Managing Director
Princeton Premier

Wednesday, February 25, 2009

Ballet with Strangers

The Dan Ryan expressway is where you get to dance with strangers wearing 2000 pound ballet slippers.

It always amazes me that things flow as smoothly as they do.

Friday, February 06, 2009

What they put in the water

So, some workers in Bellaire, Ohio accidentally put less toxic than normal chemical in the water supply, but they had to purge it. They put in 40 pounds of Hydrochloric acid (found in your stomach) instead of an unknown normal quantity of Flouride.

What harm could a lower pH really do?  Clean out some pipes, or are they worried about dissolved lead? All of the calcium buildup would buffer it out quickly anyway.

People just don't understand Chemistry these days, do they?

Tuesday, February 03, 2009

Tired of the monopoly game.

Doc Searls is tired of a specific part of the monopoly game, sucky AT&T coverage.

I like to think bigger, I'm tired of the whole monopoly game.

I think we should take a few big chunks of spectrum and dedicate them to a new mesh transport network. There would be standards for equipment, with the good old FCC doing type approvals. We could then all buy our own off the shelf part of the internet. Everyone could own it, or improve it. I expect that groups would quickly form to meet common needs, and the commercial interest would leverage their existing sites to move into this new opprotunity.

Instead of government enforced monopoly use of the spectrum, everyone would have to fairly peer with everyone. For the commercial end of things, there would be minimum requirements that would allow you make money, but keep the incentive there for others to contribute to the spectral commons. The main billing event would be transit off the wireless grid and back into the phone company or internet. Those would be the toll booths. This means that if you maintained enough equipment to have good wireless connectivity, you wouldn't have to pay any tolls.

The other option would be to bill it out like railroads if necessary so that everyone gets fairly paid for how much traffic they help move. It's my understanding that owner of a section of rail gets paid by whoever moves cars over it. The owners of the cars get paid rent to use them. It's complicated, but it works.


Either way, let's give strong incentives for people to put up mesh network nodes, and keep them powered and maintained. This can help route around the huge cost of laying fiber, and get us acceptable speeds at acceptable costs, right now, in spite of the economy, etc. Pay back the unused portion of the rents we've charged the cellular networks if they agree to the plan, and are willing to provide access in a neutral manner.

Tuesday, January 27, 2009

Adding value - looking for balance in photography

Here is an example of added value... done after giving the family all of the raw photos from the Baptism. It took a few train trips worth of effort to get these three panoramas stitched, and did require some manual editing.

Addison Baptism - Panorama 01

Addison Baptism - Panorama 02

Addison Baptism - Panorama 03

The reason I gave them the raw photos was one of simple expediency, and to hedge against the possibility of the task of delivering them falling off my to-do list. I'm not as happy with this as I thought I would be.

The first reason is that I take a LOT of photos... 1202 at a baby shower on Saturday, and 900 at the Baptism and other associated events on Sunday. This means that I've given the task of reviewing that many essentially random photos to families that have other things to do.

The second reason I've come to realize lately is that I don't get feedback... I really need to find out what people like, so I can give them more of that, and less stuff they don't care about. It's impossible to learn without feedback.

The last reason is one of adding value which is kind of a merge of the others... I want to make the photographs a gift of value... not just a pile of snaps. I want to help people make memories they will cherish through their years, and be able to share with others. I can't do that alone.

Sunday, January 18, 2009

Computational Imaging - experiments to date

I've been experimenting with doing things with a stack of images, some of it is virtual focus, some of it synthetic aperture, and some of it involving multiple image layering (aka "multisync"). This seems to have generated an audience who wishes to know more, so I'm writing this in response. Please feel free to add questions in the comments, and I'll answer as precisely as I can.

Multisync

The first thing to discuss is what I call "multisync" for lack of a better term. Suggestions for a better term for this would be greatly appreciated. A "multisync" photo is simply a set of photos layered to show (or hide) a subject across multiple moments in time. A tripod makes it relatively trival to do so. Here's one of my personal favorites:


Virginia walks with ball and flower

Here you see Virginia at 2 years old, clearly walking towards a goal. At the time I had something like this in mind, but when the opprotunity presented itself, it was a quck rush to get into position and holding down the shutter to get a continous stream of photos. The results were amazing to me. It's not often that something truely original results from an experiment, but this time I feel like it did.

I didn't have a tripod at the time, so the images were all taken with slightly different angles. I used Hugin to align them, because that is the tool I feel most comfortable with. It's a matter of trading time post-exposure for pre-exposure setup. I'm very happy with this particular trade.

Synthetic aperture

Once I learned about synthetic aperture, I was hooked. This is the thing that got me started on this thread of experimentation. I still tip my hat to Marc Levoy for his work and demos that got me interested.

The basic idea is to trade your time and effort to replace a very large lens to create your own short depth of field. Here's an example:

Marshall Fields Clock

The basic process is can be reduced to these basic steps:
  • take a lot of almost identical photos from slightly different positions
  • spend minutes or hours manually adding control points using Hugin
  • output the remapped images to a series of TIFF formattted images
  • average the results using a python script written for the purpose
I took this set of photos while standing near the Marshall Fields clock at State and Randolph in Chicago, Illinois. I took the pictures while moving left and right and forwards and backwards covering an area of a circle approximately 1 meter in diameter if I recall correctly.

Again, here I use Hugin to trade off pre-exposure alignment for time post exposure. However, in the case of virtual focus, it's pretty much impossible to align things pre-exposure. I imported the photos into Hugin, and chose 4 points on the face as alignment points and allowed Hugin to optimize for Roll, Pitch, Yaw, Zoom and X/Y offset. I'm pleased with the results, though I do wonder how many photos it would take to get a creamy bokeh.

Virtual Focus

Another way to combine photos is to take them from widely varying locations, to create images that would otherwise be impossible to capture using film, because it combines photos taken non-continous locations.

Here is a good example of an otherwise impossible shot using multiple exposures taken while receeding from the subject at 30+ miles per hour:

Railroad Signals

The compression of distance as the magnification becomes greater to keep the relative size shows so very interesting artifacts in the photo, and in fact I use this shot to help explain the process to others who commute with me.

Hugin - The Process

The process is pretty simple, if tedious. Import all of your exposures into Hugin, using the Images tab to avoid the auto-alignment process. Then manually enter control points between image pairs that are in your desired plane of focus.

Once you've gotten enough points entered and you've managed to optimize the error to an acceptable level (I try to get below 1 pixel of error), you then use the following option on the Stitcher tab:

Output : Remapped images

In remapper options I turn off the "saved cropped images" because my script can't handle them cropped. You then tell it to stitch now, and it will ask for a prefix, I always use the underscore character _ because it's easy to remember.

I usually then run the resulting image through one of the scripts I've written to average the frames. Lately, I use trail.py a lot, because it shows all of the intermediate steps. Here is a listing of the script:

import os, sys
import fnmatch
import Image

mask = sys.argv[1]
count = 0

# Program to average a set of photos, producing merge.jpg as a result
#
# version 0.01
# Michael Warot
#

print mask
for file in os.listdir('.'):
if fnmatch.fnmatch(file, mask):
print file
count = count + 1
in2 = Image.open(file)
if count == 1:
in1 = in2
in1 = Image.blend(in1,in2,(1.0/count))
in1.save("trail"+str(count)+".jpg","JPEG")
in1.save("merge.jpg","JPEG")



You need to be warned that this script overwrites the output files without asking, so be careful. It also requires the python image library, which might not be installed by default.

Once done, you'll have a set of jpeg images which show the intermediate results, along with merge.jpg which is the average of all frames.

Summary
So, I hope this has been of help. Please feel free to ask additional questions, or point out errors or omissions. Thanks for your time and attention.

Saturday, January 10, 2009

Signal and Noise - Cleaning up my act

It turns out that Blogger, the service I use to host this blog, supports tagging (though they call it labels). My label usage to date hasn't been very good, so I'm going to take time to re-tag everything and make the archives more useable.

Friday, January 09, 2009

Signal and noise - Separation of concerns

I keep seeing the same meta-problems over and over again. Some new tool comes along to make it easier to communicate, and it takes off. Then people complain that it makes it too easy to communicate... and it loses steam.

We've been through this cycle so many times, and history always rhymes. This particular portion of the rhyme of time involves Doc Searls, Twitter and Facebook, because he mentioned all of them in one place.

History - Email

Email made asynchronous messaging possible for most of us. It became a de-facto component of any "internet access" package. There were some improvements along the way, but the basic protocol hasn't really changed since the switch from ! to @ email addressing.

Because the environment has changed, the lack of authentication of the sender of a message has turned it into the spam pit we all learn to accept as the status quo. If you take spam out of the picture, email is a great way to get data from sender to reciever. It's a deliberate 1:1 transmission means, which means the sender implicitly signals the intent for someone to read the message.

Mailing lists preserve the intent because the are opt-in and dedicated to a certain topic. Social conventions arose to keep the clutter down, and keep things on-message.

History - Usenet


Usenet is a distributed protocol for hosting discussions on an internet-wide basis. Before the spam got out of control, it was a useful way to get information on a wide variety of topics. Once again, the nature of the environment changed, and the lack of authentication made it into an even bigger spam pit than email is today.

History - IRC / Instant Messaging


IRC - Internet Relay Chat made it possible for real-time group discussions across the internet. The spam factor drove some innovation amoung closed systems, which requried some form of authentication, spawning a wide range in so called Instant Message (IM) offerings amoung the various walled garden services such as AOL and Prodigy.

The VHS/Betamax style battles to capture customers lead to a lack of a single service arising to unify things, though with clients like Trillian it was possible. The need to install a specialized client was a big hindrance to adoption as well.

History - Twitter


While there have been other web-based messaging platforms in the past, the rise of twitter as a way to broadcast messages has been quite solid, in spite of some growing pains. The ability to see all public "tweets" and the additional innovation of third party tools such as search has kept it growing.

The fact that there is a central authority to register users and enforce some rules does help keep the signal to noise ratio in check. Like instant messaging, you chose who you listen to, but in addition your audience choose you as well.

The problem faced by twitter is one of the loss of Intent. When you send a message to twitter, it goes out to the entire audience, there is no way to segment your audience by intent, which forces you to artificially limit yourself to the one common interest of your audience if you don't want to lose them. Otherwise you have to play a very delicate (and unneccessary from my point of view) balancing act to try to keep everyone happy.

History - Facebook

Facebook gives us an easy to use place for putting sharing our stuff with others. Because of the lack of intent, there is becoming a friction between work and home, and people really like the tool because of ease of use, so they don't want to lose it to the lessor choice. The tension of multiple audiences is becoming easier to see with the rise in user population, and the increasing adoption of social networking platforms by businesses who have gotten a clue.

History - Internet Access and the soda straw

I have at least 320 Gigabytes of stuff I'd like to share. I'm sure that anyone who has had a digital means of capture of images or sound could easily generate a gigabyte or two per month without breathing hard. The entrenched providers have built their networks on an assymetric model that prohibits running servers, which is the only reasonable strategy for making this much content available on a discretionary basis.

Prohibition of servers at home actually then forces us to chose carefully the content we wish to share, and to send it to a silo where we begin to lose control and rights over our own stuff. We're forced to live our online life breathing digitally through a soda straw.

The Present

There are many tools to allow you to share content, but there is something lacking in each of them. As we see the ones that allow specifying audience don't seem to have the necessary authentication or regulation of the senders. The ones that work well for social networking don't allow for the separation of concerns. There is an additional factor of the difficulties of moving content to, from, and between these platforms and silos. The fact that this all has to be done through a limited bandwidth connection certainly doesn't help much.

Defining The Future - Social Aspects

If we are to have the future we desire, we must articulate the vision clearly and build consensus to push towards it as a goal. There are no obvious technical limitations that prohibit a future where we can share all the content we care to create. We do need to take care to make sure the signal to noise ratio remains as high as possible for those we share with. I believe that the simple act of being polite and mindful of our audience is going to carry the highest social value in the future, as it currently does with our limited tools.

Defining The Future - Technology


The future is all about metadata. We don't have enough of it right now. Flickr helps by allowing tags, and this is a very powerful tool. We need to build tool sets that allow tagging to become a socially shared tool as well. It would be valuable to allow someone else to review my photos, and decide which are keepers, or to tag them for specific keywords, possible venues, etc.

HTML needs to be expanded to allow for the markup and tagging of existing content by other parties. Of course, there needs to be some authetication built into it, to prevent grafitti and other forms of spamming from taking over. This would allow someone to highlight the "geek speak" parts of this very blog post, for example. It would also allow someone to highlight the part they found insightful, or insulting, or whatever.

Offline forms of sharing should be sought out to allow familes and others to route around the poorly designed "broadband" we've all been brought up to think was fast.

Twitter, Facebook, and others should allow for some form of tagging, like flickr, to make it possible to subscribe to only certain content from other users.

In Summary

The limitations of Twitter and Facebook that Doc Searls complains about are those of metadata and intent. They are not unsolvable, and they can be addressed. It remains to be seen if that happens via inclusion of new features, or migration to even newer networks that offer the required features.

Thursday, January 01, 2009

Predictions for 2009

Here are my predictions for the new year
  • Computer security still remains unaddressed, as nobody cares enough to fix it.
  • The insecurity of the nodes on the ends becomes an even more valid reason to offer internet access instead of true connectivity
  • IPv6 still sits on hold and fails to get traction.
  • The value of the US dollar increases for a while, then drops to a new low of 1 cent relative to the 1913 value.
  • The cutover to digital TV goes bad, and gets postponed at least 6 months.
  • Netbooks continue to become more popular
  • Fixing things will emerge as the hot new skill set as consumerism ceases to be a viable lifestyle choice for many.
  • Just In Time as a management strategy will be shown the door.
  • Sneakernet comes back into style as a way to transfer data outside of the net.
  • Wifi networks for small groups that aren't network connected will begin to become popular for neighbors in rural areas.
  • Software to distribute twitter like feeds amoung ham radio operators takes off, and causes a huge number of no-code amateurs to take up packet 2.0 to get around the phone companies.
  • A new standard for tagging links with additional information takes off, allowing us all to vote on anything with a link.
  • HyperText Markup Language continues to prohibit the markup of existing hypertext. (A personal pet peeve)
  • The US dollar is devalued to 1/2000 ounce of gold, or 1/200 ounce of silver. Gold bugs rejoice. Inflation killed dead in its tracks.
  • Growing ones own food becomes a popular hobby.
Ok... it's 1 AM while I'm writing this... and it's a wild stab in the dark. I look forward to the year ahead. I live in interesting times, and hope to do so for a long time.

--Mike--