Thursday, December 31, 2009
Tuesday, December 29, 2009
Thursday, December 24, 2009
Thursday, December 17, 2009
$26 Software Is Used to Breach Key Weapons in Iraq; Iranian Backing Suspected
Tuesday, December 15, 2009
We still seem to think that progress on the Net is the work of “brands” creating and disrupting and doing other cool stuff. Those may help, but what matters most is what each of us does better than anybody or anything else. The term “content” insults the nature of that work. And of its sources.The revolution that matters — the one that will not be intermediated — is the one that puts each of us in the driver’s seat, rather than in the back of the bus. Or on a bus at all.
We still seem to think that progress on the Net is the work of “brands” creating and disrupting and doing other cool stuff. Those may help, but what matters most is what each of us does better than anybody or anything else. The term “content” insults the nature of that work. And of its sources.I think it's better to say that we're used to having reliable labels on our content, enforced by law (Branding, Trade Mark, Copy Right). These labels made it possible to aggregate many different pieces of stuff under one reputation umbrella.
Thursday, December 03, 2009
Wednesday, December 02, 2009
Sunday, November 29, 2009
Virginia and I just finished assembling a remote control forklift kit.
Monday, November 02, 2009
Sunday, November 01, 2009
- Giving both sides to a story isn't fair and balanced if one side is a tiny vocal minority
- Tiny vocal minorities can be created by corporations with a relatively modest investment of resources, and can net a large disruption of Democracy in the process. I'd recommend reading the works of John Robb to see what that implies for the future.
- The science behind global warming has been known for quite a long time... longer than even I imaged it to be.
- There are parts of government which actually work quite well. Let's not throw it out, but encourage the good parts, and weed out the bad.
- Slippery slope arguments are bad for one's logical thought processes, because they over-ride the possibility of progress through dialog with one of fear.
Thursday, October 29, 2009
So now I'm not the only one with an idea.... Dave Winer has it too... some way to manage all of the downloads, via a 3rd party.
He's only worried about local stuff.... I want to distribute the list, etc... but I think there is some common ground.
Saturday, October 24, 2009
Monday, October 19, 2009
Sunday, October 11, 2009
Saturday, October 10, 2009
- All downloads would get tracked in one place, per user, instead of per PC
- All downloads would happen across very high speed links (assuming a hosted service)
- Virus scanning could be built in
- Tagging and other forms of metadata could be added
- Source metadata could be automatically saved
- Registration data could be supplied via OpenID, or some other means, instead of filling out the same lead data over and over.
- Lower friction providing lead data might increase the quantity and quality provided to the site allowing downloads
- Rating and other social networking features could be added as well.
Friday, October 09, 2009
Wednesday, October 07, 2009
Monday, August 31, 2009
Tuesday, July 07, 2009
Sunday, June 28, 2009
- Secure OS
- Mesh Internet (related: Throttling, Price Tiers, Metering)
- DNS Security
- Spoofing/Authentication (possibly related: Context)
Saturday, June 27, 2009
Wednesday, June 24, 2009
- Secure operating systems
- Mesh networking
- Distributed content systems / sync gets solved
Friday, May 29, 2009
It would be nice to be able to actually markup hypertext... but that still appears to be outside the range of feasibility.
Maybe it could be called TLM - Text Language Markup?
Our experience with collaborative editing of book manuscripts at O'Reilly suggests that the amount and quality of participation goes up radically when comments can be interleaved at a paragraph level.
Tuesday, May 19, 2009
Monday, May 18, 2009
Thursday, May 14, 2009
Wednesday, May 06, 2009
Wednesday, April 29, 2009
I think it's time to explode twitter into its components making each of them public, implementable as a service, or as a hosted app, or something run on the end user's hardware.
The success of Twitter is because it allows for the rapid spread of messages from a controlled user base... there is a central control that can (but doesn't always) ban a user, etc. This means that every message is authenticated a bit... and all messages are tied to an identity. This makes filtering possible.
There is NOT any really good rating metadata. The messages are too short. This sets expectations, but really does cripple it for important stuff.
Blogging is seen as too slow, but RSS is a slow version of twitter. I think that metadata richness is the fix to this whole thing. Trade a bit of speed (twitter is too fast anyway) for expressive power.
We need to be able to aggregate our own stuff, which is one of the strengths of RSS. The ability to follow (and unfollow) an authenticated message channel is a great plus.
Collective ratings are ok, but the scales of services such as digg (up or down), and slashdot (funny is the same as insightful), are very limiting. It would be more useful for longer term (slower) conversations to be able to add more expressive metadata. Funny, Insightful, Biased, SelfPromoting, etc... could all be but a few of the plethora of possible bits of critical review that could be added.
It would be nice to be able to actually markup hypertext... but that still appears to be outside the range of feasibility.
Maybe it could be called TLM - Text Language Markup?
Tuesday, April 07, 2009
Thursday, April 02, 2009
Monday, March 30, 2009
Saturday, March 28, 2009
Surely we can trace the billions of bailout dollars that were used to pay off naked credit default swaps and repatriate that money to the treasury, where it belongs. We shouldn't reward betting behaviour when we weren't involved in making the odds, nor the rules.
We could also trace back all of the Madoff money, and get some people paid back.
We could use the post 9/11 snooping apparatus for a truely patriotic purpose... protecting our national security by protecting us from greedy abusers of the system.
Which would suit me just fine.
Wednesday, March 25, 2009
Monday, March 02, 2009
Friday, February 27, 2009
It is my pleasure to inform you that you are being considered for inclusion into the 2009-2010 Princeton Premier Business Leaders and Professionals Honors Edition section of the registry.
The 2009-2010 edition of the registry will include biographies of the world's most accomplished individuals. Recognition of this kind is an honor shared by thousands of executives and professionals throughout the world each year. Inclusion is considered by many as the single highest mark of achievement.
You may access our application form using the following link:
xxxxx (link removed)
Upon final confirmation, you will be listed among other accomplished individuals in the Princeton Premier Registry.
For accuracy and publication deadlines, please complete your application form and return it to us within five business days.
There is no cost to be included in the registry.
If you've already received this email from us, there is no need to respond again.
This email serves as our final invitation to potential members who have not yet responded.
On behalf of the Executive Publisher, we wish you continued success.
Wednesday, February 25, 2009
Friday, February 06, 2009
Tuesday, February 03, 2009
I like to think bigger, I'm tired of the whole monopoly game.
I think we should take a few big chunks of spectrum and dedicate them to a new mesh transport network. There would be standards for equipment, with the good old FCC doing type approvals. We could then all buy our own off the shelf part of the internet. Everyone could own it, or improve it. I expect that groups would quickly form to meet common needs, and the commercial interest would leverage their existing sites to move into this new opprotunity.
Instead of government enforced monopoly use of the spectrum, everyone would have to fairly peer with everyone. For the commercial end of things, there would be minimum requirements that would allow you make money, but keep the incentive there for others to contribute to the spectral commons. The main billing event would be transit off the wireless grid and back into the phone company or internet. Those would be the toll booths. This means that if you maintained enough equipment to have good wireless connectivity, you wouldn't have to pay any tolls.
The other option would be to bill it out like railroads if necessary so that everyone gets fairly paid for how much traffic they help move. It's my understanding that owner of a section of rail gets paid by whoever moves cars over it. The owners of the cars get paid rent to use them. It's complicated, but it works.
Either way, let's give strong incentives for people to put up mesh network nodes, and keep them powered and maintained. This can help route around the huge cost of laying fiber, and get us acceptable speeds at acceptable costs, right now, in spite of the economy, etc. Pay back the unused portion of the rents we've charged the cellular networks if they agree to the plan, and are willing to provide access in a neutral manner.
Tuesday, January 27, 2009
The reason I gave them the raw photos was one of simple expediency, and to hedge against the possibility of the task of delivering them falling off my to-do list. I'm not as happy with this as I thought I would be.
The first reason is that I take a LOT of photos... 1202 at a baby shower on Saturday, and 900 at the Baptism and other associated events on Sunday. This means that I've given the task of reviewing that many essentially random photos to families that have other things to do.
The second reason I've come to realize lately is that I don't get feedback... I really need to find out what people like, so I can give them more of that, and less stuff they don't care about. It's impossible to learn without feedback.
The last reason is one of adding value which is kind of a merge of the others... I want to make the photographs a gift of value... not just a pile of snaps. I want to help people make memories they will cherish through their years, and be able to share with others. I can't do that alone.
Sunday, January 18, 2009
The first thing to discuss is what I call "multisync" for lack of a better term. Suggestions for a better term for this would be greatly appreciated. A "multisync" photo is simply a set of photos layered to show (or hide) a subject across multiple moments in time. A tripod makes it relatively trival to do so. Here's one of my personal favorites:
Here you see Virginia at 2 years old, clearly walking towards a goal. At the time I had something like this in mind, but when the opprotunity presented itself, it was a quck rush to get into position and holding down the shutter to get a continous stream of photos. The results were amazing to me. It's not often that something truely original results from an experiment, but this time I feel like it did.
I didn't have a tripod at the time, so the images were all taken with slightly different angles. I used Hugin to align them, because that is the tool I feel most comfortable with. It's a matter of trading time post-exposure for pre-exposure setup. I'm very happy with this particular trade.
Once I learned about synthetic aperture, I was hooked. This is the thing that got me started on this thread of experimentation. I still tip my hat to Marc Levoy for his work and demos that got me interested.
The basic idea is to trade your time and effort to replace a very large lens to create your own short depth of field. Here's an example:
The basic process is can be reduced to these basic steps:
- take a lot of almost identical photos from slightly different positions
- spend minutes or hours manually adding control points using Hugin
- output the remapped images to a series of TIFF formattted images
- average the results using a python script written for the purpose
Again, here I use Hugin to trade off pre-exposure alignment for time post exposure. However, in the case of virtual focus, it's pretty much impossible to align things pre-exposure. I imported the photos into Hugin, and chose 4 points on the face as alignment points and allowed Hugin to optimize for Roll, Pitch, Yaw, Zoom and X/Y offset. I'm pleased with the results, though I do wonder how many photos it would take to get a creamy bokeh.
Another way to combine photos is to take them from widely varying locations, to create images that would otherwise be impossible to capture using film, because it combines photos taken non-continous locations.
Here is a good example of an otherwise impossible shot using multiple exposures taken while receeding from the subject at 30+ miles per hour:
The compression of distance as the magnification becomes greater to keep the relative size shows so very interesting artifacts in the photo, and in fact I use this shot to help explain the process to others who commute with me.
Hugin - The Process
The process is pretty simple, if tedious. Import all of your exposures into Hugin, using the Images tab to avoid the auto-alignment process. Then manually enter control points between image pairs that are in your desired plane of focus.
Once you've gotten enough points entered and you've managed to optimize the error to an acceptable level (I try to get below 1 pixel of error), you then use the following option on the Stitcher tab:
Output : Remapped images
In remapper options I turn off the "saved cropped images" because my script can't handle them cropped. You then tell it to stitch now, and it will ask for a prefix, I always use the underscore character _ because it's easy to remember.
I usually then run the resulting image through one of the scripts I've written to average the frames. Lately, I use trail.py a lot, because it shows all of the intermediate steps. Here is a listing of the script:
import os, sys
mask = sys.argv
count = 0
# Program to average a set of photos, producing merge.jpg as a result
# version 0.01
# Michael Warot
for file in os.listdir('.'):
if fnmatch.fnmatch(file, mask):
count = count + 1
in2 = Image.open(file)
if count == 1:
in1 = in2
in1 = Image.blend(in1,in2,(1.0/count))
You need to be warned that this script overwrites the output files without asking, so be careful. It also requires the python image library, which might not be installed by default.
Once done, you'll have a set of jpeg images which show the intermediate results, along with merge.jpg which is the average of all frames.
So, I hope this has been of help. Please feel free to ask additional questions, or point out errors or omissions. Thanks for your time and attention.
Saturday, January 10, 2009
Friday, January 09, 2009
We've been through this cycle so many times, and history always rhymes. This particular portion of the rhyme of time involves Doc Searls, Twitter and Facebook, because he mentioned all of them in one place.
History - Email
Email made asynchronous messaging possible for most of us. It became a de-facto component of any "internet access" package. There were some improvements along the way, but the basic protocol hasn't really changed since the switch from ! to @ email addressing.
Because the environment has changed, the lack of authentication of the sender of a message has turned it into the spam pit we all learn to accept as the status quo. If you take spam out of the picture, email is a great way to get data from sender to reciever. It's a deliberate 1:1 transmission means, which means the sender implicitly signals the intent for someone to read the message.
Mailing lists preserve the intent because the are opt-in and dedicated to a certain topic. Social conventions arose to keep the clutter down, and keep things on-message.
History - Usenet
Usenet is a distributed protocol for hosting discussions on an internet-wide basis. Before the spam got out of control, it was a useful way to get information on a wide variety of topics. Once again, the nature of the environment changed, and the lack of authentication made it into an even bigger spam pit than email is today.
History - IRC / Instant Messaging
IRC - Internet Relay Chat made it possible for real-time group discussions across the internet. The spam factor drove some innovation amoung closed systems, which requried some form of authentication, spawning a wide range in so called Instant Message (IM) offerings amoung the various walled garden services such as AOL and Prodigy.
The VHS/Betamax style battles to capture customers lead to a lack of a single service arising to unify things, though with clients like Trillian it was possible. The need to install a specialized client was a big hindrance to adoption as well.
History - Twitter
While there have been other web-based messaging platforms in the past, the rise of twitter as a way to broadcast messages has been quite solid, in spite of some growing pains. The ability to see all public "tweets" and the additional innovation of third party tools such as search has kept it growing.
The fact that there is a central authority to register users and enforce some rules does help keep the signal to noise ratio in check. Like instant messaging, you chose who you listen to, but in addition your audience choose you as well.
The problem faced by twitter is one of the loss of Intent. When you send a message to twitter, it goes out to the entire audience, there is no way to segment your audience by intent, which forces you to artificially limit yourself to the one common interest of your audience if you don't want to lose them. Otherwise you have to play a very delicate (and unneccessary from my point of view) balancing act to try to keep everyone happy.
History - Facebook
Facebook gives us an easy to use place for putting sharing our stuff with others. Because of the lack of intent, there is becoming a friction between work and home, and people really like the tool because of ease of use, so they don't want to lose it to the lessor choice. The tension of multiple audiences is becoming easier to see with the rise in user population, and the increasing adoption of social networking platforms by businesses who have gotten a clue.
History - Internet Access and the soda straw
I have at least 320 Gigabytes of stuff I'd like to share. I'm sure that anyone who has had a digital means of capture of images or sound could easily generate a gigabyte or two per month without breathing hard. The entrenched providers have built their networks on an assymetric model that prohibits running servers, which is the only reasonable strategy for making this much content available on a discretionary basis.
Prohibition of servers at home actually then forces us to chose carefully the content we wish to share, and to send it to a silo where we begin to lose control and rights over our own stuff. We're forced to live our online life breathing digitally through a soda straw.
There are many tools to allow you to share content, but there is something lacking in each of them. As we see the ones that allow specifying audience don't seem to have the necessary authentication or regulation of the senders. The ones that work well for social networking don't allow for the separation of concerns. There is an additional factor of the difficulties of moving content to, from, and between these platforms and silos. The fact that this all has to be done through a limited bandwidth connection certainly doesn't help much.
Defining The Future - Social Aspects
If we are to have the future we desire, we must articulate the vision clearly and build consensus to push towards it as a goal. There are no obvious technical limitations that prohibit a future where we can share all the content we care to create. We do need to take care to make sure the signal to noise ratio remains as high as possible for those we share with. I believe that the simple act of being polite and mindful of our audience is going to carry the highest social value in the future, as it currently does with our limited tools.
Defining The Future - Technology
The future is all about metadata. We don't have enough of it right now. Flickr helps by allowing tags, and this is a very powerful tool. We need to build tool sets that allow tagging to become a socially shared tool as well. It would be valuable to allow someone else to review my photos, and decide which are keepers, or to tag them for specific keywords, possible venues, etc.
HTML needs to be expanded to allow for the markup and tagging of existing content by other parties. Of course, there needs to be some authetication built into it, to prevent grafitti and other forms of spamming from taking over. This would allow someone to highlight the "geek speak" parts of this very blog post, for example. It would also allow someone to highlight the part they found insightful, or insulting, or whatever.
Offline forms of sharing should be sought out to allow familes and others to route around the poorly designed "broadband" we've all been brought up to think was fast.
Twitter, Facebook, and others should allow for some form of tagging, like flickr, to make it possible to subscribe to only certain content from other users.
The limitations of Twitter and Facebook that Doc Searls complains about are those of metadata and intent. They are not unsolvable, and they can be addressed. It remains to be seen if that happens via inclusion of new features, or migration to even newer networks that offer the required features.
Thursday, January 01, 2009
- Computer security still remains unaddressed, as nobody cares enough to fix it.
- The insecurity of the nodes on the ends becomes an even more valid reason to offer internet access instead of true connectivity
- IPv6 still sits on hold and fails to get traction.
- The value of the US dollar increases for a while, then drops to a new low of 1 cent relative to the 1913 value.
- The cutover to digital TV goes bad, and gets postponed at least 6 months.
- Netbooks continue to become more popular
- Fixing things will emerge as the hot new skill set as consumerism ceases to be a viable lifestyle choice for many.
- Just In Time as a management strategy will be shown the door.
- Sneakernet comes back into style as a way to transfer data outside of the net.
- Wifi networks for small groups that aren't network connected will begin to become popular for neighbors in rural areas.
- Software to distribute twitter like feeds amoung ham radio operators takes off, and causes a huge number of no-code amateurs to take up packet 2.0 to get around the phone companies.
- A new standard for tagging links with additional information takes off, allowing us all to vote on anything with a link.
- HyperText Markup Language continues to prohibit the markup of existing hypertext. (A personal pet peeve)
- The US dollar is devalued to 1/2000 ounce of gold, or 1/200 ounce of silver. Gold bugs rejoice. Inflation killed dead in its tracks.
- Growing ones own food becomes a popular hobby.