Why I am switching to using Windows 7 as my boot OS

This is a little embarrassing.  My most recent source of unhappiness was my purchase of a Dell Vostro 3700 laptop that happened to be one of those “Hybrid” chipsets.  In exchange for a smashing purchase price I am stuck with an Nvidia FX graphic card that will not work in Linux.  Instead I get the equivalent of an Intel HD graphic card.  The awful truth is that I would have saved even more money and gotten a laptop with just Intel HD graphics.  Pulling at this small thread has led me to even more sources of discontent.

 

In reality, my problems are much bigger than just my graphic card.  I have been a Linux user for quite some time (my first store bought copy was SlackWare 96).  I have used a variety of flavors with all types of user experiences:  source compile only Gentoo, Debian package management based distros, even some live CD versions. My younger more patient self was more willing to put up with the little foibles:  bad device driver support, half baked UI rewrites, API wars about issues that maybe 5 people in the world care about.  I was willing to put up with it because until about 2 or 3 Windows revisions back Windows was not the most competent O.S.  Furthermore, Linux was usually blazingly fast on even limited hardware.  One of these things has changed though:  since about Windows Vista, Windows has actually become a very usable and elegant O.S.  In fact, its fair to say that Windows 7 is easily the best desktop O.S. that has every existed.  I say this without the least irony and with a little eating of crow.

 

I have to admit, I wasn’t exactly rooting for Microsoft in the desktop O.S. wars.  I frankly didn’t care about commercial operating systems as I was perfectly happy using freely available ones.  A few things have changed for me in the last few years:

  1. I don’t have time to tinker anymore.  I have a full time 9 to 5 and then I come home to code my startup application.  I don’t have time to spend days or weeks without my computer just working.
  2. Linux driver support has become progressively worse.  Back in the day Linux was usually made to work on a few manufacturer pcs and maybe one or two high performance computing platforms.  Today, Linux is running on everything from cell phones to toasters.  The resulting dilution of resources means that Linux just doesn’t have the same driver support that it used to have.  This is particularly bad for desktop users as the paid driver developers are working for enterprise vendors like Red Hat and they no longer care about desktop Linux.
  1. I finally have a big problem with the perpetual change model of Linux.  Here is the thing, do we really need to have 25 window managers?  Do we really need 10 application servers.  I’m all about diversity, but, diversity should be moving us forward not just changing for the sense of change.  The Linux community seems to accept that the platform is perpetually changing and that we have to keep updating on a regular basis like a waiter serving drinks on the roof of a moving vehicle.  That analogy explains what its like being a Linux desktop user today.  Every day you log into your machine and you are greeted by 5-10 updates.  You can read the developer descriptions of those updates, but, they don’t get into the epic flame wars that may be associated with those updates.  You also won’t hear about why your machine may be broken after installing these updates.  (I could write a book on why this is not really always the developers fault, in fact, there are so many reasons why this is).  The problem is that other than the Kernel, Linux frankly has too many updates that can cause end user problems.  The traffic cops for this are the distro developers and this problem lies in their hands.  This problem is for them to solve.  Quite frankly distros need to have integration labs like what Microsoft have where they thoroughly test new versions before hoisting them on an unsuspecting public.
  1. I am more concerned about being able to work and be productive than I am about having new features.  Maybe I’m getting old or maybe I am very busy.  Either way I wouldn’t mind using FVWM and old lib C if I didn’t have to worry about my laptop breaking every few weeks. 

 

So how does Windows 7 fit into all this.  First off, it’s a damn fine OS.  I’ve never had it crash on me and I’ve used it on several different laptops and desktops.  Its not a memory hog.  The UI is very nice and simple.  The driver support is amazing.  Overall it is a very unfussy product.  This leaves me in somewhat of a bind though since most of my development work at home is on the LAMP stack.  However, there are several virtualization platforms that allow you to run Linux inside Windows 7.  Now this may seem quite strange, but, I need to run Linux for all the great developer tools and wonderfully free API.  However, I am likely going to give up on dual booting my laptop.  Long rant for little pay off, but, this is what you get from staying up late because of something stupid.

Sobering details of the startup

For those who know me personally you know that I haven’t been able to shut up about this platform that I have been working on for the past few months.  Its been a lot of hard work, especially because it was happening after I would do  my 9 to 5.

Unfortunatley, a competitor has gotten to market with a product very similar to ours.  But, if you know anything about people who try things like this, thats just water on a ducks back.  We are aware that the market has changed, but, we are still convinced that we can develop a viable platform and a good company.  You see thats what we are really after, not so much to be the most well known or to be the most profitable, but, just to be sustainable and reputable.

At this point I only have the following advise:

  1. Do something you believe in.  Don’t be caught doing something that you wouldn’t want to be doing when you die.
  2. Remember that the race belongs to the dedicated and wise, not the quick.
  3. You have to be doing this for a reason other than making money, because frankly you can make money selling internet porn or as a mail carrier.
  4. God rewards those who pursue their dreams with a singular conviction.  You may not get exactly what you thought you would get, but, trust me it will be something so much more worthwhile.
  5. Don’t limit yourself to just what you can imagine.  There is a whole universe out there and you can’t imagine what the boundaries of it are and of where you might fit into those boundaries.
  6. Have a fine German Ale every once in a while, it will reboot you very quickly.

Best of luck on your adventures my friends, I am actually enjoying my own.

Google Interests launches

The first version of one of my side projects is now up on appspot.  I created this App, because I wanted a way to conduct data mining of particular subjects that interest me.  In essence the app does diffs of Google search requests over time.  It gives a way to store and visualize the Google search history of terms.  These terms become an “interest” with the inclusion of a NLP element.  Terms can just be what you choose, but, the idea is to use an NLP engine to use your search terms and create a better query using some NLP’s engines notion of taxonomy.  The NLP component has not been implemented yet, but, the search history has.  I’m still working on the guts of this App, so please be gentle.

Its my first GWT app as well so it has provided an opportunity for me to learn a lot.  This App is my focus right now since I am using such good tools to develop it and learn so much in the process.  I am looking for UI designers and HCIL experts to help me design the interface so if you are interested please drop me a note.

Apple Adjusts

Apple has loosened restrictions on developers writing code for IOS.  This is entirely to be expected.  Apple is a smart company.  They can feel themselves losing their grip on the mobile marketplace.  They need all the hands on deck to offshoot this winter’s Android tablet invasion.  If this year is any indication, Android will dominate the mobile phone market even quicker than I could have expected.  I predicted 3 years earlier this year, but, honestly, with 800,000 of one brand of Android phones selling out in a week and no vendor capable of stocking any high end Android phones, the tea leaves are clear.   Apple knows that their last and best has to be the Ipad.  If the Ipad doesn’t remain the dominate computing platform for media delivery then all those media houses and Apple are in trouble.  But, how could the Ipad be that device.  First off, its not the best tablet.  Its just Apple’s first successful tablet.  That is saying something, but, Apple will not be able to hold off the Android tablet phenomenon.  There are going to be at least 10 to 12 tablets with more features and lower prices than the Ipad this christmas.  Once the media houses realize that they have to do Android apps as well then Apple is sunk in the tablet market.  I have to admit that Apple is going to lose this battle not because their product is second best.  In fact, if the Ipad was the most amazing tablet pc ever, it would still be a disaster.  No one can compete with multiple competitors offering even the same product for lower prices.  This fantasy that somehow Apple has an intuitive understanding of consumers will be proven for the bunk it is.  The most brilliant product apple has ever created was Apple OS X.  Unfortunately there are literally dozens of oses available (most for free) that offer the same or better features than OSX.  That Microsoft Windows is dominant is a reflection of the market saturation of Microsoft rather than any inferiority of the Apple products.

Apple is smart enough to accept that they will lose the mobile phone market.  They can accept that as long as they can still charge a premium for their phone device (which is why they will never leave AT&T).  What they cannot accept is that ubiquitous computing will leave them behind.  Apple is not going to be defeated by an Android sword.  Instead it is going to be defeated by a million Android paper cuts.  Its a familiar feeling for Apple and it can’t feel good about it.  Even I wish it would happen to a worse company. 

Speaking of a worse company what the hell is Nokia thinking.  Symbian 3’s opportunity was back in 2008 before Android hit the street.  The moment the G1 came out the opportunity for a new mobile phone stack to make it pretty much disappeared.  Android is literally the best we can do right now for a phone stack.  I know it sucks to accept this, but, honestly we should have been able to have something like Android with J2ME and I don’t even know how long ago that disaster was.  Android’s greatest innovation is that it is an open source mobile device platform.  Sure you can make a better platform.  But, if you aren’t willing to give it away you can’t beat Android.  I suspect that Nokia is going to undergo a painful market readjustment once the failure of Symbian 3 is accepted.  Once they and Blackberry start selling Android devices I will invest in them.  Otherwise, they are on the wrong side of history.
        

Problems of being a large company, or why GWT and AppEngine don’t work well together

I wasted about a week on a known issue with GWT and AppEngine (link).  The basic problem is that GWT requires the source of a Java class in order to provide access to that class on the client side.  The issue is that age old problem of front end and back end version of objects that you encounter a lot.  The fix would be if the App Engine team had done some consultation with the GWT team so that this restriction of the GWT wouldn’t mean trouble for developers using both.  As it stands I’m going to use the light version of object solution.

Sucks that this problem exists, but, its what happens with a large company.  Two teams go off and do something and the results are a little incompatible.  It happens.

Google Web Toolkit

The big boys have another hit on there hands. If you haven’t heard the Google Web Toolkit is out in the wild and even comes with a nice Eclipse Plugin. The link is here (http://code.google.com/webtoolkit/). Very nice stuff and it even offers the ability to use the Google App Engine as host for your apps. The benefits of using GWT are plentiful, but, for my purposes:
1. Allows me to write a snappy looking Ajax enabled app without becoming a JQuery god.
2. Develop an app in a productive language like Java and then publish it as raw Javascript.
3. Eclipse plugin that enables me not to think about it.
4. Built in support for the Google App Engine (one less thing for me to think about).
5. Did I mention the whole Java thing…That is huge, because you are getting the power of Java in your apps.

Steve Jobs and the rest of the world

Before I say anything, let me say that I have a tremendous amount of respect and regard for Apple and their marketing expertise. This is not a slam, but, they sell 5 year old technology and make more off it than even the creators. Thats not their shortcoming, but, its a recognition of how talented they are at the sale. There is a genius in that. Complain all you want about being outdated and boxed in, but, you are getting a product that reflects the ideals of the packagers better than almost anything out there.

With that being said, I completely disagree with Steve Jobs vision of the world. I prefer freedom and independence even if it means that I have to have more diligence and put some effort into making it my own. As a software developer, I find the latest Iphone SDK odious. I can understand that Apple has an interest in controlling the “user” experience for it’s customers. However, frankly its none of their damn business really how I use their devices. To me, this started with the way they responded to “jail breaking.” For those who don’t know, that was the term for the way an IPhone can be rooted so that the users is able to configure the phone as they like and install whatever software they like. As a customer, I think that if you ask me to pay full price for this phone I have all rights to install whatever I want on the phone. While on the Verizon network I was running a Motorola Q with a nonstandard ROM. Though Verizon had a problem with that they didn’t block me from the network or not allow me to get updates etc. I had a nonstandard ROM because I wanted to use extra features that were in my phone that the Verizon ROM didn’t allow. My ROM did not have any impact on the Verizon network (though some customizations may have an impact). Now I can understand if a company feels that they should not suffer negative impact because of some change to a standard release. However, its usually impossible or difficult to tell ahead of time that a particular customization will have a negative impact. Until someone does it in the wild there is no way for Apple to know what impact some customization will have. What I find disturbing about the SDK is that it makes the presumption that developers will do things that have a negative impact. As a developer I find that insulting, but, understandable. However it is illogical. Without any testing there is no way to know these things. Since Apple isn’t testing nonstandard customizations there is no way for them to know ahead of time. Therefore they cannot jump to the conclusion that customizations will be bad.

This latest SDK is really about Apple wanting to further control customization to the IPhone. It is all apart of how Steve Job sees the world. I’m not going to comment on how valid or invalid his world view is. Frankly people can believe what they want about the world. However, this cannot justify him trying to change the world to fit his particular world view. I want the freedom to use what I paid for however I want as long as it doesn’t negatively impact others (such as Apple). For him to presuppose that something I do will negatively impact is quite insulting and infringes on my rights as a customer. It will only be a matter of time, before Apple’s philosophy about closing the user experience becomes a legal issue. It will already become a market issue. Everyone knows that wherever you find lots of programmers you are now going to find lots of Android phones. It is no accident. Developers all like to flirt with the vision of perfection that is Apple. However, when we roll up our sleeves we want systems that allow us the freedom to customize. For cellphones, Android is that platform. As I have said before, over time this will result in a shrinking market share for the IPhone. As I have said, this has happened before.

Since I know the market will correct the influence that Apple has on mobile application development I am not really concerned about what Apple is doing. Apple is doing the worst to their own interests. The current spat with Adobe strikes me as further desperation. If Apple really believes that they are going to have the market share to direct the future of web application development then they are under some grand illusion. Not even Google has that much influence and Google is probably the closest to having that influence.

What I get from all this is more of the old Apple conflict between Jobs and Wozniak (link). This is actually an old conflict between closed and open systems. As influential as Jobs is on consumer electronics is how influential Wozniak is on software engineering. In the end I believe the bazaar will always be larger than the Cathedral. However, I believe there will always be a Cathedral. The thing is that there will be fewer and fewer people flocking there.

XRX real world example

I know I have been mentioning my excitement about XRX and what it means for web development today. However, I have not gotten into any specifics as to why I think it is so powerful. Recently I solved a problem using XRX and I think its a good demonstration of the power.

I was building a little helper application and for a part of its
processing I need to store some web based data. Since it is pulling and storing data from the web XRX technologies are
a natural fit. I was working with Python since I was doing the initial design for a quick solution. I could have used a regular SQL database to store things. However, a few things bothered me about that:

  • Schema definition for something that I was actually changing a lot. I wasn’t big on firing up some sql management
    tool when what I really wanted was a web CRUD form that was flexible enough to change when my needs changed.
  • I needed to store web based data, so it was easier to store it as xml and to store it in a way that had columns of
    a typical web app (creation date, URI, etc).
  • I didn’t know where all the components were going to be and it made sense to have something that had a rest interface since I could easily access this from all sorts of connection points.
  • I had Orbeon running on my box so it made sense to use that to manage schema and my data.

Because of all these reasons I choose to create an xform model and form in Orbeon and then use the eXist rest interface
to update the model via HTTP puts in Python. Now I know that sounds like a mouthful, but, its actually much simpler than it sounds.

  • I created a form using the Orbeon Form Builder app. This is a web based app for creating xform web apps and models. You can literally create an app in 5 minutes using it. That app will have a data store automatically that has a rest interface. The power of this is that as I can change my model and web app in the same place. As I change it in the form
    builder its immediately accessible via Rest.
  • Once i defined my CRUD app, I was able to insert records based on the model from my python code using HTTP Puts. This was not even 10 lines of code to accomplish storing of data for my app with a flexible data model. I have worked on apps
    where it would take us 3 months to write code to allow such schema flexibility and such loose coupling. Using a REST interface is by the book loose coupling. If you have a unified XSD you could modify only one file to control data schema
    in all places.

Some of this may sound like overkill, but, keep in mind:

  • I’m not sure what my schema will look like yet.
  • Storing to a regular SQL data store would take 5 minutes, but, I would have to rewrite the code whenever the schema changes.
  • I didn’t need the CRUD web form, but, its an easy way to validate all my code by firing it up to see what my
    crawler (what I’m working on next) is inserting into the database.

Couple of kudos are in order:

  • Orbeon is brilliant.
  • Python is the quickest prototyping code that can be put into production
  • REST interfaces allow infinite customizability.
  • Sun Virtual Box is nice stuff.

Motoblur Android 2.0 Update, please Googlefy it Jha.

Unfortunately, Motorola has decided to leave a gaping hole in their Android device offensive.  Motoblur is a great technology.  I have a Cliq/Dext and I love how when I press menu something happens in less than a minute.  However, I don’t like the fact that even G1s have a later version of Android than me.  Also, I have heard that Android 2.0 solves a lot of the problems that I had with Android 1.X.  However, I have no way of knowing that because Motorola has chosen to have the Droid be the only “Google experience” Android device.  What this leaves for users like me is frankly unclear.  As a developer, I know it will take at least 3 months for integration work for a new Android platform to take place before they can do an update for my phone.  However, its been more than 3 months since Donut and no OTA from Motorola.  This could mean one of many things

   1. Motorola has slow developers or they had a problem updating Motoblur to 1.6.
   2. Motorola has no intention of doing incremental updates and is working on an Eclair 2.0/2.1 update around February timeframe.
   3. Motorola is still using the old phone development model and is going to wait a year before they come out with a version .1 of the Cliq and do an update for all devices then.

Of all these possibilities I hope it isn’t 3.  This would indicate that Motorola’s Jha may be missing the market.  The pace at which things change is quickening for everyone.  I can see how it is hard to appreciate that users would be impatient with a device that isn’t updating every 6 months when people used to keep phones for 3-5 years with no update.  However, this is a critical failing.  Bluntly, customers want the latest and greatest and they want it for free.  This is actually why the on-line pornography business is going to implode.  Any sector that had its hopes set on a stagnant market with the same customer demand for years in software is going to die out.  Google is demonstrating this to the world everyday.  If you don’t accept the Goggle model of continuous, free updates then you are dead in the water as someone else will.  That someone else will have your customers the next time you are late with an update.  Every day there are thousands of competitors out there looking at your products and business model and trying to see how to “Googlefy” it.  That is

   1. Create a better user experience for it that is as easy to use as Google search.
   2. Make use of more data that is “smarter” about how it does things than your software is.
   3. Come up with releases as frequently as they can
   4. Provide it with no costs or at the most a low subscription model (And even subscription models are disappearing as well)

What these means for companies like Motorola is actually pretty devastating in the long run.  The only way to beat Google is to do like Google and hope that you can still bring in revenue in a fashion similar as possible to your current revenue schemes.  Not everyone can offer it for free, but, if you provide 3 out of the 4 things I called “Googlefying” then you stand some chance of a long term strategy.  To be fair, Google hasn’t quite figured out a revenue scheme for all the products it creates, so you may actually be beating them to the game there.

Hopefully, Motorola is working on a 2.0/2.1 release that will probably be a minimal version of Motoblur since Eclair includes so many of the features of Motoblur.  If not, I will likely keep my phone for at least another year, but, to be honest I won’t be able to afford to keep an outdated device for much longer than that.

Computing is getting smaller

Computing is getting smaller.  Ten years ago when you would plan an enterprise system we would plan in terms of how many physical boxes we would need to serve the need of x users.  For example, we would say that to server 1000 simultaneous users we would have one application server box, one  database server box, and two load balanced presentation servers.  So if our client need was 5000 simultaneous users  then we would know that we are talking 5 times of many boxes (e.g. 20 boxes).  This was a significant aspect of solution design and it was very much a limiting factor.  It meant that there had to be time and resources for integration (installing OS, application libraries, and applications on one box is time consuming:  now imagine on 20 boxes) accounted for in estimates.  You had to factor into your solution how much it would cost for the hardware.  That’s why all shops back then usually had large it infrastructure resources on hand.  For you to create an enterprise solution you had to maintain a shop that you would only use for a small portion of your development all year round (there was little outsourcing of it back then).  I am not even talking about something at the scale of a Google or amazon either.  Medium size applications (5000 to 100,000 simultaneous users) would need these resources.  Otherwise you would not be able to create a solution quickly.

The result of all this is that the typical development effort would have a full complement of developers as well as a sizeable on site hardware and hardware staff complement.  It also meant that any discussion of a new solution would require hardware folks in the discussion.  Capacity planning was very much in the domain of the hardware folks.  Typically you would spec out a list of boxes needed and an expected load per box.  The hardware folks would take that and return with a hardware requirements list.  Computing back then was a big effort because even before you started there would be several servers sitting around with staff supporting them.  During the development even more hardware and staff would be added.  Once you were done there would be typically more hardware and staff as well.

Today, we still need infrastructure to support of solutions.  However, the effort is much smaller.  I can seriously say that I can plan and implement a solution on the same scale of a medium size solution from ten years ago without either consulting any hardware folks and without adding a single physical box to my infrastructure.  In fact, depending on my market goals, I may be able to do this without spending a single dime initially.

What has caused this sea change?  A few things.  The first is undoubtedly virtualization.  I can remember the first time that sysadmin and I worked on deploying virtualization software in this infrastructure where I was a team lead.  It was at my request, because I was frankly getting pissed about how long it would take for me to get hardware to just test out new software solutions.  Eventually at that site we got a huge Dell sever with 16 gigs of goodness that ran at least 4 virtual servers.  That sysadmin may not have realized it, but, the moment we got that box my reliance on him to deploy new solutions immediately became zero.  With a few mouse clicks I could configure and deploy a new box and automatically size it to whatever my application needed.  I would tell any sysadmin that if they are worried about job security don’t give a person like me a virtual server to play with as I will likely never need your help with anything ever again. 

The seond thing is cloud computing.  Actually you can think of cloud computing as an obvious extension of virtualization.  However, there is a way in which cloud computing has achieved a sort of critical mass.  With virtualization you basically end up with a bland server that still has to be configured for your apllication needs.  You still have to make that application server provide all the services needed for your application.  If you want a cron type process running in the background then if its a windows box you will have to create a scheduled task or Windows Service.  If you want access to a sql like data access layer you will have to install some db server and then set up authentication and whatever database objects that you need to exist.  The point is that a virtualized server will still need to be configured for your application and that will require additional software and configuration before you get to the point of focusing on your domain problem.  Even for some of the brilliant development tools like Ruby you still have to deploy Apache.  However, the cloud changes that.  All the cloud implementations start out by providing you all the services and abstraction layers that you could ever imagine for creating an application.  Out of the box you begin by working only on your domain problem.  No need for an authentication layer, a job monitoring layer, a message queue layer.  All of these facilities are included.  In fact the only piece of software you will spend much time with is a development tool for whatever language the cloud supports.  E.g. if its the Google cloud you will be using Eclipse running either Python or Java.  Clouds also abstract away hardware.  In fact hardware becomes an abstract concept that is only referred to in terms of CPU cycles or concurrent users.  As a developer you focus on this when it is absolutely necessary:  when your application users need more performance for an already deployed application.

The third thing that has changed can best be described as the growth of XRX.  The web is a platform.  This is also a good way to look at the components that make up a domain model based solution.  In fact this is a natural way to look at a problem in a domain based way.  I wont get into a discussion of why I think a domain model specific way is a natural way to look at problems.  However, I will say that this is exactly the basic premise of all the cloud infrastructures that I have experience with.  The focus is on getting the developer to think on their problem domain and then constructing data models and processes on those data models.  10 years ago if I was planning an application so much time would be spent on specifying hardware, bandwidth, application libraries, etc, that by the time we would get to actually writing in any domain specific way it was usually towards the end of the application effort.  By then any notion of a problem domain was very much influenced by all these other hardware and application library constraints.  Today, the problem domain can pretty much be your only consideration.  Hardware specifications and application library constraints are not a limiting factor anymore.  For a developer it is very freeing as we spend time doing what we should be doing:  problem solving.

Why do I say computing is smaller.  Well its because the effort to create something like a Facebook can literally occur at my dinner table with no more than myself and some other developers involved.  We don’t need a dedicated server team.  We don’t need to spend our capacity planning effort on speccing servers.  yes, there are still giant servers in some data center somewhere.  However, I don’t have to think about that till my app has been up and running in the cloud long enough to necessitate additional capacity that I will then hopefully have enough revenue to pay for.