Sunday, December 23, 2007

60 Years Since it all Started

As you can tell from the title of this post, today, December 23, 2007, marks the 60th anniversary of the most important invention of the 20th century (it was actually invented on December 16th, 1947, but not shown to anyone until December 23, 1947, because they wanted to have time to run tests on it). While there is sure to be much debate, this device is critical to nearly every product, of any kind, manufactured today. From its humble beginnings at Bell Labs in 1947, the transistor has led the march of electronics for the last 60 years. To really understand the transistor's impact, we need to go back a couple of years, and take a trip to science town. (I've tried to make this explanation as plain and simple as possible, but this stuff is pretty complicated when you get right down to it.)

The year is 1915. Researchers, scientists, and inventors worldwide are taking their first steps in the world of electronics since the initial discoveries surrounding DC (Direct Current, Thomas Edison) and AC (Alternating Current, Nikola Tesla) in the late 1800's. In 1880 Edison discovered a phenomenon called thermionic emissions, now commonly known as the 'Edison Effect', while inventing the incandescent light bulb. Thermionic emissions are the flow of electrons, or charge carriers, from a surface (like a piece of metal).

In the case of the light bulb, the thermionic emissions are created when an electric current passes through the filament (or the little wire in the light bulb) which resists the flow of electrons moving across it (thermionic emissions) , which in turn causes electromagnetic radiation to be emitted (or released) from the heated filament in the form in light (or radiation that exists in the visual spectrum).

Based on some of Edison's research on thermionic emissions, Irving Langmuir built the first modern Vacuum Tube at the GE (or General Electric as it was known back then) research lab in 1915. A vacuum tube also uses thermionic emissions, but in a completely different manner. A vacuum tube is composed of three main parts, an anode (a positively charged metal plate), a cathode (a negatively charged metal rod), and a control grid (a wire that is wrapped around the cathode) that sits between them. When the anode is heated, it releases electrons (thermionic emissions again), whose flow is directed by the grid into the cathode. All this takes place in a glass tube containing a vacuum (I know, how in the world did they come up with the name vacuum tube for this device?).

Now that we know what a vacuum tube is, what does it do? Basically it is used to amplify (as in a guitar amplifiers), switch (as in computers from the 50's and 60's), or to modify electric currents. In terms of computers, a switch is a device that can flip between two states (a 1 or 0, known as binary code). The highly charged (or high voltage/current) state of the switch is a 1, while the low state is the 0. Vacuum tubes where originally designed to replace mechanical switches (yes, that's right, computers actually used to be built from completely mechanical parts like the Difference Engine designed by Charles Babbage in 1822), which were incredibly slow (at least by today's standards any way) and prone to breaking down.

One of the first computers to use vacuum tubes was the ENIAC (short for Electronic Numerical Integrator And Computer), built in 1943. The ENIAC was designed at the University of Pennsylvania by John Mauchly and J. Presper Eckert to calculate artillery firing tables (charts that are used to calculate the angle and powder charge of artillery) for the US Army. It could take upwards of 40 hours by hand (due to the process of incorporating hundreds of variables), to calculate a single trajectory by hand (an incredibly complicated task), while it took ENIAC only 30 minutes to complete the same task. Without these charts it was very difficult to hit enemy targets with anything nearing precision, so needless to say, the US government was throwing a lot of weight behind the project as it was happening during World War II (just before the Pearl Harbor attack and America entered the war).

The ENIAC was able to do these fantastic (by 40's standards) calculations by utilizing over 18,000 vacuum tubes to perform 385 calculations per second. By controlling which tubes where 1's and which where 0's, mathematical calculations could be performed. Due to the "newness" of the vacuum tubes built at that point, there was the significant issue of the tubes burning out to factor into operating procedures. In fact, the tubes burnt out so often (several a day; remember that they are similar to light bulbs) that research was done to determine what was the main cause. They found that most of the failures happen during heat up (start up) or cool down (shut down) so they decided to just leave it on continuously to limit burnouts to a minimum (it never had to be rebooted because of operating system crashes either. Anybody miss those days?). The ENIAC ran nonstop from 1946 until it was retired in 1955.

Vacuum tubes where pretty much exclusively in every computer manufactured from the early 1940's until the 1970's. Nowadays over 10 million "true "vacuum tubes are still in use (mostly in radio stations, telecommunications, and amplifiers) and millions more in the form of "tube TVs" (ever wonder where the nickname "tube" came from? Well, now you know) , CRT (or cathode ray tube) computer monitors, and microwaves (which use a magnetron). So while it's on it's way out, the vacuum tube still plays an important part in electronics. Coincidentally, the device that replaced the vacuum tube was invented just over a year after the ENIAC went into operation.

The basic idea behind the transistor was to be able to perform the same electronic switching as the vacuum tube, but in a more energy efficient manner. Not only that, but they were considerably smaller as well. In fact, the transistor was considered so revolutionary back then that in 1956 it netted its inventors, William Bradford Shockley, John Bardeen, and Walter Houser Brattain, the Nobel Prize in Physics.

In 1955, some of the first commercial devices were built, by a tiny Japanese electronics company called Tokyo Tsushin Kogyo (fortunately they later changed their name to Sony, or Ridiculously Proprietary Incorporated as I like to call them), called transistor radios. Since they didn't need giant, power hungry, unreliable vacuum tubes anymore these FM radios could be made very small and run off of batteries. Originally sold for $50 ($364 in 2006 dollars) , they quickly dropped to $10 ($73) in the early 60's as Chinese manufacturers entered the market. This device did two very important things; first, it brought the idea of affordable electronics to the average working man, and second, it made transistors very cheap due to the amount that were being manufactured. (The more you make, the cheaper they are since most of the money it takes to build them is the factory, not the actual components themselves.)

However useful they are, the transistor is important because of only one device that uses them, the integrated circuit (IC). The integrated circuit, invented in 1952, uses transistors in the same fashion as the ENIAC used vacuum tubes. Basically, the IC chip is a collection of many smaller components (think of a computer full of vacuum tubes) that are controlled by a binary program such as assembly language. IC chips are used in nearly every single electronic device manufactured today and range from simple chips used in a calculator to extremely complex microprocessor chips (or CPU, Central Processing Unit).

The first commercial microprocessor, the Intel 4004 (a 740 kHz processor), released in 1971, utilized 2,300 transistors on a single chip to perform 92,000 instructions (or basic commands like 'add these two numbers together') per second. So we went from a computer (the ENIAC) that took up 680 square feet, used 107,000 individual parts (18,000 vacuum tubes) and was only able to perform 385 calculations per second in 1946, to just 25 years later where a single chip an inch long is able to perform 239 times the number of calculations per second. It's easy to see why transistors are so important and why they were instrumental in pushing computing forward.

In 1975, the first commercially available "home" computer, the Altair 8800, was sold. It used the Intel 8080 microprocessor (hence its name) and was sold for the rock bottom price (at that time only major companies could afford computers because they cost millions of dollars to build) of $250 ($1254 in 2006). Released just four years later, the 8080 (a 2MHz processor) was able to perform 500,000 calculations per second, over 5 times more computing power.

On a side note, the Altair also was the first client of a soon-to-be-famous two-man company from Albuquerque, New Mexico, founded that year (1975). Microsoft wrote a version of BASIC for the Altair that allowed people to actually "do" something with their new computers.

The Altair made computers cheap enough that everyone could afford to buy one, and within a couple years there were dozens of companies making similar machines. The rest, as they say, is history. And what happened to our little friend, the transistor? Why, he is now so small that a modern microprocessor the size of your thumbnail and just as thin, can now hold over 1.7 billion transistors. Contrast that to the first transistor 60 years ago that was the size of a baseball. What a long way we have come.

Of course, the transistor isn't going to last forever, but unlike nearly everything else in electronics, it is getting replaced for a very peculiar reason. While Moore's Law, named after Gordon Moore, one of the co-founders of Intel, stated in 1965 that the cost and size of a transistor (and therefore basically every piece of electronics since most of them are made from transistors) would decrease by half every 18 to 24 months. In effect, what this means is that a processor will cost the same, but be twice as fast around a year and a half later.

While this has held true (for more info see the blog post the 64-bit Question) for the last 42 years, within 20 years it will no longer be true. How can I be so sure? Simple. Within 20 years, in order to cut the size of a transistor in half, it would have to be smaller then a molecule. Now, while I think that humans are smart, they aren't God. So knowing the we're going to hit this limit soon, we have to look into a new direction, like molecular electronics and quantum computers. Researchers hope that one or both of these directions will provide us with the next "big" step in computers. By that point, our dear and faithful friend will be approaching his 80th birthday and will finally start entering into retirement. It just goes to show, that sometimes that smallest things do turn out to be the most important...

Think I'm right? Think I'm wrong? Click here to join the discussion about this post.

Thursday, December 20, 2007

CompUSA Who?

I'm going to start this post a little differently, with a disclosure. Since I own no stock in any of the companies I talk about, nor do I work for any of them, I feel I can make any observations I want. While that is true about every other article I've written, I have worked at this company. I was hired by CompUSA in October of 2003. I worked first on the sales floor, and later as an Assistant Retail Sales Manager. I left the company in May of 2007, when the branch near me closed (more on that later). This both gives credence to my observations and has the risk of distorting my views since I actually worked there. You can be the judge.

I should start with a history lesson since many of you probably haven't ever heard of CompUSA, even though they are one of the largest electronic retailers in the US, and have been for many years. I used to describe the marketing quality of CompUSA like this:

If you were to fly to Afghanistan and arrive in a random city, you would rent a Jeep and drive straight out into the desert. After you were at least 50 miles away from any sign of civilization, you would start looking for the first band of nomads you could find and ask them a simple question: "Have you ever heard of Wendy's?" The answer will probably be, "Of course I've heard of Wendy's! I love their Big Bacon Classic." Do you know why this statement, while ridiculous, is probably true? Because Wendy's actually knows how to market to consumers. But back to CompUSA. I was actually told on several occasions by customers, "You know, I actually live just up the street from your store and I've never heard of it."

Did you know that CompUSA has higher yearly sales (in 2005 before they started closing stores) than Pep Boys, Abercrombie & Fitch, Men's Warehouse, Dollar Tree, Petsmart, Advance Auto Parts, and WENDY's. Not only is CompUSA bigger then Wendy's (despite the fact that they haven't grown at all in the last 7 years; meanwhile Best Buy has tripled and Circuit City has doubled), but CompUSA has almost twice their annual sales (4.7 billion versus 2.4 billion).

Listed below are the 7 largest "electronics" retailers in the US. How many have you heard of? My guess is 6.

(I'm excluding Walmart, Kmart, Costco, and other major retailers who happen to sell electronics.)

(data from the Fortune 500 list)
Best Buy - 30,848 Staples - 18,161
Office Depot - 15,011
Circuit City - 11,598
Game Stop - 5,319
Radio Shack - 4,778
CompUSA - 4,700 (estimate from 2006. Since it is a privately held company, more accurate numbers are not available)

I'm not sure what the executives at CompUSA have been thinking, but at one time they were bigger than both Best Buy and Circuit City. I think it has something to do with the fact that they are absolute morons when it comes to marketing.

Last week I was in Cleveland and I was looking through the ads in the newspaper. Best Buy had a great 50 page ad (it is almost Christmas time, after all), Circuit City had a 30 page ad, and CompUSA didn't have one. I actually went to the CompUSA in Fairlawn, OH (not more then 15 miles from my locale) and they actually had an ad in the store when I first walked in. Although it was really a pittance of an ad at about 8 pages. So this is their wonderful strategy: don't advertise externally to drive traffic into the store, but wait till people come in on their own and then mark the product down (in the ad) so that you'll lose more money on it. Absolutely brilliant, CompUSA. You get an A++ for your marketing strategy.

Let me ask another question. Have you ever seen a Best Buy commercial (or more like a 1,000)? Does "Turn on the Fun" sound familiar? Perhaps a Circuit City commercial? How about CompUSA? Yeah, I didn't think so, for the last one. Despite being one of the largest retailers in the nation, CompUSA thought it would be prudent to never advise on TV at all, because after all, it's just a passing fad that no one uses (only 112 million households). I worked at CompUSA for nearly four years and I can tell you with absolute certainty that CompUSA only ran ads for one month of that time period. The one time they actually tried running ads (which were on major stations like TBS, nationwide) for a single month, they didn't see their sales jump 150% instantaneously and decided to stop running them.

Many advertising experts believe it takes at least 5 to 7 times seeing a company's ad before a consumer will start to recognize your brand. I've seen some studies that say it takes as many as 25 times seeing the brand name before you'll recognize it and consider doing business with the company. CompUSA's main problem is that they either do nothing and hope it gets better, or try for a little while and give up long before they'll actually start seeing the fruits of their labor.

As if the fact that they didn't advertise wasn't bad enough, their logistics were even worse. Say whatever you want about how Walmart got as big as they are ($352 billion in 2006), but their logistics team is incredible. If you go into a store and buy something, the second the transaction is completed a factory in China starts making another one and within a couple of days it's at the store. I realize that this is a gross oversimplification, but they have done something that no one ever thought was possible. They are not only by far the largest company in the world but they are still growing at over 10% a year. Think of it this way; in order for Walmart to gain another 10% in sales next year they'll have to grow their sales by 106 million dollars per day. They are going to grow (in only new business) a CompUSA every 44 days.

Let's contrast that above paragraph to CompUSA's logistics strategy. To put it simply, they didn't have one at all. Basically what they did was allocate a percentage of new product to a store, not based on their actual sales, but rather on their gross yearly sales. So if store A sells a lot of low end computers (<$1000) and does 1 million in yearly sales, and store B sells a lot of high end computers (>$1000) and does 2 million in yearly sales, this is how things would get allocated. If CompUSA bought 300 $2,000 computers, they would send 100 to store A and 200 to store B. Now, this doesn't seem like that bad of a strategy, but let me add this little tidbit of information. What if I told you that store A hasn't sold a single computer over $1,500 in the last six months, while store B has sold every single one they got and often has none in stock to sell. It's starting to sound a little stupid, isn't it?

Well, this is exactly what I witnessed while working there. In Columbus, there used to be two CompUSAs. One that sold a lot of lower end computers (my store) and another that tended to sell a lot more high end electronics (like plasma TVs, $2000 computers, etc..) So every time my store got a $2,000 Sony laptop we would try to sell it but it would end up sitting there for 6 months until the corporate office decided to clearance it, at which point we would sell it for several hundred dollars under cost (cost is what CompUSA paid for it wholesale).

I watched this happen over and over and over again. We would never get the things we actually sold in stock, but we'd get hundreds (one time literally 700 6ft USB cables in one day - we usually sold about 100 a month) of units of product that we either couldn't sell or which would take us a year to sell. Praytell, what would happen if we didn't sell any of that particular product? Well, the next month we'd get more of it. Not being at the corporate office in Texas, I can't be 100% sure, but from talking to the managers in other stores and in looking at sales and inventory numbers for the stores, it appears that corporate didn't ever look at anything.

I can be pretty sure about the above statement because in the spring of 2007 CompUSA closed approxiemtaly 1/3 of it's least profitable stores, including both my store and it's sister store in Columbus, Ohio. Not that I was suprised. After all I had been saying that things weren't being done right for three years. Some of the blame has to rest on store management (inlcuding myself) but by far the vast majority needs to rest of the corporate office. At the store level you can only really do some much with what you have. I mean their corporate buyers must have really sucked at math because our pricing was so bad that sometimes even I didn't shop there. Last year, I bought a new motherboard on newegg.com for $175 even though I could buy things at cost at the store. Why? Because CompUSA's cost on it was $250. That's right, CompUSA's cost was $75 higher then Newegg's retail price. Sadly, this happened far too often.

Not that everything was bad. According to Consumer Reports, CompUSA was always ranked very highly in customer service and knowledge of product. You can be sure that if you talked to someone at CompUSA, they'd know everything about your computer. When I used to "mystery shop" our competitors, it would appall me how stupid the guys at Best Buy are. You can ask them the simplest questions and they scratch their heads. If you asked any guy at CompUSA what the L2 cache of a processor was or any number of random complicated questions, you can bet that they probably knew the answer immediately. (I sure do.) After comparing the two stores I couldn't believe that people would buy computers (CDs and DVDs of course) at Best Buy instead of CompUSA, but that's what happens when people don't know you are there.

Which brings me back to the beginning. When I started working at CompUSA in 2003, it was because I went there to buy an MP3 player, and was so impressed by the knowledge of the salesman and selection they had that I filled out an application on the spot. Not two months later, I started making comments about how poorly the store was run and I could often be heard saying, "CompUSA is either going to get bought out by another retailer or go bankrupt within 5 years." I often make blanket statements like that based on my observations, and in this case I was nearly dead on. It was announced earlier this week, that in January of 2008 (4 years and 3 months after I started working there) that CompUSA will be closing all of their remaining stores and going out of business forever.

As is often the case, the most discerning eye is at the bottom of the company and not at the top. I used to say that if I took a crap on the CEO's desk and the board of directors voted it in as the new CEO, that the company would be better off. While this is a humorous notion, I nonetheless believe it wholeheartedly. The company executive leadership team was so moronic that the company would have been better served by piles of crap that just sat there and did nothing but stay out of the way.

I can't tell you how many times CompUSA would start a great program, like the 'CompUSA Network' (a rewards program), only to cancel it shortly afterward (a little over a year later). When that program launched, it was literally the most important thing in the entire world that each store sign up every customer who walked into the building. It got to the point that some stores were actually giving it away (it cost $15-20) on slow days because if they didn't sell enough they got completely reamed by upper management. Now, if you were the General Manager of a store that made a net profit (profit leftover after all expenses have been paid) of over a million dollars a month, that meant absolutely nothing if your store didn't sign 50 people up for the CompUSA Network. If executive bonuses are tied to stupid things like 'CompUSA Network' sales rather then actual net profit dollars, is it any surprise that they are going out of business?

Since I got to see the store's P&L (profit and loss statements), I can tell you that the managers got bonuses some months that the store itself actually lost money after expenses. They actually got a bonus even if the store lost money as long as they performed at other things. I can't say how other retailers do it, but if you don't encourage people to make margin, your company isn't going to make any either.

As I reflect on the events that will transpire of the next few weeks, I am saddened by the thought that soon the best computer retailer (CompUSA usually has over different 100 laptop and desktop models in stock while Best Buy and Circuit City have 20-30) in the US will be gone forever. Not because I worked there, but because I'm losing my favorite place to shop. Here's to you, CompUSA. A retailer who was great at the store level (in customer service), but completely lousy at the top. It just goes to show you that good advertising, expert logistics, and plain old common sense are the best ways to run a business.

Think I'm right? Think I'm wrong? Click here to join the discussion about this post.

Monday, December 17, 2007

The 64-bit Question

Windows Vista is available in both 32-bit and 64-bit flavors, so what is the difference? While there are much more technical definitions of the differences between a 32-bit and 64-bit OS (Operating System), the short and sweet of it is that a 32-bit OS/CPU (Central Processing Unit) can move or manipulate a 32 bit (a bit is a 'Binary Digit', or a single 0 or 1) number, while a 64-bit OS/CPU can move or manipulate a 64 bit number. The reason why this is important is because the processor can literally do twice as much in the same amount of time. While this is a gross oversimplification, and it isn't really twice as fast in a real application, this explanation will suffice for the purposes of this discussion.

So why on earth would you want a 64-bit version? Among other things, a 32-bit OS (Operating System) can only handle 4GB (gigabyte or 1 billion bytes) of memory address buses, which are used by the CPU (Central Processing Unit) to store and retrieve data from a specific data location. Imagine that your system memory is a giant piece of graph paper with over 4 billion squares on it. In simple terms, the individual square can store a small piece of data and the address bus determines the 'row' and 'column' (or exact location) of that particular square.

So next year once 4GB sticks of RAM become common in most computers sold, lots of people will need to go to a 64-bit OS in order to actually be able to use all that memory. Here's a note to all you l337 (elite) computer users out there who think that you have the greatest computer system of all time: if you have 4GB of system RAM and a 512MB video card with a 32-bit OS, your system is going to ignore at least 512MB of your system RAM (Random Access Memory) because there won't be enough address bus locations available for it. Obviously this is going to be an issue for most computer users next year, since many of this year's computers already come with 2GB of RAM, so many new computers will need to be loaded with the 64-bit version of Windows, officially ending the 32-bit OS's 13 year reign (1994 to 1997).

This leads me to the next logical question; If 4GB is the limit of a 32-bit OS, what is the limit of a 64-bit OS? Now the answer won't be double, although that would be the obvious conclusion. In fact the reason that 4GB is the limit of a 32-bit OS is because 232 is equal to 4GB. (232 = 4,294,967,296 bytes) So we can see that since the 4GB is derived from a power of 2, doubling it's power from 32 to 64 will not double the total, but be exponentially higher.

In fact, the answer might astound you. 264 = 16 exabytes, or 16 billion gigabytes of RAM. That's not a typo. A 64-bit OS can handle 16 BILLION GIGABYTES of RAM. In fact, in an interesting twist of irony, using the word exabyte is so uncommon at this point that the spellchecker for this blog insists that exabyte is not a real word and is trying to convince me to change it.

To define exactly what an exabyte is in relation to a gigabyte, as well as what a gigabyte even is, I draw your attention to the list below:

1 byte = 8 bits
1 kilobyte (kB) = 1024 bytes = 210
1 megabyte (MB) = 1024 kilobytes = 220
1 gigabyte (GB) = 1024 megabytes = 230
1 terabyte (TB) = 1024 gigabytes = 240
1 petabyte (PB) = 1024 terabytes = 250
1 exabytes (EB) = 1024 petabytes = 260

So we can see that 16 exabytes = 16,000,000,000 gigabytes.

Why does this matter? When will we actually need this ridiculous amount of RAM? Well, that answer might astonish you as well. To explain this I'll have to bust out some more math (sorry non-math geeks). Moore's Law (1965), named after Gordon Moore, one of the co-founders of Intel, states that the processing power of a microprocessor (or CPU) will double approximately every 18 months or 1.5 years. This law is used to calculate how quickly the processor speeds will increase, hard drive capacity will increase, as well as the size of RAM modules.

If we get in the Way Back Machine and go to 1977, we'll be witnessing the birth of one of the first 'production' computers (that is a full scale assembly line production, not hand built), the Apple II. The Apple II which cost $1,298 in 1977 dollars ($4,440.81 in 2006 dollars) shipped with 4kB of system RAM. It we extrapolate this out (shown in the chart below) by doubling the amount of RAM every 1.6 years (a slight deviation from the 1.5 years as stated in Moore's Law), in 2007 the average computer will have 2GB of RAM. (Not only that, but the computer costs less than 1/4 as much.) Based on the continuation of that formula, in 2060, if we follow the same rate as the last 30 years, an average home computer will come with 16 exabytes (16 billion gigabytes) of RAM in it.


Imagine that... every home computer will have 16 exabytes of RAM. So now that we know that we can be fairly certain that we'll have reach this amount of RAM, I know there are those of you out there that are asking yourselves, "Why in the World would I ever need that much RAM in my home computer?" Honestly, even being the computer geek I am, I have a hard time believing I'll ever need that much RAM myself. Since I can't predict the future (or can I...), all I can do is look to the past for a historic perspective.

The video clip below is from the first episode of "The Screen Savers" (which I can't believe ever got canceled because it was great), which ran from 1998 to 2005 and was for many years the highest rated technology show on TV. Hosts Leo Laporte (who I'm going to plug since his podcast is awesome) and Kate Botello show viewers the "ultimate gaming machine" which has a Pentium 2 300Mhz processor and 128MB of system RAM. Watch for the part 4:30 into the clip where Kate says that "...64MB of RAM is as high as you'll ever need to go."





So if some of the smartest people in technology said 9 years ago that 64MB would be all you'd ever need, and I'm sitting here looking at the minimum amount of RAM to play Crysis (which is 1GB), I know that I'll want that much RAM. Every time I've thought games and software couldn't get any better, they always have. So while part of me may not believe it, I'll know I'll want it someday. Needless to say, we do need 64-bit operating systems and I'm glad we have them (despite how people though AMD was crazy to bring out 64-bit processor a couple years ago).

The really crazy thing is that somewhere around 2050 people will start thinking about a 128-bit OS and there will be a definite need for it.....

Think I'm right? Think I'm wrong? Click here to join the discussion about this post.

Monday, December 10, 2007

Human Nature Strikes Again! Mac VS PC.

Vista sucks! Or at least that's what pretty much every journalist in the world seems to believe. Since Mac OS (Operating System) 10.5 (or "OS X Leopard," as it is more commonly known) was just released, some interesting things have come to light that made me rethink my own stance on Vista.

Vista has been been out for nearly a year now and now that SP1 (Service Pack 1) has just been beta tested, there are finally some good numbers to crunch. Recently both the upcoming Windows XP SP3 and Vista SP1 were compared against their respective previous versions. Interestingly enough, XP SP3 was a 10% performance boost, while Vista SP1 was barely a 2% increase. So not only is Windows XP still a more stable and efficient OS than Vista; with both the new services packs out, XP pulls even further ahead.

Now, to play the devil's advocate, Vista was a complete rewrite of the kernel (core code at the center of the OS), so there are bound to be issues. The rewrite added more support for multi-core applications (yay for even more awesome video games!), made it easier to write applications for, and offers hundreds of other new features. For these reasons, I'm not surprised that it is laggy out of the gate, but I'm sure the next version will be much better.

That said, I do agree that Vista is not the best product Microsoft could have come up with, but let's not heap all the blame on them. Several recent studies have proven some important facts about OS's that apply to both Microsoft and Apple iterations. More often than not, it is the fault of poorly written third party apps (applications), rather than the OS itself, that causes system failures. In the case of both Windows XP and Windows Vista, boot up times were significantly affected by apps like Norton Antivirus, which slowed boot up times by over 60 seconds in many cases.

With a couple more of those 'worst offending' apps, your boot up time could be delayed by five minutes or more beyond what the OS should normally take. Saying that the OS start up time is bad (which is really because of apps like Norton Antivirus) is Microsoft's fault is like saying that crappy Firestone tires on a Ford automobile is Ford's fault. It's even less so in Microsoft's case, though, since they can't control who writes apps for Windows. So I'll be the first to admit that XP is a way better OS than Vista, and despite having a free copy (of Vista), I'm not planning to use it anytime soon. However, I think that Vista, and Windows in general, is a lot better than people give it credit for.

It has been said that Microsoft is forever copying Apple's software design, but it looks like Apple has taken a play out of Microsoft's book with their new OS 10.5 release. Apple took a very stable and efficient 10.4, much like Windows XP, and turned it into an unstable, inefficient, visual razza-ma-taz that is OS 10.5, much like Windows Vista. Many of you out there might be clutching your chests over the shock of learning for the first time that Apple is, in fact, a fallible company, but it's true. While I am a fan of OS 10, it's safe to say that Mac dropped the ball on this one.

Those issues aside, I don't believe that poorer OS releases are the true cause of resentment against either OS 10.5 or Vista. I think the real reason is that that both companies are the victims of their own success. Let's use Windows as the example here:

Windows 3.1 was released in 1993. Barely two years later Windows 95 was released (1995). Another three years go by and we have Windows 98 (1998). In 2000 Microsoft released both Windows 2000 Professional, based off the Windows NT platform, and Windows ME (Millennium Edition) based off the 9x platform (or Windows 95, 98, and ME). Windows ME is regarded by many as the worst OS Microsoft ever released, I believe mainly because Microsoft was trying to release two simultaneous versions in 2000. In any case, they absolved all bad feelings with the release of Windows XP in 2001. So we can see that Microsoft was releasing a new OS basically every two years for nearly a decade up to this point. For many people getting a new OS with every new computer became a fact of life, and having used every one of these OS versions myself as soon as they came out, I can tell you that I had no qualms about upgrading each time, with the exception of ME, which prompted me to go to Windows 2000 instead.

Moving to the present, Windows Vista is released in 2007, over 6 years after Windows XP. By this time XP has completely dominated the OS landscape with over 400 million copies sold worldwide. I think it's safe to say that I, like many of you, love Windows XP and have spent a long time using it. Unlike every other OS release in Microsoft's history, this time around we've had an OS (XP) long enough to grow attached to it. Compare Windows ME to Windows XP. If you bought ME in 2000 and you hated it, you'd just buy a copy of Windows XP the next year when it's released. Not that this example is good for the consumer, mind you, but it's the true state of affairs. In XP's case we've been waiting for 6 years for a new OS and honestly, I think that even if Vista had been the be-all end-all greatest OS of all time people would still be complaining about upgrading to it. Not because it's a bad OS, but rather we've become too comfortable with XP.

To cite another example, I've recently upgraded my Microsoft Office version from 2003 to 2007. Now, I've been using Office since 1995, and every Office release including Office 95 (1995), Office 97 (1997), Office XP (2000), Office XP (2001), and Office 2003 (2003) have used the same tool bar interface. So, in these versions, at the top of the screen you have a toolbar with a series of words that when clicked will expand into a series of options. Being the geek that I am, I prefer to have the menu options streamlined like this, but in Office 2007 the interface has been completely changed. The menu has been replaced by a series of large buttons, almost like a fast food register where the buttons are pictures of hamburgers, and I absolutely hated it at first. Maybe it's because I've grown used to the old way over the last 13 years, maybe I'll never like it. A couple days ago, though, I was building a Power Point Presentation and I used some of the new options on the menu, and honestly, the new interface is growing on me. I still think it will take a year before I really like it, but the potential is definitely there.

So in the case of Vista, while I'm not going to upgrade until the OS is more stable, I'm sure I'll have the same initial feelings about Vista that I did about Office 2007. I'm not going to like how things are changed, but at some point a year from now I'll be using it and thinking of these words I've written, and I'll laugh because I love the ways it's been changed. The proof that really drives this point home is the fact the Mac users are resisting OS 10.5 the same way Windows users are resisting Vista. Apple released a new version of OS X nearly every year from its debut in 2001, 10.1 (2001), 10.2 (2002), 10.3 (2003), 10.4 (2005). Then it was over two years later that OS 10.5 Leopard was released. Now again, I agree that both OS 10.5 and Vista definitely have their shortcomings, and that is part of the equation. If the new release isn't topping everything about the previous release, people are going to find things to hate.

Let's all just face facts. Software will never be perfect. Vista has over 50 million lines of code for the OS alone. You try writing a one page document (which usually has about 46 lines at 12 point font) without making any typos, let alone 54 million pages. So in the future let's give Microsoft and Apple a break; they've got an impossible job building an OS that will stand up to the abusive code changes of third party apps like Norton Antivirus, Bonzai Buddy, and WeatherBug, as well as all the other stupid stuff we (computer users) do to our machines. It's just human nature to expect our computers to do the impossible without flinching and to blame the OS design for all its shortcomings. I can think of no better quote to sum up then this than this one, by Charles Babbage (the inventor of one of the first mechanical computers, the Difference Engine):

"On two occasions, I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able to rightly apprehend the kind of confusion of ideas that could provoke such a question."
-- Charles Babbage (1791-1871)

It seems we haven't learned....

Think I'm right? Think I'm wrong? Click here to join the discussion about this post.

Thursday, November 29, 2007

Time to Just Give Up...

I'll be the first to admit that after the last generation of consoles (Xbox, PS2, Gamecube, Dreamcast), I considered Sega and Nintendo both completely out of the game. As history now tells us, Sega has completely given up, but Nintendo really surprised me. After playing the Wii for the first time last week, I see why it has become the highest selling seventh gen console. That's right. If you haven't already heard, despite being released over a year later than the Xbox 360, the Wii has pretty much passed the 360's sales. Due to its low price and great party games, the Wii is just what Nintendo needed to bring themselves back from the brink.

After the success of the Xbox and Xbox Live!, I knew that Microsoft was going to take a much larger share of the market with the 360, but I can't believe how they are crushing the PS3. In every product Microsoft has launched, they are far from being the market leaders in the first gen. By the second they are a big player, and by the third they've dominated the market. Think about it for a second. Have you ever heard of Windows 1.0 or Windows 2.0? How about 3.0 or 3.11? Well, Windows 1.0 and 2.0 did exist, they just weren't that great compared to the market leading Mac OS. I think we all know who won that battle. By the time Windows 3.11 was released, the market had shifted and since then, Microsoft has completely dominated the personal OS market. Remember Netscape Navigator? Yeah, I barely do either, but it once dominated the web browser market. After its release in 1994, it was the de facto standard, holding 80% of the market for nearly 4 years, until Internet Explorer 5.0 captured the lead. Again, while Microsoft came late into the game, by version 3.0 they were a strong competitor and by version 5.0 it was all over.

Needless to say, I knew that Microsoft was going to come out strong with the 360, but I figured it would be a 60/40% split against Sony. After all, the PS2 has sold nearly 120 million units compared to the Xbox's 24 million. I mean, for a brand new console against two companies that had both been making consoles for over ten years (Sony/Nintendo), 24 million is impressive, but they got clobbered nearly 6-to-1 by the PS2. Back to the present, it is Sony that is getting clobbered this time. The 360 has sold almost 14 million units compared to the PS3's 6 million, a 2-to-1 lead, and the 360 is gaining ground.

Based on the title of this post, you might assume I'm saying that Sony should give up. Well, I am, but not completely. I'm merely saying that they should give up on the PS3 -- just try to milk enough money out of it to break even and then focus on the PS4. Let me put it this way -- for those of you who are going to say that the PS3 is 'awesome' in response to this post, no one thinks the PS3 is that great or they would be buying it.

Not only are both the 360 and the Wii vastly outselling the PS3, but so is the PS2. That's right. The PS2, originally released in 2000 is still outselling both the 360 and the PS3. Now part of that is due to its low price and great game selection, but you would think that with all those people who loved the PS2, myself included, that the PS3 would sell better. Not only that, but out of all the consoles ranked on Wikipedia, the PS3 is dead last behind the Sega CD and the Sega Saturn, both of which outsold the PS3.

Now, after the history lesson, we can get to the meat of this story. Sony needs to stop thinking about their own profits and focus on making good electronics again. Let's compare the three strategies of Nintendo, Microsoft and Sony. Nintendo made a very inexpensive console with a totally new game play experience; which I still honestly can't believe is possible. It's a totally new way of playing games that makes it fun for the entire family. Not only that, but Nintendo is actually making money on the thing. So, good for the consumer and good for Nintendo.

On to Microsoft. They aren't trying to capture the family market, but rather gamers, like myself. Their strategy is to make games really easy to develop (which also makes them cheaper to create) so lots of cool games will get made. They also created an awesome online multi player environment which will allow PC gamers to play against console gamers (kudos on that sweet piece of technology) and they were the first company to sell the console for less than it costs to make it. Why? To capture market share. Whether you love, hate, or just tolerate Microsoft, by selling their console under cost, they are forcing all the other players to do the same in order to remain competitive. Microsoft is building a long-term strategy for dominating the living room, while at the same time giving the consumer great games at a low price.

On to Sony and the title of my blog. If Sony wanted the consumer to have a better of an idea of what their long term strategy is (which is to have every consumer device be a product that you can only buy from them), they would just rename themselves, "Ridiculously Proprietary Incorporated." Their only strategy with every product they make is to get the consumer to use their version so the consumer can never buy another non-Sony product again. Now, I'm not against companies trying to retain customers (which Microsoft does as well), but come on! Sony has not only one proprietary digital camera memory format, but 4! Memory Stick, Memory Stick PRO, Memory Stick Duo, and the Memory Stick PRO Duo. I realize that Sony makes converters that allow you to use the different formats in different cameras that weren't explicitly designed for it, but seriously, isn't one proprietary format enough for them? The PS3 is basically designed to make money and sell blu-ray discs. While it has the potential to have great games, it is way too expensive and the games are so hard to develop that many game designers will not bother. Although, things seem to be getting better since Sony just cut the cost of their SDK (Software Development Kit) in half.

Put yourself in EA's (Electronic Art) shoes. If you make a game for the Wii you've got 14 million potential customers. With the 360 you've also got 14 million and it's easy to develop for. Or, you can make a special cell processor version for the potential 6 million PS3 owners. Which would you choose? Add that to the fact that Microsoft makes more in a year, 15 billion, than Sony does (2 billion) in 7 years. When it comes right down to it, if Microsoft drops the price of the 360 again, which they inevitably will, Sony (which already loses at least $250 per unit) is going to be completely screwed.

This is for you, Sony, so listen up. Stop focusing on trying to make blu-ray a standard, and rip it out of the PS3 to sell as an add-on player like the 360 does. Make it cheaper and easier to develop games, and for the love of God, make your console cheaper or you're going to be facing two far superior market leaders next time around. In the end, having the better technology (anybody remember Betamax vs VHS?) doesn't really mean diddly squat because 8 of the top 25 rated games are for Wii, 7 for the 360, and only 2 for the PS3. Listen to the consumers, for once, before you go the way of Sega....

Think I'm right? Think I'm wrong? Click here to join the discussion about this post.