This article is more than 1 year old

Make way for the 64 bit revolution

Let the memory live again

Cast your mind back 15 years, if you can. Believe it or not, that was when the IT industry suddenly started to get excited about 64 bit computing.

IBM (in conjunction with Apple and Motorola) had already embarked on the design of the 64 bit Power PC chip and Digital announced the 64 bit Alpha chip (you remember Digital, surely. It was eaten by Compaq, which was eaten in turn by HP). With the advent of the (now defunct) Alpha chip, commentators everywhere were trumpeting the dawn of 64 bit computing.

Well, it has been one of the longest dawns ever. At the time Intel didn't care about 64 bit computing and the PC industry didn't either. It took Intel about another 10 years to care about 64 bits and even then it seemed to be pushed into it by competition from AMD. But, nevertheless, it happened in the end and now we're all living in a 64 bit world...or are we?

Where 64 bits mattered

We entered a 64 bit world 15 years ago, but the simple fact was that it made no difference to most applications. The 64 bits, if you didn't know, refers to the size of the physical storage address that the on-chip instructions use. A 64 bit chip has 64 bit registers and a 64 bit data path, which means it can address a vast amount of memory.

How much memory?

In the region of 18 Exabytes. An Exabyte is a billion gigabytes and 18 Exabytes is greater than the sum total of all the worlds' data (which is a mere eight Exabytes or so).

But so what? 32 bits was enough to address two gigabytes of data and very few applications needed to address that much data. The other advantage of a 64 bit processor is that it does 64 bit arithmetic a great deal faster, which is useful in heavyweight scientific applications, but little use on the PC.

And if you don't actually need 64 bits, there's a penalty for using it. The vast majority of applications perform better compiled as 32 bit because the executable files are smaller so you can fit more executable code in the chip cache. So 64 bit performance can actually be poorer.

But 64 bits makes a big performance difference if you need to directly address more than two gigabytes of information and needing to do this is very common in data warehouse applications where terabytes of data are under management. Indeed, IBM's iSeries and pSeries had a big advantage in this kind of application for a while because of their 64 bit capabilities. The major database products moved to having 64 bit implementations very quickly once 64 bit servers were available. Scientific computing moved quickly in that direction too.

Today's 64 bit applications

The reason for writing this article is to point out that the IT industry is rapidly arriving at the point where 64 bits will make a difference to many things...enough to drag us all into a 64 bit world. First consider video. An hour of video at a reasonable resolution (for a PC or Mac) will usually occupy more than a gigabyte. If it is HD (high definition) then it will occupy about four times as much space. So, if you want to manipulate video in any way or just address part of a video file, 64 bits suddenly makes a big difference.

Now think of the PC or Mac you might buy soon. If it runs Windows Vista then think in terms of two gigabytes of memory, and if you're doing anything graphical at all on the Mac then two gigabytes is the base requirement (in my opinion). The personal computer has crossed the 64 bit line.

Now think of computer grids, especially loosely coupled ones which might be assembled on the fly. Such a grid will be managed much more effectively with 64 bit addressing. Now think of managing a large corporate network as though it were a single computer. The simple fact is that it will be far more effective with a single addressing scheme that can apply to the whole network.

A 64 bit revolution

There is, quite possibly, a genuine 64 bit revolution that is likely to occur. Just combine 64 bit addressing with the fact that memory is gradually replacing disk as the natural place to store online information and you have an architectural revolution in the offing.

The truth is that such a revolution began quite a while ago with the idea of virtualised operating environments and the separation of disk resources as NAS or SANs. But it hasn't yet got to the point where the industry is thinking in terms of memory based architectures. This will happen, and it is likely to happen soon.

And it will be a good thing too, making streaming applications and database applications far more efficient than they currently are. It's odd to think about it this way, but nearly all the applications we run are built on the assumption that the primary copy of the data is held on a spinning disk. Pretty soon all such applications will be legacy applications.

Copyright © 2007, IT-Analysis.com

More about

TIP US OFF

Send us news


Other stories you might like