8.10.2005

 

The Dream Machine, 2005

Every year, Maximum PC magazine puts together its "Dream Machine". It's the most powerful PC that you can build with off-the-shelf components. This year's machine has quite impressive specs, including:Over a billion transistors in all. The machine costs almost $13,000 when you include the case, power supply, dual monitors and speakers.

You read the article and you think, "My God, this is an insane amount of computing power and disk space! Who could possibly need such a machine?!" But then you look back at the first Dream Machine that they built in 1996. That machine had:They didn't even have a "3D graphics card" in it, because 3D graphics cards didn't exist yet.

Just 9 years ago that was an insanely expensive ass-kicking machine. Today this 9-year-old Dream Macine is so pathetic that it would be unusable. 32 MB of RAM??? You could not even launch the OS in that.

Even coming up to the year 2000 Dream Machine, you find:That cost $12,000. $12,000! Today a $500 desktop PC at Best Buy beats that.

So... Between 1996 and 2005 -- just 9 years -- disk space increased by a factor of 1,000. RAM increased by a factor of 250. CPU clock speed increased by a factor of 11, there are 4 cores instead of 1 and the number of transistors went up by a factor of 150. And now we have incredibly powerful graphics cards holding 300 million transistors -- a technology that did not even exist 9 years ago in the normal PC marketplace.

Project out 10 years from now, to 2015. It is quite likely that the $13,000 "Dream Machine" of 2005 will seem pathetic and unusable. You won't even be able to buy a machine like this because it is so pathetic. The 2015 Dream Machine will have:Who knows what the graphics cards in 2015 will be doing.

Will the machine in 2015 contain a vision processing card??? That is the huge question I have. 3D graphics accelerator cards like we see today did not even exist in 1996 as far as the Dream Machine was concerned. Will we see vision processing cards arise from nothing and explode in power like that? Or will it take ten years more?

What will the robots in 2015 be able to do?

And what will the Dream Machines in 2025 look like? I don't think we can imagine it.

See Robotic Nation and Robots in 2015 for a discussion.

Comments:
"They didn't even have a "graphics card" in it, because graphics cards didn't exist yet."
Erm, yes they did. Graphics cards have been used ever since the original IBM PC. It is only later that video functions started getting integrated into motherboards. 1996 was also the year the voodoo 1 3d accelerated graphics card was released, and nVidia had a card out in 1995.
It would be more accurate to say that in 1996 3d accelerated graphics cards were not a mainstream piece of hardware.

"Project out 10 years from now, to 2015."
Continuing the rapid development of computer power will eventually have to hit a few laws of diminishing returns eventually. IMHO transistor density will have a really hard time to continue as it has, it's hard to see how many fabs a market can sustain when they cost billions of dollars each.
 
The limits known today aren't set to be hit before 2015.

Also, those limits are in 2D density. 3D storage and chips should become more common.

As for the vision card, let me just say that "people" are working on it. It will have a suite of hardware-implemented algorithms, some more simple than others. For instance, SVD is a common linear algebra technique used in machine learning and computer vision. t is sort of a building block, but isn't that fast for large matrices. A hardware implementation should be thousands of times faster.

This doesn't just mean live computation will be faster. It also means training a vision algorithm can be done, perhaps, in hours or days, in place of weeks or months. That's huge as far as something equally important to Moore's law: the advance of algorithms. Shorter test time means more iterations and improvements in the same span of time.
 
The present future
http://www.kottke.org/05/08/the-present-future

Predictions for the future
December 25, 2001 2 GHZ chips from Intel and/or AMD
October 16, 2003 4 GHZ
August 6, 2005 8 GHZ
May 27, 2007 16 GHZ
March 17, 2009 32 GHZ
January 1, 2011 64 GHZ
October 26, 2012 128 GHZ <-- end of photolithography
August 17, 2014 256 GHZ equivalent
June 7, 2016 512 GHZ
March 28, 2018 1024 GHZ = 1 TeraHz
January 17, 2020 2048 GHZ = 2 TeraHz
November 7, 2021 4096 GHZ = 4 TeraHz
August 28, 2023 8192 GHZ = 8 TeraHz
June 18, 2025 16384 GHZ = 16 TeraHz
April 9, 2027 32768 GHZ = 32 TeraHz
January 28, 2029 65536 GHZ = 64 TeraHz
November 18, 2030 131072 GHZ = 128 TeraHz
September 8, 2032 262144 GHZ = 256 TeraHz
June 30, 2034 524288 GHZ = 512 TeraHz
April 19, 2036 1048576 GHZ = 1 PetaHz
Feb 8, 2038 2097152 GHZ = 2 PetaHz
November 30, 2039 4194304 GHZ = 4 PetaHz
September 19, 2041 8388608 GHZ = 8 PetaHz
July 11, 2043 16777216 GHZ = 16 PetaHz
May 1, 2045 33554432 GHZ = 32 PetaHz
February 19, 2047 67108864 GHZ = 64 PetaHz
December 10, 2048 134217728 GHZ = 128 PetaHz
August 8, 2050 268435456 GHZ = 256 PetaHz
http://singularityinvestor.com/singularity.php?r=1123689079&a=777474658
 
Moore's Law, as described by Intel.

We can just ask Intel what they are working for in 2015.

In the Intel Developers Forum 2005 keynote speech, which you can hear online, (I believe this is the link,) Justin Rattner said that they work in 4 year release cycles, and look ahead 2 cycles. He said that there will be either tens or hundreds of cores within the chip, by 2015.

My read on this is that 16 cores is very conservative. If I remember right, that's in the books for the next 4 years.

-- lion
 
"'They didn't even have a "graphics card" in it, because graphics cards didn't exist yet.' Erm, yes they did. Graphics cards have been used ever since the original IBM PC. "

Did you intentionally misquote him to prove a point or was it something else? The original article said "They didn't even have a '3D graphics card' in it, because 3D graphics cards didn't exist yet."

For the average person, 3D graphics cards were about as common in 1996 as add-on physics cards are today.

"...it's hard to see how many fabs a market can sustain when they cost billions of dollars each."

They cost that much already, but AMD, Intel, and IBM are still building them. Why? Because of the economies of scale. It's extremely economical to spend that much on a facility when each year they make more and more money off of processors that become increasingly cheaper to manufacure. As long as someone can make money off of it, it'll be done. And I don't see any point in the future where the need for processors taper's off.
 
Nice Blog, I found it while looking for make money online
 
Robotic Nation Evidence very nice post and good article second hand laptops in hyderabad very nice keep it up
 
Post a Comment

<< Home
Archives © Copyright 2005 by Marshall Brain
Atom RSS

This page is powered by Blogger. Isn't yours?