EMC launches first petabyte array

EMC launches first petabyte array

From the article:What's funny is that in 2025 or 2030, your iPod will have a petabyte of disk space...



Will Google create the god of knowledge?

St Lawrence of Google

The article is a fairly typical review of a Google speech until we get to the last paragraph:At that point, what happens? If you have the money and know-how to create "an omniscient and omnipotent algorithm", what happens next?



British and U.S. police states

Something to start off the new year:


From the article:See Robotic Nation for details.



65nm chip manufacturing

The next step in the development of microprocessors is the roll-out of the 65nm manufacturing process, which appears to be successful:

Intel Pentium Extreme Edition 955 & 975X Express Chipset: 65nm is Here

There is a fascinating graph in the article showing the trendline from 0.8um in 1990 to 0.65nm today. Obviously we will reach a "minimum feature size" for silicon at some point, so it will be fascinating to see what the next step is for Intel when we reach that point. But we still have a ways to go.

It is also interesting that power consumption is down rather than up in these new chips.

When I do public speaking on these topics, a comment I hear almost every time is, "well, your Robot predictions will not come true because computers are not going to get much faster than they are today. Moore's law is dead." It is not clear to me where this thinking comes from. Moore's law has been in place for perhaps 60 years. We have moved from relays to vacuum tubes to transistors to chips in that time. Something else will come along when silicon reaches its natural limits (carbon nanotubes, quantum computing or something). Or we will figure out a way to do 3-D silicon. Or we will come up with a new computing paradigm that does not depend on every single transistor and wire working. The human brain has billions of cells and uses only 20 watts or so, so we have a working technology right there as an example of how far things can go.

Even more interesting is the algorithm and programming side. The new Xbox 360 has 6 cores, but most games use only one of them because we haven't yet figured out how to write highly multi-threaded game code. We still don't have a good general approach to computer vision. As soon as we do, we will see vision cards proliferate just as we have with graphics cards. Once we figure these things out, it may change the course of hardware design.

Archives © Copyright 2005 by Marshall Brain
Atom RSS

This page is powered by Blogger. Isn't yours?