fishsupreme: (carp)
[personal profile] fishsupreme
Bill Gates says:

Ten years out, in terms of actual hardware costs you can almost think of hardware as being free -- I'm not saying it will be absolutely free -- but in terms of the power of the servers, the power of the network will not be a limiting factor," Gates said, referring to networked computers and advances in the speed of the Internet.

The world's largest software maker is betting that advances in hardware and computing will make it possible for computers to interact with people via speech and that computers which can recognize handwriting will become as ubiquitous as Microsoft's Windows operating system, which runs on more than 90 percent of the world's personal computers.

"Many of the holy grails of computing that have been worked on over the last 30 years will be solved within this 10-year period, with speech being in every device and having a device that's like a tablet that you just carry around," Gates said at the Gartner Symposium ITxpo, held by information technology researcher Gartner Group.


Via understands it. Bill Gates actually gets it. Intel, with their new move to Pentium M and relabeling processors with model numbers instead of clock speeds, is starting to get it.

We've reached the point where "More power!" is not what most computing users want. Sure, we (the kind of geeks who read the AnandTech forums, where I found the link ot this article) still want it... but your average computer buyer does not. We've reached the end of more, faster, better -- and are now at the beginning of smaller, cheaper, and good enough.

In 10 years, super-powered enthusiast systems will be around... but their price will have gone *up* from where they are now, not down. The vast majority of people will be using systems not that much more powerful than what we use now... that they bought for under $100 and that fit in a desk drawer.

Intel's latest CPU (Pentium IV Prescott) is a disaster. It puts out enough heat to fry an egg on, and isn't much faster than what they already had. The higher they clock their CPUs, the harder it gets to ramp up more -- they're seeing diminishing returns. They can either keep throwing themselves into that wall, or recognize that most people don't want to go faster anyway; they want a computer that costs less, takes up less space, and is cheaper. They'd like it to be easier to use, too, but that's more a software than a hardware issue.

I think that future high-end CPUs will not be faster so much as more parallel -- one physical CPU will be multicored, and contain 4 or 8 or 16 logical CPUs inside it, all capable of processing simultaneously. This will be even better for software written to make use of it, but will not benefit traditional software at all... but current CPUs are already plenty fast for traditional software.

Date: 2004-03-31 11:15 am (UTC)

Date: 2004-03-31 11:19 am (UTC)
From: [identity profile] eagle243.livejournal.com
Interesting theory. I'll check back in 10 years. ;)

Date: 2004-03-31 11:26 am (UTC)
From: [identity profile] prester-scott.livejournal.com
they're seeing diminishing returns

The end of Moore's Law, perhaps?

Date: 2004-03-31 11:56 am (UTC)
From: [identity profile] phanatic.livejournal.com
Naw, just reaching the limits of currently-implemented technology. Still plenty of tricks available.

Date: 2004-03-31 12:35 pm (UTC)
From: [identity profile] pyran.livejournal.com
Really, they could probably use some of those tricks already, but said tricks are pretty expensive. Cooling technologies exist to keep the chips from overheating, but they're a complicated pain in the ass.

Date: 2004-03-31 01:02 pm (UTC)
From: [identity profile] phanatic.livejournal.com
All fabs are "pretty expensive."

Date: 2004-03-31 01:00 pm (UTC)
From: [identity profile] phanatic.livejournal.com
It's not that I think we can't make it go faster -- but it gets more and more expensive every time.

If that's the case, the you are saying that we can't keep up Moore's Law. Moore's paper didn't just deal with increasing transistor densities, but with decreasing transistor costs. In that paper, he writes "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year." His curves weren't just saying that you get more transistors, but that you get more transistors for your buck. If the only way to get more performance is to throw exponentially more money at the problem, then you're no longer following along with Moore's Law.

enough power

Date: 2004-03-31 03:09 pm (UTC)
From: [identity profile] kmo.livejournal.com
I think a lot of those people are going to stop doing this when they can get a system that's powerful enough for all their needs for $300... then $200... then $100. And since those people vastly outnumber those who actually need the more powerful systems, the money for developing those more powerful systems for an ever-decreasing user pool dries up.

I was under the impression that, for a while at least, a lot of PC users were motivated to upgrade their hardware when their systems could no longer run the latest crop of PC games. Is that still the case?Nowadays $300 will buy you an XBox, which seems to pack more than enough computing power under the hood to fuel just about any gamer's jones. Does that mean that the lastest, greatest PC games no longer drive gamers to shell out the big bucks for new hardware?

Re: enough power

Date: 2004-03-31 03:44 pm (UTC)
From: [identity profile] phanatic.livejournal.com
console games vastly outsell PC games.


But PC games are what drive the technology that finds its way into console systems like the XBox.

Look at what's in the XBox: A Coppermine P3 with a piddling 128k L2 cache running at under 800 MHz; 64 megs of RAM with a 133Mhz FSB; an 8 gig, very slow drive; and a GPU that was hot beans relative to other console systems when it came out, but is now vastly antiquated.

Sure, that's cheap; you couldn't even buy a PC with such feeble specs today. Systems like that aren't what drives development.

Date: 2004-03-31 11:55 am (UTC)
From: [identity profile] phanatic.livejournal.com
Intel's latest CPU (Pentium IV Prescott) is a disaster.

Ah, but AMD's latest CPU, is a rock-solid balls-fast success.

Date: 2004-03-31 12:53 pm (UTC)
From: [identity profile] phanatic.livejournal.com
And expensive as all hell.

If you're the type who's going to spend $269 on a P4 Prescott 2.8Ghz, I don't think $270 for a 3200 Athlon64 is "expensive as all hell," considering price/performance.

Date: 2004-03-31 12:25 pm (UTC)
From: [identity profile] flummox.livejournal.com
Intel's latest CPU (Pentium IV Prescott) is a disaster. It puts out enough heat to fry an egg on, and isn't much faster than what they already had.

I find it interesting that this same sentiment keeps getting expressed for initial product releases. The first pentium couldn't beat a 486 (and could also fry eggs). The pentium pro sucked eggs on 16 bit applications. Willamette could only just beat a Coppermine chip and couldn't compete with the Athlon.

I think the jury is still out on whether or not Prescott will be a disaster, per se. All I know is that we're selling all the ones that we're making today.

As for multi-core, intel would be mighty foolish not to develop in this direction. It's all a question of when.

Profile

fishsupreme: (Default)
fishsupreme

July 2014

S M T W T F S
   1 23 45
678 9101112
1314 1516171819
20212223242526
2728293031  

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 10th, 2026 02:22 pm
Powered by Dreamwidth Studios