• NOW LIVE! Into the Woods--new character species, eerie monsters, and haunting villains to populate the woodlands of your D&D games.

Computer Upgrade Time

Thanee said:
@drothgery: Yeah, I know those aren't really useful right now... but who knows what comes in 2-3 years. And I do think Vista will stay around for a while.

Look up Amdhal's law. It's why without some massive, radical changes in how PCs are used and programmed (which -- even if possible -- will take more like decades to trickle down than years), more than 4 cores are going to be pretty much useless on the desktop (and 4 won't be all that much better than 2).
 

log in or register to remove this ad

Yeah, it's not very likely. Also the support for 128+ GB of RAM (instead of 16 GB for VHP) doesn't seem like it will be useful anytime soon. :)

You could get two dual-cores instead of one quad-core, though. ;)

I'm almost sure to get the 64bit version, though, since I can still use XP for 32bit if there are any compatibility issues.

Too bad, Ultimate only seems to come in both versions at the high retail price... otherwise that would make the choice really easy.

Bye
Thanee
 

Thanee said:
I thought the 550W is already more than is really required, but I will look into this. Thanks!

Bye
Thanee
It is... but not by a very comfortable margin. And you're talking an extra $20 to pump it up more.

And yes, I ment 2x1GB :)
 

drothgery said:
Look up Amdhal's law. It's why without some massive, radical changes in how PCs are used and programmed (which -- even if possible -- will take more like decades to trickle down than years), more than 4 cores are going to be pretty much useless on the desktop (and 4 won't be all that much better than 2).
Well, Thanee is about to buy a GTS which has 6 multiprocessors on it and each of those has 16 streaming processors making up a 96 'core' card in total. I admit it takes some specialist programming to make use of them. My current stuff will make use of a few CPU cores and I am working on something new, to be announced later this year I hope, that will make use of as many cores as you can give it. Its being tested on quad but it would help if I had more. I know I keep saying that I am unusual because I develop high end stuff but to say that its not possible or not going to happen on the desktop for decades is wildly pessimistic. It will be a few years sure, and games (for once !) will lag behind for a few more unless your talking consoles which for sure will make use of it. Actually isn't the XBox360 3 core CPU and the PS3 a 7 core cell ? Something like that anyway but the point is that its changing now. Intel have said that they are not going to make more than 4 cores on a chip for a while tho. Back onto topic, whatever the hardware industry cooks up, I'm sure MS will cater for. Its difficult to hedge against what might happen - especially now - but buying what will just about cope with todays hardware will not allow it to last long. Because you might have to upgrade a lot if you change to multi CPU I think that you can buy that bit with some confidence but number of cores / ram and perhaps power requirements etc, thats going to change.
 

Redrobes said:
Well, Thanee is about to buy a GTS which has 6 multiprocessors on it and each of those has 16 streaming processors making up a 96 'core' card in total. I admit it takes some specialist programming to make use of them. My current stuff will make use of a few CPU cores and I am working on something new, to be announced later this year I hope, that will make use of as many cores as you can give it. Its being tested on quad but it would help if I had more. I know I keep saying that I am unusual because I develop high end stuff but to say that its not possible or not going to happen on the desktop for decades is wildly pessimistic.

Err... I'm talking about fundamental principles of computer science here. An application has to scale almost perfectly with extra cores to make more than 4 useful. It happens that the kind of work that's normally offloaded onto graphics cards often does scale almost perfectly, but that's a long way from being a general case. IPC overhead adds up. And so traditional multithreaded programming cannot make use of large numbers of CPUs effectively in most cases.

It boggles me that so many people seem to have complete faith that programmers will 'figure out multithreading' or something to make large numbers of CPU cores useful to general-purpose applications, when there's absolutely no evidence that this will happen. It's a problem very smart people have been working for over thirty years, and they haven't come up with a solution yet. If there is one -- and I'm not at all sure there is; I expect large numbers of CPU cores to remain concentrated in servers and specialized applications -- it's going to require radical changes in programming techniques. Let's not forget that it took decades for procedural programming to replace assembler and GOTO statements, and as long for object-oriented programming to become mainstream. Any future radical shifts will also take a long time to trickle down to Joe packaged-software developer with a multi-million dollar budget, and longer yet to Dave line-of-business programmer who's lucky if he has one or two other programmers on his team.

Incidentally, this is why the Xbox 360's Xenon CPU is bad, and the PS3's Cell is downright awful. But the big guys in the console industry got caught up with the maximum theoretical performance at the lowest price (MS) and with demonstrating something cool (Sony) and that's a big part of why the radically underpowered Wii is doing well. A dual-core traditional CPU would have been a much better choice for any of them.
 
Last edited:

Bront said:
It is... but not by a very comfortable margin. And you're talking an extra $20 to pump it up more.

You are right. The GFX card manufacturer also lists higher Watt numbers for optimum. And it's really not such a big difference in cost for a little more security. That damn thing will be expensive enough as is. ;)

Bye
Thanee
 

drothgery said:
Solid CS stuff cut...
QFT. Our research lab has a big group in High Performance Computing. Getting big returns on processing is their modus operandi. But these are special-purpose algorithms developed for requirements much different than desktop OS usage. And the performance tradeoff is generally not worth it yet for even the hard-core gamer user base. Will this change? Somewhat. Better multicore programming processes will help to a point. But that's going to be a while and not going to pay huge dividends in the short term.
 

What I cannot really figure out yet is... will I likely run into (driver, compatibility) issues with a 64bit Vista, which I would not have with a 32bit version?

I think that's about the most important factor, which would speak against the 64bit version, right?

Bye
Thanee
 

The 64bit driver issue is pretty easy to suss out ahead of time.

It sounds like you have a pretty fair list of components already picked out. Check the manufacturer's web pages. If they don't have a driver specifically labled as being intended for 64bit windows, there probably isn't one. 32 bit drivers do *not* play well with 64 bit OSs.

If they have a driver for 64 bit XP, but not 64 bit Vista then there is a good chance one may be forthcoming, otherwise be very wary.
 

It might make for a good topic for people with 64 bit stuff to list what exact 64 bit OS they use and what peripherals they have that work well or don't work with a 64 bit driver or have no driver at all. As well as knowing, you could get a web link or email that person to get a driver.
 

Into the Woods

Remove ads

Top