Playstation 3 pricing announced

trancejeremy said:
Looking on Circuit City, the cheapest HDTV seems to be about $900 (for a 30"), and you don't hit the really high resolution (1080p, which the PS3 and the HD DVD formats are aiming for) until the $3500 mark or so.

If you got one for $700 2 years ago, you got a pretty good deal.

Anyway, on even a HDTV, is the picture from a regular DVD (480p) that much worse than it from a Blu Ray (or HDTV) downsampled to 720p?
Huh? Don't look at 1080p - that is tech that will be the average 3-5 years from now. That tech isn't needed to provide an HD picture. 720p or 1080i is the current sweet spot.

Any TV that has component inputs and can produce at least a 720p picture is considered HD.

As for your question about 480p - Have you seen a picture at 480 vs. 1080i/720p? There is a definite difference. It's also about the output. Downsampling is exactly what is sounds like: going downhill.

The best example I can think of is music. Compare a current stereo system with a tape deck and CD player. The same song on a cassette vs a CD.
 
Last edited:

log in or register to remove this ad

The fact that modern games aren't multithreaded much is an issue for both the 360 and the PS3. More for the PS3, in fact. However, it is an issue for dual-core PCs as well.

However, this is a paradigm that has to change eventually. Making faster CPUs while keeping heat under control is becoming harder and harder, with manufacturers pushing for multi-core architectures instead. Even PCs are going to be dual-core as a standard.

Bottom line, game developers must learn how to multithread. Microsoft and Sony aren't being stupid for putting heavily multi-core CPUs in their machines - they simply know this simple fact.

As more of them start writing efficient multithread code, multi-core CPUs will become a real advantage. This is especially true for Cell, and for this reason I expect second and third generation PS3 titles to be much better than early ones. Most current PS3 titles only use two or two PPUs plus the main core, which means that more than half of the CPU's power is not being used. In a few years, this will probably give PS3 a noticeable edge over 360.
 

Zappo said:
The fact that modern games aren't multithreaded much is an issue for both the 360 and the PS3. More for the PS3, in fact. However, it is an issue for dual-core PCs as well.

However, this is a paradigm that has to change eventually. Making faster CPUs while keeping heat under control is becoming harder and harder, with manufacturers pushing for multi-core architectures instead. Even PCs are going to be dual-core as a standard.

Bottom line, game developers must learn how to multithread. Microsoft and Sony aren't being stupid for putting heavily multi-core CPUs in their machines - they simply know this simple fact.

As more of them start writing efficient multithread code, multi-core CPUs will become a real advantage. This is especially true for Cell, and for this reason I expect second and third generation PS3 titles to be much better than early ones. Most current PS3 titles only use two or two PPUs plus the main core, which means that more than half of the CPU's power is not being used. In a few years, this will probably give PS3 a noticeable edge over 360.
Ah! That's what I was looking for! :)

A plain explaination for what is currently "wrong" and a possible solution. You have also given yourself and these devs a brain, meaning that you are not offering blind criticism but simply what is going on and where it is going. I'm glad to hear that this is basically going the same way that Sony/Nintendo consoles have gone in the past. They start off with game devs not fully getting how to use the hardware and after 3+ years (sometimes more) they are able to really use the system in ways that it was meant to be and can live up to the promises projected at launch.

I exclude MS from this equation as they have only one generation of console and it wasn't very inventive - just a concentrated PC for gaming (which is okay). Sony and Nintendo have historically put together hardware that takes devs a while to "unlock." It happened all the way back to the SNES and the PSone. It you look at some of the titles released at the very end of the cycle you'd be amazed at what they were able to milk out of the hardware.

It's also why we'll almost always be dissappointed by launch games. But that's not news to anyone.
 

Zappo said:
The fact that modern games aren't multithreaded much is an issue for both the 360 and the PS3. More for the PS3, in fact. However, it is an issue for dual-core PCs as well.

However, this is a paradigm that has to change eventually. Making faster CPUs while keeping heat under control is becoming harder and harder, with manufacturers pushing for multi-core architectures instead. Even PCs are going to be dual-core as a standard.

Bottom line, game developers must learn how to multithread. Microsoft and Sony aren't being stupid for putting heavily multi-core CPUs in their machines - they simply know this simple fact.

It's not just a matter of "learning to multithread" (non-programmers tend to hand-wave this; even when you're doing something that can be effectively multithreaded, writing multithreaded code is hard, and despite decades of effort, there hasn't been much progress in making it easier). It's a matter of trying to do things that can be effectively mutli-threaded. There are very severe limits on how multithreaded something that's strongly dependent on user input (i.e. a game) can be. A lot of games might be able to use two threads effectively; very few will be able to use six (three cores + SMT on the Xeon) or one main thread and seven helper threads (PPE + 7 SPEs on the Cell).

And the other massive performance drag on both CPUs is that they're in-order machines (which PC CPUs haven't been since the Pentium), rather than out-of-order. Which means code that branches a lot (extremely common in games) doesn't perform very well. Code that involves doing the same thing over and over again in predictable ways can be very fast on an in-order CPU (Intel's Itanium is an in-order CPU), but they're not very good for general purpose computing, or for games. What they are is relatively cheap to design and build (the cheapest Athlon 64 X2 is almost as expensive as an Xbox 360), and something that allows MS and Sony to give impressive numbers to the press.

But like console makers have done over and over again, they keep ignoring that CPUs and memory get cheaper a lot faster over time than GPUs (new low-cost GPUs are almost always scaled-down versions of new high-cost GPUs, rather than old midrange or high-end GPUs; CPUs almost always move down-market), so it's better to spend the component budget on the CPU and memory.
 

John Crichton said:
I exclude MS from this equation as they have only one generation of console and it wasn't very inventive - just a concentrated PC for gaming (which is okay). Sony and Nintendo have historically put together hardware that takes devs a while to "unlock." It happened all the way back to the SNES and the PSone. It you look at some of the titles released at the very end of the cycle you'd be amazed at what they were able to milk out of the hardware.

Err.. the original PlayStation was a simple and straightforward architecture, certainly much more so than the competing Saturn and N64. The same goes for the GameCube, DreamCast, and Xbox (and the Wii, for that matter); they weren't just simple relative to the PS2, they were simple, period (that's why with less theoretical power by most measures -- it's only theoretical advantage, though a very significant one, is more memory -- the GCN keeps up with with the PS2 quite well; it's far easier to code against a G3 than an Emotion Engine). The Xbox 360 is less wonky than the PS3 (symetric multiprocessing is far easier to handle than asymetric multiprocessing), though the GPU design on the 360 is new (unfied shaders should be simpler, and are pretty much required in DirectX 10, but they haven't been done before), and they're both fairly wonky by nature of having massively multithreaded, in-order CPUs.
 

John Crichton said:
What should have Microsoft done to make their system better for games while still being able to take less of a loss on each sale of a 360?

It depends what Intel, AMD, and IBM Microelectronics were offering MS. I'm a lot less concerned about owning all the IP, and about making the system unusual (and therefore hard to hack) than Microsoft was. I also think that MS overestimated the importance of being first to market; I wouldn't have launched the 360 last year, when it was clear neither Sony or Nintendo would (in fact, they're only launching consoles this year because MS did last year; otherwise we'd see the PS3 in 2007, when it almost makes sense to launch a console with a Blu-Ray drive). Heck, I'd've made sure Halo 3 and KotOR 3 were launch titles.

But I'd bet that the best thing IBM could make for a console last year that would be a tweaked PPC970FX (the low-power G5, seen in the last PowerPC iMacs). Single core, but with an excellent vector unit. The dual-core version costs too much and uses too much power for a console. Intel could have provided a tweaked Pentium M "Dothan", or possibly two of them for conventional SMP (the Pentium 4 is too power-hungry for a console; the Pentium D is right out); AMD could have provided a tweaked Athlon 64 (though not an X2; too expensive).

John Crichton said:
Why did Nintendo choose the architecture they did for the Wii rather than something "faster?"

Cost, cost, and cost. The Wii's strongly rumored to be launching at under $200, and Nintendo has never taken a loss on hardware.

John Crichton said:
Why is the Cell processor terrible? What would have been better to put in the PS3?

I think I explained above why the Cell's terrible for games, except to add that asymetric multiprocessing on the Cell is even harder to do well than symetric multiprocessing on the Xenon or standard dual-core CPUs, that the Cell's SPEs just aren't good for all that much. When Sony was intending to use two Cells in the PS3, and not have a dedicated GPU, they were halfway decent for processing graphics -- but nowhere near as good as ATi and nVidia's dedicated hardware, which was why they had to scramble and sign on nVidia at the last minute.

Because it's launching in 2006, not 2005, IBM could probably make a dual-core G5 that fit in a console's power envelope. AMD's costs for Athlon 64 X2s are lower now than they were last year. Intel could provide a Core Duo (a Core 2 Duo would be excellent, but they're launching in the next month or two and will still be too expensive by conventional console economics in November).
 
Last edited:

I don't handwave. I am a programmer. ;) The problem is hard, but far from impossible.

By comparison, making single-threaded games constantly better and better is impossible.
 

John Crichton said:
Your pricing is way off, man. Two years ago I bought a Sony 30' CRT for $800 that has an HDMI input and 2 component inputs at Circuit City. Yeah, it was on sale ($100 off that week) but it was brand new, in the box from a reputable store. I actually saw it online at the time for as low as $700. That was 2 years ago.

These days, a TV that can take a component/HMDI/DVI signal can easily be found for $500 or less. May not have a 60' screen but that's okay. And the prices on these TVs is dropping monthly. By this holiday season you'll be able to have a huge choice of TVs that can take a signal produced by a BR or HD player.

[snip]

And the bottom line is that, yes, you will need a TV that can take advantage of the BR or HD tech. But it will actually cost less than the price of the PS3 if you do a little homework and legwork. Open Box buys can knock a few hundred off a TV easily for a place looking to get rid of it. :cool:

And I think that while the hardcore consumer is willing, or more likely already has, spent money on that setup, the casual gamers have not, and likely will not. Things may have changed somewhat since the two years I stopped selling electronics, but the average customer didn't come in looking even at a flatscreen television. And while flatscreens seem to be more prolific now, the point is that your average consumer, and thus casual gamer, IMO, is not going to be spending thousands of dollars to play video games. And that's why you can't bring up the next-gen playability as a benefit, because the majority of people won't have sufficient set-ups to take advantage of it.
 

Zappo said:
By comparison, making single-threaded games constantly better and better is impossible.

Maybe (I suspect we haven't passed the point where single-thread is better for gaming performance, and won't soon; moreover, if AMD and Intel were solely concerned with gaming, they'd be putting more effort into fast single cores), but massive multi-threading is not useful for game machines now, and won't be in the expected lifespan of the PS3 and Xbox 360. A second core and/or SMT would be useful. Three SMT cores or 7 helper micro-cores is not.

They're about the worst case for multi-threading; they typically run a single application that doesn't parallelize well. Desktops typically run a lot of apps at one time. Servers tend to run apps that parallelize well (or, in the case of web servers, a lot of small apps, and/or multiple copies of the same app).
 

LightPhoenix said:
And I think that while the hardcore consumer is willing, or more likely already has, spent money on that setup, the casual gamers have not, and likely will not. Things may have changed somewhat since the two years I stopped selling electronics, but the average customer didn't come in looking even at a flatscreen television.

He's not talking about flatscreen TVs (or rather, he's not talking about LCDs, DLPs, plasmas, or old-school projection TVs). He's talking about 26" and 30" widescreen CRTs, which are pretty widely available for $500-$800 (if there had been $500 30" CRTs three years ago, I'd have one).
 
Last edited:

Remove ads

Top