PDA

View Full Version : Playstation 3 launch


Funko
17-05-2005, 10:19:04
http://news.bbc.co.uk/1/hi/technology/4554025.stm


CPU Cell Processor running at 3.2Ghz with 2.18 teraflops of performance
Backward compatible
256Mb XDR main RAM at 3.2 GHz
256Mb of GDDR VRAM at 700Mhz
Memory Stick Duo, SD, compact flash memory slots
Detachable 2.5 inch hard drive
Support for seven Bluetooth controllers
Six USB slots for peripherals
Supports Blu-ray DVD format
Output in HDTV resolution up to 1080p as standard

King_Ghidra
17-05-2005, 10:33:53
well the techno stuff means very little to me, as consoles perform very differently from equivalent pc hardware speeds.

Either way, it looks a lot better than the old one and if they do deliver on the promise to make it genuinely broadband and network compatibile (i.e. without having to buy loads of extra plug-in crap) then it will be just super as far as i'm concerned.

mr.G
17-05-2005, 10:34:32
i like teraflops

Nills Lagerbaak
17-05-2005, 13:54:07
"The console also boasts a new graphics chip from Nvidia, which Sony claims can create movie-quality images in real time in games."

Hmm, I'd like to play a demo of this. I remain highly sceptial.

King_Ghidra
17-05-2005, 13:58:19
sceptial eh, you should see a doctor

Gramercy Riffs
17-05-2005, 14:01:30
It could turn septic.

Nills Lagerbaak
17-05-2005, 14:04:36
http://www.gamespot.com/ps3/action/killzone2/media.html?gcst=killzone2_ot_ps3_051605.asx


Hmm, I think I see where they're going now......

Funko
17-05-2005, 14:07:58
That looks pretty pretty.

Nills Lagerbaak
17-05-2005, 14:12:59
Yes, the other one looks pretty pretty too. Still "movie" quality? I've heard that before.....

King_Ghidra
17-05-2005, 14:19:44
i would tend to agree with nils on the whole, such hype generally leaves me cold, but i have seen things which have proved how much things have moved on. Look at the graphics in gt3 and gt4 on the ps2, they were stunning and certainly photo-realistic-ish. We can realistically expect them to be improved considerably with another generation of hardware.

MattHiggs
17-05-2005, 15:00:40
Any idea what the PS3 will retail at?

Asher
17-05-2005, 15:06:27
The PS3 looks like a George Foreman Grill and/or minifridge with Spiderman font on it.

You'd right not to believe the hype on Cell, particularly its comparison to the Xbox 360. The PS3's GPU is a generation behind the Xbox's, and Cell has excessive theoretical performance (it's a massive vector processor), but it is not nearly as "efficient" as the more traditional tri-core CPU in the Xbox 360.

JM^3
17-05-2005, 15:15:55
it isn't really about the specs

both are better than what is arround now, both are nice

it is about what games come out for what..

Jon miller

mr.G
17-05-2005, 15:17:18
yaaaaay and no tinted windows damnit

Deacon
17-05-2005, 21:51:33
Should be fast enough, provided that the new software doesn't demand more capabilities than the new hardware will provide, and that the bandwidth between components is enough that they can function together quickly. 256MB RAM doesn't sound like as much as it used to.

As has been mentioned before, besides tech specs, features and price matter too. Not to mention corporate management. It's the difference between success and being the next 3DO console or Iridium phone. :)

LoD
17-05-2005, 22:12:42
Originally posted by Asher
You'd right not to believe the hype on Cell, particularly its comparison to the Xbox 360. The PS3's GPU is a generation behind the Xbox's, and Cell has excessive theoretical performance (it's a massive vector processor), but it is not nearly as "efficient" as the more traditional tri-core CPU in the Xbox 360.

Just one question Asher - how can something be "more traditional" and "a generation ahead" of something else, simultaneously?

Sir Penguin
17-05-2005, 23:45:27
Same way that the Pentium 5 will have the more traditional non-Netburst architecture of the Pentium 3, but will be a generation ahead of the Pentium 4.

SP

Asher
18-05-2005, 00:22:48
Originally posted by LoD
Just one question Asher - how can something be "more traditional" and "a generation ahead" of something else, simultaneously?
Read more carefully: GPU != CPU.

PS3 GPU is a generation behind the Xbox GPU in design style, in that we'll see similar 8-pixel pipeline 48/96-FPU designs a couple years down the road for legacy reasons (old games would run like ass on these new cards, since it has 8 pixel pipes instead of 12/16/24 which is now standard...the games would need to be rewritten for the new design).

The PS3 CPU (Cell) isn't really comparable generation-wise, since CPUs won't be going to the Cell-like design at all.

The PS3 has the Cell design because Sony f*cked up. Their original plan had Sony make a bunch of Cells (probably 2-4 Cells per console) and have a very dumb raster graphics chip, a la PS2's GS. Then the Cell would be the CPU and GPU, doing all the calculations.

Unfortunately, they knew this -- again -- would be no match for a Nvidia/ATI GPU coupled with a faster CPU. So at the "last minute" they ditched the GS idea, got a "stock" Nvidia GPU (complete with the GDDR3 memory controller) and removed all but 1 Cell.

So in the end Sony has a ton of vector processors (http://en.wikipedia.org/wiki/Vector_processor) in the PS3, with only one conventional processor. This doesn't really make sense to have the vast majority of the vector-related work offloaded onto the GPU (with the Nvidia GPU), and still have a vector-based CPU.

The end result is insane theoretical performance due to the vector processors, but the in-game real world performance will be lucky to be remotely competitive to the Xbox 360's CPU. Game logic and AI totally suck on vector processors.

Cruddy
18-05-2005, 01:36:30
Hmm. How does 256 CPU and 256 DDR3 GPU RAM stack up against the arrangements on Xbox 360?

Also, it rather depends on how the hardware of each platforme is developed. Which has a better reputation for software quality control - MS or Sony? Sony for sure.

A firm thumbs up to the USB ports. There's going to be some weird and wonderful peripherals for PS3.

I rather doubt it's going to make FX-55 SLI game players too worried though.

Asher
18-05-2005, 02:17:50
Originally posted by Cruddy
Hmm. How does 256 CPU and 256 DDR3 GPU RAM stack up against the arrangements on Xbox 360?
Almost identical. Xbox has 512MB DDR3 total, so developers have more freedom in partitioning the memory. The CPU RAM on the PS3 is RAMBUS, 25GB/s. Slightly higher bandwidth (vs. 22.4GB/s on Xbox for all 512), but with substantially higher latency. But cheaper!

The main differentiation point is the 10MB of 256GB/s embedded DRAM on the Xbox 360's GPU. Something like 40% of all memory writes on the GPU is on the current framebuffer only (drawing the actual picture). The Xbox 360 has the 256GB/s eDRAM on the chip, so it's quicker and doesn't saturate the main GPU bus bandwidth...freeing it up for other things.

In effect, Xbox 360 gets free anti-aliasing thanks to the eDRAM, and it'll be costly on the PS3.

Also, it rather depends on how the hardware of each platforme is developed. Which has a better reputation for software quality control - MS or Sony? Sony for sure.
Depends what you're talking about.

Devs hate Sony's devkits and love MS' devkits. Sony wants them to develop their games on Linux with an immature, buggy POS IDE.

Sony also has a nasty quality reptuation from the PlayStation series. The original PlayStations overheated frequently and required additional fan cooling (a stable peripheral market) until a later revision fixed that. THe PS2 had a terrible, terrible reputation for optical drive quality and "dirty disk errors".

I rather doubt it's going to make FX-55 SLI game players too worried though.
Why would it?

If they want power, they'd go for the IBM Cell-based workstations. They'll have up to 4 Cells in them (not just 1), and SLI high-end graphics cards.

mr.G
18-05-2005, 09:10:47
butt ugly controls tho

Sir Penguin
18-05-2005, 09:32:41
Is there a console that doesn't have butt-ugly controls?

SP

Funko
18-05-2005, 10:50:32
The dreamcast?

King_Ghidra
18-05-2005, 11:53:02
To reiterate Mr G's point:

"I don't care if it can harnesses the power of the sun, compute the definitive answer to Pi, is controlled via mental telepathy, or can clean my flat. If the games aren't there then it's just a thing that comes in a box. Stephen T, Vancouver, Canada"

I totally agree.

Everyone knows both consoles are going to be broadly similar in their technological leap forward, even if one does turn out to be slighty better than the other.

To me, the biggest two factors that determine which edges ahead of the other will be 1) games 2) how much they live up the hype about being entertainment centres as well as just games boxes.
The one that best meets point 2) in terms of connectivity & compatibility, ease of use, variety of functions, extra hardware required to get the best from it, etc. will definitely have an edge.

However, as with the technology, i can't really help but expect them to be broadly similar in this capacity too. Maybe xbox's quicker release to market will turn out to be a genuine killer factor.

Cruddy
18-05-2005, 12:35:54
Hmm... won't PS3 have a MUCH larger back compatible catalogue?

Cruddy
18-05-2005, 12:44:33
Originally posted by Asher
The main differentiation point is the 10MB of 256GB/s embedded DRAM on the Xbox 360's GPU. Something like 40% of all memory writes on the GPU is on the current framebuffer only (drawing the actual picture). The Xbox 360 has the 256GB/s eDRAM on the chip, so it's quicker and doesn't saturate the main GPU bus bandwidth...freeing it up for other things.

In effect, Xbox 360 gets free anti-aliasing thanks to the eDRAM, and it'll be costly on the PS3.



Think I've got it. The graphics frame buffer on the Xbox 360 is built into the graphics chip. It's on separate chips on the PS3.

BUT... is a 10MB frame buffer enough on high resolution screens? Can it double buffer on 1200 X 1080 24 bit screens (bearing in mind it will be using 32 bits per pixel to avoid "odd" addressing?). Just... but forget widescreen, too many pixels to fit into 10Mb.

If so, it's possible Xbox could be the slower at max res?

Also, as for getting AA for nothing, I can't help but think NVidia and Sony are going to build a heap of "free" processes in.

Well, if they don't, the Xbox will stuff them on paper.

MattHiggs
18-05-2005, 12:44:59
Originally posted by Cruddy
Hmm... won't PS3 have a MUCH larger back compatible catalogue?

Yep, Microsoft said they would make 'their most popular' titles backward compatible.

Personally I wouldn't take into account backward compatibility in my decision to purchase a console.

Cruddy
18-05-2005, 13:02:07
Hmm... 2 X HDMI video/sound connectors?

Wonder if they're planning a Virtuality headset?

MattHiggs
18-05-2005, 13:26:44
If anyone cares Nintendo announced vague details of their 'Revolution' package.

fp
18-05-2005, 14:01:46
Nobody cares.

King_Ghidra
18-05-2005, 14:44:07
I'd rather hear about it than listen to any more pointless fucking number crunching nonsense

MattHiggs
18-05-2005, 14:54:17
Nintendo are also releasing a new version of the Gameboy SP. It's about the size of an iPod and is aimed at the same kind of market.

Asher
18-05-2005, 14:59:19
Originally posted by Cruddy
Think I've got it. The graphics frame buffer on the Xbox 360 is built into the graphics chip. It's on separate chips on the PS3.

BUT... is a 10MB frame buffer enough on high resolution screens? Can it double buffer on 1200 X 1080 24 bit screens (bearing in mind it will be using 32 bits per pixel to avoid "odd" addressing?). Just... but forget widescreen, too many pixels to fit into 10Mb.
1080p is 1920x1080 = 2073600 pixels x 32-bits = 66355200 bits = 8294400 bytes = 8.1MB per frame at 1080p resolution (the max resolution).

Few pointers though:
1) You only need one framebuffer in the EDRAM, you only draw to one. Doublebuffering means you draw on the one "in the background" and draw it to the screen when it's done only. Singlebuffering means you draw directly to the picture on the screen (which looks like ass).
2) They use 32-bit per pixel not only to avoid odd addressing, the "odd" 8-bits is used as an alpha (transparancy) channel.

Also, as for getting AA for nothing, I can't help but think NVidia and Sony are going to build a heap of "free" processes in.

Well, if they don't, the Xbox will stuff them on paper.
They may claim things are "free", but everything takes memory bandwidth. If they're not going to use embedded memory for the task, it's coming at the cost of something else...and not free. ;)

Asher
18-05-2005, 15:02:52
Originally posted by Cruddy
Hmm... 2 X HDMI video/sound connectors?

Wonder if they're planning a Virtuality headset?
As far as I know, they're just touting "panoramic" gaming...using 2 HDTVs at once.

Asher
18-05-2005, 15:03:48
Originally posted by MattHiggs
If anyone cares Nintendo announced vague details of their 'Revolution' package.
It's a sexy machine to be sure.

Nintendo says "2-3x faster" than the Gamecube...

Which puts it way behind the Xbox 360 and PS3 in power. :(

MattHiggs
18-05-2005, 15:26:01
I read in the paper that they are in discussion with IBM and ATI regarding the specifications.

Cruddy
18-05-2005, 15:50:07
Originally posted by Asher
1080p is 1920x1080 = 2073600 pixels x 32-bits = 66355200 bits = 8294400 bytes = 8.1MB per frame at 1080p resolution (the max resolution).

Few pointers though:
1) You only need one framebuffer in the EDRAM, you only draw to one. Doublebuffering means you draw on the one "in the background" and draw it to the screen when it's done only. Singlebuffering means you draw directly to the picture on the screen (which looks like ass).
2) They use 32-bit per pixel not only to avoid odd addressing, the "odd" 8-bits is used as an alpha (transparancy) channel.



My point exactly! Unless Microsoft use non-square pixels or some other work around, it won't double buffer on top resolution settings, will it?

Cruddy
18-05-2005, 15:53:33
Originally posted by Asher
As far as I know, they're just touting "panoramic" gaming...using 2 HDTVs at once.

Well, if that's 2 different scenes at once, looks like a shoo in candidate for Virtual Reality headsets, doesn't it?

Asher
18-05-2005, 16:04:06
Originally posted by Cruddy
My point exactly! Unless Microsoft use non-square pixels or some other work around, it won't double buffer on top resolution settings, will it?
Of course it will, the 2nd buffer exists in the main memory pool instead of the EDRAM. This is in contrast to the PS3, where both buffers exist in the main memory pool.

The PS2 even had a small (2MB) eDRAM @ 48GB/s for the main framebuffer.

I guess Sony had to cut some corners somewhere,so the EDRAM had to go.

Cruddy
18-05-2005, 16:11:09
Yuk! So Mr GPU either has to copy memory continuously, or strobe different types of memory in the hope that the CPU has its feet up in the air at that precise moment in time.

On paper, PS3 will rock at ultra res, and Xbox 360 will not. I guess we'll just have to wait and see if that turns out to be true.

Asher
18-05-2005, 16:25:36
Originally posted by Cruddy
Yuk! So Mr GPU either has to copy memory continuously, or strobe different types of memory in the hope that the CPU has its feet up in the air at that precise moment in time.

On paper, PS3 will rock at ultra res, and Xbox 360 will not. I guess we'll just have to wait and see if that turns out to be true.
I don't see how you can draw that conclusion. Higher reses require much more bandwidth, and the PS3 will have much less effective bandwidth since they don't have the eDRAM to draw the framebuffer on. Something like 40% of the memory bus activity will be saturated with constant calculations/streams of vectors that are avoided on the Xbox 360. It takes WAY more bandwidth for the GPU to draw the image, than it is to transfer the image once it's drawn.

Asher
18-05-2005, 16:27:33
Look at it this way: Is it faster to Xerox an image than it is to draw it by hand each time? ;)

The PS3 is constantly drawing the image on a 22.4GB/s bus.

The Xbox2 draws this image on a 256GB/s bus, then sends a copy of the final product on the 22.4GB/s bus later.

8.1MB is compressed by a factor of 2-4 usually (ATI's lossless framebuffer compression), so you're effectively sending 2-4MB of data 60 times a second = 120-240MB/s of 22.4GB/s...and this is worst-case 1080p high-res...in standard 720p, it'll be much lower.

By comparison, the PS3 will easily use ~10GB/s of that 22.4GB/s to draw the image constantly.

This gives the Xbox2 over 22GB/s for data streaming (such as textures), and PS3 about 12GB/s for the same at high resses.

Cruddy
18-05-2005, 16:35:10
Yeah, but it's a constant bandwidth hit, isn't it? It means the Xbox 360 memory bus is slower again, because it continually has to transfer.

It's not going to help CPU computation of the vector data to feed into the next frame generation, is it? Both machines have to feed the graphics chip with data from the CPU, but the Xbox has to wait for the finished frame to be buffered as well.

I'll leave it there. You could well be totally right and Xbox 360 will indeed update ultra res faster than a PS3.

We'll never really know for sure, unless a product is launched on both platforms. That would be a REAL comparison... highly unlikely though.

Asher
18-05-2005, 16:46:10
Originally posted by Cruddy
Yeah, but it's a constant bandwidth hit, isn't it? It means the Xbox 360 memory bus is slower again, because it continually has to transfer.
Both designs constantly have to transfer. The Xbox 360 transfers a compressed final image only, the PS3 every single detail. It takes much, much more bandwidth to draw the screen than it does send the final image.

It's like 400MB/s at 1080p to send a compressed final image compared to 10000MB/s.

It's not going to help CPU computation of the vector data to feed into the next frame generation, is it? Both machines have to feed the graphics chip with data from the CPU, but the Xbox has to wait for the finished frame to be buffered as well.
Both GPUs create the vertices now, as well as manipulate. That's a VS 3.0 feature. They don't get streamed from the CPU anymore.

And in any case, it makes no difference. The Xbox 360 has access to the exact same memory bus to main RAM as the PS3, but it has the additional option of 10MB of ultrafast embedded RAM to use as they wish. If they wanted to, they could draw both framebuffers in the main RAM and use the EDRAM for some constantly-used textures or whatever.

If anything, the PS3 is again at a disadvantage here, if only because the PS3 CPU only can access its 256MB of RDRAM, and can't touch anything in the GPU's RAM pool. This may be an issue when doing per-polygon collision detection (which is fairly standard today) on vertices created by the GPU. Since the X360 has a fully unified 512MB pool, both the CPU and GPU have access to everything.

fp
18-05-2005, 18:05:14
Originally posted by King_Ghidra
I'd rather hear about it than listen to any more pointless fucking number crunching nonsense

Actually I agree with you after reading these last few posts.

Asher
18-05-2005, 18:18:26
I think you guys are lost, this is the geek forum, not the game forum.

fp
18-05-2005, 18:19:41
:nervous:

Cruddy
18-05-2005, 18:24:40
Originally posted by Asher
That's a VS 3.0 feature. They don't get streamed from the CPU anymore.



Please explain what VS means. I'd like to know a lot more about how 3D is generated these days, and DirectX tutorials don't really explain things like that.

fp
18-05-2005, 18:27:28
Visual Studio.

I think.

fp
18-05-2005, 18:28:49
Thusly: http://msdn.microsoft.com/vstudio/

Asher
18-05-2005, 18:32:25
Originally posted by Cruddy
Please explain what VS means. I'd like to know a lot more about how 3D is generated these days, and DirectX tutorials don't really explain things like that.
VS is Vertex Shader in this context. Think of a assembler-like programming language that runs on GPUs for graphics alone.

If you want to learn more, Beyond3D is a great, great place: http://www.beyond3d.com/forum/viewforum.php?f=4

ATi employees post there, game developers (eg, SquareEnix developers), etc. Lots of insight on the hardware with some great diagrams/explanations.

Asher
19-05-2005, 01:27:59
Apparently the EDRAM on the Xbox 360 isn't just a really fast repository for cache.

It has 1million logic gates on the thing as well.

If the primary framebuffer is stored there, it will do "pixel blending, z-test, and anti-aliasing" completely independent of the GPU.

They weren't kidding that AA is free.

The AA is done on the "memory chip" housing the picture of the scene, independent of the CPU/GPU/memory bus... :D

mr.G
19-05-2005, 08:25:59
Originally posted by Sir Penguin
Is there a console that doesn't have butt-ugly controls?

SP yes PS2

Sir Penguin
19-05-2005, 11:05:23
It was actually a rhetorical question. All consoles have terrible, unintuitive controllers.

SP

mr.G
19-05-2005, 17:46:29
Lurker knows

LoD
19-05-2005, 19:29:02
Originally posted by Asher
Read more carefully: GPU != CPU.

Ah, sorry, to my tired eyes it was C==C, my bad.

Asher
19-05-2005, 19:58:32
More GPU info: http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423

Because of the extremely large amount of bandwidth available both between the parent and daughter die as well as between the embedded DRAM and its FPUs, multi-sample AA is essentially free at 720p and 1080p in the Xbox 360. If you're wondering why Microsoft is insisting that all games will have AA enabled, this is why.

ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.

It looks like MS' AA will be free, and Sony will take the typical 20-40% performance hit on AA...

Cruddy
19-05-2005, 21:07:12
So long as you can turn it off, I won't complain. ;)

Not everybody likes AA and if it's a forced choice, MS will find people like me complaining.

Asher
19-05-2005, 21:27:48
You're insane.

Cruddy
19-05-2005, 21:39:45
No, I'd just like to have a small advantage lining up 1,000 metre headshots.

Asher
19-05-2005, 22:13:07
You're using a controller with games that use per-polygon hitboxes. The position of individual pixels don't matter.

Cruddy
20-05-2005, 00:29:58
You misunderstand me. Say I'm playing an FPS with a camouflaged character, I would rather blend in myself at range (other playes have AA on) but be able to turn it off (so they don't get blended into the background on my view).

Asher
20-05-2005, 00:40:56
You seem to think AA is just a blur filter?

Cruddy
20-05-2005, 15:40:27
A simpler term for anti-aliasing - mixing up midtones to avoid high contrasts.

Asher
20-05-2005, 16:31:34
That's not how modern anti-aliasing works.

Cruddy
20-05-2005, 19:25:15
Well, if it doesn't mix up mid tones to avoid high contrasts, it's not anti-aliasing is it?

Asher
20-05-2005, 19:50:42
In modern 3D AA, an image is rendered in double (2x) or quadruple (4x) resolution, then filtered and resampled to the final resolution. On the monitor, the color of a pixel is set to the middle value from four of the original pixels. This allows the hard edges to be smoothed out.

Cruddy
20-05-2005, 21:01:50
Originally posted by Asher
In modern 3D AA, an image is rendered in double (2x) or quadruple (4x) resolution, then filtered and resampled to the final resolution. On the monitor, the color of a pixel is set to the middle value from four of the original pixels. This allows the hard edges to be smoothed out.

All well and good. It's still just selecting mid tones to stick between contrasting pixels.

And the Xbox 360 can't AA render that way, can it? Not with 10Mb of graphics memory.

Asher
21-05-2005, 00:26:40
Yes, it can. There is more complicated details to that, it's called MultiSample AntiAliasing (MSAA).

BTW: http://www.xboxyde.com/news_1610_en.html

tech video running in real-time on beta Xbox 360 hardware. It's not running at fullspeed, it was ported to the Xbox 360 from the PC in "under a week".

It was actually designed and optimized for ATI's upcoming PC part, which is very very different from the Xbox 360 GPU.

Asher
25-05-2005, 15:42:17
Excellent article on the Xbox 360 and how it plans to "procedurally" create art assets to shave development costs.

Funko
27-05-2005, 12:50:01
From the BBC:

Which next-generation games console looks the most tempting?
PlayStation 3 57.44%
Revolution 11.28%
Xbox 360 31.28%

34135 Votes Cast

Seen a couple of predictions that reckon the X-Box could get 30% of the world market, pretty impressive leap.

Asher
27-05-2005, 13:27:00
One analyst have predicted MS would get 58% of the market next time around.

Asher
27-05-2005, 13:27:45
Originally posted by Asher
Excellent article on the Xbox 360 and how it plans to "procedurally" create art assets to shave development costs.
Wow, I didn't link it.

http://arstechnica.com/articles/paedia/cpu/xbox360-1.ars

Funko
27-05-2005, 13:37:04
Next time round meaning X-box 3?

They did very well with the X-Box which I think they were aiming to be an intro for them to get their names known as console makers and into the market etc. rather than a massively profitable thing in itself?

If they grow to 30%+ (which they probably will) with this one then that'll be great growth I think.