If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
That looks just like the radeon 9500 and 9700 series. Maybe this is just a better version of a 9700 rather than a new card:cry:
the most common rumour is that the RV350 will be a R300 with a faster clock (375 or 400 MHz depending on who you want to believe)....and that the R400 will be the next card to have a significant change in architecture.
In a nutshell this is all the NV35 is, well according to Hellbinder that is.
-600mhz
-8 (true) pipelines
-256bit bus
-500mhz DDR-II (or perhaps DDR-I at 256mb)
-All the Nv30 hardare bugs fixed (in theory)
Im sure there are one or two suprises in there. However with the revelation that the low-k process is screwed
up at TSMC. It is not very likely that Nvidia will be able to get 600mhz without the use of ANOTHER dustbuster.
Currently the Nv35 is based on copper, and they have done a few tweaks to make it run a little cooler and more
stable at faster speeds. However, being that the Nv30 is already pushing it at 500mhz.
It would seem the changes they have made would simply make the Nv35 run like a *normal* chip at 500mhz.
Now with them fixing their pixel output, and jumping to around 30GB bandwidth, even at 500mhz it will be pretty
darn fast. However I have not heard that they have changed their AA modes, nor have they addressed some
issues with their AF that will become apparent when the Nv30 is public.
Well if this turns out to be true, like the R350/R300, the NV35 is simply a polished NV30, and with the R400
rumoured to be delayed until next year, ATi should continue to lead (just) with their R350.
As I've stated previously, Nvidia's NV40 will and needs to be something very special indeed if they're to get
back into the game.
Yes the NV30 offers 128bit colour, so the NV35 will too, not sure about the R350. Where's RDR when you need him?
The real world is filled with dramatic contrasts between lights, shadows and colors, from the brightest, harshest white
to the deepest, darkest black. For any game world to convincingly depict a 3D environment, it needs to simulate this
seemingly infinite range in a decidedly finite space. With each added "bit" of light or color information afforded a game,
the quality and accuracy of the resulting image grow exponentially. 32-bit color only gives each of the red, green, blue
and alpha channels 256 choices. 128-bit color provides the developer with literally millions of choices for each channel.
For film-quality real-time animated visuals, there's simply no substitute. The GeForce FX delivers this demanding level
of excellence in real time.
Thanks for the clarification !
As I remember, the jump from 16bit to 32bit had a performance penalty in first gen cards.
What kind of performance hit will the jump to 128bit color cost us ? Is it worth it yet ??:?: :rolleyes2
Sorry but I can't help you there, we will have to wait for retail products to be tested, and comparisims made with
other cards before that question can be truely answered.
But I would guess that there must be some sort of performance hit with 128bit colour. Having said that, if we could
see a game running with that depth of colour, well can you imagine the level of realism?
Comment