Skip to main content

GeForce 6

Submitted by TheBigJ on

nVidia announced the GeForce 6 (NV40) today and in case anyone hasn't seen it yet I thought I'd post some links. By the way, this thing is PHAT!

[url]http://www.nvidia.com/object/IO_12687.html[/url]
[url]http://www.anandtech.com/video/showdoc.html?i=2023[/url]
[url]http://www.tomshardware.com/graphic/20040414/index.html[/url]
[url]http://www.nzone.com/object/nzone_nalu_home.html[/url]

nVidia is sticking with the scantily-clad female mascot. This time its a mermaid named Nalu. Woo!

Submitted by inglis on Fri, 16/04/04 - 2:03 AMPermalink

oh another 'Revolutionary' and 'Blazingly Fast' video card.

but with zbrush2 coming out. it may be a good idea to think about upgrading my geforce 3.

Submitted by ScORCHo on Fri, 16/04/04 - 2:06 AMPermalink

What happened to 5? or was the FX series instead of that...

Submitted by TheBigJ on Fri, 16/04/04 - 2:10 AMPermalink

Yeah, FX was the fifth GeForce series. Maybe they forgot that they would have to name a sixth.

Submitted by Fluffy CatFood on Fri, 16/04/04 - 6:09 AMPermalink

I read some of the benchmarks on anandtech, some where damn impressive, I forget which tests were the ones where it doubled the frame rate of a radeon 9800xt, some of the other tests werent that impressive, but doubling a 9800xt is pretty good

Submitted by Aven on Fri, 16/04/04 - 5:56 PMPermalink

I hope that they fix up their image quality a little. Have a look at the FarCry screens from off Tom's Hardware :/ If it is just a driver fault then it looks like it may be quite a nice card. As long as you have a power supply to support it :)

Submitted by Maitrek on Fri, 16/04/04 - 11:01 PMPermalink

I imagine that ATI have a similarly powerful card in the pipeline with the R423 (PCI express). Graphics tech is just getting ridiculous, most programmers/developers can't keep up with it. Look at FarCry for instance, it already runs bloody smooth on a 9800XT, yet in six months time or less it'll look 'dated' compared to what is possible with next gen graphics tech.

I imagine the NV40 is going to cost a mother-load too. I mean, 512 registers, 16 pipelines or something like that? Were are talking gazillions of transistors here which is going to cost $$$.

Submitted by TheBigJ on Sat, 17/04/04 - 12:00 AMPermalink

It has about 50 million more transitors then the latest P4. About 220 million or something like that. I'm sure it'll cost a pantload.

Submitted by Kalescent on Sat, 17/04/04 - 12:38 AMPermalink

bah if you absoluetly must stay up with the latest graphics cards as soon as they hit the shelves then kudos to you, just hold off a couple months after release and youll pretty much loose 15-40% the original release price.

A couple months... or a couple hundred bux,.. meh.. following maitreks comments - if youve got a geforce 4/fx or radeon9800 or thereabouts you probably wont notice MUCH difference at all when playing games, as the graphics tech is way way ahead.

Submitted by MoonUnit on Sat, 17/04/04 - 8:13 AMPermalink

i know you people are probably interested to see this so:

Graphics Core 256-bit
Memory Interface 256-bit
Memory Bandwidth 35.2 GB/sec.
Fill Rate 6.4 billion texels/sec.
Vertices per Second 600 Million
Memory Data Rate 1100 MHz
Pixels per Clock (peak) 16
Textures per Pixel* 16
RAMDACs 400 MHz

[img]http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/images/ut…]

http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/page20.asp

makes saving up to buy a radeon 9600 seem really pointless...

Submitted by Major Clod on Sat, 17/04/04 - 8:17 AMPermalink

Exactly, you can pick up 128Mb 9800 Pro's for ~400 these days. I'd definately save that $300 and go with the Pro rather than the slightly faster XT.

Submitted by Major Clod on Sat, 17/04/04 - 9:58 AMPermalink

Radeon 9800 Pro (Powercolor - 128MB) - http://www.pc-express.com.au (VIC) ($386.10)
Radeon 9800 Pro (Powercolor - 128MB) - http://www.i-tech.com.au (NSW) ($379) ***NEW***
Radeon 9800 Pro (Powercolor - 128MB) - http://www.cpl.net.au (VIC) ($465)
Radeon 9800 Pro (Powercolor - 128MB) - http://www.bluefiretechnology.com (VIC) ($455) *Delivery Only*
Radeon 9800 Pro (GeCube - 128MB) - http://www.ticomputers.com.au (NSW) ($428)
Radeon 9800 (GeCube - 128MB) - http://www.below-0.com.au (QLD) ($429)

So they are not Asus or Hercules or some big brand like those, but the performance differences are extremely small. In some cases the GeCube cards have proven to be quicker and enjoy being overclocked quite a bit. There are heaps of benchmarks of these cards on the net. I see no reason to pay an extra $100-$200 to get a Sapphire or other big brand. Quality is not an issue with the cheaper cards either. We sell plenty of GeCube 9600-9800's at work and never get any returns.

This is quite a good link that is kept up to date with the latest in video card prices around Oz.
[url]http://www.atomicmpc.com.au/forums.asp?s=2&c=7&t=153[/url]

Submitted by Kalescent on Sat, 17/04/04 - 6:36 PMPermalink

Why not show the stats at 1280 x 1024 x 32 & 1024 x 768 x 32, that would be the most frquently used resolutions id say, i think the differences there would mean almost nill to the human eye, in terms of frame rate anyways. i think prettier and more texture layers / pixel shader capabiilty etc is awesome - but i honestlyu dont see the point in getting a higher and higher res + 100's of frames per second,.. when the most the human eye can see is around 60 - 75 fps.

I remember way back when the TNT 2 ultra came out and it was pushing something like 50-60 fps on quake @ 800 x 600 / 1024 x 768 - which made it easily playable - imagine what some of todays graphics cards will run @ .. like 500+ fps - given that obviously quake is pretty old tech nowadays
still,..

GFORCE 15 WOW 89237487 fps 15gb onboard ram only $4995!! WOOoo now if only i could get a 42" monitor that runs at 16384 x 12288 @ 100Hz! so i can see some of the goodness! [:D] or perhaps a game that actually uses 128 textures per pixel !!

/winge

Submitted by Blitz on Sun, 18/04/04 - 5:37 AMPermalink

Except that quake is such old tech, it gets very little (if any)performance gain running on newer GFX cards :) I'ts mianly your processor speed thats gonna improve your quake frames. :)
CYer, Blitz

Submitted by Aven on Sun, 18/04/04 - 8:38 AMPermalink

The main thing is that very few benchmarks show the minimum fps on games. I don't care if a card can display HL2 at 1600 at 80+ fps average. I want to know what the minimum peaks are and how frequently it hits them. The most anoying aspect is playing a game and having it all of a sudden crawl to a 5fps lul when you turn to a certain angle :(

Submitted by MoonUnit on Mon, 19/04/04 - 12:47 AMPermalink

ah now i understand the 400ish price tag, most of them are powercolours (which apparently have a habit of breaking).

Submitted by Maitrek on Mon, 19/04/04 - 6:47 AMPermalink

Actually the human eye can only pick up 25 frames per second. The reason that 25fps looks so bad is because when we see something real we are actually seeing the total light over a 1/25th of a second period. Meaning we pick up motion blur and various crap like that. If you pause a video, you'll notice that one frame is *heaps* blurry because a video camera picks up light over a reasonably comparable time frame (depending on shutter speed of a camera), meaning it also picks up artefacts like motion blur. However a computer renders something in it's instantaneous state (more often than not) so it only shows the image in an infintisimally small time frame.

We can't tell the difference between 70fps and 170fps because the screen doesn't refresh much quicker than 70Hz (70 times per second). This doesn't mean that the human eye sees 70fps. 70fps on a 170Hz screen would look noticeably 'fake' next to 170fps running on the same screen. This doesn't mean the human eye can pick up 170fps though, it just means that more frames are picked up by the eye over a 1/25th of a second period, meaning the brain will receive a more smooth/realistic image.

Submitted by Kalescent on Mon, 19/04/04 - 10:13 AMPermalink

sorry - yeah your right maitrek i understand that just got caught up in thw whole big wow over 300+frames per second video cards...

Submitted by Maitrek on Mon, 19/04/04 - 10:35 PMPermalink

I just felt like making a Publis Service Announcement :)

I'm surprised that they (tech guys) aren't doing more to 'sacrifice' framerate and improve image 'believability' with modern graphics cards. I know that 3DFX attempted to do this with it's Voodoo 5 graphics cards, but they were mistimed - now that we have ridiculous shaders and very believable 'still' rendering, we should really move onto making things look good in motion.

Submitted by MoonUnit on Thu, 22/04/04 - 6:05 AMPermalink

question, is a "GeForce FX 5900 w/VIVO (Leadtek - 128MB)" for 400 bucks a good deal, from the previous graph it looks like that that card might be better then a 9800XT (which are like 650 minimum)

Submitted by Aven on Thu, 22/04/04 - 6:25 AMPermalink

My friend bought his 256mb 9800TX at the local computer fair for $600. I can't remember the actual brand name, but he hasn'y had any problems so far (after 6 weeks).

Submitted by Blitz on Thu, 22/04/04 - 9:12 AMPermalink

I think i read somewhere that the GFFX cards are faster at anti-aliasing than the radeon 9800's, but other than that the 9800's tend to be faster...
Maitrek, i'm interested what you mean by sacrificing frame-rate for believability...particularly with the programmable graphics pipeline now, higher framerates mean being able to squeeze more shader instructions in which if all goes well == more realistic graphical/lighting effects...
CYer, Blitz

Submitted by TheBigJ on Thu, 22/04/04 - 7:19 PMPermalink

There's very little speed difference between the 5900 and 9800XT. From what I can remember, the 5900 has a faster memory clock and higher fill rate.

Posted by TheBigJ on

nVidia announced the GeForce 6 (NV40) today and in case anyone hasn't seen it yet I thought I'd post some links. By the way, this thing is PHAT!

[url]http://www.nvidia.com/object/IO_12687.html[/url]
[url]http://www.anandtech.com/video/showdoc.html?i=2023[/url]
[url]http://www.tomshardware.com/graphic/20040414/index.html[/url]
[url]http://www.nzone.com/object/nzone_nalu_home.html[/url]

nVidia is sticking with the scantily-clad female mascot. This time its a mermaid named Nalu. Woo!


Submitted by inglis on Fri, 16/04/04 - 2:03 AMPermalink

oh another 'Revolutionary' and 'Blazingly Fast' video card.

but with zbrush2 coming out. it may be a good idea to think about upgrading my geforce 3.

Submitted by ScORCHo on Fri, 16/04/04 - 2:06 AMPermalink

What happened to 5? or was the FX series instead of that...

Submitted by TheBigJ on Fri, 16/04/04 - 2:10 AMPermalink

Yeah, FX was the fifth GeForce series. Maybe they forgot that they would have to name a sixth.

Submitted by Fluffy CatFood on Fri, 16/04/04 - 6:09 AMPermalink

I read some of the benchmarks on anandtech, some where damn impressive, I forget which tests were the ones where it doubled the frame rate of a radeon 9800xt, some of the other tests werent that impressive, but doubling a 9800xt is pretty good

Submitted by Aven on Fri, 16/04/04 - 5:56 PMPermalink

I hope that they fix up their image quality a little. Have a look at the FarCry screens from off Tom's Hardware :/ If it is just a driver fault then it looks like it may be quite a nice card. As long as you have a power supply to support it :)

Submitted by Maitrek on Fri, 16/04/04 - 11:01 PMPermalink

I imagine that ATI have a similarly powerful card in the pipeline with the R423 (PCI express). Graphics tech is just getting ridiculous, most programmers/developers can't keep up with it. Look at FarCry for instance, it already runs bloody smooth on a 9800XT, yet in six months time or less it'll look 'dated' compared to what is possible with next gen graphics tech.

I imagine the NV40 is going to cost a mother-load too. I mean, 512 registers, 16 pipelines or something like that? Were are talking gazillions of transistors here which is going to cost $$$.

Submitted by TheBigJ on Sat, 17/04/04 - 12:00 AMPermalink

It has about 50 million more transitors then the latest P4. About 220 million or something like that. I'm sure it'll cost a pantload.

Submitted by Kalescent on Sat, 17/04/04 - 12:38 AMPermalink

bah if you absoluetly must stay up with the latest graphics cards as soon as they hit the shelves then kudos to you, just hold off a couple months after release and youll pretty much loose 15-40% the original release price.

A couple months... or a couple hundred bux,.. meh.. following maitreks comments - if youve got a geforce 4/fx or radeon9800 or thereabouts you probably wont notice MUCH difference at all when playing games, as the graphics tech is way way ahead.

Submitted by MoonUnit on Sat, 17/04/04 - 8:13 AMPermalink

i know you people are probably interested to see this so:

Graphics Core 256-bit
Memory Interface 256-bit
Memory Bandwidth 35.2 GB/sec.
Fill Rate 6.4 billion texels/sec.
Vertices per Second 600 Million
Memory Data Rate 1100 MHz
Pixels per Clock (peak) 16
Textures per Pixel* 16
RAMDACs 400 MHz

[img]http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/images/ut…]

http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/page20.asp

makes saving up to buy a radeon 9600 seem really pointless...

Submitted by Major Clod on Sat, 17/04/04 - 8:17 AMPermalink

Exactly, you can pick up 128Mb 9800 Pro's for ~400 these days. I'd definately save that $300 and go with the Pro rather than the slightly faster XT.

Submitted by Major Clod on Sat, 17/04/04 - 9:58 AMPermalink

Radeon 9800 Pro (Powercolor - 128MB) - http://www.pc-express.com.au (VIC) ($386.10)
Radeon 9800 Pro (Powercolor - 128MB) - http://www.i-tech.com.au (NSW) ($379) ***NEW***
Radeon 9800 Pro (Powercolor - 128MB) - http://www.cpl.net.au (VIC) ($465)
Radeon 9800 Pro (Powercolor - 128MB) - http://www.bluefiretechnology.com (VIC) ($455) *Delivery Only*
Radeon 9800 Pro (GeCube - 128MB) - http://www.ticomputers.com.au (NSW) ($428)
Radeon 9800 (GeCube - 128MB) - http://www.below-0.com.au (QLD) ($429)

So they are not Asus or Hercules or some big brand like those, but the performance differences are extremely small. In some cases the GeCube cards have proven to be quicker and enjoy being overclocked quite a bit. There are heaps of benchmarks of these cards on the net. I see no reason to pay an extra $100-$200 to get a Sapphire or other big brand. Quality is not an issue with the cheaper cards either. We sell plenty of GeCube 9600-9800's at work and never get any returns.

This is quite a good link that is kept up to date with the latest in video card prices around Oz.
[url]http://www.atomicmpc.com.au/forums.asp?s=2&c=7&t=153[/url]

Submitted by Kalescent on Sat, 17/04/04 - 6:36 PMPermalink

Why not show the stats at 1280 x 1024 x 32 & 1024 x 768 x 32, that would be the most frquently used resolutions id say, i think the differences there would mean almost nill to the human eye, in terms of frame rate anyways. i think prettier and more texture layers / pixel shader capabiilty etc is awesome - but i honestlyu dont see the point in getting a higher and higher res + 100's of frames per second,.. when the most the human eye can see is around 60 - 75 fps.

I remember way back when the TNT 2 ultra came out and it was pushing something like 50-60 fps on quake @ 800 x 600 / 1024 x 768 - which made it easily playable - imagine what some of todays graphics cards will run @ .. like 500+ fps - given that obviously quake is pretty old tech nowadays
still,..

GFORCE 15 WOW 89237487 fps 15gb onboard ram only $4995!! WOOoo now if only i could get a 42" monitor that runs at 16384 x 12288 @ 100Hz! so i can see some of the goodness! [:D] or perhaps a game that actually uses 128 textures per pixel !!

/winge

Submitted by Blitz on Sun, 18/04/04 - 5:37 AMPermalink

Except that quake is such old tech, it gets very little (if any)performance gain running on newer GFX cards :) I'ts mianly your processor speed thats gonna improve your quake frames. :)
CYer, Blitz

Submitted by Aven on Sun, 18/04/04 - 8:38 AMPermalink

The main thing is that very few benchmarks show the minimum fps on games. I don't care if a card can display HL2 at 1600 at 80+ fps average. I want to know what the minimum peaks are and how frequently it hits them. The most anoying aspect is playing a game and having it all of a sudden crawl to a 5fps lul when you turn to a certain angle :(

Submitted by MoonUnit on Mon, 19/04/04 - 12:47 AMPermalink

ah now i understand the 400ish price tag, most of them are powercolours (which apparently have a habit of breaking).

Submitted by Maitrek on Mon, 19/04/04 - 6:47 AMPermalink

Actually the human eye can only pick up 25 frames per second. The reason that 25fps looks so bad is because when we see something real we are actually seeing the total light over a 1/25th of a second period. Meaning we pick up motion blur and various crap like that. If you pause a video, you'll notice that one frame is *heaps* blurry because a video camera picks up light over a reasonably comparable time frame (depending on shutter speed of a camera), meaning it also picks up artefacts like motion blur. However a computer renders something in it's instantaneous state (more often than not) so it only shows the image in an infintisimally small time frame.

We can't tell the difference between 70fps and 170fps because the screen doesn't refresh much quicker than 70Hz (70 times per second). This doesn't mean that the human eye sees 70fps. 70fps on a 170Hz screen would look noticeably 'fake' next to 170fps running on the same screen. This doesn't mean the human eye can pick up 170fps though, it just means that more frames are picked up by the eye over a 1/25th of a second period, meaning the brain will receive a more smooth/realistic image.

Submitted by Kalescent on Mon, 19/04/04 - 10:13 AMPermalink

sorry - yeah your right maitrek i understand that just got caught up in thw whole big wow over 300+frames per second video cards...

Submitted by Maitrek on Mon, 19/04/04 - 10:35 PMPermalink

I just felt like making a Publis Service Announcement :)

I'm surprised that they (tech guys) aren't doing more to 'sacrifice' framerate and improve image 'believability' with modern graphics cards. I know that 3DFX attempted to do this with it's Voodoo 5 graphics cards, but they were mistimed - now that we have ridiculous shaders and very believable 'still' rendering, we should really move onto making things look good in motion.

Submitted by MoonUnit on Thu, 22/04/04 - 6:05 AMPermalink

question, is a "GeForce FX 5900 w/VIVO (Leadtek - 128MB)" for 400 bucks a good deal, from the previous graph it looks like that that card might be better then a 9800XT (which are like 650 minimum)

Submitted by Aven on Thu, 22/04/04 - 6:25 AMPermalink

My friend bought his 256mb 9800TX at the local computer fair for $600. I can't remember the actual brand name, but he hasn'y had any problems so far (after 6 weeks).

Submitted by Blitz on Thu, 22/04/04 - 9:12 AMPermalink

I think i read somewhere that the GFFX cards are faster at anti-aliasing than the radeon 9800's, but other than that the 9800's tend to be faster...
Maitrek, i'm interested what you mean by sacrificing frame-rate for believability...particularly with the programmable graphics pipeline now, higher framerates mean being able to squeeze more shader instructions in which if all goes well == more realistic graphical/lighting effects...
CYer, Blitz

Submitted by TheBigJ on Thu, 22/04/04 - 7:19 PMPermalink

There's very little speed difference between the 5900 and 9800XT. From what I can remember, the 5900 has a faster memory clock and higher fill rate.