nVidia announced the GeForce 6 (NV40) today and in case anyone hasn't seen it yet I thought I'd post some links. By the way, this thing is PHAT!
[url]http://www.nvidia.com/object/IO_12687.html[/url]
[url]http://www.anandtech.com/video/showdoc.html?i=2023[/url]
[url]http://www.tomshardware.com/graphic/20040414/index.html[/url]
[url]http://www.nzone.com/object/nzone_nalu_home.html[/url]
nVidia is sticking with the scantily-clad female mascot. This time its a mermaid named Nalu. Woo!
I imagine that ATI have a similarly powerful card in the pipeline with the R423 (PCI express). Graphics tech is just getting ridiculous, most programmers/developers can't keep up with it. Look at FarCry for instance, it already runs bloody smooth on a 9800XT, yet in six months time or less it'll look 'dated' compared to what is possible with next gen graphics tech.
I imagine the NV40 is going to cost a mother-load too. I mean, 512 registers, 16 pipelines or something like that? Were are talking gazillions of transistors here which is going to cost $$$.
bah if you absoluetly must stay up with the latest graphics cards as soon as they hit the shelves then kudos to you, just hold off a couple months after release and youll pretty much loose 15-40% the original release price.
A couple months... or a couple hundred bux,.. meh.. following maitreks comments - if youve got a geforce 4/fx or radeon9800 or thereabouts you probably wont notice MUCH difference at all when playing games, as the graphics tech is way way ahead.
i know you people are probably interested to see this so:
Graphics Core 256-bit
Memory Interface 256-bit
Memory Bandwidth 35.2 GB/sec.
Fill Rate 6.4 billion texels/sec.
Vertices per Second 600 Million
Memory Data Rate 1100 MHz
Pixels per Clock (peak) 16
Textures per Pixel* 16
RAMDACs 400 MHz
[img]http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/images/ut…]
http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/page20.asp
makes saving up to buy a radeon 9600 seem really pointless...
Radeon 9800 Pro (Powercolor - 128MB) - http://www.pc-express.com.au (VIC) ($386.10)
Radeon 9800 Pro (Powercolor - 128MB) - http://www.i-tech.com.au (NSW) ($379) ***NEW***
Radeon 9800 Pro (Powercolor - 128MB) - http://www.cpl.net.au (VIC) ($465)
Radeon 9800 Pro (Powercolor - 128MB) - http://www.bluefiretechnology.com (VIC) ($455) *Delivery Only*
Radeon 9800 Pro (GeCube - 128MB) - http://www.ticomputers.com.au (NSW) ($428)
Radeon 9800 (GeCube - 128MB) - http://www.below-0.com.au (QLD) ($429)
So they are not Asus or Hercules or some big brand like those, but the performance differences are extremely small. In some cases the GeCube cards have proven to be quicker and enjoy being overclocked quite a bit. There are heaps of benchmarks of these cards on the net. I see no reason to pay an extra $100-$200 to get a Sapphire or other big brand. Quality is not an issue with the cheaper cards either. We sell plenty of GeCube 9600-9800's at work and never get any returns.
This is quite a good link that is kept up to date with the latest in video card prices around Oz.
[url]http://www.atomicmpc.com.au/forums.asp?s=2&c=7&t=153[/url]
Why not show the stats at 1280 x 1024 x 32 & 1024 x 768 x 32, that would be the most frquently used resolutions id say, i think the differences there would mean almost nill to the human eye, in terms of frame rate anyways. i think prettier and more texture layers / pixel shader capabiilty etc is awesome - but i honestlyu dont see the point in getting a higher and higher res + 100's of frames per second,.. when the most the human eye can see is around 60 - 75 fps.
I remember way back when the TNT 2 ultra came out and it was pushing something like 50-60 fps on quake @ 800 x 600 / 1024 x 768 - which made it easily playable - imagine what some of todays graphics cards will run @ .. like 500+ fps - given that obviously quake is pretty old tech nowadays
still,..
GFORCE 15 WOW 89237487 fps 15gb onboard ram only $4995!! WOOoo now if only i could get a 42" monitor that runs at 16384 x 12288 @ 100Hz! so i can see some of the goodness! [:D] or perhaps a game that actually uses 128 textures per pixel !!
/winge
The main thing is that very few benchmarks show the minimum fps on games. I don't care if a card can display HL2 at 1600 at 80+ fps average. I want to know what the minimum peaks are and how frequently it hits them. The most anoying aspect is playing a game and having it all of a sudden crawl to a 5fps lul when you turn to a certain angle :(
Actually the human eye can only pick up 25 frames per second. The reason that 25fps looks so bad is because when we see something real we are actually seeing the total light over a 1/25th of a second period. Meaning we pick up motion blur and various crap like that. If you pause a video, you'll notice that one frame is *heaps* blurry because a video camera picks up light over a reasonably comparable time frame (depending on shutter speed of a camera), meaning it also picks up artefacts like motion blur. However a computer renders something in it's instantaneous state (more often than not) so it only shows the image in an infintisimally small time frame.
We can't tell the difference between 70fps and 170fps because the screen doesn't refresh much quicker than 70Hz (70 times per second). This doesn't mean that the human eye sees 70fps. 70fps on a 170Hz screen would look noticeably 'fake' next to 170fps running on the same screen. This doesn't mean the human eye can pick up 170fps though, it just means that more frames are picked up by the eye over a 1/25th of a second period, meaning the brain will receive a more smooth/realistic image.
I just felt like making a Publis Service Announcement :)
I'm surprised that they (tech guys) aren't doing more to 'sacrifice' framerate and improve image 'believability' with modern graphics cards. I know that 3DFX attempted to do this with it's Voodoo 5 graphics cards, but they were mistimed - now that we have ridiculous shaders and very believable 'still' rendering, we should really move onto making things look good in motion.
I think i read somewhere that the GFFX cards are faster at anti-aliasing than the radeon 9800's, but other than that the 9800's tend to be faster...
Maitrek, i'm interested what you mean by sacrificing frame-rate for believability...particularly with the programmable graphics pipeline now, higher framerates mean being able to squeeze more shader instructions in which if all goes well == more realistic graphical/lighting effects...
CYer, Blitz
oh another 'Revolutionary' and 'Blazingly Fast' video card.
but with zbrush2 coming out. it may be a good idea to think about upgrading my geforce 3.