View Full Version : Anyone downloaded the new nvidia drivers for Geforce cards?

03-12-2001, 22:38:00
Anyone done it? Anyone seen any improvement? Which files did you download (there are 2 to choose from)?

Would normally download withought a second thought, but the first one is 6+megs and I'm still on a poxy dialup modem.

frustrated :bounce:


03-12-2001, 22:50:13
No but I was going to ask similar questions.

No longer Trippin
03-12-2001, 23:02:59
Nvidia's site is fast. I can pull about 6k on average from it on dailup and compuserve! I usually get 4 as my fastest average, 5 with a download accelarator... I disable it like nvidia ask and I got the fastest download speeds I ever gotten.

04-12-2001, 00:44:11
Me and my cable modem got 225kb a sec so I got the 6 meg file. No noticible differences so far. I'll let you know if I see anything.

RC said it solved some stability issues he had with The Sims: Hot Date.

04-12-2001, 01:01:03
Define new?
I'm running with the 22.80's right now, some definite performance increases since the 12.40's. You can also enable 8x anistropic filtering (does wonders in games). Takes a performance hit when enabled though.

Check it: http://www.guru3d.com/detonator-dbase/

(Click next a few times to see benchmarks)

04-12-2001, 01:09:25
Just found the new official ones. :o

Not much different than the ones I already had. Their download site is alright (~250KB/s). Fastest download site I get is download.microsoft.com, where I regularly get 500-900KB/s.

04-12-2001, 09:50:31
Funny, Microshaft is one of the slowest sites from Europe, rarely get much more than 10K even on ADSL.

Resource Consumer
04-12-2001, 10:49:16
I did. Haven't noticed anything startling except that Sims Hot Date now runs for me (hasn't failed to load once). I downloaded version 23.11

Resource Consumer
04-12-2001, 10:49:40
That's the 6 Meg one.

04-12-2001, 11:44:33
what's the difference between the 1 meg file and the 6 meg file? Do you only need the 1 meg file or is the 6 meg file essential?

Anything to make my pc more stable - since I installed my GF2 my system is about as reliable as the power system of Florida. If there was one good thing about 3dFx cards is that they were rock solid stable.

04-12-2001, 12:44:24
I think the six meg on is the right one.. if you look closely its says GPU (graphic processing units or something), where as the other one has a acronym I dont recognise

04-12-2001, 21:20:07
The smaller file is for the 'Personal Cinema' thing (has a remote control and a bunch of other useless things). Almost nobody has it.

05-12-2001, 14:39:48
Great, I'll try downloadign teh 6 smeg one at the first opportunigythty.

Whoops, still beered up :gotit:

05-12-2001, 20:20:28
I have the new drivers, upgraded from 21.43 to 23.11 and have noticed no difference in 3D games, everything runs at my monitors refresh rate anyway which is around 120fps.

It'll probably make a difference on other cards though, I have a GeForce 3, but nVidia are renowned for making excellent driver updates.

06-12-2001, 08:07:00
I downloaded them yesterday, and haven't really tested them out. I'm just still annoyed that my computer isn't letting me play Worms anymore. It thinks I must want the screen turned off whenever I run it. I wish I could remember how I fixed that last time it happened.

06-12-2001, 09:44:18
The new options for 8x anisotropic filtering in Direct3D and OpenGL make me very happy indeed.

Quincunx + 8x anisotropic simply cannot be beat. :bounce:

06-12-2001, 10:45:01
I'm going to pretend I know what you guys are talking about...

06-12-2001, 13:24:59
It's quite obvious what they're talking about.

06-12-2001, 13:40:05
Basically they are talking about a new feature which makes graphics a whole lot better looking.

06-12-2001, 13:43:47

06-12-2001, 13:57:30
But if you have something below an Athlon 1.2Ghz then don't bother.

07-12-2001, 02:39:46
I've got less than a 1.2GHz (a mere 800MHz, alas).

I can still play games like RTCW in 1152x864x32 with full graphics, plus Quincunx + 8x anisotropic. Max Payne too.

07-12-2001, 09:44:09
I've got a 1.2GHz Athlon

No longer Trippin
07-12-2001, 15:33:56
Processor speed as little to do with it. If your going for raw frames per second you'll need a strong processor, but the eye can only make out approximately 24 fps anyhow. So it does no good going for anything higher than that. The target fps for Doom 3 on a geforce 3 is 30 fps, not the insane 75-125 fps people are getting right now on top end cards. Hardware has greatly outpaced the software it runs at the moment.

07-12-2001, 16:25:06
Trippin - I'm sorry but I have to completely disagree with what you've said. The eye can quite easily discern the difference between various framerates above 24fps and even in this era of 3d cards with "T&L acceleration" you still need a whopping fat load of clock cycles out of your processor to drive everything along.
People use the "24fps is smooth" thing because of cinema. Crap, during camera pans on big screen films it's quite easy to spot the jerkiness. On my own pc, I used to get "only" 27fps on Quake 3 with a Voodoo2 SLI setup. I then upgraded to a GF2 and I was getting 40fps and the whole thing felt so much smoother, less jerky and also easier to aim.
The only rule in FPS games is more frames per second the better. The 2 main ways of getting more FPS is processor speed and graphics card power. There's no escaping it.

I've only got 350 Mhz :(

07-12-2001, 16:36:58
I've got a 1.6Ghz O/c'd Athlon with GeForce 3 and 2gig of DDR RAM.

Games run quite smooth :)

07-12-2001, 16:47:33
That is so awesome.

09-12-2001, 09:51:41
Trip is confusing CINEMA, in which a mere 24 fps to 32 fps CONTAINS ALL THE BLUR AND MOTION INBETWEEN, with simple refresh rates of computer graphics.

In Computer Graphics, the more often that the screen can be UPDATED, the better. Think of it this way... Each screen is just a single, perfect snapshot. No blur, no motion. To create motion, that screen has to be changed for the next snapshot. If you have trouble figuring out the difference, think of the current low band video phones. They generally run at about 2 to 10 fps, depending on the particulars. VERY jerky, and no 'in between' blurring. (You web camera folks can just take a look at a lo res output from your live camera feed.)

Cinema long ago started 'blurring' their CG and stop motion created effects. The reason for this is that it makes it LOOK RIGHT to us. Especially when mixing the effects with Real Life actors.

Then there is this myth about 24 fps is the threshold for humans. It isn't. First off, for the average joe, it's 25 fps from my studies in special effects and computer animation. ;) But for SIMPLE things, such as just a color flash (from a strobe, for instance), it's MUCH higher then that. And the high end of the human spectrum can recognize and process complex visual information which is only shown for 1/100 of a second (Studies with Combat Fighter Jockeys show the majority tested can recognize a flying aircraft from just seeing an image of it at 1/100th of a second exposure.)

Now, if you are just looking to play current generation Myst, a little choppiness might be okay. But I would think that you'd want the top line for anything where you need to accurately move and target a moving target... ;) The smoother would be the better.

That's just the basics of animation/display techniques of course. I haven't a clue about the different hardware and specify technologies used these days. Although I could go digging if I needed to for paying my rent, I suppose. :D

No longer Trippin
09-12-2001, 23:30:52
Hearing your arguements I have to think that I'm wrong. It's sounds like I'm wrong now when I think about it more. I went from having an absolute shit video card (intel 810/815 chipset - 11 megs video ram and direct AGP, meaning no slot.), to tossing the motherboard and getting a geforce 3. So I really have no experience in the middle ground fps wise. I just remembered that games used to AIM for 24 fps back in the day. Never really stopped to think that it's probably wrong since back in the day was Doom.

10-12-2001, 05:43:32
Interesting all round.

So is anyone actually willing to say the new drivers are good?:)

Resource Consumer
10-12-2001, 16:35:17
Well, I don't really notice anything other than that Hot Date runs with them and didn't on the old drivers )a mere 6 months old).

10-12-2001, 17:42:06
Glad to hear the newer drivers are more stable - I've been having awful stability problems with my GF2. Maybe this'll help.

Trip - I think the basic rule of thumb is "more power = more firepower" as far as FPS games go. Also, what one man may consider smooth and be perfectly satisfied with (say just for arguements sake 25fps), another man might view it as the ultimate in slide-show gameplay and not be happy unless he's pulling 50fps.

Usually, I think what happens is that you personally become satisfied with the level of performance you get from your pc until you see the very same game running on somebody elses machine much more smoothly, at double the screen-res with all the graphics features turned on. Then all of a sudden you go home and think - "my pc is shit, I can only get 25fps"...

That happened to me when I went to see MikeH recently and he'd got his new shiny 1.3Ghz Athlon/GF2 monster. Seeing what was meant to be the same game on his machine as on my own (a humble 350Mhz P2 with Voodoo2SLI) was enough to make me go out and buy a GF2!

Computers. Rugh.