Jump to content

frame rate drops... anyone got some input

Recommended Posts

i get frame rate drops in ONLY call of duty games like world at war and mw2... i have yet to figure out why... i have a xfx 4890 and set my games to play at 60 fps at 1920x1080 and all do it fine without hiccups, but call of duty does not... my bios is up to date and my card has the latest driver... any ideas why it is strictly doing it on call of duty???? thanks

Link to post
Share on other sites

I haven't played COD in quite awhile. But IIRC, one of the biggest frame rate

cutters, was the sound. Do not use EAX/EAX2.

Even if you have a card, or MB is supposed to support it.

If I get a chance tomorrow, I'll ask some of the players at the V.

They played some earlier tonight.

Link to post
Share on other sites

I would guess that there is just a ton of explosions and effects that require alot of processing power going on.

Also, the engine they used allows for bigger levels, so maybe the card is just having to use the memory a lot more than most games require?

Link to post
Share on other sites

I dont know I have the sapphire tech version of the 4890 and no hiccups what so ever..

Even MW2 at the same res and runs flawlessly.


What are your cards specs?


I thought you were runnin X-fire.
Link to post
Share on other sites

you still have vertical sync enabled??


read stormy's link again and disable it. ;)




Did not know he already asked..

to be fair they weren't all about frame rates for cod waw, at least one was about frame rates in crysis. :lol:


but amd-rules there really isn't anything more anyone can tell you that hasn't already been said.


defrag hard drive

keep pc clean of viruses

if your not happy with frame rates

lower resolutions

lower other settings

etc, etc all been said before.


that card is a great card and plays any game.


all you need to do is stop looking at the frame rate numbers in the corner(switch them off) ;)



Link to post
Share on other sites

Agree with Terry I mean the human eye cannot tell the difference above anything that is 40. fps...SO what do you care? (or is it 45?) Sure your eyes can detect things up to 60 fps but it is really not noticeable.. Just turn your head (as long as your monitor is at 60hz) and at 35 fps you cannot see the flicker. (use your peripheral vision to do this) I mean my 4890 gets 60 easily in Crysis and pretty much all games even when it gets down to 35 I cant tell the difference and you should not be able to either.

Edited by lugnut
Link to post
Share on other sites

I am going to have to correct you on that. The frame rate discussion is never ending, the human eye is supposed to not be able to discern anything beyond 60 frames per second, but believe it or not we can notice and tell the difference between say 60 and 150 frames per second.

Link to post
Share on other sites

Oh I agree now! I just did not read up on it before I said it. Last I heard( a long time ago) we could not see beyond the 30fps margin. So it was my mistake..

Obviously we can see more that 30fps (I should have used common sense first) as the newer LED tv's are 240hz so I should have taken it from there..

So no need for correction I said above that I just read up on it (again) and the old myth is false. My mistake.. :tup:

Link to post
Share on other sites

Lug, 30 FPS is usually the cutoff between the human eye being able to tell that it is smooth game play, vs. choppy game play.


Between 30 and 60 our eyes can easily discern the difference between say 30 and 35, but at 60+ it looks ultra smooth to the human eye. We can't really tell the difference between 60 and 65, but for us to gap has to be bigger for us to detect..something like 60 to 120.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...