General Discussions

Page 1 of 1
Signup or Login to Post
The Myth behind the GPU Memory Clock..
  • Current rank: 3.5 Stars. Next Rank at 8000 Posts.
    Send a message to DABhand
    PHAT CAT
    DABhand posted on Apr 04, 2011 3:31:53 PM - Report post
     
    I have had people ask me to explain why their Memory Clock isn't showing correctly in different places,etc.

    Here I will explain why it is correct what you are seeing.

    This applies to ATI 4700+ and Geforce 400+ cards. When you goto buy one perhaps online, it will say for example that the Geforce 460 has a memory clock of 3400mhz, but when you buy it and run some sort of application to check speeds it shows 1700mhz.

    People ask

    Is this right, is my GPU perhaps damaged?
    Is my motherboard not good enough for the full speed?

    The answer is no to both.. here is the reason why.

    As you know with RAM - DDR is Double Data Rate... So DDR2 800 is infact 400mhz clock speed but due to it being double data rate that is doubled, this is the same for GDDR3 GPU's, before the 4700 and 400 cards.

    Now for higher they are GDDR5 cards, they are not Double Data Rate (DDR) but Quad Data Rate (QDR) but are listed as GDDR5.. yes I know it seems confusing but its the silly way of reporting things, just like ISP's use mb instead of MB to describe connection speeds.

    Although it is listed as 3400, in reality you have memory clock speeds of 850mhz per channel, with 2 channels pumping out 1700mhz.. But because of the QDR system that is doubled kinda its like Intel's Virtual core idea where you may have a hex core i7 but it has 6 virtual cores also for a total of 12.

    So hopefully that clears up the myth a bit and that your not being fobbed off :P
    Oh and Don't forget some tuts on ASM and defeating DMA

    Clicky Here for them
Page 1 of 1
Signup or Login to Post
All times are (GMT -06:00) Central Time (US & Canada). Current time is 2:09:12 AM