Announcement

Collapse
No announcement yet.

Overclocking Gigabyte GV-N56GOC-1GI graphics cards in SLI configuration?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Overclocking Gigabyte GV-N56GOC-1GI graphics cards in SLI configuration?

    Hi Everyone,

    I have two Gigabyte GV-N56GOC-1GI (GIGABYTE - Graphics Card - NVIDIA - PCI Express Solution - GeForce 500 Series - GV-N56GOC-1GI) graphics card in SLI configuration, and a few questions to ask the forum community -

    1). What is a safe operating temperature for these cards? Typically they run between 42 to 50 Degrees Celsius when not overclocked. However I have started to experiment with overclocking and had them sitting between 55 - 60 Degrees C.


    2). GPU to Memory Bus ratio: I have divided the Memory Bus frequency by the GPU frequency (the end result being approximately 4.2), and then when overclocking have simply nominated a new GPU frequency (of say +100MHz over default) and then calculated a new memory bus frequency using the ratio. Is this a good approach to overclocking the frequency on these cards?


    3). I have been using the Gigabyte Overclock Guru II software, however I am wondering if I can get away with using MSI AfterBurner with my cards?


    Any help here will be much appreciated.


    Kind Regards,


    David
    Last edited by Mr Davo; 05-06-2014, 09:35 PM.

  • #2
    Re: Overclocking Gigabyte GV-N56GOC-1GI graphics cards in SLI configuration?

    Do a web search using: "gtx 560" overclock guide (with the quotes) and watch the related youtube videos and read several of the overclocking guides. If you've already read some of the detailed web reviews for your video card, you should have a pretty good idea how high you can probably overclock your gpu and memory speeds. If you have below average cards, your o/c will be somewhat lower and above average cards can probably o/c higher.

    The o/c settings in the GTX 560 Ti reviews, guides or videos won't directly apply to your video card.

    My basic overclocking procedure is to increase the gpu speed in a small increment and then stress test this setting. If you get screen artifacts, the video driver crashes or you get a system error, then you increase the gpu voltage by 1 click and repeat the stress test. With my video cards, MSI AfterBurner had a safe fixed maximum voltage that I could use.

    If AfterBurner will let you increase gpu and memory voltages, you're good to go.

    You can download MSI AfterBurner and its user manual at MSI Afterburner
    Avoid the latest beta versions because the beta's all have a final expiration date.

    note: I've never heard of using or applying a "GPU to Memory Bus ratio" with video card overclocking.

    Overclocking memory is done differently because when your gpu drivers/software detect memory errors, your memory will automatically be down-clocked and you won't notice anything unless you look closely at stress test benchmark scores. Once you notice a consistently lower score, you need to lower your memory speed to the highest previous stable setting and retest again to make sure that the slightly lower memory settings are still stable.

    With my GTX 460 video cards, my gpu speeds went from 790MHz (stock) to about 850MHz using stock gpu voltage settings. Going higher required increasing gpu voltage slightly until I maxed out the gpu voltage at ~ 915MHz. Many of my stress tests were stable up to 925MHz, but the Unigine Heaven benchmark often locked up or crashed above 915MHz.

    I found that increasing memory speeds had much less effect than increasing gpu speeds. YMMV

    My overclocked idle temps were about 30o(C) at idle and usually in the 50o - low 60o(C) range when running video benchmarks. My case has excellent air flow and cooling with 20o(C) ambient temperatures.

    Here is the basic procedure that I used to overclock my GTX 460 SLI video cards:
    load your highest stable gpu profile
    :START
    increase gpu speed by 25MHz
    apply new gpu speed setting
    test for artifacts by running FURMARK or KOMBUSTOR for 15 - 20 minutes
    are there any screen artifacts?
    -- if there artifacts, goto :HELL
    goto :START

    :HELL
    increase gpu voltage by 1 tick
    could you increase the gpu voltage?
    -- if you can't increase the gpu voltage any higher, goto :DONE
    save your current overclocked gpu settings as a profile
    goto :START

    :DONE
    note: you've maxed out your gpu speed and your most recently saved gpu profile is the best that your can do.

    It's been a long day and I'm not very proud of my gpu overclocking DO-loop.
    Q9650 @ 4.10GHz [9x456MHz]
    P35-DS4 [rev: 2.0] ~ Bios: F14
    4x2GB OCZ Reaper PC2-8500 1094MHz @5-5-5-15
    MSI N460GTX Hawk Talon Attack (1GB) video card <---- SLI ---->
    Seasonic SS-660XP2 80 Plus Platinum psu (660w)
    WD Caviar Black WD6401AALS 640GB (data)
    Samsung 840 Pro 256GB SSD (boot)
    SLI @ 16/4 works when running HyperSLI
    Cooler Master 120XL Seidon push/pull AIO cpu water cooling
    Cooler Master HAF XB computer case (RC-902XB-KKN1)
    Asus VH242H 24" monitor [1920x1080]
    MSI N460GTX Hawk (1GB) video card
    Logitech Z-5500 Digital 5.1 Speakers
    win7 x64 sp1 Home Premium
    HT|Omega Claro plus+ sound card
    CyberPower CP1500PFCLCD UPS
    E6300 (R0) @ 3.504GHz [8x438MHz] ~~ P35-DS3L [rev: 1.0] ~ Bios: F9 ~~ 4x2GB Kingston HyperX T1 PC2-8500, 876MHz @4-4-4-10
    Seasonic X650 80+ gold psu (650w) ~~ Xigmatek Balder HDT 1283 cpu cooler ~~ Cooler Master CM 690 case (RC-690-KKN1-GP)
    Samsung 830 128GB SSD MZ-7PC128B/WW (boot) ~~ WD Caviar Black WD6401AALS 640GB (data) ~~ ZM-MFC2 fan controller
    HT|Omega Striker 7.1 sound card ~~ Asus VH242H monitor [1920x1080] ~~ Logitech Z-5500 Digital 5.1 Speakers
    win7 x64 sp1 Home Premium ~~ CyberPower CP1500PFCLCD U.P.S
    .

    Comment


    • #3
      Re: Overclocking Gigabyte GV-N56GOC-1GI graphics cards in SLI configuration?

      Hi profJim,

      Thanks very much for your detailed response. I will take everything onboard, and look forward to using your overclocking technique.

      Kind Regards,

      Davo

      Comment

      Working...
      X