Showing posts with label NVIDIA. Show all posts
Showing posts with label NVIDIA. Show all posts

Friday, January 16, 2009

EVGA X58 SLI - Take Three

Just last week, NVIDIA announced both the GTX 295 and GTX 285. Today we have availability on both and test results for the GTX 285. As we weren't able to get power tests done time to include in the GTX 295 review, we also have those available today.

EVGA was kind enough to provide the hardware for this review. They sent us two GTX 280s for single and SLI testing. They provided us with overclocked cards, but for this article we underclocked them to stock GTX 285 speeds in order to learn what we can expect from non-overclocked variants.

see the orinal posting for more details
Source & Image : http://www.anandtech.com/

Tuesday, January 13, 2009

The Art of Overclocking a 3D Card

By SANDRA PRIOR

By contrast to all that fiddling about with CPUs, 3D card overclocking is incredibly simple. It's all done from within Windows, with no rebooting required, and in Vista you'll even find that the system will recover from an unsuccessful overclock without locking up or bluescreening.

For an NVIDIA board, you want an app called nTune. It's an official NVIDIA tool - grab it from nvidia.com/object/sysutility.html. Then you just need to head over to the NVIDIA Control Panel (there should be an option for it, if you right-click on your desktop) and click on 'adjust GPU settings' under 'Performance'. If you select 'Custom clock frequencies' you can alter the core bus and the memory bus. As always, do it by tiny increments (10MHz or so) to identify the exact speed ceiling. There's an option in the NV control panel called 'system stability', and there you can run a looped render test to check the card can cope with the speed hike.

ATI cards are similarly straightforward. Load up Cataylst Control Center, again by right-clicking on the desktop and selecting its name. When prompted, choose ‘Advanced’ rather than the 'Basic' mode. From the list of settings on the left, you want the bottom one - ATI Overdrive. The best option whether you're a first or a fourtieth timer at this is to click 'Run automated clock configuration utility.' This will test the card's GPU and RAM at various different speeds, working out what's safe to run at. It'll take a little while, but once done you should notice that Overdrive's added a few extra MHz. Click ‘Apply’, then gun up a few games and give them a quick run.

Subscribe to Sandra Prior’s Online Newsletter and get up to date Computer Technology News delivered right to your email box for free.

See website for details http://usacomputers.rr.nu and http://sacomputers.rr.nu
Source : https://www.amazines.com/

Review : Nvidia GeForce GTX 295 Graphics Card.

What do you do when your chief competitor's high-end graphics card uses two GPUs and is faster than the high-end card from your lineup? Produce a dual-GPU card of your own, of course! This is the tactic Nvidia has employed with the GeForce GTX 295, based on a 55nm version of the GT200 chip found in their GeForce GTX 260 and 280 graphics cards.

The shrink from 65nm to 55nm makes the new GT200 chip smaller and more affordable to produce, as well as more power efficient. Both are necessary to realistically produce a dual-GPU graphics card; the original 65nm GT200 chip is an enormous 576 mm2 and 1.4 billion transistors, making it very expensive to produce.

The power draw is too much to put two of them on a single card and remain within PCIe specs without dramatically lowering clock speeds. At 55nm, the chip is down to perhaps a little more than 400 mm2, and with dramatically reduced power the company can finally put two of them together into a single graphics card.

Granted, it's still an expensive proposition. We're talking about lots of high-speed GDDR3 memory, two printed circuit boards, and still over 800 mm2 and 2.8 billion transistors worth of silicon. It's way bigger and more expensive to produce than ATI's RV770-based Radeon HD 4870 X2 (which is based on 276 mm2 chips and one PCB).

Nvidia is so determined to reclaim the performance crown that they're keeping prices aggressive—we're testing a reference board here today, but we're told the MSRP should be around $500. It's faster than a Radeon HD 4870 X2 in most cases, but how much so? And with price drops from ATI, is it the better deal in high-end graphics?

Source & Image : http://www.extremetech.com