Home Far Cry 1.2 with Shader Model 3.0
Reviews

Far Cry 1.2 with Shader Model 3.0

Scott Wasson
Disclosure
Disclosure
In our content, we occasionally include affiliate links. Should you click on these links, we may earn a commission, though this incurs no additional cost to you. Your use of this website signifies your acceptance of our terms and conditions as well as our privacy policy.

AS THE GRAPHICS WARS between ATI and NVIDIA have escalated, one of the most favored weapons of both sides has been performance comparisons in big-name games. We’ve already seen performance previews of Doom III and Half-Life 2, made possible by graphics chip companies, despite the fact that neither game was anywhere near shipping at the time those previews were published.

Today we have something kind of similar, but it involves a shipping game title, the ever-so-sweet shooter Far Cry. The game’s publishers, Ubisoft, have worked with NVIDIA to make the game use the Shader Model 3.0 capabilities built into the GeForce 6800 series graphics cards. The version 1.2 patch for Far Cry should be released soon for all the world to see. NVIDIA supplied us with an early copy of the patch a few days back so we could test Far Cry performance using Shader Model 3.0. Read on to see how the new patch performs with Shader Model 3.0, and how the GeForce 6800 series now compares to the Radeon X800 lineup.

To recap…
When we last tested Far Cry on these cards, in our Radeon X800 review, it wasn’t pretty for NVIDIA. The Radeon X800s cleaned up. Here are the results we published then, using version 1.1 of the game.

Ow. The Radeon X800 XT PE walloped the GeForce 6800 Ultra, and the Radeon X800 Pro beat it, too. Only the overclocked “extreme” GeForce 6800 Ultra beat the 12-pipe Radeon X800 Pro. (Those numbers were generated with 4X antialiasing and anisotropic filtering.)

Since Far Cry is one of the few games with really excellent graphics capable of pushing the latest graphics cards to their limits, this performance was one of the reasons we gave the Radeon X800 a slight edge over the GeForce 6800 overall in our review.

Shader Model 3.0 meets Far Cry
Since then, NVIDIA and Ubisoft have worked to implement support for Shader Model 3.0 in the new version of Far Cry. Shader Model 3.0 is a new part of the DirectX 9 specification intended to encompass the capabilities of the GeForce 6800 GPU. ATI’s graphics chips are coupled to Shader Model 2.0, which is now a subset of Shader Model 3.0.

SM3.0 includes a number of enhancements to both pixel shaders and vertex shaders. On the pixel shader side, 3.0 allows an expanded graphics programming model with much longer instructions lengths in programs, plus dynamic branching and looping with conditionals. SM3.0 also requires 32 bits of precision per color channel, up from the max of 24 bits available in SM2.0 and in current ATI pixel shaders. On the vertex shader side, Shader Model 3.0 enables vertex texture fetch, a feature useful in creating certain types of special effects. Also, Microsoft has slipped support for some other new features into SM3.0, including geometry instancing, which allows for more efficient organization and transfer of geometry data to the graphics card.

To learn more about Shader Model 3.0, let me suggest you read our dueling interviews with NVIDIA’s Tony Tamasi and ATI’s David Nalasco. Both of the interviews include discussions that go into some depth about the relative merits of Shader Models 2.0 and 3.0.

By incorporating Shader Model 3.0 support into Far Cry, NVIDIA and Ubisoft obviously intend to showcase the performance gains possible with the new shader model and the advanced graphics capabilities of CryTek’s game engine. NVIDIA’s presentation on the patch points out four specific levels of the game where SM3.0 enhancements make a difference. The “Training” and “Regulator” levels have lots of grass and foliage in them, and the game now uses geometry instancing to transfer vertex data for these models. In the “Volcano” and “Research” levels, per-pixel lighting is heavily used. With SM3.0, this lighting routine can be handled in a single rendering pass, potentially boosting performance.

I would like to share more with you about what exactly Far Cry‘s developers did to make the game take advantage of Shader Model 3.0. In fact, I fired off a round of questions about just that, but I don’t have the answers back yet. I will update this article or write a separate one soon when I get some answers, so keep checking back here. I expect to know more later today.

Now, let’s put these optimizations to the test…

 

Our testing methods
To test Far Cry with Shader Model 3.0, we set up a test system with Windows XP Pro and the RC2 version of Service Pack 2. We needed the Service Pack because it includes DirectX 9.0c, the new version of DirectX that supports Shader Model 3.0. We also received from NVIDIA a new version of its 61.45 graphics drivers with SM3.0 support enabled.

Version 1.2 of Far Cry will apparently come with four built-in demos for benchmarking. Those demos take place on the four levels mentioned in the NVIDIA presentation. Rather than use those pre-recorded demos, however, we elected to record five of our own—one on each of the four levels NVIDIA mentioned, and one on the “Control” level. The demos are downloadable via a link below.

Although Far Cry does include a timedemo benchmarking function, it’s far from ideal. The game doesn’t record user interactions with the environment, so playback varies from the sequence originally recorded. Also, the movement of the game’s bad guys from one run to the next adds some variance to the scores. As ever, though, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the score from the first run was discarded. Scores from the second and third run were averaged. Obvious outliers were discarded, and tests were re-run as needed.

We set the game’s image quality options as high as possible, including “Very High” for every option except water quality, which we set to “Ultra High.” We tested without antialiasing or anisotropic filtering, and then we tested with 4X AA and 8X anisotropic filtering.

Both ATI and NVIDIA cards were left at their driver default settings for image quality, with the exception that we turned off vertical refresh sync on all cards.

Our test system was configured like so:

Processor Athlon 64 3800+ 2.4GHz
System bus HT 16-bit/800MHz downstream
HT 16-bit/800MHz upstream
Motherboard Asus A8V
BIOS revision 1006
North bridge K8T800 Pro
South bridge VT8237
Chipset drivers 4-in-1 v.4.51
ATA 5.1.2600.220
Memory size 1GB (2 DIMMs)
Memory type Corsair TwinX XMS3200LL DDR SDRAM at 400MHz
CAS latency 2
Cycle time 6
RAS to CAS delay 3
RAS precharge 2
Hard drive Seagate Barracuda V ATA/100 120GB
Audio Integrated
Graphics Radeon X800  Pro 256MB AGP with CATALYST 4.6 drivers
Radeon X800  XT 256MB AGP with CATALYST 4.6 drivers
GeForce 6800GT 256MB AGP with 61.45 drivers
GeForce 6800 Ultra 256MB AGP with 61.45 drivers
OS Microsoft Windows XP Professional
OS updates Service Pack 2 RC2, DirectX 9.0c

Thanks to Corsair for providing us with memory for our testing. If you’re looking to tweak out your system to the max and maybe overclock it a little, Corsair’s RAM is definitely worth considering.

The test systems’ Windows desktop was set at 1152×864 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

If you have questions about our methods, hit our forums to talk with us about them.

 

Regulator

Research

The GeForce 6800 cards get a boost in both of these levels from the switch to Shader Model 3.0. In Regulator, without AA or aniso, the GeForce 6800 GT outruns the Radeon X800 XT Platinum Edition, even at 1600×1200 resolution. Once edge AA and texture filtering enter the picture, though, things tighten up.

In Research, the results are more dramatic. With Shader Model 2.0, the GeForce 6800GT can’t keep up with the Radeon X800 Pro when AA and aniso are at work. Switching to SM3.0 changes the outcome, giving the 6800GT the lead. The Radeon X800 XT PE still leads the entire pack, however.

 

Training

Volcano

We don’t see much difference at all between shader models in Training, but Volcano’s another story. There, the GeForce 6800 Ultra just ties with the Radeon X800 XT PE—until SM3.0 enters the mix. Then, the 6800 Ultra stands alone.

 

Control

Like Training, the Control level doesn’t show us much difference at all between Shader Models 2.0 and 3.0. Once again, though, the ATI and NVIDIA cards are tightly bunched together.

 
Conclusions
The new Far Cry patch does indeed seem to be a decent showcase for Shader Model 3.0’s potential. The GeForce 6800 cards gain up to about 10% in average frame rates when using the SM3.0 code path. No, the differences aren’t going to convert the masses into NVIDIA fanboys overnight, but they do show NVIDIA wasn’t kidding about Shader Model 3.0 offering some advantages.

This patch also seems to have cleaned up some problems with the GeForce 6800 code path in the game. Even with Shader Model 2.0, GeForce 6800 performance is way up. Image quality also seems to be quite a bit better than it was on the GeForce 6800 with Far Cry version 1.1. My eye is accustomed to seeing the game run on an Radeon 9800, and the GeForce 6800 cards looked just fine to me with the new patch. I played through a few levels of the game, including some with lots of pixel shader-laden effects on the walls and floors, and I didn’t notice any corner cutting or blocky shading or texturing. Of course, some of these changes may be attributable to newer NVIDIA drivers, but the effect is the same.


Image quality looks good to me (Click for larger versions)

So what does all of this tell us about the eternal question: “Should I fork over my cash for a GeForce 6800 or a Radeon X800?” I’m not sure, exactly. PC games seem to be approaching the way the console world works, where publishers cut deals to publish exclusive or enhanced games for a given platform. In this case, Ubisoft worked with NVIDIA to make one of the best games of the past six months run smoothly on the GeForce 6800. That’s spectacular, especially because the game still runs very well on Radeon cards.

One thing we do know is that the GeForce 6800 seems to have some built-in headroom for performance gains. We’ll further explore the performance of the new ATI and NVIDIA cards with the latest drivers soon—just as soon as we’re done playing through Far Cry one more time. 

Latest News

smartphone security organization
Community Contributions

How to Successfully Tackle Smartphone Security in Your Organization

meme-season (1)
Crypto News

8 Meme Coins to Consider for Investment During the Current Meme Coin Trend

Meme coins recorded jaw-dropping returns in the past couple of weeks. Many household projects pushed towards its new ATHs in recent weeks. Dogwifhat, surged over 600% in the last week...

SpaceX Is Building A Network Of 100 Spy Satellites For The US
News

SpaceX Is Building a Network of 100 Spy Satellites for the US Government, Angers China

Elon Musk’s SpaceX is reportedly making 100 spy satellites for the US intelligence agency. According to sources, the company recently accepted a secret contract by the US government worth $1.8 billion....

IMF Shared An Update About The February Security Breach
News

IMF Shared an Update about the February Security Breach – All Affected Email Accounts Resecured

Taylor Swift in concert
Statistics

9 Taylor Swift Controversies – The Numbers Behind the Drama

What is Darwin AI, Apple’s Latest AI Acquisition?
News

What is Darwin AI, Apple’s Latest AI Acquisition?

Cyberattack On France Govt Exposes Data of 43 Million Users
News

Massive Cyberattack On France Government Departments Leaves The Data of 43 Million Users Exposed