Poor Watch Dogs PC Performance? Here's Why And How To Fix It

Over the past few weeks, I've spent a significant amount of time with Watch Dogs. The game is infamous for not running very well on PCs -- and after some legwork, I've figured out a hefty chunk of the reason why: 

Ubisoft royally screwed up its suggested VRAM (graphics card frame buffer) settings.

Watch Dogs sucks down far more memory than any other modern PC game I'm aware of; this game's VRAM demands are far, far outside the norm for a modern 1080p title. In the past, I've compared VRAM usage in games like Guild Wars, Battlefield 4, and Total War: Shogun 2.  In these games, the average VRAM use with all detail levels maxed out was about 1.5GB -- BF4 will break 2GB if you use the supersampling option to render the game internally in 4K mode, and Total War: Shogun 2 wants a bit more than 2GB of VRAM if you max out every single graphics setting and enable 8x MSAA.

Ubisoft claims that Watch Dogs' "High" texture detail setting requires a 2GB frame buffer while its "Ultra" textures need 3GB of frame buffer memory. That might technically be true, but these figures should be treated as a minimum, not a max. Playing the game through with High Textures and "Ultra" details (the two settings are controlled separately), my system was dogged with repeated, jerky slowdowns when running a GeForce GTX 770. Switching to "High" Details improved the situation, but didn't resolve it.



A check of GPU-Z and swapping to a GTX 780 (with 3GB of RAM) demonstrated why -- High Textures + Ultra Details chews up an easy 2.5GB of GPU RAM even at 1080p. High Textures + High Details still consumes about 2.1GB -- enough to send the GTX 770 into periodic paroxysms and long, stuttering pauses. This is often exacerbated by an Alt-Tab issue with Watch Dogs -- if you're playing this game, you don't want to Alt-Tab. Set for Borderless fulllscreen mode, or you'll be facing gradual performance degradation every time you switch back and forth between game and desktop.


This demo shows stuttering with a GeForce GTX 770 set to High Textures and Ultra Detail.

Unfortunately, the best way to improve performance in Watch Dogs is to ignore Ubisoft's recommendations altogether and opt for lower detail levels, depending on your configuration and monitor resolution. Our recommendations:

1GB of Video Memory
1366x768 Resolution
Medium Textures
Low Detail Settings
FXAA

1-2GB of Video Memory
1920x1080 Resolution
Medium Textures
Medium Detail Settings
FXAA

2GB of Video Memory
1920x1080 Resolution
High Textures
Medium Detail Settings
Temporal SMAA

3GB of Video Memory
1920x1080 Resolution
High Textures
High Detail settings
2x TXAA  / 4x MSAA (May require step-down to Medium Detail settings).

This means that only users with a Radeon R9 290 or GeForce GTX 780 are going to see Watch Dogs in its full glory.

It's hard to know, at this point, how much of the game's wretched memory management is caused by poor optimization, and how much is caused by Ubisoft's engine design decisions. The impact, however, leaves many graphics cards -- including GPUs that are less than a year old -- staggering under the rendering load. For this reason alone, I'd say Ubisoft poorly calculated its own requirements -- High Textures ought to have been described as requiring at least 3GB of RAM, and Ultra Textures require more like 4GB.  Once you enable anti-aliasing, the game will chug even on a GTX 780.

Ubisoft has said that a patch is coming that may address some of these issues.  We'll give it a spin when it finally arrives.
Via:  Hot Hardware
Comments
basroil 6 months ago

There's a good reason why I'm waiting on buying this game until I can find myself a 4GB version of the 770 at a good price...

rafiki 6 months ago

Find a top of the range, factory overclocked 4GB 680 if you can; 770 is the same chip I believe.

basroil 6 months ago

Yes and no, the 680 OC tends to be older batches that run hotter, and actually still cost more than the 770 4gb. I missed out on a chance to buy a GTX780 HoF for $400, so I'm stuck spending the same on the clearly inferior 770

AustinJankowski 6 months ago

I finally just got everything fine tuned. 660ti nVida reference edition with a mild overclock and a i7 920 stock clock x58 chipset. Mostly everything is on ultra except a few settings to give some headroom for textures and aa set to temp ssma, but I get anywhere from 53 (in bad areas) to 60 fps with vertical sync on to 1 frame. still cant seem to fix the blinking shadow issue but other than that no problems.

rafiki 6 months ago

Predicting "all this would happen" around the time of the new console announcements, I'm glad I went for a 4GB 670 FTW, which honestly paired with an i5-4670k runs the game beautifully at 1080p and at ultra settings (I've even turned up the ambient occlusion setting to max too), I've left the AA @ default at the moment, SMAA I believe. No stuttering. I believe 4gb should probably be the minimum on a GPU now to be "future proof".

JesseLazenby 6 months ago

I7-4770k and 770 Windforce 4GB works no prob, has stutters while driving full speed but other than that I finished the game, everything maxed, and had no big issues at all

KhaleelSiddiquee 6 months ago

This post is useless Lol I still play on high with everything else on max on my GTX 760 get lag here and there but its still running fine

Joel H 6 months ago

I'm willing to grant that this kind of stuttering may drive me more insane than most ;)

CalicalCarlos 6 months ago

game run good for me i7 3770k 4.4ghz asus dcuii hd 7970 3GB i run it on ultra settings

CalicalCarlos 6 months ago

http://www.youtube.com/watch?v=iegFKEo_23E&feature=share&list=UUk8vgHUgQEep7LUomNNcPOw&index=2

JentoPieters 6 months ago

thx for the post, ubisoft f*** things up even more

Joel H 6 months ago

Calical,

Flip HBAO+ on and add real antialiasing (MSAA). You'll slam into a memory wall if you do.

KOwen 6 months ago

i use txaa x 2 and haven't had too many slow down issues but like Austin said, can't figure out how to fix those damn flickering shadows and lights during the day/night cycle. good thing I'm not prone to seizures.

JPayne 6 months ago

Should ask them why the game supports some Intel HD graphics and not Nvidia dedicated graphics on laptops

rafiki 6 months ago

@ JPayne - what?!

AndrewKay 6 months ago

good article

BJew 6 months ago

what about a gtx 780 6gb?

Post a Comment
or Register to comment