Arc Graphics Updates And PresentMon Tested: Intel Makes Huge Strides

hero intel arc q3 23 update
Intel's efforts to improve its graphics drivers continue apace. We probably don't have to tell an audience of enthusiasts that Arc launched in a less than ideal state, and that lead to a sub-optimal first impression for the brand. Intel has been working hard ever since to correct course, and its regular driver releases have come along with steady improvements, bugfixes, and optimizations galore.

The latest driver update, released this past Wednesday, is simply known as version 4644. Intel claims that this latest driver achieves an average performance uplift of between 12 and 19% in DirectX 11 games (depending on the host CPU), which we'll discuss in a bit. This update also brings along a new metric that Intel has devised to help gamers determine whether their performance issues in games are due to their GPU or the rest of their system.

todays updates

Generally speaking, you want video games to be bottlenecked by the graphics processor. The reason for this is that it is the final step in the render process, and it has the most control over when frames are presented to the display. We call this condition being "GPU-limited". When performance is being held up by some other part of your system, we usually call it being "CPU-limited" even though it may not necessarily have anything to do with the CPU itself; often enough, "CPU limited" games are actually struggling with main memory latency, system I/O, thread contention, or other issues.

GPU-Busy And Intel's New PresentMon Beta Tool

cp2077 overlay
You want to look like the right, not the left.

Traditionally, we detect CPU-limited games by looking at CPU and GPU usage. When the frame rate is low and GPU usage is also low, we can typically assume that the game is suffering under some other kind of limitation. However, this can be an inexact science. Intel wants to remove all uncertainty by introducing a new metric called "GPU Busy." It's measured in milliseconds, and the intention is to compare it directly against frametimes, or more specifically "time between presents", to see how much of the render time was spent on the GPU.

gaming work division

If only a small portion of a given frame was spent on the GPU Render part of the process, then it's very likely that your performance is being held up by something else. This can be down to a game's settings, and it will vary drastically between software and hardware configurations. Someone who slaps a modern graphics card into their old Core i7-4790K desktop is going to miss out on the overwhelming majority of the performance of their new graphics card (especially if it's an Arc, considering those GPUs really need resizable BAR support for best performance.) Intel's own Tom "TAP" Petersen and Ryan Shrout discuss GPU busy and additional Arc updates here...


Intel devised this metric so that gamers can use it to optimize their settings. If you're getting bad or inconsistent performance in a game, you can use Intel's new customized PresentMon tool to profile your game's performance and see what percentage of each frame is spent on your GPU.

normalized dx11 improvement

Of course, Intel didn't do this simply out of the goodness of its heart. The company says that its new driver drastically improves the efficiency of CPU operations in games, such that frametimes, particularly in DirectX 11 and DirectX 9 titles, become both more consistent and overall lower. That means smoother performance in general, even if the actual 'average FPS' hasn't gone up that much.

Intel's idea of measuring GPU time versus CPU time isn't exactly a new one. You've surely run game benchmarks yourself that outline "GPU FPS" versus "CPU FPS," as this kind of thing has become more common. Titles like Shadow of the Tomb Raider, Forza Horizon 5, and Gears 5 (just to name a few) all have detailed breakdowns of the CPU and GPU performance impact of your settings as part of their benchmark tools.

intel presentmon
Intel Presentmon Beta

However, the new PresentMon Beta can be used with any game, and it includes a fully-configurable overlay with real-time graphing, which is handy. In that sense it's sort of like a fancier (albeit less feature-filled) version of the Rivatuner Statistics Server (RTSS) as configured using CapFrameX or MSI Afterburner. It doesn't require an Intel GPU, either; you can use the full program with an AMD or NVIDIA GPU as well.

intel arc control warframe

If you're already comfortable with RTSS, you may not be interested in Intel's new PresentMon Beta. On the flip side, if you're not comfortable with Rivatuner, or especially if you tried to use RTSS and found its ancient interface to be obtuse and arcane, you should probably give Intel's new PresentMon a try. It doesn't have the capture graphing functions of CapFrameX or OCAT, but if you just want a performance overlay, it's very easy to use and very informative.

Real Game Testing: Verifying Intel's Numbers

Intel has also made some pretty big claims about performance gains across the board, but it's important to keep in mind that the company is comparing its progress against the original public Arc driver, version 3490 released way back in October. It's more of a fair comparison than you might think, because most reviews of the Arc GPUs were done around the release of the cards, and so they used that driver, or a pre-release driver that was slightly older.

arc a750 with box

We've tested a handful of games on an Arc A750 graphics card using first the original v3490 driver from last year, and then afterward, the just-released v4644 driver. Unfortunately, if we needed to, there is no way to go back to the old driver and re-test as it was, because the v4644 driver includes a mandatory firmware update. This firmware update does seem to have corrected a persistent issue where Arc GPUs would fail to initialize correctly on a warm boot, so that's good.

Given that Intel's focus for this driver was on DirectX 11 performance, we've emphasized those results in our testing, but we've also tested a half-dozen other games using other APIs to see if there were any notable changes. First, let's take a look at the DirectX 11 results:

dx11 benchmarks fixed

Results vary, but these are some pretty decent upgrades, especially in terms of consistency. Fortnite goes from nearly-unplayable to reasonably smooth, and Warframe also saw a tremendous improvement in frametime stability. GTA V, likewise, sees a nice bump in performance; don't read too much into the larger gap between 1% and Average because we were doing a custom test in GTA Online and the six results we collected (three with the old driver, three with the new) varied pretty widely, as is the usual nature of testing with online games.

warframe
Warframe: Plains of Eidolon

Warframe was one of the best results we got from the new driver. An uplift from 78 to 86 average FPS isn't bad in its own right for just a driver change—that's a 12% gain—but the real news here is the improvement in framerate stability. Warframe is kind of a stuttery game to begin with, even on AMD and NVIDIA hardware, but with the old driver it was practically unplayable, with massive stutters up to 2 seconds in length. With the new driver, it's as smooth as it gets in 1440p, or even in 4K if you make use of the game's FSR 2.2 upscaling.

Notably, Warframe actually has a DirectX 12 renderer, marked as (Beta) in the launcher. This seems to largely be a D3D11on12 hack and historically has not offered improved performance on other GPUs. Well, it doesn't here, either. In our testing, DX12 in Warframe is both less stable (frametime-wise) and also slower than DX11 on every GPU, so don't bother with it, even for Arc.

gtaonline
Does the median count as "off-road"? (GTA Online)

One of the key upgrades that Intel highlighted in its benchmarks above was for Grand Theft Auto V, both the built-in benchmark and also the still-updated GTA Online component. We didn't bother to test the built-in benchmark because frankly, it isn't representative of gameplay now ten years on from the title's original release. Instead, we jumped into a GTA Online session and raced our Pfister Comet Safari across the city.

Intel claims a 27% improvement in GTA Online performance. Our measured bump was actually a bit better at 28%, which ain't bad at all. We did see a smaller gain in 1% performance, but as noted above, there was considerable variance between our runs in GTA Online, and we wouldn't worry too much about this. The result is great, regardless, as GTA Online can still be a surprisingly heavy game in 2023.

monhanworld2
Monster Hunter World

Monster Hunter World is not the newest addition to the venerable Monster Hunter franchise, but it's still the prettiest. The more recent MH Rise was designed for the Nintendo Switch, and it lacks the visual flair of the earlier World release. That's true despite the fact that Rise runs on the newer RE Engine, while MH World actually runs on the ancient MT Framework technology that originally debuted with Dead Rising in 2006.

We tested Monster Hunter World because it's still one of the more-demanding DirectX 11 games, and the gains were pretty modest, but also surprisingly consistent across runs. MH World also features some of the most-consistent frametimes in our testing. It runs well on the Arc A750 at 1440p, and with the new driver, the DirectX 12 mode offers very-slightly-improved performance over even these numbers. Just don't try it on the old driver; it gave us "blue screen of death" errors related to video scheduling twice.

fortnite
Fortnite (DX11)

On the other hand, DirectX 12 mode couldn't even be toggled in Fortnite using the older driver. Even in DirectX 11 mode, using the game's recommended settings—which set almost everything to "Epic", the highest value—the game was nearly unplayable.

Updating to the latest driver removed much of the stuttering and improved performance considerably, although the game still doesn't run particularly well at these settings. Once again, performance is pretty variable due to both the online nature of the game and also the diversity of the game's environments, but we went for the tropical Creeky Compound region due to its demanding foliage.

fortnite frametimes DX12
Average frametime of 46ms isn't ideal (Fortnite DX12 Epic)

Swapping over to DirectX 12 mode and enabling Fortnite's Unreal Engine 5 Lumen and Nanite features absolutely brutalized performance, with averages dropping to the low 20s and 1% numbers in the single digits. Suffice to say that an Arc A750 isn't ready for Unreal Engine 5, at least not in 1440p.

other benchmarks

Outside of our DirectX 11 game tests, we also tested a couple of games each for DirectX 9, OpenGL, Vulkan, and DirectX 12.

counterstrikego
Counter-Strike: Global Offensive

Arc's performance in the most popular game on Steam was a big story around the launch of the GPUs. Even though the cards were putting out hundreds of frames per second, it was hundreds less than competing cards, so Intel put some game-specific effort into improving CS:GO. Did it help? Sure.

We're testing using a map made specifically for stress-testing performance in CS:GO, and even in 4K with the in-game settings completely maxed-out, the A750 managed 158 average FPS. Keep in mind that the benchmark includes a section where the player walks directly through multiple smoke grenades, and that's why the 1% lows are relatively poor. The new driver nearly doubles the 1% lows and improves the average by 29%, which is quite good.

20xx screenshot
20XX

20XX is a visually-simple 2D game that's so light it runs very well even on Intel's integrated graphics, and on that basis it might seem nonsensical to include BatteryStapleGames' Rogue-like spiritual sequel to Mega Man X in this kind of testing. However, considering that we're examining frame-time stability, it's important to make sure that these kinds of games can run fluidly as well.

20xx glitch
Not what the game's intro should look like. Fixed in the new driver.

Most players would probably never notice the performance difference between the old driver and the new, but they certainly would notice the image quality difference. The old driver had significant visual bugs in 20XX, and it also caused long load times for whatever reason. 20XX is a game that loads in a couple of seconds on NVIDIA and AMD hardware, and on the new driver, that's largely resolved for Arc, too, along with the aforementioned visual bugs getting fixed.

doom2016 screenshot2
DOOM (2016)

We used Doom (2016) for testing because it's one of the most-demanding OpenGL games there is—despite not being particularly demanding thanks to its extreme level of optimization. Doom ran well even on the older driver release, so it's no surprise that upgrading to the newer driver made a margin-of-error improvement.

doom frametimes
Doom in Vulkan on our system is a stutter-fest.

You might think that on an Arc GPU, the better option would be to play in Vulkan mode, and if you only look at average FPS, you'd be right. In Vulkan mode, on both the old and new drivers, Doom (2016) offers a higher average FPS than in OpenGL mode. However, it also offers drastically worse 1% lows. This may be due to the critical Resizable BAR feature apparently not working in Vulkan on Arc when run on AMD platforms; more on this below.

eldenring
Elden Ring

Much has been made of Arc's capable ray-tracing performance. We wanted to test that too, so we loaded up one of our favorite titles of the last decade, Elden Ring, and benchmarked it in the game's starting area of Limgrave, near the Church of Elleh at daybreak. The performance gains from the new driver are clear to see, and since this game was designed with 30 FPS gameplay in mind, we'd say it runs just fine.

You can optionally turn the resolution down to 1080p to run up against the 60 FPS cap, or you can disable ray-tracing altogether to achieve the same in 1440p, but we don't really recommend it; the RTAO implementation has a big effect on immersion. With that said, while the RT effects are turned off, the Arc A750 can even run Elden Ring smoothly in 4K resolution with maximum settings, and that ain't bad.

cyberpunk2077
Cyberpunk 2077 Ray-Tracing: Overdrive

It's not on our chart above because it's not really playable, but we also tested Cyberpunk 2077's path-traced "Overdrive" mode. Spoilers: it doesn't work. Well, it does work, actually, just fine, even on the old driver. However, whether old or new driver, the performance caps out in the mid-teens even if you reduce the resolution as far as 960⨯540. Cyberpunk 2077 runs just fine on Arc without path-tracing, but hopefully Intel can take a look at this title and get the real-time ray-tracing working better than it does.

q2rtx
Quake II RTX

We also tested Quake II RTX. Other outlets have reported good results with this modified title on Intel's GPUs, but our experience wasn't great. Check out this frame time graph:

q2rtx frametime
This is not what smooth performance looks like

The frame times are all over the place, and the micro-stutter makes the already-middling performance (37 FPS average) feel more like 20 FPS or worse. It's not just because NVIDIA created the RTX mod; Radeon cards don't suffer this same fate, and the game is totally playable with good performance and image quality on AMD's RDNA 2 and RDNA 3 GPUs.

gpuz arc vulkan rebar
That last line indicates that ReBar is not enabled under Vulkan on our system.

Doing some investigation, it looks like resizable BAR is not enabled specifically for Vulkan on our test system, which is based on a Ryzen 7 5800X3D CPU in an ASRock X570 Taichi motherboard. We researched the issue, and found a thread where some folks described the same behavior not only on AMD systems, but also on Intel machines where the CPU's integrated graphics were disabled. It's a curious bug, and makes Arc considerably less attractive for folks with AMD systems. Hopefully Intel can get this one worked out.

okinawa rush fixed
Okinawa Rush (Arcade mode)

Finally, we tested two other OpenGL titles: side-scrolling arcade brawler Okinawa Rush, and the hardware-accelerated form of ZDoom, running the awesome "Eviternity" level set. Okinawa Rush, like 20XX above, is a visually-simple (but very attractive) 2D action game like those of yesteryear, and it runs flawlessly on both old and new drivers—exactly as expected, but again, it's important to make sure.

zdoom eviternity
ZDoom (Eviternity)

Meanwhile, ZDoom runs rather poorly in OpenGL, with the infamous "Transcendence" benchmark struggling to reach just 26 FPS. Swapping over to Vulkan greatly improves that to around 38 FPS, but unfortunately, the new driver made no real difference in these numbers. 38 FPS is actually a good result for an entry-level GPU in this benchmark, so we'll give it a pass.

Intel's Latest Arc Updates: The Final Word 

Our benchmark results more-or-less bore out Intel's claims, which is great. It's always nice to validate manufacturers' performance claims. There are still a few rough spots for Arc here and there, but ten months on from release, the cards are practically in a different performance class altogether in many games. Intel's made excellent progress with its drivers, and we hope this continues.

arc balanced builds

With Intel charging just $250 US for an Arc A750, it becomes hard to recommend anything else at that price point. It's true that you can spend another hundred dollars to get a much faster GPU in the Radeon RX 6700 XT, but not everyone has an extra hundred dollars to spend—and that card will be both larger and much thirstier than the A750.

In our testing, we really haven't run into any showstopper issues with games—even obscure ones, including emulators—on the 4644 driver, so we'd say it's probably safe to recommend Arc to most gamers at this point, which isn't something that coudln't necessarily be said when the GPUs launched late last year. Here's to Intel's driver team for making massive strides in recent months, and we're looking ahead with hopeful eyes at the company's second generation of graphics processors that loom on the horizon.