Once upon a time, this site had a performance guide on Rise of the Tomb Raider, though it was removed during the exit of the original author. Since ROTR’s release, developers Crystal Dynamics and Nixxes Software have been busy releasing a myriad of patches, adding an internal benchmark, the ability to run the game in DirectX 12, and a wide variety of performance improvements.
- API: DirectX 11, DirectX 12
- V-Sync: Double and Triple-Buffered options
- Anti-Aliasing: FXAA, SMAA, SSAA
- 4K Support: Yes
- Unlimited FPS: Yes
- Adjustable FOV: No
Rise of the Tomb Raider is 3rd-person action adventure game that features elements from platforming, stealth, shooting, and even the RPG genre. Crystal Dynamics handled the primary development of the game, following up the well-received Tomb Raider from 2013, while Nixxes Software handled the port to PC platforms, a partnership that’s working out well both for PC gamers and Square Enix themselves.
Rise of the Tomb Raider has won awards for it’s story, and this second game continues the tradition of allowing the user to take more ownership of their unique Lara Croft than ever before. Players can change Lara’s face, hair, clothes, height, bust size…just kidding. But I bet some of you were about to stop reading and boot up the game right now, weren’t you?
Crystal Dynamics continue the excellent tradition of offering a ton of different graphical options, with a game that runs equally well on AMD and Nvidia systems, and even plays nice with multi-GPU solutions (though we had to wait for a few patches for that last one). Settings include multiple AA and anisotropic options, as well as customisable texture quality, shadow quality, ambient occlusion, depth of field, tessellation, screen space reflections, dynamic foliage, bloom, motion blur, lens flair – they event went so far as to offer multiple levels of their in-house PureHair tech, and not one but two V-sync options!
For my testing, I used patch 1.0.647.2 (Patch #6) on my single Radeon R9 290X at the most popular gaming resolution of 1920×1080. As with Far Cry Primal and The Division, the developers have been kind enough to provide an internal benchmark, and after a few hours of gameplay, I decided that the benchmark provided an ideal scenario that was both representative of actual gameplay as well as being easily reproducible. Additionally, due to the relative ‘new-ness’ of DirectX 12, the ONLY way to run a benchmark (and therefore get a comparison to DirectX 11) is to use the internal benchmark. For testing I used the ‘Very High’ preset, but with the Texture Quality bumped down to ‘High’ and Anti-Aliasing set to ‘SMAA’. On these settings I averaged around 62 fps at 1920×1080 on my 290x. While the options for a single setting are changed, the rest of the options will be left on my aforementioned settings. These settings are all shown in the above slides, and of course V-sync has been disabled for the benchmarking.
As you can tell, your biggest power sucker is going to be Anti-Aliasing, with Shadow Quality delivering a pretty big hit when you get to the Very High setting (the “Very High” preset actually only uses the “high” setting). For anti-aliasing, I prefer using SMAA, as FXAA will introduce a minor blur, which is nowhere near the haze-inducing fogginess that Far Cry Primal’s FXAA setting produced, but is still less than ideal. Regardless, FXAA can be enabled at the cost of less than 1 FPS, while SMAA only costs around 1-3. Users should stay away from the SSAA settings unless you just have a mammoth rig, or if you prefer your games to resemble slide shows. As for shadow quality, dropping from Very High to High should give you a decent boost, and for those trying to run this game on decade-old hardware, the option to remove all shadows is available…but you may end up deciding not to play that way once you see what it does to Lara’s eyes… One last thing, the framerates for texture quality don’t tell the whole story. I’ll go into more detail in the next paragraph, but just know that if you’re experiencing a good deal of stuttering in-game, you should try lowering the texture quality before you lower other settings. The “Very High” setting on texture quality is extremely demanding while the “High” settings uses much less VRAM while still providing a great experience.
|Graphic Settings||Performance Impact|
|Shadow Quality||Very High|
|Sun Soft Shadows||Low|
|Depth of Field||Low|
|Level of Detail||High|
|Specular Reflection Quality||Low|
|Screen Space Reflections||Negligible|
One thing you should look out for is VRAM usage. I have a 4GB card, but I noticed some fairly severe stuttering when cranking up the Texture Quality all the way to “Very High”. The game even warns you against how much VRAM that setting will use. So that said – beware. ROTR maxed out my VRAM on the Very High setting, even when I was only running at 1080p. I tried looking around to see how much VRAM Rise of the Tomb Raider would use if given an unlimited reserve, but the best I could find was that apparently VRAM usage can climb as high as 10GB when using the 4K x 4K textures on the “Very High” setting. Anyways, here’s the VRAM usage I observed on my 4GB card:
As I mentioned earlier, Nixxes and Crystal Dynamics have been busy continuing the game’s development, even after launch. To date, they’ve released over half a dozen game patches, fixing everything from broken quest lines to massive performance improvements, as well as working closely with both Team Red and Team Green to release optimized graphics drivers. A key part of those optimizations have been multi-GPU enhancements, resulting in one of the best-performing multi-GPU games I’ve experienced, with some users reporting a 100% performance boost over using only one GPU. My own testing showed a 67% improvement at 2560×1440 and a whopping 80% bump at 3200×1800.
Sadly, this multi-GPU performance does not translate to DirectX 12. In one of the more recent patches, Nixxes added the ability to run the game using the DirectX 12 API. Your mileage may vary, but my own testing seems to agree with the vast majority of users: DX12 offers worse performance, although only by the smallest of margins. Additionally, there is no multi-GPU support in DX12, which is a shame, because I’ve really been itching to see what, if any, real-world improvements the DirectX 12 API brings to multi-GPU gaming. Additionally, I observed no increase or decrease in game stuttering so, as it stands, I can’t really recommend using DX12, although there is the possibility that you might receive slightly better performance if you have a weaker CPU.
Rise of the Tomb Raider is an excellent follow-up to 2013’s Tomb Raider and developers Crystal Dynamics and Nixxes have shown PC gamers much love in the numerous fixes and improvements they’ve made to an already great game. With its gorgeous graphics, detailed graphical settings, and excellent support for both major graphics card manufacturers, this game is well-deserving of its Excellent rating. If you have any questions or suggestions, be sure to leave them in the comments!
My Testing Rig Specifications
- CPU: Intel Core i7-3930K
- GPU: Radeon R9 290X (Crimson Driver 16.4.1)
- RAM: 16GB DDR3
- OS: Windows 10 Pro 64-bit
Final Verdict for PC Quality:
Rise of the Tomb Raider: EXCELLENT (shite, mediocre, good, excellent)