The Ghosts are back, this time it’s in Tom Clancy’s Ghost Recon Breakpoint. The latest entry into the series is coming out of Ubisoft Paris who has been handling the Ghost Recon series for quite some time now. Taking place on the fictional island of Auroa that is located in the Pacific Ocean. I have kept up with this title taking part in the early closed BETA sessions through to launch and was hoping that somewhere along the line they would announce that using the Anvil Next 2.0 engine was a joke and they were, in fact, using the Snowdrop engine to be able to take advantage of DX12, but nope it is still Anvil Next 2.0 and it shows in the results.
Testing Methodology
Ghost Recon Breakpoint carries the tradition of bringing a built-in benchmark to the table and after starting with the RX 570 and settling on High being the sweet spot preset we decided to proceed with Anti-Aliasing disabled along with Temporal Injection disabled as well to limit variability.
We used FrameView to capture the performance from the 105 second run of the benchmark while running at the High preset. Once we had the results from 3 runs, after discarding an initial burner run for loading purposes, we took the average of average frame rates as well as the 99th percentile results from the run. We report our performance metrics as average frames per second and have moved away from the 1% and .1% reporting and are now using 99th percentile. For those uncertain of what the 99th percentile is, representing is easily explained as showing only 1 frame out of 100 is slower than this frame rate. Put another way, 99% of the frames will achieve at least this frame rate.
Test System
Components | Z370 |
---|---|
CPU | Intel Core i9-9900k @ 5GHz |
Memory | 16GB G.Skill Trident Z DDR4 3200 |
Motherboard | EVGA Z370 Classified K |
Storage | Kingston KC2000 1TB NVMe SSD |
PSU | Cooler Master V1200 Platinum |
Windows Version | 1903 with latest security patches |
Graphics Cards Tested
GPU | Architecture | Core Count |
Clock Speed | Memory Capacity |
Memory Speed |
---|---|---|---|---|---|
NVIDIA RTX 2080ti FE | Turing | 4352 | 1350/1635 | 11GB GDDR6 | 14Gbps |
NVIDIA RTX 2080 SUPER FE | Turing | 3072 | 1650/1815 | 8GB GDDR6 | 15.5Gbps |
NVIDIA RTX 2070 SUPER FE | Turing | 2560 | 1605/1770 | 8GB GDDR6 | 14Gbps |
NVIDIA RTX 2060 SUPER | Turing | 2176 | 1470/1650 | 8GB GDDR6 | 14Gbps |
NVIDIA RTX 2060 FE | Turing | 1904 | 1365/168 | 6GB GDDR6 | 14Gbps |
ZOTAC Gaming GTX 1660 | Turing | 1408 | 1530/1785 | 6GB GDDR5 | 8Gbps |
NVIDIA GTX 1080 FE | Pascal |
2560 | 1607/1733 | 8GB GDDR5X | 10Gbps |
NVIDIA GTX 1070 FE | Pascal |
1920 | 1506/1683 | 8GB GDDR5 | 8Gbps |
NVIDIA GTX 1060 FE 6GB | Pascal |
1280 |
1506/1708 | 6GB GDDR5 | 8Gbps |
AMD Radeon RX 5700XT | Navi | 2560 | 1605/1755/1905 | 8GB GDDR6 | 14Gbps |
AMD Radeon RX 5700 | Navi | 2304 | 1465/1625/1725 | 8GB GDDR6 | 14Gbps |
AMD RX Vega 64 | Vega 10 | 4096 | 1247/1546 | 8GB HBM2 | 945Mbps |
AMD RX Vega 56 | Vega 10 | 3584 | 1156/1471 | 8GB HBM2 | 800Mbs |
MSI RX 580 Armor 8GB | Polaris 20 | 2304 | 1366 | 8GB GDDR5 | 8Gbps |
Sapphire Nitro+ RX 570 4GB | Polaris 20 | 2048 | 1340 | 4GB GDDR5 | 7Gbps |
Drivers Used
Drivers | |
---|---|
Radeon Settings | 19.9.3 |
GeForce | 436.48 |
Preset Scaling At 4K
Testing presets at 4K gives us a couple of quick metrics before diving into the game too deeply. First off, it shows us how the game looks at various presets as well as how performance scales with those settings. Breakpoint has a very distinct change in visual qualities between Low to Medium to High, but past that there is a visual improvement fall off to the point you’ll likely have to look at screenshots to notice the difference. The performance penalty for going past Medium is pretty tough but High seems to be about as far up the chain as I would recommend and that is why we chose it for testing rather than shooting for Ultra like normal, we are tweaking our approach to these types of performance evaluations hoping to bring better results to our readers.
Preset Scaling At 1080p
This is the exact same idea as the 4K preset scaling just changed to using an RX 570 at 1080p.
Intel Core Scaling Performance
While this test won’t tell just how many cores and threads the game can and will use, it does show how the game performs as you move up in cores and threads available. These were tested at the 1080p settings that we tested the rest of the results while pairing the CPU with the RTX 2080 Ti Founders Edition. While this does not take into account the cache difference you would see with Intel CPUs as you move through their offering stack it does give us a better idea of how the game benefits and behaves from more cores and threads. Core and thread benefits here in this one surprised me a bit, where games like Assassin’s Creed Odyssey had a fairly distinct benefit when moving past a hyperthreaded quad core this one doesn’t have that much of an improvement. But if you’re on a not hyperthreaded quad core with a high powered GPU you’ll be in for a stutterfest. Still surprised to see this title even operate on a simple dual core configuration. Added to the results in this one thanks to the benchmark is the GPU Utilization showing that with a dual core the RTX 2080Ti is only seeing an average of 54% utilization, but even a full 8 cores and 16 threads of the i9-9900K at 5GHz isn’t enough to bring the GPU past 93% utilization.
Graphics Card Results
1080p
Ultrawide 1080p
1440p
Ultrawide 1440p
UHD 4K
Conclusion
Tom Clancy’s Ghost Recon Breakpoint is an interesting one to wrap up, while the game does have good preset scaling and will allow you to squeeze as much as you can from your hardware it just never feels like you get the performance you should. The game can be absolutely gorgeous at times, but move off the main paths and you start to wonder if they even tried. The game opens up to a spectacular scene visually, but once I was in the game and playing I found the water to be quite the sad state and while on a side mission it seemed like the entire area I was circling around should have been chained off with an ‘under construction’ sign as it looked half baked at best. The Anvil Next 2.0 Engine is really powerful for DX11 but this really feels like a game where DX12 not only could be there, but rather SHOULD be there. Look over at what the team working on The Division Series has been able to do with Snowdrop and it really makes you question the use of Anvil Next 2.0 in this title.
All of that aside you can get good performance on the more modern hardware with this game allowing you to make up your mind what eye candy you want and don’t want as you tailor the details to your liking. But if you’re still holding on to the last generation 1080p powerhouses I think you’ll find yourself in a position where you’re really considering is it time to upgrade as those cards are starting to really show their age.
The post Tom Clancy’s Ghost Recon Breakpoint PC Performance Explored by Keith May appeared first on Wccftech.
Refference- https://wccftech.com
0 Comments