GPU Performance
With a modest increase in EU hardware (20 EUs up from 16 EUs), the Intel HD 4400 GPU in the Core i7-4500U I’m testing today isn’t tremendously faster compared to the HD 4000 in the i7-3517U. On average I measured a 15% increase in the subset of game tests I was able to run in Taipei, and a 13% increase in performance across our 3DMark tests. The peak theoretical increase in performance we should see here (taking into account EU and frequency differences) is 19%, so it doesn’t look like Haswell is memory bandwidth limited just yet.
If we throw 35W Trinity into the mix, HD 4400 gets closer but it's still far away from 35W Trinity performance:
GPU Performance Comparison | ||||||
Metro: LL - Value | Metro: LL - Mainstream | BioShock Infinite - Value | BioShock Infinite - Mainstream | Tomb Raider - Value | Tomb Raider - Mainstream | |
Core i7-3517U | 15.4 fps | 6.0 fps | 16.4 fps | 7.0 fps | 20.1 fps | 10.2 fps |
Core i7-4500U | 14.5 fps | 6.5 fps | 17.4 fps | 9.9 fps | 24.6 fps | 12.2 fps |
A10-4600M | 16.8 fps | 8.0 fps | 25.8 fps | 10.0 fps | 30.1 fps | 12.7 fps |
For light gaming, Intel’s HD 4000 was borderline reasonable. Intel’s HD 4400 takes half a step forward, but it doesn't dramatically change the playability of games that HD 4000 couldn't run well. Personally I’m very interested to see how the 28W Iris 5100 based Haswell ULT part fares later this year.
ncG1vNJzZmivp6x7orrAp5utnZOde6S7zGiqoaenZIRxgJZoq6GdXZ2utMPEpaNmrZypv6KuzqiiZqqVq7amw4ycpqudXZ6EdYGPaaxmrJWowaawjnA%3D