The GeForce GTX 1060 Founders Edition & ASUS Strix GTX 1060 Review
by Ryan Smith on August 5, 2016 2:00 PM ESTMeet the GeForce GTX 1060 Founders Edition
We’ll start off our look at GeForce GTX 1060 cards with NVIDIA’s Founders Edition card. That there’s a retail reference card for GTX 1060 is actually a bit of a surprise here. NVIDIA has not consistently offered retail reference cards for the GTX x60 family over the years. These cards sell for lower margins, and with their lower TDP and overall simpler design, there’s simply not the same need or market for a retail reference design as with the higher end cards.
As a result, while the Founders Edition cards are a bit harder to swallow with the higher-end GeForces, here it makes a bit more sense. It’s essentially an NVIDIA exclusive design separate from the partner cards, albeit at a higher price. But perhaps more importantly, it’s one of the only two GTX 1060 blower designs being sold, which makes it a unique offering amidst the many open air designs on the market. Albeit one at $299, $50 over the base GTX 1060’s MSRP.
GeForce GTX 1060 Cards | ||||
NVIDIA GTX 1060 Founders Ed. | ASUS STRIX GTX 1060 OC | |||
Base Clock | 1506MHz | 1620MHz | ||
Boost Clock | 1709MHz | 1848MHz | ||
Memory Clock | 8Gbps GDDR5 | 8.2Gbps GDDR5 | ||
VRAM | 6GB | 6GB | ||
Length | 9.75" | 11.75" | ||
Width | Double Slot | Double Slot | ||
Cooler Type | Blower | Open Air | ||
Price | $299 | $314 |
In terms of styling, the GTX 1060 Founders Edition is designed to look like the higher end Founders Edition cards. This keeps a consistent look to the Founders Edition lineup, however a consistent look is also as far as the similarities go. Absent is the metal shroud of the high-end cards, replaced with a simpler plastic shroud, which is more along the lines of what you’d expect for a cheaper, lower TDP card. That said, even just handling the card makes it clear that NVIDIA took care to maintain a high build quality; the shroud is solid with no flimsiness or weak seams. Of all of the plastic NVIDIA blowers I’ve looked at over the years, this is likely the best, easily ahead of the reference GTX 680 and other efforts.
A lot of this has to do with the fact that while the shroud is plastic, NVIDIA has otherwise built the card like one of their higher-end cards, with plenty of fasteners and some metal – aluminum, I suspect – used at key points. Going under the hood also continues this analogy, as we find a metal baseplate running the length of the card, with an aluminum heatsink for the GPU covering much of the board.
The amusing bit is that the blower itself is larger than the card’s PCB due to the amount of space a blower needs for the fan and heatsink assemblies. The GTX 1060 PCB measures just 6.75” long – not too much longer than the PCIe x16 slot connector – while the shroud adds another 3” to that, bringing the total length to 9.8”. NVIDIA keeps the PCB and shroud flush with each other through some careful engineering with the overhanging part of the shroud, but it’s none the less an interesting sight when the cooler is larger than the board it needs to cool.
Looking at the PCB itself, what we find is a pretty typical design for a Gx106 card. NVIDIA has shifted the 3+1 phase power delivery circuitry to the front of the board, allowing the GPU and its associated GDDR5 RAM to be placed near the rear. Curiously, there are two empty GDDR5 pads here, despite the fact that GP106 and its 192-bit memory bus can only connect to 6 pads. From what I hear, this PCB is designed to accommodate GP104 as well (a potential third tier GP104 card, perhaps?), leading to the additional memory pads. In any case this is a simple but functional design, like previous NVIDIA reference PCBs before it.
Moving on, towards the top of the card we find the requisite power connector. NVIDIA’s reference PCB doesn’t actually place the 6-pin power connector on the PCB itself; rather it’s built into the shroud with an internal wire to carry it the rest of the way. This is an unusual design choice, as we haven’t seen this done before for cards with short PCBs. As far as I know there’s no technical reason NVIDIA needed to do this, but it does allow for the 6-pin connector to be placed at the far end of the card, which I imagine will please system integrators and others who are accustomed to short, minimally visible cable runs.
Meanwhile, looking at NVIDIA’s display I/O configuration, it’s physically unchanged from the other Pascal reference boards. This means we’re looking at 3x DisplayPort 1.4, 1x HDMI 2.0b, and 1x DL-DVI-D.
SLI Gets Removed
The one connector you won’t find on the GTX 1060 is the SLI connector. NVIDIA has been rethinking their SLI strategy for the Pascal generation, which as we’ve seen for GTX 1080/1070 has resulted in NVIDIA deprecating 3 and 4-way SLI support. GTX 1060 has not escaped this retooling either, with NVIDIA removing (or rather, not building in) most of their traditional SLI support.
Given the struggling state of multi-GPU scaling that I discussed in the GTX 1080 review, I had initially suspected that removing SLI support from the GTX 1060 was a further consequence of those scaling difficulties. However in discussing the matter with NVIDIA, they informed me that the rationale for this decision is based on economics more than technical matters. As it turns out, SLI is not used very often with GTX x60 class cards. Due to the aforementioned scaling issues there’s little reason to buy a pair of x60 cards right off the bat as opposed to a single x70/x80 card, and meanwhile by the time most customers were ready to upgrade again, it was 12+ months down the line, and buying a next generation card made would make more sense than doubling up on a now older card.
The end result is that NVIDIA has opted to remove SLI support for the GTX 1060 series, making the feature exclusive to the GTX 1070 and above. For customers looking for more performance this essentially locks them into following what has long been our own recommended path: buy the fastest card first (scale up), and then go SLI after that if you need more performance (scale out). As I noted before this is an economic decision – this was a feature that few people were using in this class of card – but at the same time it is a removed feature, so it’s also not really a positive thing. That said, in keeping with our traditional advice on multi-GPU I don’t consider SLI the best way to go with a mainstream card to begin with, so it’s not a feature I’m going to miss.
In any case, while NVIDIA has removed formal support for SLI from this class of product, it’s important to note that they haven’t removed multi-GPU support entirely. DirectX 12’s built in mGPU support supersedes SLI in some cases, so game developers can still offer multi-GPU support if they’d like. NVIDIA pulling SLI support only impacts scenarios where NVIDIA’s drivers were responsible for mGPU: games using DX11 or lower, and games using DX12’s implicit Linked Display Adapter (LDA) mode. Games that use explicit multi-adapter (e.g. Ashes of the Singularity) or explicit LDA can still setup mGPU. So multi-GPU on GTX 1060 isn’t truly dead, though I also don’t think we’re going to see too many game vendors bother to formally qualify a GTX 1060 mGPU setup regardless of what the API allows, since it’s going to be so uncommon.
189 Comments
View All Comments
osxandwindows - Friday, August 5, 2016 - link
Finally!.A timely review from anandtech.
osxandwindows - Friday, August 5, 2016 - link
Now, where is the HTC10 review, the new titan, and the note 7?Ryan Smith - Friday, August 5, 2016 - link
HTC 10: In progress (Josh is nearly done)Titan X Pascal: We weren't sampled
Note7: No comment
ddriver - Friday, August 5, 2016 - link
"Titan X Pascal: We weren't sampled"What do you expect? They send units to be reviewed for publicity, which requires the unit be reviewed immediately after it is received, and the review published the moment NDA expires. But if it takes you months after the official release to review stuff - why bother sending you samples? Keep on sloth gear and you might end up having to purchase all the hardware you want to review...
ddriver - Friday, August 5, 2016 - link
And please don't go with the "but we go in depth" stuff - there is nothing preventing you from publishing detailed stuff later on. Because otherwise you are implying some absurdity like "we're too good for timely reviews" which is plain out silly.zepi - Friday, August 5, 2016 - link
I'm happy to read average results from techpowerup, guru3d on whatever random site I happen to find my way to.I come to Anandtech to find out WHY the cards perform the way they do, not to answer the question of HOW they perform.
ddriver - Friday, August 5, 2016 - link
Sure, because it is all about you happiness...mmrezaie - Friday, August 5, 2016 - link
Well mine too. I do not care about others shallow reviews. I like how Anandtech goes deep about these reviews. Maybe it is a niche portion of visitors, but AT is being famous because of these reviews.Fnnoobee - Friday, August 5, 2016 - link
Deep in reviews? They're not even doing they're test on the latest AMD Crimson drivers, 16.7.3, or even 16.7.2, which released almost a month ago. Yeah, real deep testing there. /smkaibear - Friday, August 5, 2016 - link
Ah, the irony of ddriver complaining that anandtech doesn't make him happy, then telling zepi off for pointing out anandtech makes him happy...