We remind you that the GeForce RTX 3060 Ti is great for gaming at 2.5K, and even with RT (+DLSS) it is able to provide acceptable gaming comfort at this resolution. But the card without the Ti suffix has a significantly lower performance level: the GeForce RTX 3060 is fully suitable for playing with maximum settings (including using RT + DLSS) only in Full HD resolution. True, in this resolution, it usually pulls games even with RT without DLSS), but it will be difficult to swing at 2.5K resolution: for some reason, Nvidia seriously smashed the GeForce RTX 3060 and GeForce RTX 3060 Ti in terms of performance. Of course, there are games where in 2560×1440 resolution such a video card will provide a comfortable gameplay without reducing the graphics settings, but these will most likely be network games with relatively primitive graphics. From the AMD camp, the competitor is the Radeon RX 5700 XT, which, of course, only competes in games without ray tracing. In terms of performance, the Gigabyte video card is slightly faster than the reference analog. As for the existing protection against Ethereum mining in the GeForce RTX 3060 Elite, we will talk about this later.
- Aorus GeForce RTX 3060 Elite features and comparison with Palit GeForce RTX 3060 StormX
- Heating and cooling
- Test results in 3D games
- Test results in mining (mining, hashrate)
Gigabyte Technology (Gigabyte trademark) was founded in 1986 in the Republic of China (Taiwan). Headquartered in Taipei / Taiwan. It was originally created as a group of developers and researchers. In 2004, on the basis of the company, the Gigabyte holding was formed, which included Gigabyte Technology (development and production of video cards and motherboards for PCs); Gigabyte Communications (production of communicators and smartphones under the GSmart brand (since 2006).
- Graphics ProcessingGeForce RTX™ 3060
- Core Clock1867 MHz (Reference Card: 1777 MHz)
- CUDA® Cores3584
- Memory Clock15000 MHz
- Memory Size12 GB
- Memory TypeGDDR6
- Memory Bus192 bit
- Memory Bandwidth (GB/sec)360 GB/s
- Card BusPCI-E 4.0 x 16
- Digital max resolution7680x4320@60Hz
- Card sizeL=296 W=117 H=56 mm
- PCB FormATX
- DirectX12 Ultimate
- Recommended PSU650W
- Power Connectors8 Pin*1, 6 Pin*1
- OutputDisplayPort 1.4a *2
HDMI 2.1 *2
- SLI SupportN/A
1. Quick Guide
2. 4-year warranty
3. Metal sticker
Aorus GeForce RTX 3060 Elite has 12 GB of GDDR6 SDRAM located in 6 16 Gbit chips on the front side of the PCB. Samsung memory chips (GDDR6, K4Z80325BC-HC16) are designed for a nominal nominal operating frequency of 8000 (16000) MHz.
Comparison with the Palit card is forced: today this is the only GeForce RTX 3060 model that we had for testing (Nvidia does not have and did not provide reviewers with the GeForce RTX 3060 Founders Edition). It is clear that the comparison turns out to be not particularly interesting: the Gigabyte card is among the top ones (as already indicated by the Aorus brand), while the Palit version is very simple (I almost wrote “budget”). The total number of power phases for the Palit card is 7 (5 for the core and 2 for the memory chips, but at the same time we see empty spaces for additional phases, that is, in principle, it is possible to use 7 phases on the core). The Gigabyte card has a much more powerful power system: 8 phases per core and 2 for memory (10 in total).
The power circuit for the kernel is shown in green, and the memory is in red. The uP9512R PWM controller (uPI Semiconductor) is responsible for the operation of the GPU power phases.
And the uP1666Q controller controls two phases of the power supply of the memory chips. Both controllers are located on the back of the PCB.
On the front side there is a uS5650Q (the same uPI), which is responsible for monitoring.
The well-known Holtek HT32F52342 controller is responsible for the backlighting.
The power converter, traditionally for all Nvidia video cards, uses DrMOS transistor assemblies – in this case, AOZ5332QI (Alpha & Omega Semiconductor) is used for the GPU, each of which is designed for a maximum of 50 A.
The power converter for memory microcircuits uses AON6994 mosfets from the same manufacturer with a maximum current of 19 A.
The board has a dual BIOS, so a switch is installed on the top end.
BIOS versions OC and Silent differ mainly in different settings (curves) of the fans, their edge frequencies are almost identical (but the TDP limits are different, so the core frequencies may differ in different applications).
In the absence of a reference card itself, we will compare frequencies and performance with the already mentioned Palit card.
The nominal memory frequencies of the Gigabyte card are equal to the reference values, but the Boost value of the GPU frequency in BIOS OC mode is increased by almost 4% compared to the reference analogue, while the power consumption of the Gigabyte card (156 W) was less than that of the Palit card. Research has shown that, on average, we got a 3.7% increase in performance in games relative to the reference results. As regards the BIOS Silent mode, the maximum power consumption of the card was 149 W, and the operating frequencies are about the same.
I also tried manual overclocking. The consumption limit can be raised by 15%, so, most likely, the drivers will allow you to reach the frequencies you specified. When I tried to set +134 MHz for the core and +800 MHz for memory, I got the maximum frequencies of 2152/16500 MHz, which provided an average increase in games in 4K resolution of almost 10% relative to the reference indicators. The power consumption of the card has grown insignificantly – up to 169 W.
The Gigabyte card is powered via two connectors (one 8-pin and one 6-pin). The board has a set of video outputs that differs from the recommended Nvidia (3 DP + 1 HDMI), giving the consumer two HDMI 2.1 instead of one.
The proprietary utility Gigabyte Aorus Engine is quite simple, but it allows you to control all the parameters of the video card, monitor the state of the accelerator and install the backlight control utility.
Starting with the GeForce RTX 30 series, Nvidia engineers use a more compact PCB and a new cooling system for reference cards (and recommend to use them for partners), which partially blows through the radiator. Gigabyte took advantage of this solution, although the PCB still turned out to be not as short as in the Founders Edition cards. The cooler is based on a nickel-plated plate heatsink with heat pipes in direct contact with the GPU.
The memory chips are cooled by a sole, into which the same tubes are pressed, and the VRM power converters are cooled using a separate sole on the same heatsink.
The back plate serves as a PCB protection element and enhances the rigidity of the board as a whole.
Above the radiator, there is a casing with three ∅80 mm fans with a special design of the blades (grooves in the middle of the impellers) to enhance the direction of the air flow.
To minimize air turbulence between the fans, the middle fan rotates in the opposite direction, and all three fans create a kind of “gear effect”.
The fans stop at low load of the video card if the GPU temperature drops below 50 degrees. When the PC starts up, the fans work, but after loading the video driver, the operating temperature is polled, and they are turned off. Below is a video on this topic.
Thermal Monitoring with MSI Afterburner:
After a 2-hour run under load, the maximum core temperature did not exceed 57 degrees for the core, which is just an excellent result for video cards of this level. The power consumption of the card reached 156 watts.
In this case, the core temperature reached 61 degrees, the difference is not that significant.
With the described manual overclocking, the heating and noise parameters also changed very little, the maximum consumption increased to 169 W.
We filmed and accelerated the 8-minute warm-up by 50 times:
The maximum heating was observed in the central part of the PCB.
The noise measurement technique assumes that the room is noise-insulated and damped, and reverberations are reduced. The system unit, in which the noise of video cards is examined, does not have fans and is not a source of mechanical noise. The background level of 18 dBA is the noise level in the room and the noise level of the sound level meter itself. Measurements are taken from a distance of 50 cm from the video card at the level of the cooling system.
- Idle mode in 2D: loaded internet browser with iXBT.com website, Microsoft Word window, a number of internet communicators
- 2D mode with movie viewing: SmoothVideo Project (SVP) is used – hardware decoding with insertion of intermediate frames
- 3D mode with maximum load on the accelerator: using FurMark benchmark
The assessment of the noise level gradations is as follows:
- less than 20 dBA: conditionally silent
- 20 to 25 dBA: very quiet
- 25 to 30 dBA: quiet
- 30 to 35 dBA: clearly audible
- 35 to 40 dBA: loud but bearable
- above 40 dBA: very loud
In 2D idle mode, the temperature was no higher than 40 ° C, the fans did not work, the noise level was equal to the background noise level – 18 dBA.
When watching a movie with hardware decoding, nothing changed.
Under maximum load in 3D, the core temperature reached 57 ° C. At the same time, the fans spun up to 1929 rpm, the noise grew up to 32.5 dBA: this is clearly audible, but not loud. In the video below, you can evaluate how the noise grows (it was fixed for a couple of seconds every 30 seconds).
Under maximum load in 3D, the temperature reached 61 ° C (core). At the same time, the fans spun up to 1409 rpm, the noise grew only up to 24.7 dBA: this is completely quiet. In the video below, you can evaluate how the noise grows (it was fixed for a couple of seconds every 30 seconds).
It should be remembered that the heat generated by the card remains inside the system unit, so it is advisable to use a case with good ventilation.
The backlighting of the Gigabyte Aorus GeForce RTX 3060 Elite is implemented on the upper end, where the Aorus logo is illuminated (and the signal of stopped fans at low load on the card), as well as along the rim of each fan.
Backlight control is done using RGB Fusion software.
The number of lighting scenarios is not so great, but some of them are very nice and additionally customizable.
The Aorus GeForce RTX 3060 Elite package now does not even have a traditional user manual (you can download it from the manufacturer’s website), but there is a registration card for a 4-year warranty, as well as a branded sticker.
Bench configuration and tests
- Computer based on AMD Ryzen 9 5950X (Socket AM4) processor:
- AMD Ryzen 9 5950X processor (overclocked to 4.6 GHz across all cores);
- ZHSO Cougar Helor 240;
- Asus ROG Crosshair Dark Hero motherboard based on AMD X570 chipset;
- RAM TeamGroup T-Force Xtreem ARGB (TF10D48G4000HC18JBK) 32 GB (4 × 8) DDR4 (4000 MHz);
- Intel 760p NVMe SSD 1TB PCI-E;
- Seagate Barracuda 7200.14 3TB SATA3 hard drive;
- Seasonic Prime 1300 W Platinum (1300 W) power supply;
- Thermaltake Level20 XT case;
- operating system Windows 10 Pro 64-bit; DirectX 12 (v.20H2);
- LG 55Nano956 TV (55 ″ 8K HDR, HDMI 2.1);
- AMD drivers version 21.3.2;
- Nvidia drivers version 465.89;
- VSync is disabled.
Testing tool list
All gaming tests used the maximum graphics quality in the settings.
- Hitman III (IO Interactive / IO Interactive)
- Cyberpunk 2077 (Softclub / CD Projekt RED), patch 1.2
- Death Stranding (505 Games / Kojima Productions)
- Assassin’s Creed Valhalla (Ubisoft / Ubisoft)
- Watch Dogs: Legion (Ubisoft / Ubisoft)
- Control (505 Games / Remedy Entertainment)
- Godfall (Gearbox Publishing / Counterplay Games)
- Resident Evil 3 (Capcom / Capcom)
- Shadow of the Tomb Raider (Eidos Montreal / Square Enix), HDR enabled
- Metro Exodus (4A Games / Deep Silver / Epic Games)
To calculate the hashrate when mining Ethereum, the T-Rex miner (0.19.14) was used, the average indicator for 2 hours was recorded in two modes:
- by default (consumption limit reduced to 70%, GPU frequency reduced by 200 MHz, memory frequency by default, fans set in manual mode by 70%)
- optimization (the consumption limit is reduced to 70%, the GPU frequency is reduced by 200 MHz, the memory frequency is increased by 500-1000 MHz (depending on the card), the fans are set in manual mode by 80%)
To test the Aorus GeForce RTX 3060 Elite, we used the same “leaked” driver version 470.05, in which the protection against mining was disabled.
Standard benchmark results without hardware ray tracing at 1920×1200, 2560×1440, and 3840×2160
Most games still do not support ray tracing technology, and there are still a lot of video cards on the market that do not support RT in hardware. The same is true for Nvidia’s DLSS smart anti-aliasing technology. Therefore, we still carry out the most massive tests in games without ray tracing. Nevertheless, today already half of the video cards we regularly test support RT technology, therefore, starting in the fall of 2020, we have been conducting tests not only using conventional rasterization methods, but also with the inclusion of RT and / or DLSS. It is clear that in this case, video cards of the AMD Radeon RX 6000 family participate in tests without an analogue of DLSS (we are waiting for the company to implement the promised analogue and speed up the calculation of ray tracing).
Test results with hardware ray tracing and / or DLSS enabled at 1920×1200, 2560×1440, and 3840×2160
The diagram clearly shows that for mining purposes, the Radeon RX 5700 XT (and even the Radeon RX 5700) may now be more profitable than the GeForce RTX 3060. This is probably why AMD video cards have become very expensive and disappeared from sale. Also, just in case, we repeat that the GeForce RTX 3060 produces an adequate hashrate only when using a specific version of Nvidia drivers and provided that the card is installed in the PC first (or second / third, but with an HDMI dongle on the video output), and also that the PCIe version not lower than 2.0.
Optimizing the settings for the operation of video cards for mining in our case does not provide for strong overclocking of the video memory, and external cooling of the video cards is also required. It is necessary to carefully monitor the heating of GDDR6X in GeForce RTX 3080/3090, because the maximum for this memory is 110 degrees, and it will not live long, constantly working under heating conditions above 100° C.
The Gigabyte Aorus GeForce RTX 3060 Elite (12 GB) is an excellent mid-range graphics card designed for Full HD resolution with high graphics settings. Its cooler can work in a quiet mode with absolutely adequate cooling. The board is of the usual length (fits into any case), but occupies three slots. The set of video outputs includes two HDMI 2.1 instead of the usual one.
It is clear that in some games the capabilities of the GeForce RTX 3060 will be enough for a resolution of 2.5K. Considering that previously there was enough memory for such resolutions of 8 GB, the current 12 GB for the GeForce RTX 3060 is a forced move: on cards with a 192-bit bus, you can install either 6 or 12 gigabytes of memory, and 6 gigabytes may not be enough to implement DLSS 2.0. That’s why this midrange got more memory than the top-end GeForce RTX 3080.
Let me remind you that, as in the cases with the older accelerators of the new series, the GeForce RTX 3060 realizes all the advantages of ray tracing technology in combination with DLSS in games. And at the same Full HD resolution, the performance of the GeForce RTX 3060 with RT + DLSS enabled is more than adequate. And without DLSS, the novelty shows decent speed in many games with RT. The GeForce RTX 3060 is a direct competitor to the Radeon RX 5700 XT accelerator and the past GeForce RTX 2070 and GeForce RTX 2060 Super. You can also clearly see that there is a decent performance gap between the GeForce RTX 3060 Ti and the GeForce RTX 3060, which can be used in the future to release the next intermediate solution.
The GeForce RTX 3060, like the entire GeForce RTX 30 family, offers interesting Nvidia solutions, which we have already talked about many times: support for the HDMI 2.1 standard, which allows you to output 4K images at 120 Hz or an 8K image using a single cable, hardware decoding of video data in AV1 format, RTX IO technology that can in the future provide fast transfer and decompression of data from drives directly to the GPU, as well as Reflex latency reduction technology, useful for esports athletes.
As for the Nvidia advertised limitation of the GeForce RTX 3060 in mining cryptocurrencies using the most popular GPU algorithm, on which the same Ethereum is based (the miners who bought all the video cards, basically dig it): yes, the limitation exists and works, but it filmed in a specific version of the driver (allegedly released by mistake), which is not difficult to find. Of course, no one bothers to use the GeForce RTX 3060 for mining other cryptocurrencies either.
But do not forget that under the new terms of warranty repair, adopted recently, all manufacturers of video cards will be able to refuse warranty service if they find that the card was used for mining (in the rules they write: to make a profit). For Gigabyte cards, we recommend reading the rules here.