![]() I also expected to pay over £2000 for it as prices were vague until 2pm on launch day but I got one for £1699. Overall, very happy with 4090 *for my use* considering Resolve doesn't appear to be fully optimized for it yet with the card only being released less than 48 hours ago. I also did a file conversion/upscale using the latest version of Topaz Video Enhance as it tests the GPU with a 17 min, 29 sec clip. Mac Studio did it at 80fps and render time of 11 mins, 43 secs Mac Studio did it at 44fps and render time of 16 mins, 31 secsĪ 1080p HD mp4 (1920x1080) with 10% amount of temporal noise and no other correction.ģ090 did it at 117fps and render time of 8 mins, 2 secsĤ090 did it at 174fps and render time of 5 mins, 26 secs Mac Studio did it at 63fps and render time of 8 mins, 23 secsĪ 1080p HD mp4 (1920x1080) that was super scaled x2 within Resolve, with 75% amount of temporal noise reduction and general sharpening and colour correction.ģ090 did it at 40fps and render time of 18 mins, 40 secsĤ090 did it at 72fps and render time of 10 mins, 39 secs ![]() Mac Studio did it at 59fps and render time of 4 mins, 34 secsĪn older SD mp4 (720x480) that was super scaled x3 within Resolve, with max amount of temporal noise reduction and general sharpening and colour correction.ģ090 did it at 57fps and render time of 9 mins, 7 secsĤ090 did it at 107fps and render time of 5 mins, 0 secs Latest MAC version of Resolve (18.0.2 build 78)Īll files were pulled from my NAS running 10Gb Ethernet and rendered out onto my PC's secondary SSD in h264/mp4 format and the Macs internal SSD drive.Īn older SD mp4 (720x480) that was super scaled x3 within Resolve, with a 75% amount of temporal noise reduction and general sharpening and colour correction.ģ090 did it at 53fps and render time of 5 mins, 5 secsĤ090 did it at 92ps and render time of 3 mins, 2 secs Latest PC version of Resolve (18.0.4 build 5) I'm not really bothered about other online benchmarks such as gaming and Blender etc as they're irrelevant to myself but the tests I've done are what I do day in, day out so any extra performance helps me out with my work - and the results are pretty impressive.įor reference, I also did the same tests on my Mac Studio Ultra which outperforms all but one Resolve test over the 3090. )Ĭlick to expand.I rendered some projects on my 3090 and then fitted my new (and very large!) 4090 and ran these non scientific tests for comparison. They don't have a huge fab process utilization lead to lean on. They are likely to land short of Nvidia 4090 ( and pretty likely AMD 7900). Or even Intel Flex 140-170 if someone needed AV1 hardware de/encode (which apple is likely to skip this iteration). The bigger blunder that Apple may have wandered into here is blowing up any ability to add a "Compute GPU" card to the Mac Pro ( a GPGPU to just add more compute grunt without a display option. How well Apple covers the mid-range ( 7600-77/4070 respectively) is the far more strategically critical piece. AMD and Nvidia will likely sell mostly not 4090/7900 GPU units. Nvidia has already stretched the overclocking on the 4090 without even more exotic cooling so there won't be much of a response.Įxtremely highly doubtful Apple wants to jump into the middle of that pissing contest. Tweak some clocks just even to close most (if not all the gap) to the 7900. After a while later in 2023 AMD can do a 7950 with double the cache by just stacking some extra 3D 元 cache on top and likely get a boost. ![]() Nvidia's 4090 will likely top the AMD 7900 on many metrics, but probably not Pref/$. As some point have to make transistor budget allocation trade-offs (and package construction costs trade-offs). Can't put everything and the kitchen sink on a single die. Nvidia tossed NVLink off the 4090 to allocate more specialized cores ( I think RT cores). At some point "more allocation" will be too big a hurdle. they just aren't likely to 'win' in the bigger/biggest die area allocation desktop GPU card space. ![]() More so dGPUs that might show up in a laptop (or most All-in-one desktops.).Īs long as they have to allocate die space to CPU cores, SSD controllers, Secure element cores, Thunderbolt controllers, and a much higher than average number of memory controllers. I doubt Apple is trying to directly kill off all dGPUs that could be using in desktop/server units on a card. They explicitly said they wanted to have the fastest iGPU. Extremely unlikely Apple is using the 4090 as a benchmark at all. If there is someone laying back to see what they have to cover it is AMD not Apple. I think you got the "company that begins with 'A'" mixed up. So "being quiet" isn't indicative of anything other than "normal". The general standard operating rule at Apple is to be quiet if they are far away from launch.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |