vlad01 wrote:I do it the other way, connect all my PCs to the same display where I can and switch input depending on which PC I use. Only issue is my other PC atm has DVI only where my screen is DP and HDMI so I need to get an adapter.
My LG monitor I got last year is 1440p, great upgrade from 900p that's for sure!
N0B0DY wrote:I've upped my RAM from 16 to 32 GB ([email protected] QuadChannel)). But I couldn't find same latency as my original kit (CL15) and had to compromise with CL16. Hope its not a big deal performance wise ..
vlad01 wrote:So now that the 30 series reviews are out what do people think?
To me it looks a lot like a repeat of the 200 series back in 09 or so. Back then a massive increase(almost double) the FPUs, massive increase in transistor count, big TDP increase but only a modest gain in performance, actually these are even less than the 200 series was over the 9800 series.
Overall not bad in a vacuum and at a decent price point but technologically kind of meh, as in that node shrink and gain in transistors aren't doing much for performance and efficiency, in fact the later is almost the same as the 20 series despite the node shrink.
The other thing I have noticed in all the reviews is the architecture has poor scaling below 4K which is really odd. So anyone getting one of these for 1080p or 1440p are going to be disappointed if they already have 20 series or even one of the higher end pascal cards. The 3080 was only showing gains of about 20% over the 2080ti at 1440p and iirc around 15%? at 1080p, while 4k was bit over 30% which is decent.
As for the yet to be reviewed 3090, I reckon it might only outpace the 3080 by 10% give or take, it's got 19% more FPUs but they already scale not so good at the lower count of the 3080.
Overall I feel it's better than 20 series but still not impressed by it given the specs on paper and the claimed figures nvidia gave pre-review. That's my 2c anyway.