Either way you're gonna grab a 1070 :tongue:
Printable View
Wow, looks like the next time I feel the urge to buy a CPU I will get Ryzen
https://segmentnext.com/articles/amd...-3rd-gen-cpus/
Let's hope the Ryzen 3 series will be great, so gamers could grab the latest Intel's gaming chip, a bit cheaper :tongue:
PS
Thanks for posting here guys! I'll get back to Tech threads, as soon as I finish my latest interview for Content ;)
Does anyone here have an RTX2080? I played on RTX's lately and I feel as if modern games aren't handling that power too well, for the moment.
Ray Tracing is a marketing bull that Nvidia is known to do every time they want to boost their sales and want to remain dominating in the market. Sure the featured tech is amazing in demos and benchmarks but what about actual gaming? Yeah it :wub:s the hardware in realtime gaming.
-Earlier it was "Physx", now it's just a joke & majority of the devs don't even use it. Hell last major title that was Witcher 3 puts the Physx load on CPU by default instead since modern day CPUs are far better in physics calculations.
-Then came Hairworks, again few titles included them and it still hasn't been adopted on a widespread scale amongst game developers. FC4, TW3, COD and AMD's TressFX in Tomb Raider. And then that's it.
So Ray Tracing is no different and won't be predominant for a long time, if ever. Nvidia just likes to come up with demanding features & then pay ("fund") developers to include them to lure gamers in buying their tech. Devs go with their stuff because no one is going to deny a generous fund to support their costs and get free advertising with it. But the real deal-breaker here is the pricing and performance ratio, RTX is bloody ridiculous. And the funny thing is that even with RTX2080 you can't play BF at ultimate settings on 1080p lol.
Well, that's one way to see it and I partially agree with you. It's demanding as hell and still early to fully enjoy or implement, but that's the same for every new tech. If it wasn't that good and futureproof, then ATI wouldn't adopt it and last time I checked ATI's CEO said they'll surely use it and support it with their cards because it's an important feature. I agree that hairwroks was badly designed and written. PhysX is another story though and I blame Devs for its failure, not NVIDIA.
Anyway RT is still quite expensive and not performance friendly, that's why I kept my 1080ti. But let's not exaggerate things that much, time will tell, I guess...http://www.twcenter.net/forums/image...icons/wink.gif
I think you misunderstood me, my criticism is towards the current RTX cards not the tech. I'm not exaggerating, there are no AAA games that employ ray tracing as their primary rendering method. A complete modern ray traced game in real-time is still some way off from reality. Current ray tracing games only include some features and we know how much demanding they're right now. Ray tracing an entire game in real time is not possible at the moment and in fact current game engines are not designed for that.
The current goal right now of DTX (RTX + Microsoft) is to provide ray tracing in certain areas\scenarios where our traditional rasterization does poorly and where ray tracing can offer better visuals with less performance cost in comparison - it's still not an overhaul. So, right now it's just a hype train as Nvidia wants to push for sales as they took a massive -73% hit in profits. Anyways, we're certainly going in the right direction but it will take time to develop on the gaming end.
Oh I see...Sure, I agree with that approach :thumbsup2
The little samples we have seen so far are certainly promising, but its overall development is indeed very slow. I also believe NVIDIA jumped the gun with Ray Tracing, too soon.
Btw have you seen NVIDIA's latest trick to get rid of the remaining GTX models ?
Pascal GeForce GTX cards could get ray tracing with driver update
All new technology is problematic. I remember how annoying Crossfire and SLI setups were even during Fermi and AMD's Evergreen family. Speaking of which. Eyefinity was actually a very clean launch. Eyefinity 6 Screen edition was a nightmare to navigate though.
Any news on the latest updates to Windows 10?
Don't know anything about that, but the latest AMD CPUs and GPU are great.
Aye, I was impressed. Intel is a tiny bit better in gaming tasks still but not by much and for absolutely everything else that matters AMD is head and shoulders better.
The only reason intel is better in gaming is due to DirectX (instead of vulkan) being used in games and the lack of support for multithreading and core usage by the devs. I don't blame game devs for it, multithreading is very hard to do properly without bugs.
AMD is better bang for buck in general IMO and I don't think the slight edge in gaming that Intel is offering is at all worth it. Just for the sake of future proofing, I'd take an AMD mobo and CPU. Of course, I'm not really going to be buying anything, anytime soon. Still rocking an i7-4790K. Maybe in 2020 I'll rebuild my PC.
I do agree with that, if I were interested in buying a new CPU it would definitely be AMD, one of the key reasons being that they make sure their CPU's are generally backwards compatible with older motherboards.
I'm on the low end of gaming, thus the Intel edge is necessary until such a time I can pick up an AMD that provides equal true single core power and software support for the mediocre budget I have available.
Personally I've always found GPU's make far more of an impact on my FPS than a CPU nowadays. For example moving from a 3770k to an 8086k did almost nothing for my FPS in AC Origins, despite it being somewhat of a CPU intensive game.
Older games may be affected ever so slightly but I doubt the difference is very significant since only the most expensive CPU's e.g. i5's i7's / i9s have enough overclocking power and ability to really affect things and that's discounting improvements in the CPU architecture that newer models have.