Jump to content


Realistic build for 2k 1080i 70fps on a budget upgrading to 4k


  • Please log in to reply
34 replies to this topic

WindSplitter1 #21 Posted 16 June 2019 - 02:17 PM

    Colonel

  • Player
  • 20373 battles
  • 3,638
  • [ORDEM] ORDEM
  • Member since:
    02-07-2016

View PostValkyrionX, on 16 June 2019 - 08:51 AM, said:

I have to debunk myths and legends about gaming

 

Playing in 2k or 4k is almost useless, or at least you have to spend a lot in terms of components, including the screen.


To be able to play WOT today in FHD with maximum details without any FPS problems, you just need a 4GB RX570 and a decent Ryzen or Intel processor with at least 4 cores to take advantage of multithreading properly.


We must also debunk the myth of the frequencies of RAM, yes ,  RAM at high frequencies do not give any real benefit for gaming but can only help those who work with gigantic files that need to be compressed or decompressed, as far as gaming [even competitive] it is useless to have RAM with high frequencies.

 


Playing in 2k or 4k means having to spend really a lot on a build, starting from the motherboard, which is with intel or amd sockets you have to buy something really good, then we move on to the processor, which obviously must be one of the top of the range, same thing for the RAMs, if I had to do a build of this kind I would mount RAM GSKILL or HYPERX at least +16GB with +3000mhz of frequency and obviously a powerful GPU that can hold 2k and 4k, obviously we talk about cards like 1080TI, the 2080TI , 2070 or AMD the new NAVI 5700XT or a liquid-cooled overclocked VEGA 64.

 

Of course also quality component like the PC power supply must be powerful and with a 80 Plus Platinum certification and at least with a wattage of at least 800 watts [this depends on the components you buy]

 

Obviously for a similar config the cooling system and the case must be functional to dissipate the heat of a build that will have a big workload during the game.


The monitor shouldn't be forgotten either, if I built a similar rig I'd buy a 244Hz monitor compatible with FreeSync and Gsync, maybe from BenQ or Samsung.

 

However spending so much money to play wot doesn't make much sense, given that today you can make much less expensive rigs with 144Hz [or less] FHD monitors up to 27 inches.

 

Anyway I strongly advise you to wait until 2019 ends to get lower prices on the ZEN2 versions of AMD Ryzen and on the AMD GPUs with GNC architecture and obviously you will have more choice since amd by the end of 2019 will have launched almost everything on the market announced to release during their conference.

 

Building a PC now is madness, wait a few months, prices will fall on the previous components and you will also have the choice of new generations of processors, motherboards, RAM and GPU and also better SSD nvme with the new tecnology of the PCI-E 4.0 

 

 

edit:

personally I find it senseless to spend maybe 1500 euros for a PC for 2k or 4k and then play with maximum details with about 70 FPS, I prefer to play in 1080p [or maximum 1440p] by far and have 244 or 144 FPS

 

Mine cost me between 0.9-1.1K€ ("stock" GPU).

 

Intel Core i7-8700

G.SKILL Aegis 8GB ~ 3000MHz

MSI Z370 SLI Plus

 

etc

 

I intent to expand the RAM to 32GB and the GPU to 8GB.

 

For gaming? Heck, no. Old rig was good enough for that. But for video editing, 3D Modelling+Rendering, etc. Can't see how that doesn't make sense.

 

Building a 2K, nevermind 8K rig for the sole use for gaming is a waste of thought to begin with.


Edited by WindSplitter1, 16 June 2019 - 02:18 PM.


KanonenVogel19 #22 Posted 16 June 2019 - 02:22 PM

    Warrant Officer

  • Player
  • 390 battles
  • 585
  • Member since:
    04-05-2019

View PostValkyrionX, on 16 June 2019 - 09:51 AM, said:

We must also debunk the myth of the frequencies of RAM, yes ,  RAM at high frequencies do not give any real benefit for gaming but can only help those who work with gigantic files that need to be compressed or decompressed, as far as gaming [even competitive] it is useless to have RAM with high frequencies.

 

That is actually wrong. If you understand how computer hardware and programming works, it becomes quite obvious that the frequency matters quite a lot, up to a limit of course. Thing is, you can't store everything in registers or cache, so things that are not needed at the very moment in the code gets moved to the RAM. Thus, any application that uses a decent amount of RAM will benefit from higher RAM frequencies.

 

If your CPU makes a large amount of small requests to RAM, it's the frequency that will determine the performance. If your CPU makes a small amount of large requests to RAM, it's the bandwidth that will determine the performance. In games, we usually don't load large blocks of data at once, but rather small to medium sized blocks, but we do it very often. That's why frequency is important.

 

So what happens if you have a low frequency? Simply put, your CPU will have to spend a lot of time "waiting" for the RAM. In some cases, it can do other instructions in the meanwhile, but in other cases, it needs the data from RAM before it can do the other instructions. How beneficial will a high frequency be? That's impossible to say, because it depends on the application itself, and the ratio of regular instructions vs memory fetches. But I think it's very wrong to say that it doesn't give any benefits in gaming, because it does.



ValkyrionX_TV #23 Posted 16 June 2019 - 03:23 PM

    Lieutenant Сolonel

  • Moderator
  • 56025 battles
  • 3,028
  • [RDDT] RDDT
  • Member since:
    02-07-2015

View PostKanonenVogel19, on 16 June 2019 - 02:22 PM, said:

 

That is actually wrong. If you understand how computer hardware and programming works, it becomes quite obvious that the frequency matters quite a lot, up to a limit of course. Thing is, you can't store everything in registers or cache, so things that are not needed at the very moment in the code gets moved to the RAM. Thus, any application that uses a decent amount of RAM will benefit from higher RAM frequencies.

 

If your CPU makes a large amount of small requests to RAM, it's the frequency that will determine the performance. If your CPU makes a small amount of large requests to RAM, it's the bandwidth that will determine the performance. In games, we usually don't load large blocks of data at once, but rather small to medium sized blocks, but we do it very often. That's why frequency is important.

 

So what happens if you have a low frequency? Simply put, your CPU will have to spend a lot of time "waiting" for the RAM. In some cases, it can do other instructions in the meanwhile, but in other cases, it needs the data from RAM before it can do the other instructions. How beneficial will a high frequency be? That's impossible to say, because it depends on the application itself, and the ratio of regular instructions vs memory fetches. But I think it's very wrong to say that it doesn't give any benefits in gaming, because it does.

 

for gaming the 2666mhz rams are fine, the gain with ram over 3000mhz is for the sole purpose of benchmark and there is almost no real benefit in the games, but only in terms of productivity and work 


Going beyond it is just wasted money.

 

Not forgetting that the memory controller, the motherboard frequencies and the speed of communication with solid drives affect the performance of the ram, the frequency of the ram affects the real / theoretical band and the response in terms of nanoseconds, earnings over frequencies above 3000mhz do not justify their prices solely for the purpose of gaming



tomatoRNGmaster #24 Posted 16 June 2019 - 03:33 PM

    Field Marshal

  • Player
  • 39127 battles
  • 12,552
  • Member since:
    03-18-2013
I'm just wondering which marketing guru came up with 2k and managed to actually get people to use that non-term. What resolution is that even supposed to be?

KanonenVogel19 #25 Posted 16 June 2019 - 03:37 PM

    Warrant Officer

  • Player
  • 390 battles
  • 585
  • Member since:
    04-05-2019

View PostValkyrionX, on 16 June 2019 - 03:23 PM, said:

for gaming the 2666mhz rams are fine

 

As I explained, you simply can't state that X MHz are sufficient or not sufficient, because it depends on the application. In one game it will be overkill to have even 1800 MHz, in another game you'll see performance improvements if you use 3200 MHz. Not to mention that this also largely depends on if you're CPU or GPU bound. Point is you can't make a general case out of this. I agree, higher frequencies comes at a higher price and that price needs to be taken into consideration. Your other hardware components needs to be taken into consideration aswell. But stating that higher frequencies are useless is technically and logically wrong.



ValkyrionX_TV #26 Posted 16 June 2019 - 03:40 PM

    Lieutenant Сolonel

  • Moderator
  • 56025 battles
  • 3,028
  • [RDDT] RDDT
  • Member since:
    02-07-2015

View PostVonniVidiVici, on 16 June 2019 - 03:33 PM, said:

I'm just wondering which marketing guru came up with 2k and managed to actually get people to use that non-term. What resolution is that even supposed to be?

 



tomatoRNGmaster #27 Posted 16 June 2019 - 03:52 PM

    Field Marshal

  • Player
  • 39127 battles
  • 12,552
  • Member since:
    03-18-2013

View PostValkyrionX, on 16 June 2019 - 03:40 PM, said:

 

 

So it's ever so slightly wider than regular 1080p. That's... useful, I guess? :unsure:  Sounds far more impressive than full HD + a bit though which I guess is all that matters to the average consumer.


Edited by VonniVidiVici, 16 June 2019 - 03:53 PM.


ValkyrionX_TV #28 Posted 16 June 2019 - 03:55 PM

    Lieutenant Сolonel

  • Moderator
  • 56025 battles
  • 3,028
  • [RDDT] RDDT
  • Member since:
    02-07-2015

View PostVonniVidiVici, on 16 June 2019 - 03:52 PM, said:

 

So it's ever so slightly wider than regular 1080p. That's... useful, I guess? :unsure:

 

in reality it doesn't change so much, the best resolution to date remains Full HD 1080p



tomatoRNGmaster #29 Posted 16 June 2019 - 04:04 PM

    Field Marshal

  • Player
  • 39127 battles
  • 12,552
  • Member since:
    03-18-2013

View PostValkyrionX, on 16 June 2019 - 03:55 PM, said:

 

in reality it doesn't change so much, the best resolution to date remains Full HD 1080p
 

 

Yeah, it doesn't seem like something particularly noticeable.

Personally I like 1440p for gaming, same aspect ratio as 1080p but doesn't looks like arse on monitors larger than 24". :great:

 

View PostNishi_Kinuyo, on 16 June 2019 - 12:50 PM, said:

Oh, and they advice "only" a 750W power supply for the RTX 2080 Ti, and in theory, a PC with one of those should never be able to reach the max power draw that it is capable of delivering.

 

Fun fact: miminum rated wattage for a system with a 2080ti is only 650 watts, which is... cute and incredibly misleading. The system might not actually pull that much in terms of wattage but a 2080ti (at least in an overclocked system) is going to hammer the 12v rail, requiring more than PSUs with that wattage can typically deliver.


Edited by VonniVidiVici, 16 June 2019 - 04:05 PM.


Nishi_Kinuyo #30 Posted 16 June 2019 - 04:36 PM

    Lieutenant General

  • Player
  • 10251 battles
  • 7,093
  • [GUP] GUP
  • Member since:
    05-28-2011

View PostVonniVidiVici, on 16 June 2019 - 04:04 PM, said:

Fun fact: miminum rated wattage for a system with a 2080ti is only 650 watts, which is... cute and incredibly misleading. The system might not actually pull that much in terms of wattage but a 2080ti (at least in an overclocked system) is going to hammer the 12v rail, requiring more than PSUs with that wattage can typically deliver.

Eh, not really, if the PSU has two 8-pin PCI connectors than it should have no problem with that.

Besides, a 2080 Ti with two 8-pin PCI connectors is rated for 375W power draw.

You'd need to do some serious overclocking if you want to cap out that 650W PSU, because without it I estimate it has only a maximum power draw of around 575W, and that's assuming 100% CPU&GPU load.

Regardless, buying the absolute minimum usually isn't the best choice unless you're really limited in budget (and then, why even a 2080 Ti?).



Scabolcz #31 Posted 16 June 2019 - 04:41 PM

    Staff Sergeant

  • Player
  • 18173 battles
  • 398
  • Member since:
    07-16-2014
4k is overrated, you don't need such resolution for gaming... You would be better with a ultrawide monitor.

Myself have a 1080 and a 3440x1440 monitor on a steady 120-130 fps.

Edited by Scabolcz, 16 June 2019 - 04:42 PM.


tomatoRNGmaster #32 Posted 16 June 2019 - 04:46 PM

    Field Marshal

  • Player
  • 39127 battles
  • 12,552
  • Member since:
    03-18-2013

View PostNishi_Kinuyo, on 16 June 2019 - 04:36 PM, said:

Eh, not really, if the PSU has two 8-pin PCI connectors than it should have no problem with that.

Besides, a 2080 Ti with two 8-pin PCI connectors is rated for 375W power draw.

You'd need to do some serious overclocking if you want to cap out that 650W PSU, because without it I estimate it has only a maximum power draw of around 575W, and that's assuming 100% CPU&GPU load.

Regardless, buying the absolute minimum usually isn't the best choice unless you're really limited in budget (and then, why even a 2080 Ti?).

 

Well, I first had a 650watt PSU which ran absolutely fine until I swapped my 2080 for a 2080ti, after which my system would randomly reboot when running resource-intensive games (mostly KC: D at the time). Did some tests, checked what the system was actually using (pulled at most 600 watts from the wall during gaming), eventually fixed the issue by swapping the PSU for an 850 watt version of the same model which had 70 amps on the 12v rail vs 64 amps on the lower wattage PSU. Hence my conclusion that nVidia's listed minimum wattage is misleading as wattage ain't the issue; they should be listing amperage.

 

My system isn't super heavily overclocked by the way, I'm too paranoid about liquid to use a custom loop and I don't see the point in AIO so I'm limited to what I can achieve on air cooling.


Edited by VonniVidiVici, 16 June 2019 - 04:46 PM.


Pansenmann #33 Posted 16 June 2019 - 05:11 PM

    Field Marshal

  • Player
  • 37028 battles
  • 14,159
  • [WJDE] WJDE
  • Member since:
    08-17-2012

the advantage of 2560x1440 is imo

that you can scale up 1080p material without it looking too bad.

 

regarding RAM clocks - my ryzen 5 2600, running at 3.8 Ghz liked it that I overclocked the memory a bit.



Bordhaw #34 Posted 16 June 2019 - 09:18 PM

    Major General

  • Player
  • 15639 battles
  • 5,333
  • Member since:
    01-29-2017

View PostScabolcz, on 16 June 2019 - 03:41 PM, said:

4k is overrated, you don't need such resolution for gaming... You would be better with a ultrawide monitor.

Myself have a 1080 and a 3440x1440 monitor on a steady 120-130 fps.

 

View PostVonniVidiVici, on 16 June 2019 - 02:33 PM, said:

I'm just wondering which marketing guru came up with 2k and managed to actually get people to use that non-term. What resolution is that even supposed to be?

 

8K will be the limit.

 

It was the same selling point with digital camera's - must have more megapixels



Captain_Kremen0 #35 Posted 17 June 2019 - 09:50 AM

    Captain

  • Player
  • 39622 battles
  • 2,494
  • [TFMB] TFMB
  • Member since:
    06-04-2011

View Postjack_timber, on 16 June 2019 - 12:03 PM, said:

A query....

I play on full HD what is the difference between HD & 2 or 4k?

Is there an advantage?

TY.

 

 

 

Yes - your Ecock will be much bigger than everyone elses.

A 4K tv is a bit like Bully's star prize - fecking useless in the real world of nobody actually transmitting in 4K






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users