Jump to content

PC Hardware 7dtd


Stockjunkee84

Recommended Posts

I'd like to start a thread on different aspects of performance with different hardware. I know these have been discussed before in best GPU/CPU for the game. But I haven't seen much on ram and how the game scales to multiple cores vs single core performance.

 

I currently run a 8700k @ 4.7ghz

1080 to ftw3

2400hmz DDR4

 

My fps at a mix of high @ ultra setting with grass distance and tree detail set to middle and high in 1440p gsync would be as high as the 140's to plunging into mid 40's fairly regularly. I've seen ram usage of 13gb "total system hosting with 2 friends" 10gb usage in SP and Vram usage at 9gb.

 

With such heavy ram usage has anoyone tested to see if the game runs more stable with faster ram?

 

Or can utilize high amounts of cores. My 8700k would show high utilization across all cores and ran warmer than most games I play.

Link to comment
Share on other sites

I've been playing this game for years and my PC has seen many upgrades in that time. By far the upgrade that made the biggest difference to performance by quite a margin was moving to an nVME SSD. Other than that, my full spec is

 

Intel i7-6700 @ 4MHz

nVidia GTX 980 Ti 6GB

16 GB RAM

 

I run a server on that for 1 to 3 other players (usually 1 other guy these days). I have everything maxed and 24 Zombies, except:

 

UMA: Lowest

Reflections: Off

Reflected Shadows: No

Water Quality: Low

 

I get solid 60fps. I also have Depth of Field and Motion Blur off but only because I dislike such effects. UMA is the weird one; I've never managed to spot this setting making any visible difference anywhere in the game but if I set it to anything but Lowest, my fps drops to 45 on Horde nights. Shrug.

Link to comment
Share on other sites

Intel i7-6700 @ 4MHz

New record for slowest CPU... 4MHz :p jk

 

I think you meant GHz.

 

 

I'd like to also point out that AMD Ryzen specifically benefits from ram performance much more than Intel or AMD's older lineup as it's more directly connected with the ram. So if you had a Ryzen CPU and are trying to squeeze out even more performance out of it, get the better ram as well. This benefit would carry through to all games, not just this one.

Link to comment
Share on other sites

And i am sitting here in the GPU Limit. With a 1070. My i5 4460 CPU is at 30% and 50° C while my GPU is at 97-100% at 76°.

Also 1070 here, but with a 2700X on DDR4-3333.

 

Usually also GPU limited on High-Settings in 1080p. I cap at 60fps, but with Ultra it can't occasionally hold the 60fps during a normal day.

 

And yet i read everywhere that the CPU is more importnant.

The CPU is important in different cases, not all the time.

I have massive framedrops during horde night. I wanted to take a closer look on system usage during last horde night, but forgot about it. Hopefully i will not forget it next blood moon. I'm almost surce it's a "few-cores"-bottleneck and not the gpu.

 

In A17 i let a large building collapse, that was still on my old i5-3570. 30% CPU-usage (one core 100%, the others 1-3%), GPU-usage dropped to 20% and overall 15fps. ;)

Link to comment
Share on other sites

And i am sitting here in the GPU Limit. With a 1070. My i5 4460 CPU is at 30% and 50° C while my GPU is at 97-100% at 76°.

 

And yet i read everywhere that the CPU is more importnant.

Before a18, the GPU was barely utilized. The game relied extremely heavily on CPU processing and RAM. With a18, that load is being balanced, and more of the texture streaming and graphical parts of the client are now passed to the GPU. You still have a heavy reliability upon the CPU for managing the voxel SI calculations and AI pathing.

Link to comment
Share on other sites

I am torn now @SylenThunder. I actually wanted to upgrade CPU, but seeing Ryzen 4 coming in the first half of 2020 and getting the GPU Limit in this game i might go GPU for christmas.

 

@Kalen well that´s early access for you. Hardware optimization is done shortly before release in every development cycle. Also seeing you spent that much on a CPU, do you play in HD? Because in 1440p or higher you basically wasted your money because you will run into GPU limits way before you run into CPU limits usually.

Link to comment
Share on other sites

I am torn now @SylenThunder. I actually wanted to upgrade CPU, but seeing Ryzen 4 coming in the first half of 2020 and getting the GPU Limit in this game i might go GPU for christmas.

 

@Kalen well that´s early access for you. Hardware optimization is done shortly before release in every development cycle. Also seeing you spent that much on a CPU, do you play in HD? Because in 1440p or higher you basically wasted your money because you will run into GPU limits way before you run into CPU limits usually.

 

Just out of curiosity, about how long does Alpha testing last for these types of games? World of Warcraft (that I was able to beta test) started production in 1999 and was launched in 2004. So, what about a 5 year window from the start of production to a finished product released for P.C.

 

Looking forward to the Pimps finally moving out of Alpha testing and into a much more finalized product before 2030.

Link to comment
Share on other sites

You still have a heavy reliability upon the CPU for managing the voxel SI calculations and AI pathing.
So in practice, if I were to upgrade my CPU I'd have less stuttering and jittering during:

- blood moons

- complex POIs (wandering hordes trying to get in)

- zombie digging operations?

Link to comment
Share on other sites

This summer I upgraded to an i9 9900 with a 2080RTX and an nVME SSD. I still can't run at max settings without drops in frame rate.

What resolution?

 

 

 

And while I haven't been bothered to confirm this yet, I still have my doubts that the game properly uses all cores now. Cuz if it did, then what reason would there be to relieve stress from the CPU and pile even more stress onto the GPU? I can at least confirm that the server no longer uses all threads for at least some CPUs because the hyper-threaded ones sit idle the whole time on my FX8320.

Link to comment
Share on other sites

@Seraphim No clue at all. I heard something about 2020 in youtube vids. But i don´t follow dev diary or anything. For me it´s done when it´s done. I absolutly don´t understand how people can rush developers. Rather have it finished in 2021 than having a bugfest.

 

Has nothing to do with "Rushing" the Developers, morose at what point in a games developmental life does it finally progress from the drawing board to a stable product ready for market consumption. This game in particular has seen it's fare share of trading hands and has been in Alpha testing for what 3 years now?

 

 

Like I wrote in my previous post, I'm hoping the game is ready for market release before the year 2030.

Link to comment
Share on other sites

your cpu does not have hyperthreading or SMT.

That was one of that bulldozer cpus, right?

His CPU has some pipelinestages 8 times, but not 8 "full" cores. Yes it is not exactly SMT, but very similar anyway.

 

But i don't know a way to determine which core is physical and which one is virtual. Your system only has 8 virtual cores, not 4 physical plus 4 virtual (even with real SMT). So it's impossible to say what runs on a physical cpu and what runs on a virtual cpu.

Link to comment
Share on other sites

And i am sitting here in the GPU Limit. With a 1070. My i5 4460 CPU is at 30% and 50° C while my GPU is at 97-100% at 76°.

 

And yet i read everywhere that the CPU is more importnant.

 

I agree. We ran the game using an I3 8350k 4x 4Ghz, all settings on ultra (nothing disabled) with a Rog Strix 1070ti 8gb and got a solid 60-70 fps. I don't think the cpu is the key here, I believe you need a massive gfx card and at least 16 gb ram but who really knows..

Link to comment
Share on other sites

That was one of that bulldozer cpus, right?

His CPU has some pipelinestages 8 times, but not 8 "full" cores. Yes it is not exactly SMT, but very similar anyway.

 

But i don't know a way to determine which core is physical and which one is virtual. Your system only has 8 virtual cores, not 4 physical plus 4 virtual (even with real SMT). So it's impossible to say what runs on a physical cpu and what runs on a virtual cpu.

 

there are no physical and virtual cores on bulldozers, they have 2 integer and 1 fpu unit per module, each unit shares e.g. l2 cache.

it is a different design compared to current cpus with smt, and it failed quite hard.

 

technically bulldozers are probably 8c, because in past a CPU was a CPU even though they had no FPU at all. (fpu calc was either emulated or you bought a co-processor)

Link to comment
Share on other sites

This game in particular has seen it's fare share of trading hands and has been in Alpha testing for what 3 years now?

It's been more than 6 years actually.

 

your cpu does not have hyperthreading or SMT.

I get Intel and AMD mixed up... you know what I'm talking about, it's pretty much the same technology just under a different name because... reasons.

 

That cpu has 4 cores and 4 virtual... according to task manager, 4 of them aren't working. My assumption is it's either the virtual cores sitting idle or the server only uses 2 cores and 2 virtual and the rest sit idle (which is very much unlikely).

 

My Ryzen has no problem with the client side though... all 12 threads being used to some degree. I should maybe try loading a dedicated server with it to see if it's just exclusive to the old FX series CPU.

Link to comment
Share on other sites

What resolution?

 

 

 

And while I haven't been bothered to confirm this yet, I still have my doubts that the game properly uses all cores now. Cuz if it did, then what reason would there be to relieve stress from the CPU and pile even more stress onto the GPU? I can at least confirm that the server no longer uses all threads for at least some CPUs because the hyper-threaded ones sit idle the whole time on my FX8320.

 

3440x1440

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...