Jump to content


PC version vs X360 version of WoT

wot world of tanks x360 xbox edition xbox 360 edition x360

  • Please log in to reply
13 replies to this topic

GordonRamsay #1 Posted 31 July 2015 - 12:07 AM

    Warrant Officer

  • Player
  • 13164 battles
  • 517
  • Member since:
    01-05-2012

Well

just for curious I started to play X360 version of WoT and I wonder why they have more features that  PC players wants too but we dont get them.  Its nothing big, mostly visuals like:

- maps with weather  (heavily snow, or storm with rain..  giving you problems with seeing enemies) 

- maps in night

 - smokes around tank/gun when you shot  (more realistic, not that ** we have) 

 

they also have limited range in platoons, so  leader pick  tank first,  and it lock tier +-2  from his tank, so others in platoon cant go for example tier 10 with tier 1..  (which I see almost in every battle.. those trolls) 

 

I know its different team, different engine..   Im also playing WoWs sometimes (friend pc broken so its on pause..)  and seems  WoT_PC have worst dev team ever :(   I dont want be disrespecftul, but seriously why all other WG projects have better things we are calling for them?  

 

Like in WoWs -  devs in early beta found that 2-3 aircraft carrier ships (read arty) will be too hard so they limited their platooning, so only 1 can be per platoon.   Same thing we can see in WoT.  

If there is platoon with 3 arties it not only ruin game (because other side need to have 3 too)  but also many times it break matchmaking, making it unbalanced and sometimes it get really crazy and put 5 vs 5 arties in same game (why?).   IF  WoWs devs can do this for their players,  why  WoT devs cant do it too? Why they have  dozens of empty excuses? 

 

btw after 40 battles in x360 wot I can say it have clearly better matchmaking. Yes, sometimes it wait in queue for 2 minutes but hey..   again.. serB (wot dev on pc)  saying they dont want change mms in any way because people will must wait and he dont want to........  and devs from other projects dont have problem with it?   

Its under one flag, so it should have same quality.   If someone come from x360 to pc version it should not be shock for him ;D 



CroustibatFR #2 Posted 31 July 2015 - 09:59 AM

    Brigadier

  • Player
  • 23297 battles
  • 4,135
  • Member since:
    09-14-2011

Block Quote

 btw after 40 battles in x360 wot I can say it have clearly better matchmaking.

 

Making statistical observations after 40 battles ? Wow.

 

Well if you find the 360 version so much better, why don't you keep playing it ?

 

If someone comes from the 360 version to the PC version, i can tell the player level will be a shock, not the GFX quality (which, once pushed to the max, is way better than anything the 360 can do BTW).



Xino9922 #3 Posted 31 July 2015 - 12:44 PM

    Brigadier

  • Player
  • 16028 battles
  • 4,736
  • [LEWD] LEWD
  • Member since:
    02-13-2011

The xbox 360 version has the worst framerate I've ever seen on a wargaming product....

 

It barely runs at 25 fps on my xbox 360e (latest 360 released), and it's a lot more powerful than the earlier versions. I don't even want to think about trying to play it on 1st generation 360s...



GordonRamsay #4 Posted 31 July 2015 - 07:49 PM

    Warrant Officer

  • Player
  • 13164 battles
  • 517
  • Member since:
    01-05-2012

xino - [edited], all X360 are same. same clock, same cpu. everything is same. 

I have 2nd gen (white one, no wifi but hdmi with 20gb hdd) and I had only 1 time  fps loss, when many enemies around + some noob was destroying objects around.. so too much explosions. 

 

i cant benchmark it, but i am pretty sensitive on low FPS so maybe your x360 is crap. Probably because  latest versions have problems with cooling, because of smaller fans and smaller case. (i am working in one big game shop and have RMA as my duty too, so if you really want to know.. let me talk )   Of course XONE version is better. But I dont want support Microsoft that much ;D I get  this white x360 almost for free.. so why not :) 

 

CroustibafFr -yes. because in PC version you can see and feel  broken matchmaking in every second game.  In X360 version I had it only at 3:00am with  5k people on EU server. 

 

Why I dont keep playing x360 version?  For two reasons.. first you need to pay for xbox live gold, which is crap.   second -  there is no clans, no clan wars.. no end content.


Edited by GordonRamsay, 31 July 2015 - 07:51 PM.


Kuraikage #5 Posted 31 July 2015 - 08:41 PM

    Major

  • Player
  • 1010 battles
  • 2,806
  • Member since:
    07-30-2014

View PostGordonRamsay, on 30 July 2015 - 11:07 PM, said:

 giving you problems with seeing enemies

- maps in night

 

They don't really give you any at all trouble seeing enemies unless you're running around with no HUD.

Night maps are alright, but they look like the night as much as Dead Space 3 looks like a 'survival horror' game.

 

View PostGordonRamsay, on 30 July 2015 - 11:07 PM, said:

they also have limited range in platoons, so  leader pick  tank first,  and it lock tier +-2  from his tank, so others in platoon cant go for example tier 10 with tier 1..  (which I see almost in every battle.. those trolls) 

 

It's an attempt to stop fail-platooning but it is not a definite answer, unless I am mistaken there is an easy to use trick to actually do such a thing.

Alongside that, they don't allow say a Tier IV to platoon up with a Chaffee, so that IV can find himself in a Tier X game I believe.

 

View PostGordonRamsay, on 30 July 2015 - 11:07 PM, said:

btw after 40 battles in x360 wot I can say it have clearly better matchmaking. 

 

I am pretty sure there's no real difference between the MM, outside of that of certain tanks, like how the Pz II J can never see a Tier III only battle, M24 still sees Tier X and so on.

 

The reason for the visuals is quite simple.

Everyone playing either the 360 or One are basically on the same 'engine', the console can easily handle such things, whereas with PC you have people running PC's capable of designing a Terminator, or those that could barely run a Potato.

 

Between playing both of them, although my time on the PC was limited to grinding the M3 Lee for brother, I much prefer the 360 version because I am much better acquainted with controllers to keyboard/mouse, but both of them are on similar playing grounds, we get fancy stuff like War! map variants, they get mods, tank skins, all that stuff.

I am quite interested in getting the One version for the much better looking graphics, ability to record stuff I believe and so on.

And only using 40 battles as a way to find the differences between the game sounds like too little.

 



Polkadot9000 #6 Posted 02 August 2015 - 04:15 PM

    Warrant Officer

  • Player
  • 10765 battles
  • 870
  • Member since:
    02-05-2013

As for the graphical side of things, It really isn't fair to compare what your experience is on your PC without saying what your settings are on your PC. I mean your experience will differ HUGELY on PC if you are running it on standard render on a pentium processor in your laptop with integrated graphics (so everything is cranked to the minimum) compared to running it on a higher end gaming pc (e.g. i5-4690K and a GTX 970 (A reasonably high end system, not the most baller ever, but more than enough to max out WoT)

Plus there is of course the variance in PCs that you don't get on an Xbox, they have to design it so that it can run on everything from a system with windows XP and a GTX 260 to a 5960X with quad Titan-Xs, with Xbox 360s there is no variance, everyone has the same processor, clock speed, ram etc. ( with some tiny generational variance.)



maxim131 #7 Posted 02 August 2015 - 08:57 PM

    Staff Sergeant

  • Player
  • 3391 battles
  • 378
  • Member since:
    01-06-2013

Consoles suck.

 

PC is always better.



Kuraikage #8 Posted 02 August 2015 - 09:00 PM

    Major

  • Player
  • 1010 battles
  • 2,806
  • Member since:
    07-30-2014

View Postmaxim131, on 02 August 2015 - 07:57 PM, said:

Consoles suck.

 

PC is always better.

 

But PC doesn't get Bloodborne ;-;

Woody1999 #9 Posted 02 August 2015 - 10:51 PM

    Lieutenant General

  • Player
  • 15895 battles
  • 6,487
  • [WJDE] WJDE
  • Member since:
    05-15-2011

I have a 1st gen Xbox 360 Elite, running through an S-video connection (yes, that old). It barely runs horrible looking old games (Virtua Tennis 2009 anyone?) at 20FPS, I can actually see the individual frames! It's incredibly noisy even at idle, hard drive sounds like it is going to die any moment. The hard drive in my Xbox isn't even 5 years old, yet the hard drive in my PC is 8 years old and counting.

 

Okay, I do look after my PC very well, regular cleaning and maintenance. Even so, consoles still require a helluva lot of work before they even approach PC level.

 

Woody



h11poc #10 Posted 02 August 2015 - 11:45 PM

    Private

  • Player
  • 12436 battles
  • 17
  • [BROP] BROP
  • Member since:
    09-29-2013

View PostSanta_CIaus, on 02 August 2015 - 03:15 PM, said:

As for the graphical side of things, It really isn't fair to compare what your experience is on your PC without saying what your settings are on your PC. I mean your experience will differ HUGELY on PC if you are running it on standard render on a pentium processor in your laptop with integrated graphics (so everything is cranked to the minimum) compared to running it on a higher end gaming pc (e.g. i5-4690K and a GTX 970 (A reasonably high end system, not the most baller ever, but more than enough to max out WoT)

Plus there is of course the variance in PCs that you don't get on an Xbox, they have to design it so that it can run on everything from a system with windows XP and a GTX 260 to a 5960X with quad Titan-Xs, with Xbox 360s there is no variance, everyone has the same processor, clock speed, ram etc. ( with some tiny generational variance.)

 

I saw the difference form first playing it with an Amd 8 series and GTX285, a 2600k runing a HD7850 and to now running a 5930K with a 290x. The framerate increase was vast. Someone on the general Discussion forum said it had nothing to do with graphics cards and all to do with CPU . He was of course wrong.  I get a FURY X in a week or so as I do video editing and AMD kills Nvidia in Sony Vegas Editing. I will also see what it does to the frame rate on WOT although i currently get over 100fps - 120fps on maximum settings and more than that would seem overkill.

 



Woody1999 #11 Posted 03 August 2015 - 12:56 AM

    Lieutenant General

  • Player
  • 15895 battles
  • 6,487
  • [WJDE] WJDE
  • Member since:
    05-15-2011

This game is single threaded. The Vishera technology only allows a low IPC clock ratio, therefore leading to low single core performance. That's why FX chips aren't that useful for this game.

 

Your 5930K has some of the best single threaded performance in the world right now. That is your massive increase in FPS, although the graphics cards helped a lot too.

 

Woody



h11poc #12 Posted 03 August 2015 - 02:03 AM

    Private

  • Player
  • 12436 battles
  • 17
  • [BROP] BROP
  • Member since:
    09-29-2013

View PostWoody1999, on 02 August 2015 - 11:56 PM, said:

This game is single threaded. The Vishera technology only allows a low IPC clock ratio, therefore leading to low single core performance. That's why FX chips aren't that useful for this game.

 

Your 5930K has some of the best single threaded performance in the world right now. That is your massive increase in FPS, although the graphics cards helped a lot too.

 

Woody

 

 

Woody, the difference when the 290x went into the 2600k computer saw increase in FPS. I just ran game and although loading was on core 1 and core 4 switching between two , all six cores showed activity during play. Core 1 has 53% in use - between 15 and 27 on cores 2 and 3 , core 4 was at 30-35% and core six seemed to be solid 5% doing nothing. This may have been activity between CPU and GPU or CPU and Memory though so I can't be certain more than one core is used. You are right though about the 5930k - unfortunately the 5930K does not have built in graphics in any shape or form and relies solely on a GPU for all visual activity . I



Woody1999 #13 Posted 03 August 2015 - 01:00 PM

    Lieutenant General

  • Player
  • 15895 battles
  • 6,487
  • [WJDE] WJDE
  • Member since:
    05-15-2011

I can only assume that other programs are using some of your CPU. This game is strictly single threaded, and so it can only recognise and utilise one core at a time. Your CPU is far more advanced than this game, and therefore it was probably trying to compensate by "splitting" the workload onto core 0 (1) and core 3 (4). However, because the bare source code of the game doesn't allow this, it was simply swapping workload from core to core. Does that make sense?

 

Anyway, surely task manager would show 12 threads rather than 6? 6 physical cores + 6 logical cores. The wonders of hyperthreading. ;)

 

I would expect the better graphics cards to increase the quality of the game (graphics settings) but not the overall frame rate. This is because in WoT, your CPU is always going to be your bottleneck, even if you have a hexa-core Haswell Extreme monster. I'll use my system as an example:

 

CPU: Core 2 Quad Q8300 clocked at 2.9GHz.

GPU 1: Radeon HD 5450

GPU 2: Radeon R7 260X

 

The Q8300 doesn't have particularly good single-core performance. On GPU 1, I could manage about 30 FPS on minimum settings, but if I tried to increase the graphics settings, FPS would fall dramatically. This is because my terrible old GPU is my bottleneck - it isn't good enough to support anything above minimum. On GPU 2, I can manage about 40 FPS on pretty much any settings, whether minimum of maximum. This is because my CPU is now my bottleneck - whereas my 2014 era graphics card can easily handle whatever the 2011 game throws at it, my CPU is having real problems dishing out the commands to the GPU. This caps me at a maximum frame rate. I can see this because at all times, my core 0 is working at 100%, and my GPU is only working at 60% tops.

 

Your current CPU and GPU are far too powerful for this game. That explains the low CPU usage on core 0 - it's barely having to break a sweat.

 

Woody



h11poc #14 Posted 03 August 2015 - 06:41 PM

    Private

  • Player
  • 12436 battles
  • 17
  • [BROP] BROP
  • Member since:
    09-29-2013

View PostWoody1999, on 03 August 2015 - 12:00 PM, said:

I can only assume that other programs are using some of your CPU. This game is strictly single threaded, and so it can only recognise and utilise one core at a time. Your CPU is far more advanced than this game, and therefore it was probably trying to compensate by "splitting" the workload onto core 0 (1) and core 3 (4). However, because the bare source code of the game doesn't allow this, it was simply swapping workload from core to core. Does that make sense?

 

Anyway, surely task manager would show 12 threads rather than 6? 6 physical cores + 6 logical cores. The wonders of hyperthreading. ;)

 

I would expect the better graphics cards to increase the quality of the game (graphics settings) but not the overall frame rate. This is because in WoT, your CPU is always going to be your bottleneck, even if you have a hexa-core Haswell Extreme monster. I'll use my system as an example:

 

CPU: Core 2 Quad Q8300 clocked at 2.9GHz.

GPU 1: Radeon HD 5450

GPU 2: Radeon R7 260X

 

The Q8300 doesn't have particularly good single-core performance. On GPU 1, I could manage about 30 FPS on minimum settings, but if I tried to increase the graphics settings, FPS would fall dramatically. This is because my terrible old GPU is my bottleneck - it isn't good enough to support anything above minimum. On GPU 2, I can manage about 40 FPS on pretty much any settings, whether minimum of maximum. This is because my CPU is now my bottleneck - whereas my 2014 era graphics card can easily handle whatever the 2011 game throws at it, my CPU is having real problems dishing out the commands to the GPU. This caps me at a maximum frame rate. I can see this because at all times, my core 0 is working at 100%, and my GPU is only working at 60% tops.

 

Your current CPU and GPU are far too powerful for this game. That explains the low CPU usage on core 0 - it's barely having to break a sweat.

 

Woody

 

Thank you Woody - I used Core Temp which only shows the cores, core activity and temperature. I do have HT enabled as well .   You are more than likely correct as the CPU would have to communicate with the GPU.

The GPU has a clock speed of 1 Ghz compared to the current single core setting of 4.0ghz on the 5930K . The reason the GPU is faster for video editing is that we enter the realms of parallel processing which the GPU is much better at doing. The 290x has 2800 cores !! In parallel processing the instructions are pre determined by my render settings and therefoer you are not taking the random game playing elements into consideration. The GPU is therefore way faster at video rendering than the CPU.

Also in rendering DVD i am using 25fps PAL and in HD 30 fps . This is easy for the 290x to render.  

 

Your 2nd GPU is perfect for the game.  I have not heard what the future is for console based CPU but AMD release ZEN very soon and that will bring them back onto level playing field with INTEL at last. Also HMB2 will be standard on GPU within 18months. Comparing 4gb of HBM1 with its 4096bit wide bus compared to DDR5 which is slower on 386-512bit bus shows us the speeds that will be available in the  not too distant future. 

 

Hopefully AMD and its ZEN will give the console market and the AMD fanboys a real boost. We have to wait and see of course.

 

 






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users