Sections

Is The Witcher 3: Wild Hunt maxing out next-gen consoles already?

Thursday, 5th September 2013 14:35 GMT By Ray Willmott

The Witcher 3 developer CD Projekt RED has previously claimed that the RPG maxes out PS4 and Xbox One tech already. Ray Willmot checks out the game’s latest build and quizzes the studio to see if it’s true.

The Witcher 3: Wild Hunt

Developed by CD Projekt RED, The Witcher 3: Wild Hunt will hit PC, PS4 and Xbox One in 2014.

You can check out a gallery of recent gamescom screenshots here.

The latest ‘Killing Monsters’ trailer sees Geralt beating down guards just before they execute a woman with charges of cannibalism.

Game of Thrones actor Charles Dance will have a big role in The Witcher 3, which just sounds ideal if you know the man’s work.

In a behind-closed door presentation at gamescom last month, it became clear to me that while Geralt may be older he’s also much wiser in The Witcher 3, players will come to understand his human side more than ever, and his unwavering sense of heroism will compel you to consider side-quests that threaten the balance of society.

Completing these quests could have political ramifications, or it could simply see the banishment of monsters terrorising the land. The way Geralt can now track beasts using their footprints, trace them to their lair, and stick a sword in their gut is every bit as poignant as his significant conversational choices, which may forever alter the destiny of a small village or large city.

Clearly, there’s a lot going on in this game from both narrative and technical perspective and it’s true that The Witcher 3 oozes elegance with its next-generation glow. Everything about the game feels fluid, sublime and indicative of a painstakingly created world, inspired by real-world mythology. CD Projekt RED has been quietly building its masterpiece in Poland, intentionally setting the bar high for next-generation open-world RPGs, and every article I see makes the wait for 2014 much harder.

After seeing the game in action, I got a chance to quiz gameplay producer Marek Ziemak about just how far The Witcher 3 is pushing boundaries on next-gen consoles before they’ve even launched, and much more.

VG247: While bringing The Witcher 3 to next-gen, have you had to make any changes to the game? Have any sacrifices had to be made to fit in with the business practices of Sony and MS

Marek Ziemak: No, it actually worked the other way around. We actually had loads of new opportunities because of the new platforms. We have got better equipment to play with and can now deliver more things in the game.

VG247: Did you make any major technical changes to either version?

Ziemak: We’re using brand new technology with Red Engine 3. It’s an advanced version of the engine we used in The Witcher 2 and has been rebuilt in many different ways. For example, with the open-world mechanics and system, we had to implement a streaming mechanism which allows players to smoothly travel across the world. We’ve also totally changed the AI system. There are a lot of technical things going on in the background which enable us to meet our needs and it allows us to create group tactics and many various types of enemy behaviours and things like that.

VG247: Earlier this year we saw Microsoft introduce various DRM policies on Xbox One that have since been overturned. Given your relationship with GOG.com and the fact it places no DRM restrictions on its titles, what were your thoughts on the whole process? Did it put you off developing Witcher 3 for Xbox One?

Ziemak: Whenever we can, our approach is to create games without DRM. The Witcher 3 will be available on GOG.com, day one, DRM free. But when it comes to consoles, well, we have to consider the platform we’re developing for. Of course we’re independent as a games studio but when developing for Playstation 4 or Xbox One; we have to work with console manufacturers. And in the end, the market makes the decision.

If people buy consoles with DRM restrictions and they are OK with it – because there are a mass of players who don’t care whether a game has DRM or no DRM restrictions – we don’t want to cut those players off. So yes, we will be delivering our products to Sony and Microsoft, despite talks of DRM. If it was possible to deliver a game DRM free on those systems that would be cool, we would love it as it would support our philosophy. But sometimes we have to work with manufacturers and respect their business decisions.

VG247: In a recent interview, CD Projekt members suggested that the team has already managed to max out next-gen systems. I found this a bit concerning as you’re aiming to launch The Witcher 3 very early on in their life cycle. I know the game will be dramatically intense on PC, but can you tell us just how much the game is progressing on next-gen platforms?

Ziemak: Sure. At this point, we’re advanced in testing, running and experimenting with next-gen platforms, and at this point we know the game is pretty demanding. Of course, we still have the authorisation phase in front of us, but because of size and density, we are already close to maxing out the equipment. Of course, if we find more workforce in boxes, we will surely use it to make the game even better. I think others share the same opinion as us, if there’s power to be used then why not use it all? [Laughs]

VG247: You’re currently in talks with Microsoft about cross-save opportunities between Xbox 360 and Xbox One for The Witcher 2 and Witcher 3. How are those progressing? For those who’ve played Witcher 2, what differences can they expect in Witcher 3?

Ziemak: To be honest, I don’t know how the talks with Microsoft are progressing in this instance. We’re looking for ways to allow people to load their save games from previous instalments and put them into Witcher 3. This isn’t problematic for PC as we’re in charge of that situation and control it fully. We’re investigating the possibilities of transfer between Xbox 360 and Xbox One, and if its possible, we will move ahead with that.

As for what to expect, players will be able to continue their progress, continue with the decisions made in storyline of The Witcher 2. They’ll also be able to collect some stuff from their inventory, such as gold, but we haven’t firmed up the full details yet. We’ll see. Things still need to fine-tuned and balanced. 

VG247: Could you tell us how diverse the landscape of The Witcher 3 is and describe some of the environments players will get to see?

Ziemak: There’s a pretty huge diversity. The world is around 35x bigger than Witcher 2 and each location is different. We’re putting effort into creating unique elements in this world and producing various points of interest. We believe these things make a world more attractive and interesting. There are a few different kingdoms or relics you can visit as you travel through the game. All of them are connected and part of one coherent experience.

There are no loading screens and the transition between areas is smooth. But there are also moods in each area that feel different. For instance, skeletal moods inspired by Celtic and Norse mythology. There’s also a huge city inspired by medieval Amsterdam, There’s plenty of cloak and dagger from the books, some featuring the mafia and all dark businesses being done in background of story. There are plains called No Mans Land, a ravaged territory run down by war and totally destroyed. This is just an example of the huge diversity and emotions in places.

VG247: The Witcher 2 had a lot of customisation available to the player in terms of aesthetics and items. Has that system been reduced to accommodate newer players?

Ziemak: Not at all. In fact, The Witcher 3 will offer a bigger variety in terms of the amount of items you can collect. Your abilities will grow, enemies will get tougher and bigger. The amount of options available to the player are huge.

VG247: A combat mod was released for the PC version of The Witcher 2 two years after launch. Does this hint at the combat style being implemented in The Witcher 3.

Ziemak: You mean the mod released by Andrzej Kwiatkowski? Well, he is a combat developer working on Witcher 3. He is partially responsible for enemy behaviour and combat and balancing the game. It’s a cool mod and it will influence Witcher 3 as he influences Witcher 3 by his person.

Our approach to balancing difficulty and enemies is that we like different levels of difficulty for different sets of players. In The Witcher 3, we want to make the learning curve more smooth and not frighten anyone who is just starting the game. I can’t answer this question specifically, but it will almost certainly have an impact on the game.

VG247: Final question. As we’ve come to expect from The Witcher series, the PC version will offer mod opportunities. From your experience with next-gen systems, do you think Xbox One and Playstation 4 will offer more flexibility for modding opportunities than current-gen?

Ziemak: I’m afraid I can’t answer this question yet. We haven’t spent too much time exploring this area as we’re trying to get the core product and core experience onto the platforms. From there, we will be trying to squeeze Red Engine 3 to allow modders to use this part of the technology. But it’s a very hard question and i’m afraid I can’t give you a clear answer at this time.

The Witcher 3 hits PC, PS4 and Xbox One in 2014.

Breaking news

44 Comments

Sign in to post a comment.

  1. FrankWhite

    Not hard to believe. The Witcher 2 on max settings is pretty demanding, so I imagine they have all the bells and whistles from The Witcher 3 on PC that they can cram into the console versions.

    Of course, over the next 8 years of the next console gen, devs will learn new tricks and ways to optimize both ps4 and xbone hardware beyond what is possible today.

    Still, I don’t think this is just hype, I am willing to bet, TW3 will make use of all 5-6gb of RAM and really push next-gen GPUs. Hell, it will probably look better on next-gen consoles than on most people’s mid-range gaming PCs.

    #1 11 months ago
  2. kezwar

    What a silly question! It does look lovely though. Boggles my mind to think how much devs can push the systems next-gen if this is what we’re getting already.

    #2 11 months ago
  3. Froseidon

    I finally completed Witcher 2 for the first time yesterday (despite having had it on Xbox since it was released) and it just made me want Witcher 3 even more. I’m definitely getting a next-gen console for the release of this, it looks fantastic.

    #3 11 months ago
  4. Clupula

    I do hope they have some way of catching PS4 gamers up with the story before this comes out. It looks like it’ll be fun, but I really do hate jumping in on the third part of a trilogy, yet, I don’t do PC gaming.

    #4 11 months ago
  5. sebastien rivas

    I hope TW3 maxes out all consoles. The Witcher have always been to the edges of designs, effects capabilities with strongly well woven story lines. I do not expect less with the witcher 3.

    And Ziemak to help make a choice regarding last vg247 question.
    Yes you should make it available for modding on consoles if it is possible with current technology import/export.

    Thanks anyway, TW3 looks lovely and a need that game on my shelf :)

    #5 11 months ago
  6. Dark

    Also this
    http://i.imgur.com/hOFKG1L.png

    #6 11 months ago
  7. Froseidon

    @5 – Speaking of story, I want to know which one they’re going to take or if we’d be able to import or make any sort of choice. I don’t think I have seen a game which diverges so greatly based upon your actions.

    #7 11 months ago
  8. yeoung

    Guess I’ll have to YT Witcher 1 & 2 cutscenes and dive into some wiki pages to know what the hell is going on story-wise. Still very excited to finally be able to get my hands on this franchise though.

    #8 11 months ago
  9. xFidelCashflow

    I hope they bring 1 & 2 over to PS4 somehow, similar to what was done with Mass Effect. PS3 would be more likely, but that hasn’t happened. With PS4 actually getting an entry in the series for the first time, the first two could wind up making the jump. At least I can dream they will :/

    #9 11 months ago
  10. Clupula

    I’m surprised no one ever seems to ask them that question, really. It was the first thing that popped into my head upon learning of a PS4 version.

    #10 11 months ago
  11. lookingglass

    Of course they are close to maxing out the next gen consoles. The new consoles are standard PC architectures and developers know how to use PCs already.

    The only console that might see significant improvement is the Xbox One. And that would all be from cloud gaming which requires a good, consistent Internet connection, something 90% of PS4 users don’t have anyways.

    #11 11 months ago
  12. silkvg247

    It wouldn’t be hard would it given that the new consoles are basically under powered PCs. Design it on PC with all bells and whistles and then gradually turn said bells and whistles off on the “next gen platform” until it can run at 30 solid fps (woo).

    #12 11 months ago
  13. Animeboy413

    LOL

    #13 11 months ago
  14. Lengendaryboss

    @12
    Its amazing how “teh cloud” has turned certain Xbox trolls into mindless puppets.

    #14 11 months ago
  15. Dragon246

    @15,
    More like this-
    http://www.oneeyeland.com/photo4/conceptual/one_eyeland_cloud_head_by_sean_breithaupt_44450.jpg

    #15 11 months ago
  16. Lengendaryboss

    :D

    #16 11 months ago
  17. pcbros

    Everyone seems to be down playing the cloud (the latest trend) but don’t forget the Nvidia CloudLight demo.

    I can see this kind of technology being implemented in the future and not just for lighting.

    #17 11 months ago
  18. Lengendaryboss

    @18
    I’m not downplaying anything, i just feel resentment for the poor troll (@12) being brainwashed by it all.

    #18 11 months ago
  19. Phoenixblight

    Witcher series has always been inefficient they go for brute force and its no wonder that they can “max” next gen consoles. It is far more impressive when you get Naughty DOg doing their magic and seeing what they can squeeze out just look at Last of US and compare it to Witcher 2 on the 360.

    #19 11 months ago
  20. Moonwalker1982

    Judging by what I saw at GC, which was a 45 minute gameplay demonstration at CDPR’s booth, i sure hope not. Cause while i have a lot of faith in this developer i wasn’t impressed at all with the first half of the demonstration. Sure it was still pre-alpha, but still. The environment and graphics in the first half honestly weren’t all that much better than Skyrim on PC without mods and i am not kidding.

    The second half was much more impressive when it started to get dark and Geralt entered a forest while it was raining and thunder, the trees realistically moving with the wind and what not. That looked absolutely fantastic. They ended the demo with the trailer we all know, but the graphics you see there are NOT what we saw in the first half. I left that demonstration with mixed feelings. If i had to compare this to the footage of Dragon Age 3 so far…..DA wins it. DA looks cleaner overall so far, smoother, more crisp, less low res textures and so on.

    #20 11 months ago
  21. super3001

    20

    linear corridor vs wide open area. witcher 2 still looks as good lol

    #21 11 months ago
  22. pcbros

    @19 – I didn’t mean it specifically towards you. But hearing “teh cloud” again and seeing how some people think these consoles might reach their full potential early… I felt I had to throw my 2 cents.

    Although personally, I’d like to see a time when Nintendo, Microsoft and Sony all developed games for the PC. We would have a unified hardware (the PC) and enjoy all those great games. Maybe they could create a hardware standard (ex. everyone should own a GTX660 or better) and developers would have to create games based on the GTX660 or lower. Then in 3-5 years they can increase that standard to let’s say a GTX780 (at that time, the price of a GTX780 would be drastically cheaper).

    I wouldn’t mind upgrading my video card every 3-5 years if it would cost less than $200. And if you think about the game console industry, the last generation lasted about 10 years and now we are paying $400-$500 to upgrade our hardware.

    Also, with the PC, you don’t have to worry about backwards compatibility. I can still play DOS games on my PC. So no worry about buying a huge library of games, only to have them be incompatible with your console “upgrade”.

    I know this has very little to do with the article, but I just had to share my vision :)

    #22 11 months ago
  23. Hcw87

    @22
    Witcher 2 was a linear RPG, unlike for example Skyrim.

    Ofcourse, Witcher 2 was bigger than Last of Us, but both games are among the best this generation.

    #23 11 months ago
  24. noamlol2

    “do you think Xbox One and Playstation 4 will offer more flexibility for modding opportunities than current-gen?

    Ziemak: I’m afraid I can’t answer this question yet. We haven’t spent too much time exploring this area as we’re trying to get the core product and core experience onto the platforms. From there, we will be trying to squeeze Red Engine 3 to allow modders to use this part of the technology. But it’s a very hard question and i’m afraid I can’t give you a clear answer at this time.”
    this worries me, what if they release free DLC on PC
    and for consoles it might costs money?

    they really MUST answer this

    #24 11 months ago
  25. Riseer

    @22 TLOU looks better that said it’s a corridor game.Kinda like how Infamous SS” open world ” looks better then most of the Xbone games shown so far.

    #25 11 months ago
  26. Clupula

    @15 – All hail Blast Processing ’13!

    #26 11 months ago
  27. sebastien rivas

    @24

    I have nothing against linearity, actually I can say I enjoy it because it is a long harsh work the author wants to bring his/her point of view on and about character(s) and the world. As long as story is deep, wide, and offer at least little surprises then I am up for it.

    Now don’t get me wrong, I liked Skyrim with its open world but it is not really a non-linear game. Storylines have as a whole a beginning, an end and a plethora of roots, a trunk, and good load of branches that makes the game seems non-linear,but for example I could not become a peasant laboring my patch of ground in Skyrim, nor did I fund a family with kids… my point is Skyrim is vast linear game filled with choices but even choices stop where the author see fit to evoke a storyline or in this case many storylines that as gamers would understand non-linear game.

    So while 1 author wants you to discover who Geralt is through and out a line of hurdles, challenges and emotional acres that define the character, the story, and its world. The other wants you to open more choices and experience the game as you would enjoy through and out line of hurdles, story choice/direction. challenges but we can’t say much about emotional acres that define the character while it still defines open/end stories and the world your character evolves in.

    My final point is that TW series and skyrim are like comparing apples to oranges… isn’t it?

    Cheer ;)

    #27 11 months ago
  28. DreadSabot

    the Witcher 2 on full tilt could max out cards that are stronger than both the X1 and PS4 cards, so it no surprise that witcher 3 would be so demanding. Love the series but plan to play this one on PC, especially for the save importing.

    #28 11 months ago
  29. nollie4545

    Lol to the max. Shouldn’t the question be, ‘are next gen consoles capable of maxxing out TW3?’ Somehow I doubt it. Or its about as likely as an xbox 360 running TW2 with ubersampling enabled.

    More to the point, are the next gen GRAPHICS CARDS, going to be able to max out TW3??

    I don’t see why gamers are moaning out the ass about PCs and having to design games to cater for those with lower end hardware. Seriously, does Crysis 3 look terrible then merely because they had to allow for quality presets that will let you run it sensibly on an Athlon X4 and GTX 460? Of course not, the people with i7s and GTX 780s still got their eye candy and high frame rates, they were not disadvantaged in any way.

    What you guys have to remember is that what these next gen consoles have is totally unspectacular graphics hardware. Its going to blow the minds of 360 owners for sure, but PC gamers have had access to this kind of performance and fidelity for over 2 years, probably closer to 3. All i am saying is that those of you opting for a 500 quid console and hoping to get an experience comparable to that from a 500 quid graphics card are going to be very unhappy bunnies on Christmas day.

    #29 11 months ago
  30. m2stech

    Maxing out a radeon 67×0 and a 77×0 shouldnt be much of a challenge.

    #30 11 months ago
  31. Diingo

    Years ago Peter Molyneux said they maxed out the Xbox 360 with Fable 2.

    Obviously that was absolute horsesh!t but i’m sure it fooled many people.

    I highly doubt they’ve already maxed out next gen consoles.

    #31 11 months ago
  32. Pitts

    How is this negative news to some people?

    If the power is there… why not use as much of it as possible?

    #32 11 months ago
  33. sebastien rivas

    @33
    To be honest I do not know how negative it could be because to me it means the devers know their stuffs to give the most to their gamers.
    Though I must be honest, I would say to beware to the author of this article because it sounds biased at many levels, particularly evoking manufacturer Sony/MS in a lapse of time will not be able keep up with next gen games for their own next gen console within with that ending title question … already?

    Might this be true in the future, we all know yes. Might this be already true then I am not sold on that.

    #33 11 months ago
  34. Llewelyn_MT

    The new consoles cost around €500. For that much money you can buy a more powerful PC that still won’t run The Witcher 2 in 1080p with ubersampling enabled. No wonder it’s maxed out already.

    People are deluding themselves thinking the next generation is years more powerful than middle end gaming PCs of 2013.

    #34 11 months ago
  35. Darkwaknight

    What a load of Tosh!! Maxed out already..yeah because the Witcher 2 is so inefficient it can Max out a 780!! This is typical PR rubbish, the amount of times I have heard this.

    So we can now confirm that projekt are using at least 6 cores on the CPU, passing back GPGPU processing and hitting the 176Gb/s limit along with using all the hUMA features to reduce re-reads and Writes for coherent CPU/GPU work…??? Make me laugh!!

    I missed the part where this game is handling so much more (aside Pretty visuals) than say GTAV on current Gen.. this and Witcher 2 is not even in the same league.

    And as for Visuals yes indeed this is much better than the PS4 Tech demo showed at E3 running in realtime

    http://www.youtube.com/watch?v=4j5xxi6cjjU

    The sad thing is alot of people will believe this and PC gamers will use it a Proof of their miss-aligned superiority!! FFS. Fanboy Journalism at its best and an arrogant team of coders, typical PC work just throw Hardware at the problem!!

    Anyone can lap a track faster with more power, but that is not to say a less BHP Bike or car with a better rider or driver cannot equal or beat this, no difference here at all.

    #35 11 months ago
  36. nollie4545

    What the hell are you on about?? Seriously, do you understand computer hardware architecture at all??

    Even current gen dual GPU cards like the 7990 cannot max out a PCI-E 2.0 slot, much less 3.0 which many recent motherboards support. Bandwidth in PCE-I lanes is NOT a limiting factor in gaming.

    There is NO performance advantage to having a GPU on die or unified memory. System ram is DDR3, GPU RAM is GDDR5. Two completely different animals which operate in different ways. They cannot function as effectively doing each others work.

    There is only one reason the next gen consoles elected for a GPU on die solution and that is: COST.

    If on die GPU solutions provided a definite performance or efficiency advantage, it would have been done years ago in the PC world. How the hell do you expect to jam a processor containing 3.5 much less 7 billion transistors on the same die as a CPU is beyond me. There is no point. All it does is generate a shit load more heat.

    #36 11 months ago
  37. Darkwaknight

    @37

    I think I could ask you the same question as based on your answer you clearly know nothing about hardware (or software) of any kind.

    If you knew anything you would know that have a CPU&GPU on the same die along with Unified Ram & hUMA completely removes the Southbride from the equation which is your bus limit from your card, not to mention that the DDR3 is the slowest point so holds back your GPU GDDR anyway along with the need to keep write/erasing data and copying all over the place to work!!

    Along with the fact that 0% of PC games use more than 2 cores and 99% only use 1. By having access to the same memory pool & using 4 or more cores (which will offset the latency from the GDDR5) allows the “console” to achieve more per clock cycle than currently is possible.

    The heat is only an issue if you do single core threads (as you are clearly a PC gamer and this is what you are taught from all your benchmarking crap), by passing work out and any unused cores act as a heatsink for the other, along with the fact the Frequency are not at >3Ghz heat is not the issue from the smaller die.

    Just do some checking first before you call someone out as your knowledge is all based on PR crap and Theoretical benchmark rubbish which means little to nothing in the real world.

    For example on a i5 or i7 all the northbridge work is done from the CPU thus wasting time and effort that it can achieve and the reason why Desktop CPU have such High Freq to go someway to compensate for this. In these consoles alot of the work is offset with separate silicon with small caches allowing the CPU to only do work that is needed is more efficient. In PC you use/waste so much from the O/S alone not to mention all the sound work that is done all from the CPU that is again handled off Die elsewhere in these consoles, even the screen render for I/O is handled by separate hardware. In reality you would need a PC that is Twice as powerfully (on Paper) to even get close to equal the performance of a console. And with these being designed with alot more features that are just not available in Pc’s today you cannot compare (like no retail 8 Core CPU’s or coherent memory allocations or even system wide GDDR5) just because all you can buy is x does not mean there are not better options.

    If (and they will soon) Intel and AMD start pushing Single chip APU’s that outperform current “Split” hybrid systems the industry and the consumer will flock, right now GPGPU is barely used, (Hence why Stream Processors are becoming the norm in GPU cards) but in 3 years or so most work will be processed through the far more powerful GPU and all the cores of a CPU will be used the cleverer stuff CPU’s can do where as GPU do all the heavy work due to there power (and being unaffected by latency).

    #37 11 months ago
  38. nollie4545

    As above- TL;DR.

    You consistently and continually fail to grasp something important here about gaming.

    The bottleneck 99% of gamers will face first, is the GPU. Or, more specifically, how rapidly the thing can crank out frames. To do this requires CORE poke.

    Faster core= more calculations= more frames.

    You are moaning out the ass about CPU this and CPU that.

    I don’t know what PC you have or have played or whatever understanding you might have of them, but I can assure you I am regularly playing games and they are regularly running on many threads across multiple cores. At no time is my CPU being at all bothered by games, all the CPU is doing is handing most of the work to the GPU. I can tell this from several sources:

    1 Various software suites can tell you core CPU loads.

    2. Various software suites can tell you GPU loads.

    3. Various software suites (and your ears!) can tell you how how what is getting.

    No matter what game I am playing, the GPU consistently does much more work. Modern games are NOT a problem for modern CPUs. Modern OSs are so lightweight they are NO problem for a modern system to run. In fact, most of us have an antivirus suite running as well, and then we’ll throw in Metro last light or Crysis 3 and game like crazy for hours. At no point does any of this really concern the CPU. In fact, I doubt there is a modern game out there which can stress an i7 to 50% of what something like prime95 would.

    Quite what crap you are on about regarding RAM I have no idea. DDR3 is suitable for a particular task. GDDR5 is a totally different beast, faster yes, but far less flexible, just like a modern GPU, which is a piece of silicon specifically designed to do a lot of parallel processing.

    The conclusion then is GAMING PERFORMANCE, is a product of GPU Core count and clock rate, of the GPU. No graphics card out there today is being held back by the need to channel data down a PCI-E lane. That is a fact.

    There is NO performance advantage to system on a chip or on die GPUs. If there was, the tech would have arrived in the PC world years ago, it isn’t some latest and greatest invention, its be done for a decade.

    As I have said a dozen times on here, the ONLY reason both console makers have elected to use a system on a chip from AMD is cost. IT IS CHEAP.

    If you think your console is going to magically whoop a system with a £500 GPU on board, you’re losing the plot. Seriously.

    #38 11 months ago
  39. Darkwaknight

    A again you are just spouting crap about nothing, you plugged a few Lego parts into a board and you are a tech engineer.

    You make me laugh with your ” I can see cores are being used in task manager so they are”, if you understood how these things work you would know that pc are inherently powerful beasts ( if a good spec) but are hugely wasteful, you nativity of lightweight o/s?!!? Really compared to a console with a single focus, factor in motherboard drivers, graphics card drivers, sound, etc etc there is nothing lightweight about them.

    My example for CPU is not that they are anything powerful they do not have to be, a 7990 or 780 is not being maxed out on processing, the bottleneck will come from how fast or slow it can be handed data to crunch from the CPU and in turn this is restricted from all the bottlenecks, fir example how can a game like BF4 be shipped and run on the same near enough specs as BF3 and yet it is doing loads more, being a gamer you should see how a CPU is little bearing on bf3 compared to a GPU… Because all they do us hurl he workload at the GPU, due to the latency from the GDDR and all of the physics going on the CPU will spend stacks of cycles waiting for the return value for the GPU work. These games are not designed to paralegal process the data, the perceived bottleneck or performance hit is due to he inefficient method of this work, the CPU is not being taxed as the GPU is doing all the work and the CPU is waiting around. Factor in the wasted cycles in both instances for the cache copy’s you have the same thing as cutting you toe nails with a chainsaw. By hurling more core freq at the issue you get faster returns and thus more work. The point is with consoles you can work smarter nit harder to achieve the same results, pc gaming has for too long been caught in the to be better we need more ROPS or ghz, it is a solution but an expensive one.

    My point is no I do not thing that a £350 console can compete with a £600 GPU, what I do think is that used effectively they can and will produce the same results as a PC that is twice as theoretically powerful!!

    #39 11 months ago
  40. nollie4545

    No idea what you are on about. Seriously.

    I have built several machines in my time, with different specs. The GPU is the limiting factor.

    Maybe once you start getting to insane resolutions and have multi GPU setups, the CPU might become the limiting factor, but for the average gamer, who is at 1080, then its semantics whether he can hit 60fps or 120fps.

    The CPU in a modern gaming platform does virtually no work.

    I don’t know what piss weak CPUs you are on about but since there is virtually no gaming advantage between a 4 core i5 and a 6 core i7, you can only conclude CPU poke is irrelevant.

    NO MODERN GRAPHICS CARD CAN SATURATE PCI-E 2.0. THE END.

    #40 11 months ago
  41. Darkwaknight

    No you are wrong, 8gb/s is your limit for your pcie 2.0 slot, what you are still failing to understand is beyond what you only know from building pcs.

    I am not here to argue with anyone but I have built and do build desktops/servers/ farms and I am a coder myself. What the difference is now is on PC they hammer the GPU to do all the work locally using the fast GDDR onboard to do all the render work. Having a T 176gb/s and (now listen this is the new bit) by offloading work away from solely GPU and more importantly both devices being able to share the same data without have to flush or renew this allows the CPU to be used for sections that on a PC a GPU is doing. Modern pc games do not the CPU as the GPU is the powerhouse and the bottleneck limits what can be done between the 2!! THE END!!

    #41 11 months ago
  42. nollie4545

    Listen, a modern GPU card is not yet able to saturate a PCI-E 2.0 lane, much less a PCI-E 3.0 one.

    What the CPU can or cannot do is irrelevant, the GPU is the one doing trillions of calculations in parallel processing. GDDR5 is suitable for this kind of task, DDR3 is a different beast. CPUs are not the ideal tool for huge amounts of parallel processing, hence the use of GPU technology.

    There is NO performance advantage on having a system on a chip system, it is done for reasons of cost and simplicity.

    And there is no fucking way your console, with its last gen GPU on die, is going to come near the potential of a PC which cost not much more money.

    There is no performance bottleneck in modern gaming machines from having second rate CPUs or narrow PCI-E lanes. The GPU always runs into a wall first in 99% of situations.

    A GPU and CPU are totally different devices, if one could do both, then intel would not be bothering with ivybridge type systems where both are on the same die.

    This lot has all been benchmarked. Whether the GPU and CPU are on the same silicon is irrelevant, it all comes down to the power of the GPU.

    I can point you at a myriad of benchmark tools which are designed to give complete parallel processing testing, some you can run on GPU or CPU. CPUs consistently are less effective at that kind of task. Its how they are made.

    Once again, there is NO ADVANTAGE to offloading processing needs to the CPU. Its memory is less effective at the task.

    Its all irrelevant anyway since Nvidia are going to launch a GPU soon with its own on board CPU which will further lower the load on the CPU itself. A CPU does nothing more than shovel data to the GPU which is whats displaying the game anyway.

    #42 11 months ago
  43. Darkwaknight

    Look, let me try and simplify if for you with an example.

    Lets take AI for an example (as task better suited to the weaker but more intelligent CPU).

    In a PC if this was done by the CPU then for it to decide on the char to raise his gun or run away or scratch his arse, then it needs to tell the GPU to render the target through this process. If you then shoot your gun at that point another conversation has to happen to update the GPU with this new action, meanwhile the GPU is still running the previous task so has to be told (again) to stop and run the “save ass” process now.
    This results in the CPU having to flush and the GPU doing the same, thus wasting loads cycles on a process that is a waste.

    Now if the GPU and CPU have the ability to not only share the ram but more importantly see each others cache then this allows each to “snoop” the others data creation without the need to flush/copy repeat to find the same answer. In turn this means that the GPU can use these “Wasted” Cycles on even more Parallel jobs.

    There is FAR too much to go into on how this works, not to mention the API from MS and Sony that have a more refined set-up.

    As an example of how these are NOT standard, the GPU in the PS4 has 64 Source Compute commands, a standard 7870/7970 has only 2 ACE. The whole thing has been beefed up for a far more parallel GPU that your are used to seeing in the PC world, along with it being very well fed from RAM.

    Now this does not make it more powerful on its own, but it does allow it to do far more work, better and thus will translate on screen.

    And the 64 ACE would be enough to saturate even a 3.0 PCie card, the 16Gb/s limit would be smashed. What you will see after the first wave of games all made to in-complete HW specs and much older and weaker GNM or even alot using the even weaker GMNX API for sony is games that will show content and visual fidelity far above what you see now. Hell even microsoft are bumping the DX11 API as in PC SO MUCH is done by this that games developers cannot get right into a GPU (or have the time to maximise 1 set-up with such a huge array on the market), you are comparing apples to oranges..just wait and see!!

    #43 11 months ago
  44. nollie4545

    You’re getting your pants in a real twist.

    The whole reason Sony and MS have gone with AMD and their cheapskate APU, is cost.

    If you think this thing is going to magically run Crysis 3 at 1080P and maximum settings at 60fps, you’re barking mad.

    I don’t care, nor would any other gamer, how efficient my CPU is during gaming- its irrelevant. The GPU is doing the work, it is the bottleneck- how ever many pixels and frames it can crunch per second is what gives me the ability to turn up all those quality pre-sets and still get a playable frame rate. The CPU has virtually no bearing on this, by comparison. Modern processors are so potent that gaming is the least stressful thing many of them will ever do. The OS are now ridiculously lightweight, I can see the exact poke required to run them in real time- its nothing.

    As I have said before, if there really was a genuine performance advantage in this technology, it would have happened in the PC world decades ago. System on a chip is not some new invention, its been around for donkeys.

    No PC gamer lost any sleep worrying about how ‘slow’ a PCI-E lane was.

    #44 11 months ago