AMD boss talks up next Xbox’s graphics and AI

Monday, 18th July 2011 02:17 GMT By Brenna Hillier

AMD’s Neal Robison has dropped hints on what to expect from Microsoft’s new console.

The new console will have graphics on par with James Cameron’s Avatar, Robison told the US version of OXM, as reported by the Examiner.

Commenting that gamers have much to be excited about, Robison reportedly said AI and physics will be a major focus for the console’s powers. Open-world games in particular will benefit from this, with crowds acting as individuals rather than predictable mobs.

It’s not clear how hardware is supposed to make this happen, but perhaps Robison means Microsoft is ensuring it supplies specific capacities with which developers can implement such novelties.

The Xbox 360 uses ATI Xenos to power its graphics, suggesting AMD is likely to be in the know if Microsoft is (inevitably) researching the console’s successor. The original Xbox used nVidia cards.

Thanks, CVG.



  1. Hybridpsycho

    *cough* BULLSHIT *cough*

    Would cost at least 600Euro or more.

    #1 3 years ago
  2. Phoenixblight

    @2 Obviously no current game pushes that even with a super rig of a computer could you pull that off. When the 360 was just announced MS was saying that it would be able to pull Toy story 2 graphics in real time. Still Waiting MS. Sony has gotten to TOystory 2 with prerendering but no actual real time graphics with it.

    #2 3 years ago
  3. Dannybuoy

    I’m looking forward to the next gen, but I’m a bit worried that development costs will be so high and take such a long time that we’re mostly going to have huge blockbuster type games. These’ll probably cost quite a lot at retail.

    #3 3 years ago
  4. themadjock

    Waits for Erth to tell us all that his PC can already do graphics like that…..

    #4 3 years ago
  5. manamana

    ^ … in 3D.

    #5 3 years ago
  6. NiceFellow

    There is zero chance of this unless the next gen is a lot further away than presumed.

    What is it with wacky PR statements at the moment? You’ve got this and just a few days earlier the MS guy happily ignoring Nintendo 1st party titles both outsell and outscore MS 1st party titles (in fact I’m not sure he even understood what 1st party is supposed to mean as he was surely including Gears in his views)… all eyes on Sony now. Surely they can’t let a moment like this go without making some outlandish claims of their own?

    #6 3 years ago
  7. monkeygourmet

    Im hoping MS and Sony pack absolutly huge amounts of power into their new consoles (high end graphics cards, at least 8gb ram, super fast processor etc…) in lew to them lasting 10 plus years.

    That way they could sell at loss for 2 plus years then start to re-coup as the console aged.

    Consoles need to have bigger life spans these days, the 360 and PS3 have proved this.

    With the huge cost of game development not to mention how many years it takes to make a block buster this would ensure the new machines were as future proof as possible.

    then maybe we could get graphics close to avatar for a £350-£400 launch price.

    MS can def afford to take a bit of a hit for a few years when it comes to making a loss at launch, XBLA should be able to pick up some of the slack for them.

    Also, why don’t they purchase OnLive? It’s not like they are busy purchasing First Party games studios…

    Im starting to hate Nintendo’s “must make a profit on launch” philosophy regarding there console launches. When mobile phones are almost more powerful than a console you are just about to launch, well, thats just plain wrong.

    #7 3 years ago
  8. Blerk

    Wonder if they’re going to be including a dedicated physics chip? Did they ever catch on for the PC?

    #8 3 years ago
  9. Maximum Payne

    @4 Well in theory it.
    IMO maybe not Avatar graphics,but next gen console will be strong enough for like Toy story 1 or I don’t know. Just look at God of War 3 look better then God of War 2 CGI so for the next gen console will be again even biger giant leep just like every generation.Because they are new tehnology lik PS1 and 2 didn’t have dual core and now they are multicores so next console let say 6-8 cores and graphic card like 6850-170E(in my country) and you can play 95% of all games 1080p/60 frames

    #9 3 years ago
  10. Shubb9

    @8 Not really, Nvidia bought out Ageia who did the separate Physx cards and then they included the drivers in the Forceware package so it could be handled by the graphics card rather than a different bit of kit. But to my knowledge only Batman: AA was the only top tier game to implement it well. Really helped the immersion when done well, using the claw to pull the walls open and seeing all the pieces flying out looked awesome. Problem was ATI never got onboard so it was Nvidia or GTFO

    #10 3 years ago
  11. Blerk

    Ah, now that’s a shame.

    #11 3 years ago
  12. Shubb9

    @11 It really was because when you had the Physx set to max it added in loads of fluttering banners hanging from the ceiling and other incidental things like more papers on the floor blown around by the wind. It was a richer world to explore, definately want to see more of this in next generation of consoles but as its not a shared thing between the red and green teams it looks like it’ll stall again.

    #12 3 years ago
  13. Christopher Jack

    @10, How could AMD(formerly ATi) get on board when Nvidia bought them out.

    #13 3 years ago
  14. Shubb9

    @13 I wasn’t clear enough, Nvidia should have made it more of an open thing rather than trying to make AMD/ATI pay for using their tech therefore benefiting the gamers across the board. I think ATI at the time were set on using Havok, my point was there was no agreement on a standard so devs wouldn’t back it as it would exclude a large portion of the audience (especially so as Nvidia has been losing its dominant market share as AMD’s product line up has improved in recent years), it took Nvidia throwing cash at devs to get them to use it. Thats why it didn’t take off as I would’ve liked. I wasn’t trying to suggest AMD refused to use a good thing they just didn’t want to pay their rivals to do it.
    TL:DR no co-operation between competitors unsurprisingly.

    #14 3 years ago

Comments are now closed on this article.