On Wed, 12 Jun 2013 16:40:05 +1000 Manu <[email protected]> wrote: > On 11 June 2013 20:52, Nick Sabalausky > <[email protected]>wrote: > > > On Tue, 11 Jun 2013 11:39:00 +0200 > > "nazriel" <[email protected]> wrote: > > > > > > But there seem to be some quirks with those CPUs: > > > > > > http://www.mcvuk.com/news/read/ps4-and-xbox-one-s-amd-jaguar-cpu-examined/0116297 > > > > Wow, given the abilities of the PS3 and 360, that article reads like > > the comments of a spoiled brat. > > > > Abilities? I think they're thoroughly uninteresting and totally > underwhelming hardware. > They're pretty weak. I wish they'd stuck with (multiple) ridiculously > high clocked PPC's personally. >
I don't doubt there's hardware out there with more raw power, but I'm talking "abilities" in terms of observable end results here. Consider for example Little Big Planet, the Cod: Modern Warfare series (yum, fish! ;)), the upcoming "Time and Eternity", or really just about anything involving Unreal Engine 3 (no offense intended to MAX-FX 3, of course, I just haven't actually tried Alan Wake). Hell, even Wind Waker, and that's previous generation. Any extra computational power available typically goes primarily to graphics (and some to physics and maybe some other things, too, but in most cases it's primarily graphics). But the thing is, good graphics has more to do with art direction than computational power. That was always true to a certain extent (the 16-bit Sonics and Marios were far better looking than most of the games on the N64/PS1/Sat), but with modern hardware it's much more true now than ever, and will only continue as hardware gets even more powerful. The end result is that we've already hit a point where increased hardware is only giving graphics increasingly marginal improvements: More resolution and texture detail, more triangles, improved shadows, etc. It's all just tweaking the details. And even that's relevant mainly just to the decreasing proportion of games that are going for a photorealistic or Pixar-like style. Things like Shank, Unfinished Swan, or Terraria (the #2 PSN game for last month) wouldn't benefit much from increased horsepower. Even the *#1* PSN game for last month, FarCry Blood Dragon (fantastic game, BTW), looks mostly like something off of the XBox 1 (the first XBox 1, not the upcoming new one). Granted, there are other things besides graphics that can still be improved with more raw power (ex, physics), but even for those it's still mostly just improving details at this point. Even if MS/Sony had gone with top-of-the-line hardware, there aren't a whole lot of truly significant things that would have opened the door for (such as maybe some gameplay based on good 3D fluid dynamics?), and even those things would only be applicable to a minority of titles. And I think the console manufactures are well aware of all this. Definitely Nintendo is, who has long since switched gears from wowing people with high-fidelity graphics to focus on providing fun games with original interfaces at much more reasonable prices. The troubles they had with this on the Wii mainly came from third parities reacting to the lack of focus graphics-crunching by shunning the system (and yet the Wii still managed alright). Sony's certainly taken note of the value of indie titles, which are definitely not power-hungry in most cases. And for any of the console manufacturers, it would be very difficult not to notice how much harder it's becoming for the computational-power-pushing AAA titles to flourish - increasingly expensive to develop, increasingly necessary to get huge sales numbers to stay profitable, increasing risk, and increasing competition from inexpensive indie and otherwise non-AAA-blockbuster titles. > I think it sounds encouraging: It means the next gen might not end up > > pulling a 3DO on price like their predecessors did. It damn near > > killed the PS3, which took Sony some major work to finally turn > > around. > > > > Indeed, they're obviously designed to be cheap this time... or they'd > be better, and more interesting ;) > Yea, and I think it was necessary. Consider the mobile space: those things routinely pack in $600+ worth of hardware and can get away with it because it's at least partially subsidized by the nearly $100/mo cellular contracts many users pay. (Plus many of them can hook up to TVs and add-on gamepads). Console hardware has the benefit of not needing to be mobile, but they can't subsidize nearly as much as mobile can, so packing in truly advanced hardware that's significantly beyond what you can get in other devices would have led to a prohibitive price tag. Sony attempted that in the early days of the PS3 but (along with its developer-unfriendly nature) it damn near killed them until they revised the PS3 and did their best to reverse their strategy. The same strategy *did* kill the 3DO. > In any case, it is nice that they're using x86. Seems like a smart > > choice. I'll admit, when I first heard about it I was surprised at > > least one of them didn't go ARM, but x86 does seem to make more > > sense for a major games console at this particular point. > > > > Personally, I think it's disappointing. x86's key advantage is being > able to run crappy desktop code fast. > Games are not usually 'crappy desktop code', they're carefully tuned, > purpose-specific code. > x86 uses MASSIVE amounts of its CPU realestate to tolerate crappy > code. I'd rather use that CPU realestate on more raw power, and put > the responsibility on the engines engineering merits to make the most > of it. > > This move sets a low upper limit, and the bar will start high. I don't > anticipate you'll see much tier-ing between 1st gen -> 3rd/4th gen > games this time round. > Keep in mind, indie is *big* thing these days and I see no sign of its growth leveling off any time soon. (Even a lot of the AAA houses have started doing more games on an indie-like scale - the new Death Rally, for example ;) ). Combined with the fact that the console manufacturers have learned (the hard way, in some cases) that being developer-friendly is absolutely critical, and I think that makes x86 the prudent choice, even if it isn't the most powerful choice. From your first DConf talk, it sounds like even your company has been reaping some benefits from this. If the X Bone (or XBox 3, or "second" XBox 1, or whatever the heck I should be calling it to disambiguate from the one I bought ten years ago) had been using ARM or PPC, it would have been harder (maybe even prohibitively so?) to integrate D when you did. > ARM might be better/more interesting than x86, but I actually still > think PPC is a good architecture for the purpose. VMX/SPU is still > the best SIMD unit. > I don't doubt that. Especially since I really wouldn't know anyway :) The last time I dealt with anything on that low-level, it was Parallax's Propeller microcontroller, which is clearly not even in the same ballpark. (A fun little device, though. An 8-core microcontroller on roughly the price point of the BASIC Stamp? Yes, please!)
