We're looking for new writers to join us!

Quick comments with working on each console

by: John -
More On:
Reddit user and game developer from Munkyfun Cory Bloyd has posted on working with some old and current consoles. He goes back to the PlayStation 1 days to th current consoles we have today and offers up a quick insight on what it's like to work on them. 

It goes to show you that not all consoles are the same, of course, and that the quality of the tools can make a difference. As a programmer myself, I'd hate to have worked on the PSX as the tools didn't include a debugger. You built it and ran it. Ugh. The N64 had a buggy debugger, which is kind of funny. Anyways, here's his quick summary of what each console he worked on had in its favor and what it didnt.
  • PlayStation 1: Everything is simple and straightforward. With a few years of dedication, one person could understand the entire PS1 down to the bit level. Compared to what you could do on PCs of the time, it was amazing. But, every step of the way you said "Really? I gotta do it that way? God damn. OK, I guess... Give me a couple weeks." There was effectively no debugger. You launched your build and watched what happened.
  • N64: Everything just kinda works. For the most part, it was fast and flexible. You never felt like you were utilizing it well. But, it was OK because your half-assed efforts usually looked better than most PS1 games. Each megabyte on the cartridge cost serious money. There was a debugger, but the debugger would sometimes have completely random bugs such as off-by-one-errors in the type determination of the watch window (displaying your variables by reinterpreting the the bits as the type that was declared just prior to the actual type of the variable --true story).
  • Dreamcast: The CPU was weird (Hitatchi SH-4). The GPU was weird (a predecessor to the PowerVR chips in modern iPhones). There were a bunch of features you didn't know how to use. Microsoft kinda, almost talked about setting it up as a PC-like DirectX box, but didn't follow through. That's wouldn't have worked out anyway. It seemed like it could be really cool. But man, the PS2 is gonna be so much better!
  • PS2: You are handed a 10-inch thick stack of manuals written by Japanese hardware engineers. The first time you read the stack, nothing makes any sense at all. The second time your read the stack, the 3rd book makes a bit more sense because of what you learned in the 8th book. The machine has 10 different processors (IOP, SPU1&2, MDEC, R5900, VU0&1, GIF, VIF, GS) and 6 different memory spaces (IOP, SPU, CPU, GS, VU0&1) that all work in completely different ways. There are so many amazing things you can do, but everything requires backflips through invisible blades of segfault. Getting the first triangle to appear on the screen took some teams over a month because it involved routing commands through R5900->VIF->VU1->GIF->GS oddities with no feedback about what your were doing wrong until you got every step along the way to be correct. If you were willing to do twist your game to fit the machine, you could get awesome results. There was a debugger for the main CPU (R5900). It worked pretty OK. For the rest of the processors, you just had to write code without bugs.
  • GameCube: I didn't work with the GC much. It seems really flexible. Like you could do anything, but nothing would be terribly bad or great. The GPU wasn't very fast, but it's features were tragically underutilized compared to the Xbox. The CPU had incredibly low-latency RAM. Any messy, pointer-chasing, complicated data structure you could imagine should be just fine (in theory). Just do it. But, more than half of the RAM was split off behind an amazingly high-latency barrier. So, you had to manually organize your data in to active vs bulk. It had a half-assed SIMD that would do 2 floats at a time instead of 1 or 4.
  • PSP: Didn't do much here either. It was played up as a trimmed-down PS2, but from the inside it felt more like a bulked-up PS1. They tried to bolt-on some parts to make it less of a pain to work with, but those parts felt clumsy compared to the original design. Having pretty much the full-speed PS2 rasterizer for a smaller resolution display meant you didn't worry about blending pixels.
  • Xbox: Smells like a PC. There were a few tricks you could dig into to push the machine. But, for the most part it was enough of a blessing to have a single, consistent PC spec to develop against. The debugger worked! It really, really worked! PIX was hand-delivered by angels.
  • Xbox360: Other than the big-endian thing, it really smells like a PC --until you dug into it. The GPU is great --except that the limited EDRAM means that your have to draw your scene twice to comply with the anti-aliasing requirement? WTF! Holy Crap there are a lot of SIMD registers! 4 floats x 128 registers x 6 registers banks = 12K of registers! You are handed DX9 and everything works out of the box. But, if you dig in, you find better ways to do things. Deeper and deeper. Eventually, your code looks nothing like PC-DX9 and it works soooo much better than it did before! The debugger is awesome! PIX! PIX! I Kiss You!
  • PS3: A 95 pound box shows up on your desk with a printout of the 24-step instructions for how to turn it on for the first time. Everyone tries, most people fail to turn it on. Eventually, one guy goes around and sets up everyone else's machine. There's only one CPU. It seems like it might be able to do everything, but it can't. The SPUs seem like they should be really awesome, but not for anything you or anyone else is doing. The CPU debugger works pretty OK. There is no SPU debugger. There was nothing like PIX at first. Eventually some Sony 1st-party devs got fed up and made their own PIX-like GPU debugger. The GPU is very, very disappointing... Most people try to stick to working with the CPU, but it can't handle the workload. A few people dig deep into the SPUs and, Dear God, they are fast! Unfortunately, they eventually figure out that the SPUs need to be devoted almost full time making up for the weaknesses of the GPU.