Wednesday, July 19, 2006

Update in the Industry

So here’s a quick lowdown on what’s going on inside the industry.

Hardware

Sony is trying to retain grasp of the %60 market share that they own in the video game console business. With Microsoft already two steps ahead of them and an entire years worth of a head start in the “next gen” console wars, Sony is going to have to race hard to catch up. Sony, of course, is fumbling right now with a proposed market price for the PS3 well above what it should be. At a cost of roughly $500, some people say that the PS3 is overpriced for what it’s worth while Sony maintains the stance that it is an excellent value.

Under the hood of the two machines are multiple processors that would make a modern day PC cringe. The special hardware involved in making these two machines are directly decedent of the supercomputers used in the early 90s. Ironic that their only use now is to go towards video games!

Of course, after looking in depth at both architectures, what’s printed and available for the public to read(actual console details are hard to come by since most companies like to keep that info a secret) I’ve come to the conclusion that Sony *may* have the better tech. Of course, I have seen reports out there that say that there are unresolved issues with the Sony architecture, but you never know what the truth is and what is simply propaganda.

The reason why I care about this so much should be obvious. I love watching two giants go head to head. Granted, I don’t want either of them to win, or Nintendo for that matter, but I want this battle to rage on for eons spurring innovation and massive changes in the industry.

Oh, and I almost forgot to talk about Nintendo there for a second. Of course, Nintendo is staying out of this whole “next gen” war simply by saying they are not competing on the same level as Microsoft and Sony. That is definitely true, and a very smart move by Nintendo. By releasing a technologically inferior machine with several innovations, not in tech, but in gameplay, Nintendo is basically targeting a new audience here… and I think they’ll win with this strategy!

In the PC world, the battle between NVIDIA and ATI has heated up. ATI has publicly released that they will be using a new architecture for their chips that unifies the pixel shader and the vertex shaders, and introduces a new shader called the geometry shader. They will be taking all three shaders and unifying them into the appropriately named Unified Shader.

NVIDIA, on the other hand, wants to keep the three shaders separate. Why, you may ask? Well, I’m just speculating here, but my guess is that the unified shader model that ATI is running after gets compiled and then runs on several chips. The problem with this is that the shaders are then dependent on the compiler and the chip design to make certain optimizations that, in the hands of a programmer, are straight forward, but require special attention. What this means is that, while the unified model may be easy to program and easier to make new shaders with, it may be inefficient and slow, versus the NVIDIA approach which requires programmers to pay special attention to the shaders which increases the chance of them being blazingly fast!

Games

The idea of Grand Theft Auto series moving to the X-Box 360 platform is sacrilegious to some people, but I welcome the move by Rockstar Games and I congradulate them for expanding their horizons. Combined with the new in-house developed engine and technology, the next installment of the Grand Theft Auto series is sure to make headlines… as long as Take-Two and Rockstar can survive the countless legal proceedings they have in front of them. Everyone from the SEC, to the infamous prior first lady, Hilary Clinton, hot on the heels of the company trying to destroy them in any way possible for sex and violence in video games.

When will these people just grow up and realize… it’s just a game!

On to another game that has broad popularity. Halo 3 has debuted with their trailer and a subsequent “making of” short film available, well, everywhere. As usual, Bungie is pushing the technological limits of the 360 in order to, well, I don’t know why, but they’re pushing the technological limits again with stuff we barely pay attention to. Who really cares if the Master Chiefs visor has true reflection on it anyway. Yeah, sure, it adds to the realism of the game… but give me some more god-damned explosions! I want to see things blow up!!!

Ahh, yes Final Fantasy is still around, and coming out with another installment as well. Rumor has it they’ve also started developing for the X-Box. Is this true? If it is, that could mean another thorn in the side of Sony as their all time largest franchise starts to branch out into other consoles.

That’s all for now, but I will keep you informed of the latest developments as I have time and as they come up.

-Ken Noland

Thursday, July 13, 2006

Online Distribution

Online Distribution?

With publishing methods like Steam, Vapour Online, and even FileFront, it seems like everyone is trying to scramble to become the largest digital distributor of all. Yeah, sure there are tons of these guys all over the place. Some small shops have started putting up their own shopping carts and building web based distribution methods, meanwhile apps like Steam and Vapour Online act as their own independent installation mechanisms.

Steam seems to be the poster child for digital distribution, and seems to be the only one to get the largest customer base, however even they still requires an extra publisher to get to store shelves. In the case of the most popular Steam title, Half-Life 2, they went with EA to put the game out there in your favorite stores. However, the market data has not been published, I estimate that for every one copy sold in stores, there are at least two copies purchased online.

Before Steam popped up, I was paying very close attention to a little known legal battle between the giant publisher, Vivendi, and the tiny little developer, Valve. It was a real life David vs. Goliath, considering Half-Life 1 was the first game of the ex-Microsoft employee, and founder of Valve, Gabe Newell. It was primarily developed using Id Software’s Quake engine and modified to support all the animations they were handling. With sales of Half Life 1 well below publishers acceptable standards initially, it wasn’t until the Counter Strike mod was developed and published that Half-Life became a hit. Vivendi was on the verge of pulling it from store shelves then came a miracle from outside Valve’s control. Now with the resources and popularity gaining, they started working on a sequel, meanwhile fighting the publisher for license agreements with internet gaming cafés. In the midst of it all, Gabe made a bold move and started working on Steam.

When news of the victory of Valve hit the streets, people inside the industry, including me I might add, shouted out “Death to the Publishers!” After a week of talking it over with my buddies, I realized that this was merely the first step. Publishers aren’t going away any time soon!

While steam may have something here, with being the largest “first to market” in the online distribution world, they also have a lot of room to improve, most notably the requirement to be online to play the game. This can be a huge inconvenience to the end user.

Everyone is racing to create the largest network of games and mods and trying to compete, head to head in many cases, to gain the largest audience. Since Valves bold first moves, file sharing sites such as FilePlanet have tripled their daily bandwidth usage. Is this becoming a reality? Have we finally entered the age of digital distribution?

When little Timmy wakes up Christmas morning, will he be scanning his computer for the latest downloads instead of unwrapping the game? Will he be pouring over some PDF file instead of cracking open the manual?

Wait, before I go any further! The concept of “the game manual” shipping with games has died! Last game I know of that had a manual was Black and White. Ever since then, the manual typically consists of a page or two about how to install the game and then nothing more.

Okay, back to the subject at hand! In short, little Timmy is safe. While hardcore gamers may adapt the distributor of their choice, sometimes basing their decisions on how intrusive is the distribution software, and how quickly can I get this game, the casual gamer will still head to the local store to pick up the boxed version of the game. Why? Because having a box is something that is tangible, something that can be added to the shelf above the computer to be put proudly on display, whereas something that is downloaded is just… well… boring!

-Ken Noland

Tuesday, July 11, 2006

A Word About Cell Phones

A word about cell phone game development…

Ah, the joys of working on another device that requires more low level knowledge of the underlying hardware then you care to know about. Qualcomm tried to ease things by coming out with their massive solution, called BREW, but this API leaves a lot to be desired.

A bit of a note before I start slamming BREW. I was using BREW version 2.1.3. There are severe differences between 2.1.3 and 3.1. Most notably is the image processing section, and the addition of several exposed features that, put simply, weren’t available in Brew 2.1.3. Additions like multiple output displays (for phones with more then one screen, which I believe is all phones now) and the ability to stream audio.

I finished working on cell phones over a month ago. I wanted to wait a while before posting anything because I wanted to clear my mind of all the bad taste left in my mouth due to the BREW SDK. Not to mention all the difficulties with the ARM9 compiler (RealView Compilation Tools directly from the ARM guys themselves)

The task was simple, at first. Build a basic sound processing system that we could use to animate characters on screen. Basically, imagine all those nifty visualizations in Windows Media Player, and put them on a screen less then 2 inches wide. In theory, it sounds good!

The first thing we had to do was to parse out the sound and get the frequency data. You would think that would be simple. All you have to do is decode the sound file if it is in anything other then pure PCM, run a simple fixed point version of a Fast Fourier Transform (FFT) and then viola; you have the frequency info right there. It doesn’t take a ton of processing, and it’s a nice feature to have!

The first thing I did was generate a fixed point FFT. This involves fairly remedial math involving taking Tailor series numbers and converting them to their fixed point equivalent and then doing all the calculations in fixed point space. Using 32 bit signed integers, you can easily compute the fixed point space using whatever granularity you chose for precision. I used 16.

This worked like a charm. I was able to get the frequency data directly from the PCM data and display that on screen. Then came the next part, and all hell broke loose!

So the first step is to take the encoded sound data and decode that to pure PCM. This turned out to be impossible! Brew simply doesn’t allow that in Brew 2.1.3. All they say is that this feature is purely provider dependant. Grrr… back to the basics.

Okay, so we can’t decode in hardware. Can we do it in software? Nope, because then we’d have to build our own sound streaming function. Sound streaming is something I’ll get to later, perhaps in another post, but the basics of sound streaming means that you take a signal and you output it to speaker.

Turns out this feature is not available in Brew 2.1.3 either.

All we can do is play sounds one at a time. In order to get the frequency data we have to open the file, decode the sound in software, extract the frequency info and store that on the phone, then close the file and reopen it using their media interface and then we can output the sound to speakers meanwhile loading the frequency info on the fly.

Project complete!

Then, the next project we wanted to do was to build a nifty little streaming video, camera to camera, cell phone to cell phone and PC, or any other device that can connect to the internet.

At last, we have all the tools and functions necessary for doing this. We have a sockets that we can use to connect to the internet and we have direct access to the camera data. Although rather slow access, we have access! Those were the only two ingredients necessary to complete this task. The compression and decompression were to be based on simple HAAR wavelets, then bitwise compressed, then zero length encoded, and then sent along their merry little way.

Sounds complicated? I know, I know. You must be asking yourself what the hell all this means. Put simply, it worked. You can download the video of it in action at Whatif’s website.

I did, however, run into one small stumbling block that was actually easy enough to get around. Turns out the under TCP/IP phones can’t “listen” for connections. This means your tiny little phone is incapable of being a server. This is easy to get around though, just use UDP!

I built up this whole UDP system based around a state machine of sorts. It would “listen” for other UDP packets coming in and then trade messages back and forth. It would attempt to TCP connect, mainly just for determining when a connection has been broken, and if not then it would continue to run.

Alas! It all worked. Everything from the compression to the UDP system, and even the camera data all worked as it should’ve!

In conclusion, I started typing this up to give a sense of where cell phone gaming is, in its current state. However, despite all my luck and charm, I’m simply not convinced that the cell phone will be replacing the PSP or the GBA any time soon. Sure, some phones are hardware accelerated, phones like the nifty, and now overly distributed, Razr V3 all have ATI chips in them. However, while working on the cell phones I simply couldn’t bring myself to try to code something 3D on a 2 inch screen. It just doesn’t make sense.

Yeah, okay, sure, streaming video on a 2 inch screen equates to streaming a very small image on a PC. We’ve had the tech to do that for years. Being able to view, real time, what your buddies are up to when they go to the strip club, well, that’s priceless.

Games, however, on a 2 inch screen is just stupid!

Have you ever tried to play something real time with that tiny little keyboard? Even games like Pac-man become increasingly difficult when trying to navigate around the maze and you repeatedly hit the wrong button.

While I was working on the cell phones, I did think of a user input system that would be kinda cool. If you took the orientation of the phone, using the camera and syncing frame by frame, you could, in essence, have a controller similar to the Wii controller with its multi axis design.

Now, all of a sudden games could become more interesting. Can you imagine riding on the subway with someone playing a game using this style of user input and watching them ducking and diving, swerving their phone to avoid the oncoming monster. Oh, that’d be sweet!

Next week… sound processing
-Ken Noland