GamingNEXT has posted their interview with Valve Software's Managing Director and head of the Half Life 2 project Gabe Newell! It goes about the Source engine, via the Steam distribution system and more. Here's a snip:
GamingNEXT - After all of the movies, both engine/tech oriented and pure gameplay, that came from E3 it's safe to say that all of the gaming world is now clamoring for Half-Life 2. Can you tell us a bit about the technology used to power HL2? Gabe Newell - I did a presentation on the engine for Vivendi a little while back. It was about 100 pages long, and we got through it in about 4 hours. I'm not sure really how to condense that all down, but I'll try. We usually break it down into humans, graphics, interactivity, and AI. Now obviously there's overlap. For example there's a special shader for people's teeth. That could go in the graphics section or the humans section. There's a lot of intelligence in moving creatures over an LOD mesh - so is that AI, interactivity, or graphics? You get the idea.
For humans we wanted to make them look realistic and look consistent. Consistency is an important characteristic, as you need to make their skin tones look as "realistic" as their walk cycle. If something is too good, it actually breaks the illusion of humaness you're trying to create. There are lots and lots of details that go into their skinning and muscles to make it look right. We probably spent more time on their eyes than anything else - for example you have to model them as ellipsoids rather than spheres to make them look right as they rotate within the eye socket. The facial expression system is pretty cool in a lot of ways, not the least of which is that it blends together multiple inputs yet always maintains consistency with a set of rules about what are valid potential facial states. In other words you can push random numbers through the expression system and you won't get a face that a human can't create and you will get believable transitions between them.
Half-Life 2 Interview
GamingNEXT - After all of the movies, both engine/tech oriented and pure gameplay, that came from E3 it's safe to say that all of the gaming world is now clamoring for Half-Life 2. Can you tell us a bit about the technology used to power HL2? Gabe Newell - I did a presentation on the engine for Vivendi a little while back. It was about 100 pages long, and we got through it in about 4 hours. I'm not sure really how to condense that all down, but I'll try. We usually break it down into humans, graphics, interactivity, and AI. Now obviously there's overlap. For example there's a special shader for people's teeth. That could go in the graphics section or the humans section. There's a lot of intelligence in moving creatures over an LOD mesh - so is that AI, interactivity, or graphics? You get the idea.
For humans we wanted to make them look realistic and look consistent. Consistency is an important characteristic, as you need to make their skin tones look as "realistic" as their walk cycle. If something is too good, it actually breaks the illusion of humaness you're trying to create. There are lots and lots of details that go into their skinning and muscles to make it look right. We probably spent more time on their eyes than anything else - for example you have to model them as ellipsoids rather than spheres to make them look right as they rotate within the eye socket. The facial expression system is pretty cool in a lot of ways, not the least of which is that it blends together multiple inputs yet always maintains consistency with a set of rules about what are valid potential facial states. In other words you can push random numbers through the expression system and you won't get a face that a human can't create and you will get believable transitions between them.
Half-Life 2 Interview