Wednesday, June 29, 2011
I still make Source mods and I like it, don't get me wrong -- there's no harm in a little experimentation though, right? Now, if you make levels for Source Engine stuff then you already have the skills to start using Unity. Think of it as learning another language and being bilingual or multilingual. Here are some general concepts from Source, their translations in Unity, and some tips:
> In the Unity editor, hold right-click in 3D camera view to enable WASD / mouse-look / noclip-style navigation, just like in Hammer. Hold "Shift" to sprint. This is probably the single most important thing for you to remember that you might not've figured out immediately.
> "Y" axis is up. In Source, the "Z" axis was up.
> Unlike Hammer, where you just left-click on an object several times to switch between move / rotate / scale tools, in Unity they're mapped to the keyboard all Maya-style: Q = Select, W = Move, E = Rotate, R = Scale, SPACE = Maximizes Viewport, F = Snap Camera to Selected Object (the equivalent of Ctrl+E or Ctrl+Shift+E in Hammer) You'll also be doing most of your editing in the 3D view with gizmos instead of using 2D views. To get grid snapping, go to Edit >> Snap Settings.
> You have to compile or "build" your game to distribute it. For a web player build, it spits out a .html and a .player file. Put both files in your Dropbox and send the URL to people to test! Easy.
> A level (.vmf) = a "scene." Some Unity games use one scene for the entire game, and swap out level geometry or teleport the player around.
> There are no brushes / you don't seal your levels. Instead, you use models for everything. There is no VVIS because there are no brushes for the engine to construct visleaves from. There are no level compiles, unless you want to bake lightmaps (like in VRAD) in which case it takes a really really long time compared to what we're used to. (i.e. the Portal 2 / Source 2011 version of VRAD is blindingly fast compared to this, probably because of brushes.)
In terms of visibility optimization, you basically only have 3 tools: level of details / shader fallbacks for your models, a far distance Z-axis cull (like the one on env_fogcontroller), and occlusion culling that works like func_occluder.
> You can scale models! Whoa! Negative scale values can flip the model.
> There's no "tools\skybox" texture. Instead of rendering a hall of mirrors / level leak / "the void" effect, Unity renders your skybox. Edit the skybox texture from "Render Settings" in the Unity editor. There's also one global fog controller, and it's in the Render Settings too.
> You make your own entities. Everything in your Unity scene is an entity, or a "game object." Game objects can be saved and instanced in scenes, much like .vmf instances, and these are all "prefabs." Your entity library is a bunch of prefabs that you've made -- a static palm tree entity, a weapon power-up entity, a door entity, etc.
> Each entity or "game object" is made of "components." The components do different stuff, like the "Mesh Filter" adds a model picker, the "Mesh Renderer" adds a material picker / shadow properties for the model, a "Rigid Body" adds physics simulation, and the "Box Collider" adds box collision, etc. These fit together like Lego pieces; you can't have a Rigid Body (physics) without a Box Collider (collision model), etc.
So if you wanted an entity with just a textured model and no collision, only use those first two components I mentioned in your game object or prefab. You can also make your own components, like an "Explode Every x Seconds" component. (You can already see Valve moving towards this model with their vScripts in Portal 2; it's like scripting your own entities.)
For example, a basic "player" entity / prefab is a Camera, an FPSInput to read keyboard, a MouseLook to read mouse, a CharacterController to handle humanoid collision and movement, a CharacterMotor to handle movement speed and acceleration and jumping physics, and a MeshFilter & MeshRenderer to render a player model. (Don't worry, it comes packaged with Unity.)
> If you don't want to code your own components, you don't have to. There's a bunch of "visual scripting" plug-ins you can get. The most popular one is called Playmaker.
> You can use the Source Engine folder hierarchy style (\maps\, \materials\, \materialsrc\, etc.) to organize your Unity project. That way you'll be on the same page if you're collaborating with another Source modder.
> You can use whatever scale you want. A lot of people use the default scale based on the provided FPS controller prefab, which is "a human = 2 units tall." So that means, roughly, 1 Unity unit = 32 Hammer units.
> The grid is organized in sets of 10 instead of power of 2 numbers. Try to use 10s instead; that's the power of the metric system, right? In general, staying on the grid is much less important, and Unity has much better precision with floating point coordinates.
> .vmt = a "Material." You can also write your own shaders, e.g. make your own variant of "LightmappedGeneric" that adds parallax mapping. Note that there are no material proxies in Unity, because you can write the game code that directly accesses the UVs on your models.
> .vtf = anything. Unity automagically compresses your source images when you compile your game. So you can use .PSD, .JPEG, .PNG, .TIF, whatever. You can also turn off compression if you have a nice normal map or something.
> The lighting tools are mostly the same, except in real-time. Much like Source, there's a point light, spotlight, and a directional light (i.e. light_environment). You can have more than one spotlight that works like env_projectedtexture, and the spotlight texture is called a "light cookie." Yum. One useful trick is to use multiple directional lights to simulate light bouncing off the ground, or to add subtle ambient light to a scene.
> There are two kinds of specular; one is the Phong-style approximation that looks at nearby lights and is dynamic, while the other is the cubemapped variant we're used to. Both are more expensive then you'd think. There's also no buildcubemaps or env_cubemap in the default Unity editor. Source has had Phong light probes / spheric harmonics since Episode 1; Unity should be implementing light probes in the next versions.
> For terrain, Unity has a displacement mesh on steroids -- the "Terrain" tool can handle everything for you, and you can paint detail props (rocks, flowers) and procedurally generated trees. It works much like displacements in Source.
> Trigger volumes (trigger_once, trigger_multiple) are basically collision meshes that don't collide with anything. They're extremely useful, and many games use the triggers as the basis for their game logic.
> For a first project, try a simple FPS where you just walk around a level and explore, maybe a Dear Esther type thing. Unity comes with some primitives you can use to block out an environment.
If you're a Source modder who can speak Unity or vice-versa, please add comments / corrections... And if you've made the jump, post your game for us to play!