Someday I’ll write up a collected thing with a bunch of what I’ve learned, but here’s some salient things before I forget.
For my initial modeling, I set my body mesh up with two modifiers: mirror, and subdivision.
When it was time to weld the body to the hair, I applied the mirror modifier, but kept the subdivision modifier, and used the same subdivision modifier on the hair.
I maintain the subdivision modifier forever and ever, because editing subdivision surfaces is way easier than editing meshes. This does mean there’s somewhat of a precision loss for rigging, UV mapping, and weight painting, but it isn’t so bad that it makes me wish I had a higher-resolution mesh to reference, and the advantages of staying in subdivision mode are well worth it.
The downside is that it complicates the FBX export slightly, although it isn’t too bad.
A lot of folks swear by the Cats Blender Plugin, but I’ve found it to cause more problems than it helps with. Instead, I use ApplyModifierForObjectWithShapeKeys to “bake” my subdivision mesh into triangles, and then when I export the FBX I use the following options:
- Limit to: (whatever you feel is appropriate, I just export my whole scene personally)
- Include: Armature and Mesh
- Apply scalings: FBX scale Update: This turns out to be incredibly important, see below!
- Add leaf bones (this seems to be necessary for PhysBones to work right)
The “apply scalings” setting is pretty important; otherwise, PhysBones' scale will be off substantially, among other things.
Anyway, after doing the export, I can then undo the ApplyModifierForObjectWithShapeKeys and my mesh is back to a lovely subdivision surface, perfect for continued tweaking and fiddling.
Note that for most people it probably makes more sense to use Cats, especially if you’re using Blender to set up the materials; however, I’m doing some overly-fiddly/complicated texturing and normal mapping stuff externally and I’m not making use of normal maps on Quest (and my base mesh is low-poly enough that it doesn’t require decimation on desktop). Cats does have some very nice options for exporting, however, including baking normal maps as part of the decimation process and generally making a better export workflow when you’re doing as much as possible (especially texturing) within Blender itself.
I’m probably making things way more difficult than they need to be, really.
The VRChat avatar system wants to adjust the sizes and positions of things to match some reference skeleton which it builds based on where it thinks your physical limbs are. There is no way to override this directly in general, and there’s no way to determine where it thinks those limbs are either. This is why VRChat provides “height” and “shoulder width” measurement options, but both of them are still pretty much secret sauce.
One of the more obnoxious side-effects of this system is that it also wants to move your eyes to be where it thinks they physically are, which means that if you have your avatar’s eyes mapped, they might be adjusted, incorrectly. However, it seems that correcting my FBX scale has also corrected the eye positioning. More experimentation is necessary. Update: So, it turns out that this was actually due to the armature scale being wildly off, and fixing the FBX export settings seems to have gotten the game to actually honor my authored eye positions. Huh.
PumkinAvatarTools has a rather nice pose editor, which makes it much easier to edit hand poses than using Unity’s built-in animation editor. (It’s also what identified that my FBX scale was set incorrectly.) Unfortunately it doesn’t have any options to automatically mirror poses between the left and right hands, but it isn’t too hard to copy-paste values manually.
The hand armature is especially fiddly when it comes to the finger and thumb positioning, due to the hidden/unspecified measurements mentioned above. This is especially problematic when your avatar’s hands don’t map cleanly onto a standard human hand, such as with a critter’s mitts.
Fortunately, if you don’t need individual finger tracking (as is available on some controllers such as the Valve Index), there’s a cheat to get past this, which I learned from the official Avatar 3 Walkthrough. Somewhat buried in that article, when talking about AFK animations, is this little gem:
In order for the AFK animation to “take over” the character, you have to disable input from things like IK. You don’t want to track your hands when you’re AFK. For that, we’ll need to use another State Behavior. This one is called
This can be used to override the hand animation for any gesture! So what you need to do is open your gesture controller (which handles the hand animations) and, for every hand gesture, add an appropriate
VRC Animator Tracking Control behaviour script, setting “left fingers” or “right fingers” to Animation as appropriate:
This makes the authored gesture pose completely override the tracking data from the controller, rather than adding on top.
I have no idea how this interacts with hardware finger tracking, but that’s a bridge I’ll cross when I come to it. Hopefully by that time I’ll have a better idea of how the hand armature should be set up in the first place, which is the real fix. (Although I suspect that finger tracking will always be fundamentally incompatible with critter mitten-hands anyway…)
Update: It turned out that the hand animation override was also unnecessary after I fixed my FBX armature scale! Go figure. It’s still good to know about the animation overrides, though, since that’s a part of making advanced gestures work, as well as it might still be necessary for critter mitts anyway (although it doesn’t seem to be needed on non-finger-tracking headsets). I’ll have to have an Index-wielding friend try it out.
For my Quest decimation I’ve been using Polytool for Unity, which costs $25 as of this writing. It works pretty okay. Its decimation isn’t perfect but it works well enough. I disable the mesh and texture atlassing, because I had modeled my stuff to not need it in the first place (because I have a terminal case of graphics programmer brain, apparently).
Apparently the Cats plugin can do all that stuff too, but it does way too much and I’m not happy with how it mucks with the rest of my Blender scene (renaming armatures and so on), not to mention it really, really wants you to set up the initial material from Blender, which is incompatible with how I want to do my texturing.
Anyway, something that was really confusing to me which took me way too long to figure out is that many of the VRChat Quest shaders account for vertex colors. If you’re only going to be using an albedo map for your coloration, you’ll need to make sure that your vertex colors are all white. It took me embarrassingly long to realize this; I didn’t even realize Blender had vertex colors, and it only even occurred to me when I stepped through the Quest shader sources to see that it was using them as an attribute.
I ended up paying $39 for UVPackmaster. It was well worth it. My UV unwrap process is pretty simple:
- Set up the seams to keep separate parts separate
- Run Blender’s default UV unwrap
- Scale islands up and down based on the texel density you need (for example, scaling up the eyes and any high-detailed features that people will be looking closely at)
- Use UVPackmaster to proportionally reallocate a lovely tight packing on the UV space
I also like using Blender’s “color grid” generated image to inspect the UVs to make sure that things are relatively appropriately-dense:
If I end up working on a part of th emesh so much that I need to re-unwrap it, I just re-unwrap that section and then use UVPackmaster’s “Pack To Others” command, which will then find the best new spot for it (which will probably be close to where it was before, if you’ve only re-unwrapped a single mesh piece).
For some parts I’ve also reoriented the islands to make them more sensible; for example, I rotated the UV island for the front of the necklace so that it’s much easier to put text or an image on it.
I am extremely not a fan of Substance Painter for a number of reasons. So what I’ve been doing in order to work on my textures in Photoshop is:
- In Blender’s UV editor, do
UV > Export UV Layout, and save it as, say,
- In the Texture Paint window layout, create separate textures for each group of parts that you want to texture easily (e.g. skin, hair, teeth, etc.); each texture should be your total texture size (so I use 4096x4096) and be set to transparency, with a color with alpha of 0
- For each part group texture, select the appropriate polygons in the mesh editor, and then flood-fill them with a demonstrative color and alpha of 1
- Import all of the above images into layers in Photoshop
Now you have a convenient UV layout reference to know where your parts are; I keep
uvmap.png on top and then the masks in a folder, and then if I need to quickly select a mask for working on a part of the texture, I can ⌘-click the mask’s layer thumbnail.
For the more complicated things (such as plaid lines) you can use a more traditional texture paint workflow to draw a reference/sketch image that you can then import into Photoshop and, combined with the UVMap reference, paint things with precision and the nice Photoshop tools.
For my desktop version I’m just using Unity’s standard PBR shader, which packs the metallic and specular maps into a single texture. Unfortunately it puts the specular map on the alpha channel, which is incredibly difficult to work with directly.
So I wrote a simple C++ program that lets me remap the channels as appropriate. In my Photoshop setup I’m using red for metallic and green for specular, so if I run e.g.
remap-channels poiu.png MS-poiu.png r0gg then that will give me a combined material parameter map appropriate for Unity PBR. (I put a copy of the specular map on the blue channel so that I can quickly see that it’s formatted correctly from the Unity editor, instead of accidentally selecting the red-green map; the PBR shader ignores the green and blue channels so that’s okay, if a little wasteful. Use a packing mode of
r01g if you want it to be more efficient and still visibly-obvious.)
The C++ program requres CImg. On my Mac I install that with
brew install cimg and then also have this build script in my project directory to build my tools (with source files kept in
src/) and run my various asset-packing pipelines:
I have a YouTube playlist where I’ve been keeping a daily-ish vlog-ish thing of my avatar progress. I feel like I’m at the point where I can start texturing it for reals.
At some point I might also make a custom shader so that I can do more with pigmentation; for now my plan is to just have a bunch of baked colormaps adjacent to some baked material+normal maps and then swap those out piecewise, but what I’d like to do is packing different color traits into the different color channels and use clever
lerp tricks to use those to make a fully-realized color scheme, although I need to do a bunch of math to see if I can actually fit everything into one or two four-channel textures. I’d explain more but holy crap I am suddenly having a pain flare and need to get to bed, oops.