Lighting, Rotations, Animations, and Transparency
Hey Folks! Adding a few more finishing touches to the renderer today, as I remembered the space prototype has objects that get rotated, animated, and have transparency.
So far, everything I've been testing was set to zero rotation, not animated, and opaque. So some of the calculations I was using start to break when things don't fit that bill. Fortunately, these are fairly standard problems and there's a lot of info out there on how to solve them.
My first attempt was to use Unity's built-in functions, and their normal mapping shader tutorial, to convert normal maps (surface bumps) from the object's local space to the scene's world space. Basically, if the surface points to the right on the object, and you rotate the object 90 degrees, that surface has to rotate, too.
This actually worked pretty well! Right out of the box. And I thought I was home free. However, I forgot about transparency.
As I was testing out the animated crew to see if the calculations held up for more complex arm, leg, and body movements, I noticed that the crew had no alpha channel. Anywhere the crew was supposed to be invisible was rendering a solid color, obscuring the floor.
This turned out to be a few things. First of all, my shaders were all opaque, and I had to setup blend modes for them to ensure alpha channels were honored. And even after doing that, I had to make a few more tweaks (material render order) to ensure the crew rendered after the floor. Before doing this, the crew would be rendered fine, but then the floor would render everywhere not inside the crew sprites, leaving ugly black areas.
Once I solved that, there was still one more problem: normal maps didn't have alpha channels.
Well, they do, but they're occupied by normal info. I guess texture compression historically was better in the alpha channel, so normal map info evolved to store data there instead of where you'd expect it. And Unity followed this convention. You get nicer normal maps, but you lose transparency info.
My first attempt to solve this was to just abandon Unity's normal-packing regime, and use the old-fashioned setup: red channel for horizontal direction, green for vertical. Then, I could jam the color texture's alpha into the normal map, and proceed with blending as usual.
Unfortunately, Unity's built-in functions for converting object to world space didn't seem to play well with that.
So a bit of tinkering later, I found a compromise: keep each object's normal map info in the Unity format when it first gets loaded, so we can use the Unity built-in functions to convert the normal data into usable world space coordinates. Then, instead of converting that back to Unity's preferred normal format, we just keep it in the old fashioned red-green channels, use the color texture's alpha, and blend together as we render the whole scene's normals.
Today's image shows you how that looks. The left side is the red channel of these normals (surface horizontal direction), the right side is unlit color info. You can see our crew standing in the middle, with alpha channels working in both scene views.
However, there's still one more hurdle to overcome. My shader that calculates lighting on these scene renders still uses Unity's old normal map convention. I'll need to update it to use the red-green channels instead. Seems pretty straightforward, but my first crack at it didn't work. It's close, but something's off.
Alas, might be time to put this down for the weekend, and resume with a fresh brain. We're pretty close, though! Theoretically, solving this last bit means I can start porting this rendering code to the main project, which won't be trivial, but should yield some cool results!
Have a good one, all!