You are here

December 2015

Lights, Part Placement, Ship Save/Load, and Happy Holidays!

Hey Folks! I managed to get lights working today, as well as a few more basic part-placement features and ship loading/saving, and some project housekeeping.

Each item can have one or more lights added to it, for things like general room illumination, and (hopefully) more lively ship parts like consoles, reactors, and other powered-items. A lot of the reason I bothered with this Unity experiment was because I pictured some really moody ship interiors where the player could immerse themselves in atmosphere.

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-18.jpg) Mind the programmer art.

The shot above shows off some of the floor, substructure, wall, and light parts in action. There are some glitches in the normal maps due to my quickly hand-painting them, and it's pretty simple placeholder sprites. But I think there's potential there for simple art with interesting lighting to produce visually interesting spaces.

Long interplanetary travel is going to be the bulk of this game, and watching crew/managing ship stuff will be the main game loop. My hope is that with this lighting system, more professional art, and more piece variety, players can not only be entertained by the gameplay, but allowed to admire their precious ship's beauty from the inside.

In the part placement system, I made some updates so the item rotations were in 90-degree increments, and I can tap the rotate key to cycle through rotations when placing a part. Also, the rotation persists after placing a part, so it's easier to place multiple parts with the same orientation. I also added the ability to remove parts from the scene using the right-click.

Ships can now save/load their layouts to a file, so I can stop using the buggy layout I created before with lots of holes and overlaps. I also did some UI cleanup to make the part picker UI stay in front, but background in the back, and removed some placeholder stuff.

Not quite playable yet, but starting to be more usable!

Happy Holidays! And OOO.

Finally, I'll be heading home for the holidays, so my internet presence will be diminished for about two weeks. I should be back to normal on the 4th of January, and may sneak an hour of work in here and there as time allows. So if it seems quiet here, fear not, I'll be back in a couple weeks.

My best wishes to everyone these holidays, and much prosperity to all in the new year!

Shadow Mesh, Go!

Looks like my hunch last night was right: it was the normal mesh and not collision mesh that casts shadows. It took some finagling, but I was able to sort out the appropriate position and scale info to make shadow meshes from a list of points specified in the data.

So far, it's looking pretty good. I'm getting wall shadows as expected, and I can even do complex shapes if I want. E.g. I made the shadow points match the full thickness of the wall sprite, instead of just a flat line from one side to the other. This should be handy for sprites with more thickness or other non-straight edges.

There's one problem: my shadow mesh and my sprite seem to be flipped horizontally relative to each other. From what I can tell, my prefab cube must have flipped UVs in the horizontal direction. I'm looking into blender now to see if there's something I did to flip them. Though, I suspect it may just be a coordinate system thing. Unity's camera points in the Z+ direction, so I'm actually looking at the underside of the object.

During all this, I also did some patching-up of my JSON handling code, and now I can write out any JSONs that I want based on current game data. This probably won't be used much in the final game, but for development, it can be handy for quickly adding bulk data to a large number of items. Just read in the base data, have my code procedurally write extra data to each object, and save out to JSON again.

I also fixed the scrollwheel sensitivity on the scrolling parts list, so now a single mouse wheel increment moves about one full button part. Should make scrolling easier.

However, I'm still having trouble detecting when the mouse is over that GUI panel or not. Unity handles mouse clicks and scroll events on that panel magically, but if I want to manually check if the mouse is over it, I seem to fail. I'm mainly interested in this so I can prevent mouse events from affecting the scene when manipulating the menu. Mouse wheel scrolling zooms the camera and scrolls the part list, but it should only do one or the other based on where the pointer is.

For the life of me, though, I can't figure out how to do this. Seems to be a glaring omission in the Unity UI system, based on my searches. (I.e. I may have to create custom OnEnter/OnExit handlers for the panel in question)

We're starting to get there. I think I'm quite close to being able to place correctly-lit sprite parts soon. The next step will be getting the grid-fitting rules active, so parts only go where they're supposed to.

That's all for tonight. Have a good one, all!

Grid Placement, Resolution, and Shadows

I continued working on the prototype today, focusing on units, scaling, and grid placement.

Up until now, I've been using an orthographic camera (i.e. 2D projection) viewing a 3D scene. Most of my objects are either quads (squares) or cubes, stretched to match the texture dimensions. The result was pretty much identical to 2D engines like HaxeFlixel (indeed, the rendering behind the scenes is identical). The difference is that using Unity's 3D meshes allows for things like mesh materials that support lighting, shadow-casting, etc. Unity's 2D setup only allows a subset of these features.

One down side to this 3D-viewed-as-2D setup is that the on-screen elements can get stretched/squashed if you're not careful. This can ruin the pixel art as it gets aliased (or anti-aliased). Fortunately, there are some tricks to setting that up. Most of my morning was spent getting these settings fixed, as my first attempts were fairly arbitrary "looks good" eyeballing.

Once I had sorted out some pixels-per-unit and units-per-screen ratios, I moved on to grid placement. This was just a matter of figuring out the appropriate grid size in terms of Unity units, and adjusting item positions so they snap to the nearest grid point.

This had a few hurdles. First, I was mistakenly changing pixels-per-unit when changing camera zoom and screen size. Second, the mesh-scaling I was doing to match sprite sizes to the images was causing some alignment issues. Meshes are placed according to their center, not the top-left, so a scaled mesh is going to align differently with meshes of a different scale. I had to add some adjusters to my code for this, too.

With that done, I was able to move items around the scene and place them on a grid that matched the 32-pixel minimum tile size. I also fixed the bug that caused parts to be placed when I clicked a button to select a new part type.

Lastly, I decided to start looking at whether I could dynamically generate shadow-casters for my parts based on the json data. If you recall from my earlier shadow testing, diagonal walls had some problems. Meshes can cast shadows wherever they are opaque, and I created a special mesh cube that "smeared" the texture down the sides. This works well if the wall is flush against the sprite's edge, but the diagonal cuts across the middle, and the smearing doesn't cover that hole.

To solve this, I pictured a system where the user specifies a couple of coordinates in their part data, and the game extrudes that into a shadow-casting mesh. Kinda like this:

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-16.png) Rough shadow-casting mesh concept.

Basically, the user draws the gray diagonal wall sprite, and the orange box is the bounds of the image. Within their part data, they provide a list of coordinates (green dots), and the game will connect these dots with a shadow-casting mesh at runtime. The white numbers are the coordinate system of the image, so the user would write something like "[0, 0, 1, 1]" in this case.

All the game is doing is reading those dot coordinates, extruding them towards the camera (up from the floor), so that the light (white dot) gets blocked by it.

So far, the mesh creation was pretty easy. I'm able to apply any arbitrary collection of dots to a game object that I want. The tricky part is figuring out how to get them to cast a shadow.

So far, I've been trying to replace the item's collision mesh. For some reason, I thought this is what was casting the shadows. But now that I write this, I think I'm mistaken. It's probably the regular mesh (which is why the texture on the mesh can cast a shadow in the first place). I must've been confusing something I did before as a test with a collision mesh casting shadows in the editor.

Anyway, maybe I'll try adding this green-dot-shadow-mesh to the existing mesh, and see if that works. In theory, since this shadow mesh is flat and its edge faces the camera, it should be invisible to the player (even if the sprite wasn't covering it).

Hopefully, I'll get that working tomorrow!

Triple-Boot Is Go! Also, Part Selection and Placement

MacBook Pro is officially booting into OSX El Capitain, Windows 10, and Ubuntu 14.04!

As of last evening's news, I was installing the non "+mac" iso for Ubuntu for [reasons], and holding my breath. Installation completed without too much trouble, and then a reboot. Most importantly, OSX still worked. Ubuntu installation still worked. But Windows 10 was not happy. It complained of "Recovery...0xc000000e...need restore disk to continue" and could not boot.

On a hunch, I went back to the topic I read about fixing the hybrid disk to be GPT again. Following the steps to convert the hybrid MBR into a protected MBR, I rebooted and...it worked! All three OSes!

So to quickly summarize (from memory here, so forgive errors), a mid-2014 MacBook Pro was able to triple boot OSX, Win 10, and Ubuntu after a bit of partition massaging. I basically shrunk the OSX partition in OSX, putting Windows 10 after OSX, and Ubuntu live. Both were FAT32, and this step may have caused OSX to hybridize the partitions.

Next, I followed the link above to convert the hybrid MBR to protected, and was able to install Win 10 (EFI mode) from USB. After that, I think I installed the reFind boot loader. I forget why.

I then installed Ubuntu using a non "+mac" .iso, using a USB in EFI mode. (The +mac forced BIOS mode, instead of EFI, which is bad for modern EFI OSes like OSX, Win 10, etc.) This killed Windows 10 momentarily, but re-running the gdisk script (above) to fix the hybrid MBR to be protected restored Windows.

Phew! Now, time to get back to coding a game!

I did some of that today, too. Namely, I was able to get my scrolling part-selector buttons to attach a ship part to the cursor when clicked, and it would follow the cursor around. Then, when I clicked the scene, it was deposited in the scene, and a new copy of the item started following the cursor.

It's crude, doesn't follow any grid-placement rules, and clicking a button to change parts adds a mesh over the button. So still a work in progress.

But yay! Starting to feel like a ship editor again. Now for those pesky grid-placement rules...

Butchering My MacBook Pro

Hey Folks! Hope everyone had a good weekend. Ours was a bit stressful as we realized that we needed to have a complete list of all my and Rochelle's entries to the US for some paperwork. Not just month and year, but also dates.

As someone who's been living in Canada since 2004 with family to visit in the US, well...let's just say it's a long list! And one we're still working on.

Back at the office, I decided to take another crack at a triple-booting MacBook Pro (MBP). "Crack" is no mistaken choice of words, either. The MBP disk partition situation is a real jungle, particularly if you let BootCamp do its thing. I've been learning all kinds of stuff about EFI, UEFI, Master Boot Records, HFS+, and other likely-immediately-obsolete techno-jargon. Apple seems to be keeping slightly ahead of folks' ability to build tools for MBP to support OSX, Windows, and Linux.

My latest discovery is that recent (2014+) MBPs use a more modern EFI than the old EFI 1.1 spec. Though, it may still be a hybrid EFI. And Windows has recently (as of 8) embraced EFI, so BootCamp's "helpful" hybridization of disk partition schemes may actually make things worse.

With that in mind, I set about manually cleaning up BC's hybrid disk to make it more EFI-friendly for Windows. Except recent Macs also have a "helpful" System Integrity Protection (SIP) feature. Basically, it locks users out of certain root folders/commands to avoid malware hijacking the machine. Despite my snarky quotes, this is good in most cases. But if you need to muck with the partition scheme, not so much.

To make matters worse, it appears recent MBPs (or at least mine) have the "helpful" feature of internet-based recovery HD. Again, great for when you need it, but it appears this means I have no local recovery. As a result, I have to download OSX setup (i.e. wait several minutes) every time I use recovery mode to bypass SIP for a root command. (Also, it seems internet recovery is stuck at the OSX version my hardware came with, despite upgrades to OSX since then. I've heard some horror stories about reverting OSX versions like this, and if so, that stinks. But if that's just a myth, then no probs.)

Anyway, I finally got Windows working using gdisk to fix the hybrid partition scheme. (Setting a "protected" Master Boot Record.) Windows 10 is working! OSX is working! Wait, where did my Lubuntu go?

I installed Lubuntu before Windows, but it's no longer booting. ("Not a bootable device.") Huh.

A few more hours of research, and it appears that I may have been using the wrong .iso for Lubuntu. When you download the iso for *buntus, you get a raft of choices. 64-bit, 32-bit, alternatives of each, and +mac versions of each of those. Bizarrely, it turns out the lubuntu-14.04.1-desktop-amd64+mac.iso is not for use with modern Macs. It's a misnomer these days, as it forces BIOS mode instead of EFI. Oof.

So, I set about downloading and creating a USB live drive using lubuntu-14.04.1-desktop-amd64.iso instead, and so far, so good. In the "live" mode, I can verify it booted in EFI mode instead of BIOS. Now, for the ultimate test: does installation work, and will it bork my MBP?

Tune in tomorrow!

Hoo-Wee! GUI!

Hey Folks! With some trepidation, I delved into Unity's GUI system today. And after a day of tutorials and experimenting, I have to say, I'm pretty impressed!

Originally, I approached GUI like I always did: hack something together quick and dirty. I started thinking about how Unity handles scene hierarchy, scrolling, clicking, whether I want buttons or just raycast clicks...and it dawned on me: I should really see how Unity prefers to do this stuff first.

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-11.jpg) And I'm glad I did.

It turns out, Unity's GUI system is pretty robust, and not too hard to figure out. It took a video tutorial to get my bearings (and I hate video tutorials), but after some training wheels, I think I'm getting the hang of it.

The shot above shows a scrolling list of buttons, each tied to an item loaded from my items JSON. The user can just click and drag the list, and it scrolls like an Apple menu. I.e. with inertia and bounces when it reaches the end. I could care less for the bounces, but I'm impressed that I didn't need to write special code for this type of scroller. I didn't even need to write code to line them up vertically, align text, or practically anything. I just set some layout rules in the editor, and had my code populate the list at runtime.

The one thing I did and up doing in code was to adjust the button preview images to scale according to the sprite size. I may be doing this wrong, as it was just the first way I figured out how to add images at runtime (the tutorial did it a different way).

And by all accounts, there's a lot more functionality in there. Automatic 9-slicing of images for resizable frames, using shader materials on UI, having some UI live in the scene and some overlaid atop it...there's a lot of thought behind this system. It's a huge leap forward from what I'm used to. I suppose we'll see if I produce a better UI as a result :)

I did three other smaller tasks today, with mixed results.

First, I took another crack at setting up a triple-boot MacBook. No dice, though. I thought I could partition/format in a different order to fool Windows into being okay, but I ended up in the same place (Windows refused to install on the bootcamp partition after Linux was done). I have a feeling I forgot to switch Linux to Logical instead of Primary partition, which may have been it. If not, maybe I need to see if Linux just installs afterwards. Or, as some have suggested, just boot Linux from USB when I need it.

Second, I tried to figure out why item rotations were wonky in my loaded data. I still haven't, but I'm starting to see more of the problem. I'm not too concerned about this yet, as this data was produced in another app, and I might sidestep this issue if I start using this Unity version of the code to place items.

Finally, I figured out how to make mouseclicks intercept items in the scene. Turned out to be pretty simple. Just a raycast from the camera to the camera's look-at (which for a 2D camera is just the Z axis). And conveniently, Unity adds collider meshes to each object by default, so the camera raycast just intersects those colliders.

In fact, I think those colliders are also the basis of Unity's navmesh system (AI pathfinding), so I may get a two-for-one by using meshes: selection and automatic navmesh setup.

In fact in fact, the primary purpose of the colliders may constitute a third benefit: physics! I haven't yet decided how or if I'll use physics. But two use cases that interest me are:

  • Particles/FX - It'd be interesting to see if sparks, ricochets, and other sprites could be worth having.
  • Zero-g - I've sort of pictured the ship layouts being flat, like a boat. This would mean that thrusters in the stern (a la Millenium Falcon) would mean crew and loose items flying aftward under acceleration, and flying forward onder "braking." I.e. literally, loose cannons. My realism angel/devil says grav-compensators are dirty, so I'm curious to try it.

    That said, I could always impose a design standard of accelerating with the same belly thrusters used for lift-off, and the ship must reorient belly-forward when braking, too. We'll have to see.

That's all for now. Have a good weekend, all!

Porting Data from Haxe to Unity

Today, I wanted to solve some of the problems I was seeing in yesterday's render. Namely, the incorrect depth-sorting of sprites (such as the floor and struts), and rotations. For this to happen, though, I needed to be more sophisticated in how I load data.

To-date, I've just been parsing the ship JSON file I created in the past, and hacking together parts from the image and transform data it held. It was enough to get a list of parts and their X/Y positions, but that's about it. The items JSON file I had was where I defined more information for each item, such as the depth-sorting order, socketing info, etc. I needed to get more data into the engine.

Loading data isn't hard, as I've discovered. The LitJSON library makes it pretty trivial for me to parse my old JSON files. It took a bit of time for me to decide how I'd store the raw data I was parsing, and whether to convert it to game-ready objects right away or not. In Haxe, that's basically what I did. Here, however, I may just load the raw data, and piece-together game objects as they are needed. This saves me some upfront time in coding, and also lets me separate code a bit more.

Most of this raw data loading/storing went without a hitch. But there were a few gotchas. Most of them revolved around fancy tricks I was doing in Haxe, and whether they could be done in C#. And for the most part, they can. I just had to relearn them.

One example was the ability to specify a generic type in a function, so different types could be used by the same function. I used this for writing JSON parsing code, since it was almost identical for each data type, and just a few variables differed each time.

Another example was how to get an arbitrary property on an object without knowing what it is at compile time. (I.e. reflection) I'll admit, the Haxe way of doing this is a bit easier to understand, but I managed to get this working, too.

Once I figured these out, I had Unity loading all my old JSON files, and was ready to grab depth-sorting info for each ship part I drew. And a few tries later, it worked! My floor grill was rendering over my framework, and under my walls. And shadows were looking good!

Rotations are still an issue, though. I still haven't figured out why my horizontal walls look wrong but the vertical ones look right, especially since they all get similar rotation treatment. I'll have to investigate more tomorrow.

We're getting there, though! Still no logic in the game. No data-editors, no AI, no interactions, etc. However, I'm hoping this will be the easy part, as it's mainly just copying old code and translating Haxe to C# (which are pretty similar in syntax).

Except for that data editor. The GUI in Haxe was a nightmare, and I shudder to think what Unity has in store. We'll see!

Unity Mapping and Lighting

I decided to put hardware aside today, and do some more prototyping in Unity. And I'm happy to report I managed some serious progress!

First, I fixed a niggling issue in the LitJSON library that seemed to force all floats to doubles (which is annoying for any sort of Vector2/Vector3/graphics code).

Then, I set about making some code to read the JSON and create prefabs for each ship part, set the material properties, then rotate, position, and scale them accordingly. It's still not quite there, but it's getting close:

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-09.jpg) Somehow reminds me of Lunch Atop a Skyscraper

The above is a mostly correct loading of the JSON ship I built in the HaxeFlixel editor, with sprites that have diffuse (color) and normal (bump) maps, plus shadow-casting.

There were quite a few hoops that needed jumping through. But thankfully, they were mostly one-liner fixes that took longer to find than to write. For example:

  • How to apply a custom diffuse and normal texture to a material at runtime.
  • How to instantiate a prefab vs. a quad primitive at runtime.
  • How to enable "cutout" transparency on a mesh renderer at runtime.
  • How to enable normal maps on a mesh renderer at runtime.
  • How to fix the normal map swizzling Unity does when reading a PNG at runtime.

In case you hadn't noticed, the pattern is "at runtime." Unity is definitely geared towards doing stuff in the editor, and I'm geared towards doing stuff in code. So a lot of this was uncovering the little details Unity's editor does when toggling various UI settings.

Each of these secrets I uncover is thankfully a one-time thing, as I can reuse those tricks on any future code. And I'm getting pretty close to the point where I've learned how to do everything I was doing in HaxeFlixel.

Next up: I need to figure out why some sprites have wrong rotations. Then, I want to add a JSON that defines each ship prefab's settings (like quad vs. cube, draw order), and some sort of way to customize how the sprite blocks light, for diagonal walls.

I'm feeling pretty confident after today!

Hardware Day

Unfortunately, didn't get much dev time today. I spent a large chunk of the day partitioning and installing OSes on my MacBook. And by the time this afternoon rolled around, I had a mostly broken MacBook :(

Fortunately, restoring the original hard drive wasn't too hard. I just had to reboot into recovery mode, repair the main partition, delete the others, and resize the main partition to use the whole disk again. Back to normal!

However, that's a rough first try.

I was following this triple-boot tutorial, and it basically has you start installing Windows via Boot Camp, stop midway to install Linux, then resume Windows.

Unfortunately, the "install Linux" step was light on details. I was faced with a choice of how to use the partition I created for it, and chose to create a "primary" partition for the root, and a swap partition. Looking back, I'm thinking I made a mistake by choosing "primary."

It's also possible that by doing things in this order, Windows gets shoved to the back of the disk's partition list, causing it to disappear from the boot options. Though, I haven't specifically run into that issue yet. It's just something I remember from attempts on my older machines.

Anyway, bad that I've wasted time. But good that I'm back to square one. I'll give it another shot tomorrow!

Unity Data Progress, MacBook Ugh

Hey Folks! Hope everyone had a good weekend. Ours was a bit stressful as we take in the mounting complexity of our upcoming move. It's going to involve a lot of traveling around to get government stuff setup, plus finding an apartment, new car (importation issues), etc. It could be a low-progress couple of months ahead...

Back in the dev world, there is fortunately better news. I figured out how to get Unity to load non-binarized data at runtime. Better yet, I managed to get it to load my JSON from the HaxeFlixel project!

As I mentioned last week, one of the problems I was having was to locate where I could put files that both the Editor and compiled app would see equally. And as it turns out, that's the StreamingAssets folder I alluded to last week.

"Didn't that fail," you ask? Yes, but I was doing it wrong. (Of course!) It turns out I can just have Unity tell me where that path points to, and then use standard C# file access methods. E.g. System.IO.File.ReadAllText()

Once I had that working, it was just a matter of reading my JSON format. And thankfully, there's a great library out there for this. It's called LitJSON, and so far, it "just works." Within minutes, I had it reading a json file, instantiating an object from its data, and writing it back out to a new json. Not bad!

There's still some concern about whether this will have cross-platform issues. Ostensibly, no, but I haven't tested anything outside of Windows yet.

Elsewhere in dev, I began setting up my triple-boot MacBook today. I say "began" because most of the day was eaten up by just downloading updates. Yosemite needed an update to El Capitain, because I've heard updating after a triple-boot setup may require reinstalling some EFI boot loaders.

Then, I downloaded a Windows 10 installation disk to a USB stick. I'm sort of hoping I can install that without buying it for a 30 day trial until I'm sure it works in triple-boot, but we'll have to see.

I had to download the Lubuntu iso and install that onto a USB. And unfortunately, doing so seems to have killed my first USB. I spent altogether too much time trying to salvage that, then just gave up and used a new one. So far, I think the new stick is working.

I also had to download Boot Camp's support files for Windows. This seemed to take an inordinately long amount of time, and I had to try a few times to find a stick that would fit it. (I eventually had to use the same stick as Win10, and hope it lives side-by-side in harmony.)

Finally, I have to do the song and dance where I start Boot Camp to install Win 10, interrupt it on the first reboot to partition a bunch of stuff and install Lubuntu, then let it reboot back into Windows to finish Boot Camp stuff.

So far, I've gotten to the Lubuntu installation's partition setup screen, but I bailed out. I need more time than I have left today to figure out what I need there. (E.g. swap partition, boot partition, other partitions?)

All this, of course, hoping I don't bork my MacBook. Of course, I think reinstalling a fresh OSX probably won't be bad if that happens. Especially since it's direct from Apple, unlike my last, second-hand MBP.

Tomorrow, I'll probably pick up where I left off with Lubuntu, then Win10, and see if I can get it to boot-select. Might need rEFit boot-loader on there, too.

And at the same time, I'll probably be trying to get the JSON-loaded objects to appear in the world in different positions and rotations, with different materials. Ideally, materials based on PNGs specified in the JSON, using some base Material type.

In theory :)

Unity and Data Files

So I may have encountered my first mark against Unity: data access.

I started my search today by looking up how Unity could load and parse JSON files. I was a bit disappointed to learn that Unity doesn't have this out of the box. Nor does it have XML parsing. There were quite a lot of third-party solutions, though, so not bad yet.

It wasn't until later that I learned why: Unity assumes you're baking all of your data into assets at compile-time. That's a problem.

In Haxe, I can just drop some files into my assets folder, and they come out the other side one-to-one when I build the game. I can add JSON, XML, PNG, or whatever to the game, and players can access them easily. Plus, Haxe has some handy XML and JSON parsers built-in.

Unity, on the other hand, has something called Resources, which get binarized into asset packages that players cannot read. Only the game can read them (or an editor, if I made one). There is something called the "StreamingAssets" folder, which kind of works like Haxe. Files I drop in there come out in the build as-is.

However, getting access to files in this folder has been tricky. So far, it looks like I have to make a WWW(?) call to stream the data, and then wait for it to load. This is a bit more complicated than I'm used to, and I haven't succeeded yet. I'll have to look into this a bit more, but so far, score one point for Haxe and against Unity.

In other news, my replacement MacBook Pro arrived today! I still have to climb the mountain that is reinstalling the various OSes and tools, but at least I have a (theoretically) working Mac again.

Anyway, have a good weekend, all. And see you Monday!

Getting to Know More About You(nity)

Had an interesting semi-breakthrough today in my Unity testing.

I grabbed a bunch of the ShipEdit sprite assets and moved them over to the Unity project, and decided to try and recreate the ship editor scene. Namely, could I get my sprites to look good with lighting and shadows? Well...

IMAGE(http://i.imgur.com/vqeoYj2.gif) Also not a bad start for my poltergeist game...

As you can see, that's pretty darned close to what I was trying to achieve all this time. It's far from perfect, but we've got the main features:

  • Normal (bump) mapped sprites.
  • Colored lighting.
  • Shadow-casting.

How did I do it? Basically, by making it a 3D scene. But maybe not like you're thinking. Each of the vertical/horizontal wall pieces is a 3D mesh cube, but the cube is larger than the opaque part. Namely, it's a cube with UV projection planar along the z-axis:

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-03.jpg) Cube with planar UV projection.

What this does is take the image I apply as a material (my wall sprite), and sort of "smear" it along the cube's z-axis. Anywhere the image touches the edges, it creates a solid section, and anywhere the image is transparent, the cube is transparent. And since Unity's Standard mesh shader allows "Cutout" rendering mode with casting/receiving shadows, any of these opaque sides cast shadows accordingly. Nice!

Does this mean I can just paint any old wall I want, and turn it into a shadow-casting mesh? Not exactly. Check out this diagonal:

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-03a.jpg) Not such a solid wall after all.

In the diagonal case, we have a big hole. This is because the diagonal wall only has texture along the edges where the corners of the image are (hence the solid edges). The main part of the wall cuts across the inside of the image, so it doesn't appear on the mesh's sides. (The mesh is just hollow.)

The kludge I came up with was to insert a diagonal plane where the hole was, and set it to cast shadows. This is easy enough to do in the editor, but I'm not sure if a player/modder is going to go through this trouble. (Although, Unity is free...hmm.)

Anyway, it'd be nice if this could all be defined in an image and text file pair, so no special tools are needed. In theory, this may be possible if the user specifies some points that make a line through the image, and the game just extrudes those into a mesh/plane accordingly.

I'll have to give that some thought.

Once again, I'm coming away from this test impressed. Most of the really low-level and technical stuff "just works," and the issues I need to solve are fairly shallow. Things like game logic, art asset specs, etc. It feels really productive. I even allayed one of my concerns from yesterday: creating a build is as easy as choosing a target and selecting "build." The above animation was recorded from a Windows build.

Plus, in the process of learning, I'm stumbling across useful features I didn't plan on having. Things I thought would be extra work are coming for free, like material specularity, physics, etc. It's all starting to look doable.

Of course, I'm basically a 3D game at this point, which is maybe a bit risky. E.g.

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-03b.jpg) This is not your father's pixel art.

I've already had to download and teach myself Blender to solve that UV-mapped cube issue, so I could be increasing the project complexity. But then again, maybe I'm decreasing it by not having to write low-level rendering code and other features?

I think one of the next tests should probably be to see about things like JSON parsing to see if I can recreate my editor. If so, I may just have to drop the $5000 for a pro license and make a go of it. Kinda scary, but also promising!

Getting to Know You(nity)

Okay, so I've given myself a full day of exploring Unity and...not bad, actually. I took most of the morning to finish knocking-out a Pacman tutorial using Unity's 2D mode, and it was pretty straightforward. There's a lot of functionality packed into Unity, and (importantly) it works out of the box.

Probably my biggest complaint so far was the way everything in the scene had to be manually added via UI widgets in the editor. Extremely tedious for things like adding physics bounds for the Pacman maze, pellets, nodes for the lights, etc.

After some poking around, I discovered that some devs just bypass most of the editor use by doing everything in code. This is a bit closer to what I'm used to in Flash and Haxe. Just draw the assets and write the data in external files, and have the code load it in based on rules and loops.

So once I figured out how to do that...

IMAGE(http://bluebottlegames.com/img/screenshots/screenshot-2015-12-02.jpg) Oh my god, it's full of wabbits!

This is a screenshot of my Pacman tutorial with 1000 sprites loaded and randomly placed around the scene. Each sprite has a random rotation, a diffuse (color) map, a normal (bump) map, some sort of specularity (glossy highlight) that apparently is on by default, and they slowly rotate over time. There are 4 point lights with different colors in the scene. And all this runs at ~60fps.

More importantly, I got all this to work in about one day of effort.

Even more importantly, I got all this to work in one day without having ever really used Unity before.

My current feeling is that this might be worth further exploration. I still have concerns. E.g.

  • How straightforward is it to make the Mac and Linux versions of this? Do I have to tweak anything or write alternate code? Or just choose a different target and hit "compile?"
  • Ditto for mobile, though that's more "nice to have" than PC for now.
  • Can I get nice-looking ship lighting with these tools? Unity's built-in lighting/shadow system is actually pretty powerful, but it doesn't really work with sprites. I can do some hacky tricks to give each sprite thickness so they cast shadows, but then they end up shadowing themselves or causing alignment issues where they touch the floor. Though, I may be able to skip shadow-casting and just use clever light placement.
  • I also tried a 2D shadow-casting asset from the Unity asset store, but I wasn't 100% convinced it'd work for me. It seemed to use its own special materials and tricks which might invalidate normal maps, but maybe I was interpreting it wrong.
  • How hard is it to do things like parse JSON files, add my own fragment shaders, etc. I suspect it's as easy as Haxe, but I don't know how yet.

I'm still optimistic. More so, in fact, than I was when I started this test. So I'll keep trying!

Taking a Break from HaxeFlixel/OpenFL/OpenGL

I spent most of the day still trying to get framebuffer objects working. The goal was to try and get the ship scene to render in full color, and another copy with the normal maps, and then use a shader to combine the two with lighting data to produce highlights and shadows.

Unfortunately, I'm still hitting walls no matter what I do.

Today, I thought I discovered something I had missed before in the flixel-demos/PostProcess project. By all accounts, it was using framebuffers successfully to apply post-process effects to the whole screen. I could certainly work with that! But after some testing, it appears this only works with OpenFL legacy. Since everything else I'm doing requires OpenFL next, this is a non-starter.

I also tried one more quick test where I moved that framebuffer code over to my working OpenGLView test project. And while it compiled and ran, I ended up with a blank screen. I suspect I'm rendering to the framebuffer, as desired, but I can't figure out how to work with that render afterwards.

So I'm taking a break. From HaxeFlixel, from OpenFL, from OpenGL. I've learned a lot in this process, and even had some successes. But I need to do something that works. To see results.

So tonight, I dusted off Unity and I'm going to give it another go. I'm still not certain it's the right way to go, since I'll have to relearn a lot. But on the other hand, just about everything it does is guaranteed to work, so it's mostly about learning to use its tools instead of patching and side-stepping holes in those tools.

Plus, the last time I didn't really give Unity a fair shake. Just a few hours one afternoon, and then switched back. Maybe this time I should try to get something up and running, so I have a better idea of how it compares to my HaxeFlixel option.

So, prepare to not see anything exciting for a day or two as I learn how to crawl, walk, and then run again :)