Saturday, October 20, 2012

Navigating The Flow

Hey Readers,
It's been a busy couple of weeks. My main focus has been on getting in some kind of backend interface to control all of the different commands that are going to be available for the user. I needed something that would be flexible since there are going to be so many different possible commands.

User Contexts

I decided to go with something I call User Contexts. The idea behind the user context is simple, It's basically a state object which contains all the logics to execute commands for that context. So for example

Main User Context =>Has two options, Build Item and Move Item            
*Build Item
    Build User Context =>Choosing Build Item spawns off a new Build User Context from the Main
    *Build Wall
    *Build Floor
*Move Item
    Move Item Context
    *Click on location to move item to

So each user context knows about it's current state and how to construct the next level of user contexts which are based on the available commands. When a new context is created it's returned to the main game state as the current context. In addition each previous user context is linked to the next context so that when the current user context is finished it can easily jump to the last context.

Interfacing with the User

This was all well and good for the back end but I needed a way to convey this to the user. To keep things simple I decided to build a console UI element which could be easily written to. I didn't want to waste a lot of time fooling around with creating new UI's for each User Context so I thought this would be an easy alternative that would still convey what the possible options are to the user. I set up a function in the User Context which takes in a UIConsole so that each context can print whatever information it needs to convey. There's also a number of functions for user input for overriding what the functionality should be when there's mouse or keyboard input.

Status of the Game World

While I was mucking around with setting up this user context console window I thought it would be great to have another console window for displaying information about the currently selected object. I haven't really taken this one too far but my thought is to set up a status component I can give to each of my game objects so they can convey useful state to the user. Some useful info might be how much resources are left in a crate, or what ability a character is currently using, or maybe how much is left to do on an object in construction.

In the coming weeks I'd like to flesh out a few more gameplay features: obtaining resources, the job system, item destruction. We'll see how far I get though. I've posted another video showing the progression I've made. You can see the User Context console is in the top right corner and the object status console is in the bottom right corner.

Wednesday, October 3, 2012

Having Fun With Behaviour Trees

Hey Readers,

The last two weeks I've been working on resource management and progressing on the job system.

Job System

My initial plan for the job system is essentially that every job has a set of abilities. These abilities can be learned by having that job assigned. The user can issue requests which need to be executed but not directly use the abilities. I iterate over all the character looking for a character that has an ability which can execute the request. Initially I implemented the abilities as a chain of actions. This worked good for my basic needs, my first ability being a build item ability, which consisted of an action chain with a move action and a build item action. This worked but then I started looking at resource management( like using resources to build something, not like in game assets ) and quickly realized that my simple chain of actions was not going to get the job done.

Behaviour Trees

What I decided to do was to make each ability it's own Behaviour Tree. This way I can define very detailed behaviour for how each ability will work. For example my initial flow for the build ability is:
  • Check to see if character is holding construction resource
  • If they are
    • Check to see if character is in range of the item to build
    • If they are
      • Start using resources to build item
    • If they aren't
      • Move towards build area
  • If they aren't
    • Check to see if resources available on map
    • If there is
      • Check to see if character is in range of resource
      • If they are
        • Pick up as much resources as character can carry
      • If they aren't
        • Move towards closest resource pile
    • If there isn't
      • Fail out job can't be completed right now.

If the job can't be completed it's unbound from the character ability and returned to the bottom of the command queue.

Actions and Decisions

The way I've gone about implementing this is to set up two templated classes, Action and Decision. They both take a templated type which has the implementation for that type of action or decision. Decisions are designed to be the logic which decides which path to take in the tree.  Actions are designed to be small pieces of logic which are at the end of a chain of decisions. This is all then linked together with fast delegates. Every decision has a pass, and a fail delegate. Decisions and actions then have a function for get a delegate instance which can be hooked into the pass or the fail.


These behaviour trees are then assigned to an ability. Currently the only implemented ability is the build ability. The idea is that every job has a number of abilities that can be used by the character that has that job equipped. For example the build ability is a part of the Handyman job. The user doesn't actually control these abilities. What they do is generate a request, so
  • The user chooses a tile they want to build.
  • They generate a build request. This request goes into a global request queue that the     characters have access to.
  • On the character update if they aren't already fulfilling a request they will check the request     queue and see if any of their abilities can be used to perform a request.
  • If an ability can be used, that ability is bound to the request and the character can start     performing the request.
  • When the request is completed or can no longer be executed the ability is unbound and the     request is deleted.
I have a video demoing the evaluation of the build ability. The blue tiles are spots where I've laid down a build request and the crate in the corner is a resource crate that contains some construction resources.

Apologies for the frame rate. I recorded it on my mobile dev machine, a Toshiba Netbook, so the recording software taxed the system a bit.

Friday, September 21, 2012

I Love This Bar

Holy smokes its been a long time since my last blog post.

I've kind of switched gears in my codebase and I wanted to have something to show before I wrote up my next post. For the last year or so I've really been concentrating on getting a lot of the core components I needed to write a game at the level I'm used to from working on games professionally. After finishing most of the core front end code I finally felt like I had enough blocked out that I could start to write a game. So I pulled the trigger and got down to work.

The game I'm working on takes a lot from simulation heavy games like towns or gnomoria which both draw a lot from dwarf fortress, a game which I've read a lot about but never played. This game I'm building differs a fair bit from those games though in that I'm not trying to simulate a medieval world, what I'm hoping to simulate is a bar/night club. I spent a lot of my youth bar hopping and I feel there's a lot of material that can be pulled from that environment and translated to a really fun game.The basic premise is that you start off with 4 characters and an empty lot and you have to assign jobs to your characters, build up a bar, get patrons, build up a regular clientel and just try to have the best bar in town.

I've done some research into what's out there but most of what's already out there just feels like farmville in a bar environment. That is not what I want to build, what I want to build is something for more core gamers with lots of different jobs to master, experience building, stats to track. I want to create an environment that feels somewhat alive.

So far I have a small lot working, with characters that respond to commands. Currently the only "job" I've implemented is "The Handyman" which will be responsible for construction in the game. The characters are able to build walls and that's about it.

Over the course of the next year I'm hoping to crank out something that's a lot of fun to play and filled with lots of quirky humor. I'll try to be better about the weekly updates. In the mean time here's a few screenshots from my progress. I'm not sure where those bright spots are coming from they don't appear in game. Also all art is just placeholder.


Friday, August 17, 2012

Quest for an Asset Viewer

Week 1

What a tough 2 week sprint!

The first task to pick off was getting screen saving and loading working. This was actually really easy to get going because serialization was built into my sprocket system. So in order to save and load all I had to do was serialize out the sprocket data to xml and then for loading I just had to write some code to place the loaded data into the right containers. I also ported over the code for throwing up the save/load dialogs. I kinda cheated when I originally setup the save/load dialogs. I just call the default win32 dialogs rather than create my own dialogs. I neatly packaged it up in my platform code so it should be easy to abstract it out when I do eventually get the code running cross-platform.

The next big thing was fixing the broken window drag functionality. I had a really tough time for this one for stupid reasons. It made for a really bad morning. Basically the problem was that all my mouse handling code was in local space. This works great for some things but when you're trying to move an object and the local space is changing everytime you move the object it creates some serious jitter! I didn't have a good way to expose the screen space deltas which is what I needed to move things properly. What I wound up having to do was calculate the delta based on the parent's coordinate space. I can't help but feel that's going to bite me down the road but for now the mouse drag is actually working better than before I started refactoring the UI system.

Once drag was working I was able to hook up the scale and translate buttons in the gizmo I created for manipulating UI Elements. I decided to leave rotation alone for now. Currently my UIClickable sprocket doesn't take into account rotation. I don't imagine it'd be a tough change, I already have a transformation matrix, it's just that I'm only using the translation element of it. Still I know it's going to be a lot more complicated then just multiplying by a rotation matrix.

I also spent some time just getting all the UI elements working with the UI Editor. While I was mucking around with that I needed to get arrays working with the protoform editor for certain create params. I managed to get them working the only problem is that they only work for simple types. So the problem is that I'm already using comma separated lists for the complicated types like Float3 which take multiple parameters for initialization and the way I'm parsing arrays is a comma separated list. This shouldn't be too tough to clean up in the future, just use brackets or something to separate items between commas.

Another big thing for this week was building a better way of viewing loaded assets as well as providing a means to select what textures to use in a UI element. The old way to select a texture for an element was to open up the xml package, check what I had set the id for on that particular asset and then type that id into the texture field in the protoform editor. I was about to start laying out the components in code when I realized this was a perfect test case for the editor! I always find when I actually start using a tool I find all sorts of bugs and come up with new feature ideas. The first bug/problem I ran into was that objects sorted in the order that they were placed. This was no good, I needed to be able to control the layering. For a simple solution to this problem I decided to use the z value I had been ignoring up until this point to sort the objects. I wrote some code that does a sort whenever an objects z value changed.

So once I had the sorting issue worked out I set about creating the asset entry ui element. This was easy, lay out a UISprite background, UISprite for a thumbnail placeholder and a UIText for the asset name and I was done. Then I realized that I had no way to load the object up and actually hook some code into it to replace the placeholder elements. I decided that the best thing to do was just start pounding out some code and see how it feels. I had the objects loaded into a UIContainer object and set up some code so I could clone that container so that I could create multiple objects using that template. I set up a UIAssetViewer class that held a bunch of these containers and did the logic for swapping out the placeholders. It works but I think it still needs some refinement.

Week 2

In the second week I managed to get the asset panels loaded and populating a UI container. There was another big cleanup operation, I needed to get scroll bars fixed up and switch my window class to use the UIComponentContainer for storing all the objects parented to it. This wasn't too bad of a change it was mostly copy and paste with a few little fix ups.  The scroll bar on the other hand.....

I spent the majority of the week working on getting this damn scroll bar working. Funny thing is it's still not working properly. The big problem was that I was trying to debug something that was a few layers deep. Getting all the math working properly was just turning out to be a huge pain. I managed to get something kind of working and that's going to have to do for now. The way it works is that it's designed to just move the components in the container around. It seems easy enough but for some reason I just couldn't get the math right for the life of me. I think it's one of those silly things where if I sit down and draw it out on paper it's going to become painfully obvious where the problem is coming from.

I also spent a little bit of time on cleanup. I cleaned up how I was calculating a lot of my element positions so that it's more data driven. This will make reskinning my UI elements a breeze. As it stood before I would had to have adjusted a lot of math but now it should just work.

WIP shot while I was getting the dragging working.

Showing off a few of the different UI elements.

First shot of the Asset Viewer working.

Adjusted the dimensions a bit to something that fit better on screen.

Showing off the scaled down UI. It makes better use of space but kind of loses the pixellated look I liked.

Wednesday, August 1, 2012

Fun In The Land Of UI

I spent the last 2 weeks working on a 2 week sprint to try and get the beginnings of a UI editor in place.

Week 1

The first step was cleaning up all the UI Elements I'd accrued over the last year or so. The big problem was that as my codebase evolved I didn't want to spend the time to go back and clean up old systems. On one hand this worked to my advantage because I was able to iterate much faster. On the other hand it left me with a lot of cruft in my codebase.

The first thing I wanted to clean up was the UIClickable. Maybe I should go into a bit of background though before I go into the details. Almost every UIComponent is made up of 4 core sprockets,  TransformSprocket, UIDimensions, UIClickable, and UIRenderable.

TransformSprocket: The transform sprocket is a common sprocket that just wraps a matrix, I use it in any component that needs a transform of some kind.

UIDimensions: This guy contains a width and height and is used to calculate the bounding box of a UI element.

UIClickable: Things get a little messy here, bad naming choice. This guy basically takes care of any input functions a UI component might need, these are currently mouse events and keyboard events.

UIRenderable: I haven't been a fan of this guy from the start, but basically any UI element that can be displayed has one of these and gets linked up through the IO system with a render callback which handles any necessary rendering.

Ok so where was I, ah yes the UIClickable. So previously I had an old message passing system which packed any necessary data into a message class so any function which took a message would have to pull out the data it needed from the Message object. The problem with this of course is that the person implementing the function doesn't actually have any idea what's actually in the message since it's really just a blob of data. Anyways over the course of the project this evolved into my Energon IO system which takes strongly typed messages so you know exactly what you're getting. The problem though was that I still had a lot of the old message handling going on and UIClickable had paths for both. So the first step was ripping out all the old message handling and clean up any of the UI elements that were still using that code path.

While I was cleaning up the code paths another problem I had with the UIClickable was that all of my coordinates for an object were based on their parents coordinate system, so a button at 0,0 that lived in a window, the 0,0 would refer to the center of the window. Now to accomplish this I had put all of the code in the mouse event callbacks in every UI element so that meant a lot of duplicated code, and in some cases the math was slightly different then other elements.

To clean this up I moved all of the conversion to local space calculations into the UIClickable. Since every sprocket can access it's parent and from there could access the TransformSprocket and UIDimensions it wasn't much work to refactor all this code. The downside to doing this though was that it meant not only did I have to rip out all the old codepaths in the UI elements, I also would have to refactor the current code paths to take into account that this was now being calculated inside of the UIClickable.

Another thing I put in that I didn't have before was mouse over events. I think this gives a really nice touch to the UI. Makes everything feel more alive when there's some kind of texture change or effect on mouse over. This led me to rework my buttons a little more. As it stood currently the buttons had support for textures on different states ie, mouse over, mouse down, idle. However I hadn't actually hooked those textures up to the UISprite which is a part of the button. So I set about adding a simple state machine for the button and setting up the sprite to take a different texture depending on the state that the button was in. I set up some safe guards so if a texture for a state isn't set in the create params it will just use the idle texture and of course if there isn't an idle texture set it will assert.

At this point I took a little break from the UI and had a look at revamping some of the code I had in place for hot swapping assets. What I had in place previously was a system which allowed you to specify a directory and a callback and if any files in that directory changed the callback would be called with the name of the file allowing you to perform any needed code to reload the asset.

This didn't really work very well for assets that were nested within multiple directories or if multiple asset types lived in one folder. What I decided to do instead was change my FileWatcher task to register with the AssetInventory callbacks based on asset types. So in this way there was a callback for Textures, Shaders, VoxelSprites, etc etc. I set up callbacks for all the main asset types I'm currently loading and plan on expanding it down the road with user specified type callbacks. So now it queries the AssetInventory every so often for the loaded assets. It then calculates an initial time stamp for any new assets. If the asset is already registered with the watcher it checks to see if the timestamp has changed. If the timestamp has changed then the asset is reloaded and swapped out of the AssetInventory. This way it's seamless to all the other systems, since all assets in the engine are passed around as handles which point back to the AssetInventory.

At least... all assets should be handles. The reality of it was that I had written a lot of my code that works with textures before the AssetInventory was in place. So this meant refactoring a lot of my material system to work with handles to textures instead of pointers. While there's a bit of a hit dereferencing the handles I think it's worth it in the long run. It'll help a lot when I get into streaming textures, so that I'll be able to seamlessly load in different mip levels by just swapping out textures in the AssetInventory.

Finally I started fleshing out the UIEditor. I set up a new game state that's selectable from the main screen. I also added in some of the components I had in the level editor, a protoform type list box for selecting what type of sprocket I'd like to create, and a protoform property window, which allows editing of the create params for a sprocket type. I also started to ponder how I would handle hooking up these UI screen's I create to actual gameplay logic. I settled on a solution where I would create an interface sprocket which would contain a number of inputs and outputs that I could link up in my IO Editor component to handle most of the logic that would be necessary for the screen. I also pondered setting up a scripting language to use with the front end. I'd like to get a scripting language in eventually for gameplay so I thought it might be as good a time as any. I put that on the back burner for now though because that comes with it's own set of problems that I'm sure will keep me busy for a few weeks.

Week 2

While I had added the protoform property window in the previous week, with all my changes to the UI nothing worked on it. It didn't even render! So I spent most of the day getting all the UI sprockets it uses working again. It was mostly easy work, just a lot of ripping out code and waiting for builds.

The next thing I wanted to get going was a gizmo for quick editing the UI sprocket. My plan was basically a border around the object with 3 buttons in 3 of the corners. A button for translation, rotation and scale. Clicking and dragging on the appropriate button would perform that transformation. I was able to get a button spawning but it's a bit of a hack for right now and currently the gizmo isn't hooked up to anything.

I also began pondering what my next sprint would be. I think I'd like to spend one more week fleshing out the basics of the UI Editor and then start working on an Animation Editor. It's kind of a big jump but I'd really like to get going on gameplay mechanics again and one of the things that was holding me back was getting in some placeholder animations. I think an animation editor would help get me going again.

Unfortunately this week was mostly all about the builds. I was changing a lot of core headers which caused 15-20 minute builds on this little netbook, that's a good chunk of my commute! I ran into a lot of problems with my protoforms which are essentially an xml tree of variables, inputs, and output definitions with state stored with them. I use protoforms as a way of serializing in/out of objects as well as editing the create params for a sprocket before I actually spawn it. All of this is done via a few handy macro's that do some code generation to create some functions that the protoforms query to get reflection data on all of the create params, inputs, and outputs an object can have.

I also started cleaning up another element of my UI Sprockets. Every UI Sprocket took in an FEContext as a create param. The FEContext had pointers to all the necessary systems that the sprocket would need to create itself things like memoryAllocators, renderContext, fontSystem, assetInventory. The thing is though that I added a ConstructionContext that gets automatically stored with every create param and this already has all the necessary systems in it. I added that a while after I started building my first set of UI Sprockets so none of them were really taking advantage of the ConstructionContext.

I spent a lot of time at the end of the week thinking about refactoring how the protoforms currently work. One thing that really bugs me about them is that they kind of encompass both the reflection data and the state data. I'd like to separate them so each class only has one set of reflection data and the actual state for each of those peices of data is stored separately in the protoform as a tree of state data. For example I have a VariableInfo, it stores things like the name, the type, the data offset from the beginning of the create params, the size, etc... however it also store the actual state of the variable inside of an object that manages converting to and from a string representation of the variable. I'd like to make these two concepts very separate. It'd be a big overhaul of the system as it stands now though so I think for now I'll just leave it how it is.

So that brings us up to date. I'm currently working on a one week sprint that cleans up some of the leftover things like addressing the gizmo a bit more in depth, saving/loading screens and properly handling multiple UI objects. I also need to address properly parenting a child UI Sprocket to a parent UI Sprocket, but that's all for another week.

Before Mouse Over

 After Mouse Over

First shot of my gizmo for editing, I know it looks awful it's just placeholder

 Showing a spawned button

 Illustrating a pressed button on the gizmo, when dragging that button the width and height of the UI Element gets changed.
 A bunch of spawned buttons of different sizes.

Thursday, July 12, 2012

And Now For Something Completely Different

Hey Readers,

I have something totally different for you this week. To give you some background, my other hobby is costuming, and on top of that it's mostly armor based costuming. I build things like, transformers, Iron man, that sort of thing. The piece of software I use to do this is called Pepakura. It allows a user to "unfold" a 3d model into 2 dimensions so that it can be printed out on cardstock and re-assembled. The problem is that it's been designed with building little papercraft model's not costuming. This results in a lot of little things I don't like about the software. Now rather than just accept those things I started planning out a better piece of software more designed for costuming in mind.

The problem with this though is that even if I built out this piece of software noone is going to use it because all of the existing content is built for pepakura. So I decided the number one thing I needed was in importer that could load pdo( the pepakura file format ) files into my software. The problem with doing this though is that pdo files are a closed format and besides one hacked together python script I found that could import parts of the 3d data there wasn't really any information about the format out there in the wild.

So I set about reverse engineering the format and started building up a spec of how it was layed out in memory. I've never done anything like this before and so it was a slippery slope trying to figure out where to begin. I've done lots of work with binary files though so I have a pretty solid grounding in how a lot of different formats are layed out so it gave me a bit of a base to start from. So I started with a simple box model and just started poking and prodding in a hex editor. Change a vertex, re-export see what changed. Change a color, re-export see what changed. Slowly but surely a picture started to emerge. Within a few days I had a clear picture of what the 3d data looked like in memory. One thing that really held me up was that I expected all of the floating point data to be stored as floats, but they decided to store them as doubles, I can't imagine why they would think they would really need that level of precision, were they thinking someone would be trying to unfold an earth sized object? Another thing that held me up a bit was that the strings are encoded. I saw patterns that looked like null terminated strings, a string length, a set of characters ending with a 0. When I looked at the data though the strings were jibberish. With a bit of fiddling around though I figured out that if I subtracted a constant from each element of the string I could decode it into the original string value. Turned out though that this constant changes from file to file. The good thing though was after a quick search of the header I found out one of the blocks of data I was unsure about stored the decoder value.

So that was about as far as I got a few months ago I got distracted by other things and there it lied dormant. A couple weeks ago though I got the itch again. So for the last couple weeks I've been poking and prodding at the data structures and finally have an almost complete picture of how everything is stored. I've also set up a pdo viewer option in my project for viewing pdo files. Now the next big question for me is do I go back to my voxel earth project or do I put some love into my costume designer project I planned out so many months ago.

Anyways, I've compiled how things are layed out in memory below. Everything that's important for rendering a pdo is there, there's a few odd and end flags that haven't been decoded yet. If anyone has any questions or feedback on what some of those unknown fields do please leave a comment.

Apologies if some of the formatting is messed up, seems to be a problem with blogger. Also comments for each piece of data are directly underneath the piece of data.
    This is the main data structure. The pdo file is made up of a header,
    followed by all the object data, then a list of materials that are applied to
    those objects, the unfold patterns, then text and images that are placed in
    the document, and finally the settings.
    PDOHeader                header
    ObjectCollection         objectCollection
    MaterialCollection       materialCollection
    PatternCollection        patternCollection
    TextCollection           textCollection
    ImageCollection          imageCollection
    Settings                 settings

    u8 version[9]
    version info, "version 3" for pdo 3 files.
     u8 moreData[13]
    StringData applicationId
    I think this is an application id, in most files it seems to be "Pepakura
    Designer 3".
    u32 decoder
    This is stored as a u32 but is a char that when subtracted from each element
    in a string decodes the string.
    u32 unknown
    u8 moreData[62]

    u32 strLength
    number of characters contained in the string
    u8 nullTerminatedString[]
    array of characters
    double x
    x position of the collection of objects
    double y
    y position of the collection of objects
    double width
    width of the collection
    double height
    height of the collection
    u32 numObjects
    number of objects to read
    ObjectData dataArray[]
    array of the data objects
    u32 numMaterials
    number of materials
    MaterialData materialArray[]
    array of the material objects

    u8 unknown
    double unknown
    unknown, maybe a scale value
    u8 unknown
    double x
    x position of the collection of patterns
    double y
    y position of the collection of patterns
    double width
    width of the collection of patterns
    double height
    height of the collection of patterns
    u32 numPatterns
    number of patterns stored in the array
    PatternData patternArray[]
    collection of pattern data
    u32 numTextData
    number of text objects
    TextData textArray[]
    collection of text objects

    u32 numImages
    number of image objects
    ImageData imageArray[]
    collection of image objects
    double x
    x position of the collection of bitmap
    double y
    y position of the collection of bitmap
    double width
    width of the collection of bitmap
    double height
    height of the collection of bitmap
    BitmapData bmpData
    stores the actual texture data
    These are all of the common settings for the application. Oddly it seems to
    be a mish-mash of what options save and which ones don't. Any that I haven't
    decoded yet are marked as unknown.
    u32 unknown
    u8 showFlaps
    Whether or not all flaps are visible
    u8 showEdgeId
    Turn on/off viewing of the edge id's
    u8 unknown
    Unknown flag.
    u8 isTextureOn
    Turn textures on/off
    u8 useAngleThreshold
    Use the angle threshold to hide edges that angles are below a given
    u8 angleThreshold
    The angle threshold for hiding edges, goes from 0 to 180
    u8 unknown
    Unknown flag
    u8 unknown
    Unknown flag
    u8 unknown
    Unknown flag
    u8 doDrawWhiteLine
    If it's set then option says draw a white line under the folds.
    u32 mountainMode
    0 if it's solid, 1 if it's off, and I think 2 if it's a pattern.
    u32 valleyMode
    0 if it's solid, 1 if it's off, and I think 2 if it's a pattern.
    u32 cutLineMode
    0 if it's solid, 1 if it's off
    u32 unknown
    unknown flag
    u32 pageType
    The paper type, it's an enum, I haven't mapped it out yet, but matches up to
    how it's listed in the combo box in the print options.
    u32 isLandscape
    Should it print out landscape.
    u32 sideMargin
    Sets how big the side margins are.
    u32 topMargin
    Sets how big the top/bottom margins are.
    double mountainPattern[6]
    If the mountain mode has been set to 2 each number dictates how far to
    draw/skip the pattern in the lines.
    double valleyPattern[6]
    If the valley mode has been set to 2 each number dictates how far to
    draw/skip the pattern in the lines.
    u8 addOutlinePadding
    Add padding to how the texture is layed out across the folds.
    double globalScale
    The global scale for the file.
    StringData comment
    The comment string set in the options.
    StringData author
    The author string set in the options.
    u32 unknown
    Unknown flag.

    This is a collection of data that describes the 3d object.
    StringData name
    Name of the object
    VertexCollection vertices
    Collection of vertices that make up the object
    FaceCollection faces
    Collection of faces that make up the object
    EdgeCollection edges
    Collection of edges that make up the object
    u8 unknown
    u32 numVertices
    Number of vertices in data array
    VertexData vertexArray[]
    Array of the vertex data.

    u32 numFaces
    Number of faces in data array.
    FaceData faceArray[]
    Array of the face data
    u32 numEdges
    The number of edges in data array
    EdgeData edgeArray[]
    Array of the edge data.
    The vertex data stores the 3d position for the original 3d model. Surprised
    they didn't stick more in here like normals or uvs. I guess it keeps things
    simple for the data format.
    double x
    x position in 3d space
    double y
    y position in 3d space
    double z
    z position in 3d space

    Face data, these are the faces that make up the poly's of the 3d model and
    provide information for drawing the 2d unfolds. They allow for an arbitrary
    number of indices so the poly can have any number of points. I've only seen
    in practice 3 and 4 though, tris and quads.
    u32 materialId
    The index into the material collection which represents what material this
    face has on it.
    u32 unknown
    double normal.x
    The x component of the normal vector for this face
    double normal.y
    The y component of the normal vector for this face
    double normal.z
    The z component of the normal vector for this face
    double unknown
    u32 numIndices
    The number of indices this face has which tells how many points in poly.
    FaceIndexData indexArray[]
    The array of index information.
    Edge data, not really sure what this is useful for. All the data is better
    represented in the 2d unfold edge data. I guess one useful piece of info is
    the fact that there is one individual edgeData for each edge in the 3d model.
    Since an edge is always shared by 2 faces the 2 faces are indexed in the data
    structure. I guess this chunk of data is actually kind of useful for quickly
    calculating what the angle of an edge is for calculating the angle between
    two faces.
    u32 faceIndex0
    The face index for this edge.
    u32 vertIndex0
    The vertex index inside the face for this edge.
    u32 faceIndex1
    The second face index for this edge.
    u32 vertIndex1
    The vertex index inside the second face for this edge.
    u16 isEdgeShared
    This is either 1 or 0 it's the same value as in the unfolded edge data
    u16 unknown
    u16 unknown
    Faces are usually made up of 3 or 4 of these FaceIndexData objects. They
    store data for the 2d description of the unfolded face as well as store the
    vertex index to the 3d data for rendering the 3d model.
    u32 vertexIndex
    The index into the 3d vertex data.
    double x
    The x position of the face index in the 2d view.
    double y
    The y position of the face index in the 2d view.
    double u
    The u texture coordinate for mapping the diffuse texture to the object.
    double v
    The v texture coordinate for mapping the diffuse texture to the object.
    u8 unknown
    double edgeSize
    I believe this is the distance of the edge made up of this face index to the
    following face index.
    double leftEdgeAngle
    The left angle of the fold for the edge made up of this face index to the
    next face index.
    double rightEdgeAngle
    The right angle of the fold for the edge made up of this face index to the
    next face index.
    u32 outerColor[3]
    The outer color of this edge if an edge color has been set.
    u32 innerColor[3]
    The inner color of this edge if an edge color has been set.
    Materials can be applied on a per object basis. They store information for
    describing the color, texture, and I believe but haven't fully tested,
    ambient and specular color values.
    StringData name
    The name of the material
    float colorData[20]
    A collection of color values stored as floats. Interesting side note this is
    the only floating point value in the file that is actually stored as a float
    as opposed to double which seems to be the pdo standard for floating point
    numbers. I didn't spend much time investigating these values but I believe
    they're the diffuse/ambient/specular color values probably stored as 4
    floats, rgba.
    u8 hasBitmap
    If this value is set to 1 then the material also has a diffuse texture stored
    as a bitmap
    BitmapData bmpData
    optional piece of data, if the previous value is 1 then this will be a bitmap
    representing the diffuse texture for the material

    Images in a pdo document have their data embedded in the file. These bitmaps
    are used in materials and also in images which can be placed around the
    document. The actual raw data is rgba and stored as a compressed chunk of    
    data, the compression method is zlib.

    u32 width
    The width of the bitmap in pixels.
    u32 height
    The height of the bitmap in pixels.
    u32 compressedDataLength
    The length of the stored compressed image data.
    u8 compressedDataArray[]
    The image data is compressed using zlib.
    Every unfolded object in the 2d window is contained in the file as a pattern.
    It's stored as a collection of edges that make up the structure of the
    u32 objectData
    This is the index of the object that the pattern belongs to. This can be
    referenced from the object collection
    double x
    x position of the pattern
    double y
    y position of the pattern
        double width
    width of the pattern
        double height
        height of the pattern
        StringData name
        Name of the pattern assigned in the editor
        u32 numEdges
        Number of unfolded edges in the unfold pattern   
    UnfoldEdge edgeArray
    The array of edges

    These objects represent an unfolded edge on the 2d area.
    u8 unknown
        u8 unknown
        u32 faceId0
        The edge data is strangely stored as an index into the 3d faces and then the
    index into vertex in that face, the following point of the edge is the next
    vertex in that face.
    u32 vertexId0
        The vertex index into the face that represents the starting point of the edge
        u8 isSharedEdge
        This is a boolean representing whether or not this edge is shared. AKA, the
    edge is not an      outer edge on the pattern but an inner edge.
        u32 faceId1
        Optional, if isSharedEdge is 1, then the face that the shared edge belongs to
        u32 vertexId1
        Optional, if isSharedEdge is 1, then the vertex index into the shared edges
    face that  represents the starting point of the edge
    These are the text objects that can be placed within the 2d area of the pdo
    double x
    x position of the text
    double y
    y position of the text
    double width
    width of the text
    double height
    height of the text
    double unknown   
    u8 color[4]
    color of the text, stored as 4 u8's rgba
    u32 fontSize
    the size of the font, aka point size
    StringData fontName
    name of the font
    u32 numStrings
    number of strings
    StringData stringArray[]
    This is a collection of strings which are stored in this text object

Model almost decoded.

There we go model decoded

Hmm those unfolds don`t quite look right

Hmm the bounding boxes are looking right

Hey there we go, unfolds rendering correctly

Found the normals for them model!

FIgured out which edges were the internal edges.