Bioshock Infinite

I stay pretty busy at DigiPen, and especially so a week out from gold milestone.  Despite that, I still try to play SOME games; staying current with what’s going on in the industry and staying sane in the face of a hectic work schedule.  After getting home from GDC, I promptly bought Tomb Raider (not quite finished, but expect a review soonish) and Bioshock Infinite despite what a terrible idea that was for productivity   I just completed Bioshock this weekend, and I felt pretty strongly about the ending, so I thought I’d write about it.  So, if you haven’t beaten the game yet this is your warning.  SPOILERS AHEAD!

So, overall, I really liked the game.  I think that it took the mechanics and themes and ideas from the previous games and really pushed them in a direction that was interesting.  I found Bioshock 2 to be decent but pretty lackluster of a sequel compared to the original.  It would have been easy for Infinite to be more of the same, but it wasn’t.  I felt more compelled by the story in this game than I did in the first Bioshock.  And despite being an FPS, Bioshock is really all about the story in my mind.  Which brings me to the ending…

BioshockInfinite

I think that this is going to be a pretty unpopular opinion, but I absolutely hated the very end of this game.  The idea of a multiverse of this game world was really intriguing and I honestly didn’t fully see it coming that Elizabeth was Anna.  So, even up to the point that we’re wandering the infinite lighthouses, I was still hooked.  But then, out of nowhere, at the very end we get Butterfly Effect’ed.  Seriously?  We spend the whole game building a relationship with these characters, getting to know them, building empathy for them.  Irrational took a lot of care to put tons of emotion into Elizabeth’s facial expressions and to give her a range of unique interactions throughout the world to really flesh out her personality.  And, while it sucks, to have a scenario where one or both characters get killed tragically… that happens in good stories sometimes; throw a wrench into the works.  But to just blink both characters, and this entire, rich universe, out of existence just angers me so much.  And the worst part is that it’s because I would totally love to play another game in Columbia with Elizabeth in tow.

Maybe it’s for the best.  Rather than make a sequel to this game that falls flat like Bioshock 2, maybe we’ll get yet another great setting like Columbia and yet another great set of characters like Booker and Elizabeth.  Maybe.  I don’t know.  It feels a little like how I imagine those crazy people that are still angry at Bioware felt about the ending of Mass Effect 3.  Except that I’m not an insane zealot.  Irrational obviously knows what they’re doing most of the time.  Regardless, in the short term, the end of this game left me feeling really unsatisfied.  Maybe with more time to reflect I’ll change my opinion, but for right now, it’s an amazing game that ends with a whimper.

GoodGraphics25.png, More Particles, Gloss Maps, And Starting Cleanup

I spent the last week down in San Francisco for GDC, and the amount of knowledge gained from the various talks I attended is staggering.  I can’t wait for the videos and slides to get uploaded to the Vault and to start in earnest at trying to get some of those tricks and techniques integrated into my own graphics engine.  I’m thinking that a switch over to generous usage of Compute Shaders and Deferred Contexts should help me speed up performance quite nicely in my architecture.  Hopefully future posts will be able to show if that pans out as expected.

However, until that happens, back in the real world I’m finishing up feature implementation and tying up loose ends heading into gold submission for the game.  Since I’ve been back (and, honestly, a little bit while I was in San Francisco), I’ve been working at improving particle systems, working with artists to get whatever else we can into the engine, and handling lingering technical issues.  Let’s start with particles!  First, a video of my more recent efforts.

Work has progressed pretty steadily on particle systems since they first got implemented.  I’ve fixed a pretty bad heap corruption that the memory manager was hiding (decrement index 0 and then calculate distance from camera!), I’ve actually added camera distance calculation to make alpha blending work properly, billboarded the sprites, written some extra particle operators to deal with special case functionality we want, and then spent a huge amount of time just tweaking values to find really good effects.  It’s the kind of thing that, given time, makes me want to write a genetic algorithm to find good effects of certain type.  You know, rather than me spending hours and hours tinkering until I get a good fire, I can just go to sleep and wake up to some good options.  Dreams, right?  But, the fire is really a test case and not something that’s necessarily planned to be in game, so let’s talk about something that is.  Here’s another video!

One way that we’re looking to add player feedback to the game is to use particle systems to “drain” power from power nodes and then to “push” that power into mechanics objects.  So, Hayden and I wrote an operator to take in a variable control point and have particles update their velocities each frame to direct them towards that point.  The inspiration here is Team Fortress 2’s medic gun.  While we still have some work to do to tighten up the player tracking and to get a nice curve on it like TF2, I still think it’s already looking pretty decent.

The last “big” thing I’ve worked on since the last update is gloss maps.  My artists have wanted to use them for a while, and I finally worked with them to make it happen.  And it was pretty easy, too.  The idea here is to have an additional map that contains the n.s exponent value for the specular equation.  As a result, a single material can have variable shininess across it’s surface, which is pretty cool.  I actually separated out the depth texture from the normals texture in the pre-light accumulation stage a while ago so that the scene normals could store to an RGBA8_UNORM and the depth could be an R32_FLOAT, so it was easy to integrate the gloss maps.  Since they only store a single value per pixel, I was able to stuff them into the alpha channel of the normal maps, change 3 lines in my shaders, and everything worked.  Pretty simple!

Now I just need to implement texture animation (I know that I mentioned I’d already have it finished, but expect it tomorrow?) and then I’m on to doing detection routines for things like available resolutions, MSAA levels, etc to finish outstanding TCRs.  You know, stuff I should have already done a long time ago.  But, I did also get 3D positional sound working in engine this week (have I mentioned I also do the audio programming for this project), so that was also pretty exciting.  And maybe makes up for it?  I’m not sure.

Anyway, that’s everything that’s been happening since the last update.  Oh, and you’ll notice in the second video and the screen capture that Max has finished his gravity flipping mechanic.  So, there should be some cool levels utilizing it soon.  While I plan to continue posting as I make progress on my graphics engine, I’m concerned that the content from here to the end of the semester won’t be too exciting.  Texture animation and bug fixing?  It needs to be done, but it isn’t very flashy.  Maybe I’ll find time to slide in some extra post processing.  We’ll see!

GoodGraphics24.png, Particles, Particles, PATICLEZ!

It seems like I just made one of these, huh?  And while this is just a first pass at the system, and running on programmer art, I’m excited enough to post about it.  After that, hopefully this post isn’t a huge let down.

So, particle effect systems!  I put it off all year because I just had so much stuff on my plate and all of it seemed super important and core to just getting the game to display.  And a good portion of it was features that have since been deprecated due to changes in design direction.  Not that I’m mad, it’s the nature of the beast here at DigiPen and I accept that.  But, now that we’re in the home stretch, it’s become a real crunch to finally get particles done for the huge polish factor they can add.  And after about a week at it (although I had to spend a fair amount of time during that week tracking down what turned out to be two major buffer under runs that were causing huge stability issues), I finally have it working and, I think, good enough to show people.  So, here you go!

I also need to give a huge shout out to Hayden Jackson for all the help he gave me while developing this.  The insight, feedback, and source code was absolutely invaluable, and I never would have designed a system that was nearly as elegant in the given time frame.  So, thanks dude!

Next up is finally integrating my spritesheet animation system from last year, and that might unfortunately be the last graphical feature I have time to add before I need to clean up loose ends and fix outstanding technical requirements before submission.  We’ll see, as I’d really like to find the time to add HDR and SSAO, but I’m trying to be realistic here.  Either way, I’m off to GDC next week and I’m glad I got particles in before that or else I’d have spent all week being driven crazy by it.  Anyone else that’s going, feel free to hit me up for grabbing some drinks!  Otherwise, expect more posts on getting prepped for final submission after I get back.

Edit:  I realized that a screen capture is a terrible way to show off a particle system, so I took this short video of it in action.  Enjoy!

GoodGraphics23.png, More Lighting, More Transparency, And Performance

So, a lot of backend updates have been happening in the last month, but the visual part of this update actually happened the day after the last GoodGraphics post.  But school’s been busy (what’s new?).  And then I got Premake and FXC working and wanted to post about that.  And then school’s been busy…  it isn’t like it’s a pattern or anything.

The visual part is that after talking with designers and artists, the simplest solution to dealing with transparency in the new lighting system was to just not illuminate transparent objects at all.  While it would be nice to have light partially “hit” and partially go through transparent objects, it just isn’t feasible for me to do it in a way that both looks good and keeps reasonable performance at this point in the year.  So, instead I just exclude transparent objects from the depth buffer used by the light accumulator.  The loss of not illuminating transparent objects is pretty minor compared to the gain of seeing light sources through them.

One of the major backend changes I’ve made is to start using instanced drawing.  I had originally, and incorrectly, assumed that I could write a very optimized material system to generate the fewest number of state changes and data transfers, and couple that with the reduced overhead of draw calls since DirectX 9, and be alright.  Turns out that while draw call overhead has been reduced, it is still pretty major, and making a separate draw call for every object takes a toll.  As a result, I’ve switched my lighting over to using DrawIndexedInstanced and it has made a huge difference.  A lot of our levels are now able to handle the entire light accumulation pass in a single draw call, and frame times have cut in half.  So, I’m pretty stoked about that.

Beyond that, I’ve been working to move as much calculation from pixel shaders to vertex shaders to cut down overall instruction counts without sacrificing visual fidelity, and it’s been pretty successful.  The only major outstanding calculation left in a pixel shader that I think I can move is the inverse view projection matrix multiplication in the light accumulation shader that uses depth to recreate pixel position in world space.  And I think the information in this thread has everything I need to solve that.  Here’s hoping!

So, that’s it.  I’ve nearly got our particle system working (finally), so hopefully there’ll be a new post up soon about that.  And if all goes well, it should be a two for one with texture animation as well (which I wrote last year, but still haven’t gotten around to integrating into this engine yet).  I’d really like to get it implemented and post about it before heading off to GDC next week, but you know, school.  We’ll see what happens.

Premake, FXC, And You

When I joined Team Fizz over last summer, I was introduced to the awesomeness that is Premake.  Shortly after that, this post on Bobby Anguelov’s blog introduced me to FXC and everything awesome that it could bring to the table.  I instantly wanted to use it.  The problem then became, how I can get Premake to generate projects and include the necessary custom build steps for my shader files to be able to utilize FXC?  I figured this would be a pretty common issue for people and it would either be intuitive to setup or there would be a lot of information online on how to make it happen.  I was completely wrong on both counts and I wasn’t able to make this work until just last night.

Semi-recently, Premake added support for two features that I felt could potentially solve my issue.  The buildrule function was supposed to allow me to set a custom build rule to files based on a configuration declaration, and this sounded like it would directly be the solution I was looking for.  I could just have the shader files in my project, the configuration declaration could look for files based on extension to apply the build rule to, and everything would work.  I’m not sure if we just didn’t set it up properly (the documentation for this feature isn’t great) or it just doesn’t work right, but it didn’t do what I needed it to.

The second option was project reference through the external function.  While less optimal, the plan was to manually create a project for my shaders, set the custom build tool on each shader file to use FXC, and then have Premake just add this project into the sln when it creates and links in all other projects.  This is what ended up working for me, so I thought I would document what I did in case anyone else runs into this issue.  I know I would have appreciated it about 6 months ago.

In your build script, assuming a current enough version of Premake, you can use the external command to reference a pre-existing project into the sln being made.  Just slide the command in anywhere that you would add “project” for a project that you wanted Premake to autogenerate, and make sure that you add the name to the startup project’s links so that the proper build dependency is made.  Here is what mine looks like:

Shader is the name of my vcxproj, which is what comes after external.  It’s also in a folder named Shader at the root of my sln directory, which is what goes in location.  Setting the kind to StaticLib (and making sure you actually set the project to match) might not be necessary, but it fit the rest of the architecture of our engine.  Uuid can be found by opening your vcxproj in notepad and looking for this:

  <PropertyGroup Label=”Globals”>
    <ProjectGuid>{7CF9442C-12F8-4675-9CE2-D54FEC4C31D0}</ProjectGuid>
    <RootNamespace>Shader</RootNamespace>
  </PropertyGroup>

 

Once you have the project made, setup, and your buildscript ready, you just need to setup your shader files’ properties to use FXC.  The link to Bobby’s blog from earlier in this post completely details the process so I won’t repeat it here.

So, there you have it.  I won’t say this is the best way to accomplish what I wanted, or even necessarily a good way, but it’s a way and it works for me.  Hopefully if anyone finds themselves in the situation I was in, it will be helpful.  And if anyone has a better way to solve this, please let me know; I would love to get that kind of feedback and improve things.