A Bit of Work History

I graduated from the University of Waterloo in Computer Science, Combinatorics and Optimization (double major).  My work terms were with Telesis North, Corel, and Microsoft.

The work terms weren’t constrained to learning just development.  My second term with Telesis North went south due to financial troubles, and with help from the co-op department, I secured the remainder of the term (and the next) with Corel.

Corel was interesting as I was limited to using Visual Test for writing software to verify the printer and vector filters.  However, the automated tests would eventually cause CorelDraw! to crash.  To resolve this, I created a DLL for a Visual Test script to load that would start CorelDraw! as a debugger and monitor unhandled exceptions and exception loops (an interesting way to hang an application).  Then another Visual Test script would start the debugger script to run the application.  If the application crashed, it would be caught, killed, logged, and the test would start a new instance to continue testing.

Microsoft

Electronic Bill Presentment and Payment/TransPoint

The Microsoft work terms were with the Electronic Bill Presentment and Payment group before it joined First Data Corporation or became TransPoint.  The experience was phenomenal.  The product was to view and pay your bills online–no cheques.  It was a great idea that never made it in North America (at least then).  I worked on tools to automate website testing by simulating user behaviour and the transfer layer between billers and the site.  The original transfer layer used the first version of MSMQ, which had trust relationship requirements between the domains that billers were not fond of.  Much has changed since then.

I returned to Microsoft and the now TransPoint group after university.  I had the pleasure of working with others to make the site localization friendly as we were licensed by Australia Post.  We went for two weeks to help them set up the system and provide additional hands-on training.  It wasn’t to last, however.  The product was licensed away to CheckFree, and the group disbanded.

Natural Language Group

I continued at Microsoft in the Natural Language Group.  The simplest description is “the red and green squiggles in Word.”  We were doing a port of the engine from C to C++.  In the midst of this came the massive security push at Microsoft for code security resulting in a review of the code for security flaws.  My time was short there, however, as the project did not suit me.  I moved on to Xbox Live.

Xbox Live

Xbox Live was an amazing experience.  I joined after the Xbox had shipped but before Xbox Live had launched.  I started out working on automating integration tests for the console for the different components of Xbox Live from the console side.  As we were approaching launch, however, we found consoles failing to connect to Xbox Live.  I got to help track down failing consoles, contacting customers that had registered their consoles to get more information.  We worked with manufacturing, repair and refurb, customer support, and ops to trace the cause of the problem.  We put together a system to identify, track, and remedy the issues–providing a free console repair even if the console was out of warranty.  I wrote the software integrating the information provided by ops, repair and refurb, and manufacturing to track these consoles in a database customer support could use to verify the failures.

I also tested the beta site for the Halo 2 release, including the beta dashboard for creating beta accounts and general client/server functionality.

Xbox 360 changed the experience for Xbox Live by adding a lot more UI functionality with system software running in the background with an overlay of notifications, the blades, and a broader suite of live functionality, including the marketplace, account creation, more sophisticated troubleshooter, etc.  I assumed the role of UI Test Lead, running a team of three to cover verifying the functionality.  I also worked with the Xbox base team responsible for non-Live related UI functionality and the group verifying localization of the UIs.

BioWare

Then an opportunity opened up at BioWare that I could not pass up.  A tools position was open, and I took up the opportunity.  Before Xbox 360 launched, I moved to BioWare in Edmonton to work on Mass Effect 1.  The very first day I started was a bit of a shock as I went through documentation with human resources, I found out BioWare had been bought.  My first gut guess was Microsoft, but it turned out to be Elevation Partners–a firm including John Riccitiello & Bono.

Mass Effect 1

Mass Effect 1 was BioWare’s first game with the Unreal Engine.   I started with automating the verification of content. There were a lot of easy ways to break content, and we needed a way to identify and address them as quickly as possible.  I also worked on our automated build process in Visual Build Pro.  Code reviewing XML diffs of build scripts was atrocious.  Please consider this when designing a build system.  The most challenging problems, however, were two classes of hitches and fitting on a single disk.

The first class of hitches didn’t manifest often at all until we started running from media.  The code frequently loaded objects from packages on disk with blocking function calls.  A substantial amount of work on the loading of single assets fixed loading by name.

The second was audio.  We used a different audio engine to support streaming audio from disk.  The audio engine, however, had a top-level lock that would be held while reading from the disk.  So if the main thread needed to issue an audio command while the lock was held (or the request required disk IO), the game would hitch.  We moved the audio to a different thread, then I changed the logic around the locking to unlock the top-level lock before a disk read and reacquire it after reading.  To avoid IO-related contention, I added a spin lock to avoid the audio system triggering another IO read concurrently.  Finally, we changed how data was stored in packages for audio to store the control data and sound bank offsets seek free, so additional IO wasn’t required during package loading.

The last was fitting on disk.  Mass Effect 1 was the only one in the series to fit onto a single disk.  Some planets were cut, and numerous transformations and rearrangements of the data occurred to get it to fit by factoring common content and compressing more aggressively.  We also cut back on the filler in the files used to align the data on ECC block boundaries, squeezing more space at the cost of performance.  We managed to keep the space monkeys.

Mass Effect 2

Mass Effect 2 was a substantially smoother experience applying what we learned from Mass Effect 1.  We reviewed alternative audio engines and went with Wwise from Audiokinetic.  I was responsible for the integration with IO and memory.  This was a substantial improvement in the game experience.  The audio designers also preferred it.  This cascaded, however, into a requirement to revamp our VO pipeline to support getting the voice-over audio into the Wwise project.  I replaced substantial portions of the VO pipeline, parallelizing it, adding text-to-speech (for lines without voice-over yet), and checking the data for issues that would interfere with lip-sync generation.  The VO pipeline was a complicated beast integrating SQL Server (now a heavily documented 30-page SQL query), Perforce, Wwise, and Unreal.  Our Wwise project was so huge we integrated Google PerfTools into it to address fragmentation, causing out-of-memory issues (since this time, they have added some great features to Wwise to make it much more challenging to get into this state).

Mass Effect 2 PS3

Next up was Mass Effect 2 for PS3.  This was an interesting engineering effort as the split memory architecture imposed a hard constraint on how we managed memory.  On the Xbox, we could adjust our texture pool size to handle variations in main memory requirements, but not so for the PS3.  We did roughly a year and a half worth of Unreal integrations, updating our content as we proceeded.  We also spent man-months addressing content issues exposed by the Unreal integrations.  I changed FNames to be thread-safe and require less memory (1/2 MB), added property stripping to compile out properties from classes for the consoles (13 MB), added property reordering to remove gaps between properties for structs and classes (3.5 MB), moved editor-only code to separate projects (1 MB), and stripped out a lot of dead content.  I also assisted with refinements to IO streaming for audio to use less memory and to stream to/from VRAM.  These changes also resulted in shaving a gigabyte of data from the disk size compared to Mass Effect 2 for Xbox 360.  I also wrote a database shader cache, where any user that needed to compile a shader would check against the database first to see if it was done already and, if not, compile it and send it back to the database.

Mass Effect 3

Mass Effect 3 integrated the changes from Mass Effect 2 PS3.  From there, we had a little over a year left to finish Mass Effect 3.  I replaced the IO layer to support more asynchrony when reading from disk–supporting parallel reads from different IO devices and removing an entire thread from the PS3 implementation.  I added live switching of the VO, text, and speech languages without restarting the game.  I contributed a substantial amount of code to a framework initiative to consolidate our tools libraries and finally owned developing Kinect for Mass Effect 3.

Building Kinect into Mass Effect 3 was intense.  The first foray for Kinect occurred with another team at BioWare, and they demonstrated the functionality to progressively more and more people at BW.  With the help of the developer from the other team, we put together the functionality for the E3 demo.  Our rep thought we had faked the functionality in the video we sent, it was working so well.  The scariest problem we encountered, however, was Liara–she wasn’t matching well when issuing commands.  We found that Kinect expected a different pronunciation, so we managed to hack a solution for E3, but we needed a better long-term solution for the shipped game.  For the E3 demo, we used dynamic grammars–grammars constructed entirely from within code.  We switched to static grammars for the shipped game since we could provide pronunciations in our custom lexicons for those grammars to use.  I built a pipeline for compiling the grammars in parallel where we could build almost 3,000 grammars (for 7 speech locales) in minutes.  The greatest challenge, however, was finding the pronunciations we needed for those locales.  I wrote a tool to manage the lexicon files in the PLS format, added text-to-speech to verify pronunciations (converting between UPS and IPA encoding), and a legend of phones commonly found with sample words.  Taking a corpus of almost a million words across 5 languages, we determined what Kinect would use for the default pronunciations as a starting point.  With the audio from commands recorded by the game, we would iterate on the pronunciations in the lexicon to improve matches.  Problematic audio was attached to bugs to track problem phrases and verify resolutions when fixes were made.  We had a lot of support and feedback from Microsoft and an amazing team at BioWare working on the Kinect functionality in Edmonton, Montreal, and the localization offices.  We learned a lot of interesting tricks for improving matches in Kinect.

The Galaxy is Saved

With the Mass Effect series complete and a rather intense knowledge handoff to others at BioWare we moved back to Ontario to be closer to family.  It has been a fantastic summer spending time with the kids.  I have a tan!

4 thoughts on “A Bit of Work History

  1. Hey Brent!

    Nice post! Thanks for some technical details on the Mass Effect development. I have a question though: you mentioned that you “moved editor-only code to separate projects” for the Mass Effect 3. What does it mean exactly? Have you actually split the single engine project into “Game” and “Editor”?

    ~Robert.

    • Wow, that’s taxing my memory! As I recall we did have separate projects, but we also had done an extraordinary amount of work to the unreal build tool too to make it faster, and those changes may have been related. Though when it came to building the final game it was still monolithic. I also did a pass on Epic’s projects and marked up a lot of functionality as #if WITH_EDITOR.

      • Hey Brent!

        Thanks a lot for the answer!

        I am an Unreal developer myself and a huge fan of the game series so knowing how it was done from the technical side is very nice, and your article is one of a very small number of in-depth articles on the web, that’s why it drew my attention 🙂

        I’ve been always wondering what is the actual amount of modification that some studios like BW do apply to third-party software like Unreal. I know there are studios who are completely fine with vanilla (or almost so) version of the engine, so the nature of such changes and reasons behind making them are interesting to investigate. So if you don’t mind sharing some experience I would like asking a couple of more questions.

        Did you actually follow the standard Epic’s pipeline when building content and used built-in tools or you managed most of your content outside of the editor? Is the majority of the game code written in Unreal Script or C++?

        Thanks for your time,

        ~Robert

Leave a Reply to Robert Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.