IE11 is no longer supported
We do not support Internet Explorer 11 and below. Please use a different web browser.
Roberts Space Industries ®

ID:

17432

Comments:

21

Date:

January 22nd 2020

Squadron 42 Monthly Report: November & December 2019

Squadron 42 Monthly Report: November & December 2019

This is a cross-post of the report that was recently sent out via the monthly Squadron 42 newsletter. We’re publishing this a second time as a Comm-Link to make it easier for the community to reference back to.

Attention Recruits,

What you are about to read is the latest information on the continuing development of Squadron 42 (SCI des: SQ42).
Operatives around the world collected the intel needed to provide you with this progress report. Through their efforts, we’ve uncovered information on AI improvements, a hefty debrief on cinematics, and an as-needed UI display element.

The information contained in this communication is extremely sensitive and it is of paramount importance that it does not fall into the wrong hands. Purge all records after reading.

UEE Naval High Command

AI


Ship AI had a busy couple of months working on NPC planetside traversal:

“Flying on the surface of a planet is definitely different from open space; the expectation is that ships fly straight at a level altitude and bank when steering to different headings.” -Ship AI

The first iteration involved creating a flight path suitable for the terrain height and local slope. The information generated is used to correct the path to ensure ships always have a specified minimal clearance to the terrain below while steering according to the inclination of the terrain. For example, a ship will favor a flight path through a valley rather than over a ridge. Hand-placed and procedurally generated objects (rocks, buildings, etc.) are not part of the terrain elevation information and are obtained by a different query. In the next iteration, the team will integrate this information so spaceships can adapt their flight paths accordingly.

The same considerations apply to in-atmosphere dogfighting. The prototype of this was shown during the recent CitizenCon playthrough, with three security ships engaging the fugitive Carrack. The team also experimented with a stunt maneuver to add flair and personality to AI behavior.
Work was also completed on Collision Avoidance. This system runs in the background during AI flight and provides steering correction if a collision is likely to happen within a given time horizon. In the current development version, the system considers the collision resolution of all piloted vehicles, vehicles without a pilot (abandoned vehicles, derelicts), static obstacles, and procedurally generated asteroids.

Recently, the system was heavily reworked to minimize its impact on performance. This included ensuring all vehicles and obstacles were organized in memory according to their current host zone to make the search for the surrounding participants faster. It also makes sure the collision avoidance update never gets an exclusive lock on the cached information when creating and processing individual AI collision problems. This is a critical aspect, as the updates of all AI entities run concurrently, so having an exclusive lock can heavily impact performance.


AI (Social)


The Social AI Team made several improvements to path following, including the ability to specify movement speed when following a path edge, add branching paths, wait at a path point continually playing any function until events have fired, trigger a random Subsumption function from a weighted list, and enable/disable Subsumption secondary activities when following a path edge.

Work also continued on the bartender, with the first iteration of mixing station tech added. This extends the character’s ability to not only pick up and serve items from the fridge, but to prepare new drinks at the mixing station. Several other small improvements were also made to make the overall experience smoother.

Finally for Social AI, the base idle system in the actor state machine was improved by moving all decision-making to the control state. This should make the whole system more robust and stable over the network, potentially avoiding the desync issues seen in the past as well as helping the team debug potential issues more efficiently. Further on the optimization, the usable caching has been updated to listen for tile-created events from navigation meshes.

AI (Character Combat)


The team’s primary focus over the past few months was on the overall player vs. NPC combat experience. This involved implementing new behavior tactics for shotgun-wielding NPCs, giving them the tactical choice to get much closer to the target than before to make proper use of the weapon. They also began improving vision perception so that the designers can more easily adjust the size of vision cones and how fast a target is perceived inside them.

They also made improvements to the system that generates cover location and usage to enable NPCs to choose to only partially cover themselves. Small optimizations were also made to AI hearing perception and specific designer movement requests.



Audio


The Audio Team worked on and reviewed all sound for chapter four, paying particular attention to the Idris, improving mark-up and SFX where required.


Cinematics


The pre-holiday period was a busy one for the Cinematics Team and included story scene implementation and production quality passes for the camera, staging, and lighting (most scenes received their third of four planned passes).

Alongside the usual pipeline tasks, they worked on upcoming tech developments, specifically enabling players to interrupt certain scenes and NPC conversations. They kicked this off by marking-up scenes with a number of interrupt points to indicate where a scene could stop and resume and considered how it would do so if the player doesn’t ‘behave’. The goal is to do this with minimal (or no) additional animation fragments.

Cinematics also tested new animation code from Engineering that will allow performance-captured scenes to work from other directions than they were intially directed for. For example, if the actor representing the player character approached a bridge officer from the right, this code allows the performance of the bridge officer to mirror in real-time should the player approach from a different angle. Most games either use a locomotion solution (where the NPC constantly turns to face the player via a systemic set of motions) or use rigid performance capture data that only works from a specific direction. This solution bridges that gap as much as possible. Of course, there are limitations; the team are currently testing the restrictions, performance implications, and the authenticity vs. player agency trade-off.

Cinematics also worked with Character Art and Engineering to elevate character quality to the final production level. Character Art are finalizing the bridge officer, using Executive Officer Sofia Kelly as the testbed for the new skin and hair. This led to the prioritization of scenes and where the character and camera move around a lot to test the temporal anti-aliasing (TSAA) stability.

Since last year, the TSAA solution has advanced considerably and now creates a highly stable final image that intelligently smooths harsh white pixels, such as specular flicker. However, it was also smoothing out carefully crafted character eye-lights. So, Engineering delivered a solution to ensure eye-light reflections in character eyeballs are not smoothed out and the bright dots are kept alive, which is vital to giving characters an additional flicker of life. Currently, characters as lit as if on a movie set, with something similar to a virtual dedolight employed to cast eye-light.

Another area that received a quality push was comms calls. Squadron 42 features real-time render-to-texture comm feeds from NPCs. So, if the player has a comms call with Old Man, the wing leader will actually be in his cockpit and ‘filmed live’ when responding. The team wanted to determine a ‘gold standard’ for key video comms partners the player will interact with, including their wing leader, air traffic control, and an Idris captain.

These comms calls have several steps:
  1. Figuring out where the camera that films the NPC would be located. For the best possible framing, they want to ground the camera as much as possible and avoid it floating in a seemingly magical way.
  2. Lighting that area well while keeping it ‘cheap’ to avoid dozens of shadows and performance-affecting variables.
  3. Setting interference FX, such as comms call line ‘open’ and ‘closed’.
On the animation side, this required R&D to figure out what to keep from the motion-captured cockpit performances (all actors were sitting down in cockpit mock-ups on set) and when to apply the piloting inverse kinematics to have the characters hold the throttle and flight stick. They’re still exploring whether they want g-force on those conversation partners on top of the actor performances.


Engineering


In the UK, the Actor Team began implementing the new personal inner thought system. This now-radial menu allows all player functionality to be accessed more easily than the current iteration (such as taking off a helmet or equipping weapons). It’s context sensitive too, so the options will change depending on the location or available actions. For example, in a cockpit, additional options such as exiting the seat or engaging landing gear become available. They also added a ‘favorites’ section for commonly used commands and enabled keyboard/button shortcuts to be assigned to any command.

Regarding melee combat, the team moved onto body dragging, starting with interacting with the body, repositioning it, and basic player movement. For actor status, they finished the temperature system and moved onto hunger and thirst, setting up the initial stats.

Over in Frankfurt, Engineering spent time in December and January on game physics. This including exposing a linear air and water resistance scale, making damping linear (and not step dependent), using the box pruner to accelerate tri-mesh collisions, and general optimization. Regarding attachment support, they moved boundary brushes and entities to the object container entity and added the ability to query for the attached points during a previous attachment. They also exposed the surface area on geometries and optimized loading times and calculations while verifying parts.

The team continued with the new graphics pipeline and render interface (Gen12), made robust changes to the Vulkan layer to prepare for use with scene objects, added PSO and layout cache, improved API validation, and ported tiled shading light volume rasterization to the new pipeline. They also continued the parallel refactor of existing render code in support of the new graphics pipeline and APIs and worked on the global render state removal and device texture code refactor.

For planet terrain shadows, multi-cascade and blending support was added to improve detail and range. This will also eliminate various existing shadow artifacts. Development continued on volumetric terrain shadow support for fog, with the teaming coming up with a multi-scattered ambient lighting solution that ties into unified raymarching.

Multi-scattering improvements via a specialized sky irradiance LUT was made to improve unified raymarching. Updates were made to the jittered lookup and TSAA to vastly improve quality and reduce the number of raymarching steps. They also made lighting consistent so that planet atmosphere affects ground fog layers and vice versa, experimented with ozone layer support in-atmosphere, and added support for scattering queries (transparent objects).

Finally, work continued on planetary clouds, ocean rendering, frozen oceans, and wind bending for static brushes.



Graphics


In the final weeks of 2019 the Graphics Team focused on stability, with great progress made on profiling and optimizing the renderer. This work will continue into the first quarter of 2020.


Level Design


The Social Team continued work on the narrative interactions, applying additional polish to the various scenes in the form of gestures, head turns, and eye-looks to make them appear as natural as possible. This ongoing process will take a while to complete as there are several hours of conversations throughout Squadron 42 that allow the player to feel a part of the world, rather than just a spectator.

The Level Design Team are focused on the FPS-heavy sections of the game, working alonside the various feature teams to make sure AI behavior is realistic in all encounters. The pre-combat behaviors were also worked on, such as patrolling, investigation, and reactions to audio attenuation on differing materials.

The space and dogfight teams progressed with the AI behaviors of specific types of ship. Combat maneuvers are constantly being iterated on, while AI awareness of its own ship emissions, remaining ordinance, component damage, etc. is in progress. The team are also delivering level prototypes for some of the ‘exotic’ gameplay puzzles specific to Squadron 42 that rely on underlying core mechanics.



Narrative


With CitizenCon wrapped up, the Narrative finished off the year focusing on SQ42. Alongside ongoing text requirements for mission objectives and UI, they reviewed progress on the various levels and synced with Design on the overall game experience. They also worked with the Animation Team to provide additional material for background NPC scenes. For example, they added moments where named characters interact with random NPCs to create additional character or story flavor.


QA


The QA Team provided dedicated tools support for the Cinematics Team. This mainly involved investigating various crashes from loading cutscene levels and DataCore issues within the editor roll-up bar. They also set up a new track view checklist to better maintain and test the editor. They’re planning to run these tests weekly and whenever a track-view-specific QA test is required. Finally for QA, cinematic client captures of different scene playthroughs for review by the designers and animators are ongoing and will continue into the new year.


Tech Animation


November and December saw Tech Animation rework Mannequin setups to clean up older usable iterations and address bugs. They worked with the animation and code teams on the updated bartender and implemented socket/attach point tech for props in Maya. They created multiple prop rigs for the usable and cinematics teams, including plates, trays, maintenance ladders, and ship turrets. Support was given to the Weapons Team along with new rigs, updates to older rigs, and in-engine bug fixes. The necessary rig additions for the new real digital acting tech were completed too.
New toolsets were authored to help visualize the complex engine-side attachment systems inside Maya. These assist the animators in their daily workflow and makes it infinitely easier to author props too. Development continued on the facial rigging pipelines; this lengthy initiative will ultimately yield great things when the team authors their own in-house face rigs compatible with the DNA systems. They also worked on the comms calls with Cinematics.


Tech Art


Frankfurt’s Tech Art Team worked closely with the UK’s Tech Animation on the design and implementation of the new runtime Rig Logic plugin for Maya. In contrast to previous hard-coded versions, the new plugin is entirely data-driven and utilizes so-called rig definition files that can be changed as needed without the assistance of an engine programmer. The removal of this bottleneck will give more creative freedom to the artists and significantly shorten iteration times. While the Maya runtime node is responsible for driving rigs based on facial and/or body animation, a bespoke command was also implemented. This allows the convenient querying and editing of all parts of the rig definition data and forms the foundation of the team’s new rigging pipeline and tools.


User Interface (UI)


The UI Team’s main SQ42 focus of the past couple of months was the actor status display. This is one of the first UI elements that will hide when not in use, meaning the screen isn’t cluttered with things players don’t need immediate knowledge of. They also continuing creating visual targets and concepts for the final looks of various UI elements, including the visor the player wears throughout much of the game and target selection. The logo for an important faction in the SQ42 narrative was completed too.


VFX


VFX continued to work with Art and Design to flesh out key locations and gameplay loops. This included revisiting some of the older effects and making use of the more recent GPU particle collision improvements. The artists also iterated on the workflow for gas clouds, with several improvements coming online to improve the overall process.

Conclusion

WE’LL SEE YOU NEXT MONTH



End Transmission

Part of

Monthly Reports

More in this series

Comments
021.0

Feedback

Loading Additional Feedback