THE GREAT WAR - RECREATING THE WESTERN FRONT FROM THE AIR - PART 4 / by sandy sutherland

In this post, I cover the first part of the process involved in populating my terrain with objects

 A tiny tease of substance terrain textures in engine - Colour, roughness and a normal map for each tile. There is also a water mask map that will be used to boost specular. 

A tiny tease of substance terrain textures in engine - Colour, roughness and a normal map for each tile. There is also a water mask map that will be used to boost specular. 

SET DRESSING - APPLYING AN OBJECT LIBRARY TO REAL WORLD DATA

The key to set dressing this giant terrain was to keep things as procedural as possible. I wanted to minimize manual placement of objects at all costs. The kinds of objects I was placing were things like trees, buildings, trench lines, barbed wire, craters, rubble, vehicles, artillery, fires, that kind of thing. I would be creating a library of these objects and then distributing several million copies of them all over the terrain.

Secondly, I needed a combination of randomly placed objects, and very precisely placed objects - not just position, but precise rotation and scale as well. Finally, I wanted to use real world data to inform distributions of trees and buildings, along with other terrain features and landmarks. Again this is where the Open Street Maps data comes into play.

Thirdly, for performance reasons, I needed to reuse objects and meshes as much as possible. I made heavy use of Unreal Engine's Hierarchical Instanced Static Meshes (HISM's). For a great overview on how to use HISM's, I highly recommend this video by @TechArtAid

 

THE FIRST APPROACH - STORING ARBITRARY DATA IN MESHES

I'll warn you up front that this is not the method I ended up using. I've included this description just for peoples interest, it's a method that could work, but there is an easier way I only discovered very recently. I also wrote this description down a while before posting, as will become obvious...

So as I've dicussed in my previous post on texturing, I had all sorts of data extracted from open street maps. This included buildings and trees, probably the two most common objects my terrain would be covered with.

BUILDINGS:

So I had all these building footprint polygons, and I needed to somehow get that into a format where I could replace them with several 100,000 instanced mesh buildings in Unreal Engine. Each instanced building had to have the same alignment/rotation, size, and building type tag as the ones in the .osm files. Typically, instanced meshes in Unreal Engine are used for foliage, things like grass and trees, often placed via painting onto terrains and meshes with totally random size and placement. You can go in and manually move and rotate individual instances after the fact, but there was no way that was an option in this case, I needed everything placed correctly up front.

So the problem was 2 parts, for each building, first I had to extract the building rotation/size/type properties, and then I had to define that buildings position, all in a format that could be read and accessed in Unreal.

My solution - generate 2 new meshes in Houdini, and store each buildings key values as point data in the meshes. Then once the meshes are imported into Unreal, make a Blueprint that loops over each point, extracts the required info, and generates a mesh instance building at each point position.

HOW IT WORKS:

The Houdini graph I built loops over each building footprint polygon and does a few different things:

  • First, resample the polygon edges, adding more points
  • Make each points normal aim at its neighbour, convert the normal vectors to a rotation value
  • Tally up the most common rotation values, and define the most common rotation
  • This becomes the overall rotation direction for that building - store it as float value called "rotation"
  • (The above step is very much the same process I used for defining field rotations in a previous blog post)
  • Next, rotate the polygon by the negative of the rotation value - this cancels out the rotation, making the polygon world aligned
  • Find the bounding box X and Y size of the polygon (Z up world). Store these 2 values as "xSize" and "ySize"
  • Rotate the polygon back as it was
  • Replace the building polygon with a single point at its centroid. Store these 3 new values as attributes on the point

The bit that might be slightly abstract at first is that I'm not using the position attribute to actually store position values. An attribute is simply a container. Position is just a container for a vector, which itself is just 3 floating point numbers (X,Y,Z). 

So what I can do now is create two meshes, one for storing the actual building positions, and the other one can store my 3 values for rotation/xSize/ySize in the position attribute. Visually this second mesh looks like a total mess, but it doesn't matter, I'm using the mesh itself as a container that Unreal can read and understand. 

So to clarify, I create:

Position Mesh : The points of this new mesh just define building positions

Data Mesh :  This mesh stores the rotation, and X/Y size info for each building

So the 3 XYZ position components of the data mesh actually store

  • X = rotation
  • Y = xSize
  • Z = ySize

The two meshes have corresponding point counts and point IDs.

Right, I think I've covered that enough!

I needed to be able to access these values in Unreal. There is a very handy blueprint function I will cover in a moment, but vertex position, uv's, normals and tangents can all be accessed per point, in Unreal. Different attributes have different levels of precision. Position and UV's for example are more precise than normals, so I didn't use normals to store data.

You may remember I also had another attribute I was creating, which defined the building type. I can store that value in the U component of my UV coordinates.

Now I should point out that there is probably a bunch of other ways I could have got this data into Unreal, save it as a text file or an .xml file or something. I guess this is a good example of how I think visually. It just seemed like saving data in mesh points was the most logical thing to do, especially as Houdini is so powerful at processing geometry.

These 2 meshes never actually get rendered in UE4, again, they are just data containers.

The last thing I did was to project the positions mesh I had created in Houdini, down onto the height map mesh in Houdini. The Data out of Open Street Maps was totally flat, so projecting it down meant it now fit the contours of the height map. When imported into Unreal, the position mesh lined up nicely with the terrain, and my Blueprint could then do it's thing, copying instanced buildings onto the points of the mesh.

 Green shapes are building footprints as they are in Open Street Maps. The yellow boxes are the bounding cubes I generate and align to each building after I've generated a rotation value for the building.  

Green shapes are building footprints as they are in Open Street Maps. The yellow boxes are the bounding cubes I generate and align to each building after I've generated a rotation value for the building.  

 

THE OTHER WAY - HOUDINI CSV FILES AND  UNREAL ENGINE DATA TABLES

So if you bothered reading the bit above, you will see I had a hunch that storing my object data in a spreadsheet format was probably a better way to go. Thankfully, the guys at SideFX are always a few steps ahead and there is a CSV file exporter built into Houdini, which I only discovered in the last week. 

All my buildings data was ready to go, all I had to do was save it out as a CSV instead of storing that data in meshes. I decided I would save CSVs for each tile, initially for buildings, but I'll do the same for trees and all other objects I want to place.

In Unreal Engine, all i had to do was define a data structure that corresponded to the data I was writing out from Houdini, import the CSVs, and Unreal would create a Data Table asset that I could then access.

I made some quick changes to the blueprint I had made to extract the data from the old meshes I had, and instead I directed it to read the data from the data tables. All that was needed was a tiny bit of adjustment to some of the values to account for Houdini and UE4 having different coodinate systems. Then, it all just kind of worked - I had buildings that lined up exactly to the substance textures on the terrain. Woo!

Now, I haven't actually made the building assets yet, right now it's all just cubes, but again this is just getting the workflow in place. 

One of the awesome benefits of this method is just how quickly I can update things. I can make changes in houdini, say, to the type of buildings I want in an area, save off the CSV again, refresh the asset in UE4 and boom, the new building distribution is engine. The blueprint takes about half a second to process and spawn the buildings onto the terrain. 

 The Unreal Engine blueprint that reads the data tables and spawns each individual building. 

The Unreal Engine blueprint that reads the data tables and spawns each individual building. 

 

IN ENGINE

And finally, here are a small handful of in engine pics

 Box Buildings, in engine!

Box Buildings, in engine!

 Another cluster of towns

Another cluster of towns

 The view from a higher altitude. This is only showing 2 tiles worth of terrain textures and their buildings. 

The view from a higher altitude. This is only showing 2 tiles worth of terrain textures and their buildings.