Skip to main content

We are a swiss-based serious game studio and gamification agency Gbanga that creates mixed-reality apps, augmented reality games and spatial computing games that entertain a world-wide community of players. The games are mass-customised and individually tailor-made experiences based on the players context, mood and preferences.

This article here is an insight on how we work with shaders. Here in the example of one of our browser games.

For a recent project, we were challenged with creating an environment containing luscious grass fields. Rendering performant, yet satisfying grass landscapes is a well known problem in the computer graphics and gaming industry. Of course, there were some additional requirements.

  • The grass must be placed on a spherical object
  • There must be a mowing mechanic interacting with the grass
  • The system must perform on mobile devices as well as desktop in a browser

As inspiration, we stumbled across this blog post by James Smyth. This is a great point of entry, since it runs already in a regular browser environment, is based on a shader implementation and is very lightweight.

Different methods of visualizing grass blades, with different visibility in different anglesDifferent methods of visualizing grass blades, with different visibility in different angles

How to adjust 2D grass blades to volumetric 3D blades

James Smyth has implemented the visualization on a flat disk. Our requirements are different: we have a seamless sphere, variable in size, and we want to configure some areas that are overgrown with grass and others that are bare. Additionally, because our approach is on a sphere, there will always be an angle where the camera is pointing straight down. This made 2D grass blades a bad fit, as you then won’t see the sides of the grass leaves. Therefore, we decided to create volumetric 3D blades by adding two vertices to the existing geometry.

The main parts that changed were:

  • Point sampling
  • Orientate the different grass blades along the sphere normal
  • Volumetric grass blade

Point Sampling

This is simply done by randomly sampling polar coordinates combined with rejection sampling from a texture. This allowed us to only spawn the visible blades with maximum density, rather than spreading blades on a wider area and then hiding some blades at all times. For a simple texture sampling, we used the very standard equirectangular projection.

Blade Orientation

Our sampling already provides the normals, from there we can simply calculate a perpendicular vector to the normal and construct a local space to create the blade into. The volumetric blade is crafted by adding two faces to the 2D base model.

volumetric blade is crafted by adding two faces to the 2D base model.

A few tweaks for the shader

Adjustment 1: Fixing non seamless sway animation

Due to the new target shape, some parts of the original version from James Smyth did not work well on the new spherical surface. On the vertex shader, the grass sway is not seamless. This was easily fixed by adjusting the wave size of the sway animation to a multiple of PI.

Adjustment 2: Pragmatically restricting cloud movement

The other issue was that the cloud shadow is stretched along the poles. This is related to the problem of mapping a 2d texture to a sphere. For our application, it was sufficient to restrict the cloud movement to the equator.

Adjustment 3: Extended fragment shader

The fragment shader worked fine, we extended it only to be able to hide individual blades, once they are mowed.

Interactive grass mowing

The concept of mowing grass is simple. Every position the player passed, we paint on an additional texture. The shader will render every painted position in the texture invisible. This allows us to hide single blades of grass, besides having a single mesh for it. 

The equirectangular projection provided us with a closed form solution for the point sampling and allowed us to paint an ellipsoid on the uv-texture, representing a nice circular brush on the 3d sphere. This enables a uniform trail over the whole sphere.

Possible future improvements

Since the application is web-based and targets mobile devices, the performance requirements are critical. The final version is good enough for most modern devices and is capable of rendering between 50’000 to 100’000 grass blades. Nevertheless, I would like to highlight some potential improvements. 

Occlusion culling

On a sphere, a single point of view can obviously never see the whole surface. In an optimistic scenario, we would approach ~50% of visible surface. Clumping the blades into a single mesh has many advantages, but we cannot simply leave away parts of it.

LOD/Patching 

For a more general and scalable solution, one could implement an LOD system. Only grass blades close to the camera need to be volumetric, the rest can be rendered in 2D or even be replaced with images of multiple blades at once at a certain distance. 

Texture painting

The current implementation of texture painting is very inefficient and scales poorly. The raycast we use for texture sampling could be replaced by a closed form calculation based on the player’s position. The algorithm for manipulating the texture runs on the CPU and computes the updated texture almost every update. The straight forward next step would be to utilize another shader for the texture manipulation and let the specialized graphics hardware do the hard work.

We are the serious games agency Gbanga. As a specialist for gamification games and playful experiences, Gbanga implements games for contract customers. We would love to work with you on your next game!