Wave Particles et cetera

This software project is an adaptation of Wave particles, a method introduced by Cem Yuksel in 2007. To show that the algorithm accounts for reflections at shorelines, the animation features a narrow river/canal. However, as you can see, my immitation does not achieve the same degree of realism and graphical sophistication:

Additionally, I experimented and incorporated the following techniques into the project:

Similar to the 3D animation I made earlier, the user can control all the entities: Helicopter, hornet, catamaran, tugboat, and turrets.

Boat in canyon (as in video), source code (C++) * wavepart_1st.zip 3.5 MB
Multivehicle animation, modelling, source code (C++) * wavepart.zip 4.6 MB
* Credits: threads using pthreads library incorporated together with James Cosford, 3d models adapted from 3D Warehouse, textures from cgtextures, sounds from findsounds.

Faith is believin' what you know ain't so.
Samuel Langhorne Clemens


Cut mesh, friction forces, edge network

My implementation of Wave particles deviates from the suggestions by Cem Yuksel in several aspects:

• A particle subdivision results in 2 new particles (instead of 3).

• For every partially submerged object, I compute the cut of the surface mesh along the plane that approximates the wave height underneath the object. The cut is a mesh that contains the submerged surface of the floating object including an edge network along the boundary at water surface level. The buoyancy is determined by the volume of the cut mesh, and applied as a force at the center of gravity of the cut mesh. The surface of the cut mesh determines the friction, as the object glides through the water. We simply iterate over the triangles and quads of the cut mesh. The curve network of the cut mesh determines where new wave particles are emitted. (Cem Yuksel original suggestion was to project the 3D semi-submerged object onto a 2D grid and then iteratively determine the boundary of the object, in order to prevent particle generation underneath the object.)

Shortcomings of my implementation: no outsourcing to GPU, no shading to render the water surface, no wicked timetable to sort the subdivision and reflection events.

View from turret

The grass and bushes are rendered as sprites. During the initialization of the game, the locations of these sprites are distributed using blue noise in 2D, which guarantees a minimum distance between neighboring sprites.

During gameplay, the top vertices of the sprites are subject to a displacement sampled from Perlin noise. Since the noise function is smooth, the vegetation appears to move in the wind. The semi-transparent sprites are rendered in the descending order of their depth in the camera perspective. Beyong a certain depth threshold, vegetation are faded out or not drawn at all.

Perlin noise is used at two more places in the game: the shape and color or the fireballs and the fluctuation of the helicopter. The implementation of Stefan Gustavson or Perlin and Simplex Noise is used.

Hornet from Halo

In order to display the scene on a 3D monitor, the scene is rendered twice, i.e. from the perspective of each eye. The pseodo code shows how simple it is:

while (1) {  
  // BEGIN: animation/physics, user control, ...
  // END: animation/physics, user control, ...
  tensor cam=opengl_camera;
  double alp=0.0325;
  double ofs=0.22;
  for (int c0=0;c0<2;++c0) {
    tensor<double> eye=tensor<double>::matrix(4,4).as(
                 cos(alp), 0.0, (c0?1:-1)*sin(alp), 0.0,      
                      0.0, 1.0,                0.0, 0.0,      
       (c0?-1:1)*sin(alp), 0.0,           cos(alp), 0.0,      
                      ofs, 0.0,                0.0, 1.0);
    // BEGIN: drawing
    // END: drawing

If I do not issue commands, the people will behave themselves
if I do not preach but remain silent, the people will find serenity
if I do not meddle, the people will prosper by themselves
if I do not impose my desires, the people will return to simplicity.
Lao Tzu