19 Jul 2016

Using Shaders to Generate Dynamic Textures in the Viewer API

Default blog image

By Michael Ge (@hahakumquat)

A week ago, I wrote a post on injecting dynamic THREE.js textures into a viewer scene by creating a plane mesh with a canvas-based material. While the solution works for simple surfaces, it fails for two major reasons:

  1. Simply applying a texture to a previously uncolored fragment usually yields an incorrect projection (since the UV mapping coordinates have most likely not been defined).
  2. A model's fragment often has complex geometry that cannot be imitated by built-in THREE.js shapes. 

With WebGL shaders, however, we can solve both problems by using pre-existing geometry and defining how each pixel should be rendered.

A Brief Introduction to Shaders

Coming from more of a 3D animation background, shaders seemed like scary, mystical things that always "just worked." But in fact, the concept behind writing shader code is simple. There are only two types of shaders: vertex shaders and fragment shaders. Vertex shaders are programs that define how every point along a surface should be positioned, while fragment shader programs define how every point is colored. Note that a "point" is different from a "vertex" as points span the entire surface while vertices define the boundaries. These programs output vectors based on parameters such as position, the current UV coordinate, etc. Aerotwist's intro to shaders is a great way to get started relatively painlessly using built-in THREE.js functionality.

Generating the Material

 First, we'll use THREE.js's native ShaderMaterial instead of a preset mesh material to texture a fragment. Here, we pass in the vertex shader, fragment shader, and uniform variables. The uniforms include the width and height of the fragment I want to project on, the canvas texture, and a corner variable that is (er, initially was) used for offsetting every point so have nonnegative coordinates. The calculations for _bounds were determined by the genBounds function here:

The color and displacement of every point is then determined by the shaders. 

The Shaders

For my heat map, I wanted every point on the canvas to translate to its scaled position on the actual rooftop, but with a bit of math, shaders offer much more complex and interesting visualizations. 

The final position of any point is determined by the gl_Position variable in the vertex shader. THREE.js magically passes in the position of the current point, position, as well as projection and model-view matrices required to transform the position to the correct perspective. During these calculations, I also determine a vec2 called vUv to pass into the fragment shader.

Now, my original intent when calculating vUv was to project the point onto the xy-plane (bird's eye view), shift it by an offset defined by the minimum corner of the roof, and divide the result by the roof width and height. In theory, this should return a value between 0 and 1, which is the format in which the fragment shader can understand how to apply the texture. The calculation looked something like:

vUv = vec2(abs(projection.x - corner.x) / width, abs(projection.y - corner.y) / height))

But for reasons unknown, it doesn't perform as expected. The code in the gist seems to work, but if anyone can figure out how to sensibly calculate the UV coordinates, I'd love to know.

In either case, a UV vec2 of scaled values between 0 and 1 is passed to the fragment shader. Here, the gl_FragColor simply applies the texture at the UV coordinate specified by vUv.

Setting the Material

The fragment proxy gives us access to a mesh's material, so we simply update the model with our newly created shader material. Here, I access the roof mesh's fragment proxy and recolor the roof to the appropriate material.

Implications

We now have a way to display textures onto meshes that weren't intended to have textures! The possibilities really are endless. Videos can be projected onto any surface, data can be pulled into a scene to create an IoT viewer, even entire webpages can be placed in a scene. The specific use case is up to you, but the idea of taking a texturable asset, importing it into THREE.js, and projecting it via either shaders or standard materials makes the viewer an especially powerful tool.

Tags:

Related Article