Creating a Virutal Reality Texture Painting App
Sept 1, 2024

I've always thought that painting models would be more intuitive in VR. Here's my journey to creating a VR texture painting app using Wonderland Engine and WebGL.



Many 3D software packages let you paint directly onto your 3D models, but I have yet to see one that uses controls the brush using VR. The current process allows you paint on a 2D projection of the 3D model, a big improvement compared to the direct UV map edits of the olden days, but it still has its issues on painting occluded sections of the mesh.

< Toe beans image here >

Here were a few of my goals coming into the project:
  • Any texture painting app should be able to handle large textures. Nowadays, textures for character models are 1024 x 1024 pixels at a minimum, with a very common dimension being 4096 x 4096!
  • As with all VR apps, it needs to run at at least 60 FPS on not very good hardware (cough cough Quest headsets) or else you can get motion sickness and throw up - a very unpleasant user experience.
  • It should support all the key functions of comparable modeling suites, at least for handpainting purposes. Substance Painter has very nice procedural smart materials, for example, and excellent baking capabilities, but that's out of scope for this project. I ended up implementing these features:
  • Real-time painting
  • Brush Radius Control
  • Brush Opacity & Falloff Control
  • Color Picker
  • X-mirror symmetry
  • Undo-Redo system
  • Layer system with the ability to reorder, toggle visibility, and change blending modes
  • Custom UI for VR controllers

  • Why shaders?
    I started this project with the VR game engine Wonderland, which handles all the rendering, raycast and intersection testing, and controller input, so why did I bring in WebGL?

    For real-time painting, I would need to update the texture every frame the model is within range of the brush. Because I want to color all the pixels within the given radius, this means in a single operation, many disparate triangles, and therefore many pixels scattered all over the texture could be effected. Under a single-threaded framework, I would need to check each pixel of the at least 1024 x 1024 texture to see if its in range of the brush. That's over a million operations each frame at minimum! WebGL was brought in to take advantage of parellelism using the GPU - using the millions of threads there to hardware accelerate this process.

    UV Textures as Shader Output
    Dynamically painting using shaders have been explored before, such as in these devlogs by TNTC and Lone Developer. Both of these videos were very helpful to me to understand how to hijack the shader pipeline for texture painting.

    In the normal shader pipeline, vertices in 3D world space are projected to 2D screen space coords in the vertex shader, and then each pixel of the render ouput is colored in the fragment shader using the associated interpolated attributes of each vertex. Since our output is a UV-unwrapped texture and not a render output, we can hijack the fragment shader by passing in the UV coords as the screen space coordinates instead from the vertex shader.

    < image of process >

    #version 300
    precision mediump float;
    in vec3 a_vertexPos;
    in vec2 a_vertexUVs;

    out vec3 f_worldPos;
    void main() {
    f_worldPos = a_vertexPos;
    // UV coordinates range from 0 to 1, screen coordinates range from -1 to 1
    vec2 rangeFixedUVs = 2.f*a_vertexUVs - vec2(1.f);
    gl_Position = vec4(rangeFixedUVs, 0.f, 1.f);
    }

    Here is psuedocode for the frag shader, we'll go deeper into the frag shader when we get to brush settings:
    void main() {
    if distance(brush_pos, f_worldPos) < radius {
         out_color = brush_color;
    } else {
         outColor = current pixel color in the texture;
    }
    }
    More to come! Documentation will be updated in the future