WebGL Point Cloud Sandbox

3 comments

UPDATE – 08 NOVEMBER 2016

I got some time to optimize my native WebGL FBO engine and turn a few parameters into UI controls. The user can now upload a OBJ or PLY file, apply tessellation iterations to add more vertices to the cloud, play with it by using a brush tool, take snapshots or create slow motion effects by adjusting velocities and gravity, … I think it’s almost ready for a first release as a Point Cloud Sandbox.

 

UPDATE – 21 JULY 2016

At last, my native WebGL particles now support (classic) shadow maps and phong lighting is fully integrated with ThreeJS pipeline according a point sprites technique with generated depth.
Live demo is coming for sure but I need to provide some additional parameters and propose a fresh GUI… but here is another video preview:

 

FIRST RESEARCH – 4 MAY 2016

I wrote Particle Systems in many different environments. This time, I port my experiences on a web page through WebGL and Javascript.

To start with this, I looked for examples and I found quite a lot with different approaches but I was not happy with the functionalities and the structure of source code. That’s why I wrote a new FBO oriented system from scratch to get access to the GL_DRAW_BUFFER extension in a proper way. With this GL extension I’m able to ouput in more than one texture from the fragment shader and handle more data in a GP-GPU way, with position maps, velocity maps, extra-parameter maps, …

To illustrate this approach, I created a model of myself using a Kinect 2 device to get a PLY file with encoded vertex colors. Then I use ThreeJS to load and parse the PLY file. I create data textures to store 32bit values of these vertex positions, velocities, colors and other parameters like distances from the mouse, sizes, … To process these data, I use an old texture swapping technique as input/output of my framebuffer because WebGL (OpenGL ES 2.0) has no computation feature like OpenCL or DirectCompute (more here).

I also made a complete integration of my FBO particle system with the ThreeJS lighting pipeline.

The particles are rendered as spherical billboards with generated normals and depth. Later, I’ll try to get a Lagrangian render of these particles (like the one I made in unity, here).

Now, I’m working on PSM (Particle Shadow Map) (more here).

The source code is not fully cleaned yet but I’ll try to push a live demo online soon.
Here is a short video preview running in Google Chrome.

QWebGL Point Cloud Sandbox

3 comments

Join the conversation
  • Jean Claude Robert - May 5, 2016 reply

    Super Nice as usual ! And Nils Frahm for the bgr-music perfect match 🙂

  • jnt - July 29, 2016 reply

    “Mindblowing!”

  • Beats Away - January 8, 2017 reply

    you are amazing. your work of particle visualization+customization is the best on youtube. i.e. the world. Looking forward to learn more about your fbo library!

Leave a Reply

Your email address will not be published. Required fields are marked *