Handmade Network»Phil

Recent Activity

2023 Demo Video of &sol !

This showcases the highlights of my handmade shader catalog along with numerous features.

  • Audio driven visuals
  • Mulitpass scenes
  • Compute Shaders
  • Post-processing effects
  • Cross application texture sharing
  • Dynamic Audio Processing Controls
  • Shader Hot-loading
  • Auto-generated UI & UBO via a meta-program
  • Free flying camera controls

I'd love to hear year recaps from folks about their handmade projects.
I took a look through my HMN profile and am pleasantly surprised with everything I've done.

The year started off with me finishing a beta version of "Intent: Reinventing Desktop UX", kicked off by the first wheel reinvention jam.
A demo of this can be seen here: https://phtest.tv/projects/intent/

At the end of spring and beginning of summer I went on the road with High Pulp Music doing live concert projection visuals https://phtest.tv/projects/high_pulp/ .
-- For this I got to use my handmade audiovisual shader design and performance app "Sol": https://handmade.network/p/191/sol/
-- This was finished being ported from C++ to Odin this year to continue to make progress on this experimental research oriented application.
-- A live recording of our full audio/visual set will be coming out soon!

The year continued with me dabbling with a few game prototypes (demos not yet ready) and tooling around it.
-- I ported the raylib targeted 2d physics library "Physac" to Odin: https://github.com/thePHTest/odin-physac
-- I updated the sokol odin bindings generator to be wraperless: https://github.com/floooh/sokol-odin
--- Also made an odin+sokol+microui demo: https://github.com/floooh/sokol-odin/pull/2/commits/f0ca2c5e0e12d404a852813dba01dbb786a76673

November was focused on doing the technical production for Handmade Seattle.

And finally I finished the year by working on a small pixel sorter application to practice releasing something high quality on Windows+macOS+Linux and continue learning sokol.
This will come out within the next month!
You can see a short clip of the kind of effects it can achieve here.

Also added multi-file select to process the frames of a video. Let it snow!

Added writing images to disk and some simple animation controls. Now I can put together videos like this:

Vastly improved the performance of restarting the pixel sorter when changing parameters by improving the thread pool to be clearable without killing the threads and managing the image data lifetime better.
Also added an overlay to see the selected pixels (could improve the blending for this if you have any suggestions).

The pixel sorter is almost to the polishing and wrap up phase. Only one more feature left now that the motion vector animation is running on the gpu

Also not programming (yet), but a proof of concept of using CV signals from a synth to control visuals using TouchDesigner. I'll certainly be bringing this idea into my visual performance project &sol soon!
From the images you can see the patch cables running from the synth outputs to an almost empty eurorack. The only module at the moment is a DC-Coupled audio interface that can properly send and receive control voltages. I'll have to try the reverse next of manipulating image/video and sending the parameters as control voltage into the synth for generative audio.
And yes, the default image in TouchDesigner is a banana.

Sokol with Odin is now very polished! I updated the binding generation to be wraperless using the new #by_ptr directive to represent const references in bindings.
I've also made an Odin sokol-microui example program . I submitted this as another example to sokol-odin for reference. https://github.com/floooh/sokol-odin/pull/2

Next up is attempting to get this working on web with wasm.

The pixel sorter now supports a combination horizontal + vertical sort mode. It leads to interesting results.

Finishing up the pixel sorter program to practice taking something to the finish line for a release.
Added most of the main features: different animations via task ordering (sequential vs interleaved) and sleep timing, various sorting modes (by luminance, saturation, or channel min & max), opening image files via file dialog, and auto-centering and scaling the image for the viewport.

Made a configurable pixel sorter. Specify the # threads + # tasks + channel threshold + # of ms to sleep between lines to get different animations and results.

Ported a 2D physics header from C to Odin. Turns out, it's pretty fun to play with

Getting back to work on my project &sol after 2 months on the road. Bindings with Spout https://github.com/leadedge/Spout2 are now working in Odin for texture sharing with other applications. I can now start to use this for live performance instead of the old C++ version of my app!

Here's a short video showcasing the recent progress on "Intent", kicked off by the wheel reinvention jam as a way of re-thinking Desktop UX. It is now at a place where I can start experimenting with using it as a daily driver to switch between different contexts on Windows by managing the open and shown windows & processes. The video should make it clear what the current model and capabilities are if you are curious!

Short demo showing the latest progress on porting my project &sol to Odin.

  • I can now switch between different scenes and interact with an imgui to change parameters.
  • The scene gui and pipeline to upload the structure to the gpu/shaders is all auto-generated using core:reflect .
  • Post-processing shader is also now working.
  • Some simple audio analysis has been added (summing range of frequencies i.e. lows/mids/highs).
  • Next steps are to add compute shaders, a simple Render Graph representation, and serializing the scene data.

Jam Project "Intent": Reinventing Desktop UX


Most desktop UX is a blank canvas with the whole universe of computing open to its users. In addition, the manipulative tactics of social media, video streaming platforms, and other software designed to constantly capture the attention of its users, leads to an extremely distracting environment full of easily treaded time sinks.

The goal of "Intent" is to reject this paradigm and bring mindfullness to the way we interact with our computing devices. Instead of sitting down at your computer and habitually opening the usual vices, "Intent" immediately confronts you with the question of "What do you want to accomplish?". Select a session and do what you set out to without distraction.

Select a task and "Intent" will launch a pre-configured list of applications, websites, and folders to get to the task at hand. Pause sessions to hide the windows, switch contexts, and resume later. Set a timer and "Intent" will automatically close them down when time has expired.

Read more and provide feedback at https://handmade.network/forums/jam/t/8128-intent__reinventing_desktop_ux

New forum thread: Intent: Reinventing Desktop UX

Got to project some of my visuals for a music performance the other night 😁 &sol

Got a physarium network up and running! My most complicated scene yet. &sol

now with more evolution. &sol

Experimenting with cellular automata using compute shaders. &sol

Today's shader sketch was working on an effect. It took me way too long to figure out this code. &sol

    vec2 st = gl_FragCoord.xy / u_resolution.xy;
    int iters = 32;
    float ratio = 0.5 + 0.5*sin(0.5*u_time);
    for (int i = 0; i < iters; i++) {
      if (st.x < ratio) {
        st.x = st.x / ratio;
      } else {
        float temp = st.x - ratio;
        st.x = (1.0 - st.y);
        st.y = temp / (1.0-ratio);

today's shader sketch. It's too easy for me to get stuck playing with polar coordinates. I'm starting to brainstorm different coordinate spaces I could play with. &sol

With music this time! (It's hard to get these clips under 8MB) &sol

the daily shader practice continues. I'm finding inspiration in other artists and challenging myself to figure out their techniques. &sol

Keeping up the daily practice. I was in the zone today and iterated a lot on this one. It took a while to resolve all of the visual artifacts, and there is still some noticeable banding that I'm not entirely sure how to resolve. &sol

I am keeping up a daily practice of creating shaders everyday. Didn't feel very inspired today and played around with tweaking an older piece. &sol

attempting to fold space. &sol

some shader coding fun today after being inspired by the revision party stream &sol :

Nothing too fancy yet but geometry shaders and ssbos are working nicely in my rendering pipeline now : ) (Audio On!) https://www.youtube.com/watch?v=XfTz7IOmabs

Haven't been working on the engine much and have been playing with this gpu terrain marcher instead: https://www.youtube.com/watch?v=MP_Oq9agyww Wish I could have flown the camera around but having the window focused caused the framerate to drop and therefore OBS capture to fail. I'm not sure why having the window focused causes such a difference (input polling?) &sol

Compute shaders with ssbos are now integrated properly into the rendering pipeline for my visual synthesizer. I'm excited for the potential to make new things with this. https://www.youtube.com/watch?v=0xsza0XkTcU

Got this compute shader particle system up and running with 8 million points. Not the best recording but it shows the idea. I'll probably add in some audio reactivity for this next https://www.youtube.com/watch?v=tJERW-Wp5V4

Made this audio-reactive gpu raymarched scene. Pretty happy with how I figured out how to properly raymarch the scene by clamping the step to the minimum of the sdf or the step to the next "grid cell". Also removed moire patterns appearing when zoomed out or on the horizon by adding a jitter to the blocks. I still get some artifacts in the shadows when I move the camera too far from the origin. Not entirely sure why that starts to occur. I think I'm going to work on midi controller input next to control the camera more fluidly with faders & knobs and have more input options overall. (YT compression is less than desirable for this recording....)


Made this interact-able multiview today for my shader creation / live performance app &sol: https://www.youtube.com/watch?v=ZEZeyVLfZoI

Now auto-generating UI for hotloaded shader uniforms in my visual editor / performance tool! Next step is to think on how to parse tags on the uniforms for the kind of UI elements to generate. https://streamable.com/1islhm

Huge step for my shader/vj project! Integrated Dear ImGui with my backend and started auto-generating UI with datadesk. Here's an audio-reactive visual to show it off 🙂 https://youtu.be/lQHwS_aw1bo