Handmade Network»Phil

Recent Activity

The pixel sorter now supports a combination horizontal + vertical sort mode. It leads to interesting results.

Finishing up the pixel sorter program to practice taking something to the finish line for a release.
Added most of the main features: different animations via task ordering (sequential vs interleaved) and sleep timing, various sorting modes (by luminance, saturation, or channel min & max), opening image files via file dialog, and auto-centering and scaling the image for the viewport.

Made a configurable pixel sorter. Specify the # threads + # tasks + channel threshold + # of ms to sleep between lines to get different animations and results.

Ported a 2D physics header from C to Odin. Turns out, it's pretty fun to play with

Getting back to work on my project &sol after 2 months on the road. Bindings with Spout https://github.com/leadedge/Spout2 are now working in Odin for texture sharing with other applications. I can now start to use this for live performance instead of the old C++ version of my app!

Here's a short video showcasing the recent progress on "Intent", kicked off by the wheel reinvention jam as a way of re-thinking Desktop UX. It is now at a place where I can start experimenting with using it as a daily driver to switch between different contexts on Windows by managing the open and shown windows & processes. The video should make it clear what the current model and capabilities are if you are curious!

Short demo showing the latest progress on porting my project &sol to Odin.

  • I can now switch between different scenes and interact with an imgui to change parameters.
  • The scene gui and pipeline to upload the structure to the gpu/shaders is all auto-generated using core:reflect .
  • Post-processing shader is also now working.
  • Some simple audio analysis has been added (summing range of frequencies i.e. lows/mids/highs).
  • Next steps are to add compute shaders, a simple Render Graph representation, and serializing the scene data.

Jam Project "Intent": Reinventing Desktop UX


Most desktop UX is a blank canvas with the whole universe of computing open to its users. In addition, the manipulative tactics of social media, video streaming platforms, and other software designed to constantly capture the attention of its users, leads to an extremely distracting environment full of easily treaded time sinks.

The goal of "Intent" is to reject this paradigm and bring mindfullness to the way we interact with our computing devices. Instead of sitting down at your computer and habitually opening the usual vices, "Intent" immediately confronts you with the question of "What do you want to accomplish?". Select a session and do what you set out to without distraction.

Select a task and "Intent" will launch a pre-configured list of applications, websites, and folders to get to the task at hand. Pause sessions to hide the windows, switch contexts, and resume later. Set a timer and "Intent" will automatically close them down when time has expired.

Read more and provide feedback at https://handmade.network/forums/jam/t/8128-intent__reinventing_desktop_ux

Got to project some of my visuals for a music performance the other night 😁 &sol

Got a physarium network up and running! My most complicated scene yet. &sol

now with more evolution. &sol

Experimenting with cellular automata using compute shaders. &sol

Today's shader sketch was working on an effect. It took me way too long to figure out this code. &sol

    vec2 st = gl_FragCoord.xy / u_resolution.xy;
    int iters = 32;
    float ratio = 0.5 + 0.5*sin(0.5*u_time);
    for (int i = 0; i < iters; i++) {
      if (st.x < ratio) {
        st.x = st.x / ratio;
      } else {
        float temp = st.x - ratio;
        st.x = (1.0 - st.y);
        st.y = temp / (1.0-ratio);

today's shader sketch. It's too easy for me to get stuck playing with polar coordinates. I'm starting to brainstorm different coordinate spaces I could play with. &sol

With music this time! (It's hard to get these clips under 8MB) &sol

the daily shader practice continues. I'm finding inspiration in other artists and challenging myself to figure out their techniques. &sol

Keeping up the daily practice. I was in the zone today and iterated a lot on this one. It took a while to resolve all of the visual artifacts, and there is still some noticeable banding that I'm not entirely sure how to resolve. &sol

I am keeping up a daily practice of creating shaders everyday. Didn't feel very inspired today and played around with tweaking an older piece. &sol

attempting to fold space. &sol

some shader coding fun today after being inspired by the revision party stream &sol :

Nothing too fancy yet but geometry shaders and ssbos are working nicely in my rendering pipeline now : ) (Audio On!) https://www.youtube.com/watch?v=XfTz7IOmabs

Haven't been working on the engine much and have been playing with this gpu terrain marcher instead: https://www.youtube.com/watch?v=MP_Oq9agyww Wish I could have flown the camera around but having the window focused caused the framerate to drop and therefore OBS capture to fail. I'm not sure why having the window focused causes such a difference (input polling?) &sol

Compute shaders with ssbos are now integrated properly into the rendering pipeline for my visual synthesizer. I'm excited for the potential to make new things with this. https://www.youtube.com/watch?v=0xsza0XkTcU

Got this compute shader particle system up and running with 8 million points. Not the best recording but it shows the idea. I'll probably add in some audio reactivity for this next https://www.youtube.com/watch?v=tJERW-Wp5V4

Made this audio-reactive gpu raymarched scene. Pretty happy with how I figured out how to properly raymarch the scene by clamping the step to the minimum of the sdf or the step to the next "grid cell". Also removed moire patterns appearing when zoomed out or on the horizon by adding a jitter to the blocks. I still get some artifacts in the shadows when I move the camera too far from the origin. Not entirely sure why that starts to occur. I think I'm going to work on midi controller input next to control the camera more fluidly with faders & knobs and have more input options overall. (YT compression is less than desirable for this recording....)


Made this interact-able multiview today for my shader creation / live performance app &sol: https://www.youtube.com/watch?v=ZEZeyVLfZoI

Now auto-generating UI for hotloaded shader uniforms in my visual editor / performance tool! Next step is to think on how to parse tags on the uniforms for the kind of UI elements to generate. https://streamable.com/1islhm

Huge step for my shader/vj project! Integrated Dear ImGui with my backend and started auto-generating UI with datadesk. Here's an audio-reactive visual to show it off 🙂 https://youtu.be/lQHwS_aw1bo