Taking a little break from the wasm debugger to integrate harfbuzz into &orca
Taking a little break from the wasm debugger to integrate harfbuzz into &orca
I finally published my end of jam writeup about &babbler here: https://forkingpaths.dev/posts/24-10-02/wheel_reinvention_jam_writeup.html
I explain the motivations of the project, how it works, how it differs from the wheel it's trying to reinvent, what insights I got from it and the new questions it raised.
I half-hacked variables at the last minute, as demonstrated by this little counter. I think that will be all for this jam! &babbler is obviously just a tiny computing toy at this point, but it's already pretty fun to play with.
Today I added a first spatial query to &babbler. You can detect if a card p
points at another card q
in a given direction d
with a query of the form (when p points d at q)
. p
, q
, d
can also be placeholders so you can react to several spatial relationships at once. The query itself triggers the drawing of "pointing whiskers" so you can visualize how cards point to each other.
Since it's impractical to precompute all possible spatial relationships, I introduced a notion of "responders" to the facts database, ie callbacks that can generate new facts on demand when a query matches a specific pattern. Next step is to add new spatial responders (ie proximity, clustering, etc) and hopefully allow such responders to be defined in user-code.
Reproducing a small DynamicLand tutorial in &babbler! This shows the use of claim
, when
and wish
clauses, how to apply illuminations to the cards, and how to use placeholders in when
clauses to match several claims.
Func facts! Not much progress today, but we're now registering claim
clauses to a "facts" database, and matching when
clauses with existing facts &babbler.
&babbler is a little experiment based on my misunderstanding of DynamicLand's Realtalk. Here's a a first jab at the UI, where you can put cards on a canvas and write code on them. Got immediately sidetracked into writing a little structure editor for the cards...
Switching between wasm and internal bytecode views, and adding breakpoints &orca
Stepping through my wasm interpreter's internal bytecode (wip wasm3 replacement for &Orca)
I wrote a blog post describing Orca's vector graphics backend: https://orca-app.dev/posts/240426/vector_graphics.html &orca
!til late addition to my learning jam project on &colors: how to derive the HSL/HSV color models from an RGB cube: https://youtu.be/1zwXnf0G2II
!til the sRGB curve is slightly different from a simple gamma curve, and the difference is actually noticeable on darker &colors. So here's a follow-up video on the proper way to do sRGB conversions: https://youtu.be/T54SX-QwFpc
!til here's part two of my learning jam project &colors where I explain gamma correction and sRGB encoding, and update the vector graphics renderer of &orca to be gamma correct
https://youtu.be/cFoi1OLHFQ0
!til a bit about &colors, so here are some ramblings about the XYZ color space and xy chromaticity diagram: https://youtu.be/IOJe1ugkc9Y
The tiger is back. Running pretty ok already without much optimization. I could probably cull a lot more stuff. &orca
Just added strokes (converting strokes to filled paths for now) &orca
Added MSAA in the compute shader. It might be more efficient to let a fragment shader "automatically" do it, but I'd have to change the simple final blit pass with a draw indirect call that generates triangles for each touched tile... will test later! &orca
Finally, propagating tile-local winding numbers to get global winding numbers. We can now fill paths!
Now I've got a bunch of cleanup to do. And then strokes, texturing, anti-aliasing, clipping, translucency (not necessarily in that order) &orca
Same, but merging the tile bins across paths, and rasterizing only the tiles inside the bounding box of the path control points &orca
Another building block for orca's webgpu vector graphics render: binning monotonic curves per-path and per-tile and computing tile-local winding numbers &orca
I just finished adding GLES bindings to &orca, so I figured I could try porting my old WebGL fluid sim experiments to it, and it was very pleasingly easy: https://www.youtube.com/watch?v=NqfEtRGEdYU
@bvisness wrote a small breakout game in &orca using our vector graphics renderer! Some platform discrepancies to iron out still, but it's shown here running on windows and macos.
I added a logging API and a basic in-app console in &orca, that displays log entries emitted by the guest webassembly module. It doesn't look like much yet, but it's already a nice quality of life improvement. The entries are also structured, meaning it could later be used to filter them or to jump to the line that generated the message, set a breakpoint, etc.
As a small jam experiment, I added an overlay to &orca that shows you the source code of the webassembly module it's running, and displays colored dots next to the functions that are executed. The color itself indicates the time spent in that functions during the frame relative to the other functions of the module (ie, green = less weight, red = more weight).
I implemented another method for &orca gpu vector graphics renderer. Same ghostscript tiger as before (1600x1200 window, 8x msaa), approximately 6x faster: https://twitter.com/forkingpathsdev/status/1644636281459056642?s=20 (edit: deleted and reuploaded, because I can't do twitter right and I posted the video of the previous method)
I wrote about the styling system I use in &orca UI toolkit, which solves some problems I had with stack based approaches: https://www.forkingpaths.dev/posts/23-03-10/rule_based_styling_imgui.html
Porting my vector graphics renderer to opengl on windows (All glyphs are re-rendered from vector outlines each frame).
Squeezed one last feature on &orca before the finish line. Launching apps from the browser, with a choice to cache them for offline use.
Not much visible progress on &orca today, but I did a lot of background work!
OnMouseUp()
or OnFrameResize()
), this gets called automatically (i.e. no need to register input handlers).So now I can click to change the direction of rotation of my triangle. Phew! But to be fair, I warned you that there's wasn't much eye candy today!
Phew! I Finally figured out a way to do cross-process graphics. So now I can give each tab process a GLES context, that can be displayed inside the main &orca process. To celebrate I'm having some remote triangles!
Intro video to my jam entry &orca https://youtu.be/1HM-VUi2YlE
Day3 - The &orca launcher now loads metadata from the app bundles it finds in the local apps folder. You can click an icon to select an app and see its banner and description. Double clicking creates a new tab backed by a separate process.
Just a very crude UI mockup for the first day! The OS is making handling animations during resize overly painful, and I spent way too much time on this :lol_sobbing: &orca
First draft of a completion panel in Quadrant's structure editor. It's not hooked up to the type system and symbols tables yet (so for now it only suggests syntactic constructs), and there's still a lot of little UX decisions to be made in order to provide a fluid completion workflow, but I think the general idea is working pretty well. &quadrant
Doing some UX experimentation for &quadrant structure editor: here the editor is automatically inserting "holes" for missing tokens, with syntactic/semantic hints, while trying to retain a "text-like" editing flexibility.
Here's a short presentation I made for the Sound and Music Computing Conference 2022. I go over the goals and approach of my temporal programming environment &quadrant and then demo how you can pilot a Max/MSP patch from a Quadrant program using OSC.
Follow-up to my first contact with fluid sim: I implemented what I understood of the multigrid method. Still unsure if I did get it 100% correctly (need more testing) but it seems to address some obvious issues of the first try!
Lately I've been trying out using minimal C to WebAssembly as small practice sessions in my spare time. I'm also kinda curious about fluid simulation, so this week-end I took the opportunity to learn some basics and whip up a small demo. It was a lot of fun 😄
Some progress update on Quadrant's language features: variadic parameters, polymorphic procedures, modules, and foreign system: https://youtu.be/AmO9hczGkYU &quadrant
I rewrote the layout strategy of &quadrant. A cell can now adapt its layout to the layout of its children (so it can avoid unwanted hanging indents or break lines when its contents gets too large). I also took the opportunity to add smooth animation to the cells and cursor's position.
A little more detailed presentation/demo of my prototype temporal programming environment &quadrant. https://youtu.be/_wGPEDwp1AA
Prototype of a programming environment I'm working on, called &quadrant. The end goal is to write temporal scenarios for live shows/art installations. It has a structure editor, a non-textual DSL that gets compiled to a bytecode which is then executed by a VM. The VM and the editor communicate, which allows monitoring and controlling the execution of the program from the editor.
Day6 devlog (I couldn't work on my Jam project on day5): https://youtu.be/DmmKoz8XiJo I added a simple UI autolayout system and implemented dial knobs and a curve editor.
Day3: added code reloading, and a basic project settings file allowing to specify a build command to build the dsp code from the application with a keyboard shortcut. Also did some fancy overlapped dll loading to avoid shuting down the audio during reload.
Day2:
I'm definitely not a fan of the CoreAudio API, and there's still a lot of kinks to iron out, but it will do for now!
I got some basic structured editing working for a scripting tool I'm making.