Automaking Prolog

I’ve been gradually building some little tools for myself to enhance the web development experience in Prolog. My “day job” is still primarily Clojure, but I really enjoy using Prolog and I’d like to bring the experience of building web apps up to par.

I couple of years ago, I made a pack for generating CSS from Prolog terms (in the style of Garden for Clojure). It was nice enough and made it possible to write more of the parts of a web-app in (SWI-)Prolog, in conjunction with the built-in html_write library for generating the HTML with DCGs (analogous Clojure’s Hiccup), but the lack of a nice story for client-side javascript made building the sort of very interactive things I do with Clojure + Clojurescript not as feasible.

I’d looked at doing some shenanigans, using pengines and vue to have some sort of “live” experience while mostly writing Prolog, but I didn’t love that approach.

More recently, having started using Girouette in Clojure to do styling in the manner of Tailwind and really liking that approach, I ported Girouette to Prolog (a pretty fun exercise in itself!). With this, I had a very nice way of generating styled HTML in Prolog, but the interactive piece was still missing.

I got a final piece of inspiration from reading about Phoenix’s “LiveView” and Rail’s “Hotwire”. This seemed much easier than the “boil the ocean” approach of running Prolog on the client side and so, I hacked together a little single-js-file library that implements just enough of the “live HTML” idea to work.

I put together a little Proof-of-Concept TODO-list type thing (source here) and then, more recently, a Wordle “assistant” (source).

Building that was quite fun; I was pleasantly surprised how well just that tiny bit of javascript worked to let me build a usable little site. It also has the nice attribute of gracefully degrading – if you use the sites with Javascript turned off, they still work fine! Deploying is also very smooth – I just pull the repo with the code on the server, connect a top-level to the running process, run make and boom, Prolog automatically re-compiles the changed predicates and the app is updated seamlessly.

There was just one nagging little thing that was bugging me in my dev experience. While the “tailwind” watcher would automatically update the CSS when the files containing the code changed, I would have to remember to also reload the code, so when the page refreshed, the server was sending the corresponding new HTML. It’s easy enough to do – C-c C-z to switch to the *prolog* buffer, make. to recompile, refresh, continue – but it was easy to forget and would knock me out of my flow. Having the CSS update just as the file changed made me subconsciously expect the rest of the code to work the same way, I think.

That, in addition to some comments on the SWI-Prolog Discourse, planted the idea in my head that I should build something that would automatically run make. whenever any dependent files changed. Prolog already knows which files are of interest and I’d used inotify before, so it seemed straight-forward enough to do. The wrinkle though was that I knew using inotify by just using the pack would limit my tool to only working on Linux. I’d had issues previously with someone unable to use my tailwind_pl library because it depended on inotify, even though it was only needed at “dev time”, not if one wanted to just one-off generate the CSS. Since there isn’t (yet?) a way to have conditional dependencies in packs, I’d handled the tailwind case by splitting it into two libraries: The main tailwind_pl pack that contains the file-watcher code and tailwind_pl_generate that depends only on my css_write pack and should work anywhere.

In this case though, it would be nice if I could just make it work on multiple platforms. As I’ve written about before, SWI’s FFI is quite nice, so it was short work to put together a little “foreign module” in C that uses inotify on Linux and kqueue on macOS to monitor for file changes, then write a tiny bit of Prolog to use that module to run make as needed. Pretty slick!

However…I didn’t want to also leave Windows users out in the dark (they already suffer so much, the poor dears). I don’t have access to a Windows machine, but Jan suggested cross-compiling with mingw64 plus Wine to test. He’d already made a docker image, which he uses for creating the Windows builds of SWI, so it was straightforward enough to get that all set up.

Well, in theory. In actuality, it took me longer to figure out how to use docker, the various flags I needed to pass and environment variables I needed to set to build a dll than it took to actually write the code, but I persevered. Using the API was pretty simple, although Windows’ versions of inotify/kqueue, as best I can tell, only lets you monitor directories for changes, not particular files. Not a huge deal for my use-case – it might create redundant watches and be triggered unnecessarily, but that’s not really much of a problem.

Once I had this working though, I wanted someone to be able to test it on Windows. However…just running pack_install/2 failed, since it needed a toolchain to be able to build the foreign library and my very generous test subject didn’t have MingW or MSys installed. That instilled my next idea: I’d started trying to use sourcehut more, so why not take advantage of its features and set up a .build.yml to do the cross-compilation?

That would take care of two things, if I could get it to work: Firstly, Windows users would be able to just download the file with the DLL in it and not need a tool-chain installed. Secondly, I could make the build stage generate a zip, so I could use a simpler pack_install incantation. Previously, I’d had to install the pack from git, which necessitated passing several options to pack_install and was kind of a pain, because sourcehut only automatically generates a tar.gz archive for releases, but library(pack) gets confused by the double suffix.

So, with ropes.pl as a convenient example, I dove in. It took the better part of a day of watching jobs fail, editing the manifest, re-running, and watching it fail in a new way, but I finally got it working! I have to say that the experience of debugging was actually fantastic. The fact that sourcehut lets you edit a failing build manifest and re-submit it without having to make a new commit definitely saved my sanity, not to mention that it lets you ssh in to the failed build server so you can poke around and see what’s what. Without those two things, I probably would either still be at it now, or have finally renounced all my worldly possessions and moved to the woods.

As all my Prolog posts seem to conclude, I really do like this language and ecosystem. I would submit that it has Christopher Alexander’s “Quality Without a Name”; it just makes one warm and fuzzy to be working with. If you happen to be a Prolog developer, check out automake and let me know what you think! The next thing I’d really like to add is some sort of “hook” to be run after make; that would let me use that instead of inotify directly for my tailwind watcher and would also let me rig up some neat thing to make the browser automatically update as well…but I do have actual work that I need to get to at some point.


For more implementation details, see part two