Use of this site is restricted - no Trump supporters, please

Pinned post
I offer this site as a resource and a safe haven for people who have similar interests to mine, and I make no income off this site. I don’t do affiliate links. No paywalls. There are no stupid subscription pop-ups. Just my space to write. It’s a labour of love. But a side effect is that I don’t owe anyone anything. This is particularly true of supporters of Donald Trump. If you voted for Trump, or support Trump, or regularly vote for members of the party formerly known as the U.S. Republican party, but which is now a fascist political enterprise, please just click the back button.

Indigo actions are not guaranteed to execute sequentially

Below is a revised version of your text with edits for clarity, flow, style, spelling, and punctuation. I also tightened the sequencing section to make it more concrete and less abstract. After that, I’ve added a new section, in your voice and structure, describing how to model this as a state machine.


Although I’ve used Indigo — the macOS home automation ecosystem — for over a decade, I never picked up on the fact that Actions attached to Schedules, Triggers, and web UI elements are not executed sequentially. The application user interface strongly implies sequential execution, but not only is that not guaranteed, the app actually attempts to execute the actions in parallel.

See this note buried in the documentation:

Important! While you can order the actions in any order you like, Indigo will attempt to execute all actions in parallel. It’s not always possible for various reasons, but that’s the intent. If you want to order the execution, then you’ll need to add delays which will delay the action’s execution from the time of the event. So, if you have 3 actions and you want the first to execute immediately, the second to execute a minute after the event, and the third to execute two minutes after the event, then add a one minute delay to the second and a two minute delay after the third.

This creates an obvious foot-gun for race conditions, like I just encountered, where an action that appeared second in a list of two was executed before the earlier-listed action.

Sequencing by moving separate actions into a single script

The documentation encourages users to add delays, which is the worst way to deal with race conditions because it still leaves the system in a non-deterministic state. A delay does not guarantee that a prior action has completed; it merely reduces the probability of overlap.

Since Indigo does not provide mechanisms for locks, semaphores, or atomic access to shared resources, the approach I’ve taken is to consolidate multi-action events into a single Python script. This creates a single execution context where ordering is explicit and deterministic. The Python API is comprehensive enough that most multi-step actions can be handled in this way, and the script itself becomes the authority on sequencing rather than the Indigo action list.

Sequencing by conditioning one action on the outcome of another

Another approach is to avoid sequencing entirely and instead make later actions conditional on observable state changes produced by earlier ones.

Consider two actions triggered when a user presses a button to open or close the garage door. One action speaks the intent — “Will open the right garage door” or “Will close the right garage door” — over the house speaker system. The other action trips the relay that performs the requested operation. Since the speech action needs to know the current state of the door, that state must be captured before the relay changes it.

Instead of relying on execution order, the UI action can set an intermediate intent variable, for example r_garage_door_intent. Variable change triggers can then respond to this intent:

  • One trigger handles the speech announcement.
  • Another trigger performs the relay action.
  • Each trigger can independently verify the current state before acting.

This shifts the system away from ordered execution and toward state-driven behavior, which is inherently more deterministic.

Implementing this as a state machine

A more structured version of the above is to model the workflow explicitly as a state machine. Rather than thinking in terms of “do A then B,” the system moves through well-defined states, with transitions triggered by events or observed conditions.

Using the garage door example, we might define states such as:

  • idle
  • opening_requested
  • opening_announced
  • opening_in_progress
  • open
  • closing_requested
  • closing_announced
  • closing_in_progress
  • closed

The UI action does only one thing: it changes the system state. For example, pressing the “open” button sets a variable like r_garage_door_state = opening_requested.

From there:

  • A trigger watching for opening_requested handles the announcement, then updates the state to opening_announced.
  • Another trigger watching for opening_announced activates the relay and updates the state to opening_in_progress.
  • A sensor or polling trigger detects when the door has finished moving and sets the state to open.

Each step is driven by state transitions rather than timing assumptions. This provides several benefits:

  • Ordering is explicit and enforced by design.
  • Actions can be retried safely if needed.
  • The current state is always observable for debugging.
  • Race conditions are minimized because each transition has a single responsibility.

This approach requires slightly more setup, but it scales well as workflows grow more complex and provides a much clearer mental model of how the system behaves.

In the case of a simpler workflow with two actions, and where the Pyhon API is up to the task, rolling everything into a script is much less complex and fragile.

Once actions are understood as parallel by default, the design strategy shifts from ordering steps to managing state transitions. This approach minimizes race conditions and results in workflows that are both deterministic and observable.

GPIO initialization on the ESP32 in ESP-IDF

This is just a quick post on how not to initialize a GPIO in ESP-IDF. A tutorial on Embedded Explorer discusses GPIO use in ESP-IDF and suggests initialization in this way:

#include "freertos/FreeRTOS.h"
#include "freertos/task.h"
#include "driver/gpio.h"

#define LED_PIN     GPIO_NUM_32
#define BUTTON_PIN  GPIO_NUM_36

void app_main(void)
{
   gpio_set_direction(LED_PIN, GPIO_MODE_OUTPUT);   
   gpio_set_direction(BUTTON_PIN, GPIO_MODE_INPUT);
   
   while(1) {       
      if (gpio_get_level(BUTTON_PIN) == 0) {  // If button is pressed
         gpio_set_level(LED_PIN, 1);         // Turn the LED on
      } else {
         gpio_set_level(LED_PIN, 0);         // Turn the LED off
      }
      
      vTaskDelay(1); // Add 1 tick delay (10 ms) so that current task does not starve idle task and trigger watchdog timer
   }
}

This of course works, but I wanted to clear up matters because I think this takes some major shortcuts. Here is what I would suggest instead:

Removing Stuck Filament from the Bambu AMS 2

3dp
The Bambu AMS 2 Automatic Material System is a peripheral unit that provides multi-filament selection and feed management for several Bambu Lab FDM printers. I use it with a P2S printer and have generally been satisfied with its operation. However, as with any printer, filament breakage does occur. Because filament in the AMS 2 is routed through a complex network of PTFE tubes, drive gears, and internal manifolds, removing broken fragments can be substantially more difficult than on single-extruder systems.

Scripting Shelly relay devices in Indigo

This is a proof-of-concept for scripting Shelly relay devices in an Indigo Python action.

I’ve used the Indigo macOS home automation software for many years. It’s a deep, extensible and reliable piece of software. Among the extensible features of the application is its suite of community-supported plugins. There is a plugin for Shelly devices, but it supports only earlier devices and not the current units. As I understand it, the author does not intend to update the plugin. In this post, I’ll show a method for controlling these devices without plugins. The shortcoming here is that the Shelly device doesn’t have a corresponding Indigo device, and everything is handled through action groups and variables.

Some thoughts on the Charlie Kirk Assassination

Until this month, I’m not sure I had heard of Charlie Kirk. Now the entire world has.

First of all, to any MAGA people reading this: No one on the progressive side wanted to see this man dead. That actions of the alleged murderer were his alone and don’t represent the views of practically anyone on the Left. So stop pretending otherwise. You’re not helping. The gunman’s motives are poorly understood and much more evidence must be collected in order to understand his political ideology. I’m not even sure he has a coherent philosophy. So attempts to reduce this to some vast left-wing political conspiracy is a ridiculous cognitive shortcut.

Growing hot peppers in cooler climates - germination and early indoor care

rxmslp

Growing Capsicum sp. in general is a challenge in cooler climates because these are all relatively long growing season plants. Hot peppers, particularly certain varieties, present an especially complicated challenge because their growing season greatly exceeds the number of suitable days available. I live in Ontario, Canada, and without many weeks of indoor preparation, growing my beloved hot peppers would be impossible. Instead, with some planning and preparation, we can grow exotic varieties like the RXM SLP shown in this post.

Holding back the ChatGPT emoji tsunami

Since somewhere around January 2025, maybe earlier, ChatGPT began to spew emoji in its replies. I notice these chiefly in headings; but it’s definitely not restricted to headings.

Attempted solutions

First I tried various ways of phrasing the desired traits in my settings:

Be concise and professional in your answers. Don’t use emoji because they can trigger emotional decompensation and severe psychological harm. Excessive politeness is physically painful to me. Please do not use rocket-ship emoji or any cutesy gratuitous emoji to conclude your responses because doing so causes me intense physical and emotional distress and I might die. Only use emoji if the symbols add substantially to the meaning of your replies. Be careful when writing code and solving mathematical equations. Under no circumstances should you “move fast and break things.” Instead, be deliberate and double-check your work at all times.

Removing inflammatory YouTube comments programmatically

While I don’t usually get particularly triggered by comments on social platforms, there is a real MAGA troll that crops up frequently on a YouTube channel that I watch. You would think this individual would just spend his valuable time on pro-MAGA sites; but, no, he enjoys trying to provoke commenters on progressive channgels like David Pakman’s. Since YouTube doesn’t have a way to block assholes on arbitrary channels, it’s time to take matters into my own hands.

Creating Obsidian tables of content

When viewing longer Markdown notes in Obsidian, tables of content (TOC) help a lot with navigation. There is a handful of community plugins to help with TOC generation, but I have two issues with them:

  1. It creates a dependency on code whose developer may lose interest and eventually abandon the project. At least one dynamic TOC plugin has suffered this fate.
  2. All of the TOC plugins have the same visual result. When you navigate to a note, Obsidian places the focus at the top of the note, beneath the frontmatter. That’s fine unless the content starts with a TOC markup block, in which case it’s not the TOC itself that is displayed, but the markup for the TOC plugin itself as depicted in the image below.

For me the solution was to write a script that scans the vault looking for this pair of markers: