Last weekend I wrote about trying to make lights move with home-grown code. That experiment went fairly well, but I didn't get much time to get back to it until Friday night. And by Saturday morning, I was a little discouraged. I had problems that made me feel like this simple thing wasn't as simple as I thought.
As I said, the premise is pretty straight forward. You just blast 512 bytes down the wire at a regular interval, and all the things listen and react. If I were outputting DMX right from the computer, that would be mostly true, but as I said, I couldn't make that work directly with the cheap USB controller that I have. I also have an Art-Net and sACN interface that takes the bits over ethernet and then puts them out over a DMX cable, and that I could get to work... mostly.
At first I got obsessed with the scalability of this, which was the wrong place to put energy. I would simulate a hundred universes (see previous post), and the lighting output was like a video game that dropped to an unusable low frame rate. I would never need this, but the problem was intriguing because it didn't seem like this should be hard. I started to look at the code of the sACN library that I was using, and it essentially used a binary writer to turn all of the bytes into one array. I'm not sure the way he's doing it is the most efficient.
But there were two other timing issues that were causing pain. The first is that I was allowing the sending code to pile up in its execution loop, meaning it would start the sending process again even if the previous run had not stopped. Not only does this consume more resources, but then it's possible that the packets actually leave the computer out of order, since the code is running in parallel. That was easy enough to fix, and immediately helped reduce the CPU load. I also realized that the input to change values, sliders on the screen, were potentially trying to update the data more often than the data was sent (40 times per second, if you were wondering). That just makes even more overhead, especially given the front-end I'm using.
And let me talk about that for a moment. I'm trying to use Blazor Server for this experiment. I've used Blazor for my music player and for my word game, but both of those are Blazor WebAssembly, meaning you have a compiled thing running in the browser and sending data back and forth to a server. The coding is similar for the UI, but the server version of this keeps all of the state on the server, and streams just the parts of the UI you need to the browser. The advantage of this is that you have direct access to server resources, like the network connection to lighting gear. The down side is there's a lot of overhead to maintain that browser state, though in this situation it would be unlikely that more than one person would need it at a time. And I like the idea of being able to operate a show from anything on the network with a browser. I'm not married to it though... all of the code that I've written could be glued to any UI technology, I just like what I'm using because it's platform agnostic, and can run anywhere (Windows, Mac, Linux).
So my "user interface" for all of this testing is three sliders on a page. One I have controlling the pan action on a light, one dims one of my tube lights, and a third acts as the main dimmer. I had all kinds of weird results with the main until I got the math right, but I'm super happy with the structure of the code. Once that was working predictably though, I ran into another problem. Sometimes the last packet sent out, say setting the dimming to full off or on, wouldn't be "heard" by the lights. I could see that it was sending out the 0 or 255 on the wire, and if I wiggled the fader slightly the lights would "catch up," but what's the deal?
For my initial implementation, I was sending data using the sACN protocol, because it's simpler than Art-Net. The multicast nature of it also seems to work well across my network, which has a goofy wireless hop through a WiFi extender I'm using as an access point for the wired DMX interface. This also means packet loss is more possible. The optional part of sACN is to send synchronization packets. What that means is that if you have light fixtures (or video walls) across multiple universes, you want them to apply the data all at the same time, so your strobe on one universe starts flashing at the same time as a strobe on another universe. I wasn't doing that since it was optional, but it appears to be desired by this interface I have. Once I started doing that, it reliably did whatever my faders asked, every time. At this point on Saturday night, my confidence returned.
So what would it take, I thought, to implement the Art-Net interface as well? I found a library for that as well, and stubbed out the code to use it. The interface is pretty simple, just a method to send a universe of data, another to send a sync packet, and another to send a polling signal (to let devices know that you're sending certain universes). Ten minutes later, I could see that I was flooding the network with data, but my interface wasn't listening. So I decided to point specifically to the interface's address, and bam!, it was working. Even more interesting, I was able to simulate a hundred universes and there was no degradation in performance at all. CPU usage was still robust, but no perf hit. I'll revisit that, and see how the two libraries are implementing the creation of the bytes.
I now understand a little more about how the two protocols work. sACN does multicasting, which means that it sends data to a specific IP for each universe, in a defined range. The listening devices then listen for the corresponding universes on those IP's. Each packet has a couple of bytes describing what universe to listen to for synchronization packets, so it will listen there as well. It's really elegant. Art-Net, on the other hand, does broadcasting, which means it sends out the data to every IP in the subnet range. And I think that's why I couldn't get it to work, because my laptop isn't on the subnet associated with Art-Net, which appears to default to 2.0.0.0/8.
Before I commit to building a real UI, I have one more experiment to try. I need to apply an "effect" to a number of fixtures and see if my data structure works with it. So for example, panning a light back and forth, or dimming off and on gradually. I think this is the same design pattern as the main fader. The short explanation is that when a fixture's parameter is calculated, an adjustment is applied. Changing the main fader value fires an event that performs that calc on any parameter that is subscribed to the event. It's a rare object-oriented programming win for me.
No comments yet.