My mother works as an IT consultant in the Bay Area (it’s a pretty rad story, she taught me my computing fundamentals as a kid but could never break into the industry due to its biases and such, until ten years ago after my folks relocated to the Bay Area during a tumultuous time for all of us. I stayed in Seattle and broke into tech a few years later).

She just handed-me-down a glorious Dell XPS 13, which I immediately installed my standard Arch Linux + Awesome WM dev environment on.

Hmm. Beautiful screen, but I can’t see anything. I squinted through the Arch install process, but when I got my window manager and favorite dev tools up-and-running, something needed to be done.

Now, it turns out that I really didn’t need to do this and it was just a fun exercise: https://wiki.archlinux.org/index.php/HiDPI

Anyway:

This code scales the workspace and window decorations — I need to write a similar function for the menus. It assumes that a theme’s font will be stored in a variable theme.font in the format “fontname 7”, which is what I found to be the case in the Arch theme. I didn’t do any other checking or testing. Without further ado, here’s what I added to my ~/.config/rc.lua:

local theme_name = "arch"

local function scale_theme_font()
  local theme = awful.util.get_themes_dir() .. theme_name .. "/theme.lua"
  local theme_config = gears.protected_call(dofile, theme)
  local fs_start, fs_end = string.find(theme_config.font, "%d+$")
  local scaled_fontsize = tonumber(string.sub(theme_config.font, fs_start, fs_end)) + 5
  local scaled_font = string.sub(theme_config.font, 0, (fs_start-1)) .. tostring(scaled_fontsize)
  theme_config.font = scaled_font
  return theme_config
end

local theme = scale_theme_font()
beautiful.init(theme)

The harmonizer needs a little tuning, I’ll be playing around with the parameters of the ugens. But it sounds pretty decent as long as you’re within a certain range, and if you push it you can generate some pretty fun-sounding artifacts. Try beatboxing into it while humming your bassline.

I’ve hard-coded the bus my microphone is on, so you’ll need to edit that if you wanna paste this code into your REPL or your cider-connected emacs/vim. But anyway, here’s this:

 


(definst voice1 []
  (sound-in:ar 1)
  )


(definst harmonize-input
  [note 60 velocity 80 gate 1 bus 1]
  (let [in (sound-in bus)
        pitch (pitch:kr in 
                        :exec-freq 500 
                        :down-sample 0.1)
        amp (/ velocity 127)
        env (env-gen (adsr 0.01 0.1 1 0.3) gate :action FREE)
        orig-midinote (cpsmidi pitch)
        interval (- note orig-midinote)
        ratio (midiratio interval)
        sig (pitch-shift:ar in :pitch-ratio ratio)
        ]
      (* env (* amp [sig sig]))
    )
  )
(def harmonizer 
  (midi-poly-player harmonize-input 
                    (:overtone.studio.midi/full-device-key 
                       (first (midi-connected-devices))) 
                     :none ))

 

I’ve translated tutorial 4! That was several months longer than I intended to wait. There were extenuating circumstances.

My Clojure skills, as well as my understanding of both SuperCollider and Overtone, have improved dramatically since I tried translating Tutorial 3. This only took me an afternoon, and everything worked as expected.

The only thing I got stuck on for a couple hours is triggered params. In SCLang, you can prepend a ‘t_’ to a parameter name on the synth-form function (the second argument to SynthDef) and it will automatically create a triggered control Ugen. Overtone does not do this, and from a design perspective I think that’s a good call. But how on earth do I get, say, a gate to automatically revert to 0 after I set it to something else?

I went reading source code, but all I really needed to do was re-read the doc string for Overtone’s defsynth.

 

Params can also be given rates. By default, they are :kr, 
however another rate can be specified by using either a 
pair of [default rate] or a map with keys :default and rate:

(defsynth foo [freq [440 :kr] gate [0 :tr]] ...)
(defsynth foo [freq {:default 440 :rate :kr}] ...)

Anyway, all you have to do to get a TrigControl instead of a Control Ugen is to set the parameter’s rate to :tr using either syntax demonstrated above. See also: http://doc.sccode.org/Classes/SynthDef.html

I’m really loving Overtone. Once I really started to understand how SuperCollider works, I found that Overtone is usually pretty faithful to its design. You just write the same thing in Clojure. Sometimes there are small differences like the one above, but they’re typically improvements and they’re documented.

My code is here.

I can’t wait to get to buffers. Yeehaw.

More soon!

Being a musician is what brought me to computing in the first place. I tell prospective employers and clients ‘computing is my other guitar’ and it goes over well. It also happens to be true. Within 5 years I see myself working in pro audio, or perhaps writing some sonar computer vision type wizardry. Of course, I’ll also be making and performing music on a regular basis for the rest of my life.

So my interest brought me to SuperCollider and the like, but I lack classical math training. I dropped out of high school and didn’t even develop serious tech skills until I was 22 or so.

The concepts of DSP are implicit in working with SuperCollider and making music in general. Though I have long felt I needed a more explicit understanding, at the same time I felt daunted about teaching myself math as it relates to DSP the same way I’ve taught myself to code. I dunno what my block was, but I’d always lose momentum. Perhaps I wasn’t driven by the same obsessive curiosity. I have glossed over and understood it conceptually, but if I’m gonna do the kind of work I want to end up doing I’m going to need to walk through the concepts step-by-step and internalize them.

At a Clojure meetup recently, I mentioned my interest in DSP and lamented that I was gonna have to figure out some way to get to school in order to get the math I need. A fellow attendee said “Oh you can teach yourself that too.”  and told me to read the book ‘Understanding Digital Signal Processing’ by Richard Lyons. He told me it lays out all the math concepts you need and it’s very easy to digest without a prior background.

So far I have found that to be true, and I’ve got the momentum I’ve been searching for. In addition to taking notes, I’m gonna write code that demonstrates to myself the ideas being presented.

I’ve published my first such exercise on github:

https://github.com/beatboxchad/dsp_exercises/blob/master/muzak.c

At the end of September I lost my big sister, and although I have continued work on my Supercollider, Clojure, and other projects, I’ve done a terrible job of documenting it. I retreated back to Seattle from the Bay Area to mope among friends. I could say a lot on this subject, but this blog is intended to be more of an account of my professional and hobby work than a personal writing space. Suffice it to say that I had a bit of trouble functioning for most of October and November and had to change the timelines on my plans both personal and professional. I’ve played a lot of music and spent most of my time on friends’ couches. I am starting to get my wind back; hence this post.

I didn’t end up completing the work of translating SuperCollider tutorials into Overtone yet, though I will. My chops are up to the point where I could do it in a few days’ coding (I have stayed busy). But there seems to be a bug in how the Ugens get compiled and sent to the SC server. I haven’t done the legwork of verifying it (which could turn into the legwork of fixing it), but I intend to get to that soon as well. *EDIT* there’s no such bug. I was making several different kinds of mistakes in my early attempts at translating SC code into Overtone.

I was just really excited to make some noise, so I instead pored over the Overtone code until I could figure out how to instantiate nodes for synthdefs already loaded on the server, and still take advantage of Overtone’s studio system with all the groups, the mixer synth, the FX chain and other conveniences. Thus, I’m currently writing all my effects and synths in SuperCollider and just controlling everything from Overtone. It’s a hack, but at the time it was the path of least resistance and allowed me to continue to develop my skills.

So I got some code written and a rig built up to the point where I could take it to a jam session and get sound in one end and out the other with a couple simple effects. Having achieved that milestone, I’m currently in a long grind of just building up a library of nifty effects and synth tones. In addition I have begun an earnest study of DSP, am making music with Renoise and some VST plugins (just to get used to designing synths without simultaneously thinking about code, though I will mess around with using SuperCollider synthdefs as well), and showing up at jam sessions with my guitar and trumpet.

I’m going to start with the rig as my personal effects rig for the trumpet. But where before I was chasing it obsessively, I’m currently letting the project breathe a little. A band/project has formed around all this with a skilled drummer and his music-producer brother, but I’ll let that project speak for itself as it develops (I mention it here because it accounts for my more relaxed pace on the technical side of things — there’s writing and rehearsing afoot).

Around October it started to become especially urgent that I find work, so that’s divided my attention a little. In that pursuit I have:

  • written a toy in Clojure to mess around with large XML data sources, teaching myself how to use transducers in the process.
  • worked on a couple of Rails projects
  • wrote a Python script that fingerprints the website of all businesses retrieved from the Google Places API looking for issues I can call them about.
  • written a Luminus-based webapp with Vue.js on the frontend to track my contacts with these prospects
  • intermittently panicked

I think that more-or-less covers everything I’ve been up to that I’d like to share. As always, expect more soon, but for real this time.

As part of a project that has been torturing me with its lack of existence since 2013, I need to learn Supercollider and Overtone. This is what brought me to Clojure in the first place. I’ll be blogging about that project plenty, but enough of that for now.

So I’ve been translating Supercollider code from Eli Fieldsteel‘s very excellent YouTube Supercollider tutorial series into Clojure for use with Overtone. It’s the domain-specific knowledge that gets you. I can’t be productive in Overtone until I understand how Supercollider works. So I’ve got to start with Supercollider, but keep my chops up in Clojure. Hence, this exercise.

Tutorial 3 talks about composing Ugens together to define and control synths in Supercollider. Yesterday and today I translated the final pulseTest synth into Clojure. What I’ve got sounds almost like the example code run from the Supercollider IDE, but something’s off with the ampHz Ugen. It seems to be pulsing at about half the speed, and occasionally I get a very loud low-frequency boom when I kick off the synth. I’d appreciate anyone who can spot my error or help me track down the cause. Here’s what I have:

(defsynth pulsetest "A super awesome synth that almost works the same as it would in SC"
  [ampHz 4
   fund 40
   maxPartial 4
   width 0.5]

  (let [amp1 (* 0.75 (lf-pulse:kr ampHz 0   0.12))
        amp2 (* 0.75 (lf-pulse:kr ampHz 0.5 0.12))

        freq1 (round 
                ; http://doc.sccode.org/Classes/UGen.html#-exprange explains
                ; that it uses a LinExp ugen to do the work. Here, we do the
                ; same manually 
                (lin-exp:kr :in    (lf-noise0:kr 4) 
                            :dstlo fund 
                            :dsthi (* fund maxPartial)) fund)
        freq2 (round 
                ; see above
                (lin-exp:kr :in    (lf-noise0:kr 4) 
                            :dstlo fund 
                            :dsthi (* fund maxPartial)) fund)

        freq1 (* (+ 1 (lf-pulse:kr 8)) freq1)
        freq2 (* (+ 1 (lf-pulse:kr 6)) freq2)

        sig1  (* amp1 (pulse:ar freq1 width))
        sig2  (* amp1 (pulse:ar freq2 width))

        sig1  (free-verb:ar sig1 0.7 0.8 0.25)
        sig2  (free-verb:ar sig2 0.7 0.8 0.25)]
    (do
      (out 0 sig1)
      (out 1 sig2))
    )
  )


Here is the SC version:

 

(
SynthDef.new(\pulseTest, {
        arg ampHz=4, fund=40, maxPartial=4, width=0.5;
        var amp1, amp2, sig1, sig2, freq1, freq2;
        amp1 = LFPulse.kr(ampHz,0,0.12) * 0.75;
        amp2 = LFPulse.kr(ampHz,0.5,0.12) * 0.75;
        freq1 = LFNoise0.kr(4).exprange(fund, fund * maxPartial).round(fund);
        freq2 = LFNoise0.kr(4).exprange(fund, fund * maxPartial).round(fund);
        freq1 = freq1 * (LFPulse.kr(8)+1);
        freq2 = freq2 * (LFPulse.kr(6)+1);
        sig1 = Pulse.ar(freq1, width, amp1);
        sig2 = Pulse.ar(freq2, width, amp2);
        sig1 = FreeVerb.ar(sig1, 0.7, 0.8, 0.25);
        sig2 = FreeVerb.ar(sig2, 0.7, 0.8, 0.25);
        Out.ar(0, sig1);
        Out.ar(1, sig2);
}).add;
)

 

And that’s how far I am. I’m going to proceed to another tutorial for now. I’m also going to find a theme and plugin that makes this code look a little nicer.

You can track my progress at the git repo I have set up for this project.

I was about to write another post about my journey with Supercollider and Overtone, and I probably will right after this, but I had a thought about learning to code that I thought merited its own post.

Learning the syntax and semantics of a programming language, especially a modern one like Ruby, is indeed achievable within the context of a bootcamp. But the real challenge of programming isn’t learning how to talk to computers. It’s learning how computers talk to each other (roughly layers 3-6), how programs talk to each other (layer 7 when it comes to network, plus a bunch of other concepts and markup languages and, as you get more advanced, OS-specific API knowledge), and the general ecosystem in which your programs live (sysadmin/tech support background helps here).

There’s a blog post that pretty accurately captures all the extra stuff you pick up as a web developer, but it tries to sell you more bootcamps and classes. I can’t vouch for the content of those classes, but I can tell you from experience that this isn’t a problem you have to throw money at. Linux is free and will run on equipment you would have retired.

This “extraneous” knowledge amounts to several years of active computing experience. You just have to spend a couple years messing around with computers, trying to run Arch and get everything done, stopping to troubleshoot esoteric problems, etc. Somebody with an IP address in China logs in and deletes your home directory for lulz. You grieve the years’ loss of writing, recordings, and other personal data. You learn iptables, set up denyhosts, start using SSH keys, change the way you structure your network, and start backing up your data (ask me how I know!).

Or you can take some classes, I suppose. I chose the former route.

Either way, as far as I can tell, there’s no shortcut. Until you get this contextual knowledge, you’re gonna struggle. You know a language, but have nothing to talk about for lack of any frame of reference. It’s like learning the words in Spanish for “fruit”, “dress”, and “bathroom” but being from an alien planet where they don’t eat, wear clothing, or defecate.

I think the main problem with a lot of these training courses, and the culture of learning to code, is that they gloss over this part of the journey. People get frustrated, conclude that they’re just not smart enough for the craft, and give up.

I think that’s the wrong conclusion. You just have to realize that on your way to true competence, you’re gonna have to take on some pretty massive side quests as part of the grind. You probably have what it takes. This is just a very long game.

And when you’ve mastered the “basics”, there’s a whole other level to get at. For a professional software developer, it should be a given that they know their network, they know how OS’s work, they know how various flat-file and database storage systems work, they know a couple markup languages and a couple Layer 7 protocols, and a ton of other miscellaneous stuff I’m forgetting. Taking all that stuff as a given, here’s my frontier.

Keep learning. We practice a deep craft.

I’m still working on music coding stuff, but my day job took a lot of my attention in March.

But then a few months later I quit! My last day was July 15th. Now I’m a freelancer. That feels awesome.

I’m probably going to rewrite this blog in something other than WordPress just to be clever and flex my dev chops.

But not yet. I’ve got work to drum up, and I’d rather work on music stuff. If it ain’t broke…

Later!

I tried just calling calculate_tempo_frames() from the jack timebase callback in jack_audio_driver in lieu of setting transport info, but I don’t think that’s the right approach. I need to be fixing the calculation for giving JACK the right transport info. The Engine::calculate_tempo_frames function doesn’t yield BBT info. So I need to dig in and figure out how that information is calculated in the engine, and glue that to the transport callback. It think it’s in Engine::set_tempo, and I already see some possibilities.

The bug occurs when quantization is set to 8th as opposed to cycle, this reminds me. Note to self, verify that and finish writing out the conditions the trouble appears under in the first place.

Despite the fact that I knew the test was for remembering quantization boundaries after tempo changes when sooperlooper is JACK timebase master, I proceeded to write it as if I was testing sooperlooper’s performance with the loop stretching. Haha, sometimes brains just won’t get unstuck (well, mine at least).

This code will eventually not be one test, but I’m writing it all in one blob at first just to get my thoughts out and to experiment. I’ve had a lot of fun writing this, and just like I predicted it has really forced me to learn some of the the boring how-to-glue-it-together parts of supercollider instead of pasting code for weird noises in and then playing my guitar for an hour.

I still intend on a blog post describing the exact thing I’m trying to fix in sooperlooper, and how I intend to fix it. Incidentally, I found a couple other interesting little bugs to fix when there’s time. You can’t set the tap tempo over OSC. I found a place where it had been commented out (can’t remember where), and when I uncommented and recompiled, sooperlooper responded to the OSC tap-tempo message with a segfault. I sense somebody’s been down this road before… 😀

No matter. I used MIDI instead for now and put that on my list for later.

Here’s what I have so far:


// SETUP:
// 1. Start an instance of the sooperlooper engine
// 2. create 8 loops
// 3. set playback sync, sync, and timestretch on all loops
// 4. make sooperlooper timebase master
// 5. set the tempo TODO
//
// RECORDING:
//
// load a soundfile into each loop AND/OR record a pattern into each loop
// -- this will be a simple bleep or bloop, something that can be detected
// -- or a breakbeat.
//
// TESTING LOOP STRETCH (no reason, it works perfectly. What's up with my brain?):
//
// send tap-tempo signal to change up the tempo a bunch.
//
// listen to test events and record whether they happened at the time expected. Try to do this
// automatically, signal detection.
// It'll probably work just fine.
//
// TESTING QUANTIZATION
//
// After some tempo changes, try to record or overdub. While doing so, query
// the loop states over OSC. See if they change when we took the action or if
// they got stuck or mistimed. (I remember, they used to get stuck). okay,
// how do I test when it records? OSC docs.

//################# SETUP ###############
MIDIClient.init;
~engine = NetAddr.new("localhost", 9951);

"sooperlooper &".systemCmd();

SystemClock.sched(2.0, {

8.do({~engine.sendMsg("/loop_add", 1,1)});

// Global settings
~engine.sendMsg("/set", "sync_source", -1);
~engine.sendMsg("/set", "smart_eighths", 0);
~engine.sendMsg("/set", "jack_timebase_master", 1);
~engine.sendMsg("/add_midi_binding", "0 on 55 set tap_tempo -2 0 1 norm 0 127");

~engine_midi = MIDIOut.findPort("sooperlooper-sooperlooper", "sooperlooper-sooperlooper");
MIDIOut.connect(0,~engine_midi);

for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "sync", 1)});
for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "playback_sync", 1)});
for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "tempo_stretch", 1)});
for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "relative_sync", 1)});
for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "quantize", 2)});
for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "mute_quantized", 1)});
for (0,7, {arg i; ~engine.sendMsg(format("/sl/%/set", i), "overdub_quantized", 1)});

nil;
});

~player = Routine({
MIDIOut(0).noteOn(0,55);
});

// No idea why 0.exit() works or whether it's the best convention, but this allows
// me to just run this code like a script and not have the sclang interpreter
// just hanging around

SystemClock.sched(5.0, {"pkill sooperlooper".systemCmd(); 0.exit(); nil;});