Forked threads are out of sync


I’m sonifying environmental data and have two objects that play notes and they should be playing in sync, but are drifting. I’m starting with a small test where they each play on every beat. The beatPlayer is basically a metronome. The child clocks should be synced, right? So why aren’t the beats lining up?

I call them here:


And the classes are:

class BeatPlayer:
        def __init__(self, beatLength, inst, pitch, nbeats, vol, playNthbeat = 1):
                self.instrument = inst
                self.beatLength = beatLength
                self.nbeats =  nbeats
                self.vol = vol
                self.playNthbeat = playNthbeat
                self.pitch = pitch # static pitch, could modify this later
        def playBeats(self):
                pitchArray = np.full(self.nbeats, None)
                pitchArray[::self.playNthbeat] = self.pitch
                for thisBeat in range(self.nbeats):
                        self.instrument.play_note(pitchArray[thisBeat],self.vol, self.beatLength)

class DataPlayer:
        def __init__(self, dur, inst, data, CO2max, CO2min, refScale, vol = 0.7, tieNotes = False):
                self.instrument = inst
                self.vol = vol
                if tieNotes == True:
                        pitchlist = pitch_list(CO2min, CO2max, data, refScale)
                        [self.pitches, self.durs] = tieSamePitches(pitchlist, dur)

                        self.pitches = pitch_list(CO2min, CO2max, data, refScale)
                        if type(dur) == np.ndarray:
                                self.durs = dur
                                self.durs = len(data)*[dur]
        def playData(self, vol):
                for pitch, dur in zip(self.pitches, self.durs):
                        self.instrument.play_note(pitch, self.vol, dur)

Are you setting beatLength and dur to the same value when you create the objects?

Maybe you can post a full, runnable version? The child clocks definitely shouldn’t drift!

Yes, beatLength and dur are the same value (1). I pass an array of floats to playData, and an int to BeatPlayer, but I changed the array so in only contained ints and it didn’t resolve the problem.

The code calls outside datasets and functions from another file, so not easy to post a runnable version here as I’m not set up on GitHub (yet). I’ll make some dummy data to post runnable version if I can’t get it running this morning.

Ok, I’ve identified the problem. The clocks are fine and running in sync. I tested it with only percussive instruments with sharp attacks and they’re perfectly together on the beat. But when I use something like a flute or a clarinet for the DataPlayer, the attack sounds like it varies with the pitch (longer attack on lower pitches) and it gives the auditory illusion that they are drifting.

Thanks for your quick reply!

Ah that makes sense.

Yeah, with soundfonts, the same recordings get transposed to get different pitches. Especially if you were using a super low note for flute or clarinet, it might be transposing the a normal note down by an octave or two. It does this by changing playback speed, which stretches out the attack and creates a sense of delay.

Was it really low notes?

Yep, it was on the low notes. I’m switching to supercollider for synths now, excited to try this out!