Before posting, please read: When to use this forum, when to submit a help ticket

Midi files not working with Quantiloop

Started by go2ldook, October 16, 2019, 10:28:41 PM

Previous topic - Next topic

go2ldook

I have been successfully using MIDI files attached to recordings to trigger my
Pigtronix Infinity Looper. I am trying to do the same thing, but with Quantiloop. It is responding to Preset changes in my Midi Preset for a song, getting tempo data, but CC's from the running Midi file are not working. They are still passing to other devices...my loop switcher is changing presets as expected, so I do know that the file is running in the background. I have the correct channel selected for Quantiloop, and I have customized the CC's to be same as the commands I used for the Infinity.

Is there something I am missing as far as accomplishing this task between the two apps?

arlo

So Quantiloop responds to messages from a MIDI preset, and other devices respond to messages from a MIDI file, but Quantiloop doesn't respond to messages from a MIDI file -- is that right?

go2ldook

That is correct. And I have confirmed that I have used the same Midi Channel for Quantiloop that I had assigned for my Infinity Looper, and assigned the CC messages to idential functikns. Quantiloop is also receiving tempo data from Bandhelper. As a check, I may turn tempo off and see what happens. The Midi File has a tempo as well....wonder if it would be "seen" by Quantiloop.

This is the one thing standing in my way from a really optimal looper solution. The use of the MIDI file with the Infinity looper is amazing. I get to play guitar (using a click track of course), and the loops are triggered perfectly every time. Much less stress and can focus on my performance. But that looper is somewhat limiting with only 2 channels and there are some issues with that looper that I have learned to accept. Quantiloop, if I can get it working, would provide incredible control.

arlo

While playing the MIDI file, you can click the MIDI icon in the top toolbar to open the MIDI Status window, then look in the Activity Log to see the data being sent. Does that show that the data is being sent to all ports?

go2ldook

Not sure what I did, but working now. I am having to tweak the way the commands work, but shows promise.

go2ldook

So it is working now....with issues. The midi changes for my guitar pedals happen like they always have and reliably. With Quantiloops it is hit or miss, especially early midi changes, like recording the first track.

In Bandhelper, it is sending Midi Clock data and Tempo is set per song. In Quantiloop I have the tempo set to be controlled by Bandhelper and that comes through. But j run into these issues that are intermittent and make me uncomfortable looping live. Another odd behavior in the Bandhelper app that I get intermittently is that the Recording won't play. I have Recording 1 as a backing track with the Midi file attached, and this has worked almost flawlessly with my Infinity hardware looper. With Quantiloop, instead of the normal count in timer under the recording icon I will get something that reads like -44000:00 (I can screen shot that when I get home), and the recording won't start. Have to click on a couple of songs to get it working again.

And a last thing I see: I have struggled with an inherent offset between the running Midi file and the Mp3 backing track, despite the track being created in a sequencer with same tempo and putting the Midi CCs in the right positions (the right measure position). So I generally have to either adjust the Midi file temporally or the mp3 track. I have found the Midi file seems to run in pretty good synch with Bandhelper's Tempo, so lately I have focused on adjusting the mp3. To do this I run the audible tempo from Bandhelper concurrently with my mp3 track (normal the audible tempo is off). I then tweak the sequenced track until the tempo and track waveforms line up. For some reason this is causing Bandhelper to crash. Usually I can get enough beats to serve my purpose, but wondered if this is fixable as well.


Sent from my SM-G965U using Tapatalk


Ahiru

#6
Quote from: go2ldook on October 26, 2019, 08:51:24 AM
<snip> I have struggled with an inherent offset between the running Midi file and the Mp3 backing track, despite the track being created in a sequencer with same tempo and putting the Midi CCs in the right positions (the right measure position). So I generally have to either adjust the Midi file temporally or the mp3 track. <snip>

Yes, this is correct.  I ran a lot of tests on this to conclude the events from a MIDI SMF file precede the audio Recording events by about 215 msecs (corrected: had it reversed in original post!).  (There is also an anomaly for an event at the very start of the song where the delta is about 182 msecs.)  I use the MIDI (SMF) file to control lights. After entering lighting events in a track on my DAW corresponding to backing track events (e.g. on exact measures, etc.), as a final step I have to shift the entire lighting MIDI file by a fixed amount of msecs so lighting changes will look 'in the pocket' with the audio.

I would implore Arlo that if he does anything to change this delta, please also add a global parameter that allows us to enter our own offset adjustment between audio and SMF, so I don't have to re-adjust many dozens of SMF files.  :D  (Of course it could also be due to something in iOS audio/midi libraries that BandHelper has little control over, and thus Apple could do something to change it... I hold my breath whenever testing new iOS releases to make sure relative latencies don't change!)

go2ldook

According to my measurements, using waveform analysis to line up a click with a triggered midi note, the issue is further complicated by the fact that the offset varies according to tempo. I created a tempo table in Excel to help me deal with this and try to predict the offset.

Sent from my SM-G965U using Tapatalk


Ahiru

#8
Quote from: go2ldook on October 27, 2019, 06:17:00 AM
According to my measurements, using waveform analysis to line up a click with a triggered midi note, the issue is further complicated by the fact that the offset varies according to tempo. I created a tempo table in Excel to help me deal with this and try to predict the offset.

Sent from my SM-G965U using Tapatalk

Similar here...  I delay the MIDI lighting track by 100msecs, but for each song I have to calculate what that is in terms of the DAW's native 'tics', and so I have my spreadsheet: plug in BPM and 100 msecs target, get tics to shift. :) 

My waveform analysis showed the absolute delta msecs between the Recording and SMF events was fixed, regardless of the song BPM.  I don't use native BH timing for anything (tics, clock, etc.).  So I only need to fuss over getting that fixed Recording/SMF delta handled for each song.

BTW, even though the absolute delta is 215 msecs, I delay the lighting MIDI track by only 100 msecs...  that's because in my system there are lots of other downstream paths that add their own latency to both the audio (not so much) and lights (more, going to a second iPad running the separate lighting app, then DMX over Wi-Fi, plus consideration of physical lighting turn-on time).  So in the end I came to the 100 msec figure by trial and error, implying all that downstream lighting stuff adds about 115 msecs relative to audio.

I was also surprised at how sensitive my eyes+ears are to relatively small delays in lights versus sound... just dozens of msecs can make it look like the lighting guy is drunk.  I know studies have shown that for audio 10 to 20 msecs is too much slop; maybe lighting / sound sync is similar.  (I remain puzzled at how light shows can be made to look tightly synchronized at the back of large venues given the big difference between sound and light propogation speeds. ???)

arlo

This is interesting. If MIDI file is delayed, and the amount of the delay varies according to tempo, that can only be a factor of the MIDI player, since an audio player has no concept of the tempo of the file it's playing (it just blindly repeats the series of amplitude changes in the file). Furthermore, the delay must occur after the MIDI player starts playing, rather than being a delay required to start playing, which also wouldn't be affected by tempo. If the delay for a given song is consistent across performances, that would also suggest the delay is part of what the player is doing on purpose, rather than a delay caused by a hardware resource constraint, which would probably vary from one day to the next. If the delay were found to be consistent between older and newer devices, that would further support this explanation.

Have you tried turning on Settings > Audio & MIDI > Low-Latency Recordings? (That would also affect the MIDI player.)

Could you try measuring from when you start the playback to when the first MIDI message is sent, with or without a recording playing, and see if the absence of a recording changes anything? (If you use the low-latency setting, and your recording picks up the sound of your finger hitting the screen, you could use that to measure from. Playback starts on touch down instead of touch up with the low-latency setting.)

Quote
I remain puzzled at how light shows can be made to look tightly synchronized at the back of large venues given the big difference between sound and light propogation speeds

That is a good question. I just did a web search and found one explanation, in the Sep 30, 2016 answer on this page:

https://www.quora.com/Do-bands-playing-in-big-arenas-use-their-own-speakers-or-the-PA-system

Ahiru

#10
Quote from: arlo on November 08, 2019, 04:18:33 PM
If MIDI file is delayed, and the amount of the delay varies according to tempo, that can only be a factor of the MIDI player...

To clarify, the delay does NOT vary by tempo in my system.  The MIDI always precedes audio playback (order corrected) by 215msecs regardless of song tempo.  Likewise, I must always delay the MIDI file by a net (additional) 100msecs, regardless of the song tempo.  What does vary is how that 100msecs translates to some number of 'tics' within the DAW, since those tics are dependent on BPM; but that's just conversion math and not a absolute variable delay.

Quote from: arlo on November 08, 2019, 04:18:33 PM
Have you tried turning on Settings > Audio & MIDI > Low-Latency Recordings? (That would also affect the MIDI player.)

I've never tried the Low-Latency Recordings setting... will check that out.  What does that do under the hood?
(Even if it helps, I probably won't use it since I don't want to edit about 60 MIDI files. :P Moreover, even if it results in perfect sync of MIDI and audio from the iPad running BandHelper, I still have downstream lighting delays that have nothing to do with BandHelper that will probably need to be addressed with some relative delay.)

Quote from: arlo on November 08, 2019, 04:18:33 PM
That is a good question. I just did a web search and found one explanation, in the Sep 30, 2016 answer on this page:

https://www.quora.com/Do-bands-playing-in-big-arenas-use-their-own-speakers-or-the-PA-system

Interesting article about synchronizing the lights and sound...  I had always assumed the multiple speaker systems in a large venue were synchronized by delaying the signal to 'back of house' speakers, so that the output of those back peakers would match the (delayed) sound of the front stage speakers; and which would not help with the light/sound sync in the rear of the venue.  The cited article seems to say all speakers are 'real-time' and the speakers in the back just 'overwhelm' the sound from the front; though that would address the light/sound sync, I'm skeptical that's what's done... will do some more searching!

arlo

I'm glad to hear the delay doesn't vary by tempo -- that simplifies things!

What I'm seeing is that it does indeed take a while (tens or hundreds of milliseconds) for the recording to start playing, while the SMF playback starts faster, so the two play out of sync. I'm seeing the same problem when playing a recording at the same time as the tempo. I made a recording of a 90 bpm click and am starting that simultaneously with the built-in tempo at 90 bpm, and the offset is easy to hear.

I think the best next step is to try speeding up the recording playback. I realize this will mess up the offsets you have set up, but I think it's better in the long run to have these items play in sync by default. Then if we need to add a setting to intentionally delay the recording, to account for delays in the MIDI chain, I could do that.

Ahiru

#12
Quote from: arlo on November 14, 2019, 01:53:03 AM
I'm glad to hear the delay doesn't vary by tempo -- that simplifies things!

What I'm seeing is that it does indeed take a while (tens or hundreds of milliseconds) for the recording to start playing, while the SMF playback starts faster, so the two play out of sync. I'm seeing the same problem when playing a recording at the same time as the tempo. I made a recording of a 90 bpm click and am starting that simultaneously with the built-in tempo at 90 bpm, and the offset is easy to hear.
Looking back at my testing around July 2018, that's also what I saw (in contrast to my incorrect reversed description earlier in this thread which I'll edit to correct): MIDI events are played about 215 msecs before corresponding audio events. 

(In the example image below the top channel is the audio file (Recording) and the bottom is the corresponding MIDI event (SMF file) playing a low latency synth.  These events line up in the DAW, but played from BH you can see about 214 msecs delta.)

Quote from: arlo on November 14, 2019, 01:53:03 AM
I think the best next step is to try speeding up the recording playback. I realize this will mess up the offsets you have set up, but I think it's better in the long run to have these items play in sync by default. Then if we need to add a setting to intentionally delay the recording, to account for delays in the MIDI chain, I could do that.
That would probably be best for your overall user community (particularly those that use this feature in the future).  For me it will still mean converting about 70 MIDI files in my DAW, then reattaching them to recordings, etc. etc. :P But I could get excited about it if you also added a msec-accurate (or at least 10s msec accurate) optional +/- adjustment to relative Recording/SMF timing with (preferably) global effect.  As mentioned above, I will still need to add a relative (and constant) offset between SMF and Recording to account for non-BH related downstream latencies for lighting control.  (For a 'corrected' BandHelper that eliminated the current 215 msec delta, instead of delaying MIDI by 100msecs, I'd need to delay the Recording (relative to SMF) by about 115msecs.)  But not requiring the PIA of applying these micro delays to the MIDI tracks within the DAW and instead just setting one value in BH... great!

One other consideration might be how a 'correction' would affect Recording playback timing relative to already implemented Automation events.  Changing things a few hundred msecs might not affect things like manually recorded lyrics scrolling synchronization enough to matter, but it could get somewhat bothersome for other kinds of more precise Automation events such as fx patch changes for some customers.

(Also, I'd observed a slightly different kind of offset issue for the very beginning of a Recording / SMF; as you do analysis I recommend you focus on events after the first second or so file to not get fooled by some special cases at file start.)

arlo

Good point about the automation tracks. I guess I shouldn't improve the recording playback latency without also offering tools to bring both automation tracks and SMF files back into alignment. (I don't think that's needed for recording+tempo setups because there's no way to delay the tempo playback.)

arlo

#14
Okay, I figured out that the delay was coming from the pitch shift functionality, even when it's not in use for a given recording. I reworked the audio player to leave the pitch shift functionality out of the flow unless a recording needs it, and that decreased the delay of the recording relative to the tempo clicks from about 110 ms (on an iPad 2 with iOS 12) to something I couldn't hear or see in a DAW waveform display. If I play a recording along with a MIDI file and watch the MIDI data in an external monitor, that is also noticeably more accurate, although that's not as good of a test as playing a real song through a real lighting system.

Ahiru, I wonder how comparable my numbers are to yours, due to different hardware. If they were comparable, this change would mean that your audio would still play about 100 ms after the MIDI, and you would still need some delays in your MIDI files. With a more complex MIDI system than I have, I would expect the MIDI to end up behind, not ahead, of the audio. In that case I think the simplest solution would be a "delay audio" setting that delays the playback of recordings or tempo clicks to allow better syncing with downstream MIDI equipment. A setting to delay the MIDI playback could also be possible, but it seems less likely that someone would have to adjust in that direction. What do you think? What iPad are you measuring this with?

For automation tracks, I have a wish list item to select multiple events and offset them by a given amount, so that should enable readjusting in either direction as needed.