Damn just composed a reply and hit some mysterious key and lost it. Thank you for your posting, however you are one the few people who has been able to reiterate my challenge clearly and acknowledge the key problem of determining the state of the tracks' musical rendering at the start of the nth
I am learning flute, mainly Baroque pieces at the moment. Excellent for technique my music teacher tells me. However I do not have a ready source of youtube material to groove along with to provide me with backing material for duets etc. I don't mind entering the music myself into midi format as it is very versatile (ie can alter tempo without changing pitch, and achieving/modelling accurate pitch is very important for flute). However I don't know of a midi player that permits the operator to begin at the nth
bar and play to the mth
bar, over and over. I would like to be able to play the harder subsections of a tune and have the accompaniment work in with that, and play it at slower tempo at first so I don't practice in errors.
A number of problems have become evident. First is that the composing software introduces rounding errors into the delta delays and then the parser also adds small errors so that the parser eventually drifts away from the exact beat. I discovered this when I added an automatic metronome beat based on the tempo calculations and found the parser was about a whole beat out by the end of a longish tune with many small beats. Not only that but the multiple tracks were drifting at slightly different rates and the first notes in a bar were not necessarily going to have exactly the same timings, especially at the end of a longish piece. I tackled this by adding a quantizing routine that aligns the timing of any first note in a bar to the calculated start of the bar, ie not what the delta wait time dictated (generally delta-times run a little too quick, and when a first note in a bar is detected it needs to be given the Bar time, rather than their own delta time, which causes them simply to have to wait a bit longer before playing - this can occur at different times on the different tracks!). This brought the beats back to strict tempo and made it easier to determine what was happening midi-wise and music-wise at a particular bar. I could then calculate the exact time of a particular bar. The drift is not as significant between tracks as the errors probably cancel out to a large extent and the tracks remain relatively "in time" as far as our ears are concerned.
The second major problem was the number of tracks that needed each state captured, and given that each midi track is stored in sequence in separate blocks and not interleaved, means that each tracks needs its own state stored. The pointers into each track do not have a simple numerical relationship that ensures synchrony of events in absolute sense. So I have an array of 16 arrays for each possible channel to store these states. The idea being that I can use these states to begin the tune at that point once it has been captured?
Given that the midi file will most likely have multiple tracks, and each track with their own individual drift that needs correcting, (to even know what is in a bar and what is not, some form of quantizing will have to take place), means to me that I must do a sort of "Silent fast forward parsing" where the file is parsed in real time, but without the Arduino Mega2560 talking to the Fluxamasynth board, and the corrected-delta-times not being enacted until the nth bar is reached whereupon a snapshot of the parser's state is taken and this is used as the new starting point. Click the IR remote and voila the tune begins.... if you get my drift.
I hope I have not bored you to tears, and that if I have made some fundamental blunder in the logic you can help me out. Am I on the right track? (Please excuse the indulgence in the pun, couldn't resist it