you're always learning things when you're recording on your own like this. i'm a guitarist, really. but i need to play the role of arranger, engineer, mixer, producer....a lot of it reduces to figuring stuff out on the fly....but once you learn it, you can keep going.
i've been having weird issues with this mix, so i've sat down over the last two days and done a very careful a/b test on the real-time v regular rendering out of cubase. this is an old version of cubase, fwiw - sx3.
my analytical understanding of the situation is that there's going to be a few trade-offs. intuitively, the regular render ought to be the superior render, because it makes the calculation more specific. however, i've studied a little bit of floating point arithmetic and i know the error can become quite substantial when you're dealing with big numbers. so, i had some concern that it may end up overloading the plugins with too much information, just producing more error. this is maybe a little counter-intuitive - you'd think the more specific you are, the better it would be, but when you get to the nitty gritty of the way matrix arithmetic works in a computer it actually works the other way around - the more specific you are, the greater the error. there's no philosophical conclusions to draw from this, it's a user error. but i'm the user, here, and need to try and understand whether i'm making that error or not. based on what i'd learned about modifying the latency, it seemed to me that the plugins i'm using are optimized in real time playback, but i wanted to be sure of it.
starting with the a/b, i could definitely tell a difference in the reverb in one of the first sections, but i intuitively realized it was probably mostly randomness in the plugin. i kept going with it and couldn't tell a difference anywhere else in the song, except that the distortion sounded a little less defined in the regular render. but, it was extremely subtle. this confused me a little.
i ended up doing a null test and it came back flat - except that reverb (and the flange over the reverb, especially). i was using that plugin elsewhere and it nulled. why didn't it null there?
a little research proved my intuition correct - most modulation and time-based effects are engineered to be random, so that each render is a little different. that means i'm going to get neurotic in mixing this thing down, surely.
i then amplified what looked like a flat signal up 48 db (which is a lot) and it came back with a lot of fuzz and a very defined difference in the fuzz on the cello. the conclusion is that the distortion is definitely rendering differently, and i was able to hear it on the a/b. but it's extremely minimal, and not likely to be noticed by anybody but the composer.
regardless, i've got enough evidence in front of me to stick with the real-time. the real-time is supposed to sound the way you're mixing it, and i think it's pretty accurate (now that it's at the right latency). asking the plugin to render offline seems to be asking it to be more specific than it was engineered to be, thereby producing error in the wavetable calculation. it's minimal, but it should be avoided on principle.