i'm not able to make sense of this. it sounded "right" this morning through the mixer as a soundcard, now it's gone wonky on me...
i noticed that changing the buffer size when using the mixer as a sound card is modifying the bottom end, which doesn't make any sense to me. it's almost proof that i'm running through an external server, and it's put a limiting effect on everything i'm playing. that's about the only thing that explains this.
what the buffer size does is tell your operating system how often to look for the next piece of the stream. this is primarily going to make a difference when recording, because it introduces a delay. however, it can also make a difference in the way that plugins calculate the sound, which is why i can hear the difference when i'm monitoring the mix.
it really, really shouldn't make any difference at all when i'm simply streaming a song. nothing's being calculated. whether it takes 256 or 512 or 1048576 samples at a time, it's taking exactly the same thing one way or another.
the fact that it *is* making a difference indicates that something *is* being calculated. and, while i may be uncovering some hidden algorithm in foobar or a stealth effect on the unit, i don't think either of these things are true.
again: it sounds like a limiting effect. and it's fucking up the bottom end. and i don't really know what to do about it....
i'm considering just disconnecting that computer from the internet altogether.
there's no wireless card in it. i made sure of that.
waveforms on the bottom end are longer. so, if you're calculating something on the bottom end, you could conceivably erase it if you take chunks that are too small. conversely, a distortion effect on the low end could conceivably smooth itself out if you take chunks that are too big. so, it seems like i'm running into a contradiction: i want a smooth bass part and a jagged guitar part.
but none of that should happen when i'm playing an already calculated part.