192KHz and 96Khz are used for marketing purposes and have nothing to do with improved quality.
Equipment developers talk about 96KHz, in a way so that it makes their product "appear" better than mere 44.1/48KHz products. Recording studios use it to get more business because the uninitiated customer believes this "is bound to be better", don't forget the average consumer thinks that mp3's are better than CD - because they are a "newer" innovation. It is a marketing world.
If you stood infront of a guitar amp - you cannot hear anything above 20KHz, so recording it at 96KHz, is still not going to let you hear anything above 20KHz. 44.1KHz will record everything and above what you can hear (by you, I mean a human being).
So there is no difference in the captured audio file at 96KHz or 44.1KHz that you can hear. If you can hear a difference then it can only be the A-D/D-A converter that processes the file so that you can hear it.
So, to keep this scientific, perhaps you could let us know which microphone, monitors and converters you use, and most importantly tell us what sounds different between 44.1KHz and 96KHz.
It could well be time to stop fooling yourself on the placebo effect of pressing the 96KHz button.
A commercial studio may have to produce 96KHz audio for customer requests, and will have to oblige and use it - it is nothing to do with quality.
A quick apology to the OP of this thread, as it has got a little bit hijacked by another topic, but the original thread topic had run it's course anyway.