Vienna Symphonic Library Forum
Forum Statistics

194,597 users have contributed to 42,923 threads and 257,979 posts.

In the past 24 hours, we have 1 new thread(s), 6 new post(s) and 117 new user(s).

  • MIR Icon movement and Automation

    Is it fair to say that to emulate the stage movement of an instrument/vocal, during a clip/phrase, within MIR is not possible?
    i.e. a static orchestra setup, but a vocalist moving from down-stage left to up-stage right, during the course of a song or phrase, cannot be automated within MIR???


  • What looks easy at a first glance is a daunting scenario when you look at it from a technical perspective: Remember, MIR is not your standard algorithmic reverb. ;-)

    The moment you drop a MIR Icon on a certain position of a Venue's stage, the engine choses the closest set of raw Ambisonics impulse responses, takes into account the directivity patterns of the instrument Profile, the Icon's width and rotation, and creates a unique set of individual IRs according to the chosen output format and the assigned RoomEQ. _This_ result is passed to the convolution engine were the actual room signal is calculated.

    As soon as you change _any_ of these parameters, MIR has to re-calculate all aspects. This happens amazingly fast thanks to some clever optimizations, but it is not real-time. There can be no continuous movements without interruptions, so the engine does not allow automation of these settings.

    ... we thought this was a reasonable limitation, considering that musicians tend to stay in their seats during a typical orchestral performance ;-D.  If you really, really need changes of a source's position then please use a General Purpose profile, set the width to the desired value and use one of the countless tools that allow for panning automation _before_ the signal enters MIR Pro.

    HTH,


    /Dietz - Vienna Symphonic Library
  • Dietz,

        thank you for your response. I knew I asking for the heaven and the stars with that one. I believe a possible workaround is a combination of your suggestion by using standard panning and multiple run throughs with statuc locations in MIR but tracing out the path I'm looking for. Then finally crossfading those multiple recordings into one continuous track. Sort of like joining the dots....

    Still I think MIR is the only solution that comes close to meeting this challenge.

    Again, thank you for your response, it was very helpful.

    Spike


  • Hi Spike,

    thanks for your friendly message!

    Rest assured that we have pondered _many_ ideas in this regard over the years, but in the end there was neither enough time nor enough interest to actually look for a solution. MIR's "players" will stay on their seats, for the time being. ;-)

    Enjoy MIR Pro!

    Kind regards,


    /Dietz - Vienna Symphonic Library