Orchestrating with modeled instruments

While researching modeled instruments in the context of orchestral scoring, I ran into this thread, which starts with the thread starter (Rohan De Livera) explaining his process for orchestrating exclusively with modeled solo instruments, along with examples:

As it seems that the whole modeling thing is still pretty experimental, and even more so in classical style contexts, I figured it might be an idea to create a thread for the gathering of, and discussion around, this particular subject.

2 Likes

Sounds like too much of a headache too me. Seems proof that live players is the way to go. :grin: I Think just paying a real player to add into your sample mix is good enough for me, or using Dorico Pro with Noteperformer is realistic enough, and Steinberg is continually improving it, so just write the score and hopefully get some live players to perform it.

1 Like

Yeah, as it is, that’s pretty much what it comes down to: All that effort, for what is essentially still the old synthesis vs sampling trade-off: Expression vs quality. And, that is provided one has the “feel” and skill to create good performances one way or another, which is basically what one hires musicians for.

However, the quality and realism of modeled instruments is constantly improving, and if one can spend countless hours learning to play real instruments, why not virtual instruments? Of course, I can certainly see why people focused strictly on composing and orchestration are not willing to invest that kind of time in any sort of instrument, but other than that, there’s still a problem here: Even as someone crazy enough to pick up the violin, cello, and opera at 40+, I’m kind of reluctant to invest countless hours in learning to play expressive MIDI controllers at this point. The instruments don’t quite seem to be up to it yet - but there’s a bit of a Catch-22 situation here; how will we ever know what these instruments can actually do, if hardly anyone bothers learning to play them properly?

2 Likes

Really good points. For my part, I’m just a composer/musician and artist. Not that I’m afraid of technology, I’m just not interested in needing a computer science degree to go along with my music education to be able to create. I just want to focus on the creation rather than programming and sampling and all is enough to keep my hands full!

3 Likes

I don’t see the difficulty here. I’ve always wanted better control, more dynamics and so on - I bought a VL1m years ago to pursue that, and even in its infancy that tech greatly inspired me. Since then I’ve always built patches even using traditional sampling methods that use breath controller - and when SampleModeling released their brass I was delighted. Other than the percussion, guitar and world instruments and a few VSL and Spitfire sounds, virtually all of what I use is modeled instruments. I do a cartoon series, and there is nothing - nothing at all - as responsive, agile or flexible as modeled instruments. And I’m on a deadline, of course, but I don’t feel like they slow me down - I feel like I can play in a muted bass trombone part, and have it be funny, and done -without editing. But this is something I’ve wanted for years, so it may be that I’m motivated to use them in ways that others might not be. I know what they will and won’t do, and supplement them where needed, and I love the result. And not a word from clients about how things sound except “great”. What would slow me down would be if things didn’t sound convincing or emotive, or if I had to change what I wanted to write because the library wouldn’t do it.

2 Likes

In related news, I acquired and installed SWAM Solo Strings yesterday. First impressions are pretty much what I expected; the “bite” from the bow is there if you just play realistically, and the sound is basically nice, but seems overly “close-mic’ed” or something, so will need some processing. Finally, the built-in vibrato still sounds very mechanical and “uninspired” to my ears, so I’ll be looking into other ways of doing that.

But first, controllers and setup…

The single best thing you can do for the vibrato is something you can’t do with traditional libraries. I have a TEControl breath controller that senses movement on front/back and left/right (nod and tilt) and I map the nod to vibrato speed. With a small head movement I can speed it up for intensity or slow it down. You could do it with faders too, of course. But do it.

1 Like

Been looking for something like this to use as a bow controller for modeled strings…

Doepfer actually has one - the A-198 - which has both position and pressure, but it’s intended for analog modular, and comes with a Eurorack interface module. There was a MIDI+CV version, R2M, but that’s no longer available, as the MCU they used went out of production. :-/

There is the Sensel Morph, though, which is apparently X-Y + pressure. Excellent!
https://www.thomann.de/intl/sensel_morph.htm

Then it dawned on me… I have a Wacom pen screen! X-Y + pressure as well, but also adds 2D tilt sensing for the stylus. There seem to be some solutions for using these things for MIDI:

Update:
Unfortunately, it looks like most tablet-to-MIDI utilities are gone, abandonware, and/or Mac-only, but I found some potentially useful stuff on GitHub. At worst, I’ll have to roll my own, I suppose.

1 Like

Maybe Expressive E’s Touché?

1 Like

Well, yes, I have one, and I’m going to try using it for bow control in modelled strings. Technically, its output could be translated to position + weight, but it only has a tiny fraction of the length of a violin bow, so I’m not sure “bowing mode” would work - but it might work in the “normal” dynamics mode.

1 Like

Right! Grabbed my son’s Leap Motion, and set it up as a MIDI controller under Windows 10, which basically comes down to some simple steps:

  1. Install the driver/SDK.
  2. Install the Geco MIDI app.
  3. Install a virtual MIDI loopback solution, if you don’t have one already. (I went with loopMIDI.)

However, a few issues/warning signs immediately present themselves:

  • You need to use V2, as Geco is not compatible with the later versions. (Latest is 4.1.0.)
  • …which in turn means that Geco can’t take advantage of the improved tracking, lower latency and other improvements in the more recent SDKs/drivers.
  • …but OTOH, those releases are Windows-only, so for Mac and Linux users, the latest version is 2.3.1 either way.
  • The V2 SDK is broken by the Windows 10 Fall Creators Update, so you need to apply a hotfix for that, or the device won’t be detected at all. (See https://forums.leapmotion.com/t/resolved-windows-10-fall-creators-update-bugfix/6585)

Now, that part sorted, on to actually setting it up and using it! My setup, and some observations from the technical side of things:

  • Latency is quite impressive! Something like a typical wireless “office” mouse on a 60 Hz screen; there is definitely lag, but probably not even obvious unless you’re used to gaming mice and 120+ Hz screens.
  • Finger tracking is a bit glitchy if all fingers are not clearly visible, and touching/overlapping hands tends to break tracking entirely, but with mostly open and non-interfering hands, it’s impressively accurate.
  • The device can’t reliably tell if a hand is facing up or down as it first enters view, and thus, it can’t reliably tell left from right either. Also, any brief tracking loss will restart detection. These things combined mean hands can suddenly flip or be swapped, at which point you need to move your hands away from the device and start over to get back in sync. Thus, it’s best to explicitly configure Geco for single hand use, to at least remove that variable.
  • Due to the previous issue, even in single hand mode, Geco can still get confused as to whether the hand is facing up or down at times. It would have been nice to have an option to just assume that the hand is always facing (mostly) down, to eliminate the remaining glitches where it gets stuck thinking your hand has turned 180°.
  • Open/closed hand tracking in Geco is not very reliable, or rather, you should be using fist vs “jazz hands” for (more) solid detection. The “fingers together” shape in the UI causes glitching between open/closed as soon as you rotate your hand. I gave up on that feature for now, which leaves me with 6 DOF; 3D position + 3D rotation.
  • I’d have wanted the option of a “grip strength” axis instead of the open/close logic, for a seventh analog axis.
  • More options for curve/linearity, range, deadzone etc would have been handy in Geco, but it’s flexible enough to get the basic data you want - close enough for feeding directly into modeled instruments.
  • For the purposes I have in mind, I wouldn’t even use hard deadzones anyway, as I specifically want the “constant human noise” to make things feel and sound more like real instruments. The exponential curves are enough that I can use one axis for pitch bend (horizontal position in my case), and have a playable balance of controlled bends and accurate but “human” intonation.

For my first experiments with Infinite Woodwinds 2.0, I configured Geco like this:

  • L/R position: Pitch bend, scale 0.8 inverted, offset 50%, curve 20%
  • U/D position: CC1 (modwheel), scale 1.2 inverted, offset 100%, curve 10%
  • B/F position: CC16/CC17 (flutter/growl), scale 0.5, curve 50%
  • Pitch: (Unused)
  • Roll: CC21 (Vibrato depth), scale 2.0 inverted, offset 50%
  • Yaw: CC20 (Vibrato rate), scale 2.0 inverted, offset 50%

I’m probably going to use pitch for mute wah-wah for instruments that support it. I first used it for vibrato rate, but that just didn’t make sense somehow, and it feels more logical to use yaw for that.

1 Like

Great review David, as I suspected it is not the holy grail of expressive controllers. Impressive about the low latency though. How would you compare the real live use vs TEC Breath Controller with the 4 parameters you get there?

1 Like

Thanks!

Indeed; I was worried about it being based on processing of a video stream, as that typically introduces substantial latency, but I suspect this is pretty much frame-by-frame, at 120 Hz. (And on that note; yes, it runs pretty hot…! :smiley: Even in standby mode, it keeps comfortably warm.)

Now, that’s an interesting question. Playing with the Leap Motion and the BBC2 feels kind of similar in some ways, but they each have distinctive strengths and weaknesses, and I suspect the best option would be to use both!

  • Typical PitchBend usage: The Leap Motion wins hands down here. You can REALLY play the pitch bend, much better than with wheel, stick, TouchĂ©, or BBC2. With a bit of an exponential curve, it’s accurate enough to just have subtle “human intonation” when holding the hand right above the device, while allowing for fluent and controlled bends when desired. It’s perfectly possible to play notes this way, glissando/shift/theremin style, with a bit of practice.
  • True vibrato/shake via PitchBend: BBC2 is the ONLY device I’ve tried so far, that is at all usable for this. I wired head tilt to pitch bend, which doesn’t work great for me - BUT, since the device actually measures acceleration in the mouthpiece, it also responds to head shake as a side effect, and that’s the interesting part! The experience feels eerily similar to performing actual vibrato on cello or violin, or when singing, and similarly, it’s also usable for quick “onset slides” and similar articulations.
  • Dynamics/expression: Depends… Using the height axis on the Leap Motion, straight or inverted, just doesn’t do it for me, as there are no natural points of reference. It’s much better than the wheel, especially for swells and general (slow) expressive playing, and it’s even possible to play jazz style saxophone and the like with some training, but that amounts to some VERY aggressive hand-waving. The BB2 is massively superior here; much more agile and intuitive. So, Leap Motion might work for some instruments, in some cases, but so far, I still prefer the BBC2 for actual expressive playing.
  • Other controls: Leap Motion wins. With the exception of the “special use case” for BBC2 tilt for bend/vibrato, I find the tilt sensing kind of awkward and difficult to use for anything that isn’t better handled by pedals or something, and the bite sensor is tricky to use in a controlled analog fashion. Meanwhile, moving and tilting your hand essentially gives you 6 (mostly) independent dimensions of control with pretty much equal precision, so it’s massively superior to basically everything if you want to control multiple parameters simultaneously.

Now, since Leap Motion doesn’t really seem to have sufficiently accurate tracking to use individual fingers and whatnot for MIDI, it’s effectively 6 DOF as it is… Maybe a 3DConnexion puck could be an option? However, that essentially deals in force, whereas Leap Motion deals in position, so definitely a very different experience. Force generally gives you better precision and MUCH quicker response than position, but it’s two derivates away from position (force->velocity->position), so it’s a different “style” of control entirely. Been meaning to get one for move/pan/zoom etc in various applications (graphics mostly), so I might try this out at some point.

1 Like

Interesting! Now the final verdict I guess will be in a few months, when you will see which one you actually prefer to use in action when adding expression to your music! :slight_smile:

1 Like

Yeah, I’m expecting it to take a while to figure all this out. It takes years to learn to play real instruments expressively, so it would be very strange if MIDI controllers would somehow Just Work™!

I actually think most users have completely unrealistic expectations when it comes to modeled instruments, in no small part because other virtual instruments are carefully crafted to do what’s desired, rather than accurately render the input.

While a $4000 violin bow, or the many years of training necessary to make proper use of it, won’t bat many eyelids, people seem to expect a few $hundred’s worth of electronics to deliver infinite precision, and turn the player into a virtuoso overnight. Not going to happen! :smiley:

1 Like

If only all instruments were as easy to sample or model to sound realistic…as percussive instruments (including piano)! :smiley:

1 Like

Well, I think the problem with that is that an instrument that is easy to sample or model is inherently not very expressive - so if more expression is what you’re after, the real instruments in that category aren’t appropriate either! :smiley:

BTW, I did some more research on the 3DConnexion devices, and unfortunately, it looks like at least some of the devices have very low (~30 Hz) poll rates, which is IMHO a show-stopper, even for non-musical use cases. :confused: They don’t even mention what their actual poll rates are, so I’m not sure what the deal is with the bigger models, but there are vague indications that they can do somewhere between 60 and 100 Hz, which is… “ok.” I can’t quite comprehend why this should even be an issue, as these are pretty expensive devices! Your average gaming mouse at a fraction of the cost is doing 1000+ Hz these days.

1 Like

We keyboardists always secretly envied the guitar because of the lack of expressive capability on piano. Granted, the piano is KING for dynamics expression, as well as range. But that is it! Unless you count the pedals which I really don’t personally. :stuck_out_tongue:

1 Like

Btw, crazy idea. What if we take a gaming mouse (with extreme resolution and granularity), and use its movements as kind of an X/Y expression controller. Is it possible?

1 Like

And on that note, I should finish my analog triple pedal build, and throw that in the mix as well! :slight_smile: All three pedals are analog (though pretty short stroke, as piano pedals are), so that could be interesting.

I have a punch of piano VIs that support half-pedaling, and I think some have sostenuto logic as well, which would be interesting to try some time… (Only ever played real pianos where the middle pedal is just a half-blow pedal with a latch feature.)

As for the mouse, that just crossed my mind the other day, actually, but the problem is that it’s relative positioning only, which drastically limits its usefulness in this context. I think a tablet or pen screen (as I suggested earlier) would be much more useful here - another thing I should give a go one of these days.

That said, I suppose you could look at deltas from the mouse instead (well, that’s what they actually deal with on the hardware level anyway), and apply a leaky integrator to that. That would pretty much work as a 2D accelerometer. Not sure if it would provide any advantages over an actual accelerometer, though.

1 Like