Re: Real Strat/LPC 3 Suggestion, Probably Too Late ... Goto page 1, 2  Next
Post new topic    Reply to topic    MusicLab Forum Index » Suggestions
View previous topic :: View next topic
Author Message
Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Sun Apr 15, 2012 9:25 pm    Post subject: Re: Real Strat/LPC 3 Suggestion, Probably Too Late ... Reply with quote

While RealStrat and RealLPC work flawlessly for leads, I've been struggling for a while to use them in the capacity of rhythm guitar. Mind you, I can compose and program a rhythm track as easily as I can compose and program a lead, but my problem (and hence, my suggestion) comes into play when it comes time to program a doubled rhythm guitar track. In other words, when I want to achieve the effect of one rhythm guitar panned hard right and the other panned hard left. Currently, I am attempting to do this by simply duplicating my original rhythm guitar MIDI track, assigning it to a second instance of RealStrat/RealLPC and painstakingly massaging the duplicated MIDI track by nudging notes, using humanization and even nudging the entire track forward by a few clicks to try and achieve the effect of a rhythm guitar being played a second time with enough variation to prevent phasing problems and introducing enough variety between the two tracks to created the "doubled rhythm guitar" effect (something achieved in real life by simply having your human guitar player play the second rhythm guitar track through a different amp a second time) ... and not quite getting the results I was hoping for. This is even more problematic when you have a variety of keyswitches in play and you have to make sure each one still lines up with the notes you're trying to affect with them.

I realize RealGuitar actually already does this (with a "doubled guitar" patch) and that there's a couple of problems with simply creating a "doubled" patch for RealStrat and RealLPC. First of all, you're still going to want to record each rhythm guitar in mono and feed the mono track to a different amp, so having a stereo "doubled patch" wouldn't work. Plus, you might want to feed different notes to different sides at various points in your song, so you'd at least want independent channels for each side.

Samples VIs like Shreddage and Electri6ity (not to namecheck the competitors!) seem to achieve this by having a second patch or patches with some sort of built-in humanization feature that internally alter the incoming MIDI from a single rhythm guitar track so that one rhythm guitar is varied enough from the other to sound realistically doubled. Obviously, RealStrat and RealLPC don't operate like this. But how about a "humanization" button that automatically alters the incoming MIDI information (though leaving things like keyswitches intact). That way, you simply set up two instances of RealStrat/RealLPC, turn the humanization feature on within one of them, duplicate a rhythm guitar MIDI track and feed one to each instance of ReaStrat/RealLPC? You could even feature controls for the function to tweak the amount of humanization, if desired, or simply set it up ala RealGuitar so you can flip the switch and automatically attain the effect. It would be both a huge time-saver and an effective way to program doubled guitar tracks with a minimum of fuss.
Back to top
View user's profile Send private message

Sergey_MusicLab
Moderator

Joined: 17 Jun 2003
Posts: 2781
PostPosted: Mon Apr 16, 2012 5:42 am    Post subject: Reply with quote

Armageddon,

Didn't you try to set pick position in two RealStrats(LPCs) to opposite values (e.g. '-5', and '+5') - that will eliminate 'phasing' problem

Regards,
Sergey
_________________
MusicLab Support.
Back to top
View user's profile Send private message Visit poster's website

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Mon Apr 16, 2012 7:24 pm    Post subject: Reply with quote

Sergey, I didn't! Will that replicate the sound of an actual guitarist playing a second rhythm track, though? That's more along the lines of what I was asking.

Also, I forgot to mention, I obviously am using two different (complimentary) amps, not the exact same amp sound for both guitars. I'm also using an instance of RealLPC and RealStrat on at least one song, and they're obviously different-sounding from the get-go. Using (an amount of which I'm still experimenting with) MIDI humanization on one of the two tracks seems to get me in the ballpark of the "doubled guitar sound" I'm trying to achieve, but since you've already proved you have the capability of creating a "double-tracked" sound with RealGuitar, I was wondering what it would take to create one for RealStrat/RealLPC (even though I understand the process would have to be completely different).

It also may hearten you to know that this comes on the heels of me trying out both Shreddage and Electri6ity after my initial attempts with RealStrat/RealLPC failed. While both libraries give you the ability to create a "double-tracked" rhythm guitar -- in both cases, by simply loading a second patch or patches into Kontakt, panned in the opposite direction, that have a humanization/delay filter which creates the "double tracked" sound -- there is absolutely no contest when it comes to actual programming and keyswitches, not to mention, realism. And before obtaining RealStrat and RealLPC, I was using Ministry of Rock ... not even in the same ballpark.

Regarding your suggestion, though, I have switched "alter samples" for the second track, as well as changed the volume of certain articulations (pick noise, fret noise, etc.) and switched the playing velocity and it seems to help ... it just doesn't sound like a more realistic humanized second track. I will definitely try your suggestion!
Back to top
View user's profile Send private message

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Wed May 02, 2012 12:35 pm    Post subject: Reply with quote

I followed your advice and tried switching pick positions for each of my two rhythm guitars. While this does alter the tone considerably (on this particular track, this already wasn't an issue, considering I normally use a different amp for each rhythm guitar, and also considering I used an instance of RealLPC for the left-hand side guitar and a RealStrat for the right-hand side rhythm guitar), what it doesn't do is replicate the sound of two independently-played doubled rhythm guitar tracks, which is a huge part of the "double-tracked" sound.

So I go back to my original suggestion: is it possible to incorporate a "humanization" button or feature with RealStrat and RealLPC that would allow a perfectly-quantized MIDI track with keyswitches, etc. to be altered and randomized by the VI enough to distinguish it from the non-humanized track, though retaining the placement of keyswicthes so that no extra MIDI editing was needed? As I said in my previous post, many current guitar VIs have this function, though, in their case, they usually load up two patches into a multi within a sampler (like Kontakt). This wouldn't work with the Real instruments, but a adjustable humanization function like EZDrummer's (that not only could be adjusted, but turned on and off both manually or by MIDI automation) would likely not only work, but add an extra edge of realism to patterns and recorded MIDI tracks, even if you weren't necessarily trying to double-track rhythm guitars. It would also save a great deal of time and frustration for people like me who ARE trying to create a "double-tracked" rhythm guitar sound and are currently resorting to carefully humanizing MIDI tracks (which includes both moving keyswitches to match "humanized" corresponding MIDI notes and making sure the humanized notes don't fall above or under unwanted velocity switches, as well as the overall hassle of making sure the track is humanized enough without making it different to the point where it no longer works with its other track) and give us the ability to preview and adjust this humanization in real time.
Back to top
View user's profile Send private message

Sergey_MusicLab
Moderator

Joined: 17 Jun 2003
Posts: 2781
PostPosted: Wed May 02, 2012 1:22 pm    Post subject: Reply with quote

Armageddon,

Will think about humanization option.

What do you mean by 'MIDI track to be altered and randomized'?
Give us more details about how you do that manually in a MIDI track to get the desired result. Timing (note length, start time), velocity, etc..

Note, that humanization in timing may be done only with delay (after notes/events come to VI) - so you'll have to shift the track with 'humanized' part ahead in time (more humanization value - greater shifting)

Please email us to supportbox@musiclab.com about all that

Regards,
Sergey
_________________
MusicLab Support.
Back to top
View user's profile Send private message Visit poster's website

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Sun May 06, 2012 9:52 am    Post subject: Reply with quote

Sergey, I'm currently (trying to) achieve the "double-tracked" rhythm guitar sound with the "Humanization" function in Digital Performer. Unfortunately, due to having several keyswitches on my RealStrat/RealLPC tracks, the only way I can do this is to go into the MIDI track itself, select a short passage, employ humanization, then move the keyswitches so that they hit where they're supposed to. Too much humanization and the two rhythm guitar tracks -- the one that's actually quantized and locked to the grid and the humanized one -- aren't playing together properly and the humanizaed track actually sounds sloppy. Too little and your "doubled guitar" sound collapses, even if it's a RealLPC and a RealStrat playing through two different amps. As you can imagine, this is a very arduous task.

As you stated in your post, I first tried this with just a little humanization and nudging the second track forward by about 5 ticks to create a tiny bit of delay. This separated the tracks a bit, but again, just delaying the second track creates a "delayed" sound that doesn't necessarily allow the two tracks to work together. Plus, if you had a real-life guitar player playing the same track twice, it wouldn't actually be a delay sound, it would be the variances in playing (note velocity, timing, length, etc.) that set the two tracks apart.

So far -- and the only way I've been able to get close to the "doubled guitar" sound -- s with the Humanization settings: Note Onset +/- 4%, Note Duration +/- 4%, On Velocity +/- %4. That's for the average-length note. For longer strums (a bar or longer in length), or for fast picking (rock or metal "chugs") I switch Onset, Duration and Tempo down to 3%, since Humanization seems to affect longer notes more radically, and since a fast repetition seems to require moe cohesion. Bear in mind, after this, I still have to go in and manually nudge some of the chords, particularly the long ones, closer together if the Humanization spreads them too far apart or alters the velocity too radically. I then audition the Humanized track against the quantized track with two instances of RealStrat/RealLPC panned hard left and right (where they'd be once I printed them) to make sure they're still working together properly and make more adjustments if I need to.

I'm not sure if this is the best way to go about this, or if I'm even doing it right. Usually, I can tell by panning both instances of whichever Real instrument I have for the rhythm guitars to hard center, and if I hear phasing, I know it needs a little more. So far, the method I described above seems to be the best way to achieve "just enough" separation between the two tracks without losing cohesion.

I'm finishing up a track using this method (with an instance of RealLPC for the left-hand side rhythm guitar and an instance of RealStrat for the right, plus, a RealLPC lead guitar), and as soon as I get it done, I'll send it to you!
Back to top
View user's profile Send private message

Sergey_MusicLab
Moderator

Joined: 17 Jun 2003
Posts: 2781
PostPosted: Sun May 06, 2012 10:09 am    Post subject: Reply with quote

Armageddon,

Great! Thanks for the details, will be glad to hear the result, and to think then about programmable 'humanization' option

Regards,
Sergey
_________________
MusicLab Support.
Back to top
View user's profile Send private message Visit poster's website

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Sun May 06, 2012 10:47 am    Post subject: Reply with quote

Sergey, it's been great to actually converse with you guys -- even if it's only via a forum -- about your products. It shows in the actual instruments, too!

I think the biggest hurdle with humanization for the Real instruments will be having them be able to recognize chords and keyswitches and deal with them appropriately. Some of the elements that instruments like Electri6ity use for their humanization, you already have in place, like "Alter Samples" and pick position, things a user can already switch to a different setting to alter the sound a bit without humanization. I'm not sure exactly HOW the competition does this, exactly, but some of it is just delay (though Electri6ity actually employs a "humanization engine"), which doesn't impart quite the same sound. Also, the humanization has to be enough to separate it from the original guitar track, but not so much that it sounds sloppy and unusable.

Out of desperation, I've even tried stuff like Waves' Doubler plug. Again, if you're doing two rhythm guitar tracks and at one point, you want to have them diverge and play different things, something like a doubler or delay (or even a chorus) plug doesn't really work.

What's REALLY frustrating is that there isn't even a general resource around for this. Most MIDI users either use a stereo guitar, an instrument like Electri6ity or just aren't aware of how to create a "double-tracked" guitar sound -- or, if they're exceptional keyboard players (and I'm not), they actually do what a real guitar player does and just lay down two separate MIDI tracks.

You guys actually did a great job with the double-tracked acoustic sound in RealGuitar2, but again, with an acoustic, you can just print that stereo track from you VI and not have to add anything else, like different amps.
Back to top
View user's profile Send private message

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Mon May 14, 2012 4:43 pm    Post subject: Reply with quote

I just submitted three versions of the aforementioned test mix to you: one where the right and left-side rhythm guitars are playing the exact same MIDI track, one where I nudged the right-side rhythm guitar ahead about five ticks to introduce a delay between the two rhythm guitars and one where I actually humanized the right-side rhythm guitar, so that you could hear the difference between the three methods of recording this. I also sent along my EZDrummer User's Manual, which illustrates the "humanization function" I proposed above. Hope this helps!
Back to top
View user's profile Send private message

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Mon May 21, 2012 9:25 am    Post subject: Reply with quote

Something I've forgotten to mention in my other posts here that might have bearing on both my own problem and my suggestions for a humanization engine: I've been doing all my RealStrat/RealLPC rhythm guitars in Solo mode, rather than in Chord mode. While I love the Chords feature on these instruments, my rhythm guitars also have extensive single-note articulations and single-note bridge mutes (as if you were playing a real rhythm guitar), so I elected to do these parts in Solo mode and form my two-note power chords manually, rather than do it via the Chord engine, as I'm sure many other users do. Unfortunately, the Real instruments aren't set up to play in two different modes simultaneously, or feature a way to automatically switch between the two modes, so the only way to do it is either by putting chords together manually in Solo mode or by splitting one rhythm guitar MIDI track into two tracks and assigning the single-note data to one RealStrat/LPC set to Solo mode and the chord data to another instance of RealStrat/LPC set to Chord mode.

I bring this up because the chords tend to be fifty to seventy percent of my problem; once humanized, the Solo (manually-written) chords tend to fall apart the most, and even going back and nudging the data by hand, the chords seem to be where I lose the most cohesion between the quantized tracks and the humanized ones. I have not tried simply splitting the MIDI data into two tracks and processing them through two instances of RealStrat/LPC set to different modes.
Back to top
View user's profile Send private message

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Mon Jun 11, 2012 11:56 am    Post subject: Reply with quote

Sergey_MusicLab wrote:
Armageddon,

Didn't you try to set pick position in two RealStrats(LPCs) to opposite values (e.g. '-5', and '+5') - that will eliminate 'phasing' problem


I actually just tried this (+7/-7) and the results are unexpectedly startling! You actually seem to get a stereo separation between your rhythm guitars, not to mention, a noticeable difference in tone, and my tests so far have two instances of RealLPC with identical settings (except for the aforementioned pick settings!) playing the same MIDI rhythm guitar track -- it almost sounds like the doubling employed by other guitar VIs/sample sets. I'm gonna attempt a mix with this setup and see what happens ...

Edit #1: +7/-7 pick position seems to be too much, will be trying your suggested "+5/-5" position ...
Back to top
View user's profile Send private message

curtisgraham


Joined: 02 Feb 2009
Posts: 95
PostPosted: Mon Jun 11, 2012 7:06 pm    Post subject: Reply with quote

I have employed this strategy as well, and it sounds good until you use it on a long song or multiple songs... all the stereo separation sounds great, but also starts to have a similar effect to the machine gun sound of old drum synths. Not that it sounds that similar, but it really lacks the life of a real player because the change is all within the same range. It is usable in small doses, but I think it would be a subtle refinement that makes that tracks come a little more to life. Great suggestion by the way... would increase my work flow as I do all this manually transforming each midi track. I love the idea.

Curtis
Back to top
View user's profile Send private message

curtisgraham


Joined: 02 Feb 2009
Posts: 95
PostPosted: Mon Jun 11, 2012 7:10 pm    Post subject: Reply with quote

another technique that I use it to take the midi information (say the velocity) and transform it to fader (what it is for pickposition) so the pick position moves according to the velocity of the track. Applying this to one track at least makes it move similar to a real player and it starts to come to life. All this is done within the environment in Logic Pro. If you are a logic user I can walk you through the process if you like. It doesn't always work, but I have got some interesting results with techniques like this.

Curtis
Back to top
View user's profile Send private message

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Sun Jun 17, 2012 12:34 pm    Post subject: Reply with quote

I've tried changing pick positions at both +7/-7 (all the way out) and at +5/-5 (Sergey's suggested parameters). Something I noticed in both instances is that it made the left-side guitar actually start ahead of the beat by a wide margin. I'm not sure exactly how much, but even at +5/-5, the guitars sound "mushy" (too much stereo separation, nothing actually landing on the beat itself). I'm currently trying it at -1 (for the left-side guitar, which you would idealistically want to land as close to the beat as possible) and messing around with the parameters for the right-side guitar to see if it's possible to still get a decent separation with just using pick positions. I still think you actually have to alter the notes themselves, via velocity, duration and start times, to really achieve the effect of double-tracked guitars. The trick is figuring out how much humanization is actually "just enough".

Something else I've been doing with my "pick position experiments" is, I turn up "Alter Samples" on the right-side guitar (it defaults to "2"; I've tried between 3-5 on the right-side guitar), which actually does eliminate that "machine gun" effect a bit (Alter Samples only affects repetitive notes).
Back to top
View user's profile Send private message

Armageddon


Joined: 29 Dec 2009
Posts: 62
PostPosted: Fri Aug 03, 2012 2:09 pm    Post subject: Reply with quote

Based on the screenshots and info of the new ReaLPC, I assume the Humanize (or "Sound Humanize Engine", as it looks like you're calling it) is actually going to be part of RealLPC/RealStrat/RealGuitar 3? Also, the new GUIs look great!
Back to top
View user's profile Send private message

Display posts from previous:   
Post new topic    Reply to topic    MusicLab Forum Index » Suggestions All times are GMT - 4 Hours
Page 1 of 2 Goto page 1, 2  Next

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum