Logic Studio apps Novice Mainstage User

Hi to all you Mainstage experts,

I'm trying to set up a concert with electric guitar and vocal live over a set of pre-recorded songs, and am having an insight problem.
Is it best to have one strip of amp and pedalboard settings, one strip of vocal settings and one Suitcase Playback song file, OR one amp, one vox and 10 Suitcase strips accessing each Playback song?
I'm finding it quite challenging, even with the aid of a MacPro tutorial. I had a look at the uploaded file sent recently (which seems to have separate settings for each song file) but I'm still quite confused.
My main problem in version two (which I have set up and working) is that it requires a certain amount of fiddling on the laptop keyboard for selecting a Playback song when, ideally, I should be talking to the audience.

Best to you all from London UK
 
Is it best to have one strip of amp and pedalboard settings, one strip of vocal settings and one Suitcase Playback song file, OR one amp, one vox and 10 Suitcase strips accessing each Playback song?
Huh? What is a 'Suitcase strip', this sounds like the instrument strip of Mainstage which has the Suitcase instrument loaded by default. If you see always 'Suitcase' you should give your channelstrips meaningful names.

However, if you mean that you have 10 different playbacks, aka 10 songs, then you want 10 patches. Why? You want to start the playbacks somehow and a comfortable way to start a playback is to select the appropriate patch and let the playback run automatically. If this is not what you want, you can start it on demand. In the most simple setup, you have one playback per patch.

For your guitar and vocals it depends if they have always the same settings or if the settings depend on the song.

Let us look at the structure of Mainstage:
As hierarchical levels we have Concert, Sets and Patches. The highest level is the Concert. What you add here, shows up in any Set and any Patch. The next level is the Set. What you add here, shows up in each Patch that resides in this Set. The Patch itself is the lowest level. See it like a folder hierarchy. Concert and Set are containers, Patches can be played:
Concert
- Set A
-- Patch 1
-- Patch 2
-- Patch 3
- Set B
-- Patch 4
-- Patch 5
- Set C
-- Patch 6
-- Patch 7
-- Patch 8
and so on.
It is a matter of organization. Same settings for guitar and vox throughout the whole gig? Put those on concert level and they are available in all sets and all patches. If there is nothing else than playback, each patch (=song) would only have one individual channelstrip with a playback plugin.

Always the same guitar and vox settings but you don't want them on concert level? Create a set with your two channelstrips and put all patches (songs) into this set. Then your guitar and vox settings appear in all patches within this set.

Different guitar and vox settings for each song? You have to create channelstrips in each patch (= song) individually, to adjust them individually.

Different guitar and vox settings within a song, for example intro, verse, chorus solo etc? Create a set per song and put all the parts as patches into this set. Then a set represents a song and the patches are sounds for the parts. In this case the playback runs on set level and you can change the sound for the individual song parts while the playback runs.

Always the same vocal settings but different guitar for each song? Put the vocals on concert level and individual guitar strips in each patch.

I think you got the system.

There are two basic ways for sharing resources like reverb, delay, compressor etc. One is the bus, that can sit on either of the three levels. The second is the Alias. You can copy a channelstrip and paste it as alias wherever you need it.

I hope this was clear, if not don't hesitate do ask for a better explanation.

Btw, you should read, practice and understand the chapter #8 of the Mainstage manual "Playing Back Audio in MainStage". Actually this single chapter tells you almost everything about the Mainstage hierarchical system what I told you above.

I could just show you the well-known sign ...
:rtfm:
... but this is not our style here. So you haven't seen it, ok? :)


... it requires a certain amount of fiddling on the laptop keyboard for selecting a Playback song
From the English Mainstage Manual, page 128, "Using the Playback Plugin":
You start playback by sending a Play command to the Playback plug-in using a screen control, such as a button, mapped to the Play/Stop parameter of the plug-in. To stop playback, you send a Stop command using the same parameter. Alternatively, you can set the plug-in to start when you select the patch or set, or when the Play action is triggered.
This is usually done with a controller. You sound as if you were not familiar with Mainstage screen controls and hardware controllers. Don't you have a controller? Such a device is essential for Mainstage. As you said, you want to seek contact to the audience and not show them a busy operator at work. Changing a patch or starting a playback can be almost unvisible to people, a tiny movement of your foot or hand if your controller is set up in an economical way.
 
Many thanks Peter for your reply.

As with most of Apple's User manuals, what is omitted is more important than what is included (see, for example, the helpful files privided by Edgar Rothermich about Logic).

It seems that I have neglected to set up a "Controller" - all else is familiar. Now, a Controller which will start and stop Playback without the Playback box being onscreen. As you suggest, a "button" that starts and stops playback should have been pre-mapped to the start/stop control of the Playback. This is where I'm failing to understand precisely how to do this. If you have the time or inclination, I would be very grateful for the English-idiot-type explanation (go to the xxxx screen, do xxxxx etc).

Best regards
 
BTW, to which manual do you refer when you, so drolly, insert the RTFM icon? Surely not the Mainstage Users manual in which Chapter 8 is entitled "Performing Live with Mainstage".
Don't get me wrong, I have been using Logic since you were in short trousers and I have realized that one's way of working with the program differs from everyone else's way of working. This is why this list, and the email group before this web site was born, is SO VALUABLE because one may have crisis of comprehension about certain aspects of certain manuals which may be readily explained by other Users. No-one knows all there is to know about the programs, but some can help others to achieve enlightenment.
I apologize for my "Novice" status and hope that readers will understand my difficulties.
Best regards
 
As you suggest, a "button" that starts and stops playback should have been pre-mapped to the start/stop control of the Playback. This is where I'm failing to understand precisely how to do this. If you have the time or inclination, I would be very grateful for the English-idiot-type explanation (go to the xxxx screen, do xxxxx etc).
Allow me to cover some basics, ignoring the risk that you might tell me again that they were unnecessary. But nothing works in Mainstage if you miss a couple of basic rules.

First, don't think from the channelstrip towards mapping as you know it from Logic. Mainstage works the other way. Of course your channelstrips need to get created but for the mapping you have to work from the Layout mode to the Edit mode, towards the channelstrips. Otherwise you see nothing and have nothing to map.

Second, don't see the graphics area as a fancy optional playground. And, important for a mature Logic user, don't expect a neater version of the Logic Environment. It isn't (unfortunately). But this area is the center of Mainstage. Without you doing something here, there are no mappings. Mainstage does not work on stage as expected if there is nothing in this area. Well, it can, but those are rather unusual cases.

Now we are ready to go:

Every parameter that gets controlled needs a screen object. There are only few exceptions which we don't discuss here. To control playback, you have to create at least one screen object in Layout mode. In this case a simple button is appropriate if you don't use a MIDI controller. With a controller, you may choose a footswitch symbol.

After you created your button by dragging it from the object browser to the layout area, you can use the learn function to assign a MIDI controller or set the input manually (which is rather confusing) or leave that step out if you don't use a MIDI controller yet. And you have a couple of options in the left column, right under the MIDI stuff, to control the appearance of the button.

Next step is to switch to Edit mode and activate the button you just created. Then go to the patchlist and select the level for which you want to do the mapping. Wait! It must be the level where your channelstrip belongs to! It can be concert level or a set or a patch. This is the same as the patch hierarchy I mentioned in the other post, with same rules.

After you made your choice (the object is still selected) go to the list below the graphics window. You may have to click some tabs until you see the list of channelstrips. If you don't see them but just a few fields, you selected a patch level that has no channelstrips associated. We assume that you see the list. Select the channelstrip, select the plugin and finally the parameter you want to control. Depending on the type of parameter you can get additional options like setting high/low values or editing the response curve.

Always this direction: Layout mode, create screen control. Edit mode, select screen control, choose patch level, choose channelstrip and parameter, adjust if possible and necessary.

That's it. There are more techniques like multimapping but you don't need them right now.

I hope this description was correct and understandable.


BTW, to which manual do you refer when you, so drolly, insert the RTFM icon? Surely not the Mainstage Users manual in which Chapter 8 is entitled "Performing Live with Mainstage".
I refer to my Mainstage 2 Users Manual, English PDF, Chapter 8, starts at page 123. If yours is different, than it is newer or older. Your number 8 is my number 9 and since differences in manuals are known, I have added the title of the chapter to avoid confusion.


Don't get me wrong, I have been using Logic since you were in short trousers
Given the the fact that I wear short trousers today, you must use Logic for about six hours. You know pretty much for that short period!

Ah - just to clarify the situation - if you use Logic that long, we could be of about the same age. I am 58 since a couple of weeks, decide yourself if you are my father or my son. Not a brother please, I have a sister and this is enough. If you don't want to be related this is ok for me.


I apologize for my "Novice" status and hope that readers will understand my difficulties.
Here are a lot of good people masquerading as novices. Not your fault. We would have been happy to transfer the login data from the mailing list to the web platform. It was technically not possible due to the encrypted passwords. Everybody has registered as a new user and there are still well-known people from the mailinglist registering here. The forum status means nothing.

So, finally, welcome again in the LUG!
 
Dear Peter,
I thank you for your extreme helpfulness in trying to understand my problem - it is not your fault that you were unable, more a problem of poor references on my part. I have opened the Playback plug-in (in Editor mode) and noticed that there is a box which required ticking (something like "enable song start from key command" instantiated by the keyboard Space Bar). THIS WAS THE VITAL PIECE OF INFORMATION THAT I WAS MISSING!
Having checked all my Playback instances, and muted where necessary, I am able to sing and play to my heart's content with my correct backing track playing.
Now correct me if I've overlooked something in the manuals, but nowhere did it say that you needed to check the Playback editor box to enable playback to start by key command. All this waggle about MIDI controllers etc has been a red herring.

I apologize to you for assuming that you were naught but a callow youth - I plead exasperation with my inability to explore ALL possibilities when troubleshooting. You have taken an inordinate amount of time in trying to explain what I still find puzzling in the layout of the program.

Thank you once again for your time and effort.

Best regards
 
Now correct me if I've overlooked something in the manuals, but nowhere did it say that you needed to check the Playback editor box to enable playback to start by key command.
I don't know a 'Playback editor box' and do not use playback myself but if you want to start it by key command and you see an option to turn this on it is certainly a good idea to do it. I did not search it in the manual because you found the function already. When I ever need playback, I will map the play function to my foot controller and step on a switch.

All this waggle about MIDI controllers etc has been a red herring.
This is your current opinion. If you ask about Mainstage, people will instantly talk about controllers because this is just the way it works. The only exception may be keyboarders who use it for their sounds and can easily switch the patches manually as long as they play within a song-after-song context. Mainstage is built for controllers and depends on them. If you don't use controllers, well, then you don't actually use Mainstage but rather a kind of weird Logic without a sequencer and a crazy mapping.

The program is specialized to actively control a performance. And only this, there is not much room for other usage that makes sense. I think you will do more with Mainstage sooner or later. For the first step in your interest, playing audiofiles, you don't need Mainstage, not even a computer. But this can quickly change if you get ideas what else you could do.

If singing along a playback with a more or less fixed setting for the guitar is exactly what you need, you may consider using Logic for that. It is less CPU hungry. With big colored markers you can see your current position from a great distance, better than with Mainstage. You can always switch to Mainstage if your requirements increase and Logic starts to get inefficient because of a more complicated setup and, probably, too much Environment work.

Below is an example for a typical Mainstage application where you can see why people use controllers.
Setup
20100703-m9748k36b5mtmtd9w5w6r12uxr.jpg



Inspiring Dancers
20100703-r8j9d6wt3q41gr9b673gb2j7a6.jpg

Near at the PA I was not sure if everything works as intended but afterwards the dancers told me that the sound was excellent for them and they really loved it. Mainstage with good hardware and software can bring you far beyond the average stage sound. In this case I thought that dancers want fullness and stereo and designed the sound like a mix but with powerfull left/right movements and this payed off, made them moving. Such ideas do not always work but with Mainstage you have the chance to prepare the sound in your known studio environment. On stage you deliver your sum, there is not much a guy on the FOH can mess up. Sometimes they look disappointed ;-)


Playing technique or just lazy?
20100703-cp1q1h17i1eafeh9wr191qwr1a.jpg



Everybody needs a controller.
Even those girls, they look
pretty much out of control:
20100703-xtp5xa6fkbty41t61n6rf114gd.jpg


Here is the Mainstage application for that. Not much on screen.
Doesn't take long if you already know what you want:
20100703-pnan5jtj6ijjam8jmpb2eser1.jpg

Here is a short explanation:

Vertical level meters (just grey in the screenshot) tell me the signal strenght of the current instrument. The big level meter shows the main output. Use always a display for that, unlike most other performers you send a sum to the FOH and do not want to fight the desk. Right of it are MIDI indicators, just in case there is a problem. The patchlist tells me which instrument is selected, either an e-bass, an acoustic guitar or a wind controller. There were only three patches, all of them had sounds that fit into an overall theme, a kind of "frame composition" that was used for improvisation.

The thing at the bottom represents the foot controller. An instrument (actually a patch) gets selected by the upper three switches. Tap tempo is always available. All other controls are for hardware and software synths and for audio effects and change their meaning depending on the patch.

Actually there were two additional programs involved (Max and Bidule) but what you see is a typical Mainstage concert, glued together in a couple of hours for a particular event and probably never used again. With other software it can take days if not weeks to get similar functions and it is hard to reach the same level of stability.

Well, this was a long post again and I talked about everything you did not ask for. But it may help newbies (in the very best meaning) to understand what we are talking about, when we discuss Mainstage here in the forum. It is not about doing things right or wrong or who knows better. This is a relatively new application. Whenever we discuss it, we first need to find out what EXACTLY a musician wants to do and then think about ways to do it.

This is different from Logic, where sometimes a vague question is enough for a couple of good answers because many applications and techniques for a DAW are widely known. Mainstage is no DAW in the common sense. If you want, it is a big instrument that can play everything.
 
Originally Posted by Graeme Douglas:
All this waggle about MIDI controllers etc has been a red herring.

Quoted from Peter Ostry:
This is your current opinion. If you ask about Mainstage, people will instantly talk about controllers because this is just the way it works. The only exception may be keyboarders who use it for their sounds and can easily switch the patches manually as long as they play within a song-after-song context. Mainstage is built for controllers and depends on them. If you don't use controllers, well, then you don't actually use Mainstage but rather a kind of weird Logic without a sequencer and a crazy mapping.

Peter,

I have a Waves GTR Ground. The effort involved in trying to set that little piece of hardware up in Mainstage (it took about four weeks with no workable results) was what prompted me into trying to reduce racks full of equipment, like you have in your photographs, to a laptop, a MIO ULN-2 and an electric guitar, purely for the sake of being able to get onto an aeroplane without giving myself a hernia and incurring excess baggage charges. This set of gigs will be as simple as possible. In future, I will attempt to map this Waves hardware to Mainstage so it is useable.

I realize that it will be useful to have my set list playable with 101 different amplifiers, 201 different effects pedals, and FX synced to song tempos but this is not what I originally asked, and is something that I will investigate at a future date when time is not quite so pressing.

Using Logic would be possible, but loading songs is still time-intensive - there are only so many jokes that one can tell an audience whilst 40 Gb of software instrument in the various instances of Addictive Drums and Kontakt plug ins are loading. (Please don't take all this literally; I am exaggerating for the sake of dramatic effect!)

I hope that this exchange has been useful for readers - it's always gratifying to learn new ways of working, or to be able to modify previous ways.

Best regards
 
Dang Peter, I'm inspired!
Have you got any audio of one of these performances online?
Not yet. Recording only my Mainstage part is senseless. There was no possibility to use the FOH because there were too many performers who did not play over the desk like flutes, singers, cello, guitars etc. And nobody had the time (or interest) to set up two mics and record the whole thing. But my system is in development and I think I will soon have some recordings with a smaller group.
 
... trying to reduce racks full of equipment, like you have in your photographs, to a laptop, a MIO ULN-2 and an electric guitar, purely for the sake of being able to get onto an aeroplane without giving myself a hernia and incurring excess baggage charges.
Forget about my hardware. I carry a small studio with me and this makes sense in my case and no sense in your case. But for Mainstage it makes not much difference if you control a bunch of hardware or a bunch of software. You must work via layout -> screen controls -> mapping or Mainstage will simply not do what you want because it is built for this workflow and nothing else. You cannot migrate your Logic knowledge to Mainstage, it is another program.


I realize that it will be useful to have my set list playable with 101 different amplifiers, 201 different effects pedals, and FX synced to song tempos but this is not what I originally asked
But ignoring the very different structure of Mainstage wont let you play with 2 different amplifiers, 2 different effect pedals and with unsynced song tempo. You wont play at all.

Get a car with the steering wheel on the other side and try to insist on the drivers position you are used to. You will certainly run into problems. I know what I am talking about because my car has the steering wheel and everything else on the other side. I decided to sit there because this little trick offers some advantages.


Using Logic would be possible, but loading songs is still time-intensive - there are only so many jokes that one can tell an audience whilst 40 Gb of software instrument ...
Mhm ... without knowing your setlist ... I am not sure. Do you have so many different settings? Your initial question did not sound like that. We started with vocals and guitar as variables. Playback on stage should not run like a mix, you would bounce your playback tracks and use single stereo files for playback, without any plugins. Such projects load in no time and you can have many songs in a row with just two variable guitar and vocal tracks for example. Those wouldn't carry any data, maybe they are only two channelstrips. Of course, I am just guessing and cannot tell details without knowing about your actual requirements.
 
Peter and others,

Let me summarize what I have learned about Mainstage and its operation in concert.

What I originally asked readers was to set up MS to play a selection of pre-recorded backing tracks over which I would sing and play guitar in real time, with guitar and vocal FX generated by the various song tempos (tempi?). I have achieved this with your help and used it in a live situation.

Now I will follow Peter's suggestions and create separate patches with separate guitar and vocal settings for each distinct song. This, as Peter points out, is the main promise of MS as a live performance tool. It was a question of time that prevented me from setting this up originally.

I'm sorry that my use of the English language, and its profound ability to confuse non-native (and sometimes native) speakers, has created miscomprehension and what might be read as ill-feeling on this topic. I would be happy to have this end now.
 
Back
Top