Showing posts with label sound designer.sound sistem. Show all posts
Showing posts with label sound designer.sound sistem. Show all posts

Thursday, May 5, 2011

Why Do We Make Soundcheck. Part Two

Making it easier on junior engineers makes your band sound better.
Earlier today I went over the basic preparations required to make a sound check run smoothly. Now, we’ll get down to the nitty gritty of how you and the house technical staff can best work together to bring your audience a truly memorable and killer show.
Follow the procedure.
The venue engineers will guide you through the process of sound checking, but knowing what’s coming saves time, and demonstrating familiarity with the process provides reassurance to the technical team and the promoters.
The first step involves finding the peak level for each source, so that no matter what happens in the show, the channels won’t overload and will operate in their butter zone for the majority of dynamics levels. Thus, when the engies call for you to play, tell them, “OK, here come some peaks,” then really dig in and give themthe loudest thing you could imaginably play during your set.
For those using pedals, be sure to stage your gain such that turning on or off any given effect doesn’t cause any significant volume changes. For those using patches or samples, be sure all your sounds produce a fairly consistent level. Depending on your medium, normalizing or gently compressing your sounds in advance can help this tremendously. Start with a clean and clear tone or sound, and play long, resonant notes.
Once you’ve caught the engineers’ eyes, you can move on to step two, the tonal and textural stuff. With the input level in place, the engineers will start applying inserts and EQ to the channel, so this is the time to begin playing throughout the full range of the instrument, loud and soft, high and low, staccato and legato, clean and effected. [Vocalists who use effects, please see NOTE #1 at the bottom.]
When this process of level and tone setting has been iterated across every source, you’ll move on to the third step, building your monitor mixes. The monitor engie will ask each player in turn what (s)he wants in his/her wedge. When answering, try to follow the same order of channels you just worked through, indicating for each source how loud or “forward” it should be in the mix. If one or two sources are significantly more or less crucial to monitor than the rest, be sure to note that after providing the complete list of what you’ll need to hear. For example:
• A singing guitarist might say: “I’d like just a touch of kick and snare, a moderate amount of bass, a bit of my own guitar, a fair amount of the other guitar, and plenty of keys and my voice. The most important thing for me to hear is my own voice and the keyboard. If you have to cut something, pull the drums out.”
• A drummer who also plays a sample pad could reply: “I don’t need any of my kit, but please give me a bunch of bass, a middling amount of guitars and keys, a ton of my sample pad, and just a little vocals. The most important source for me is my sample pad, and the vocals can go away if they have to.”
Once the monitor engie has basic mixes up for all the wedges, the band should play and loop a part of a song that uses the most sources possible. [Why? See NOTE #2 at the bottom.] While continuing to loop the part, each member should catch the engie’s eye, one at a time, to fine tune his/her mix.
When it’s your turn, point at a source, then point up or down, then point at the wedge where you want the change. This says to the engineer, “I want source in mix.” Keep looping the part until all the monitor mixes are to taste. A smile and a nod or the “OK/perfect” hand sign indicates to the engie that your mix is to your liking. Repeat as necessary until you’ve covered all your sources.
You may have to tolerate some feedback during your check while the engineers maximize their gain staging and chase down trouble frequencies. Keep playing through the feedback [Again, see NOTE #2.] so the engineers can find the ringing frequency amidst the dynamic and shifting signal. Realize that although standing over a ringing wedge is obnoxious and even occasionally painful, it certainly beats gambling on feedback returning mid-show, so taking a few extra minutes to properly and sufficiently ring out a wedge is always time well spent.
When you and your bandmates are happy with your monitor mixes, it’s time for step four. Stop the loop you’re playing, and tell the engineers that your mixes are good and that you’re ready to play a song and check the house. Pick a couple songs from your set and play them from beginning to end, without stops. [See NOTE #2.] If your wedges need further tuning, signal the monitor engineer while you play.
After a couple songs, ask the front of house engineer if (s)he has heard enough and has the mix dialed in. Once the FOH engineer is happy, ask to play one more tune so the band can hear the sound in the house. Take turns, one band member at a time, walking through the house. If you have any requests for the FOH engineer, now is the time to make them. When every band member walked the house, and the band and engineers are all sufficiently satisfied, then the check is complete.
If your gear must be moved before your set, follow this procedure immediately, well before you consider heading to the green room or the bar or to dinner:
• Mark the onstage position of your gear, using 2-3” strips of tape placed in right angles at the corners of each piece of gear, including mic stands and stage monitors. Using bright and uncommonly-colored tape helps these “spike” marks show up under low stage lighting and allows for your setup to be easily distinguished from other acts’.
• Once the positions of all your gear has been spiked, carry or roll your gear from the stage to your cases at side of stage. Don’t bring your cases to your gear, as that, as before, both doubles the amount of carrying you have to do and screws up the backline staging occurring both onstage and next to it. Start with the delicate items (guitars, cymbals, electronics, etc.) and work towards the rugged stuff (stands, drum hardware, amps, etc.), all the while making sure that the gear will be accessible in the reverse order from side of stage come show time, so the last things to get uncased before your set are also the most delicate. This protects your gear as much as possible and allows for a quicker changeover later on.
• If you are sharing any of your own gear with other bands, be sure that it’s properly labeled and placed somewhere immediately visible to them when they arrive. If you aren’t sharing anything, carefully remove every last piece of gear you care about; an unlabeled power strip or 1/4″ cable or hi-hat clutch left on a stage quickly becomes the property of whomever picks it up.
• Do a quick “owner’s walk” of the stage to be sure you’ve collected every last thing you brought with you. Once this is done, enjoy your break! The engineers, not to mention the bartenders and waitstaff and barbacks and management and security and promoters and bathroom attendants, usually don’t get such a break, so saying thank you and shaking their hands on your way out, as a demonstration of your appreciation for the work they performed, is always a welcome and valued show of respect.
NOTE #1: Unless you are using very particular, obscure, and uncommon effects, you will be much better off by letting the engineers use their house effects. Reverbs and delays are ubiquitous and the engineer will likely put some on your voice even without you asking.
Unless you are running an advanced real-time autotuning rig, or absolutely have to punch in a flanger at only a few crucial moments in a couple songs, or have a calibrated and road-proven harmonizer that needs different settings recalled between each song, let the engineers handle your effects. Otherwise, you’re stuck with the inferior preamp (if there even is one) in your unit, and the monitor and FOH engineers’ hands are severely tied since they can’t separate your vocals from the effect in the channel, meaning the effect will carry into any monitors and submixes, will throw off any inserts and EQ on the channel, and will, sometimes drastically, lower the gain before feedback. Your other option is to pickup a quality mic splitter so that you can send two channels to the engineers, one clean and one effected. Thus, the clean channel hits a proper mic preamp, inserts, and EQ before any effects, meaning both your monitor and FOH mixes will be cleaner, more flexible, and deliver more gain before feedback.
NOTE #2: An engineer cannot fix or adjust something that isn’t happening or being played, so stopping a song to ask for an adjustment is usually completely counterproductive. Instead, keep the tune going and ask over a vocal mic, loudly if need be, for what you need. Alternatively, or if you need something during the show, just catch the engineer’s eye (as any engineer worth their pay will look across the stage periodically, just like a good driver scanning his/her mirrors), and indicate what you need with a couple hand signs. If there is a monitor desk at side of stage, you can also always walk over to talk to the engie, so long as you keep playing. Remember, if in the process of dialing in a mix, an engineer doesn’t want to hear something, they can always mute the related channel, but if they need to hear something when you’ve stopped playing, nothing can progress.


Read more: http://ilhamnurulresources.blogspot.com/#ixzz1LQRLC9zB

Why Do We Make Soundcheck. Part One

One of the single greatest indicators of a band’s degree of experience, expertise, and general savoir-faire is the manner in which it conducts sound checks. So for those of you looking to take a simple step towards professionalism, or for anyone who prefers eating, sleeping, showering, and hitting on that cute bartender to playing quarter notes to an empty room while feedback swirls around you like a class three tornado, I offer these five bullet points on how to perform a painless, efficient sound check.
First, the basics:
Send all relevant information ahead of you.
The minute your show is booked, you should send your rider and contract to the promoter and venue. Ask the venue, in writing, to forward your rider to the sound and lighting engineers. When in doubt, include more, not less, information: you can never be clear or specific enough. By seeing your rider well in advance, the venue crew can better prepare for your arrival and save crucial time going over basic requirements. Riders should include, at the very least:
• Stage Plot: a detailed picture of the position of all performers, instruments, monitors, and power drops. Drawing it on a bar napkin with a crayon does not count; use any word processor that can draw shapes and lines and save the image as a JPG.
• Input List: a complete account of every input source, including all mics, DIs, and wireless systems. Just about any engineer you will meet will appreciate you following the industry standard convention of channel/source order, so stick with that unless you have a really good reason not to.
• Technical Requirements: a description of what, at minimum, constitutes an acceptable sound system and backline. If you need to clear 1200 decibels SPL, or your drummer must have a 62″ plasma subwoofer to accompany his monitor, or your keyboard must be a Chrysler Grand Caravan, or you’re sponsored by SlimJim and have to use one of their mics, or the system must respond up to microwave frequencies, this is where to say so.
Extra Credit: Send sound files along with the rider. This lets the engineers hear who you are and what you’re about. It also provides a reference so they know if their mixes are hitting the mark. If you have decent audience recordings of any shows you think were mixed particularly well, those can be of great use to an engineer, in addition to your studio tracks. Similarly, sending photos or video of lighting you’ve enjoyed in the past helps the lighting designer plan and program his looks.
Bring complete instruments.
Cables, rugs, power strips, picks, sticks, mallets, bows, reeds, and tambourines are all part of your instrument: a guitarist without a pick isn’t a guitarist, a drummer without a rug isn’t a drummer, a keyboardist without a power cable isn’t a keyboardist, and so on. If you need it, and the venue hasn’t guaranteed you, in writing, that it will be there, then bring it. In fact, bring two. In general, unless otherwise specified by a contract, musicians should bring, at the very least:
• Drummers: sticks, cymbals, snare, felts, multitool, drum key, and hi-hat clutches to fit various diameter stems. If you’re bringing a kick drum, then you need to bring a rug. Having a spare kick pedal and extra cymbal stands around never hurts either.
• Guitarists and bassists: picks, power strip, 1/4″ cables, change of strings, multitool, strap, 9V batteries. If you’re savvy, you also travel with at least one spare 1/4″ cable, one spare power strip, one spare IEC cable, and a bunch of 9V DC transformers.
• Keyboardists: stand, power strip, 1/4″ cables. Really slick guys have extra 1/4″s and power strips, as well as redundant power cables for their keyboards.
• Computer Musicians and DJs: table or stand, power strip, 1/4″ cables. If your outputs are XLR, then don’t sweat the 1/4″s. If they’re TRS, bring TRS-to-XLRmale adaptors. Again, spare 1/4″ cables, power strips, power supplies, and output adaptors are all good to bring along.
• Brass & Strings: reeds, bows, rosin, change of strings, and stands are all requisite. Spares of all are a bonus.
• If you use music stands, bring your own clip lights with extension cords and spare bulbs even if there are stands at the venue.
Extra Credit: EVERY player, regardless of instrument, should bring an extra 1/4″ cable, power strip, pack of picks, pair of 5A drumsticks, roll of gaff tape, Sharpie, pen, multitool, and zip ties in an “emergency kit.” All of this fits in a small bag and weighs just a couple of pounds, but it’s well worth it for the gloating you can do after you save your idiot bandmate’s ass when he forgets something crucial.
Extra Credit: Bring your own vocal mics and DIs. Professional quality gear please – no RatShack, no B******er, no PG series, no cheap/unreliable/flimsy stuff. This affords you the delight of not having to taste the mouth of whomever last used the venue’s mics, and it means you’ll always have spares on hand in case something goes down during the check or the show.
Load in intelligently.
Place your cases to side of stage so the full stage is available for gear and cable runs. By bringing your gear from its case at side of stage to its onstage position, you both keep from cluttering the stage right when it needs to be its cleanest for cable running and dressing, and you save yourself the trip of carrying an empty case offstage. The exceptions to this rule are amps and cabinets in top-over-style cases. Leave the top at side of stage and roll the amp into position on the bottom/caster plate. (Don’t forget to lock your casters once your amp is in place!)
Start by helping each other move the large and heavy objects like drums and amp cabinets. Once all the big stuff is staged, break off and finish the details of your rig by yourself. Any pedal setup involving more than two pedals should live attached a pedalboard, with its patching and power configured for plug-and-play readiness.
Extra Credit: Use exclusively ATA-rated hardcases in standard sizes. Such cases’ dimensions, durability, and recessed hardware make packing, stacking, and moving much easier. ATA-rated cases also meet the most demanding specifications for the transport of delicate materials, so your gear travels as safely as possible.
Be on time!
It might sound obvious, but lateness screw up dozens, hundreds, or even thousands of people’s days. It pushes back all the other checks at least, and the door and show times at worst. If you’re lucky enough to be the headliner, you will likely check first; checks usually run in the opposite order from that of the show, so that when the first opener finishes their check, everything is set and ready to go come show time. When other acts are late or run long at check, it’s almost always the poor opening act that suffers the most, as it’s ultimately their check that will get shortened, or even cut entirely.
Extra Credit: Be 30 minutes early. If the venue is open, load your gear to side of stage, and begin prepping the gear, making sure to leave room for any other acts’ backline. If the venue is closed, line up your gear neatly by the stage door, big and heavy stuff first, small stuff last.
In the next installment, I’ll go over the nuts and bolts of how to make the most out of your check once you step on stage.


Read more: http://ilhamnurulresources.blogspot.com/#ixzz1LQQzDK1w

Tuesday, April 19, 2011

How to Choose a Wireless Microphone for Public Speaking

If you're conducting a presentation, lecture or any other public speaking engagement in a large conference room or convention hall, you'll need a wireless microphone to help project your voice and enable you to move freely. Here's what to look for when choosing a wireless microphone for your class or speech.

INSTRUCTIONS

    • 1
      Sample a wireless handheld microphone. A transmitter box eliminates the need for constricting cable. This inexpensive system mic works well for public speaking in smaller rooms.
    • 2
      Select a wireless mic with a body pack. These models, commonly used by entertainers, allow presenters to move freely on the stage. Attach the battery operated pack to your waistband or collar and speak normally.
    • 3
      Consider a lavaliere microphone. You clip it to your lapel or other clothes. If you conduct interviews during your speaking engagement, try this option. A lavaliere microphone helps preventing the "popping" sound exacerbated by many handheld microphones. These lightweight microphones need to be placed 6 inches from your chin for best sound quality, and come with a body pack transmitter.
    • 4
      Look at the variations on body pack transmitter microphones. You can choose from headset mics for speakers who move around a lot and wander into the audience, or clip-on mics that can be mounted on an instrument or piece of furniture for more stationary speakers.
    • 5
      Study the differences between UHF and VHF microphone transmission. Microphones operating on the UHF band (similar to the higher number channels on broadcast television) share that frequency with appliances like cordless phones, so wireless mics tuned to this band tend to drop off more easily. If a wireless mic shares a frequency with the VHF (lower number broadcast television channels) the sound will be clearer when speaking.
    • 6
      Locate a good frequency for your system. Higher priced systems have an automatic select button that will do this for you. Otherwise, you'll need to select the clearest one by checking each available frequency through the transmission box.


Read more: http://ilhamnurulresources.blogspot.com/#ixzz1JtpmS4GE

Monday, April 4, 2011

8 Reasons We Love Music On The Journey


Music reminds us of our travels, just as much as it inspires us to travel. It urges us to unfold the maps, pack the backpack and hit the road.
Music let us feel free like a bird.


Every time I listen to Pink Floyd’s Shine on You Crazy Diamond, I think back to the Makgadi-kgadi Pans in Botswana. These pans cover 6 177.6 square miles, and from the middle, you can see the curvature of the Earth.
Once, I drove my Land Rover over the pans and decided to stop in the middle and sleep under the stars. Pink Floyd played on the CD player and it made the whole experience surreal.
When I listen to Joan Baez I’m taken back to a surfing trip and a remote beach down the East Coast of Africa. I can smell the freshly caught crayfish crackling over the coals and hear the waves whooshing in the dark.
Music reminds us of our travels, just as much as it inspires us to travel. It urges us to unfold the maps, pack the backpack and hit the road.
Here are 8 reasons we love music on the journey:
1. Music is a universal language
Local music breaks down the barriers of language and ethnicity. Music is a universal language, which uplifts the spirit and helps to make friends.
People all over the world identify with music. The essence of ancient cultures lies in music.
People all over the world identify with music. The essence of ancient cultures lies in music, and if we are lucky, we can experience a bit of it.
I was one of very few Asian to have been invited to a traditional Ghule-whankulu dance in Malawi. These dancers belong to a secret society, their true identities are known only by themselves, and no one in the village knows who is behind the masks.
Their energetic dancing accompanied by awe-inspiring drum rhythms will always be a part of my memories.
2. Music reminds you of the people
Music reminds you of the people you met along the journey. Like a little romance I had with the girl from Holland captured perfectly by Natalie Imbruglia while we were waiting for the bus. It was a bittersweet farewell.
She promised to phone me, lying naked on the floor. (It never happened…)
3. Music can make the world yours
Music shortens those long hot and smelly journeys in the back of a crowded bus. Music can drown out the noise of a big city. Listening to your favorite tracks can enhance the experience of seeing a natural or man-made wonder for the first time.
Make that “touristy” spot, a piece of your own travel memories without the camera-toting holidaymakers. For me it was Moby at Victoria Falls in Zambia.
4. Music can ignite your imagination
Music documented what happen around us.
Recently, I boarded a plane at the start of a new journey. As the plane hurtled down the runway on take-off, Lynyrd Skynyrd’s Free Bird was reminding me that I was free as a bird, and that I would never change.
Too many places to see. Music fueled the excitement. The promise of new adventure. It made the farewell to my family easier.
5. Music can enhance the present
Music makes the world a little more interesting. It helps your thoughts and imagination to be a little bit more creative.
I was sitting at the Stockholm train-station, and I was listening to Vaya Con Dios’s Don’t cry for Louie. I saw this shady guy with a trench coat and dark glasses, and he had two tarty looking women with him.
Pimp and prostitutes. I saw Louie. I had the urge to walk over to one of the women and ask her to sing for me, in that lovely low and sexy voice. The fear of a slap in the face, and possible arrest for public disturbance stopped me.
And Louie would have been pissed off.
6. Music can scar you
Music is a form of traveling on its own. Traveling without moving.
Sometimes, music can have a negative impact as well. Malawi has cured me of Peter Tosh and Bob Marley forever.
I was staying at a backpackers in Inkhatha Bayand, and couldn’t help but notice some other “beach boys” that spoke with overdone Jamaican accents. The local Rastas.
They were rolling joint after joint of “electric spinach” and listening to some shockingly bad, very loud distortions of Buffalo Soldier and Redemption Song. Over and fucking over.
7. Music can be a useful
I have once used music as a weapon. I arrived at this nice little campsite at the Drakensberg in South Africa, on the Lesotho border.
I set up camp away from the rest of the crowd when a noisy family decided to come and disturb my peace. They settled right next to me, and the kids proceeded to kick up a lot of noise.
I put Tom Wait’s Bone Machine on at top volume and took a few sips of Rum straight out the bottle while I gave the mother the evil eye.
Needless to say, they packed up and left me in peace. I always reserve old Tom for warding off witchdoctors and noisy kids.
8. Music is travel
Mostly, I have fond memories of my travels when I am listening to music at home and working towards the next trip. It is like a form of traveling on its own. Traveling without moving.
How has music enhanced your journeys? Share your stories in the comments!


Read more: http://ilhamnurulresources.blogspot.com/#ixzz1IWwpMcBg

Thursday, March 31, 2011

Sound Designing Description

Mohd Fauzi Mohd Hanif
Sound Designer of Ilham Nurul Resources
MALAYSIA


What is Sound Design?
You may assume that it’s about fabricating neat sound effects. But that doesn’t describe very accurately what Ben Burtt and Walter Murch, who invented the term, did on "Star Wars" and "Apocalypse Now" respectively. On those films they found themselves working with Directors who were not just looking for powerful sound effects to attach to a structure that was already in place. By experimenting with sound, playing with sound (and not just sound effects, but music and dialog as well) all through production and post production what Francis Coppola, Walter Murch, George Lucas, and Ben Burtt found is that sound began to shape the picture sometimes as much as the picture shaped the sound. The result was very different from anything we had heard before. The films are legends, and their soundtracks changed forever the way we think about film sound.
What passes for "great sound" in films today is too often merely loud sound. High fidelity recordings of gunshots and explosions, and well fabricated alien creature vocalizations do not constitute great sound design. A well-orchestrated and recorded piece of musical score has minimal value if it hasn’t been integrated into the film as a whole. Giving the actors plenty of things to say in every scene isn’t necessarily doing them, their characters, or the movie a favor. Sound, musical and otherwise, has value when it is part of a continuum, when it changes over time, has dynamics, and resonates with other sound and with other sensory experiences.
What I propose is that the way for a filmmaker to take advantage of sound is not simply to make it possible to record good sound on the set, or simply to hire a talented sound designer/composer to fabricate sounds, but rather to design the film with sound in mind, to allow sound’s contributions to influence creative decisions in the other crafts. Films as different from "Star Wars" as "Citizen Kane," "Raging Bull," "Eraserhead," "The Elephant Man," "Never Cry Wolf" and "Once Upon A Time In The West" were thoroughly "sound designed," though no sound designer was credited on most of them.
Does every film want, or need, to be like Star Wars or Apocalypse Now? Absolutely not. But lots of films could benefit from those models. Sidney Lumet said recently in an interview that he had been amazed at what Francis Coppola and Walter Murch had been able to accomplish in the mix of "Apocalypse Now." Well, what was great about that mix began long before anybody got near a dubbing stage. In fact, it began with the script, and with Coppola’s inclination to give the characters in "Apocalypse" the opportunity to listen to the world around them.
Many directors who like to think they appreciate sound still have a pretty narrow idea of the potential for sound in storytelling. The generally accepted view is that it’s useful to have "good" sound in order to enhance the visuals and root the images in a kind of temporal reality. But that isn’t collaboration, it’s slavery. And the product it yields is bound to be less complex and interesting than it would be if sound could somehow be set free to be an active player in the process. Only when each craft influences every other craft does the movie begin to take on a life of it’s own.
A Thing Almost Alive
It is a common myth that the time for film makers to think seriously about sound is at the end of the film making process, when the structure of the movie is already in place. After all, how is the composer to know what kind of music to write unless he/she can examine at least a rough assembly of the final product? For some films this approach is adequate. Rarely, it works amazingly well. But doesn’t it seem odd that in this supposedly collaborative medium, music and sound effects rarely have the opportunity to exert any influence on the non-sound crafts? How is the Director supposed to know how to make the film without having a plan for using music?
A dramatic film which really works is, in some senses, almost alive, a complex web of elements which are interconnected, almost like living tissues, and which despite their complexity work together to present a more-or-less coherent set of behaviors. It doesn’t make any sense to set up a process in which the role of one craft, sound, is simply to react, to follow, to be pre-empted from giving feedback to the system it is a part of.
The Basic Terrain, As It Is Now
Many feature film directors tend to oscillate between two wildly different states of consciousness about sound in their movies. On one hand, they tend to ignore any serious consideration of sound (including music) throughout the planning, shooting, and early editing. Then they suddenly get a temporary dose of religion when they realize that there are holes in the story, weak scenes, and bad edits to disguise. Now they develop enormous and short-lived faith in the power and value of sound to make their movie watchable. Unfortunately it’s usually way too late, and after some vain attempts to stop a hemorrhage with a bandaid, the Director’s head drops, and sound cynicism rules again until late in the next project’s post production.
What follows is a list of some of the bleak realities faced by those of us who work in film sound, and some suggestions for improving the situation.
Pre-Production
If a script has lots of references in it to specific sounds, we might be tempted to jump to the conclusion that it is a sound-friendly script. But this isn’t necessarily the case. The degree to which sound is eventually able to participate in storytelling will be more determined by the use of time, space, and point of view in the story than by how often the script mentions actual sounds. Most of the great sound sequences in films are "pov" sequences. The photography, the blocking of actors, the production design, art direction, editing, and dialogue have been set up such that we, the audience, are experiencing the action more or less through the point of view of one, or more, of the characters in the sequence. Since what we see and hear is being filtered through their consciousness, what they hear can give us lots of information about who they are and what they are feeling. Figuring out how to use pov, as well as how to use acoustic space and the element of time, should begin with the writer. Some writers naturally think in these terms, most don’t. And it is almost never taught in film writing courses.
Serious consideration of the way sound will be used in the story is typically left up to the director. Unfortunately, most directors have only the vaguest notions of how to use sound because they haven’t been taught it either. In virtually all film schools sound is taught as if it were simply a tedious and mystifying series of technical operations, a necessary evil on the way to doing the fun stuff.
Production
On the set, virtually every aspect of the sound crew’s work is dominated by the needs of the camera crew. The locations for shooting have been chosen by the Director, DP, and Production Designer long before anyone concerned with sound has been hired. The sets are typically built with little or no concern for, or even awareness of, the implications for sound. The lights buzz, the generator truck is parked way too close. The floor or ground could easily be padded to dull the sound of footsteps when feet aren’t in the shot, but there isn’t enough time. The shots are usually composed, blocked, and lit with very little effort toward helping either the location sound crew or the post production crew take advantage of the range of dramatic potential inherent in the situation. In nearly all cases, visual criteria determine which shots will be printed and used. Any moment not containing something visually fascinating is quickly trimmed away.
There is rarely any discussion, for example, of what should be heard rather than seen. If several of our characters are talking in a bar, maybe one of them should be over in a dark corner. We hear his voice, but we don’t see him. He punctuates the few things he says with the sound of a bottle he rolls back and forth on the table in front of him. Finally he puts a note in the bottle and rolls it across the floor of the dark bar. It comes to a stop at the feet of the characters we see. This approach could be played for comedy, drama, or some of both as it might have been in "Once Upon A Time In The West." Either way, sound is making a contribution. The use of sound will strongly influence the way the scene is set up. Starving the eye will inevitably bring the ear, and therefore the imagination, more into play.
Post Production
Finally, in post, sound cautiously creeps out of the closet and attempts meekly to assert itself, usually in the form of a composer and a supervising sound editor. The composer is given four or five weeks to produce seventy to ninety minutes of great music. The supervising sound editor is given ten to fifteen weeks to—smooth out the production dialog—spot, record, and edit ADR—and try to wedge a few specific sound effects into sequences that were never designed to use them, being careful to cover every possible option the Director might want because there "isn’t any time" for the Director to make choices before the mix. Meanwhile, the film is being continuously re-edited. The Editor and Director, desperately grasping for some way to improve what they have, are meticulously making adjustments, mostly consisting of a few frames, which result in the music, sound effects, and dialog editing departments having to spend a high percentage of the precious time they have left trying to fix all the holes caused by new picture changes.
The dismal environment surrounding the recording of ADR is in some ways symbolic of the secondary role of sound. Everyone acknowledges that production dialog is almost always superior in performance quality to ADR. Most directors and actors despise the process of doing ADR. Everyone goes into ADR sessions assuming that the product will be inferior to what was recorded on the set, except that it will be intelligible, whereas the set recording (in most cases where ADR is needed) was covered with noise and/or is distorted.
This lousy attitude about the possibility of getting anything wonderful out of an ADR session turns, of course, into a self fulfilling prophecy. Essentially no effort is typically put into giving the ADR recording experience the level of excitement, energy, and exploration that characterized the film set when the cameras were rolling. The result is that ADR performances almost always lack the "life" of the original. They’re more-or-less in sync, and they’re intelligible. Why not record ADR on location, in real-world places which will inspire the actors and provide realistic acoustics? That would be taking ADR seriously. like so many other sound-centered activities in movies, ADR is treated as basically a technical operation, to be gotten past as quickly and cheaply as possible.
Taking Sound Seriously
If your reaction to all this is "So, what do you expect, isn’t it a visual medium?" there may be nothing I can say to change your mind. My opinion is that film is definitely not a "visual medium." I think if you look closely at and listen to a dozen or so of the movies you consider to be great, you will realize how important a role sound plays in many if not most of them. It is even a little misleading to say "a role sound plays" because in fact when a scene is really clicking, the visual and aural elements are working together so well that it is nearly impossible to distinguish them. The suggestions I’m about to make obviously do not apply to all films. There will never be a "formula" for making great movies or great movie sound. Be that as it may........
Writing For Sound
Telling a film story, like telling any kind of story, is about creating connections between characters, places, objects, experiences, and ideas. You try to invent a world which is complex and many layered, like the real world. But unlike most of real life (which tends to be badly written and edited), in a good film a set of themes emerge which embody a clearly identifiable line or arc, which is the story.
It seems to me that one element of writing for movies stands above all others in terms of making the eventual movie as "cinematic" as possible: establishing point of view. The audience experiences the action through its identification with characters. The writing needs to lay the ground work for setting up pov before the actors, cameras, microphones, and editors come into play. Each of these can obviously enhance the element of pov, but the script should contain the blueprint.
Let’s say we are writing a story about a guy who, as a boy, loved visiting his father at the steel mill where he worked. The boy grows up and seems to be pretty happy with his life as a lawyer, far from the mill. But he has troubling, ambiguous nightmares that eventually lead him to go back to the town where he lived as a boy in an attempt to find the source of the bad dreams.
The description above doesn’t say anything specific about the possible use of sound in this story, but I have chosen basic story elements which hold vast potential for sound. First, it will be natural to tell the story more-or-less through the pov of our central character. But that’s not all. A steel mill gives us a huge palette for sound. Most importantly, it is a place which we can manipulate to produce a set of sounds which range from banal to exciting to frightening to weird to comforting to ugly to beautiful. The place can therefore become a character, and have its own voice, with a range of "emotions" and "moods." And the sounds of the mill can resonate with a wide variety of elements elsewhere in the story. None of this good stuff is likely to happen unless we write, shoot, and edit the story in a way that allows it to happen.
The element of dream in the story swings a door wide open to sound as a collaborator. In a dream sequence we as film makers have even more latitude than usual to modulate sound to serve our story, and to make connections between the sounds in the dream and the sounds in the world for which the dream is supplying clues. Likewise, the "time border" between the "little boy" period and the "grown-up" period offers us lots of opportunities to compare and contrast the two worlds, and his perception of them. Over a transition from one period to the other, one or more sounds can go through a metamorphosis. Maybe as our guy daydreams about his childhood, the rhythmic clank of a metal shear in the mill changes into the click clack of the railroad car taking him back to his home town. Any sound, in itself, only has so much intrinsic appeal or value. On the other hand, when a sound changes over time in response to elements in the larger story, its power and richness grow exponentially.
Opening The Door For Sound, Efficient Dialog
Sadly, it is common for a director to come to me with a sequence composed of unambiguous, unmysterious, and uninteresting shots of a location like a steel mill, and then to tell me that this place has to be made sinister and fascinating with sound effects. As icing on the cake, the sequence typically has wall-to-wall dialog which will make it next to impossible to hear any of the sounds I desperately throw at the canvas.
In recent years there has been a trend, which may be in insidious influence of bad television, toward non-stop dialog in films The wise old maxim that it’s better to say it with action than words seems to have lost some ground. Quentin Tarantino has made some excellent films which depend heavily on dialog, but he’s incorporated scenes which use dialog sparsely as well.
There is a phenomenon in movie making that my friends and I sometimes call the "100% theory." Each department-head on a film, unless otherwise instructed, tends to assume that it is 100% his or her job to make the movie work. The result is often a logjam of uncoordinated visual and aural product, each craft competing for attention, and often adding up to little more than noise unless the director and editor do their jobs extremely well.
Dialogue is one of the areas where this inclination toward density is at its worst. On top of production dialog, the trend is to add as much ADR as can be wedged into a scene. Eventually, all the space not occupied by actual words is filled with grunts, groans, and breathing (supposedly in an effort to "keep the character alive"). Finally the track is saved (sometimes) from being a self parody only by the fact that there is so much other sound happening simultaneously that at least some of the added dialog is masked. If your intention is to pack your film with wall-to-wall clever dialog, maybe you should consider doing a play
Characters need to have the opportunity to listen.
When a character looks at an object, we the audience are looking at it, more-or-less through his eyes. The way he reacts to seeing the object (or doesn’t react) can give us vital information about who he is and how he fits into this situation. The same is true for hearing. If there are no moments in which our character is allowed to hear the world around him, then the audience is deprived of one important dimension of HIS life.
Picture and Sound as Collaborators
Sound effects can make a scene scary and interesting as hell, but they usually need a little help from the visual end of things. For example, we may want to have a strange-sounding machine running off-camera during a scene in order to add tension and atmosphere. If there is at least a brief, fairly close shot of some machine which could be making the sound, it will help me immensely to establish the sound. Over that shot we can feature the sound, placing it firmly in the minds of the audience. Then we never have to see it again, but every time the audience hears it, they will know what it is (even if it is played very low under dialogue), and they will make all the appropriate associations, including a sense of the geography of the place.
The contrast between a sound heard at a distance, and that same sound heard close-up can be a very powerful element. If our guy and an old friend are walking toward the mill, and they hear, from several blocks away, the sounds of the machines filling the neighborhood, there will be a powerful contrast when they arrive at the mill gate. As a former production sound mixer, if a director had ever told me that a scene was to be shot a few blocks away from the mill set in order to establish how powerfully the sounds of the mill hit the surrounding neighborhood, I probably would have gone straight into a coma after kissing his feet. Directors essentially never base their decisions about where to shoot a scene on the need for sound to make a story contribution. Why not?
Art Direction and Sound as Collaborators
Let’s say we’re writing a character for a movie we’re making. This guy is out of money, angry, desperate. We need, obviously, to design the place where he lives. Maybe it’s a run-down apartment in the middle of a big city. The way that place looks will tell us (the audience) enormous amounts about who the character is and how he is feeling. And if we take sound into account when we do the visual design then we have the potential for hearing through his ears this terrible place he inhabits. Maybe water and sewage pipes are visible on the ceiling and walls. If we establish one of those pipes in a close-up it will do wonders for the sound designer’s ability to create the sounds of stuff running through and vibrating all the pipes. Without seeing the pipes we can still put "pipe sounds" into the track, but it will be much more difficult to communicate to the audience what those sounds are. One close-up of a pipe, accompanied by grotesque sewage pipe sounds, is all we need to clearly tell the audience how sonically ugly this place is. After that, we only need to hear those sounds and audience will make the connection to the pipes without even having to show them.
It’s wonderful when a movie gives you the sense that you really know the places in it. That each place is alive, has character and moods. A great actor will find ways to use the place in which he finds himself in order to reveal more about the person he plays. We need to hear the sounds that place makes in order to know it. We need to hear the actor’s voice reverberating there. And when he is quiet we need to hear the way that place will be without him.
Starving The Eye, The Usefulness Of Ambiguity
Viewers/listeners are pulled into a story mainly because they are led to believe that there are interesting questions to be answered, and that they, the audience, may possess certain insights useful in solving the puzzle. If this is true, then it follows that a crucial element of storytelling is knowing what not to make immediately clear, and then devising techniques that use the camera and microphone to seduce the audience with just enough information to tease them into getting involved. It is as if our job is to hang interesting little question marks in the air surrounding each scene, or to place pieces of cake on the ground that seem to lead somewhere, though not in a straight line.
Sound may be the most powerful tool in the filmmaker’s arsenal in terms of its ability to seduce. That’s because "sound," as the great sound editor Alan Splet once said, "is a heart thing." We, the audience, interpret sound with our emotions, not our intellect.
Let’s assume we as film makers want to take sound seriously, and that the first issues have already been addressed:

1) The desire exists to tell the story more-or-less through the point of view of one or more of the characters.
2) Locations have been chosen, and sets designed which don’t rule out sound as a player, and in fact, encourage it.
3) There is not non-stop dialog.
Here are some ways to tease the eye, and thereby invite the ear to the party:
The Beauty of Long Lenses and Short Lenses
There is something odd about looking through a very long lens or a very short lens. We see things in a way we don’t ordinarily see them. The inference is often that we are looking through someone else’s eyes. In the opening sequence of "The Conversation" we see people in San FranciscoĆ­s Union Square through a telephoto lens. The lack of depth of field and other characteristics of that kind of lens puts us into a very subjective space. As a result, we can easily justify hearing sounds which may have very little to do with what we see in the frame, and more to do with the way the person ostensibly looking through that lens FEELS. The way we use such a shot will determine whether that inference is made obvious to the audience, or kept subliminal.
Dutch Angles and Moving Cameras
The shot may be from floor level or ceiling level. The frame may be rotated a few degrees off vertical. The camera may be on a track, hand held, or just panning. In any of these cases the effect will be to put the audience in unfamiliar space. The shot will no longer simply be "depicting" the scene. The shot becomes part of the scene. The element of unfamiliar space suddenly swings the door wide-open to sound.
Darkness Around the Edge Of the Frame
In many of the great film noir classics the frame was carefully composed with areas of darkness. Though we in the audience may not consciously consider what inhabits those dark splotches, they nevertheless get the point across that the truth, lurking somewhere just outside the frame is too complex to let itself be photographed easily. Don’t forget that the ears are the guardians of sleep. They tell us what we need to know about the darkness, and will gladly supply some clues about what’s going on.

Extreme Close-ups and Long Shots
Very close shots of peopleĆ­s hands, their clothing, etc. will tend to make us feel as though we are experiencing things through the point of view of either the person being photographed or the person whose view of them we are sharing. Extreme long shots are wonderful for sound because they provide an opportunity to hear the fullness or emptiness of a vast landscape. Carroll Ballards films The Black Stallion and Never Cry Wolf use wide shots and extreme close-ups wonderfully with sound.
Slow Motion
Raging Bull and Taxi Driver contain some obvious, and some very subtle uses of slow motion. Some of it is barely perceptible. But it always seems to put us into a dream-space, and tell us that something odd, and not very wholesome, is happening.
Black and White Images
Many still photographers feel that black and white images have several artistic advantages over color. Among them, that black and white shots are often less "busy" than color images, and therefore lend themselves more to presenting a coherent feeling. We are surrounded in our everyday lives by color and color images. A black and white image now is clearly "understood" (felt) to be someone’s point of view, not an "objective" presentation of events. In movies, like still photography, painting, fiction, and poetry, the artist tends to be most concerned with communicating feelings rather than "information." Black and white images have the potential to convey a maximum of feeling without the "clutter" of color.
Whenever we as an audience are put into a visual "space" in which we are encouraged to "feel" rather than "think," what comes into our ears can inform those feelings and magnify them.
What Do All Of These Visual Approaches Have In Common?
They all are ways of withholding information. They muddy the waters a little. When done well, the result will be the following implication: Gee folks, if we could be more explicit about what is going on here we sure would, but it is so damned mysterious that even we, the storytellers, don’t fully understand how amazing it is. Maybe you can help us take it a little farther." That message is the bait. Dangle it in front of an audience and they won’t be able to resist going for it. in the process of going for it they bring their imaginations and experiences with them, making your story suddenly become their story. success.
We, the film makers, are all sitting around a table in pre-production, brainstorming about how to manufacture the most delectable bait possible, and how to make it seem like it isn’t bait at all. (Aren’t the most interesting stories always told by guys who have to be begged to tell them?) We know that we want to sometimes use the camera to withhold information, to tease, or to put it more bluntly: to seduce. The most compelling method of seduction is inevitably going to involve sound as well.
Ideally, the unconscious dialog in the minds of the audience should be something like: "What I’m seeing isn’t giving me enough information. What I’m hearing is ambiguous, too. But the combination of the two seems to be pointing in the direction of a vaguely familiar container into which I can pour my experience and make something I never before quite imagined." Isn’t it obvious that the microphone plays just as important a role in setting up this performance as does the camera?
Editing Picture With Sound In Mind
One of the many things a film editor does is to get rid of moments in the film in which "nothing" is happening. A desirable objective most of the time, but not always. The editor and director need to be able to figure out when it will be useful to linger on a shot after the dialog is finished, or before it begins. To stay around after the obvious "action" is past, so that we can listen. Of course it helps quite a bit if the scene has been shot with these useful pauses in mind. Into these little pauses sound can creep on it’s stealthy little toes, or its clanking jackboots, to tell us something about where we have been or where we are going.
Walter Murch, film editor and sound designer, uses lots of unconventional techniques. One of them is to spend a certain period of his picture editing time not listening to the sound at all. He watches and edits the visual images without hearing the sync sound which was recorded as those images were photographed. This approach can ironically be a great boon to the use of sound in the movie. If the editor can imagine the sound (musical or otherwise) which might eventually accompany a scene, rather than listen to the rough, dis-continuous, often annoying sync track, then the cutting will be more likely to leave room for those beats in which sound other than dialog will eventually make its contribution.
Sound’s Talents
Music, dialogue, and sound effects can each do any of the following jobs, and many more:

  • suggest a mood, evoke a feeling
  • set a pace
  • indicate a geographical locale
  • indicate a historical period
  • clarify the plot
  • define a character
  • connect otherwise unconnected ideas, characters, places, images, or moments
  • heighten realism or diminish it
  • heighten ambiguity or diminish it
  • draw attention to a detail, or away from it
  • indicate changes in time
  • smooth otherwise abrupt changes between shots or scenes
  • emphasize a transition for dramatic effect
  • describe an acoustic space
  • startle or soothe
  • exaggerate action or mediate it

At any given moment in a film, sound is likely to be doing several of these things at once.
But sound, if it’s any good, also has a life of its own, beyond these utilitarian functions. And its ability to be good and useful to the story, and powerful, beautiful and alive will be determined by the state of the ocean in which it swims, the film. Try as you may to paste sound onto a predetermined structure, the result will almost always fall short of your hopes. But if you encourage the sounds of the characters, the things, and the places in your film to inform your decisions in all the other film crafts, then your movie may just grow to have a voice beyond anything you might have dreamed.
So, what does a sound designer do?
It was the dream of Walter Murch and others in the wildly creative early days of American Zoetrope that sound would be taken as seriously as image. They thought that at least some films could use the guidance of someone well-schooled in the art of sound in storytelling to not only create sounds but also to coordinate the use of sound in the film. This someone, they thought, would brainstorm with the director and writer in pre-production to integrate sound into the story on the page. During shooting that person would make sure that the recording and playing-back of sound on the set was given the important status it deserves, and not treated as a low-priority, which is always the temptation in the heat of trying to make the daily quota of shots. In post production that person would continue the fabrication and collection of sounds begun in pre-production, and would work with other sound professionals (composers, editors, mixers), and the Director and Editor to give the film’s soundtrack a coherent and well coordinated feeling.
This dream has been a difficult one to realize, and in fact has made little headway since the early 1970s. The term sound designer has come to be associated simply with using specialized equipment to make "special" sound effects. On "THX-1138" and "The Conversation" Walter Murch was the Sound Designer in the fullest sense of the word. The fact hat he was also a Picture Editor on "The Conversation" and "Apocalypse Now" put him in a position to shape those films in ways that allowed them to use sound in an organic and powerful way. No other sound designers on major American films have had that kind of opportunity.
So, the dream of giving sound equal status to image is deferred. Someday the Industry may appreciate and foster the model established by Murch. Until then, whether you cut the dialog, write the script, record music, perform foley, edit the film, direct the film or do any one of a hundred other jobs, anybody who shapes sound, edits sound, or even considers sound when making a creative decision in another craft is, at least in a limited sense, designing sound for the movie, and designing the movie for sound.


Read more: http://ilhamnurulresources.blogspot.com/#ixzz1IBtQhceG