Sonic Design_Exercises
21/4/2025-21/7/2025
Ruthlene Chua Zhen Si 0365222
Bachelor of Design (Honours) in Creative Media
- Instruction
- Lectures
- Class Summary
- Task
- Adjustment of the sound equalizer using Adobe Audition
- Adjust different sound effects depending on its environment sound equalizer using Adobe Audition
- 2 Environment Audio Fundamentals
- Feedbacks
- Reflections
- The required task for this week is to write a reflection blog about what we have learnt through the lecture videos below:
- Nature of Sound:
- Understanding how sound behaves and what contributes to its creation.
- Capturing Sound:
- How sound is recorded using various techniques and equipment.
- Processing Sound:
- Analyzing and manipulating sound for specific purposes and media.
- Converting Sound to Digital:
- Preparing audio files for use in digital applications.
- Pro Tools:
- Using a digital audio workstation (DAW) or sound editing software for audio production.
- 1: Production
- Sound is created through the vibration of an object, which causes surrounding air molecules to vibrate.
- For example: Our vocal cords vibrate and create sound waves when we speak. Similarly, when music is played loudly, the speaker cone vibrates, pushing the air around it; this vibration of air is what starts the sound wave.
- 2: Propagation
- These vibrations travel through the air as sound waves, moving from the source toward the listener.
- Sound travels through a medium, usually air, through waves. These waves create alternating areas of high pressure called compression and low pressure called rarefaction. As the vibrations move through the air, the molecules shift back and forth, passing the energy along until the sound reaches its destination.
- When the sound waves reach our ears, they cause the eardrums to vibrate. These vibrations are translated into electrical signals by the inner ear and sent to the brain, which allows us to recognize and understand the sound.
- The outer ear: Consists of the visible part of the ear and the ear canal. Its main function is to gather sound waves from the environment and direct them into the ear.
- The middle ear: Includes the eardrum, a thin membrane that vibrates when sound waves hit it. This section also contains three tiny bones known as the malleus, incus, and stapes. These bones help to amplify and transmit the vibrations from the eardrum to the inner ear.
- The inner ear: Contains the cochlea, a spiral-shaped structure filled with fluid and tiny hair cells. These cells convert vibrations into electrical signals that are sent to the brain. The inner ear also includes the semicircular canals and the endolymphatic sac, which are responsible for maintaining balance.
- 3: Perception
- The sound waves reach our ears, stimulate the eardrums, and are interpreted by the brain as sound.
- Ear training = essential
- Good mixes come from good ears. It’s not just about using gear or plugins, but to need to actually hear what’s going on in the sound.
- Frequencies
- Bass: low-end, can make things sound full or muddy
- Mids: where most instruments sit, but too much = boxy
- Highs: clarity and sparkle, but too much = harsh
- Mixing problems
- Muddiness = too low-mid
- Harshness = too high-mid
- Dull mix = not enough top end
- Thin mix = missing low frequencies
- Any sound editing software or digital audio workstation (DAW) typically comes with a set of common tools that are essential for sound design.
- For example, imagine you're doing graphic design. There are basic functions you would expect to find, like creating a drop shadow. Even if the software doesn’t have a specific plugin or menu option for drop shadows, you can still create one manually by making a copy of the object, placing it behind the original, lowering its opacity, and adding some feathering. The same principle applies to sound design tools: even the simplest tools can achieve complex results if you know how to use them creatively.
- While these tools might seem basic or even boring at first, they are actually crucial to sound design.
- Purpose: To make sounds richer, more interesting, and of higher quality.
- How to apply it: Layering involves taking two or more sounds and placing them on top of each other. That’s the basic definition. However, just like in visual design, successful layering requires a good sense of balance and blending. In audio, layering allows you to create entirely new, unique sounds by blending different elements together — that's the ultimate goal.
- Purpose: To stretch or compress the duration of a sound without necessarily changing its pitch.
- How it works: When you stretch a sound, it becomes longer, and when you compress it, it becomes shorter. Sometimes stretching can also affect the pitch — for example, stretching a voice recording will make it slower and deeper, while compressing it will make the voice faster and higher. Most DAWs let you control whether or not the pitch changes when adjusting the time.
- Purpose: To raise or lower the pitch of a sound.
- How it works: In sound design, pitch shifting is used to achieve different effects. For instance, raising the pitch can create a chipmunk-like voice, while lowering it can produce a deep, scary, monster-like sound. The way you shift the pitch depends on the target effect you’re aiming for — it’s very subjective and creative.
- Purpose: To create strange or unusual sounds by playing them backward.
- How it works: Reversing a sound can produce interesting and unexpected effects. When combined with layering techniques, reversed sounds can add unique textures and energy to a project.
- Purpose: To create sound effects manually when no suitable recordings are available.
- How it works: If the perfect sound can't be found or recorded, sometimes the best solution is to mimic it with your mouth. This technique can be surprisingly effective and gives you a lot of creative freedom.
- Using sound allows you to tell a story in ways that visuals alone never could. Adding sound effects to your project brings your film to life, making the experience much more immersive and emotionally powerful.
- Sound effects are without a doubt one of the most powerful assets for storytelling. They can dramatically enhance the viewer’s experience and make every moment feel more real and intense.
- Co-Pulse: One of the most widely used sound effect packs.
- YouTube: A good source for free or royalty-free sound effects (be sure to check usage rights).
- Use the right sound effects at the right moments.
- For example, if your scene shows a mountain hike, you could add sounds of wind, birds chirping, and trees swaying.
- The key is to match the environment naturally to what the audience sees on screen.
- A good rule of thumb: don't use more than three sound effects per scene.
- Sometimes, even just one well-placed sound can make a scene feel complete.
- Look carefully at your clip, identify the main subject, and design your sound around it.
- For example: If the focus is an elephant statue, add an elephant sound effect ; If it’s a landscape scene with waves crashing, add ocean waves and perhaps some soft wind sounds.
- The most important thing to remember: make the sound effects loud and obvious enough.
- Most people will be watching videos on their phones with the volume below 50%. If the sound is too subtle, they won't hear or feel it at all.
- The required task for this week is to write a reflection blog about what we have learnt through the lecture videos below:
- The word “diegetic” comes from the Greek word “diegesis,” which basically means storytelling. Think of it as everything that exists inside the world of the film — the stuff the characters can actually hear or see.
- The director acts like a narrator, using sound to build the world around the characters.
- Internal diegetic sound is when we hear a character’s thoughts. The other characters can’t hear it, but it’s still part of the story world.
- Diegetic sound helps show what a character is experiencing
- it can be used to give us insight into their mental state.
- Sound can also be used in unexpected ways to create emotion or surprise.
- Nondiegetic sound is anything the audience hears that the characters don’t. This includes background music, narration (when the narrator isn’t part of the story), and added sound effects.
- Sound effects are also added for dramatic or funny moments, like an exaggerated swoosh or punch sound in a comedy.
- Sometimes, sound starts off nondiegetic and then becomes part of the scene, or the other way around. This is called transdiegetic sound.
- It’s a clever way to blur the line between what’s real and what’s not, perfect for dream sequences or emotional moments.
- Some films don’t follow the usual rules and play around with sound in really creative ways:
- In Magnolia, all the characters sing along to a song that normally would just be background music. It becomes this emotional moment where reality and feeling mix together.
- In Psycho, Norman Bates hears his mother’s voice in his head , which is really his own and it gives us insight into his fractured mind.
- In Joker, Arthur sings along to music we thought was just playing in the background, hinting that maybe it’s all in his head.
- In La La Land, background music in a restaurant turns into something a character hears inside her own imagination, leading her to make a bold decision.
- How we hear sound in film:
- Acousmatic sound: You can hear it, but you don’t see where it’s coming from. This could be part of the story (like birds chirping offscreen) or not (like the movie score).
- Visualized sound: You see and hear the source of the sound at the same time.
- Filmmakers use these techniques to guide what we feel, what we focus on, and how we understand a scene.
- We always say we’re watching a movie but actually we’re also listening to one.
- Sound design is just as powerful as visuals when it comes to storytelling. It helps create the mood, the meaning, and the connection we feel with what’s on screen.
- The required task for this week is to write a reflection blog about what we have learnt through the lecture videos below:
-
When we hear a sound, like a dog barking, our brains quickly connect it to something familiar.
-
Put a few sounds together, and we start to build a full scene in our heads, almost like watching a movie without the screen.
-
Distance – We can sense whether something is close or far based on how it sounds.
-
Space – Certain sounds make us feel like we’re in a wide-open space or a small, enclosed room.
-
Direction – We can tell where a sound is coming from, left, right, behind, or above us.
-
Temperature, Time, and Era – Sounds can suggest warmth or cold, a time of day, or even a specific moment in history.
-
Emotion – A single sound can make us feel calm, tense, joyful, or nostalgic.
-
Music plays a big role in soundscapes; it blends with other sounds to help set the mood.
-
It adds emotion and helps the scene feel more alive.
-
Learned sounds are ones we recognize because of what we’ve seen or heard before, like the sound of a cash register making us think of money.
-
Instinctual sounds are hardwired into us, high-pitched sounds often feel safe, while deep, low-pitched ones can feel threatening.
-
These instincts come from how we, as humans, evolved to respond to our surroundings.
-
Soundscapes are more than just background noise.
-
They help us tell stories, shape how we feel, and make us feel like we’re somewhere else; just through sound.



Comments
Post a Comment