Overview of post-production
Post-production is where raw footage becomes a finished television program. Everything that happens after the cameras stop rolling falls into this phase: editing, visual effects, sound design, color work, and final delivery. It's the stage that shapes how a story actually feels to the viewer.
This process requires tight collaboration among editors, VFX artists, sound designers, colorists, and more. Each specialist handles a different layer of the final product, and their work has to come together seamlessly before anything reaches an audience.
Editing techniques
Linear vs non-linear editing
Linear editing means assembling footage sequentially on physical tape. You have to lay down shots in order, which makes rearranging scenes difficult and time-consuming. This method dominated early television production.
Non-linear editing (NLE) uses digital systems that let you access any part of your footage at any time. You can rearrange scenes, try different cuts, and undo changes instantly. NLE is now the industry standard, with software like Avid Media Composer, Adobe Premiere Pro, and DaVinci Resolve used across the industry.
The shift from linear to non-linear editing fundamentally changed how editors work. Instead of committing to decisions early, editors can now experiment freely and refine cuts through multiple iterations.
Continuity editing
Continuity editing keeps the visual and narrative flow smooth between shots so the audience stays immersed in the story. The goal is to make cuts invisible.
Key techniques include:
- Match cuts that align action or composition across two shots
- Eyeline matches that ensure characters appear to look at the right place when the camera angle changes
- Consistent screen direction so movement flows logically from shot to shot (the 180-degree rule)
- Shot/reverse shot patterns for dialogue scenes
This style of editing is the backbone of narrative television, from dramas to sitcoms. When continuity editing is done well, you don't notice it at all.
Montage and parallel editing
Montage strings together a series of short shots to compress time or communicate complex ideas quickly. A training montage, a "getting ready" sequence, or a time-lapse of a city changing across seasons are all common examples.
Parallel editing (also called cross-cutting) intercuts between two or more events happening simultaneously. This technique builds tension by showing the audience things that characters in each storyline can't see. Think of a scene cutting between a detective racing to a location and a victim in danger.
Both techniques manipulate the viewer's sense of time and are frequently used in action sequences, season finales, and character development arcs.
Visual effects
CGI and compositing
Computer-generated imagery (CGI) creates visual elements that weren't captured on set. Compositing layers multiple pieces of footage and CGI together into a single frame so everything looks like it belongs in the same scene.
The range of CGI in television is huge:
- Subtle work like set extensions (making a small set look like a massive castle), removing crew reflections, or adding weather effects
- Complex work like creating dragons in Game of Thrones or the Upside Down creatures in Stranger Things
As TV budgets have grown, the gap between film-quality and TV-quality VFX has narrowed significantly. Shows now routinely feature effects work that would have been feature-film exclusive a decade ago.
Color grading and correction
These are two related but distinct processes:
- Color correction fixes technical problems: matching exposure between shots, correcting white balance, ensuring visual consistency across a scene
- Color grading is the creative step, where a colorist adjusts the overall look to establish mood and atmosphere
Color grading can define a show's entire visual identity. Breaking Bad used warm, desaturated tones to evoke the New Mexico heat. Ozark is famous for its cold blue palette. Some shows use color to distinguish timelines or locations within the narrative, giving viewers an instant visual cue about where and when a scene takes place.
Motion graphics
Motion graphics are animated design elements layered into video content. You see them constantly:
- Title sequences that establish a show's brand and tone
- Lower thirds identifying speakers in news broadcasts or documentaries
- On-screen statistics during sports coverage
- Transitions and bumpers between segments in reality TV
These elements do more than look good. They convey information efficiently and create visual consistency across an entire series or broadcast.
Sound design
Dialogue editing
Dialogue editing cleans up the audio recorded on set. This means removing background noise, smoothing out inconsistencies in volume, and ensuring every word is clear.
When on-set audio is unusable (too much wind, a plane overhead, an actor's delivery was off), the production turns to automated dialogue replacement (ADR). During ADR, actors re-record their lines in a studio while watching the original footage, syncing their new performance to their lip movements on screen. It's more common than most viewers realize.
Foley and sound effects
Foley is the art of recreating everyday sounds in sync with on-screen action. Foley artists watch the footage and physically perform sounds in a studio: walking on different surfaces for footsteps, rustling fabric for clothing movement, handling props for object interactions.
Sound effects fill out the rest of the audio landscape. These can be recorded specifically for a project or pulled from existing sound libraries. Explosions, door slams, ambient city noise, rain on a window: all of these might be added in post.
Together, Foley and sound effects create the immersive audio environment that makes a scene feel real. Without them, even beautifully shot footage can feel flat and lifeless.
Music and scoring
Music is one of the most powerful tools in post-production for shaping how an audience feels.
- Original scoring involves a composer writing custom music tailored to specific scenes and emotional beats
- Licensed tracks are pre-existing songs selected to complement the mood or era of a scene
- Theme songs and recurring musical cues become part of a show's identity (think of the Succession theme or the Stranger Things synth score)
Background music sets pacing and tone throughout an episode. A well-placed musical shift can turn a mundane scene into something tense, heartbreaking, or triumphant.
Post-production workflow

Offline vs online editing
Post-production typically happens in two phases:
- Offline editing uses low-resolution copies (proxies) of the footage. Working with smaller files means faster processing, which lets editors focus on creative decisions: scene order, pacing, performance selection. The result is a "rough cut" that locks in the structure of the episode.
- Online editing swaps in the full-resolution files and handles fine-tuning. This is where final color grading, visual effects integration, and technical polish happen before delivery.
This two-phase approach lets the creative work happen quickly without being slowed down by massive file sizes, while still ensuring the final product is technically pristine.
Collaborative tools
Modern TV post-production relies on tools that keep large teams coordinated:
- Project management software tracks tasks, deadlines, and communication across departments
- Cloud-based platforms allow editors, colorists, and sound designers to access and share files remotely
- Version control systems track every change so teams can compare iterations or revert to earlier versions
- Integrated software suites connect editing, VFX, and sound tools within a single ecosystem
Version control
Keeping track of different versions is critical when multiple people are working on the same project. A single episode might go through dozens of iterations, and productions often need to maintain several distinct cuts:
- A director's cut reflecting the director's creative preferences
- A network cut adjusted to meet broadcaster notes and time requirements
- International versions with different content edits for various markets
Version control systems prevent the chaos of lost work, overwritten files, or confusion about which cut is current.
Technical considerations
Video codecs and formats
A codec (compressor-decompressor) determines how video data is compressed for storage and decompressed for playback. A format defines the file structure and how it interacts with different software and hardware.
Common codecs in TV production include:
- ProRes for high-quality editing and finishing
- H.264 widely used for delivery and streaming
- HEVC (H.265) offering better compression than H.264 at similar quality, increasingly used for 4K content
The choice of codec depends on where the content will end up. A show destined for streaming has different requirements than one airing on broadcast television or being archived for physical media.
Audio mixing and mastering
Mixing balances all the audio elements (dialogue, music, sound effects, ambient sound) so they work together clearly. The mixer decides what the audience hears prominently and what sits in the background.
Mastering ensures consistent volume levels and frequency balance across an entire episode and across a full season. This step also accounts for different playback environments, from TV speakers to surround sound systems to headphones.
Broadcast standards (set by organizations like SMPTE and the EBU) regulate audio levels and dynamic range, so the final mix must comply with specific technical requirements.
Aspect ratios and resolutions
Aspect ratio is the proportional relationship between the frame's width and height:
- 16:9 is the standard widescreen ratio for modern television
- 4:3 was the traditional TV ratio, now mostly seen in older content or used as a deliberate stylistic choice
Resolution refers to the number of pixels in the image. TV content now ranges from standard definition (SD) up through HD (1080p), 4K (2160p), and even 8K. Productions must consider multi-platform delivery, since the same episode might air on broadcast TV, stream on a phone, and be available in 4K on a smart TV.
Post-production roles
Editor
The editor assembles raw footage into a coherent narrative. This role goes far beyond technical assembly: editors make critical decisions about pacing, shot selection, performance choices, and the overall rhythm of a program. They work closely with the director (and often the showrunner) to realize the creative vision.
Many editors specialize in particular genres, since the editing demands of a fast-paced comedy differ significantly from those of a slow-burn drama or a documentary.
VFX supervisor
The VFX supervisor oversees all visual effects work, serving as the bridge between the creative team and the VFX artists. Their responsibilities include:
- Ensuring effects align with the director's vision and the show's visual style
- Coordinating VFX work across potentially multiple vendor studios
- Managing the VFX budget and schedule within the broader post-production timeline
- Advising during production on how to shoot scenes that will require effects later
Sound designer
The sound designer creates and implements the overall audio concept for a program. They oversee dialogue editing, sound effects, Foley, and music integration, making sure all audio elements serve the story.
This role involves close collaboration with composers, music supervisors, and the director to ensure the soundtrack enhances the emotional and narrative experience.
Emerging technologies
Virtual production
Virtual production uses real-time 3D environments (often displayed on massive LED walls) combined with live-action footage during filming. The Mandalorian popularized this approach using Unreal Engine and StageCraft technology.
This technique blurs the boundary between production and post-production. Because actors and crew can see the digital environment on set in real time, many creative decisions that used to happen in post now happen during the shoot. The result is often reduced post-production VFX work and faster turnaround.

AI in post-production
Artificial intelligence is increasingly used to automate repetitive, time-consuming tasks:
- Automated color matching and correction
- Audio cleanup and noise reduction
- Content analysis and metadata tagging (identifying scenes, faces, objects)
- Rough assembly of footage based on script analysis
These tools speed up workflows, but the creative decisions still rest with human editors and artists. AI in post-production is currently more about efficiency than replacing creative judgment.
Cloud-based workflows
Cloud-based post-production allows teams to collaborate remotely, accessing project files from anywhere with an internet connection. This approach provides scalable computing power for rendering and processing, which means a production doesn't need to invest in expensive on-premises hardware.
Cloud workflows also make it easier to integrate different post-production tasks, since editors, colorists, and sound designers can all work on the same project simultaneously from different locations.
Post-production for different genres
Drama vs reality TV
Drama post-production is meticulous. Editors carefully shape performances, maintain narrative coherence, and build emotional arcs. Visual effects and color grading tend to be more complex, and the overall production values are higher.
Reality TV involves a different challenge: constructing compelling storylines from hundreds of hours of unscripted footage. Editors play an outsized creative role, essentially building the narrative in the edit room. Turnaround times are typically much tighter, and adaptability matters more than polish.
News vs sports
News editing prioritizes speed and accuracy. Editors work under extremely tight deadlines, sometimes turning around packages in minutes. Graphics focus on delivering information clearly.
Sports production relies heavily on live editing and instant replay technology. On-screen graphics are dynamic, displaying real-time statistics, scores, and player information. Both genres demand the ability to adapt quickly to unfolding events.
Commercials vs music videos
Commercials require precise storytelling within very short timeframes (typically 15 to 60 seconds). Every frame and every cut must serve the message. Post-production is highly polished, with close collaboration between editors and marketing teams.
Music videos often push creative boundaries, experimenting with non-linear narratives, unconventional visual effects, and bold color grading. The edit is driven by the music's rhythm and energy rather than traditional narrative structure. Both formats demand high production values relative to their length.
Budgeting and scheduling
Time management in post-production
Effective post-production requires detailed scheduling across all phases: editing, VFX, sound design, color, and delivery. Producers build timelines that account for potential bottlenecks like rendering time, client review cycles, and approval processes.
Balancing quality with deadlines is a constant tension. A VFX-heavy episode needs more time than a dialogue-driven one, and the schedule has to reflect that.
Cost estimation
Post-production budgets break down into specific categories:
- Editing (editor salaries, facility costs)
- VFX (often the largest variable cost, especially for genre shows)
- Sound (mixing stages, Foley sessions, music licensing or composition)
- Equipment and software licenses
- Contingency for unexpected overages
Balancing creative ambition with financial reality is a core part of producing television. A showrunner might want 500 VFX shots per episode, but the budget might only support 200.
Outsourcing vs in-house
Productions weigh the benefits of maintaining in-house post-production facilities against outsourcing to specialized vendors:
- In-house offers more control, tighter security, and easier communication
- Outsourcing provides access to specialized skills (high-end VFX, complex sound design) and can be more cost-effective for specific tasks
- Many productions use a hybrid approach, keeping a core in-house team while outsourcing specialized or overflow work
Quality control and delivery
Technical specifications
Before delivery, the final product must pass technical quality control (QC). This involves checking:
- Video and audio levels against broadcast or platform standards
- Color accuracy and consistency
- Proper frame rates and resolution
- Compatibility with target playback systems
Industry standards from organizations like SMPTE (Society of Motion Picture and Television Engineers) and EBU (European Broadcasting Union) define the technical requirements for professional content.
Content standards
Beyond technical QC, content must comply with network or platform guidelines. This includes:
- Age ratings and content warnings
- Copyright clearances for music, footage, and other licensed material
- Product placement regulations
- Cultural sensitivities for different markets
Some shows require multiple versions edited for different audiences or regions.
Distribution formats
The final step is preparing deliverables for every distribution channel. A single show might need:
- Multiple resolution versions (SD, HD, 4K)
- Closed captions and subtitles in various languages
- Audio descriptions for accessibility
- DRM (Digital Rights Management) protections for digital distribution
Each platform (broadcast network, streaming service, international distributor) has its own delivery specifications, and the post-production team must meet all of them.