This is part 2 of my After Effects diary on the Xian project, and in it I’ll be looking at colour grading. If you missed part 1, I mentioned that the Xian project marked several milestones for me, and that I had been trying to cram so many different subjects together that it had taken me two years and four attempts just get to part 1 out. The mistake I had been making was to confuse two distinct topics – compositing and colour grading – which is interesting in hindsight, as the theme of this article is the importance of keeping them separate. You gotta keep ‘em separated While colour grading might sound like a simple and familiar topic, the advice I was given on Xian (in 2011) completely changed the way I approach compositing and grading, and so it’s something I’ve been planning to share for a while. Because this is such a long article (even after splitting it in half) I feel I should clarify a few things. Firstly, this is not a step-by-step tutorial on colour grading. Secondly, as I don’t have the original project files for Xian, I’m not going to be doing a project breakdown on how Xian itself was graded. Instead, I’ll be explaining the significance of the Xian project by looking back at how I approached colour grading before and after the Xian project, and analysing why this was such a big change. If you just want the short version then it’s pretty simple: colour grading is a separate step – both creatively and technically – that comes after compositing. But although this might not sound especially groundbreaking, explaining why this was a change from the way I previously worked introduces several different threads that eventually merge together with Xian. Let’s start by jumping back to the 1980s. Ignorance is bliss I don’t mind admitting that when I started my first job in 1997, I didn’t even know that colour grading existed – but I don’t think that was unusual. The Panasonic MX 10 was a common feature in the VHS edit suites of the 1980s, but lacked any colour correction tools. When I first developed an interest in video production, at high school, we learned that you had to white balance the video camera in every new location. I think everyone in my high school class learned this the hard way, and at some stage we all forgot to do a white balance and ended up with footage that was either blue or orange. In the 1980s there wasn’t any economical way to adjust colours of video recordings– especially with the vhs and u-matic video suites that were common in those days. Desktop video processors like the Panasonic mx 10 and mx 12 did not have any colour correction functions. Even at high school we were taught that different lights had different colours, and that the colour of daylight changes throughout the day, but white balancing seemed to be a purely technical process. If you were using a domestic video camera and you shot something with the wrong white balance, you were stuck with footage that was the wrong colour. This didn’t change at University, even though it had more advanced equipment. Outside of a high-end, online edit suite, there was no readily accessible way to change the colours of analogue video footage. University is also where I was introduced to film. We learned that there were different types of film stock, balanced for either daylight or tungsten lighting, and that you could compensate for different lights by using coloured filters on the camera. But to me, this seemed analogous to the white balance on a video camera and again it struck me as a purely technical exercise. At this stage of my education, we were still struggling to read a light meter and just exposing the film correctly was a major achievement. A typical product shot from Kodak, for their Vision3 range. The T denotes tungsten balanced stocks, the D is for daylight. There was a poster on a wall that detailed the complicated process needed to go from shooting negative film to projecting it, and this is the first time I saw a mention of colour grading – which in the film world is also known as colour timing. Because I didn’t shoot anything on negative film, I didn’t go through the whole process myself. Purely from looking at the poster, I knew there was a step called colour timing, but once again I made a vague assumption that it was to do with white balancing, and any creative possibilities were completely overlooked. All of this means that up to the point I graduated, I’d only thought of colour as something related to white balancing – a necessary if tedious technical process. Not “American Cameraman” While I was studying at University, I began to read American Cinematographer. This was really my first introduction to the idea that film (and video) production could involve a particular look. It’s interesting for me to look back and see that this isn’t something that was ever spelled out clearly. American Cinematographer had been around for over 70 years before I first saw a copy, so it was a bit like picking up a book half-way through, or starting a TV show from series 4, and trying to work out who everyone is and what’s going on. I don’t mind admitting that it took me a while to deduce that the role of the cinematographer is not to simply hold the camera. I suppose if it was, it would be called “American Cameraman”. While the term “Director of Photography” is more common these days, I was surprised to learn that the DoP isn’t even allowed to touch the camera on Hollywood productions, or they risk being fined by their union. So once I grasped the concept that the Director of Photography was not just a camera man, I started to understand what it was they actually did – and I became fascinated by the amount of effort and devotion that went into controlling the image that was recorded. American Cinematographer has been published for 99 years. It’s not a murder mystery, but the first time I picked one up I had a lot of figuring out to do… So it was American Cinematographer that introduced me to the notion of giving a production a “look”. The creative and emotional aspects of a script would be interpreted into colours, lights, filters, film stocks and processing techniques. Paragraphs could be written about the decision to use a ¼ over a ½ filter, providing me with hours of contemplation as I tried to figure out what that meant. What captivated me was the clear expertise and passion of everyone involved, and the level of attention given to the tiniest details, even though I didn’t understand most of it. For me, discovering American Cinematographer would be similar to someone finding a book on art history and trying to figure out what it was about, without ever having seen a painting before. By the time I was finishing my course, I was aware that movies had their own unique look that was an integral part of the film itself. One notable example I remember is Spike Lee’s film “Clockers”, from 1995. Clockers was the first film shot by Malik Hassan Sayeed, who had previously worked in the lighting department. In order to give Clockers a unique look, they decided to shoot it on a reversal film stock developed for NASA, and then cross-process it – which basically means using different chemicals than the ones you’re meant to use. Half-way through the shoot, some representatives from Kodak turned up on set and said they should stop filming, because they didn’t think that it would work. Sayeed ignored their advice and shooting continued – and luckily it did work. While the film was not a commercial success the striking visual style was widely admired. I sometimes wonder how I would feel if I was working on the biggest project of my life, and a few developers from Adobe turned up at my office to tell me what I was doing in After Effects was wrong, and wouldn’t render… “Clockers” was shot on a film stock that was originally made for NASA. Kodak had to make a special batch for Spike Lee’s film that included edge numbers. Cross processing the unusual film stock gave the movie a very distinctive look. While American Cinematographer taught me a lot about the creative approach to lighting and filming, I generally thought of it as something that took place during production. I still wasn’t aware of the creative possibilities in post production. While the most high-profile example of a distinctive colour grade from that time would be “Se7en”, which used a bleach-bypass process and made it fashionable for years to come, it was still a chemical technique that took place in the laboratory, and didn’t seem relevant to desktop video production. So while I hadn’t realized that colour was something that could be controlled during post-production, at least I had become aware of the creative attention that was given to colour and lighting during the shoot. Out of the lab and onto the desktop When I started my first job, desktop non-linear edit suites were only an offline tool – in other words, their image quality wasn’t good enough to produce final masters. The emergence of desktop video is the subject of another series of mine, so if you’re interested in that period then please check out the series here. The TV commercial that I examined in the desktop video series marked my first experience with colour grading. The commercial had been shot on 35mm film, and initially all of the footage was transferred to tape using a fast “single light” process with no colour grading. Even so, it was the highest quality footage I’d ever worked with. But once the edit was approved, the selected takes were fully colour-graded for the final master. The colour grading was done in a Pandora Pogle grading suite, direct from the 35mm negative using an Ursa Gold Telecine. Sitting on the couch and watching the colourist manipulate the image was absolutely mind-blowing. It wasn’t just that I didn’t know how to do it, it was that I didn’t even know that it could be done! I walked out of the colour grading suite in complete awe of the entire process. I was a colour grading convert. A few weeks later I was back in the same grading suite, but this time we were working on a corporate video. Instead of grading directly from film, we were doing a “tape-to-tape” grade – manipulating the colours of a video that had been shot and mastered on Betacam SP. Someone told me that the ability to colour grade video was relatively new, but I don’t know exactly when the technology first became available. It was very expensive, but miraculous. Curves. The learning kind, not the colour correction kind. As I mentioned above, around this time desktop edit suites were only offline tools. In our case, the edit suite came bundled with After Effects for animated titles, as the text capabilities of non-linear suites were very limited. While these days After Effects is used for design, motion graphics, visuals effects, compositing and even character animation, it’s funny to think that twenty years ago it was mainly sold to editors as a way to animate lower thirds! After Effects 3.1 from around 1997. The good old days actually weren’t that good. Because our edit suite was an offline tool, at first I didn’t consider the compositing capabilities in After Effects – if we needed compositing done then we went to either a Quantel or a Flame suite. While it can be tedious to read about “the olden days” it’s worth reminding younger readers that many compositing features weren’t originally included with After Effects, and if you wanted to do chromakey with version 3.1 you needed to purchase the “production bundle”, which was sold separately. But the early desktop video technology advanced very quickly, and over the next year our edit suite improved from being roughly VHS quality to being capable of digital-betacam, broadcast quality outputs. At the same time, I dove head-first into After Effects and online communities, and basically learned as much as I could, as fast as I could. While a lot of that period is a bit of a blur to me now, a quick online search shows that by 2002 I’d figured out enough to write my first tutorial for the Creative Cow, using version 5.5 The point of reminiscing about primitive technology from 20 years ago is to acknowledge that an entire generation of desktop video professionals basically learned on the job. Practically all of the tools and techniques that I use every day didn’t exist in the same way, back in 1997. And before you think that’s a bit dramatic, let’s consider some of the more obvious features of After Effects that we take for granted. In After Effects 3.1 – the version I began with in 1997 – there was no Ram preview. You could only have 1 mask on each layer. There were no text layers – if you wanted to adjust the kerning of type, you needed to do it in Photoshop or Illustrator. There were no 3D layers, no parenting, no expressions, no scripts, no camera tracking. After Effects was only 8 bit, with no colour management. The CC plugins were sold separately, the keying plugins were sold separately, and – to bring us back to the main topic of the article – there weren’t a lot of colour correction tools. In fact – if you’ve ever wondered why there are two different Hue Saturation plugins, it’s because After Effects originally only come with this one (on the left): What has Photoshop ever done for us? The HSL filter on the left came with After Effects version 3, until the developers decided that they liked the Photoshop version better. They kept the old one for compatibility, and included the Photoshop-compatible one as well, starting with version 4 Then someone at Adobe realized that the Photoshop team had written a Hue Saturation tool that was much better, so the After Effects team asked if they could share it because hey, everyone’s working together for Adobe. The Photoshop guys said only if the After Effects team washed their cars for a month, but they refused, before it was agreed to settle the matter with a game of poker. Luckily for us, the After Effects team won and left the Photoshop office with both their trousers and a brand new Hue Saturation plugin, which they included with version 4. Actually I made all of that up just in case you’re getting bored, but it’s worth knowing that although the older HSL filter is technically obsolete, it’s still useful because you can keyframe and add expressions to the individual parameters. But the point is that an entire generation of early adopters developed their skills as the hardware and software developed too. As each new version of After Effects was released, and each new computer was faster and had more RAM, the quality and complexity of the projects being attempted in AE grew as well. Keeping up with all of this was the quality and availability of online communities, and the steady stream of internet tutorials that taught new and exciting techniques. Over time, everything slowly improved – the software got better, the hardware got faster, and I became more experienced. By the time I resigned from my first job in 2005, I had been routinely using After Effects to produce broadcast work and detailed visual fx composites for many years. But while my After Effects skills were developing, I still considered myself an editor, and when it came to compositing there was always an inferiority complex with the expensive Henrys and Flames down the road. In fact, the more complex and advanced my After Effects projects became, the more I worried about how inferior they were to what was being done in the traditional expensive online suites. Probably the most striking aspect of this inferiority complex had to do with colour grading. Right from the time I started working to today, there’s been a pervasive sentiment that After Effects simply isn’t very good at colour grading. While this may have been understandable in 1997, it’s still common to work with people who dismiss the colour tools that come with After Effects – even though there are so many! All the colour correction effects that come with CC 2018 While I was always interested in tutorials on colour grading, I noticed that many of them seemed to be written from the perspective of broadcast editors. Most focused on tools such as waveform monitors and vectorscopes, and emphasized topics such as broadcast limits and safe colours. It was easy to find tutorials on colour correction and matching one shot to another, but there didn’t seem to be any that discussed the creative aspects of colour grading. And possibly because After Effects didn’t have a waveform monitor or a vectorscope, it was easy for authors and people commenting online to dismiss After Effects as simply not being very good for grading. At the same time, whenever I was lucky enough to attend a colour grading session in a dedicated suite, I would always quiz the colourist for as much information as possible. All the colourists I’ve ever spoken to said that they mostly ignored the graphs/ waveform monitors and vectorscopes, which suggested to me that there was a difference in approach to creative colour grading than there was to technical colour correction. As After Effects continued to advance, and the list of colour correction plugins continued to increase, I generally stuck with the basic levels effect for colour correction – and then the Hue Saturation effect to boost the colours. I used the curves effect mainly for overall contrast, as I found the original version a little too clunky to precisely adjust colours, and the lack of a histogram also frustrated me. After a few years, I began using the “colour balance” effect more and more – but that was generally it. This ad was on TV a lot around 20 years ago. I hated the way it looked – all the whites were tinted yellow – but because I’d come up with similar results when experimenting with the “tritone” plugin, I felt reassured that other people were out there too who were also just playing around with After Effects. Because I was doing predominately corporate work, most of the “grading” that I had to do was simply balancing black and white levels, matching contrast and then slightly boosting the saturation, but every so often a job would come along that allowed a bit more experimentation. A popular trend around the late 90s was to mostly desaturate the image and just tint the mid-tones (think of the “tritone” and “CC Toner” effects). A professional colourist told me that they always kept the whites white and the blacks black, and about the same time an unusually ugly TV commercial appeared that had all the highlights tinted yellow, which I took as a sign that the colourist was right. Two worlds collide At some point in the early 2000s, I was lucky enough to win a copy of the “composite wizard” plugins. While the plugins themselves were a valuable asset, what impressed me more was the manual. The company that originally released “composite wizard” also made other plugins, including the Primatte keyer. All of the manuals for their products were printed together in a single book, so although I only owned “composite wizard” I also had the manual for Primatte. This proved to be a gold-mine of valuable information on keying and compositing, as it included a huge amount of information on how different keying plugins worked as well as advanced compositing techniques. While I’ve never owned or used Primatte, just owning the manual was enough to teach me techniques such as lightwraps, interlayer blurs, as well as different types of edge feathering, matte choking and so on. The upshot of all this is that as the complexity of my After Effects projects continued to grow, compositing and colour grading became part of the same process. When I first began using After Effects in 1997, I would consider a project to be complicated if it had more than three or four layers. But 10 years later, I would be building entire 3D worlds inside After Effects and a composition with 100 layers in it would not be unusual (but it would be slow…) One of the main purposes of compositing is to seamlessly combine images that come from different sources. Different layers in a composition can vary in many ways – resolution, sharpness, focus, noise & grain as well as colour. To blend disparate images together can mean removing grain from noisy images, adding grain to clean images, defocussing sharp images and so on. And of course, one of the biggest qualities to match is the colour – which has several different qualities including RGB levels, overall contrast as well as saturation. Anyone who’s ever worked with greenscreen footage will have quickly worked out just how different colours can be between different images. Most studio-shot footage is quite warm. Footage shot in front of a greenscreen will understandably be quite green. Even the simplest composite – a figure shot against greenscreen laid over a different background – can be instantly improved just by balancing the colours between the two layers. But in addition to the technical process of colour matching different layers, there’s also the creative process of adding an overall colour grade. Grading terms can be emotional – warm, cold, romantic etc, or simply descriptive – moonlight, tropical sunset, foggy and so on. This is an example of a simple composite with two layers. (left) raw footage, showing very different colour balances (centre) Using only the levels effect, both layers now have matching black and white points and gamma levels (right) A creative night-time colour grade has been applied. Until the Xian project, I had been combining everything together – each individual layer in a composite would have effects applied to both colour correct AND colour grade the image. Whenever multiple layers interacted, such as with shadows, lightwraps or atmos effects, the creative colour grade would be part of those layers. This was simply the logical outcome of years of development – software, hardware and personal development. But it’s not the best way to do it. House of cards The problem with this approach is that the composite becomes so complex that the way layers interact with each other becomes difficult to control. Getting a perfect result usually means all the colour settings are balanced on a knife edge. Adjusting one layer can throw another layer out, or the entire composite just looks slightly off but you really don’t know why. And if the look isn’t right or needs adjusting, then you quickly find yourself back at square one – having to adjust ALL layers again. If you’re working on multiple shots in a scene then keeping the look consistent between different composites can also become difficult. Combining compositing and colour grading into one process creates a house of cards that can quickly fall apart. Of course – that’s not to say it can’t be done, and I had been doing it for years. With simple composites with only a few layers then it’s not too difficult to pull it all together. And to be honest, even now I’ll still put together simple composites in one After Effects composition. But beyond a certain point, especially where the expectation is photorealism, seamless integration, or consistency across multiple shots, it’s going to cause problems. Notes from a small Island It was my work colleague who put me on the right path, having just returned to Australia from a period with Weta Digital in New Zealand. Based on the workflows they used there, he looked at the work I had been doing on Xian and made a number of suggestions. In short, the best approach for high-end compositing is to keep the worlds of compositing and colour grading separate. Instead of baking a particular colour “look” into each layer of the composite, the idea is to create a neutral composite with as much detail as possible, and then colour grade separately. While I’m sometimes ambivalent about the difference between the terms “colour correction” and “colour grading”, here the distinction is important. Colour correction is used in the compositing process to balance all of the different elements together, but not to give them a distinctive colour look. The aim is to maintain as much detail as possible, and to bring the layers together in a “neutral” state. As stated above, this not only includes colour but also things like noise, grain, sharpness and even movement. While the levels plugin can be used to bring black and white points into alignment, the saturation of all layers should also match, as well as overall contrast. The more consistent each shot is then the easier it is to apply a consistent colour grade across multiple shots. In many ways this is a return to the way things were done before desktop video emerged – as I demonstrated in my series on the desktop video revolution, colour grading and compositing weren’t just different skills, they were done by different companies in different buildings. It’s also how things are still done on large scale productions. In the world of feature films and TV series, the visual fx and compositing work might be spread out across multiple companies all across the world, while colour grading has always been a distinct, final stage. All finished composites will be assembled together into a “Digital Intermediate”, which is the complete but ungraded film. The DI is then graded – and sometimes upscaled at the same time – to produce the final “master”. The grading will take place in a single, dedicated colour grading studio – usually with the Director and Director of Photography present. While anyone who routinely works in the film and TV world may think this article is a case of stating the obvious, there are many more After Effects users out there working in different fields, who are responsible for outputting final deliveries, and this two-step approach may not seem so clear. Certainly I was one of them, and the same can be said for most of the people and companies I’ve worked with. There’s a remarkably large world of video production outside of Hollywood and broadcast TV. While there are many ways in which the workflows for smaller corporate projects can be very different to large scale Hollywood blockbusters, there are also many cases where the basic steps are the same. Solid Advice Following the lead from Hollywood, and keeping compositing and colour grading separate, was a significant change from the way I’d been working and it helped to increase the quality of my work, but also the speed. But there was one other significant lesson that I learned on Xian, relating to exactly how colour grading was done. This Da Vinci Resolve grading suite from the Blackmagic Website is pretty awesome. It’s easy to look at a room like this and compare it to the laptop in your bedroom running After Effects and be convinced that After Effects simply “isn’t very good at grading”. Dedicated colour grading suites are often impressive and intimidating. Even with 20 years of technological advancements, the colour grading suites from the 1990s would be recognisable to anyone in a modern Da Vinci suite. While it’s been many years since I attended a colour grading session in a dedicated suite, the promotional videos and photos on the Blackmagic website suggest that they still look like the flight deck of a jet airplane. As I mentioned above, there’s always been a pervasive sentiment that After Effects just wasn’t very good at colour grading. When comparing After Effects 3.1 to a Pandora Pogle in 1997, this seemed like a pretty fair assessment. But by 2011 After Effects had 16 and 32 bit colour modes, colour management, and a whole host of colour plugins. Was it still a fair call? Again – looking at a modern, dedicated colour grading suite suggests that grading requires all sorts of specialized, precision tools while some of the plugins that come with After Effects seem a bit dated. While I was used to hearing that After Effects wasn’t good at colour grading, I never felt like I was in a position to judge for myself. From what I could see, all of the necessary tools were there – even if they didn’t look as fancy as those in a Da Vinci suite. But because colour grading had such an air of mystique around it, I simply didn’t know. I assumed that there must be some critical functionality or specialised tools that dedicated suites have, which After Effects didn’t, and therefore it just wasn’t “good enough”. So I was interested to see that nearly all of the grading in the Xian project was done with nothing more than coloured solids. My work colleague – the one who’d just returned from Weta – would mock up frames in Photoshop and then I’d use them as style references in After Effects. The final colour grade didn’t require dedicated grading software, or a specialized studio, or fancy graphs or a waveform monitor or some other piece of expensive software. All it took were a bunch of heavily feathered masks on coloured solids, combined with a few blending modes and generally low opacities. In the years since Xian, I’ve noticed more and more that creative colour grading doesn’t have to rely on plugins at all – some artfully arranged solids and some suitably feathered masks can transform basic images into beautifully stylised art. Seeing the light The reason that the Xian project was such a personal milestone was because it fundamentally changed the way that I had been working. In 1997, the first corporate video I worked on was fully graded by a completely different company. When we mastered out the approved edit, we called it a “submaster”. This was taken to a dedicated grading suite – along with an EDL – and it was the final graded version that was given the label “Master”. But after that, the desktop video revolution meant videos could be completely produced – from offline to online, including compositing, animation, motion graphics and visual effects – on a single computer. In my case, this meant that the traditionally distinct role of colour grading became jumbled up into a single, sometimes convoluted, digital process. But to manage the complexity and high expectations of the Xian project, I followed the guidance of my colleague and returned to the traditional approach of creating a neutral submaster – or intermediate – and then colour grading separately from compositing. While this was hardly new – indeed it’s the conventional approach – re-discovering the significance of creative colour grading as opposed to colour correction felt like a huge revelation. In some ways I ended up back where I started, but as with any good road movie – it’s about the journey, not just the destination. If you missed part 1 on Xian, you can catch up here. And if you enjoyed this article, please check out my other ProVideo Coalition posts! The post AE Project Diary: 6) Xian part 2. Color Grading appeared first on ProVideo Coalition.