Seven, with Poppy Ackroyd.

A few weeks ago I shot this video with Poppy and John at the Witespace Gallery, in Edinburgh.

Seven from Poppy Ackroyd on Vimeo.

I shot it with available light, with the GH1 mainly handheld, wide-open with the Nikkor AF-D 50/1.4. I think we ran through the track four times, to make sure I’d have enough redundancy to make a decent cut of the track.

As per my usual, I edited in Final Cut Express 4.0.1. I decided to transcode the AVCHD footage to AIC for simplicity, and since I’d exposed the footage pretty light, I didn’t feel I would loos any dynamic range in this conversion. I think I was right, even though the whites did display some banding, both before and after transcoding.

The colour grading was done using free filters. To do this, I make a new Sequence, and slap the edited sequence in the new sequence’s timeline, and do all of my grading on this encapsulation of the editable edit. Firstly I increased the exposure with Image Control/Brightness. Then I used CHV/Silk and Fog, radio-clicked for “silk”: I used this to subtly make a little bit of glow on the highlights, which made the footage look a bit more filmic. I then used CoreMelt’s Pigment RGB Levels and Curves to adjust the exposure curve. This is – so far – the closest I have found a video filter to work like Photoshop’s curves, though it is still limited. I then converted the image to B&W using TMTS Color’s Black & White, which gives you a good RGB mix in the monochrome conversion, so you can make skin tones look smooth.

I’m quite pleased with how the high-key look came out, using only available light and manual controls. If you use Final Cut, you might as well download these free plugins, as they do seem to perform pretty well.

And of course, look out for more from Poppy HERE.

Tobias Feltus:

Using your hacked GH1 with Final Cut Express (and fighting colour shifts).

Working on the music video for Herself‘s Here We Are was the first time that I attempted to streamline my hacked GH1’s workflow. Since my struggles dragged out into several days of trial and error, I hope that my findings will be helpful to others.

I chose to shoot the video in AVCHD 1080p50 because my current firmware produces ridiculously heavy MJPGs which have an excellent dynamic range, but are not practical to work with as I can only get just over 3 minutes’ footage on an 8GB card: limitations which remind me of Super8. So, my AVCHD *.MTS files seem to have an average bitrate around 14mb/s, similar to the original settings. I also choose to work with Final Cut Express 4, as I own a copy and am content to make fun things with the pauper’s tools.

The camera was set up in front of tungsten lights, with a lovely Zeiss prime. No fancy CP or ZF glass but rather the cheapest prime on the market, the M42 Tessar 50/2.8. Frown? No, it is one of the sharpest pieces of glass out there, and can be picked up for the price of a couple of pints of beer. It was never favoured as it is not particularly bright, but if you don’t intend to shoot at a wide aperture, there is little point in spending a fortune on something only just as good. The video was shot at f5.6, focussed at 0.7m.

So, the first thing that you need to do is follow this link, and download Panasonic’s QuickTime plugin. This will allow all of your software to read the camera’s *.MTS files without the need to transcode them (so yea, don’t ‘log & transfer’ the footage in Final Cut Express). The plugin gets installed in YourDrive/Library/QuickTime (not your UserName’s Library). You will need to restart your machine before it becomes accessible. Copy the AVCHD folder from your card to a hard drive and call it something like “avchd archive”. The folder structure of AVCHD is a bit confusing, and is apparently a playlist of individual files, which only becomes handy when you have clip lengths that create files over the maximum file size limit of a FAT formatted card. So everything you should really need is contained in AVCHD/BDMV/STREAM, and all the files will have a *.MTS suffix.

Now, I decided to convert my *.MTS files to ProRes 422 HQ using Streamclip (which you can download for free
. After editing I did find that I could now just use the *.MTS in Final Cut Express’s timeline, and I am not sure if there is a disadvantage to doing this. The reason why I started exploring these codecs is that my initial import, Logged and Transferred in Final Cut into its Apple Intermediate Codec, looked washed out and varied in tone when I exported the edit from Final Cut. ProRes is a Quicktime codec which you should be able to find on the internet; the main download from Apple (here) appears to require one of their ProApps to run, but I shall leave this detail to you, if you need it.

I found that editing the footage was smooth. Both the ProRes, Apple Intermediate Codec and raw *.MTS files will play smoothly without the need to be rendered. I did a tiny bit of grading before exporting, but found that no sharpening was needed, as it destroys that film-like quality that you can get with a good lens. The main issue I had was with the low quality of the native Apple Intermediate Codec which was perfectly good for HDV footage, but seems to loose a lot – especially in the shadows – with my current setup. Assuming that the quality of each of your clips is the same, I do the grading by making a new Sequence, and placing your edit on its timeline, then applying your filters to this new Sequence.

Apple Intermediate CodecProRes 422 HQProRes 422 HQ

ProRes 422 HQ

Ranting Addendum: I still feel that many of aspects of working in time-based editors (audio and video) are kept in a manner to preserve the talents of old practitioners, and keep their ‘secrets’ magical. For some reason I was unable to find any tutorial on exactly what I was trying to do, and colour grading in Final Cut Express bares almost no resemblance to how one works with an image in Photoshop or Aperture. To me the important thing here is to have a clear ‘vision’ of what you are after, as tinkering can be slow and clumsy. When you are happy, you then need to export a ‘Master’ file. Now, one often debated issue with any image editing is monitor calibration. Since the output of most of our media will be online or on a DVD and a TV or projector, using a “broadcast” calibrated monitor really should not be an issue, as each output device in the future will be different. Ideally you want your monitor calibrated, so that you are happy with the way that things look in a “correct” manner, and hope that future viewers’ displays are close enough to also be good. Forums are littered with ‘pros’ saying that only broadcast monitors can show the right colours, to which I raise some fingers. Of course they have an opinion, but what are ‘real’ colours? the only time they exist in digital photography is in the interfacing between display and print. There is no right and wrong. The LaCie display at the lab is calibrated to reflect the Epsilon’s print output accurately, and that is all that matters. I can’t get mine to match the lab’s, which also does not really matter, as final proofing is always sone on that display.

Now that my rant is over… Final output is, once again, a big issue. According to what I have found, there are differing gamma settings between Final Cut’s canvas and QuickTime, etc. I was able to get my ‘Master’ export to look like the Canvas by exporting it in ProRes 422 (again, an AIC export shifted drastically in colour), however it took a lot of work to figure out how to compress the ProRes file for web use with the same colours. I can’t reinforce how important it is to make multiple saves – at this stage – as something went wrong the other day and I did loose a lot of work on the grading of the video.

The solution I have come to is the following:
1. Calibrate your monitor – I used my Spyder 3, and went for a ‘normal’ setting of 2.2 Gamma and a white point of 5000k, as this is what I normally use for print output. Using this, do your grading visually in Final Cut Express. Despite the fact that calibration for web-output shouldn’t make any difference, it does give you an input colour profile to work with.
2. Export your Master as ProRes 422 HQ. I left the audio settings as AAC 44.2kHz 320kbs HQ. In the Filter field I set colorsync with my current monitor profile as the input, and sRGB as the output. This gave me a very close match.
3. Using QuickTime’s H.264 encoder just seemed to not work. I have the feeling that the encoder is designed for ‘normal’ colours, and hence that it will change anything else erratically. After several days of mucking around and being very frustrated, I found this blog which recommends using another QuickTime component called x264, which you can download here. Again, this gets installed in YourDrive/Library/QuickTime. So now you should be able to open your ProRes master in QuickTime Pro, and export ‘Quicktime as Quicktime movie’, setting the encoder to x264, and limit your bandwidth if needed. I found that this, finally, produced good colours. Remember to check in the Settings of the plugin that you have the correct framerate, I ticked the Add Gamma 2.2, and b-frame to Optimal, and Use 3rd Pass.

During the compression nightmares I wondered whether I should have spent the £35 and bought Compressor: economically it would have been the cheaper option, but now maybe I have written something that can help someone else save the cash. Never having used Compressor, I don’t know whether it would allow me to preview the output accurately and, most importantly, change what parts of the image are compressed most heavily. Despite the fact that digital video is now even spooled to us on our TV, h.264 compression is good, but its dynamic range is not really able to display colour gradients in a smooth manner. All of my work is subtle, and often dark: frustratingly I think this makes me more prone to noticing the rainbowing in gradients and the blocking of shadows which – probably – no one else will care about.

The video of Here We Are can be viewed HERE.

Tobias Feltus: