(If you’re just here to find the tool, scroll straight to the bottom. If you want a little bit of the history of how this tool came to be, read on.)
Over the year, I’ve tried my hands a few times at getting better results doing relighting in Nuke.
As much as I love working in Nuke, you can’t really say it has ever been good at rendering CG. It can do a great job now with plugins, like Vray or Arnold for Nuke, but these can get a bit pricey. Foundry added a half baked raytrace node a couple of years ago, but I doesn’t support much except pure mirror reflections. And while rendering in Nuke has not been great, relighting has been worse. Always requiring complicated setups to obtain anything usable.
Of course, filtering issues have always broken relighting tools, and the tool I will share below is no exception. Motion Blur, anti-aliasing or depth of field will introduce issues. The technique could probably be implemented in c++ for deep samples, which would improve results. Maybe in the future I will attempt that myself, but that is a bit beyond what I was hoping to achieve here.
My relighting journey started in 2014 when foundry introduced colorway. It had some cool relighting tricks, I think mostly based on AOVs. When I saw the demo, while a lot of it was fairly obvious how it could be achieved in Nuke, there were a few things that baffled me. In one scene, they changed the texture on one object, and it reflected in another object. That was pretty neat. At that time I started playing with outputting a lot of non-standard AOVs from Vray to see what I could cheat in Nuke. Rendering a reflection of UV maps let me change textures in reflections, but it was limited to one ray bounce, and couldn’t support roughness (not sure colorway did either, never tried myself). One thing that came out of it was that I started putting STMaps in environment maps and rendering a reflection pass of that, so that I could change environment reflections in comp.
Again, this was taking additional time to setup, additional time to render, and couldn’t handle roughness well (only by blurring the reflection map, some tools on nukepedia do that quite well).
About a year ago, I started working on putting all this together, and trying to ditch the reflection pass in favor of calculating it on the fly in nuke. I wrote some expressions that would calculate an stmap based on camera normals, which is a pass often included in renders, and I was using an STMap node to sample the right area of an HDRI. It was promising, but a bit limited. I also ran out of free time and I ended up never finishing it.
Fast forward to 2019, I never really managed to get this idea out of my head. I started learning Unreal Engine and was amazed by the results coming out of it in real time on my old and cheap laptop (not talking about the new RTX ray tracing here, just the “regular” unreal graphics). I started doing more reading into real-time PBR lighting. It turns out that these shaders are cheating a lot. Many shortcuts are taken and physical properties are simplified to make the code run as fast as possible, but in the end it still looks quite good.
I didn’t know much about how to achieve that in Nuke, but found this website: https://learnopengl.com which breaks it all down fairly well. I read the whole thing, multiple times, often googling parts that I did not understand, before I felt like I was ready to give it a try myself in nuke. Few weeks ago I took the plunge, and started piecing it together little by little, using pieces of my old R&D to stand-in for the parts I had not implemented yet. At the beginning it really felt like I had no idea what I was doing. It took a lot of trial and error, and some help from people on LinkedIn when I felt like I was getting stuck (Thanks to Sam Hodge and Patrik Hadorn for sharing their experience and links to papers), but eventually it started to look pretty cool.
I’m changing jobs next week and I am not sure how much time I will have to play with this anymore in the short future. While it’s achieving decent results, it’s in no ways production ready. It has a few quirks, like requiring to enter the resolution in the blinkscript node, and having to use the same resolution for the different maps, or having to enter the camera position in a knob instead of just using a camera input. I ran out of time, and also this was a learning exercise, so I got what I wanted from it (the learning). I wasn’t really planning on releasing it originally, but seeing the feedback on LinkedIn it looks like a lot of people might be interested to give it a try, so you can get it here: PBR Relighting Gist.
This is the exact setup (at 720p) that I used to produce the video above. I also included a gizmo that I used to make the spherical textures, since you would need it to obtain the same result, and a link to the HDRI, courtesy of Sam Hodge.
Have fun playing with it, and if somebody brings it up to be production ready, let me know, it would be cool to see.
Leave A Comment