I worked on the finishing touches on the comp. I added smaller details like chromatic aberration, film grain, and worked on adding motion to the plate by using subtle camera shake and match-move to transfer the camera motion from a plate I shot. My results, my breakdown, and my Nuke tree can be seen below!
This week, along with getting the new pillar shaders in shot, I was able to work on some more adjustments to comp. It was suggested that I add some movement to the plate, so filmed some footage and used a match move to apply that to my still back plate. I also tweaked the shaders a little bit to adjust the dust/dirt level. The results of this week can be seen below.
I worked on adding some detailed maps to the car shaders to make the car appear more dusty/dirty. To create the maps, I used Substance. This was the first time I had ever used the software, and I am super impressed with this capabilities. I was able to get some nice results, despite my inexperience. We also decided on making the car red, to help it pop against the cool blue background. I received the first renders of the tire burn smoke from Will this week and did a test comp of the effect in the shot. Here, you can see the shaders in motion and the test with the effect. Underneath the video, I will include the maps I used in the diffuse and specular channels to give the car the appearance of dust.
Substance Maps for the Car
The maps below have been used as a mask in a layered texture for the diffuse color (to switch between red and dust), and have also been remapped and used in the specular roughness channels.
I was also able to use Substance to look dev the pillars that come bursting through the ground. I used a basic concrete material and altered the parameters to get more variation in the maps of each pillar. Two of my shaders can be seen applied to the models below. There are an additional two variations, but for the purpose of detail preservation, I filled the frame with two pillars instead of four. The maps for one of my pillars can be seen below.
Substance Maps for the Pillars
This week I worked mostly on my car shaders, as well as a test comp of the shaders in shot. Overall, I created 19 different shaders for various parts on the car. The turntable of my car can be seen below, along with the material attributes for each shader. In the upcoming week, I plan to bring the car into Substance Painter and start creating some maps to give the car a dirtier/dustier look.
Car Turntable w/ Shaders
Once my shaders were in a good spot, I rendered out a single frame with some basic layers and AOVs to get a comp started for the final shot. Luckily, the lighting and camera match I did for my white ball was close enough and gave me a good start for the color corrections on my car renders. The result, breakdown, and Nuke tree of my first test comp can be seen below.
Comp Test: Final Shot
I dedicated the beginning of this week to tracking the camera in the drone shots for our first driving sequence. I was able to get my solve error down to 0.45 and the cube geometry seemed to be sticking well in my Nuke 3D scene. However, when I exported the camera/scene from Nuke and imported it into Maya, I was getting a camera with the focal length of 1.9 (which I didn't even know was possible, since the Maya camera doesn't allow the user to set a focal length value lower than 2.5!). Additionally, the perspective in Maya was very off and did not match my Nuke 3D scene at all. I realized this error was due to a lack calculation of the conversion of the drone sensor size. Since the drone uses a micro-sensor (CMOS 1/2.3"), I needed to convert the focal length to an appropriate measurement for a 35 mm equivalent. To do this, I used a resource Prof. Bridget provided (https://www.digified.net/focallength/). Once I calculated the conversion and re-tracked my footage in Nuke, the perspective error was fixed! The initial perspective discrepancy as well as a playblast of my camera track can be watched below.
Nuke vs. Maya
Camera Track Playblast
When we reshot on Saturday, Will was able to bring along his friend from the film department who shot some plates using his drone. I am SUPER excited to work with this footage. It looks really cool and I think it will greatly contribute to the dynamic and cinematic quality of our commercial. The most recent cut of all of our plates can be seen below. *NOTE! The plates have not been properly color corrected yet, so they are mismatched.
I also cleaned and stitched all the HDRIs together from our shoot. I took 3 different HDRIs for the first sequence to increase the information for reflections. The last HDRI from sequence 1 can also be used for sequence 2 (the corner where the car turns). I took a final HDRI in the parking lot for the last shot. The results of my HRDIs are below. Keep in mind these are the 8-bit PNG versions of the 32-bit radiance files, so the color space is a little different.
Camera and Lighting
Additionally, I matched the camera for our final parking lot shot. I used the old camera matching method of matching with a box instead of camera tracking because our final shot is a still plate. I was able to get a good match and start on the lighting match as well. I'm not super satisfied with the lighting match just yet, as I had to finesse it a little be more in comp than I would have liked, but it's a good start. I will be tweaking the lighting in Maya a little more so that the renders match more closely from the get-go. My camera and lighting matches are shown below.
Since we were unsatisfied with our results from both of our shooting sessions at the dirt road location, we decided to scout a new location that offered more visual variety and related to our concept more. I suggested going to the road under the Talmadge bridge because there are a lot of pillars (to reinforce our concept) and opportunities for better compositions. We arrived at the location this morning to do some reference shots and we plan on shooting our plates tomorrow. The lighting around 7:30 am was really beautiful, so I think when we shoot tomorrow, we will arrive early, set up quickly, and get our plates done in this golden hour (really about 30 minutes) of sunlight. That was part of the reason we scouted today, so we could plan and work quickly tomorrow. The roads in this location were flatter, so the shots from Aidan's car were MUCH more successful. We also used a wider-angle lens to reduce the motion even more. I think these shots work a lot better and I'm excited to shoot the plates and HDRs tomorrow. Our results can be seen below.
Also, tomorrow we will be taking some drone videos as either B-roll shots, or if we have time, as an additional shot with our car and effects. I'm really excited about using the drone to shoot, so hopefully we will get some really cool shots!
We pitched our idea to The Mill and they liked our idea; however, their biggest critique was that our plates were not interesting enough. We expected this feedback, as we knew they were mostly just tests. We went out today to try and get some more interesting shots/camera moves. We also decided to change our commercial to two shots so each artist could focus on one shot each and really refine/perfect those shots, so we cut Shot 1. For Shot 2, we wanted to make it more dynamic, so we tried shooting out of the back of Aidan's car with Timmy holding the camera on a shoulder rig. For the last shot, we moved the spin out to a parking lot nearby our road with some foundation pillars, to help give context our effects and add interest to the shot. Here are our results (references with Will's Jeep and clean plates).
Ultimately, after unsuccessfully trying to stabilize the footage, we decided that we will need to reshoot or decided on a different approach for this shot. We were able to get a decent track off of the footage, but without the stabilization and with the addition of our animation and effects, this shot would be WAY too busy, so we're going to reshoot again on Wednesday.
Today, my group and I went to our filming location to shoot some plates. In the first shot, we wanted to create a low-angle establishing shot near the car's back tire, so we could show off the dust effects Will and Aidan plan to create. The car would then speed off into the distance. In the second shot, the car would be moving towards the camera, swerving left and right to avoid the pillars that are shooting through the ground. Then in the third shot, the car comes in from frame right and spins out, covering the frame in a dust cloud. Here's the reference (with Will's Jeep). Timmy and I both set up the camera and composition.
Then we shot plates for each shot and I was able to process them, get a camera track, export a camera to Maya, and animate some previs geometry. The track is VERY rough and I did not refine it at all, so there is sliding. However, we know these plates won't work for our final shots, since the weather was not ideal, so this was just for a test anyway.
Plates, Tracking, and PreVis Test
This week, we got out groups for project 1. The purpose of this project is to create a car commercial and integrate it into a live action plate. I will be working with Timmy, Aidan, and Will for this assignment. We got together this week to come up with a concept.
After browsing through some car commercials, Timmy found a Lamborghini commercial that we all really liked. We decided to draw inspiration from these shots and effects while forming our concept.
What we liked most about this commercial was the idea of abstract pillar-like objects bursting through the ground as the car swerves to avoid these unexpected obstacles.We especially liked these three shots and are planning to use them as guides when we shoot our own plates.
We decided to create a Jeep commercial with a similar look and feel while promoting a more adventurous vehicle. For the model, we chose a Jeep Renegade because Will actually drives one, so we would have access to the car for texture and scale reference. We found a model on Hum3d.com and plan on purchasing it.
Tomorrow we plan to location scout and shoot some plates. There is a dirt road across the Talmadge bridge in South Carolina we plan to use as our location. I'll post the results of our shoot tomorrow!
This week, I worked with David to establish a workflow for rendering out different layers in Mantra. The effect David created needed to be rendered in Mantra in order to properly display the material transition, so we set up the same passes that I used in Maya/Arnold in his Houdini file. Once this workflow was established, I started to do some test composites to see the effectiveness of the integration. Despite working with two separate renderers in two separate 3D packages, I think we achieved some pretty good results. A breakdown can be seen below.
Final Comp with Effect
Ball Comp Breakdown (click through)
Bucket Comp Breakdown
Ideally, we would have been able to get all of the render layers working within Mantra before compositing; however, due to some layers either being broken or missing, I had to create some workarounds in Nuke, specifically for the ground reflection from effect, glow reflection on ball from bucket, shadow of ball on bucket. My Nuke tree was too big to screen grab, so I've broken down the different sections of it below, along with descriptions of how I used each.
Nuke Tree Breakdown
I edited the shader to make the object seem more metallic in an effort to match my reference more closely. Here are some renders of the new shader in light and shadow, as well as the material attributes.
I also was able to render out a motion test with a RBD simulation from my effects partner, David, and create a composite that integrates the new renders. Here's my new motion test with the updated shaders.
A click-through of my breakdown and nuke node network can be seen below.
Here are the results from my motion test with the material applied to the sphere. I fixed the environment shadow maps, so the shadows on the ball appear cleaner and more accurate. I am happy with the way the ball looks as it moves through the light and shadow; however, I still need to tweak the shader. I plan to work on that next.
Also, I was able to match the lighting in Houdini as well, so that my partner can use Mantra to render his effect within Houdini. We plan to render our objects separately and composite them together. This is because David's effect is very shader dependent within Houdini/Mantra, so we have been unable to export the simulation into Maya while preserving the look of the simulation. Here's a render of the lighting match in Houdini/Mantra.
Today, I complied my renders for the motion test of the white sphere. Seeing the ball in motion help me recognize some more technical problems that I needed to fix. The edges of my shadow projection image are not clean enough, contributing to the strange stretching appearance, so I need to go back into Photoshop and clean it up some more. Here's the video of the motion test.
I also began some preliminary tests on the ball material. The look we're going for is similar to a dirty lead ball that still has some shininess, but is mostly matte due to its grimy appearance. My reference as well as my first pass can be seen below.
First Shader Pass
*Note! I have remapped the values of some of my maps within hypershade. The images above represent the source files but the values have been edited within the node network.
I'm not super happy with the shader's specular quality, I think it needs a little more shine and variation. I would also like to increase the displacement map a little more to add more variety to the object's profile. Right now, I have a motion test with the material on the farm and I will post those results tomorrow.
After class today, I tried to implement the environment shadow technique that Prof. Bridget showed us, and was fairly successful. To create the environment shadow, I captured a render from a camera that was placed in the same location as the key light. Then, I took that image into Photoshop and painted the shadows in as a black and white map (shadows being white, background being black). I then applied the map as a surface shader projection onto the ball. That render can then be used as a mask for a color correction on the key light layer. My result can be seen below, as well as my environment shadow map and updated nuke tree.
I currently have an animation test rendering on the farm and will post my composite with movement as soon as its ready!
Today, I worked on breaking up my render into separate passes for compositing. The different passes I created were a object beauty pass, key light mask, fill light mask, object shadow mask, ambient occlusion, ground reflection, and ground reflection mask. I plan to also create a layer for the environment shadow, but I believe Prof. Bridget is going over that in class on Thursday, so I will wait until then. All of these passes were created using my white ball test sphere so that I could match the reference plate in Nuke before bringing in my new ball to composite. My first attempt at passes can be seen below. A gray background has been added to fill in the background.
And here is my first attempt at compositing the CG white ball into the scene.
I'm pretty pleased with how the lighting match and shadow look; however, I still need to work on the ground reflection and occlusion. Additionally, I had the workflow wrong with the key and fill layers (don't need the masks, just separate passes), so in my next tests I will fix that. I plan start look-developing the shader for my actual ball and creating an environment shadow pass, so I'm hoping to create some better test renders and composites soon. I've included my nuke tree at the bottom of this post. *Disclaimer* I'm still learning proper nuke tree organizing techniques so if something is messy or wrong, please let me know!
After taking my background plate and reference images, I brought them into Maya to establish a camera and lighting match. My results can be seen below.
White Ball Material Attributes
For the white sphere, I was having trouble with the bounce light on the bottom left matching the reference image, but I believe that issue is due to the fact that I am using a flat gray shader instead of a material that more closely matches the ground plane in the background plate. I did some tests with image projections on the ground and that seemed to help resolve the issue.
I also did a few tests comparing the capabilities of an projection sphere and an AI Sky Dome, using both the ball HDR and lat-long 360 HDR I took. Besides the images and projection type, I kept all of the settings the same between tests. Overall, I liked the results of the AI Sky Dome using the lat-long 360 HDR the best, as it seemed to match the original lighting scenario the closest. The projection sphere method yielded pretty noisey results as well, so more optimization is needed to get a cleaner result.
Additionally, I worked on blocking in the animation of the ball rolling through the scene. There are two different versions of the playblast, one with a flat gray material and the other with a checkered material (to test the ball's rolling speed).
I plan to work on matching the lighting and shadows, as well as creating some material tests in the next coming days. I also seem to be having some color management issues in my workflow so I plan to ask Prof. Bridget resolving that.
Our first assignment for the class is to integrate a rolling ball into a background plate that contains both light and shadow. We are required to take each of our images (back plate, shadow plate, gray ball, cube, chrome ball, and HDR). Today, I went out to take some photos for project one and got some pretty good sets of images! My favorite set so far can be seen below. I will upload the other sets as I finish processing them. I was able to take my HDR images using the chrome ball technique as well as with the Ricoh Theta S 360 camera.