In this assignment, texture mapping, normal mapping, bump mapping, and an introduction to procedural textures such as Perlin noise are introduced to the ray tracer. While having many (18, to be exact) test files seems like a handicap at first, they cover very subtle edge cases gracefully. They are also simple yet inclusive scenes, so without waiting too much for rendering, one can iterate on alternative solutions when something goes wrong. A variety of cases lead to several different bugs, and while the correct renders have good eye candy now, they do not leave room for much discussion.
The difference between bilinear interpolation and the nearest-neighbor approach is quite evident in this example. Also, it is the starting point for me to check whether I assign UV values of the spheres correctly. Turns out to be working just fine. Even with the forward transformations, I have observed that the spheres rarely change their TBN matrices after forward transformations. This must have something to do with whether the transformation is a rotation, scale, or translation. However, I did not have time to run experiments on them to verify which transformations act this way.
This is the normal mapping for the sphere, where one can notice the details immediately put by the normal map.
This cube example is handy to test out the UV values for triangles.
Normal mapping on the triangle is also straightforward, just like the sphere case. However, the added detail to the geometry is a massive bargain against the little computational complexity.
Without the bump mapping, the sphere is missing many details.
And the bump-mapped sphere looks even better when it is combined with a texture that contributes to the diffuse reflectance.
The level of detail achieved without changing the geometry seems so much powerful.
Perlin noise is quite handy since one can generate textures on the fly. Here, the method is quite established, so there were no surprises. The implementation is quite straightforward, as laid out in the lecture. It is also possible to use Perlin noise as the diffuse and the normal map.
To make the ray tracer easier to debug, I hand-feed a fixed seed to all number generators. The good thing is that it is quite easy to debug only your noise function, even do some of the calculations by hand to verify. However, visually, I am stuck with the same patterns, except when the noise scale changes.
Some high-frequency noise is achieved, just by scaling with the noise scale parameter.
Here, Perlin noise is used for bump-mapping. Another more evident result below with the spheres:
.
The killeroo example had bumpy walls. It also introduces UV coordinates that are greater than 1. The presence of such UV coordinates means that the texture is being tiled. When I first tried this example, there was a lighter shade of purple, red, and green around the region that they intersect. It was quite interesting that just a patch on these tiles had lighter shading. Then, I figured out that I did not compensate for the tiling. While fetching the texture coordinates from UV values, one should subtract floor(u) and floor(v) from u and v values respectively to tile the texture.
Here, the bump mapping is applied on transformed objects. In this scene, the sphere does not yield correct results if the TBN vector is normalized during the intersection test. It is worth noting that this sphere only has scaling applied to it, making it an ellipsoid at no cost.
These are some blend modes mapped on these ellipsoids and spheres. All of the ellipsoids are actually spheres, so this is an excellent benchmark to check whether the texture mapping works correctly for spheres. The sphere to the left is the "replace_all" case, in which the texture color replaces all the shading.
This example, combined with its motion-blurred counterpart, required much troubleshooting while working on it. This example has negative UV values, as well as texture coordinates that are just above 1. So, one should handle these edge cases to take care of these subtleties. Also, background texture is present in this example as well. It is easy to implement this one; if a camera ray does not hit any object in the scene, the UV value is then x / width and y / height.
This one looks great, and it also has some triangles that caused some headaches. The subtle part is that some of the colors turn out to be NaNs, due to a deficiency in their TBN matrices. I discussed how to handle this extensively below.
In these two examples above, I have failed to follow the instructions laid out in the homework. In the beginning, I assumed "replace_all" means that I use the same texture for both diffuse and normal mapping, at the same time. Observing the images, I have looked at where the specular highlights begin on the sphere to the left, and I concluded that this has nothing to do with normals. Yet, I have struggled for a couple of hours just to see that "replace_all" means that we only use texture color without applying any shading... The moral of the story is not about "replace_all" at all; it is to read the assignments carefully next time.
This bump mapping example is wrong because the TBN matrix is not multiplied with the forward transformation of the objects. The normals are incorrect and are from another space, as evidenced by the blackish shading to the top of the image. In transformed objects, we find the normal by multiplying the normal with the inverse transpose of the transformation matrix. Here, the tangent T and bitangent B vectors must be multiplied with the forward transformation matrix. Since T and B are both vectors, their fourth component should be 0 during the multiplication.
In this quite mysterious example, I first thought that there must be something wrong with the transformation matrices that I did in the last assignment. However, debugging only to see that I actually got an intersection in the top planes made me seek alternate directions. The problem with this image is that, in my triangle intersection tests, I did not do backface culling. Thus, this is what happens when you do not implement backface culling in triangle intersection tests. I have added the test, and it worked smoothly.
This error also not seem evident at first, and debugging it took some time. In the end, I figured out that while assigning the components of the TBN matrix, I made a typo so that one of the indices was wrongfully assigned. Although this seems mysterious as well, it is quite dull that it was the result of a typo.
As I was rushing during the implementation, I overlooked some practices while I was coding pensively. This is one of those subtle bugs in which I directly used glm::normalize on the texture RGB values, instead of adequately dividing them to 255.
These cubes with Perlin noise on them do not seem right. During the bump-mapping of the Perlin noise textures, one has to use an epsilon value to move the intersection point around so that different lattices yield different results for each component. This is a must to find the gradient vector. Here, the epsilon value was too small to have the (x,y,z) points to change lattices. I have found out 1e-3 to be a reasonable value.
While not part of the assignment, to see whether I could read the textures correctly, I made a test texture and mapped them on the cube example. Obviously, the one on the left is bilinear filtering, while the one on the right uses point filtering. Also, due to the compression applied by Paint, the PNG and JPG versions of the textures are different. -> These are the textures that I used, and this is the original size of them. They are 23x11 so that I could see whether the library I use is using row-major or column-major order. Also, each corner has a different color so that I could know whether I set the UV values right on the triangle with little effort.
This spaceship looks severely damaged, and its' UV's look way off, but it has nothing to do with my driving skills, I swear. The black dots have not passed the dot-product check with the normals. However, they should have been flipped. When I debugged single pixels with these dots, I figured out that I actually need to put a small epsilon value (1e-6 works fine), for the dot product check that I flip the normals. This not the only compelling case with this scene, though:
Here's everyone's favorite, the White Stripes. After singling out these pixels, which was quite hard due to the motion blur, I figured out that these colors are actually NaN. When I debugged the shading code, I figured out that first, the normals are NaN during bump-mapping. Then, to my surprise, the TBN values are NaN as well. The du and dv from the texture were both zero, so there was no bumping on these pixels anyway. But the presence of NaN values made the normals NaN, resulting in NaN color. While further investigating the problem, I have seen that for some triangles, the matrices during the TBN construction were not invertible at all. Remember, these pixels have 0 du, dv values as I double-check by hand from the texture. What I did was, for such pathological cases, I create an orthonormal basis around the normal vector and assign the T and B matrices to the other two vectors in the orthonormal basis.
The Galactica example has very subtle details that have to be taken care of, negative UV values are one of them. I flip the signs of the UV's if all of them are negative during the intersection test and take care of the edge cases while fetching the texture pixels.
While concluding this post, I figured out that I have not shared any runtimes. Since the examples were aimed towards edge cases and texture mapping is relatively cheap, many of the scenes were rendered in order of seconds. Only the Galactica scene with the motion blur requires 1 minute. Given the resolution, the number of samples, and geometry, it might be reasonable. Now, the raytracer can create wallpapers, in a minute or so.
Comments
Post a Comment