dipityPix app

Friday, July 31, 2015

Is copy and paste a net time saver?

Many programmers have advocated disabling copy and paste because the resulting errors can be so painful.   That indeed was the source of this lack of total internal reflection in the little cube from an earlier post:

The photo on the left has a strong internal reflection at the bottom of the cube
I tracked a bunch of rays and it turned out the total internal reflection was occurring, but the polarization transformed wrong.   This code:

            r_s = 1-r_s;
            r_p = 1-r_s;



Should have been

            r_s = 1-r_s;
            r_p = 1-r_p;

Such typos are often so deadly because they take things in range (0-1 in this case) and keep them in range.  Fixing this at least has hit that most glaring problem (the sphere looks a bit better too):


Thursday, July 30, 2015

Yet another renderer validation experiment

Now that I have polarization orientation on the iPad right I did another crude experiment.

  
Left: photo, Right:rendering
There is one glaring difference: the reflection at the bottom of the cube the ball is sitting on.   It looks like total internal reflection in the photo and definitely not in the rendering.   Could be uncareful position calibration and could be refractive index error in the input.   Or of course bug in the renderer.   Looks like to go further I will need to do a very careful scene setup, and use a camera that has raw output (sadly the iPhone doesn't as far as I know).

Question: what is the best BRDF model for materials such as glossy paint?  My first impulse is to go use the Disney model, but I bet there is something specific for architectural materials that is used in practice these days?



Wednesday, July 29, 2015

Testing near Brewster's angle

Yesterday's picture had specular reflections that were way off.  I tried a bunch of tests on my Fresnel equations and they seemed ok.   I tried Brewster's Angle for the glass and this found one big problem: my model!   Note that with LCD screens, no polarizers are needed to see Brewster's angle in action!  Here's two photographs of the iPad at Brewster's Angle:

The ipad produces almost perfectly polarized light, and at Brewster's Angle only one of the polarizations is reflected.   By rotating the iPad you can change the polarization of the incident light 90 degrees.
It occurred to me that maybe I just had the polarization 90 degrees wrong in my input model.   That definitely is one of my problems!   So more complete testing tomorrow.
Rendered images.   Left: my incorrect input with polarization exactly wrong.   Right: fixed model.   Not exactly at Brewster's angle so some reflection is visible.

Tuesday, July 28, 2015

First validation with polarization

As can be seen on the left (photo), LCD screens like on this iPad 1 are fully linearly polarized.  The big black disk is a linear polarizer rotated 90 degrees from the screen.   My first attempt at a serious polarized rendering has some obvious problems!   Could be the polarization infrastructure is wrong.   Or the Fresnel Equations.   Or both :)    I'll start by trying Brewster's angle tests tomorrow.

Left: photo.   Right rendering.   Lots of problems!

Monday, July 27, 2015

Measuring refractive index of a sphere

I wasn't sure my sphere was simple glass and from what I see on the web it could be anywhere from refractive index 1.5 to 1.7.   I did a reasonably measured scene to just match it visually (the lighting doesn't matter just the ray bending) and got:

Index is around 1.55.   I don't think more resolution is likely with this method
Nick P suggested I use a laser pointer and that would have been easier and more accurate.   Wish I read that first :)   I'll try it on the glass sheet.

Another step toward validation

My progression toward a sufficiently accurate renderer continues.

There are some known problems with my renderer/scene/camera before we even start:
  • The scene is too dim for the iPhone camera (so ironically the renderer is not noisy enough)
  • Sensitivity of the camera not modeled
  • The spectral curves of the iPad are just a poor guess
  • No polarization (the iPad emits polarized light)
  • The iPad is being modeled as a diffuse source but its brightness should vary with angle.
  • I just guess the refractive index of the glass plate and ball (1.5)
  • Camera fov and position just in the ballpark.   View direction way off.
Left: rendered image.   Right: photograph.

Saturday, July 25, 2015

A first step toward validating a renderer

I just set up a small environment, bought a macbeth color checker, and a glass ball (amazon).   No light source setup (just a white background), no camera calibration, no refractive index estimation.  

Real scene, photographed with iPhone 6+, random room illumination
First rendering

Monday, July 20, 2015

Glass spheres in a ray tracer

In an earlier post I was wondering whether the image projected on the floor was a bug.    I got a dielectric sphere from Amazon and just set up a ugly ducking Cornell Box made out of foam-core.   Apparently, spheres are more effective lenses than I realized and it is probably not a bug.

The projected image of the ipad is more visible on the floor than I expected.
I'll be doing a more full spectral validation of my renderer this week.    It's fun in an artsy crafty first grade maker movement kind of way!

Sunday, July 19, 2015

Real Utah teapots

The "Utah teapot" is an iconic 3D graphics model made by hand measurement of Martin Newell's teapot in 1975 so it is now a 40 year old model!   It became so popular mainly because Utah distributed it on the internet (Utah was the 4th site on the ArpaNet so they were already used to it) and so it was the only model most people had for some time.   The real teapot is on display in Mountain View and details can be found at wikipedia.   Because the teapot was manufactured, you can get one of its cousins.   They are occassionally for sale on ebay and come in two sizes, as this screenshot from an ebay listing at the time of this writing shows shows (it is selling the big one for $119 and searching for "Melitta Teapot" will show various types including some of the right shape when you are lucky):


I am guessing the smaller teapot is the size of the "real" teapot but I don't know (Newell was a married grad student at the time, and it was pre-Grande size drinks).

Here's the one I own, with some money for scale:


My real agenda for this post is to ask who has the best model for the teapot?   The classic one is not solid and I know many people have done solid models, but I don't know where to get them.

Friday, July 10, 2015

When you see diffuse inter-reflection

Of course we see diffuse inter-reflection all the time in painted rooms, but mainly it makes them have a big "ambient" more or less, and it's hard to tell what is there.  Games like Mirror's Edge made isolated effects more obvious by cranking up the effect both by model design and by multiplication.  If you want to crank it up in model design, I saw a good real world juxtaposition today about 100 meters apart with the same overall environmental lighting.  


On the left there is surely a big "splash" of indirect light at the base of the container.  But there are several things working against it:
  • The container's albedo is pretty low (I would guess 25%)
  • There is no hue to the container
  • There is almost as much direct lighting hitting the pavement (Sun angle around 30 degrees)
On the right we have another situation that emphasizes the indirect blur:
  • MUCH more direct sunlight hitting the yellow object (Sun directions almost tangent to wall) 
  • Hue contrast of indirect illuminator
  • Indirect illuminator has relatively high albedo (it may in fact be fluorescent)

Monday, July 6, 2015

Slightly simplified Cornell Box data

I just put my new renderer through the Cornell Box test using this data from the Cornell lab.   I didn't want to type in all those numbers for the boxes, nor deal with the slightly wonky walls that aren't necessarily planar (the real walls aren't).   So here is my geometry for it (in millimeters like the Cornell measurements):

Box itself: (0,0,0) to (555,555,555)

Small block: (0,0,0) to (165,165,165), rotate around Y -0.314 radians, move (130,0,65)
Big block: (0,0,0) to (165,330,165), rotate around Y  +0.3925 radians, move (265,0,295)

For fun I made the materials of the block glass and copper.   I used the metal Fresnel equations from this very nice post by S Legarde.

This is still brute force so the rays just have to be lucky enough to hit the light (100k initial rays per pixel)


Friday, July 3, 2015

Debugging refraction in a ray tracer

That picture I suspected had a refraction bug did.   It's always hard to tell-- that is one reason refraction hack can work so well.  I found a good debugging case so I will share.

Suppose you have a plate (closed box shape in my code) of glass of refractive index n.   What ray will go into it with easy to understand outcome?   Here is one:
Pink is the cross section of the solid glass pane which continues to the right.
Recall Snell's law that n sin(theta) = n' sin(theta').   That implies that for an incident ray from vacuum (my renderer will kill inhabitants immediately) at 45 degrees into a medium with refractive index sqrt(2), the refracted ray will have sin(theta') = 1/2.   There can't be any reflection on the next bounce because you would need sin(theta) = sqrt(6)/2, so you will get total internal reflection all the way until the other end of the glass plate.   I add some green exponential attenuation and voila (hopefully right):
Refraction looks plausible.  Maybe that quasi reflection of the sphere in the floor is a caustic?

Thursday, July 2, 2015

Shadow rays and I are getting a divorce

Like most batch rendering people I have a love-hate relationship with shadow rays.   They are wonderfully simple and can easily get you effects like accurate shadows of an eclipse.   And yet without getting fancy you start having car interiors turn black because the shadow rays hit the windows.    And brute force methods give better pictures given huge amounts of rays.   But I think I am finally believing my own BS and thinking computers are fast enough.  In my new rendering project I am abandoning shadow rays so the floor under the glass is automatically illuminated.   Note the quasi-TV is the only light source in the room.   No importance sampling here yet-- just getting the functionality in.   (I think there is still a bug in the glass but getting close).