I saw this effect in my basement recently and it's one of those rare "big" color bleeds. As a bonus there is glossy reflection on the ceiling so I decided it was worth a picture.

# Pete Shirley's Graphics Blog

## Friday, January 23, 2015

## Wednesday, December 17, 2014

### Winner of U of Utah Ray Tracing class image contest

Yesterday I attended Cem Yuksel's end of semester image contest from his ray tracing class. The images were all very impressive (it's clearly a good class... see the link above) and the winner (by Laura Lediaev) I found so impressive I asked her if I could post it here. Here's her description:

*This is a fun scene with candy. There are two main components to this scene - the glass teapot, and the candies. I spent over 30 hours creating this teapot practically from scratch. I started with the Bezier patch description, which I used to create a mesh, and went to work duplicating surfaces, shrinking them to create the inner surfaces, doing some boolean work for cutting out holes, then fusing together all the seams vertex by vertex. The candies started out as a single candy prototype which I sculpted starting from a cube. I then created a huge array of candy copies and used a dynamics simulation to drop the candies into one teapot, and onto the ground in front of the other teapot. The backdrop is just a ground with a single wall (a.k.a. an infinite plane). I have two area lights, and an environment image which is creating the beige color of the ground and some interesting reflections. Can you spot the reflection of a tree in the left teapot handle? The challenge with rendering this scene is all the fully specular paths, which are rays that connect the camera to a light while only hitting specular surfaces such as glass or mirrors. The only way to do this using the rendering methods that we learned in the class is brute force path tracing which takes an extraordinary amount of time. The image has roughly 30,000 samples per pixel.*## Tuesday, December 16, 2014

### Cool website using hex color

A 24bit RGB triple such as red (255,0,0) is often represented as a hex string because 16^2 = 256 and you only need 6 digits. Recall that the hex digits are (0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E, F). So the color (255, 0, 0) would be (FF,0,0) or as a string FF0000. Usually people put a "#" at the front by convention to tag it as a hex. #FF0000. Alain Chesnais made me aware of a clever site that uses the fact that times are also 6 digits when seconds are included. For example 101236 is 36 seconds after 10:12. If one interprets that as a hex color, it is valid (note the maximum on 0..255 is 59 so they are all somewhat dark). There is a website that makes this more concrete so you can start internalizing hex codes. The dark ones anyway! Here's a screenshot.

As it gets closer to the minute rollover you'll get a dark blue.

As it gets closer to the minute rollover you'll get a dark blue.

## Monday, December 8, 2014

### Empirical confimation of diffuse ray hack

Benjamin Keinert
sent a nice note in regards to an earlier post to quickly get "lambertianish" rays. He empirically confirmed it is exactly lambertian. Cool, and thanks Benjamin I always believe such demonstrations more convincing that proofs which can have subtle errors (fine, yes, I am an engineer). I include his full email with his permission.

I think I found a simple informal "engineer's style proof" that it is indeed lambertian (assuming the sample rejection method yields a uniform distribution on a sphere, which it should).

Instead of using a rejection method I picked the uniform distribution on a sphere using spherical fibonacci point sets [2] and constructed a cosine hemisphere sampling variant of it.

Without the loss of generality it should be sufficient to show that the mapping is lambertian for a single normal (0,0,1) - given uniformly distributed points on a sphere and rotational invariance.

Sorry, rapid prototyping code, oldschool OpenGL, PHI = (sqrt(5.0)*0.5 + 0.5):

// PDF: 1/(4*PI)

float3 uniformSampleSphereSF(float i, float n) {

float phi = 2*PI*(i/PHI);

float cosTheta = 1 - (2*i+1)/n;

float sinTheta = sqrt(1 - cosTheta*cosTheta);

return float3(cos(phi)*sinTheta, sin(phi)*sinTheta, cosTheta);

}

// PDF: cos(theta)/PI

float3 cosineSampleHemisphereSF(float i, float n) {

float phi = 2*PI*(i/PHI);

float cosTheta = sqrt(1 - (i+0.5)/n);

float sinTheta = sqrt(1 - cosTheta*cosTheta);

return float3(cos(phi)*sinTheta, sin(phi)*sinTheta, cosTheta);

}

[...]

void test() {

[...]

// Enable additive blending etc.

[...]

uint n = 1024;

glBegin(GL_POINTS);

for (uint i = 0; i < n; ++i) {

glColor4f(0,1,0,1); // Green

float3 p = normalize(uniformSampleSphereSF(i, n) + float3(0,0,1));

glVertex3fv(&p[0]);

glColor4f(1,0,0,1); // Red

float3 q = cosineSampleHemisphereSF(i, n);

glVertex3fv(&q[0]);

// Additive blending => Yellow == "good"

}

glEnd();

}

This little function results in the attached image (orthogonal projection of cosine distributed points on a hemisphere -> uniformly distributed points on a circle).

With some more effort one can show that normalize(uniformSampleSphereSF(i, n) + float3(0,0,1)) = cosineSampleHemisphereSF(i, n) - instead of using additive blending.

[1] http://psgraphics.blogspot.de/2014/09/random-diffuse-rays.html

[2] Spherical Fibonacci Point Sets for Illumination Integrals, Marques et. al.

I think I found a simple informal "engineer's style proof" that it is indeed lambertian (assuming the sample rejection method yields a uniform distribution on a sphere, which it should).

Instead of using a rejection method I picked the uniform distribution on a sphere using spherical fibonacci point sets [2] and constructed a cosine hemisphere sampling variant of it.

Without the loss of generality it should be sufficient to show that the mapping is lambertian for a single normal (0,0,1) - given uniformly distributed points on a sphere and rotational invariance.

Sorry, rapid prototyping code, oldschool OpenGL, PHI = (sqrt(5.0)*0.5 + 0.5):

// PDF: 1/(4*PI)

float3 uniformSampleSphereSF(float i, float n) {

float phi = 2*PI*(i/PHI);

float cosTheta = 1 - (2*i+1)/n;

float sinTheta = sqrt(1 - cosTheta*cosTheta);

return float3(cos(phi)*sinTheta, sin(phi)*sinTheta, cosTheta);

}

// PDF: cos(theta)/PI

float3 cosineSampleHemisphereSF(float i, float n) {

float phi = 2*PI*(i/PHI);

float cosTheta = sqrt(1 - (i+0.5)/n);

float sinTheta = sqrt(1 - cosTheta*cosTheta);

return float3(cos(phi)*sinTheta, sin(phi)*sinTheta, cosTheta);

}

[...]

void test() {

[...]

// Enable additive blending etc.

[...]

uint n = 1024;

glBegin(GL_POINTS);

for (uint i = 0; i < n; ++i) {

glColor4f(0,1,0,1); // Green

float3 p = normalize(uniformSampleSphereSF(i, n) + float3(0,0,1));

glVertex3fv(&p[0]);

glColor4f(1,0,0,1); // Red

float3 q = cosineSampleHemisphereSF(i, n);

glVertex3fv(&q[0]);

// Additive blending => Yellow == "good"

}

glEnd();

}

This little function results in the attached image (orthogonal projection of cosine distributed points on a hemisphere -> uniformly distributed points on a circle).

With some more effort one can show that normalize(uniformSampleSphereSF(i, n) + float3(0,0,1)) = cosineSampleHemisphereSF(i, n) - instead of using additive blending.

[1] http://psgraphics.blogspot.de/2014/09/random-diffuse-rays.html

[2] Spherical Fibonacci Point Sets for Illumination Integrals, Marques et. al.

### Complex Christmas Tree Lighting

Here is an example of why most rendererings use approximations for complex lighting configurations; as lighting gets more complex, it also gets harder to figure out if it is right!

## Tuesday, December 2, 2014

### More on lazy ray tracing tricks

I got some questions on yesterday's post. First, you will need a bunch of samples per pixel (like hundreds). You can reduce that by jittering, but again, you bought that fancy computer-- watch movies while your sampling progresses.

Second, don't branch. Once you go to random, you may as well "path" trace so there is no ray tree. Instead of

color(ray) = R*color(reflected_ray) + (1-R)*color(refracted_ray)

instead do:

if (drand48() < R)

return color(reflected_ray)

else

return color(refracted_ray)

If you want motion blur, you can add a time to the ray ray.t = t0 + drand48*(t1-t0) and add some moving primitives or a moving camera. For a translating sphere, it would be center = p0 + time*(p1-p0)

Second, don't branch. Once you go to random, you may as well "path" trace so there is no ray tree. Instead of

color(ray) = R*color(reflected_ray) + (1-R)*color(refracted_ray)

instead do:

if (drand48() < R)

return color(reflected_ray)

else

return color(refracted_ray)

If you want motion blur, you can add a time to the ray ray.t = t0 + drand48*(t1-t0) and add some moving primitives or a moving camera. For a translating sphere, it would be center = p0 + time*(p1-p0)

## Monday, December 1, 2014

### Lazy ray tracing tricks

If you are doing some ray tracing assignments, I have some lazy tricks for making your pictures look better with very little code. Note these may be slow, but that's what overnight runs are for.

Make a sphere with radius R centered at

The negative radius will make an inward facing normal so the refraction code works out.

If you have a point source at position

do {

x = drand48();

y = drand48();

z = drand48();

} while (x*x + y*y + z*z > 1) // now (x,y,z) is random in unit sphere

newrandomlightpoint = P + R*vec3(x,y,z)

You have a hitpoint

Randomly perturb the eye point within a sphere centered at the eye point. This will work better with some ways people implement cameras than other equally good ways.

Instead of a constant, use Color1*(max(0,dot(N, V1)) + Color2*(max(0,dot(N, V2))

Making V1 "up" and sky (background) color and V2 "down" and ground color will look pretty good.

If you miss everything, and are too lazy for an environment map (I am!), use something that varies. For example (0.2, 0.2, 1.0) + (1.0-fabs(v.z))*(0.6,0.6,0.0). Fool with that.

**Hollow glass primitive. This is to get thin glass balls like Turner Whitted's famous image.**Make a sphere with radius R centered at

**C**. Make another at**C**with radius -0.9R (or -0.95 or whatever). The sphere intersection code depends only on R^2 so the negative radius wont matter. The surface normal is:**N**=**hitPoint**-**C**/radiusThe negative radius will make an inward facing normal so the refraction code works out.

**Soft shadows from point sources.**If you have a point source at position

**P**, instead of sending a shadow ray toward**P**, send one to a random point within radius R of**P**. To do this just use a rejection method by picking random points in a cube of radius R:do {

x = drand48();

y = drand48();

z = drand48();

} while (x*x + y*y + z*z > 1) // now (x,y,z) is random in unit sphere

newrandomlightpoint = P + R*vec3(x,y,z)

**Fuzzy (glossy) reflections**You have a hitpoint

**P**and a reflected ray direction**V**. The "end" of the ray is**Q = P**+**V**. Now find a random point within a sphere centered at**Q**(see soft shadows). Now the new reflection ray is**V**' =**Q**-**P**. Make it a unit vector if your code requires it.**Depth of field**(defocus blur)Randomly perturb the eye point within a sphere centered at the eye point. This will work better with some ways people implement cameras than other equally good ways.

**Directionally varying ambient**Instead of a constant, use Color1*(max(0,dot(N, V1)) + Color2*(max(0,dot(N, V2))

Making V1 "up" and sky (background) color and V2 "down" and ground color will look pretty good.

**Directionally varying background**If you miss everything, and are too lazy for an environment map (I am!), use something that varies. For example (0.2, 0.2, 1.0) + (1.0-fabs(v.z))*(0.6,0.6,0.0). Fool with that.

Subscribe to:
Posts (Atom)