dipityPix app

Thursday, October 9, 2008

Double-sided dielectrics for ray tracing

No, neither I nor this blog is dead yet.

Sometimes ray tracing models will take a thin dielectric (like a car windshield) and represent it with one surface. If we assume this dielectric surface is locally planer, then the direction of the ray will not be changed by the double refraction, but it will be offset. This is shown in the image above. Our challenge is to calculate the vector o in the figure. The new origin of the ray will just be hit_point+o.

The thickness t is provided by the model, possibly with a texture. First let's find a unit vector parallel to o by subtracting off the portion of v in the direction of n:

a = -normalize(v - dot(v,n)n)

Now we need to find the distance that is the length of o, which is q-w. We can see that

q = t tan(Theta).


w = t tan(Theta')

Snell's law tells us that

sin(Theta) = ref_Idx sin(Theta').

Assuming v is a unit vector we also have

cos(Theta) = -dot(v,n)

That allows us (with a bunch of sqrts) to get all the sines and cosines we need above.

Friday, May 23, 2008

New pastures

It has been months since my last blog post. My excuse is that I have left my professor job at the University of Utah and joined NVIDIA. I’ll still be based in Utah and will be an Adjunct Prof at the U. Many people have been surprised that I have left a job-for-life that I enjoy, and that I have gone to a GPU company. Given that I went to school at age 5 and have never left, and that I have have been a ray tracing guy for the last 20 years, I can understand their surprise. This post is to explain why I’ve made this move. Historically, graphics people have self-selected into interactive and batch programmers based largely on whether responsiveness or visual quality was more important to them, and I’ve been squarely in the latter category. However, recent GPU graphics has gotten close to the line where I find the visual quality compelling, and I believe it will soon be possible to cross that line. I think the place to help cross that line is in industry given how close this line is, and given the need for development in the context of compelling geometric models.

As for NVIDIA, the main question I asked myself to help make the decision was expressed in my last blog post; are hybrid methods a natural architecture for interactive graphics? After much contemplation, I have concluded the answer is yes. For complex geometry rasterization and hi-res screens has many advantages over ray tracing for viewing “rays”, and the solution already works. It is yet to be shown that ray tracing can be similarly efficient for such sampling rates, and I am skeptical that a magic bullet will be found to change that. I also believe that most lighting effects missing from current games do not need such high geometric complexity give the success of PDI’s renderer which uses lower LODs for illumination. That being said, is NVIDIA hardware flexible enough to support interesting hybrid algorithms? Well, I took the job :) While I cannot yet say if and when NVIDIA is going to use ray tracing, I am confident that we will use whatever is best for getting great interactive graphics, and if ray tracing is part of that solution we will make it fast. NVIDIA is supportive of me continuing this blog so I hope to get back on track with posts soon.

Thursday, January 24, 2008

Ray tracing for massively multiplayer online role-playing game (MMORPG)?

I have been playing various video games lately with an eye to which would benefit most from ray tracing in the near term. I tried World of Warcraft a couple of days ago (and so did my daughter so I fear I am out $20 a month for the rest of my life). I speculate that such a MMORPG is perfect for ray tracing now due to 1) it's current emphasis on geometry, and 2) the need to support low-end hardware.

I'd like to get my hands on a dataset from one of these MMORPG games. If anybody has a legitimate copy or is in one of the MMORPG companies and would like to help me find out if this is feasible on current medium-end CPUs, let me know.