Friday, May 23, 2008

New pastures

It has been months since my last blog post. My excuse is that I have left my professor job at the University of Utah and joined NVIDIA. I’ll still be based in Utah and will be an Adjunct Prof at the U. Many people have been surprised that I have left a job-for-life that I enjoy, and that I have gone to a GPU company. Given that I went to school at age 5 and have never left, and that I have have been a ray tracing guy for the last 20 years, I can understand their surprise. This post is to explain why I’ve made this move. Historically, graphics people have self-selected into interactive and batch programmers based largely on whether responsiveness or visual quality was more important to them, and I’ve been squarely in the latter category. However, recent GPU graphics has gotten close to the line where I find the visual quality compelling, and I believe it will soon be possible to cross that line. I think the place to help cross that line is in industry given how close this line is, and given the need for development in the context of compelling geometric models.

As for NVIDIA, the main question I asked myself to help make the decision was expressed in my last blog post; are hybrid methods a natural architecture for interactive graphics? After much contemplation, I have concluded the answer is yes. For complex geometry rasterization and hi-res screens has many advantages over ray tracing for viewing “rays”, and the solution already works. It is yet to be shown that ray tracing can be similarly efficient for such sampling rates, and I am skeptical that a magic bullet will be found to change that. I also believe that most lighting effects missing from current games do not need such high geometric complexity give the success of PDI’s renderer which uses lower LODs for illumination. That being said, is NVIDIA hardware flexible enough to support interesting hybrid algorithms? Well, I took the job :) While I cannot yet say if and when NVIDIA is going to use ray tracing, I am confident that we will use whatever is best for getting great interactive graphics, and if ray tracing is part of that solution we will make it fast. NVIDIA is supportive of me continuing this blog so I hope to get back on track with posts soon.


Nick said...

Congratulations on your new gig, I look forward to seeing what you come up with!

Ben said...

Just wanted to say thanks, and you will be missed. I for one benefited greatly from your teaching at the U.

Anonymous said...

The HW-raytracing monster will be closer now :p

EpSiLoN74 said...

Do you really think that only the industry has the possibilities to perform the final step and crosses that line? What can do the academic research in the while?

Biagio said...


Take it as a joke, but on saturday 17 febrary 2007 someone has written:
"... I predict it means the z-buffer will soon exist only in software renderers at studios, so if you are an undergraduate don't bother learning OpenGL unless you need to use it in a project or job soon. I doubt OpenGL will die entirely due to lagacy issues, but it is not something to learn unless you have to. I speak as a person that spent much of my early years writing FORTRAN code!"

I remember this quote because I referred to it in the last chapter of my master thesis ... I hope you are not going to code in FORTRAN again :D

However I (still) agree with you. Maybe this can be the right direction for two reasons:
1- actually GPUs seem to grow faster than multicore CPUs
2- solutions like GPU deferred-shading for raycasting can be easily merged with a good CPU packet raytracer in order to have a faster hybrid approach

And for my opinion (but I'm just a student :D ) this can open new interesting scenarios in research, such as 'hybrid' data structures and new ways of cast rays in GPU.

Congratulations and have a nice time at Nvidea!

(I'm waiting for you next post to attach in my PhD thesis)

Hector Yee said...

Congratulations! Hope to see real time global illumination in some game soon.

Krishna Lekshmi said...


I liked it.

image msking

Bartholomule said...

Wow. I guess I have another reason to look towards NVidia when buying graphics hardware.

I feel bad for the future students that will miss out on your classes. I enjoyed all five of them that I took and learned quite a bit from at least four of them. :)

Forrest said...

Well, it's a good news for nvidia and a bad news for me. I learned this just after I receive offer from Utah Graphics Track..... And now there seem few people in utah still doing graphics ...

Pixel Parade said...

Congratulations! This is a very good news. Thanks to share this.

clipping mask service

Unknown said...
This comment has been removed by the author.
Unknown said...

It is really a beautiful and creative blog having some vital information over the subject. Thank you for share.
clipping path