Tuesday, April 1, 2014


I'll be at NAB (the big TV conference) next week to help goHDR with their booth: I put a little money into this company years ago when I was looking for where I could bet on HDR, and I still like the odds.  It's got me thinking about trends in TVs and whether HDR is finally going to "arrive".  At NAB two years ago there were two big things TV manufacturers were pushing: 4K and dark blacks.  In my opinion the dark blacks had gone past their useful sweet spot; one manufacturer had to close off a room with black curtains so you could see how black their blacks were.  While impressive, it was a very unnatural display condition.  But 4K was the word most often being pushed, and the common demo was a huge 4K display with 4K content.  And huge it is totally worth it.  But do I really want a 4K TV in my house?   I don't think so or I would watch more on my TV and less on my laptop.   However, 4K is exactly what I want on my computer so I see a market there.  Maybe we'll finally see the convergence of TVs and computers but this has been predicted for a long time so I won't hold my breath.

My impression of the history of TV is that manufacturers keep pushing whatever dimension they can and the public does or doesn't bite.  There is an overlapping parallel in the movies.  This has been in roughly this order:
  • Pictures at all (clearly needed!)
  • Color
  • Spatial resolution
  • 3D
  • Temporal resolution (frame rate)
  • Dark blacks
  • Bright whites (the key part of HDR in my opinion)
The industry will push each of these until people stop buying.  I think 4K is past where they stop buying.  And same for 3D and dark blacks.    And temporal resolution (24fps is surprisingly good).  I don't see any technological reason not to have bright brights (back lit LEDs) as has been demonstrated by various manufacturers (old graphics nerds may remember the awesome Brightside demos), so I expect the industry to be pushing these hard next.  And I think I will buy one when it is cheap.   I think the industry will try it because it is the only "big thing" left to try!   We'll see if this is the year "HDR" is a touted word at NAB.  I think it will be, but it could be that 4K still has some legs.

Wednesday, November 20, 2013

Scattering in a constant medium.

I had a sign error in this one.  Here's the derivation of the right answer.

Suppose we have a participating media like smoke or fog that has constant density, and we send a photon-like particle into it and it interacts with the volume randomly so we want to randomly decide where, if anywhere, it gets scattered/absorbed.

Suppose we parameterize a ray along which the particle travels with "t" going from 0 to infinity.  For any given small part of the ray we have the following:

if at t the particle has not scattered, the probability it scatters n the small dt length region on the interval [t,t+dt] is C*dt.  C is proportional to the density of the medium: double the density of particles and the probability of scattering is doubled (because dt is small so the particles can't "line up" and each might block the particle independently).

For the cumulative probability density function P(t) which is the probability the particle has scattered at or before t, we can see:

P(t + dt) = P(t) + (1-P(t))*C*dt

That last term has the (1-P(t)) as the probability the particle is "live" at t, so it has the potential to scatter.


C*(1-P(t)) = (P(t + dt) - P(t)) / dt

This right hand side is the definition of derivative so we have the differential equation:

P'(t) = C*(1-P(t))

This yields

P(t) = 1 - exp(-C*t)

If we have a random number r = random(0,1), we find the t of random scattering by solving:

r = 1 - exp(-C*t)

-C*t = log(1-r)

t = - (1/C)*log(1-r)

Note that r and 1-r have the same distribution (replacing "random(0,1)" with "1-random(0,1)" has the same random behavior) so this is often written

t = -(1/C)*log(r)

Random directions with cosine distribution

I messed up this formula in my course this week.  I promised to fix it here.  Here it is!

Suppose you want to emit ray from a Lambertian (ideal diffuse) surface.  This has the probability of a direction proportional to cosine of the angle with the normal.  If we use a spherical coordinate system with (theta,phi) where phi is the angle "around" the z-axis and theta is the angle "down from" the z axis, we can see from symmetry that all phi are equally likely so

phi = 2*PI*random(0,1)

Now theta you can't just treat as a 1D random variable with density proportional to cosine of theta.  The "differential measure" on the sphere is sin(theta)*dtheta*dphi.  The sine of theta goes along with the theta term so the pdf of theta as a 1d density function is proportional to sin(theta)cos(theta)

p(t) = k*sin(t)*cos(t)    // t is theta

We know INT p(t) dt = 1, with t ranging 0 to PI/2 (hemisphere).  We will have to compute the cumulative probability density function and can normalize then.  The key bit is this: if we call r = random(0,1), we can get a Theta by solving for:

r = INT_0^Theta p(t) dt

The basic idea behind that is that if r = 0.7 for example, we want 70% of the "mass" of the density function to be below the Theta we choose.

r = INT_0^Theta k*sin(t)*cos(t)

r = 0.5*k*sin^2(Theta)

given Sin goes 0 to 1 the normalization falls out easily:

sin^2(Theta) = r

In spherical coordinates we have z = cos(Theta), so a unit vector with cosine density has the Cartesian coordunates:

x = cos(2*PI*r1)*sqrt(r)
y = sin(2*PI*r2)*sqrt(r)
z = sqrt(1-r)

Saturday, July 6, 2013

Intro to physically based rendering course notes.

I have finally put my siggraph asia 2012 course notes online.  I may give this course again in Hong Kong and it's for raw beginners.  If anyone has suggestions for changes or additions to the notes let me know.

Sunday, June 9, 2013

Andrew Glassner's new project

I have been following developments in online education and the exponential increase in college costs with interest.    I recently became frustrated with prototyping image processing codes and asked people (with little hope!) whether there was a decent environment to do that in (really, do I need 300 ugly lines for "hello world"?).  This lead me to "processing" which is super cool (for a full image blur program look at this example).

But this also led me to a new online course by Andrew Glassner.   This makes intro programming cut right to what is fun and cool about programming without all the yuck that is present in modern programming environments (yes powerful, but YUCK!  Do you really want a 15 year old to see that?).

I looked through Andrew's course and it looks to me like a nice combination of online course and medieval apprentice system.   I wish I could have taken this course when I was starting!

Tuesday, March 5, 2013

Aaron Lefohn's and my bet

Seven years ago Aaron Lefohn and I made a bet.  He crushes me.  Ironically the French Laundry is probably not what it used to be but we'll find the 2013 analog!  Here's the email snippet:

> (Aaron) My guess is that a hybrid rasterization/ray-tracing pipeline will end up
> winning.

(Pete) I bet you a dinner at the French Laundry that in 2013, it will be rt and
not hybrid.  What do you say?  My main argument is kiss.  On the other
hand, rasterization works really well for procedural geometry so your idea
could definitely win.

Monday, October 15, 2012

Gortler's graphics book is out

Steve (Shlomo) Gortler's intro graphics text is out.  It's a sweet book, and is nicely priced.  Shlomo was nice enough to sit down and take me through his favorite part: handling points, vectors, and transforms.  One of the things that is a pain in the neck in graphics is managing coordinate systems.  Most books assume a canonical coordinate system and then introduce some Phigs-like hierarchy.  This book makes coordinate systems a first class citizen, while keeping the math elegant as those of you who've read Shlomo's papers know is his habit.  I think he may have found the sweet spot of how to handle this, and he says teaching from it has gone well.  I would love to hear others' experience and am anxious to give it a try in code (the code he showed me makes me think it may be really clean).

My favorite part is surely his color chapter.  I've often tried to develop why the weird XYZ space is the way it is, and what the relationship between the weighting functions (like x(lambda)) and the lights are (nonphysical for XYZ).  He's totally nailed it.  This treatment totally lives up to the "Foundations" part of the title.  I have not yet read the rest of the book, but this chapter alone was worth buying it in my opinion.