Tuesday, February 23, 2010

The VFX Predictinator, Part 3

Just joining us? Please read "The VFX Predictinator Part 1" and "Part 2"

Let's summarize the criteria that make up our formula that correctly predicts the winner of the Academy Award for visual effects:
  • Critical Acclaim
  • Domestic Box Office
  • Academy Nominations
  • Month of Release
  • Sequel Score
  • Previous Sequel Was Oscar Winner
  • Primary VFX Are Creatures
  • Facial Animation Acting
  • Lead Actor Prestige

And here are the numerical guts of the formula:

(((RT Score/ Sum of all noms' RT Score) X 5)^2) + (BO (millions)/ BO Total of all noms) + (Academy Noms (only if 4 or more) X .25) + (((Month of Release / Total Month of Release) X 2.5)^2)* + (Sequel = -.5) + (Prior Sequel won Oscar = -1) + (Primary FX organic creatures = 1) + (Primary organic creatures include facial acting = .75) + (Lead Actor an Academy Award Winner = 1) = Final VFX Predictinator Score
*value has an upper limit of 1

Huh? You might be thinking, that's an awfully complicated way to manipulate the data. Yes, it uses odd comparative values that are multiplied by random numbers and then squared. Yes, we've assigned arbitrary values to various criteria. And, yes, we don't write formulas for a living, which probably explains a lot. Needlessly complex? Probably. 100% accurate? Definitely.

The Data
Before I list all 20 years' worth of nominees and their scores, proving that the formula works, I need to remind myself of something my wife told me when she originally saw Part 2 of my analysis from 2008. I believe she said, "that's a gigantic, boring mess of charts. I don't think anyone is actually looking at all those. Don't do that next time." She was right. If you're interested, click on the images below, which show our data for 20 years of analysis. Below the images, we'll highlight a few individual competitions, and see how the formula really works.


Now that you've seen all the data, let's take a closer look at some case studies.

Case Study: Blowout Years

Let's look at a couple of instances where the Oscar winning film's Predictinator score was far above its competition. A blowout year was 2001, when "Fellowship of the Ring" won the Oscar with a Predictinator score of 11.6, with its counterparts "A.I." and "Pearl Harbor" earning 4.2 and 2.0 respectively. Tipping the scales for "Fellowship" was its staggering amount of additional Oscar nominations, its month of release, and its very high Tomatometer rating (92% vs. "Pearl's" 25%). Similarly, in 1997, "Lost World" and "Starship Troopers" (with scores of 1.6 and 4.4, respectively) didn't have a chance against the winner "Titanic" (with a score of 9.7). The James Cameron film had a lot going for it - box office, acclaim, 14 Oscar nominations, and a late release date. In 1991, Cameron's "Terminator 2" won with a score of 10.0, while its competition "Hook" (3.9) and "Backdraft" (3.7) didn't stand a chance. "T2" had enormous box office, acclaim, character animation, and additional Oscar nominations, and won even though it was a sequel.

Case Study: Close Calls
What about the years in which the Predictinator scores were neck-and-neck? A narrow margin of victory existed in 1995, when "Babe" (11.1) narrowly beat "Apollo 13," (10.5). The films were fairly evenly matched throughout all criteria, with similar critical acclaim, month of release, but "Apollo" had stronger box office, and a lead actor who already one an acting Oscar. Pushing the family friendly kid movie into victory was the fact that its primary effects were character-based, and also included extensive facial animation.

A three-way close call took place in 1999, when all three nominees were strong contenders. All three films had strong numbers, with "Stuart Little" and "Star Wars Episode I" faring stronger than "The Matrix" with their character animation points-- "The Matrix" even had to deal with a March release, killing it in the Month of Release category. However, "The Matrix" eeked out a win with a stronger Tomatometer score, and more additional Academy Award nominations than its contenders.

And what about possibly the most controversial year of all, the year where "Golden Compass" (4.98) toppled "Transformers" (4.91)? As you can probably guess, their Predictinator scores were one of the closest battles in the 20 years of analysis. While "Transformers" had the edge in acclaim and box office, "Compass" had the slight advantage of a later release date, and a significant advantage that its primary effects were character-based and featured facial animation. (Or it could just be that some members of the Academy have 'issues' with Michael Bay.)

Case Study: Tough Years
There were some 'tough' years, too. By 'tough,' I mean years where the Predictinator was put through the ringer, testing its validity. We focused our attention on these 'tough' years carefully, tweaking the formula until it clinched the correct victory.

1996 proved to be quite difficult to land "Independence Day" on top. While the Roland Emmerich film had a strong box office helping its numbers, "Dragonheart" was hot on its tail with character animation and CG facial animation, two strong criteria in which "ID4" had zero points. And in 2006, "Superman Returns" had extremely strong numbers (with a strong Tomatometer score and solid box office). Its primary competition was "Pirates 2," which rendered weak scores on the Tomatometer, but was narrowly buoyed by character animation and facial character animation. 2006 illustrated the power of the critical acclaim criteria within the formula, and how it nearly allowed "Superman Returns" to topple "Pirates 2." And, as previously mentioned in Part 2, it was difficult to 'get' "What Dreams May Come" to win in 1998, and "Death Becomes Her" to win in 1992.

The Predictinator nearly let "Superman Returns" win the Oscar over "Pirates 2."

What It All Means
This is how it all boils down: the VFX Predictinator will guarantee an Academy Award win if the film is critically acclaimed, is a huge box office hit, grabs several additional Oscar nominations, and is released late in the calendar year. It can't be a sequel and it must contain lots of organic character work that includes facial performance. Oh, and the lead should probably have an acting Oscar under his belt. Easy, right?

This, however, is a fairly obvious conclusion. What makes the Predictinator interesting is not solely the criteria itself, but the assigning of weight and value to each piece of criteria. The heavy weight of performance-based, organic characters, actor prestige, and critical acclaim, and Oscar nominations match the subjective values of the heavily actor-based Academy. It truly is an honor to be nominated by your peers in the visual effects community, but to win the Oscar, your film needs to be loved by actors who enjoy (and wish to reward) lively character-based visual effects.

Two questions now arise from this analysis. One, could this formula, or at least the ideas behind the formula, work for predicting other categories? Secondly, what are the scores of the 2009 Academy Award nominees for Achievement in Visual Effects, and which film does The VFX Predictinator predict will take home Oscar gold? We'll find out in Part 4. And here is Part 4.

Tuesday, February 02, 2010

And the Nominees Are...

Here are the nominees for Best Visual Effects for the 82nd Academy Awards:

“Avatar”
Joe Letteri, Stephen Rosenbaum, Richard Baneham and Andrew R. Jones

"District 9"
Dan Kaufman, Peter Muyzers, Robert Habros and Matt Aitken

“Star Trek”
Roger Guyett, Russell Earl, Paul Kavanaugh, Burt Dalton