
- Critical Acclaim
- Domestic Box Office
- Academy Nominations
- Month of Release
- Sequel Score
- Previous Sequel Was Oscar Winner
- Primary VFX Are Creatures
- Facial Animation Acting
- Lead Actor Prestige
And here are the numerical guts of the formula:
(((RT Score/ Sum of all noms' RT Score) X 5)^2) + (BO (millions)/ BO Total of all noms) + (Academy Noms (only if 4 or more) X .25) + (((Month of Release / Total Month of Release) X 2.5)^2)* + (Sequel = -.5) + (Prior Sequel won Oscar = -1) + (Primary FX organic creatures = 1) + (Primary organic creatures include facial acting = .75) + (Lead Actor an Academy Award Winner = 1) = Final VFX Predictinator Score
*value has an upper limit of 1
Huh? You might be thinking, that's an awfully complicated way to manipulate the data. Yes, it uses odd comparative values that are multiplied by random numbers and then squared. Yes, we've assigned arbitrary values to various criteria. And, yes, we don't write formulas for a living, which probably explains a lot. Needlessly complex? Probably. 100% accurate? Definitely.
The Data
Before I list all 20 years' worth of nominees and their scores, proving that the formula works, I need to remind myself of something my wife told me when she originally saw Part 2 of my analysis from 2008. I believe she said, "that's a gigantic, boring mess of charts. I don't think anyone is actually looking at all those. Don't do that next time." She was right. If you're interested, click on the images below, which show our data for 20 years of analysis. Below the images, we'll highlight a few individual competitions, and see how the formula really works.


Now that you've seen all the data, let's take a closer look at some case studies.
Case Study: Blowout Years


What about the years in which the Predictinator scores were neck-and-neck? A narrow margin of victory existed in 1995, when "Babe" (11.1) narrowly beat "Apollo 13," (10.5). The films were fairly evenly matched throughout all criteria, with similar critical acclaim, month of release, but "Apollo" had stronger box office, and a lead actor who already one an acting Oscar. Pushing the family friendly kid movie into victory was the fact that its primary effects were character-based, and also included extensive facial animation.


Case Study: Tough Years
There were some 'tough' years, too. By 'tough,' I mean years where the Predictinator was put through the ringer, testing its validity. We focused our attention on these 'tough' years carefully, tweaking the formula until it clinched the correct victory.
1996 proved to be quite difficult to land "Independence Day" on top. While the Roland Emmerich film had a strong box office helping its numbers, "Dragonheart" was hot on its tail with character animation and CG facial animation, two strong criteria in which "ID4" had zero points. And in 2006, "Superman Returns" had extremely strong numbers (with a strong Tomatometer score and solid box office). Its primary competition was "Pirates 2," which rendered weak scores on the Tomatometer, but was narrowly buoyed by character animation and facial character animation. 2006 illustrated the power of the critical acclaim criteria within the formula, and how it nearly allowed "Superman Returns" to topple "Pirates 2." And, as previously mentioned in Part 2, it was difficult to 'get' "What Dreams May Come" to win in 1998, and "Death Becomes Her" to win in 1992.
What It All Means
This is how it all boils down: the VFX Predictinator will guarantee an Academy Award win if the film is critically acclaimed, is a huge box office hit, grabs several additional Oscar nominations, and is released late in the calendar year. It can't be a sequel and it must contain lots of organic character work that includes facial performance. Oh, and the lead should probably have an acting Oscar under his belt. Easy, right?
This, however, is a fairly obvious conclusion. What makes the Predictinator interesting is not solely the criteria itself, but the assigning of weight and value to each piece of criteria. The heavy weight of performance-based, organic characters, actor prestige, and critical acclaim, and Oscar nominations match the subjective values of the heavily actor-based Academy. It truly is an honor to be nominated by your peers in the visual effects community, but to win the Oscar, your film needs to be loved by actors who enjoy (and wish to reward) lively character-based visual effects.
Two questions now arise from this analysis. One, could this formula, or at least the ideas behind the formula, work for predicting other categories? Secondly, what are the scores of the 2009 Academy Award nominees for Achievement in Visual Effects, and which film does The VFX Predictinator predict will take home Oscar gold? We'll find out in Part 4. And here is Part 4.