The DGA predicts Best Director 88% of the time. PGA matches Best Picture 80% since 2009. SAG Supporting categories hit 93-95%.
Those numbers are useful. But seeing every guild winner mapped next to every Oscar winner, year by year, tells you something the percentages alone can't. It shows you the pattern breaks. The years where three guilds aligned and still lost. The years where a single guild dissent predicted the upset.
The Precursors Tracker puts all of that data in one table.
What the Precursors Tracker Does
It's a historical comparison tool. Pick a category and you see two columns: guild winner on the left, Oscar winner on the right, going back decades. Every match is visible. Every mismatch jumps out.
This isn't a prediction engine. It's the raw evidence that prediction engines are built on. When someone says "DGA is 88% accurate," this is where that number comes from. You can count the matches yourself.
Key Features
Side-by-Side Historical Comparison
The core of the tool is a simple, readable table. Guild winners on one side. Oscar winners on the other. Years running down the left column.
For Best Director:
- 2024: DGA Christopher Nolan, Oscar Christopher Nolan. Match.
- 2023: DGA Daniel Kwan & Daniel Scheinert, Oscar Daniel Kwan & Daniel Scheinert. Match.
- 2022: DGA Jane Campion, Oscar Jane Campion. Match.
- 2021: DGA Steven Spielberg, Oscar Jane Campion. Mismatch.
That 2021 row is the kind of thing you notice instantly in a table but might miss in a statistics summary. The DGA missed, and it was the year a streaming film (Power of the Dog) was the DGA favorite but the Academy went with a different streaming film's director.
Multi-Guild Coverage
The tracker covers the major guilds:
- DGA (Directors Guild of America) for Best Director
- PGA (Producers Guild of America) for Best Picture
- SAG (Screen Actors Guild) for all four acting categories
- BAFTA (British Academy) for cross-reference
- Critics Choice for additional context
Each guild gets its own column so you can see not just whether the Oscar matched one guild, but how many guilds agreed. When PGA, DGA, and SAG all pick the same film and the Oscar matches, that's the Triple Crown effect at work. When they split and the Oscar picks the odd one out, that's a genuine upset worth studying.
Pattern Recognition at a Glance
Numbers like "88% accuracy" are averages across 77 years. But accuracy isn't evenly distributed. The DGA went 14 for 14 from 2011 to 2024. Before that, it went 17 for 20 from 1991 to 2010. The tool lets you see these streaks and their breaks without doing the math yourself.
You can also spot era effects. SAG Ensemble has been a poor Best Picture predictor overall (41%), but there are five-year stretches where it ran hot. The table shows you those runs in context.
How to Use It
Step 1: Pick Your Category
Start with the category you care most about. If you're researching Best Picture, look at the PGA column. For Best Director, start with DGA. For acting races, SAG is your primary reference.
Step 2: Study the Mismatches
The matches are reassuring but boring. The mismatches are where the insight lives. When DGA and Oscar diverge, ask: What was different about that year? Was there a controversy? A preferential ballot surprise? A late-breaking narrative shift?
For example, in 2020, Sam Mendes won the DGA for 1917 but Bong Joon-ho won the Oscar for Parasite. That mismatch happened because Parasite was making history as the first non-English language Best Picture winner. The Academy was willing to break from the guild consensus for a once-in-a-generation moment.
Understanding why guilds and Oscars diverge tells you what conditions might cause a divergence in 2026.
Step 3: Cross-Reference Multiple Guilds
The strongest signal isn't one guild matching. It's all of them agreeing. Scroll across the columns and look for years where every guild picked the same winner and the Oscar agreed. Then look for years where the guilds split. Those split years are the closest historical analogues to close races in 2026.
Real Example
You're trying to predict Best Picture 2026. One Battle After Another is the frontrunner at 65%. But you want to know: how often does the frontrunner actually hold when guilds split?
Open the Precursors Tracker. Look at PGA vs. Oscar for Best Picture. Find the years where the PGA winner didn't match the Oscar (2017, 2020, 2023). In each case, the PGA winner was the market frontrunner going into the ceremony. La La Land, 1917, Top Gun: Maverick. All were beaten by films that had different guild support or late momentum.
Now check whether those upset winners had SAG Ensemble support. Moonlight did. Parasite did. Everything Everywhere did. That gives you a rule of thumb: when PGA and SAG Ensemble disagree, the SAG pick has a real shot.
That kind of cross-guild pattern analysis takes minutes in the tracker. It takes hours with scattered Wikipedia pages and award databases.
Guest vs. Pro Access
Without an account, you can't access the Precursors Tracker. The full historical data is a pro feature.
With a free account, you get the complete table with all guilds, all categories, and all years of available data.
Related Tools
- Awards Calendar: Know exactly when each guild votes so you can watch the Precursors Tracker update with fresh 2026 results.
- Payout Simulator: Once you've identified the likely winners from guild data, use the Guilds strategy to auto-fill your picks.
Key Takeaways
- The tracker shows guild winners and Oscar outcomes side by side across decades of data
- Mismatches are more valuable than matches. They reveal the conditions that cause upsets.
- Cross-referencing multiple guilds in the same year shows when consensus is strong and when it's fragile
- The Triple Crown effect (PGA + DGA + SAG alignment) has a 95%+ Oscar hit rate historically
- The 2026 mandatory viewing rule could introduce new divergence patterns. This is the year to watch the data closely.