Don't put too much faith in this. Their methodology is to present the top 100 rated kit wines that were submitted to their annual winemakermag competition. It's good info for that, but to me that's it. Here are the limitations I see, in no particular order:
1. You don't know what year kit was submitted, could be aged three years or three days.
2. It's based on what's entered into completion by people like us. It is not like consumer reports independent testing. They don't buy kits, make them per instructions, and then taste. That would be a great comparison, but impractical.
3. The winemakers could do any tweaks they wanted, good or bad.
It's a great title, but misleading. If one manufacturers kits are entered more than another they will likely have more on the list, doesn't mean they're better. These aren't the best kits manufactured in 2016. Can't really use this for much without specific age and tweak information. They use the UC Davis scoring system. It would be nice if they posted each score. But then again, we don't know enough about how each was made or aged to be useful.