There is always going to be certain things that aren't measurable. Just as a unified theory of the universe continues to elude humankinds greatest minds, a unified theory of basketball is likely to elude the greatest basketball minds. There will always be room for "the eye test", but we have to keep in mind that it is fallible. Sometimes the gut is wrong, and needs hard statistical proof to get the gut to change it's mind.
The crazy part is that EPV is just cracking the surface of the potential of what can be done with these data sets. Over time we'll be able to have a better understanding of the impact of the little things.
The eye test can tell us that Tim Duncan adds a lot of value because he does the little things, but with a data set this huge, it will be possible to assign a value that can give us a comparison to other players in the league. Is Tim Duncan better at setting screens than Tyson Chandler or DMC? How much better?
You can answer the first question with the eye test, but since it's subjective anyone else can disagree with your perspective and both are equally valid. However, if a large part of the population feels that one player in particular is better at setting screens than another a consensus begins to form, and it's mostly acceptable to play the majority rules card. (After all, reality is pretty much just the consensus of people's perspectives). The second question cannot be answered without a data set like the one synergy sports is compiling. It will be able to tell us (and I'm making this up) that all things being equal, a play where Duncan sets a screen has historically average 1.09ppp, Chandler 1.02ppp, and DMC 1.11ppp.
Right now this very micro analysis is limited by resources (essentially time, programming abilty, and computer processing power), but those limits will diminish over time. Eventually, we will have comparable data on things like screens, but also we'll be able to find things like, is Amir Johnson a better screener when he sets a screen for the player to go left, or if he sets a screen for players to go right? Maybe we'll find things out that we have never thought of before. Maybe what your center does with his hands in the paint has an effect on chances that an opposing player will hit a corner three. Maybe we'll find that it is statistically better NOT to leave your feet when the opposing player shoots... ever. Or maybe it's good to jump at the rim, but NOT good to jump outside the paint (or six feet or whatever).
The limit of the eye test is mostly time. It's virtually impossible to watch every minute of every basket-ball game. It's also virtually impossible to process the actions of ten individuals simultaneously. In order for any one person to achieve the same knowledge as synergy's database they'd have to watch every game, multiple times... AND have perfect memory in order to compare game over game. There's still a benefit to watching game tape (and there ALWAYS will be), but Synergy allows for types of analysis that just aren't possible to be done by a single human brain in any reliable way. Also, the eye test always works better for outsiders. It easy for us to see that Rudy Gay shouldn't have been taking contested jumpers or hold the ball for so long, but a tool that helps quantify the opportunity cost of those actions could be the difference between a player understanding and changing his habits (or if he doesn't/won't then to recognize it early and ship him out).
It's still takes the human brain to work out WHAT to analyze. I'm curious as to what affect covering floor space has on the game. I'm curious about a lot of defensive positioning and defensive choices. Last year the Sloan Sports conference put out a stat about how big men affect shooters when they are within 5 feet. It wasn't a perfect stat, but it did help quantify the proficiency of post defenders in a way that impossible before except in a mostly subjective way. As the tools to analyze the data are more refined the analysis (or stats) that comes out will be MORE reflective of what's is actually taking place on the court (aka accurate!).
I for one am super pumped about this, and hope that the NBA continues to foster this type of analysis. In time even regular Joes (or Jill's) will be able to afford a subscription to the data (I think RR already has one), AND afford the type of computers it takes to crunch the numbers (probably about ten years until the kind of computer Harvard is using is available in a laptop... just guessing). Once that happens you will have a considerable amount of people with the energy and ability to do fun things with the data that more will come out of it. Right now technological constraints limit the analysis to academics and NBA in-house guys.