Seasonal Fluctuations in Playing Time
There are several stats that I check often as basic guards for data sanity. So typically, I check average number of hours and gender distribution to make sure they are close to the numbers I've seen before. I had also found before that age doesn't correlate with number of hours played each week, so I was terribly surprised when I checked for that in the most recent data set and found a significant correlation (r = -.13, p < .001). I plotted this out and indeed the correlation looked strong.
So I went back through the past survey phases one by one to see how long this had been the case. And then something more puzzling showed up. The correlation does not appear in data set from the phase before (r = -.03, p = .21), or the one before that.
Over the past 6 years, I had come to expect relative stabilities from phase to phase especially because the sample sizes tend to be large. And then I realized that this may be driven by teenagers and college students being out of school for the summer since I started the phase in early June, while the previous phase started in late March. I ran some numbers and playing time overall was higher in the June sample than the March sample (23.5 vs,. 22. 3), and most of the difference came from the 22 and under crowd (as the graphs show).
I then went back to last year's data and found the same pattern. There was a correlation between age and playing time if a phase was run in the summer months, but not in normal school months. So what I used to say about the correlation about age and hours played per week isn't entirely correct. Age is not correlated with hours played per week, except when school is out in the summer, in which case younger players do play significantly more than older players and there is a correlation between age and game-play.