PRORAGIS Statistics

July 1, 2012, Department, by Bill Beckner


An understanding of the data included in the PRORAGIS survey can help parks and recreation professionals understand its applicability to their organizations.According to the famous humor writer Mark Twain, statistics are the most useful way of misleading people when discussing a topic. The reality of determining the accuracy of a statistic-based statement has led to the satirical comment that “45 percent of people make up statistics on the spot” (you can insert any number in that phrase that you choose). However, statistics, used and documented correctly, actually have significant weight when referred in a discussion.

Recently, I received a question from a member asking how valid the statistics in PRORAGIS were, given the current low response rates. This is an insightful and valid question because you may use a statistic from the Miscellaneous Operating Ratio table below and have its validity questioned by an elected official. If you have no explanation of the data, it loses its value.

Looking at the bigger picture, PRORAGIS is gaining about 10 profiles per month. The current count of completed profiles is 250+ out of approximately 3,000 NRPA member departments or 8.3 percent of the population. Since this is not a random survey, we need to obtain at least 20 percent of the population’s profiles to close in on a 90- or 95-percent confidence level. That means we actually need 600 completed surveys to achieve the minimum desired confidence.

The example used here comes from a segment of the profiles. There were 45profiles completed by agencies that defined themselves as special districts out of a population of 450+. So this particular sample is 10 percent of the population, but still only half of the sample needed. Despite this paucity of information, some of the data points are much more reliable than others. The data points used include:

Average: The average is the sum of the total sample divided by the number of respondents in the sample. The lower the percentage of participation, the more unreliable the average generally is. A couple of high-end or low-end numbers can distort the results. Row three is a good example. The revenue per visitor reported shows a median of $9.00, a bottom 25 percent that begins at $2.00, and an upper 25 percent that goes up from $22.00 per visitor. Yet the average revenue per visitor is $31.00. This is well into the top 25 percent and can only reflect one or more departments with much higher revenue collection numbers. Thus this number is not very useful as a measure since it will very likely change significantly with the addition of more profiles.

Median: The median is the halfway point of the range of numbers. As such, it changes every time a new profile is added. While it is generally more reliable than the average, it can be suspect if the range is too broad.   

Lower and Upper Quartile: The quartiles provide a better and more reliable picture even with fewer respondents. These numbers represent the 25 percent lower respondents on the range or the 75 percent and higher on the range. So, you know that of the total range of respondents, 25 percent are under the given number, 25 percent are over the given number, and 50 percent are in the middle.   

In Row 1, Operating Expenditures per Capita, 25 percent of the respondents spend $67.00 or less per resident while another 25 percent of respondents spend $340.00 or more per resident. The fact that the average is close to the median generally indicates the range has few significantly higher or lower responses.

Conversely, in Row 2 we can see that the median and the quartiles are fairly balanced while the average indicates a number of agencies in the upper quartile that have much more parkland per 1,000 residents.

Miscellaneous Benchmarking Ratios 








Operating Expenditures per Capita 

$ 205 

$ 216 

$ 67 

$ 340 


Acreage of Parkland per 1,000 Population 






Revenue per Visitor 

$ 9 

$ 31 

$ 2 

$  22 


Actual Data  

 Best of all is the actual data from peer agencies that you can access from the custom reports using the side-by-side reporting process. This data is actual reported numbers from each department and as such should be unassailable.

Table 2. Example Side-by-Side Report

Miscellaneous Benchmarking Ratios 

Special District 1 

Special District 2 

Special District 3 

Operating Expenditures per Capita 




Table 2 above shows the Operating Expenditures per Capita for three special district agencies. You can make comparisons to any or all of the data records in the profile for any specific agency in the profile. So, if you want hard, accurate data that will help you make a case for your budget, argue for additional facilities, or assess staffing levels, you can benefit by encouraging your peers to establish a profile so you can benchmark, manage, and plan using entirely accurate data even while the PRORAGIS profile is expanding to its highest level of confidence.