1) The School or the Family: Which is the cause? Does this survey reflect correlation without causality?
A— An early look at the data prompted the question: Are these life outcomes simply a reflection of the type of families that attend ACCS schools, rather than being cultivated by the school experience? Researchers at the University of Notre Dame use a statistical tool called “linear regression” to determine and factor out causes OTHER THAN the school. This approach, when applied, is not perfect but it does eliminate most of the contribution from factors outside of the school. In this report, you will see three colors of bars: Red bars (or red-outlined) represent results that are statistically adjusted to isolate the school as the causal factor for the response. This is done by applying an artificial reduction or increase in the percentage of respondents, based on the linear regression correlation coefficients of other factors like family income, religion, marital situation, etc. Blue bars (or blue-outlined) represent the unadjusted data. These are the actual responses derived from the survey.
2) Is all of the data corrected to reflect only school effect?
A— No. When you see the results reported with blue bars, you are seeing actual results. Red bars are adjusted to reflect only the school’s effect. We report blue when isolating the school’s impact artificially changes the data to create misconceptions. For example, the “earned a BA or Higher in college” question, when the correction is applied, may indicate that, say 68% of ACCS respondents earned a BA. The actual number was 89%. It would simply be inaccurate to report that only 68% actually earned a BA when 89% did. This type of question requires reporting actual data to be truthful.
3) The scale is not 0 to 100 on the charts. For example, a scale of 30% to 60% may be chosen instead of 0-100%. Doesn’t this overstate the differences?
A— The scales shown more accurately depict the differences in most cases. The view that a 100 point scale is more scientific oversimplifies the nature of a behavioral, attitudinal, and life choices survey, especially in light of the statistical nature of a “percent over the median” measure reported. These types of scales are not linear. The closer you get to 100% or 0%, the more difference is required because each school segment is approximately 1/6th of the dataset that forms the median. And, in a normal curve with the peak at the median, the tails (near 0 or 100) are increasingly difficult to approach, creating something similar to a logarithmic or exponential- type scale. 100% is not a realistic result, especially when the underlying data point is factored into the median. A more realistic understanding of the data requires narrowing the scale to show relative differences between the segments. For example, when you notice a 5% difference between public and evangelical schools, but a 15% difference between ACCS and evangelicals, the 3 times greater difference is more meaningful than the absolute scores. If anything, the scales on these may diminish the actual differences because they are not linear. Finally, the types of questions on the survey are not comparable to those, for example, on a traffic flow study or other mechanistic scientific analysis. The questions themselves may influence the curve, so relative answers are the best indicator of truth. Relative differences are more visible on a compressed scale. There could be situations where the answers are not that different in reality. The researchers at ND used insight to try and adjust for this. Regardless, these can be individually assessed by the reader by considering the response answers included at the bottom of each chart.
4) If the ACCS, a membership organization, sponsored this research, is it reliable?
A— Every survey interjects some bias. As surveys go, this survey stands out because the bias is more limited than most. This is largely because the ACCS did not create or field the original survey, and it has been run twice before. Here is a summary of the typical types of bias in surveys, and how this survey was conducted:
A) Sample bias: The ACCS provided all the names we were given by our member schools. The list represented all the alumni from those schools that participated for which they had contact information. Since we needed a wide age range— 23 to 44— we asked for all graduates. We stopped collecting names only when we met the threshold count given us by Notre Dame. We provided all collected names without selectivity. About 70% of those names collected were from ACCS accredited schools, the others were from member schools. The pool of alumni was as neutral as could be produced by this method of collection. The sampling process by Notre Dame yielded about 300 completed surveys. To help reduce our standard deviation, we invested additional dollars to make sure our sample size was as large or larger than other segments surveyed. The surveys for the different segments were run within the same year. This sample size is considered statistically valid with a relatively small error and is about the same as most major national polling organizations use for election polling.
The ACCS had nothing to do with gathering the comparative data from other segments— that was done by an outside foundation with which ACCS has no relationship.
B) Survey Question Bias: The ACCS could not have injected bias into the survey questions themselves because the survey had existed and been fielded twice previously. The questionnaire was developed by a team at an unrelated foundation with which ACCS has no formal relationship. As we initially assessed the survey, our analyst believed the questions were well formed, following best survey practices, with care given to prompting unbiased responses.
C) Methodological Bias: The ACCS had no input on the methodology used. The University of Notre Dame sociology department and its subcontractors are qualified neutral 3rd parties, and therefore, we have every reason to believe the methodology is sound and unbiased. Methodologies mirrored those used in the survey prepared by an outside, unrelated foundation. The ACCS paid the direct costs of surveying our alumni. Respondents were compensated in similar ways to the other educational segments in the survey to minimize selection bias.
D) Reporting Bias: The survey was extensive— about 89 printed pages— and generated hundreds of charts. These were provided to the ACCS by the University of Notre Dame from their statistical system. The ACCS selected those slides which showed the most significant differences, or those that reflected a measure of the seven profiles we sought to highlight. We did request that Notre Dame report the underlying data as “Percent over the median” to make the results understandable and comparable. Since the ACCS analyst took the survey results, noticed trends, and suggested groupings to Notre Dame in order to create the profiles, some bias may have been inserted at this point. Notre Dame did make their own determination about the validity and appropriate groupings. We provide charts showing the underlying data, so any reader can make a determination about the profile validity on their own. In all, with so few things under ACCS control, it is hard to imagine a process that would have yielded more objective results.
E) What data did you selectively leave out of the report? We reported nearly all slides that were unfavorable to the ACCS in a significant way, if they were relevant to the outcomes we reported. The survey looked extensively at things like high-school mission trips, coursework in high-school, and details about where alumni volunteer, among other things. We left out results that did not pertain to the 7 outcomes we isolated. We left out some results that were very favorable to the ACCS, but would mislead the reader. For example, we left out the percent of students who took physics or chemistry in high school because ACCS schools disproportionately require physics and chemistry. We tended to focus on differences, but the full report does include several charts that reflected salient similarities. For example, “I TRY TO STRENGTHEN MY RELATIONSHIP WITH GOD” is similar between all Christian school alumni. We included that for balance because it was the one “similar” point, among other questions like “I HAVE SO MUCH IN LIFE TO BE THANKFUL FOR”, which showed significant differences for ACCS alumni.
5) What is the margin of error or standard deviation for this survey?
A— The results are reported without standard deviations for simplicity. The predictive value of any response is another matter. The text accompanying the charts in the report will typically comment on the significance of the differences. Keep in mind that two bars that are relatively close together are likely not to represent a significant difference. This means that, because the result is based on a sample of about 300, any specific alumnus surveyed in the future is more likely than not to answer within the averages shown.
6) Does the report indicate I can expect these results for my child?
A— ACCS accredited schools, since they are heavily represented in this survey, are the most likely to yield these results. Because particular schools vary, schools of other types may far exceed their category on a school-by-school basis. So, a particular evangelical school may meet or exceed ACCS results. And, many of the differences are heavily influenced by the home environment (red vs. blue bars). Parents control many of the variables that yield the results.
7) Can I get access to the underlying data from this survey?
A— The full report contains all of the available public data. Given the complexity of the data collection and reporting process, the ACCS does not have access to the raw data. Notre Dame’s researchers have collected and retained the data. Outside researchers and members of the press may be able to obtain the charts directly output by the statistical software, at our discretion, by contacting the ACCS offices.
8) Who can I speak to for more information about this survey?
A— The ACCS lead on this project is David Goodwin. He is available to answer questions from the press and researchers interested in this data. Visit the ACCS website, ClassicalChristian.org, for contact information.