Print Сite this

Evaluation Focus and Question


Team Read, being a program aimed at assisting schoolers from the second to the fifth grade in terms of enhancing reading skills through high school students’ mentorship, encompasses the interaction with various stakeholders, including both students and mentors. For this reason, the major focus of the evaluation conducted by Margo Jones was the impact secured by the Team Read, including such factors as the reading skills improvement, impact on coaches, and current strengths and weaknesses of the program (Cornelia Ng & Lindenberg, 2000). The evaluation design developed for such an assessment included surveys and interviews, along with the quantitative evaluation of the students’ success both within and outside the program. Thus, it may be concluded that the evaluation design chosen by Jones is appropriate in terms of the assessment’s focus.

We will write a
custom essay
specifically for you

for only $16.05 $11/page
308 certified writers online
Learn More


The overall evaluation process can be primarily divided into three major subjects of assessment:

  • The measurement of the coaches’ satisfaction rates;
  • The measurement of program efficacy through the comparison of the results of pre-program and post-program skill evaluation;
  • The measurement of program efficacy through the comparison of reading skills among program participants and all district students not involved in the Team Read (Cornelia Ng & Lindenberg, 2000).

The major quantitative indicators of success are presented in the form of the results of the three major reading skills assessment tools: Developmental Reading Assessment (DRA), Iowa Test of Basic Skills (ITBS), and Washington Assessment of Student Learning (WASL). Considering the overall evidence-based model of education assessment, along with the embracement of the No Child Left Behind Policy, it would be reasonable to assume that the implementation of such quantitative measurements remains the most efficient way of quality assessment (Gall et al., 2015). Thus, the measurement method should be regarded as appropriate in the given setting.


According to the researchers, data gathering, although not exclusive, is one of the primary aspects of the proper evaluation process (Rossi et al., 2004). Thus, considering the evaluation goals, Jones was primarily preoccupied with gathering quantitative data through collecting pre-test, and post-test results in 1999 and 2000, both among the Team Read participants and all district students. The later evaluation analysis was primarily based on the mean difference outlined. The level of satisfaction among the coaches was collected with the help of questionnaires filled in by the participants (Cornelia Ng & Lindenberg, 2000). Taking the setting into account, one may conclude that using such approaches to data collections is the most appropriate way to draw tangible quantitative results in order to outline some recommendations for the stakeholders.

Methodological Approach

Since the program evaluation addressed three major goals, three different methodological approaches were employed to analyze the results of the data collection. Thus, as far as reading skill efficacy was concerned, Jones applied statistical analysis to the quantitative indicators of pre-test and post-test assessment (Cornelia Ng & Lindenberg, 2000). Jones mentioned the possible liabilities in such an assessment, emphasizing the qualitative difference in reading tests for students in different grades. The level of coaches’ satisfaction was more subjective, as it encompassed a questionnaire with a small number of questions concerning the coaches’ experience. Finally, the methodology of analyzing the strengths and weaknesses of the program was a complex endeavor, which included interviewing primary stakeholders and analyzing data collected in the context of existing research literature on cross-age tutoring (Cornelia Ng & Lindenberg, 2000). Hence, it may be concluded that all the approaches to analyze the data were appropriate in the given setting, but the outcomes of the analysis could not be 100% reliable due to the presence of subjective analysis.


According to the research, the overall logic model of program evaluation encompasses such constituents as inputs, activities, outputs, outcomes, context, and impact (Frechtling, 2007). In terms of the present evaluation, the notion of outcomes was prioritized by Jones. Undeniably, the process of evaluation manifested statistical value for the reconsideration of the program’s efficacy. However, one of the major weaknesses of such an evaluation is the lack of a roadmap to follow in order to secure the program’s impact.


Cornelia Ng, S., & Lindenberg, M. (2000). Team Read (B): Evaluating the efficacy of a cross-age tutoring program in Seattle school district. Web.

Get your
100% original paper
on any topic

done in as little as
3 hours
Learn More

Frechtling, J. A. (2007). Logic modeling methods in program evaluation. Wiley.

Gall, M. D., Gall, J. P., & Borg, W. R. (2015). Applying educational research: How to read, do and use research to solve problems of practice (7th ed.). Pearson.

Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.). SAGE Publications.

Cite this paper

Select style


StudyCorgi. (2022, August 17). Evaluation Focus and Question. Retrieved from


StudyCorgi. (2022, August 17). Evaluation Focus and Question.

Work Cited

"Evaluation Focus and Question." StudyCorgi, 17 Aug. 2022,

* Hyperlink the URL after pasting it to your document

1. StudyCorgi. "Evaluation Focus and Question." August 17, 2022.


StudyCorgi. "Evaluation Focus and Question." August 17, 2022.


StudyCorgi. 2022. "Evaluation Focus and Question." August 17, 2022.


StudyCorgi. (2022) 'Evaluation Focus and Question'. 17 August.

This paper was written and submitted to our database by a student to assist your with your own studies. You are free to use it to write your own assignment, however you must reference it properly.

If you are the original creator of this paper and no longer wish to have it published on StudyCorgi, request the removal.