Jump to:Page Content
You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.
The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.
Program Description
Overview | The Kindergarten Summer Camp (KSC) program is designed to boost reading achievement among low-income children in Baltimore, Maryland, by addressing one of the fundamental problems contributing to the growth of the achievement gap: summer learning loss in reading. The program is designed to increase literacy skills and enjoyment of reading through an approach that integrates art and science activities into the curricula.. |
Start Date | June 2004 |
Scope | local |
Type | summer/vacation |
Location | urban |
Setting | public school |
Participants | kindergarten students |
Number of Sites/Grantees | three elementary schools in 2004 |
Number Served | 90 kindergarteners annually (30 from each of the three schools) |
Components | KSC provides a free, 6-week, full-day (8 a.m.–2 p.m.) summer enrichment camp in literacy and the fine arts to exiting kindergarteners from three partner schools in low-income Baltimore neighborhoods. Each day begins with breakfast, after which children participate in 3 hours of literacy instruction led by a team of three program staff members, consisting of one teacher and two college student interns (one from Johns Hopkins University and one from Maryland Institute College of Art). Class sizes for the program are limited to 10 children to maximize individual attention. Based on participants’ needs, the literacy instruction incorporates language and word study, shared reading, interactive writing, guided reading, and independent writing. After this literacy instruction, children have lunch and physical activity/recess. The afternoon begins with a read aloud session and culminates with a science and art session. Designed with the input of an art teacher/community artist, the college student interns lead activities that include theme-based visual and performing arts and weekly field trips. These themes, which aim to connect and build on morning literacy activities, include colors, animals, marine life, space, and plants. Field trips to places such as museums, the zoo, and the aquarium help to launch the weekly theme by providing concrete experiences from which the children can draw meaning. For example, when the theme was marine life, students began the week with a field trip to the National Aquarium. Each morning of that week, students then participated in read alouds about various ocean animals and wrote about their experiences (real and imaginary). Each afternoon, the science and art curricula allowed them to expand their understanding by going deeper into the science content and creating artwork to reinforce it. At the end of each summer, the program hosts a community art show, designed to showcase participants’ work and serve as a culminating experience, in each of the schools. In addition, a downtown gallery exhibits select participants’ works in conjunction with the art of interns and other Baltimore-based community artists. Prior to their work in the program, college interns participated in 4 weeks of training on curricula/instruction, assessment, classroom management, parent involvement, basic first aid, and citizenship/team building. School teachers hired by the program also participated in the final week of training to get an overview of the curricula, mentoring strategies, and planning time with the interns. Interns also participated in weekly professional development workshops led by a fine arts supervisor, the KSC director, an education specialist, and/or the on-site certified teachers. These problem-based workshops were designed to provide a direct response to classroom issues. |
Funding Level | $181,000 in 2004 |
Funding Sources | Maryland State Department of Education 21st Century Community Learning Centers program, RGK Foundation, USDA, and Federal Work Study |
Evaluation
Overview | Evaluators conducted a randomized evaluation to investigate the impact of KSC on the academic achievement of low-income children, with particular attention paid to reading acceleration. |
Evaluator | Geoffrey D. Borman and N. Maritza Dowling, University of Wisconsin–Madison Ron Fairchild and Jody Libit, Johns Hopkins University Center for Summer Learning |
Evaluations Profiled | Halting the Summer Achievement Slide: A Randomized Evaluation of the Kindergarten Summer Camp |
Evaluations Planned | The program will be evaluated each summer through 2006. Follow-up assessment data will be collected to determine if there is a long-term impact. |
Report Availability | Borman, G. D., Dowling, N. M., Fairchild, R., & Libit, J. (2005). Halting the summer achievement slide: A randomized evaluation of the Kindergarten Summer Camp. Baltimore, MD: Johns Hopkins University Center for Summer Learning. |
Contacts
Evaluation | Geoffrey D. Borman, Ph.D. Associate Professor Educational Leadership and Policy Analysis, Educational Policy Studies, and Educational Psychology University of Wisconsin–Madison 1161D Educational Sciences Building 1025 West Johnson Street Madison, WI 53706-1796 Tel: 608-263-3688 Fax: 608-265-3135 |
|
Program | Ron Fairchild Executive Director Center for Summer Learning Johns Hopkins University 3105 N. Charles Street Baltimore, MD 21218 Tel: 410-516-6228 Fax: 410-516-6222 Email: rfairchild@jhu.edu |
|
Profile Updated | September 15, 2005 |
Evaluation: Halting the Summer Achievement Slide: A Randomized Evaluation of the Kindergarten Summer Camp
Evaluation Description
Evaluation Purpose | To assess the effectiveness of KSC in preventing the summer achievement slide of youth participants. |
Evaluation Design | Experimental: The total sample includes pretest and posttest data from 128 children (treatment and control groups) from high poverty urban schools in Baltimore. Each school principal was asked to recruit children to apply to the program. All applicants were entered into a computerized database, which was programmed to randomly assign two of every three children to the treatment group and the remaining child to the control group. When multiple children in the same family applied, their applications were counted as one entity, so that children from the same family would be assigned to the same group. The final sample consisted of 93 children in the treatment group and 35 children in the control group. Treatment effects were calculated by comparing posttest outcomes between the treatment and control groups, statistically controlling for children’s gender and pretest scores to enhance the precision of the treatment effect estimates. Evaluators also attempted to estimate treatment effects solely for the youth who actually attended the program by statistically adjusting for the fact that some participants never attended the program. These statistical adjustments were based on the work of Orr (1999). The sample was predominantly African American (77%) and poor, as indicated by free lunch eligibility status (90%). Males were slightly overrepresented in the sample (55%) relative to females (45%). At baseline, the treatment and control groups were statistically equivalent to one another on all three demographic characteristics measured (race, gender, and free lunch eligibility) and on achievement measures. Sample members with and without complete data were statistically equivalent on race, gender, and free lunch status, although treatment participants with complete data tended to have higher program attendance rates than treatment participants with incomplete data. Reference: Orr, L. L. (1999). Social experiments: Evaluating public programs with experimental methods. Thousand Oaks, CA: Sage. |
Data Collection Methods | Secondary Source/Data Review: Evaluators collected demographic data from school records on treatment and control group children’s gender, race/ethnicity, and eligibility status for the free lunch program. For the treatment children, evaluators also recorded program attendance rates. Surveys/Questionnaires: Surveys were collected from 33 parents and 18 interns to assess satisfaction with the program and perceived impacts on participating youth. Test/Assessments: The following assessments were collected from children at pretest and posttest: two assessments from the Dynamic Indicators of Basic Early Literacy Skills (letter naming fluency and phoneme segmentation fluency), Word List A. Developmental Reading Assessment (DRA), and dictation. The tests were selected for their field-tested validity and their ability to identify specific literacy areas (including phonemic awareness, phonics, identification of high-frequency words, and fluency/comprehension) for targeted instruction. The measures are used to inform summer instruction and assess progress. |
Data Collection Timeframe | Data were collected between May and September of 2004. |
Findings:
Formative/Process Findings
Recruitment/Participation | Excluding the program “no-shows,” the program attendance rate was 72% of days attended. Including the 22 no-shows (each with a 0% attendance rate), the attendance rate was 55%. |
Satisfaction | One hundred percent of parent respondents noted that they were satisfied with the reading instruction that their child received during the program and that the program did a good job of combining academics and enrichment. Of the interns who worked with the children, 100% agreed that they had an overall quality experience working at the program. |
Summative/Outcome Findings
Academic | The average treatment group child outperformed the average control group child by 15 percentile points on the Word List A posttest. This difference was statistically significant (p < .01). After adjusting these effects to account for treatment group members who never attended the program, evaluators found somewhat larger effect sizes. The average program participant outperformed the average control group child by 12 percentile points on the DRA posttest. This difference was statistically significant (p < .05). After adjusting these effects to account for treatment group members who never attended the program, evaluators found somewhat larger effect sizes. For the two Dynamic Indicators of Basic Early Literacy Skills assessments and the dictation assessment, control group children made somewhat greater gains than treatment group children, but these differences did not attain statistical significance. Of surveyed interns, 94% agreed that they had made an academic impact on their students. |
Youth Development | Ninety-four percent of parents agreed that their children seemed more confident at the end of the summer program. One hundred percent of interns surveyed agreed that they had made a personal impact on their students. |