You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Program Description

Overview The Quantum Opportunities Program (QOP) is designed to help at-risk youth make a “quantum leap” up the ladder of opportunity through academic, developmental, and community service activities, coupled with a sustained relationship with a peer group and a caring adult, offered to them over their 4 years of high school. The QOP framework strives to compensate for some of the deficits found in poverty areas by (a) compensating for both the perceived and real lack of opportunities, which are characteristic of disadvantaged neighborhoods, (b) providing interactions and involvement with persons who hold prosocial values and beliefs, (c) enhancing participants’ academic and functional skills to equip them for success, and (d) reinforcing positive achievements and actions.

The program was implemented on a pilot basis by the Ford Foundation and Opportunities Industrialization Centers of America (OICA), a nonprofit organization serving the poor, unemployed, underemployed, and youth. The pilot was implemented in five cities: San Antonio, Texas; Philadelphia, Pennsylvania; Milwaukee, Wisconsin; Saginaw, Michigan; and Oklahoma City, Oklahoma. The United States Department of Labor (DOL) and the Ford Foundation also tested QOP on a larger scale via a demonstration with two sites under private management and administration (Philadelphia, Pennsylvania, and Yakima, Washington) and five sites under federal management and administration (Cleveland, Ohio; Fort Worth, Texas; Houston, Texas; Memphis, Tennessee; and Washington, D.C.).
Start Date 1989–1993 for the pilot and 1995–2001 for the demonstration—six of the seven sites began the demonstration in 1995, while the Washington, DC site began in 1996
Scope national
Type after school, mentoring, summer/vacation, before school, weekend, comprehensive services
Location urban, rural
Setting public schools and community-based organizations
Participants high school students
Number of Sites/Grantees There were five sites for the pilot and seven sites for the demonstration. A local community-based organization (CBO) in each city implemented and operated a QOP program. Each CBO teamed with one to three high schools.
Number Served 125 in the pilot and 50, 80, or 100 at each site for a total of 580 for the demonstration
Components Both the pilot and demonstration offered services to single cohorts of entering high school freshman throughout their time in high school. The pilot offered continuing services to these youth for 4 years, while the demonstration offered services for 5 years. The pilot targeted youth from families receiving welfare, while the demonstration targeted youth with low grades who attended high schools with high dropout rates. Eligible youth were enrolled in QOP and served even if they transferred to other schools, dropped out of school, became incarcerated, or became inactive in QOP for long periods of time. QOP used an “anti-attrition” program design in which program staff kept in touch with participants even during periods when these youth were disinterested in the program or moved. Services were provided year-round to enrollees, and were designed to be intensive and to address barriers to success. Although services were meant for youth up until graduation, enrollees who graduated received limited additional services focused primarily on the transition from high school to post-secondary education or training.

The three primary activities offered to participating youth were supplemental academic activities (e.g., tutoring, computer-based instruction), development activities (e.g., activities designed to instruct youth about health, alcohol, drug abuse, sex, family planning, arts, career, and college planning), and community service activities aimed at improving conditions in the communities in which youth live. The QOP model prescribed an annual participation goal of 250 hours of each of these activities, for a total of 750 hours per year. Secondary aspects of the program model included financial incentives (stipends, accrual accounts, enrollee bonuses), staff bonuses (tied to youth participation), and supportive services (snacks, transportation assistance, and other services as needed).

In addition, youth received intensive case management and mentoring to help them graduate from high school and enroll in postsecondary education or training. Each youth’s program was coordinated by a caring adult, a “case manager,” who served as mentor, role model, disciplinarian, broker, and problem solver. QOP case managers referred enrollees to community health and mental health services, summer jobs programs, and local agencies that provide housing, food, income support, or child care. The program mantra was “once in QOP, always in QOP,” which led case managers to devote time to helping their youth, even if the youth was no longer interested in QOP. The program model specified roughly 15 to 25 enrollees per case manager.
Funding Level $1.1 million was spent for the pilot, $5 million for the DOL demonstration sites, and $4.1 million for the Ford Foundation demonstration sites plus Ford-funded OICA technical assistance to all sites. In addition, DOL sites were required to provide $200,000 per year per site in matching funds for the first 4 years.
Funding Sources Ford Foundation, U.S. Department of Labor


Evaluation

Overview The pilot was evaluated by Brandeis University from 1989 to 1993, with the goal of understanding the program’s impacts. The demonstration was evaluated by Mathematica Policy Research and Berkeley Policy Associates from 1995 to 2004, with the goal of understanding both the program’s implementation and its impacts on participants.
Evaluators Andrew Hahn, Brandeis University

Allen Schirm, Myles Maxfield, Nuria Rodriguez-Planas, Laura Castner, and Christina Tuttle, Mathematica Policy Research, Inc.

Mary Vencill, Berkeley Policy Associates

Vida Maralani, Department of Sociology at the University of California at Los Angeles
Evaluations Profiled Evaluation of the Quantum Opportunities Program: Did the Program Work?

The Quantum Opportunities Program Demonstration: Implementation and Short-Term Impacts

The Quantum Opportunities Program Demonstration: Initial Post-Intervention Impacts
Evaluations Planned To measure longer term impacts, data will be collected in winter 2005. The final report will be produced in late 2005.
Report Availability Hahn, A., Leavitt, T., & Aaron, P. (1994). Evaluation of the Quantum Opportunities Program: Did the program work? Waltham, MA: Brandeis University.

Lattimore, C. B., Grotpeter, J. K., & Taggart, R. (1998). Blueprints for violence prevention, book four: Quantum Opportunities Program. Boulder, CO: Center for the Study and Prevention of Violence.

Schirm, A., Rodriguez-Planas, N., Maxfield, M., & Tuttle, C. (2003). The Quantum Opportunities Program demonstration: Short-term impacts. Washington, DC: Mathematica Policy Research. Available at www.mathematica-mpr.com/education/qop.asp.

Maxfield, M., & Castner, L., Maralani, V., & Vencill, M. (2003). The Quantum Opportunities Program demonstration: Implementation findings. Washington, DC: Mathematica Policy Research. Available at www.mathematica-mpr.com/education/qop.asp.

Maxfield, M., Schirm, A., & Rodriguez-Planas, N. (2003). The Quantum Opportunities Program demonstration: Implementation and short-term impacts. Washington, DC: Mathematica Policy Research. Available at www.mathematica-mpr.com/education/qop.asp.

Schirm, A., & Rodriguez-Planas, N. (2004). The Quantum Opportunities Program demonstration: Initial post-intervention impacts. Washington, DC: Mathematica Policy Research.


Contacts

Evaluation Andrew Hahn, Ph.D.
Center for Human Resources
Heller Graduate School, Brandeis Univeristy
Waltham, MA 02254-9110
Tel: 617-736-3774
Fax: 617-736-3851

Allen Schirm, Ph.D.
Senior Fellow
Mathematica Policy Research, Inc.
600 Maryland Avenue SW, Suite 550
Washington, DC 20024-2512
Tel: 202-484-4686
Fax: 202-863-1763
Email: aschirm@mathematica-mpr.com
Program C. Benjamin Lattimore
Opportunities Industrialization Centers of America, Inc.
1415 Broad Street
Philadelphia, PA 19122
Tel: 215-236-4500
Fax: 215-236-7480
Profile Updated December 15, 2004

Evaluation 1: Evaluation of the Quantum Opportunities Program: Did the Program Work?



Evaluation Description

Evaluation Purpose To examine the impact of the QOP pilot on participants.
Evaluation Design Experimental: Prior to program implementation, researchers selected 50 eighth grade students from families receiving public assistance and who were entering ninth grade at each of the five pilot sites. From these 50 students, 25 from each site were selected at random to be eligible to participate in the program (QOP members) and were recruited into the program. The remaining 25 youth at each site formed the control group. Evaluators eliminated one site (Milwaukee) from the analysis because there was no evidence that Milwaukee QOP members received a substantial amount of services, and the Milwaukee site was not able to retain contact with enough members of their experimental and control groups to provide for sound analysis. Of the remaining 100 participants and 100 controls, evaluators were able to reach 88 QOP members and 82 controls, who were included in the follow-ups and retained in the final sample used in the analyses. Analysis of the two groups at sample entry indicated that groups were largely free of systematic differences and there was no evidence of attrition bias.
Data Collection Methods Surveys/Questionnaires: Questionnaires were administered in the fall of 1989 (9th grade), 1990 (10th grade), 1991 (11th grade), and 1992 (12th grade) to QOP members and control group members to gather information about demographic characteristics, work experience, school experiences, health knowledge, and personal attitudes and opinions. Two additional questionnaires that focused on future plans and post-secondary school outcomes were administered in the spring and fall of 1993, after youth were scheduled to have completed high school graduation.

Tests/Assessments: QOP members and control group members administered tests assessing their academic skill levels (Test of Adult Basic Education Form 5 Level), and functional skill levels (APL 40 Item Version Survey – CCP Tier Mastery Test). Tests were administered in the fall of 1989, 1990, and 1991 and in the spring of 1993.
Data Collection Timeframe Data were collected between 1989 and 1993.


Findings:
Summative/Outcome Findings

Academic QOP members across sites were significantly more likely (p < .10) than control group youth to have graduated from high school (63% vs. 42%), although the data by site indicate these differences reached statistical significance only in the Philadelphia site.

QOP members across sites were significantly more likely (p < .10) than control group youth to have enrolled in some type of post-secondary school 6 months after high school graduation (42% vs. 16%). These differences were most pronounced in the Philadelphia site, were still statistically significant in Oklahoma City, but were not large enough to reach statistical significance in San Antonio or Saginaw.

Post-secondary enrollment effects held for both 2-year and 4-year college enrollment. The rate of 4-year college attendance among QOP members was more than three times higher than the control group rate (18% vs. 5%) and their rate of 2-year college attendance was more than twice as high (19% vs. 9%) 6 months after high school graduation. Both of these differences were significant (p < .10). Both sets of effects were again most pronounced in the Philadelphia site.

Results from the 1st year (freshman year in high school) showed that test scores for many of the academic and functional skills tested declined for both the experimental and control groups. After 2 years, the experimental group’s average scores for all 11 academic and functional skills were higher than control group scores, and 5 of these differences were significant (p < .10). By the time most of the sample members were leaving high school in the spring of 1993, average experimental group scores on all 11 skills were higher than control group skills, and all of these differences were statistically significant (p < .10). There were variations in these effects by site during the high school years, with the Philadelphia site showing strong effects, Oklahoma City and Saginaw showing slightly positive effects, and San Antonio showing no positive effects.

For orientation toward and expectations for post-secondary education, there were no significant differences between the two groups after 1 year. After 2 years, however, the experimental group demonstrated significantly higher (p < .10) educational expectations than the control group, and by the time most youth were leaving high school, this difference was even more pronounced. There were variations in these effects by site during the high school years, with the Philadelphia site showing strong effects, Oklahoma City and Saginaw showing slightly positive effects, and San Antonio showing no positive effects.

There were no significant differences between the experimental and control groups during the high school years for the likelihood of being reported a school dropout. However, QOP group members were significantly less likely (p < .10) than control group youth to have dropped out of high school (23% vs. 50%) at the time of the fall 1993 survey (which counted dropouts as those who had not finished high school and were not currently in school). This difference was less pronounced according to the earlier spring 1993 survey, in which youth were asked whether they had ever dropped out of school. The dropout differences were most pronounced at the Philadelphia site, and were still statistically significant in Oklahoma City, though not large enough to be statistically significant in either San Antonio or Saginaw despite trends in the positive direction in these latter two sites.

There were no significant differences between the experimental and control groups for self-reported school grades or assessments of their need for reading and math help.
Community Development During the 6 months since finishing QOP, significantly more experimental group members than control group members (p < .01) served as volunteer tutors, counselors, or mentors (28% vs. 8%) and gave time to nonprofit, charitable, school, or community groups (41% vs. 11%). Also during this time period, more experimental group members than control group members participated in a community project (21% vs. 12%), although these differences were not significant. Differences in these indicators tended to be largest in Philadelphia and Saginaw.
Prevention Though there was no evidence of program effects on the likelihood of having children during the high school years, there was evidence that QOP members were less likely to have children than control group members by the time of the post-high-school follow-up. By this time period, 24% of QOP members had children compared to 38% of control group members (p < .10). The QOP effect here appeared to be smallest in the Philadelphia site, and was largest in San Antonio, followed by Oklahoma City and Saginaw, and then Philadelphia, though none of the single site differences reached statistical significance.

Control group members were significantly (p < .10) more likely to express a need for help with an alcohol or drug problem in the fall 1993 survey (no QOP members expressed such a need), but the actual number saying they had this need was very small.

No significant program effects were found for contraceptive knowledge or AIDS knowledge during the high school years.
Workforce Development There were no significant differences between the experimental and control groups’ assessments of their need for help in training for or finding a good job.
Youth Development Significantly more QOP members than control group members (p < .01) had received honors or awards (34% vs. 12%) in the past 12 months at the time of the fall 1993 survey. The difference was greatest in the Philadelphia and San Antonio sites.

Significantly more QOP members than control group members (p < .10) agreed or strongly agreed with the statements, “I am hopeful about the future” (98% vs. 86%) and “my life has been a success” (74% vs. 51%) at the time of the fall 1993 survey.

Though not statistically significant, self-assessments by members of both the QOP and control groups at the time of the fall 1993 survey were highly positive. Ninety-three percent of QOP members and 82% of control group members strongly agreed or agreed with the statement that their family life was happy. Only 9% of QOP members and 17% of control group members strongly agreed or agreed with the assertion that they were lonely. More than half of both groups disagreed with the statement that they were bothered about things.

A lower percentage of QOP members (5%) than control group members (13%) reported uncertainty regarding what steps to take in the future at the time of the fall 1993 survey, but a slightly higher percentage of control group members professed to know their future steps exactly (37% vs. 35%). These differences were not statistically significant.

Evaluation 2: The Quantum Opportunities Program Demonstration: Implementation, Short-term Impacts, and Initial Post-Intervention Impacts



Evaluation Description

Evaluation Purpose To answer the following questions: (a) How well was the QOP model implemented in the demonstration sites? (b) How much did QOP cost? (c) How much time did enrollees spend on program activities? (d) How did QOP affect enrollees? The report also seeks to determine whether the implementation, cost, and participation findings suggest why QOP had the impacts that it did.
Evaluation Design Experimental and Non-Experimental: Implementation of the QOP demonstration was assessed through site visits and interviews with staff, school, and community-based organization administrators, and youth participants. To estimate QOP’s impacts on high school academic performance and graduation, postsecondary education or training, and risky behaviors, data were collected from a group of youth enrolled in QOP and a group of statistically identical youth—the control group who did not participate in QOP. The QOP and control groups were formed at the start of the demonstration by randomly assigning each of the nearly 1,100 youth eligible for the program to one group or the other. The final evaluation sample had 580 QOP enrollees and 489 controls. Post-intervention data were collected from both groups approximately 2 years after the end of the demonstration when most sample members were 21 or 22 years old (1 year after the end of the demonstration in the Washington, DC, site when most sample members there were 20 or 21 years old).

There was just one statistically significant difference between the means of baseline characteristics for the QOP and control groups for the whole demonstration. Compared with the control group, the QOP group had fewer youth in the middle third of the eighth-grade grade point average distribution. The evaluators found that their impact estimates were not sensitive to whether they adjusted for differences in baseline characteristics between the QOP and control groups.

Response rates for all types of data collected were generally 7 to 10 percentage points higher amongst the QOP group than the control group. Evaluators statistically adjusted for these patterns of nonresponse between the two groups when estimating the program’s impacts.
Data Collection Methods Interviews/Focus Groups: Informal interviews were conducted with staff in person during site visits or over the telephone during annual QOP staff conferences in order to study the program’s implementation.

Observation: Site implementation was observed during a visit lasting several days at each site in each of the first 4 years of the demonstration.

Secondary Source/Data Review: QOP and control groups’ high school transcripts were collected approximately 4 years after the beginning of the demonstration. Time spent in the academic, developmental, and community service components of QOP was collected through a Management Information System at each site.

Surveys/Questionnaires: Evaluators surveyed QOP and control group members in person in the spring of the 4th academic year of the demonstration (response rate of 84%). The survey collected data on risky behaviors and factors that help youth resist negative influences in their social environment. Seven to 10 months later, evaluators conducted a telephone survey with these youth (response rate of 83%), including such topics as risky behaviors, and (for the enrollee group) attitudes toward QOP.

Evaluators surveyed QOP and control group members by telephone approximately 2 years after youth’s scheduled high school graduation (1 year after the end of the demonstration in Washington, DC). The response rate for the survey was 75% of the original sample. The survey collected data on risky behaviors, factors that help youth resist negative influences in their social environments, high school graduation, and postsecondary activities.

Test/Assessments: Reading and mathematics achievement tests developed from the National Education Longitudinal Study and scored by the Educational Testing Service were administered to QOP and control group members in the spring of the 4th academic year of the demonstration (response rate of 84%).
Data Collection Timeframe Data were collected between 1995 and 2003.


Findings:
Formative/Process Findings

Activity Implementation Only two sites offered the prescribed number of hours of educational, developmental, and community service activities. The other five sites offered fewer than the prescribed number of hours for at least one program component, frequently the community service component.

Most sites did not implement the education component effectively, according to the prescribed program guidelines. In particular, few sites regularly assessed academic performance via achievement tests, no site developed individualized education plans based on assessment results, no site implemented a sustained program of course-based tutoring, and only three sites successfully implemented computer-assisted instruction.

The developmental component was relatively well implemented according to the prescribed program guidelines, with sites offering many different developmental activities. Although these activities were intended to focus on life skills that would enable the youth to avoid risky behaviors, this component included many purely recreational activities at most sites. Participants found recreational activities to be fun, and case managers found them to be useful for fostering program participation.

The community service component at most sites was not well implemented, according to the prescribed program guidelines. The most common reasons for deviations were the enrollees’ lack of interest and the case managers’ belief that enrollees needed other QOP services more. Most sites decided to reallocate their resources away from community service to mentoring, case management, and educational activities.

Enrollee stipends for time spent on the three program components were well implemented according to the prescribed program guidelines, and appeared to be an effective way to attract the enrollees to program activities in the 1st year or 2 of the demonstration. As enrollees aged, case managers found that other incentives, such as recognition, attention, and prizes, could replace the stipends. With respect to the account funds that accrued over time for payout to youth who completed high school and transitioned into post-secondary education or training activity, Job Training Partnership Act accounting regulations prohibited DOL-funded CBO’s from establishing these accounts for enrollees. Instead, these CBO’s kept informal records of accrual account balances and paid those balances to qualifying enrollees at the end of the demonstration. According to case managers, providing periodic information on account balances was an important factor in motivating program participation. Account balances at the end of the demonstration ranged from a few hundred dollars to nearly $10,000, with most being in the range of $1,000 to $3,000.

Most sites supplied many of the most commonly needed supportive services, including afternoon snacks and transportation to program activities. On the other hand, most sites did not meet their enrollees’ needs for child care, health and mental health services, substance abuse treatment, and family counseling.
Costs/Revenues QOP cost approximately $25,000 per enrollee, on average, for the full 5 years of the demonstration. The 5-year expenditure per enrollee for the DOL-funded sites ranged from $18,000 to $22,000. For Ford Foundation sites, the figure for Yakima was $23,000 and for Philadelphia was $49,000.

The QOP stipend cost involved paying each youth approximately $1.25 for every hour devoted to educational activities, developmental activities that were not purely recreational, and community service. A matching amount was deposited in an accrual account and promised to the enrollee when he or she earned a high school diploma or GED certificate and enrolled in college, a certified apprenticeship program, an accredited vocational/technical training program, or the armed forces.

The Philadelphia site spent more than twice as much per enrollee as did any other site. Most of the additional spending was for staff compensation. A case manager in the Philadelphia site received about twice the compensation of a case manager in any other site.
Program Context/Infrastructure All sites implemented a version of QOP that deviated from the program model, two substantially and five only moderately. Evaluators cited two main reasons why these programs did not closely adhere to the QOP model. First, with the exception of the Philadelphia site where the program was operated by the CBO that helped to design the QOP model, local CBO’s found implementing QOP to be difficult, primarily because QOP was substantially more comprehensive, intensive, and complex than their traditional programs. Second, neither DOL nor Ford required the local CBO’s to be faithful to the QOP model, given the desire to see how the model would materialize under each site’s local needs and conditions.
Recruitment/Participation According to the QOP participation data for the 1st 4 years of the demonstration, enrollees spent an average of 174 hours per year on the three components of QOP, not including the one-on-one mentoring most youth received. The 174 hours represented 23% of the annual goal of 750 hours. Enrollees spent an average of 72 hours per year on education (29% of the goal), 76 hours on developmental activities (30% of the goal), and 26 hours on community service (11% of the goal). The average time spent on QOP activities fell steadily from 247 hours in the 1st year of the demonstration to 89 hours in the 4th year. The percentage of enrollees spending no time at all on QOP activities increased steadily from 1% in the 1st year to 36% in the 4th year.

Participation ranged from highs of 345 hours per year per enrollee in the Yakima site and 244 hours in the Philadelphia site to a low of 68 hours in the Fort Worth site. The average annual participation for the two Ford-funded sites was 294 hours, and the average annual participation in DOL-funded sites was 126 hours.

The roughly 12% of enrollees who spent 100 or fewer hours on QOP activities during the entire demonstration reported being uninterested in those activities or having other after school activities, such as playing a sport, working, or caring for other family members.

A distinguishing characteristic of not only the Philadelphia site but also the other Ford-funded site, Yakima, was that enrollee participation was much higher than at other sites, perhaps because case manager compensation was based entirely on enrollee participation. Compared with the average enrollee in a DOL-funded site, the average enrollee in Philadelphia and Yakima spent 1.9 and 2.7 times as many hours respectively on QOP activities during the demonstration.
Staffing/Training All sites implemented the prescribed ratio of about 15 to 25 enrollees per case manager.

Case managers developed deep personal relationships with the 40–60% of enrollees who attended some program activities regularly and addressed a wide range of barriers facing those youth.

Most case managers stayed with the program for several years, and many stayed for the entire 5 years of the demonstration.

Staff compensation varied between the sites funded by DOL and those funded by Ford Foundation. Those at Ford sites were compensated according to the participation of each case manager’s youth. Those at DOL sites were compensated according to pre-agreed on salaries, irrespective of their youths’ involvement.

The demonstration revealed the practical limitation of QOP’s policy of case managers being on duty or on call for large numbers of hours each week. Such a policy is limited by the case managers’ personal lives, the physical difficulties of providing services to enrollees who moved far away, and the legal limits on case manager overtime.


Summative/Outcome Findings

Academic QOP participants showed no significant improvements relative to control group youth in achievement test scores, grades, or credits earned.

There were no significant differences between QOP and control group youth in the likelihood of graduating from high school with a diploma or the likelihood of completing high school by earning either a diploma or a GED. No significant program effects on high school completion were found within any site or among any of the subgroups defined by the observed baseline characteristics of sex, age at entry into ninth grade, or grade point average in the eighth grade.

QOP participants were nine percentage points more likely than controls to ever have engaged in postsecondary education or training, including college attendance, vocational or technical school attendance, apprenticeship enrollment, and armed forces enlistment (62% vs. 53%, p < .05). QOP participants were also seven percentage points more likely to ever attending college (37% vs. 30%, p < .10) and six percentage points more likely to complete at least one quarter at college (33% vs. 27%, p < .10). Estimated impacts decline and become insignificant at higher levels of educational attainment (e.g., completing at least 1 year at college). These effects were generally larger for younger enrollees (those who were age 14 or younger when they entered ninth grade) and enrollees in the bottom two thirds of the eligible grade distribution. These effects were also concentrated in the Cleveland and Philadelphia sites, with none of the other five sites showing beneficial effects.
Prevention QOP was not significantly related to reductions in disciplinary actions during high school.

By the time of the post-intervention survey, QOP participants showed no significant decreases in the likelihood of teen parenting, binge drinking, committing a crime, or being arrested or charged with a crime as compared to the control group. QOP enrollees were significantly less likely than control group members to have used an illegal drug (12% vs. 18%, p < .05).
Youth Development QOP participants were significantly (p < .01) more likely (by 31 percentage points) to report participation in “special programs other than your normal high school classes … [that try] to help students stay in school, make good grades, stay away from illegal drugs, prepare for work or college, and make good decisions in life” on the survey. However, more than half (53%) of QOP enrollees failed to report participating in such a program.

No significant program effects were found for the likelihood that enrollees perceived themselves as being positively influenced by a caring adult or other resiliency factors such as having an optimistic outlook on the future or believing that risky behaviors are wrong.

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project