Main Content

Syllabus

The information contained on this page is designed to give students a representative example of material covered in the course. Any information related to course assignments, dates, or course materials is illustrative only. For a definitive list of materials, please check the online catalog 3-4 weeks before the course start date.

PADM 550 POLICY AND PROGRAM EVALUATION: (3 credits) The course will cover the theoretical issues in and basic methods of policy and program evaluation (retrospective policy analysis).

Overview

This course is designed to introduce students to the basic methods of policy and program evaluation. These methods are used in needs assessment, monitoring social programs, and assessing their effectiveness and efficiency. Methodological issues in randomized experiments, quasi-experiments, and efficiency measurement will be discussed. The class will also cover social, political, and ethical contexts of evaluation.

The primary goal of this course is to help students become informed consumers of the products of evaluation research. You will also learn the basic skills of designing and conducting evaluation projects and become familiar with some of the theoretical issues in evaluation research.

During the semester, students will be given written assignments. You will also conduct a policy or program evaluation in an area of your choice.

Prerequisite for the Course

As you know, PADM 503: Research Methods is the prerequisite for this course. I assume that you have learned all the concepts presented in PADM 503 and are comfortable applying the analytical methods you learned. In this course, you will  apply some of those methods. I understand that it is not easy to absorb all the details of research methods and their applications in one course; you may have forgotten some of the details. If that is the case, I recommend the following:

  • Make sure that you still have the textbook used in PADM 503 or have access to it or a similar introductory textbook. You may need to consult with an introductory textbook from time to time. You may also use the information in the online PADM 503 course materials in ANGEL, if it’s within a year from the course start date.
  • There are many resources available on the Internet for refreshing your knowledge of research methods. These resources can be accessed at university websites, Wikipedia, and YouTube by conducting keyword searches. Many of these resources have correct and comprehensive information, but I cannot guarantee that all do. In case of doubt, please consult a good textbook, primarily the textbook used in PADM 503, or the course materials used in PADM 503. Please note that if you apply some incorrect information you found on the Internet in your assignments, you will lose points.
  • If you still have questions about how to apply research methods, please ask me when we cover relevant topics during the course. You should ask your questions before I post an assignment. I cannot answer your questions that are directly pertinent to the contents of the assignments after I post the assignments.

Course Objectives

At the end of the course, students should be able to

  • explain and apply the basic knowledge of the different phases and forms of program evaluation: needs assessment, process evaluation (monitoring), outcomes and impact assessments, and efficiency  evaluation;
  • use basic knowledge, describe various research designs and methods used in evaluation research;
  • conduct basic analyses in the above-listed different phases and forms of program evaluation by using the designs and methods of evaluation research;
  • analyze the social and political contextual issues in program evaluation research; and
  • identify the ethical issues in program evaluation research.

You should read assigned chapters in the textbooks every week and practice with as many of the exercises in the textbooks. It is essential that you ask your questions about topics covered in the lessons to the instructor via email.

Course Materials

Most World Campus courses require that students purchase materials (e.g., textbooks, specific software, etc.). To learn about how to order materials, please see the Course Materials page. You should check LionPATH approximately 3–4 weeks before the course begins for a list of required materials.

In addition to this textbook, you will read parts of other books and some articles during the semester, as indicated in the course schedule at the end of this syllabus.

Supplemental Reading Sources

The following list of readings is meant to provide you with leads regarding the topics that will be covered in the course. Some sections from the books listed are among the required or recommended readings in the schedule at the end of this syllabus.

  • General Policy/Program Evaluation Texts

Alkins, M. C. (2012). Evaluation roots: A wider perspective of theorists’ views and influences (2nd Ed.). Los Angeles, CA: Sage.

Bamberger, M., Rugh, J., & Mabry, L. (2011). RealWorld evaluation: Working under budget, time, data, and politkcal constraints. Los Angeless, CA: Sage.

Boulmetis, J., & Dutwin, P. (2007). The ABCs of evaluation. Indianapolis, IN: Jossey-Bass.

Calley, N. G. (2010). Program development in the 21st century: An evidence-based approach to design, implementation, and evaluation. Los Angeles, CA: Sage.

Chen, H. (2004). Practical program evaluation. Thousand Oaks, CA: Sage.

Davidson, E. J. (2004). Evaluation methodology basics. Thousand Oaks, CA: Sage.

Emison, G. A. (2007). Practical program evaluations. Washington, DC: CQ Press.

Fink, A. (2004). Evaluation fundamentals (2nd ed.). Thousand Oaks, CA: Sage.

Fitzpatrick, J., & Christie, C. (2009). Evaluation in action: Interviews with expert evaluators. Los Angeles, CA: Sage.

Holden, D.  J., & Zimmerman, M. A. (2009). A practical guide to program evaluation and planning: Theory and case examples. Los Angeles, CA: Sage.

Kettner, P. M., Moroney, R. M., & Martin, L. L. (2012). Designing and managing programs: An effectiveness-based approach (4th Ed.). Los Angeles, CA: Sage.

King, J. A., & Stevahn, L. (2012). Interactive evaluation practice: Mastering the interpersonal dynamics of program evaluation. Los Angeles, CA: Sage.

Mathison, S. (2004). Encyclopedia of evaluation. Thousand Oaks, CA: Sage.

McDavid, J. C., & Hawthorn, L. R. L. (2005). Program evaluation and performance measurement. Thousand Oaks, CA: Sage.

Morra Imas, L. G., & Rist, R. C. (2009). Road to results: Designing and conducting effective development evaluations. [a World Bank electronic book]. (Available at the University Libraries e-Journals.)

Shaw, I., Greene, J. C., & Mark, M. M. (2006). In I. Shaw, J. C. Greene, & M. M. Mark (Eds.), The SAGE handbook of evaluation. Thousand Oaks, CA: Sage.

Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. Indianapolis, IN: Jossey-Bass.

Sylvia, R. D., & Sylvia, K. M. (2004). Program planning and evaluation for the public manager (3rd ed.). Lon Grove, IL: Waveland Press. (This is a good practitioner-oriented book.)

United States General Accounting Office, Program Evaluation and Methodology Division (1991, March). Designing evaluations.

Vedung, E. (1997). Public policy and program evaluation. New Brunswick, NJ: Transaction Publishers. (This book offers a perspective of evaluation that is broader than the one described in our main textbook [Rossi et al.].)

Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (2007). Handbook of practical program evaluation. Indianapolis, IN: Jossey-Bass.

Zimmerman, M., & Holden, D. J. (2008). A practical guide to program evaluation planning. Thousand Oaks, CA: Sage.

  • Experiments and Quasi-Experiments in Evaluation Texts

Duflo, E., Glennerster, R., & Kremer, M. (2007). Using randomization in development economics research: A toolkit. In T. P. Schultz & J. Strauss (Eds.), Handbook of development economics (Vol. 4,  pp. 3895–3962). Amsterdam: Elsevier. (This is a good discussion of randomization and related methods.)

Glazerman, S., Levy, D., & Myers, D. (2003). Nonexperimental versus experimental estimates of earnings impacts. Annals of the American Academy of Political and Social Science, 589(1), 63–93.

Shadish, W. R., Cook, T. D., & Campbell D. T. (2003). Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin.

  • Meta-Analyses Texts

Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correction error and bias in research findings. Thousand Oaks, CA: Sage.

Lipsey, M. W., & Wilson, D. B. (2000). Practical meta-analysis. Thousand Oaks, CA: Sage.

  • Statistics for Evaluation Texts

Bingham, R. D., & Felbinger, C. L. (2002). Evaluation in practice: A methodological approach (2nd ed.). New York: Chatham House. (This is a textbook on quantitative analytical methods used in evaluation.)

Langbein, L., & Felbinger, C. L. (2006). Public program evaluation: A statistical guide. Armonk, NY: M.E. Sharpe.

Macfie, B. F., & Nufrio, P. M. (2005). Applied statistics for public policy. New York: M.E. Sharpe. (This is a good refresher book on statistics.)

  • Needs Assessment Texts

Altschuld, J. W. (2009). Needs assessment: Phase II—collecting data. Los Angeles: Sage.

Altschuld, J. W., Eastmond, J. N. (2009). Needs assessment: Phase I—getting the process started. Los Angeles: Sage.

Altschuld, J. W., & Kumar, D. D. (2009). Needs assessment: An overview. Los Angeles: Sage.

Altschuld, J. W., & White, J. L. (2009). Needs assessment: Analysis and prioritization. Los Angeles: Sage.

Soriano, F. I. (2012). Conducting needs assessments: A multidisciplinary approach (2nd Ed). Los Angeles: Sage.

Stevahn, L., & King, J. A. (2009). Needs assessment: Phase III—taking action for change. Los Angeles: Sage.

Witkin, B. R. (1995). Planning and conducting needs assessments. Thousand Oaks, CA: Sage.

  • Logic Models Texts

Fretchling, J. A. (2007). Logic modeling methods in program evaluation. Indianapolis, IN: Jossey-Bass.

Knowlton, L. W., & Phillips, C. C. (2012). The logic model guidebook: Better strategies for great results (2nd Ed.). Los Angeles: Sage.

  • Qualitative Methods for Evaluation Texts

Patton, M. Q. (2001). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: Sage.

Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Los Angeles: Sage.

Preskill, H., & Catsambas, T. T. (2006). Reframing evaluation through appreciative inquiry. Thousand Oaks, CA: Sage.

  • Concept Mapping and Evaluations Texts

Kane, M., & Trochim, W. M. K. (2007). Concept mapping for planning and evaluation. Thousand Oaks, CA: Sage.

  • Critical Perspectives on Evaluations Texts

Fischer, F. (1995). Evaluating public policy. Chicago: Nelson-Hall Publishers.
(This is a critical theoretical assessment of the mainstream evaluation methods. Fischer proposes an alternative approach to policy evaluation.)

Morçöl, G. (2002). A new mind for policy analysis: Toward a post-Newtonian and postpositivist epistemology and methodology. Westport, CT: Praeger.
(This is a critical theoretical assessment of some of the methods used in evaluation and their theoretical underpinnings.)

  • Social Network Analysis and Evaluations Texts

Durland, M. M., & Fredericks, K. A. (2006). Social network analysis in program evaluation. San Francisco: Jossey-Bass.

  • Evidence-Based Policy Texts

Donaldson, S. I., & Christie, C. A. (2009). What counts as credible evidence in applied research and evaluation practice? Los Angeles: Sage.

Pawson, R. (2006). Evidence-based policy. Thousand Oaks, CA: Sage.

  • Confirmative Evaluation Texts

Desinger, J. C., & Moseley, J. L. (2007). Confirmative evaluation: Practical strategies for valuing continuous improvement. Indianapolis, IN: Jossey-Bass.

Using the Library

Many of the University Libraries resources can be utilized from a distance. Through the Libraries website, you can

  • access magazine, journal, and newspaper articles online using library databases;
  • borrow materials and have them delivered to your doorstep—or even your desktop;
  • get research help via email, chat, or phone using the Ask a Librarian service; and
  • much more. 

You must have an active Penn State Access Account to take full advantage of the Libraries' resources and service.  The Off-Campus Users page has additional information about these free services.

Technical Requirements

Statistical Software Requirements

Before beginning this course, you should have learned basic data management and analysis skills, particularly the skills of using Excel spreadsheets and SPSS (Statistical Package for Social Scientists) software. If you need to refresh your Excel or SPSS skills, you can study the tutorials available at lynda.psu.edu. Penn State students use the resources at this web site free of charge.

If you need a desk reference for SPSS, the most recent editions of the following books may be useful:

  • George, D., & Mallery, P. (2013). IBM SPSS Statistics 21 Step by Step: A Simple Guide and Reference (13th Ed.). Pearson.

You will need to have access to graduate student version of SPSS. You can purchase the software at the following links: 

Please note that SPSS Statistics software is installed on all CLM (Classroom and Lab Computing) lab computers on all Penn State University campuses that participate in CLM.

World Campus Technical Requirements

Technical Requirements
Operating System Windows Vista, Windows 7, Windows 8*; Mac OS X 10.5 or higher
*Windows 8 support excludes the tablet only RT version
Processor 2 GHz or higher
Memory 1 GB of RAM
Hard Drive Space 20 GB free disk space
Browser We recommend the latest ANGEL-supported version of Firefox or Internet Explorer. To determine if your browser fits this criterion, and for advice on downloading a supported version, please refer to the following ITS knowledge base article: Supported Browsers and Recommended Computers.
Note: Cookies, Java, and JavaScript must be enabled. Pop-up blockers should be configured to permit new windows
from Penn State websites.

Due to nonstandard handling of CSS, JavaScript and caching,
older versions of Internet Explorer (such as IE 6 or earlier) do not work with our courses.
Plug-ins Adobe Reader [Download from Adobe]
Flash Player (v7.0 or later) [Download from Adobe]
Additional Software Microsoft Office (2007 or later)
Internet Connection Broadband (cable or DSL) connection required
Printer Access to graphics-capable printer
DVD-ROM Required
Sound Card, Microphone, and Speakers Required
Monitor Capable of at least 1024 x 768 resolution

If you need technical assistance at any point during the course, please contact the Service Desk.

For registration, advising, disability services, help with materials, exams, general problem solving, visit World Campus Student Services!

Assignments and Class Paper

Analytical Assignments

There will be two analytical assignments during the semester. I will post the assignments on ANGEL on the dates indicated in the schedule below. These assignments should be completed individually. I will grade your papers according to the accuracy of knowledge presented, clarity and coherence in writing, and the appropriateness of the applications of concepts and methods to specific cases.

Quizzes/Tests

There will also be three quizzes/tests during the semester. I will post the questions for these quizzes on ANGEL on the dates indicated in the schedule below. You will have only a limited time to answer the questions. This is why you should have read all the required readings for the week in addition to the readings from all previous lessons, before beginning the quiz

Class Paper

You will also write a class paper and submit it at the end of the semester. You will have the following options for this paper:

  • Option 1: Design a needs assessment, process evaluation, or impact assessment study individually and write a paper.
  • Option 2: Conduct a meta-evaluation or a meta-analysis of evaluation studies individually and write a paper.

You should consult with me about the topic you want to choose for your paper early in the semester. I recommend that you submit an outline of your project to me as early as possible. I also recommend that you keep me informed about the progress you are making in your research project during the semester.

You will make a presentation on your paper to your classmates and me in one of the last two weeks of the semester (see the schedule). The purpose of this presentation is to give you feedback so that you can improve the paper before you submit it on the date indicated in the schedule. Your presentation will not be evaluated by me or your classmates for a grade, but you should still be as clear and organized as possible in your presentation.

Your paper should be between 9 and 10 pages in length (double-spaced, list of references and appendices not included).

Read the sample student papers in the Class Paper folder (link avalable in the left menu) carefully. You can learn from them how the information is organized when reporting the results of each kind of analysis (process evaluation, impact analysis, and meta-analysis). You can emulate formats used in these papers, but, of course, the information you report in your paper should be original.

Grading

Grade Distribution

No.

Assignment

Percentage of the Course Grade

Grade Distribution

1

2 Analytical Assignments

40% (20% each)

2

3 Quizzes/Tests

30% (10% each)

3

Class Paper

30%

Total

Total

100%

Letter Grades and Point Equivalents

GradePercentage
Grading Scale
A94.00 and Above
A-90.00-93.99
B+87.00-89.99
B83.00-86.99
B-80.00-82.99
C+77.00-79.99
C70.00-76.99
D60.00-69.99
FBelow 60

Please refer to the University Grading Policy for Graduate Courses for additional information about University grading policies.

Note: If you are planning to graduate this semester, please communicate your intent to graduate to your instructor. This will alert your instructor to the need to submit your final grade in time to meet the published graduation deadlines. For more information about graduation policies and deadlines, please go to the Graduation Information on the My Penn State Online Student Portal.

Information Sources

  • Recommended Information Sources on the Internet

Penn State’s electronic databases are very useful for literature searches and finding policy-related information. In particular, I recommend the Proquest Direct and Worldwide Political Science Abstract databases. The databases Policy File (Public Policy Research & Analysis) and CQ Researcher are good for finding policy-related literature and policy and program evaluation reports. In addition to these sources, I recommend the following as information sources for class projects, but you should search for additional sources of information and conduct surveys and interviews when necessary.

  • General Sources for Evaluation Studies

Online Evaluation Resource Library
Virtual Library: Evaluation
Western Michigan University Evaluation Center
Campbell Collaboration (Look under “The Campbell Library.”)

  • Statistics

The University of California at Berkeley’s Survey Documentation and Analysis Databases )
U. S. Bureau of the Census
U.S. Bureau of Labor Statistics
Gateway to the Statistics of 100 Federal Agencies

  • Public Policy Think Tanks, Research Institutions, and Government Agencies

Rand Corporation
The Urban Institute
The Brookings Institution
The American Enterprise Institute
CATO Institute
Pew Charitable Trusts
The U.S. General Accountability Office  (Look under “Reports and Testimonies.”)
Pennsylvania Department of Auditor General (Look under “Reports Online.”)

  • Writing Style

Recommended style manual

American Psychological Association. (2010). Publication manual of the American Psychological Association (6th ed.). Washington, DC.

You can find quick answers to your questions when using the American Psychological Association (APA) style at the following links:

Tutorial for the basics of the APA Manual

APA Style Essentials

Psychology with Style—A Hypertext Writing Guide

The University of Illinois at Urbana-Champaign, The Center for Writing Studies, APA Style Resources

In analytical assignments, quizzes, and class  papers, you must follow the guidelines in the document Guidelines for Writing Class Papers. You should also use an appropriate writing style consistently (APA or University of Chicago). Remember that I recommend the APA style, but if you are more comfortable with and proficient in the Chicago style (a.k.a. the Turabian style), you may use it, as long as you use it accurately and consistently. I expect that you are familiar with the basic style guidelines of APA or Chicago. If you have any questions about style, you should ask me. I will not deduct points for mistakes you make when applying the mechanics of style (I will help you correct them), but I will deduct points if you do not give any citations (i.e., no sources are cited or no reference list is provided) or if you do not use them systematically.

All assignment and project papers should be written clearly, coherently, and concisely. All papers must be paginated properly. Class project papers must be divided into appropriate sections with clear section headings. (See the APA,  Chicago, or MLA style guidelines for appropriate use of section headings.) Appendices must be used sparingly. When used, appendices must include only the material (tables, figures, lists, etc.) that cannot fit into the main body of the text and that are necessary to explain or illustrate one or more points made in the main text. When used, the length of an appendix must be kept to a necessary minimum and must be referred to and discussed in the main text. If the requirements discussed here are not met in a paper, then I will deduct points from your grade.

  • Learning Center

For information about writing support for a Penn State graduate course, visit the Graduate Writing Center Information page.

Course Schedule
Course Schedule
LessonReadingsAssignments
Week 1: Definition, History, and Social Context of Evaluation Research

Required:

  • Go through the Orientation page (link is available in the left menu)
  • Rossi, Lipsey, & Freeman: Chapters 1 and 12
  • The Colorado Trust, "The Importance of Culture in Evaluation: A Practical Guide."

Recommended:

  • Alkin, Evaluation Roots (Read this book to gain a deeper understanding of the
    history and trends in evaluation research.)
  • Fischer, Evaluating Public Policy: Chapter 1 (This is a critical theory view of program evaluation.)
  • Fitzpatrick, Sanders, & Worthen, Program Evaluation. (Read the entire book for an understanding of alternative models of evaluation, the contexts of evaluation, and practical advice.)

None.

Week 2: Tailoring Evaluations, Identifying Issues, and Formulating Questions

Required:

  • Rossi, Lipsey, & Freeman: Chapters 2 and 3

Recommended:

  • Fischer, Evaluating Public Policy: Chapters 2–10
  • Chen, Practical Program Evaluation: Chapters 4 and 5 (These chapters are particularly good because they include practical advice on how to engage stakeholders in designing a program evaluation study.)

First quiz is due.

Week 3: Needs Assessment

Required:

  • Rossi, Lipsey, & Freeman: Chapters 4
  • Elliott, N. L., Quinless, F. W., & Parietti, E. S. (2000). Assessment of a Newark neighborhood: Process and outcomes. Journal of Community Health Nursing, 17(4), 211–224.
  • Holton, E. F., Bates, R. A., & Naquin, S. S. (2002). Large-scale performance-driven training needs assessment: A case study. Public Personnel Management, 29(2), 249–267.

Recommended:

  • McDavid & Hawthorn, Program Evaluation & Performance
    Management: Chapter 6
  • Lacey, Joanne. “Alcohol brief interventions: exploring perceptions and training needs,” The Journal of the Health Visitors' Association Community Practitioner 82. 6 (Jun 2009): 30–33.

For more detailed information on needs assessments, read the following books in their entirety:

  • Altschuld, J. W., Eastmond, J. N. (2009). Needs assessment: Phase I—getting the process started. Los Angeles: Sage.
  • Altschuld, J. W., & Kumar, D. D. (2009). Needs assessment: An overview. Los Angeles: Sage.
  • Altschuld, J. W., & White, J. L. (2009). Needs assessment: Analysis and prioritization. Los Angeles: Sage.
  • Stevahn, L., & King, J. A. (2009). Needs assessment: Phase III—taking action for change. Los Angeles: Sage.
  • Witkin, B. R. (1995). Planning and conducting needs assessments. Thousand Oaks, CA: Sage.

None.

Week 4: Program Theory

Required:

  • Rossi, Lipsey, & Freeman: Chapters 5
  • Engel-Cox, J. A., Van Houten, B., Phelps, J., & Rose, S. W. (2008). Conceptual model of comprehensive research metrics for improved human health and environment. Environmental Health Perspectives, 116(5), 583-592.
  • Bond-Zielinski, C., & Moss, M. (2009). Using the logic model to develop family and consumer sciences programming.
    Journal of Family and Consumer Sciences, 101(1), 53–56.

Recommended:

  • Bamberg et al., RealWorld Evaluation: Chapter 9 (This chapter includes discussions on causality issues, mixed-methods, etc.)
  • Chen, Practical Program Evaluation: Chapters 2, 4, and 5
  • McDavid & Hawthorn, Program Evaluation and Performance Measurement: Chapter 2 (The authors use a somewhat different terminology and discuss “program technologies.”)
  • United Way of America, Measuring Program Outcomes: A Practical Approach (In parts of this book, the authors discuss how to develop logic models and apply them. The logic model was developed by the United Way. This book is where it originally formulated.)

First analytical assignment will be posted.

Week 5: Assessing and Monitoring Program Process

Required:

  • Rossi, Lipsey, & Freeman: Chapters 6
  • Carswell, Steven B, & Thomas E. Hanlon. (2009). "A Preventive Intervention Program for Urban African American Youth Attending an Alternative Education Program: Background, Implementation, and Feasibility," Education & Treatment of Children, 32 (3), 445–469.
  • D. R. Young  et al. (2009). “Process evaluation results from a school- and community-linked intervention: the Trial of Activity for Adolescent Girls (TAAG)” Health Education Research, 23(6), 976–986.

Recommended:

  • Vedung: Chapters 9 and 13

First analytical assignment is due.

Week 6: Measuring and Monitoring Program Outcomes

Required:

  • Rossi, Lipsey, & Freeman: Chapters 7
  • Hendricks, M., Plantz, M. C., & Pritchard, K. J. (2008). Measuring outcomes of United Way–funded programs: Expectations and reality. J. G. Carman & K. A. Fredericks (Eds.). Nonprofits and evaluation. New Directions for Evaluation, 119, 13–35.

Recommended:

  • Vedung: Chapters 9 and 13
  • United Way of America, Measuring Program Outcomes: A Practical Approach

Second quiz is due.

Week 7: Assessing Program Impact: Randomized Experiments

Required:

  • Rossi, Lipsey, & Freeman: Chapters 8
  • McKay, H., Sinisterra, L., McKay, A., Gomez, H., & Lloreda, P. (1978). Improving cognitive ability in chronically deprived children. Science, 200(4339), 270–278.
  • “Getting out the Youth Vote: Results from Randomized Field Experiments”

Recommended:

  • Bingham & Felbinger: pp. 57–59; pp. 77–78
  • Solberg, A. (1983). Community posthospital follow-up services.
    Evaluation Review, 7, 96–109. (This is an example of post-test only control group design.)
    (This is an example of post-test only control group design.)
  • Duflo, E, “Scaling Up and Evaluation.” (This is a methodologically sophisticated application of experimental design in complex social conditions.)
  • Banerjee A., et al. “Remedying Education.” (This is a methodologically sophisticated application of experimental design in complex social conditions.)
  • Langbein & Felbinger: Chapters 3, 4, and 5
  • Shadish, W. R., Cook, T. D., & Campbell D. T. (2003). Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin. Entire book
  • Morçöl, G. (2002). A new mind for policy analysis: Toward a post-Newtonian and postpositivist epistemology and methodology. Westport, CT: Praeger. pp. 37–47.

None.

Week 8: Assessing Program Impact: Quasi-Experimental Designs

Required:

  • Rossi, Lipsey, & Freeman: Chapters 9
  • Newcomb. T. M. (1984). Conservation program evaluations: The control of self-selection bias. Evaluation Review, 8, 425-440. (This is an example of the pretest-posttest comparison group design. The researcher used t-test to compare the groups.)
  • Powella, J. V., Aeby Jr., V. G., Carpenter-Aeby, T. (2003).
    A comparison of student outcomes with and without teacher
    facilitated computer-based instruction. Computers & Education, 40, 183–191.
    (This is an example of posttest-only comparison group design.)

Recommended:

  • Bingham & Felbinger: pp. 109–119; pp. 120–121
  • Moran, G. E. (1985). Regulatory strategies for workplace
    injury reduction: A program evaluation. Evaluation  Review, 9,  21–33.
    (This is an example of the interrupted time-series analysis. The authors used regression analysis with dummy variables.)
  • Langbein, L., & Felbinger, C. L. (2006). Public program evaluation: A statistical guide. Armonk, NY: M.E. Sharpe. Chapter 6
  • Shadish, W. R., Cook, T. D., & Campbell D. T. (2003). Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin. Entire book
  • Morçöl, G. (2002). A new mind for policy analysis: Toward a post-Newtonian and postpositivist epistemology and methodology. Westport, CT: Praeger. pp. 47–54.

Second analytical assignment will be posted.

Week 10: Assessing Program Impact: Reflexive Designs

Required:

  • Rossi, Lipsey, & Freeman: Chapter 9 (pp. 289–295 only)
  • Goetz, E. G. (1995). A little pregnant: The impact of rent
    control in San Francisco. Urban Affairs Review, 30(4), 604–612.
    (This is an example of simple time-series design. It requires some background in time-series analyses to grasp its contents entirely. In this course we do not cover this kind of analyses. However, you can still understand what the author tried to do in his analyses. Ignore the mathematical parts of this article.)

Recommended:

  • Bingham & Felbinger, pp. 153–154; pp. 166–167
  • Langbein, L., & Felbinger, C. L. (2006). Public program evaluation: A statistical guide. Armonk, NY: M.E. Sharpe. Chapter 7
  • Shadish, W. R., Cook, T. D., & Campbell D. T. (2003). Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin. Entire book
  • Morçöl, G. (2002). A new mind for policy analysis: Toward a post-Newtonian and postpositivist epistemology and methodology. Westport, CT: Praeger. pp. 47–54

Week 10: Second analytical assignment is due.

Week 11: Detecting, Interpreting, and Analyzing Program Effects; Meta-Analysis

Required:

  • Rossi, Lipsey, & Freeman: Chapters 10
  • Ennett, S. T., Tobler, N. S., Ringwalt, C. L., & Flewelling, R. L. (1994). How effective is drug abuse resistance education? A meta-analysis of Project DARE outcome evaluations. American Journal of Public Health, 84(9), 1394-1401.
  • Wilson, S., J, & Lipsey, M. W. Effects of school based social information processing programs, (A Campbell Collaboration Paper)

Recommended:

  • Langbein & Felbinger: Chapter 9
  • Bingham & Felbinger, pp. 225–226; pp. 237–239
  • Hunter & Schmidt (entire book)
  • Lipsey, M. W., & Wilson, D. B. (2000). Practical meta-analysis. Thousand Oaks, CA: Sage. Entire book.

None.

Week 12: Measuring Efficiency

Required:

  • Rossi, Lipsey, & Freeman: Chapters 11
  • Malitz, D. (1984). The costs and benefits of Title Xx and Title Xix family planning services in Texas. Evaluation Review, 8, 519–536.
    (This is an example of ex-pots cost-benefit analysis.)
  • Graveley, E. A, & Littlefield, J. H. (1992). A cost-effectiveness analysis of three staffing models for the delivery of low-risk prenatal care. American Journal of Public Health, 82(2), 180–184.)
    (This is an example of ex-post cost-effectiveness analysis.

Recommended:

  • Bingham & Felbinger, pp. 181–182; 194–196; 239
  • Jones, Bumbarger, Greenberg, Greenwood, & Kyler, “The Economic Return on PCCD’s Investment in Research-Based Programs: A Cost-Benefit Assessment of Delinquency Prevention in Pennsylvania.”
  • Roman, Chalfin, Reid, J., & Reid, S. “Impact and Cost-Benefit Analysis of the Anchorage Wellness Court”

Third quiz is due.

Week 13: Preparation for Class Papers
  • None.

None.

Week 14: Presentations of Class Papers
  • None.

Upload your PowerPoint presentation to the Discussion Forum for Class Paper Presentation.

Week 15: Revisions on Class Papers
  • None.

Make revisions in your paper and prepare it to submit next week.

Week 16: Class Paper is Due
  • None.

Submit your paper through turnitin.com. See the turnitin submission information at the course website.

Formal instruction will end on the last day of class. Provided that you have an active Penn State Access Account user ID and password, you will continue to be able to access the course materials for one year, starting from the end date of the academic semester in which the course was offered (with the exception of library reserves and other external resources that may have a shorter archival period). After one year, you might be able to access the course based on the policies of the program or department offering the course material, up to a maximum of three years from the end date of the academic semester in which the course was offered. For more information, please review the University Course Archival Policy.

Academic Integrity

According to Penn State policy G-9: Academic Integrity , an academic integrity violation is “an intentional, unintentional, or attempted violation of course or assessment policies to gain an academic advantage or to advantage or disadvantage another student academically.” Unless your instructor tells you otherwise, you must complete all course work entirely on your own, using only sources that have been permitted by your instructor, and you may not assist other students with papers, quizzes, exams, or other assessments. If your instructor allows you to use ideas, images, or word phrases created by another person (e.g., from Course Hero or Chegg) or by generative technology, such as ChatGPT, you must identify their source. You may not submit false or fabricated information, use the same academic work for credit in multiple courses, or share instructional content. Students with questions about academic integrity should ask their instructor before submitting work.

Students facing allegations of academic misconduct may not drop/withdraw from the affected course unless they are cleared of wrongdoing (see G-9: Academic Integrity ). Attempted drops will be prevented or reversed, and students will be expected to complete course work and meet course deadlines. Students who are found responsible for academic integrity violations face academic outcomes, which can be severe, and put themselves at jeopardy for other outcomes which may include ineligibility for Dean’s List, pass/fail elections, and grade forgiveness. Students may also face consequences from their home/major program and/or The Schreyer Honors College.

How Academic Integrity Violations Are Handled
World Campus students are expected to act with civility and personal integrity; respect other students' dignity, rights, and property; and help create and maintain an environment in which all can succeed through the fruits of their own efforts. An environment of academic integrity is requisite to respect for oneself and others, as well as a civil community.

In cases where academic integrity is questioned, the Policy on Academic Integrity indicates that procedure requires an instructor to inform the student of the allegation. Procedures allow a student to accept or contest a charge. If a student chooses to contest a charge, the case will then be managed by the respective college or campus Academic Integrity Committee. If that committee recommends an administrative sanction (Formal Warning, Conduct Probation, Suspension, Expulsion), the claim will be referred to the Office of Student Accountability and Conflict Response.

All Penn State colleges abide by this Penn State policy, but review procedures may vary by college when academic dishonesty is suspected. Information about Penn State's academic integrity policy and college review procedures is included in the information that students receive upon enrolling in a course. To obtain that information in advance of enrolling in a course, please contact us by going to the Contacts & Help page .

Notes from the instructor:

I will enforce the academic integrity policies of the Pennsylvania State University. These policies can be found at the following web site: Academic Integrity at Penn State. The following items are particularly important. You should understand the meaning of these terms and avoid committing the defined acts:

Plagiarism: The fabrication of information and citations; submitting others’ work from professional journals, books, articles, papers, electronic sources of any kind, or the submission of any products from commercial research paper providers regardless of what rationales a vendor uses; submission of other students’ papers or lab results or project reports and representing the work as one’s own; fabricating, in part or total, submissions and citing them falsely. Note: Copying and pasting any materials from the World Wide Web is plagiarism.

Acts of Aiding and Abetting: Facilitating acts by others; unauthorized collaboration of work; permitting another to copy from an exam; permitting another to copy from a computer program; writing a paper for another; inappropriately collaborating on home assignment or exam without permission or when prohibited, etc.

Submitting Previous Work: Submitting a paper, case study, lab report, or any assignment that had been submitted for credit in a prior or concurrent class without the knowledge and permission of the instructor(s).

Failure to Cite Electronic Resources Regardless of the Source: All electronic resources must be cited in every report, paper, project, portfolio, or any other document submitted for evaluation by an instructor.

The issues of academic integrity are also discussed in the document “Guidelines for Writing Class Papers.” I will discuss these issues in the beginning of the class (see the schedule).

General Course Policies

Policies Regarding the Protection of Human Subjects

If you are planning to conduct an empirical study that involves “human subjects” (interviews, surveys, or even secondary data analysis), you must read the policies and guidelines of the Penn State Office of Research Protections (ORP) at Research at Penn State). Normally class projects are exempt from Institutional Review Board (IRB) reviews (see the policies on IRB reviews at Research at Penn State). However, you must keep in mind that you should not present or publish the findings/results of your class project outside the class at any time. If there is a possibility that you may use your results outside the class in the future, you should submit a proposal for IRB approval before you begin the empirical part of your project. If there is any need for clarification regarding the procedures you should follow in your project, you should consult with me and/or the ORP experts (click for contact information; or call 814-865-1775).

Students with Disabilities

Penn State welcomes students with disabilities into the University's educational programs. Every Penn State campus has resources for students with disabilities. The Student Disability Resources (SDR) website provides contacts for disability services at every Penn State campus. For further information, please visit the SDR website.

In order to apply for reasonable accommodations, you must contact the appropriate disability resources office at the campus where you are officially enrolled, participate in an intake interview, and provide documentation based on the documentation guidelines. If the documentation supports your request for reasonable accommodations, your campus's disability resources office will provide you with an accommodation letter. Please share this letter with your instructors and discuss the accommodations with them as early in your courses as possible. You must follow this process for every semester that you request accommodations.

Disclaimer: Please note that the specifics of this Course Syllabus are subject to change, and you will be responsible for abiding by any such changes. Your instructor will notify you of any changes.


Top of page