• Guide for Design and Implementation of Hybrid–Flexible (HyFlex) Models in Adult Education
  • Acknowledgements
  • Introduction
  • Section 1. Programmatic Planning for HyFlex Learning Before Implementation
  • Section 2. Instructional Planning
  • Section 3. Teaching in a HyFlex Class
  • Section 4. Implementing and Scaling Up Flex Models
  • Section 5. Program Evaluation
  • Section 6. Hardware and Education Software Application Choices
  • Section 7. HyFlex Program Vignettes
  • Appendix A. HyFlex Videos and Resources
  • Appendix B. HyFlex Technology Example - Waubonsee Community College in Illinois
  • Download
  • Translations
  • Section 5

    Program Evaluation

    A common problem in education program design is that evaluation is left to the end of the plan or omitted entirely because its importance is often not understood. Research shows that education evaluation is particularly useful because it can provide information for data-driven decision-making to improve the program (Benedict, 1973). With new models, such as HyFlex, program evaluation and research are even more important. An in-depth discussion of HyFlex program evaluation is not possible in this guide because implementation is so new and in many education organizations, is in a pilot phase, but we will touch upon possible HyFlex program evaluation topics and some possible instruments that programs can use to evaluate their own pilots.

    First, however, it is important to understand the difference between formative and summative evaluation. Formative education program evaluation takes place during the development or piloting of a new course, model or program. Summative education program evaluation takes place to determine the success or efficacy of the program once fully developed. Summative program evaluation is often conducted by an independent, objective program evaluator or evaluation organization. At this point in the development of HyFlex models for adult education, most, if not all, evaluation is formative.

    Second, the beginning of a good program evaluation is a clear and agreed-upon statement of goals and objectives, especially measurable objectives. This drives what will be observed and measured. It is also helpful to have a clear vision of what kinds of decisions need to be made so that data can be collected to inform those decisions. In the case of the HyFlex model, which has four underlying principles, some of the data may be collected to determine the extent to which one or more principles are in evidence in the implementation of the model. For example, one of the principles is “Equivalency: Provide learning activities in all participation modes which lead to equivalent learning outcomes,” so as part of a formative program evaluation you may want to study the extent to which learners who choose primarily one or another of the three modes have equivalent learning outcomes. If so, it would be important to have a clear and measurable set of learning outcomes and a way to pre- and post-assess them. 

    Below are some topics to consider for a formative program evaluation of your HyFlex model.

    Potential Program Evaluation Topics

    As you think about your formative program evaluation design, you may want to engage an expert to help you with the evaluation design and also with its implementation. It is especially important that you have someone with expertise in formative evaluation, beginning with program goals, objectives, and decisions that may benefit from information (data) for decision-making for program improvement. (This, after all, is the goal for formative program evaluation.) 

    Here are a range of evaluation topics to consider as you think about what you want your formative evaluation to accomplish:

    Teachers’ Perspectives

    You may wish to have teachers’ perspectives on the opportunities and challenges in using a particular HyFlex model for a specific course — for example, on the challenges, including: using digital technology in the classroom and remotely; engaging both in-person and online learners, assuring that all three modes provide equal outcomes for learners; or assuring that all learners have high-quality internet connections to learn remotely. You may also wish to solicit teachers’ perspectives on opportunities they have had to observe improved attendance, retention, course completion or learning gains using the HyFlex model. You may wish to have their observations about what kinds of learners they have observed are particularly successful or unsuccessful in HyFlex courses. And you will certainly want to know their overall degree of satisfaction with teaching in the HyFlex mode as compared with solely in-person, remote, or other hybrid or blended models.

    Learners’ Perspectives

    You will certainly want to have learners’ perspectives on the benefits and challenges of using a particular HyFlex model for a specific course, including: the extent to which they believe the HyFlex course has met their needs, goals or objectives; the degree to which they found particular HyFlex modes engaging or not; their digital technology challenges in the classroom and remotely; and their overall degree of satisfaction in using the HyFlex model as compared with traditional in-person, entirely remote, or other hybrid or blended hybrid models. You may wish to have a learner evaluation survey that includes a question about what learners liked/didn’t like about each mode.

    Teachers’ and Program Administrators’ Questions

    Teachers and administrators may wish to know if there is evidence of learners’ attainment of course learning objectives (intended learning outcomes). They may wish to see a comparison of adult learner outcomes attainment with identical or similar courses delivered as: solely in-person; hybrid; blended hybrid; or as distance education. Adult education research from two studies (Inverso et al., 2017, and Rose, et al., 2019) suggests that a combination of online and face-to-face instruction is more effective for learning than distance education or in-person classes alone, so it may be of interest to replicate those evaluations in their program. Teachers and administrators may be interested in learner participation patterns, for example, data on learners’ use of each of the three HyFlex modes, or evidence of learner participation in in-person compared with synchronous online and asynchronous online discussions. Program administrators who chose the HyFlex model in part because they hoped to be able, with limited classroom space, to serve more learners, might be interested in knowing about the HyFlex program’s ability to serve more students versus an in-person only model. Many teachers and administrators will also be interested in a comparison of learners’ performance among those using different participation modes.

    Potential HyFlex Program Evaluation Instruments

    The HyFlex model presents many possibilities for observation and measurement instruments, some of which can be administered in more than one mode. Among these are oral or written learner surveys; learner or teacher focus groups; case studies; online or in-person quizzes, tests or written exams; and systematic teacher direct-observation checklists. In addition, some programs may want to consider competency-based direct performance measures for certain curriculum learning objectives, or direct unobtrusive measures, such as; attendance in the in-person and online synchronous modes; hours automatically logged in in the asynchronous mode; or times learners answer questions or make comments in the online or in-person mode.

    Conclusion

    As Brian Beatty (2019) has noted, HyFlex instructional formats are relatively recent in higher education. They are newer still in adult foundational education. In his chapter on evaluation, he has summarized some of the findings on HyFlex in higher education, and some of these may be relevant to adult foundational education HyFlex programs. As of this writing, there have been no published formative or summative evaluations of HyFlex models in adult foundational education. We hope that in future versions of this guide we may be able to report differently.

    Questions to Consider

    1. For what goals, objectives, questions, or decisions do you need data in order to continuously improve your HyFlex model?

    2. For each of these, what kinds of data will need to be collected and how do you plan to collect it?

    References

    Beatty, B. J. (2019). Hybrid-Flexible Course Design (1st ed.). EdTech Books. https://edtechbooks.org/HyFlex.

    Benedict, L. G. (1973). The Fortune/Hutchinson Evaluation Methodology: A Decision Oriented Approach. In American Educational Research Association. New Orleans, Louisiana.

    Inverso, D. C., Kobrin, J., & Hashmi, S. (2017). Leveraging technology in adult education. Journal of Research and Practice for Adult Literacy, Secondary, and Basic Education, 6(2), 55–58

    Rose, G. L., Wang, C., Sainz, A., & Joshi, S. (2019). “Technology Use and Integration in Adult Education and Literacy Classrooms,” Adult Education Research Conference. https://newprairiepress.org/aerc/2019/papers/2

    This content is provided to you freely by EdTech Books.

    Access it online or download it at https://edtechbooks.org/hyflex_guide/ch5_program_evaluation.