The rapid evolution of instructional design, its relative novelty, and trends impacting it serve to cloud understanding and complicate practice. This study sought insight into an area of instructional design practice in higher education by exploring a subset of survey data gathered in early 2018. In part, the survey asked instructional designers and leaders of instructional design teams, working in higher education settings, which design models and theoretical frameworks guided their work. Nearly two hundred individuals provided responses. Answers offered most often included models with long histories, relative to instructional design at large, such as ADDIE and Backward Design. Technology's impact on instructional design was also made apparent by the inclusion of tech-focused frameworks including TPACK and SAMR. Statistical testing failed to develop significant relationships between the quantity of models reported in use and other characteristics of designers, however some relationship may exist relating to education and time in the field. Altogether, this may suggest, as reported by a small number of subjects, that the design process can or even ought to be ill-defined and remain fluid to best respond to unique needs as presented by each subject matter expert or design project.
Instructional design’s novelty as a profession creates what Sharif and Cho (2015) call a shroud of obscurity, which limits our collective understanding of practices in the field. Reiser (2001) indicates that instructional design was not recognized as a separate field until the 1960s. In the five decades since, instructional design has rapidly evolved as demand grew. The United States Bureau of Labor Statistics projects an average expected growth rate of 7% through 2024 (2016). Certain trends, while contributing to increased demand for instructional design, have also complicated it and further challenge our understanding. The proliferation of online learning, expanding technology toolsets, increased availability and access, as well as the emergence of an array of learning environments are among the most notable trends (Allen & Seaman, 2016; Kim, et al., 2007). A role in overall program design and, by extension, an impact on the future of online instruction at an institutional level often also positions instructional designers as leaders (Shaw, 2012). Nevertheless, while perhaps failing to address nuances, definitions for instructional design such as “the systematic process of translating principles of learning and instruction into plans for instructional materials and activities” (Smith & Ragan, 1993, p. 2), are concise, even pedestrian, and yet remain widely accepted. An updated definition added only that the process should be reflective and iterative, in addition to systematic (Smith & Ragan, 2005). Accepted equally well as simplistic definitions of instructional design is the notion that the design process and education itself are not so simple. Caplan (2004), referring to the digital arena of education, describes online course design as “a complex endeavor” necessitating “a highly organized, concerted effort” (p. 186). Education is described by Lohr and Ursyn (2010) as more complicated than rocket science, and instructional design is styled as “a special kind of problem solving” (Simon, 1998, p. 345), which requires practitioners to possess a variety of abilities and personal characteristics (Hatcher, 2008). Yanchar and Gabbitas (2011) encourage instructional designers to demonstrate adaptability, to be versatile, and possess an openness to various sources of insight to be successful. Simply put, much is required of instructional designers as they work to translate outcomes into learning activities and measures.
Using prescribed steps or frameworks, conceptually situated between model and method, designers create and recreate instructional sequences which have proven successful in past efforts (Andrews & Goodson, 1980). In this way instructional designers develop content in a manner described by Friesen (1973), which involves the application of a set of logical steps aimed at accomplishing specified learning outcomes. The practice of applying a model to the design of instruction is “a kind of game plan for…development efforts” (Andrews & Goodson, 1980, p. 4) which assists practitioners in navigating complexities, offers a degree of scalability, and supports quality control. Considering the value promised by using a model, it is not surprising that the number of accepted steps found in the design of instruction grew, “as task analysis, objective specification, and criterion-referenced testing were linked together to form processes” (Reiser, 2001, p. 61). These processes then evolved into models, bridging instructional design’s evolution from the 1960s into the 1970s and beyond.
This paper provides insight into current practice relative to model and theoretical framework use within higher education settings, based on data gleaned from a survey of instructional design professionals and other individuals leading instructional design teams. A review of literature is included to concisely highlight the evolution of instructional design models, describe criteria commonly used to purpose these models, and comment briefly on the application of and challenges inherent in the use of models. Our methods, findings of the investigation and discussion follow, including potential implications on practice, concluding with the limitations of our efforts and suggesting directions for additional future research and areas of inquiry.
Review of Literature
Relatively early on, as instructional design evolved into a field unto itself, a systematic approach to design evolved quickly into standard practices. Early influences supported the emergence of such systems – models which were meant to standardize and guide designers’ work. Reiser credits Skinner (1954) with creating a minor revolution in education by being first to describe content as programmed instructional materials and setting forth the idea of presenting instruction in small steps. Skinner (1954) also noted the importance of interaction, positing that learners be questioned and, in response to their answers, be provided with immediate feedback. The thought herein of what could be called a feedback loop suggests motivational implications. As each subsequent “step was small, it was thought that learners would answer all questions correctly and thus be positively reinforced” (Reiser, 2001, p. 59). Skinner (1954) identified learning as behavioral change and therefore focused on reinforcement to both encourage and recognize observed behavioral change, and, therefore, to recognize learning. Bloom et al. (1956) likewise contributed to early model design by providing a way to categorize and place in hierarchy the objectives of learning methodology. The perspective of Bloom et al. (1956) was later supported by Mager (1962) with focus being placed on the importance of objectives and successful authoring of objectives by educators. Of equal importance was also the development of a shared understanding of formative and summative assessment in the field. At its start, however, these assessment monikers were not used relative to evaluation or valuation of student performance, but rather relative to the instructional materials themselves. Reiser (2001) indicates only a small number of early instructional products, despite being produced systematically, were tested in any meaningful way. The 1960s additionally brought forth another key evolution in the design of instruction, the emergence of criterion-referenced testing. Testing until that time was “designed to spread out the performance of learners, resulting in some students doing well and others doing poorly” (Reiser, 2001, p. 60). In contrast, criterion-referenced tests measure an individual’s behavior independent of others. This sort of assessment also became a key element of instructional design processes.
The 1970s could be fairly described as a decade of model development in instructional design. Andrews and Goodson (1980) provided three pivotal outcomes, by investigating models in use at the time, identifying the purposes for those models, and offering conclusions germane to model design and application. The research identified and categorized forty examples of instructional design models (Appendix B), which had emerged during the prior decade, and established four purposes for instructional design. These purposes are improving teaching and learning, oversight of design through monitoring, evaluation of processes, and testing or building of learning from a theory-based perspective. Finally, as noted above, Andrews and Goodson offered assertions relating to design models, the possible reasoning behind the relatively large number of models, and what imperatives both suggested to practitioners. In some cases, models were identified as “generic in that they may be applied across differing purposes, emphases, origins, uses and settings” (p. 12). However, it was further concluded that some “models were not models at all in that they fail to describe, explain, or predict elements in their referent system” (p. 13). The combination of these two factors chiefly explains both the quantity of models existing at the time while also indicating the need for caution as others were likely yet to come. Essentially, as so few models had been in use for significant periods of time, and were therefore not thoroughly vetted, some designers were simply inclined to invent their own model rather than trust one already in existence (Andrews & Goodson). As an emerging field, instructional design, or, more aptly, its practitioners were working toward establishing value. One of the two principles asserted by Friesen (1973) suggests a design process with external expertise, e.g., an instructional designer, is necessary. The other Friesen principle, however, indicates instruction can be created effectively without such support, by “a master teacher, working alone to create an inspired work of art” (p. 2). Andrews and Goodson focused on the notion that a design process supported by an instructional designer, in addition to subject matter expert, is the more effective approach. This push and pull between a subject matter expert’s skill as an educator and the expertise a designer brings continues to impact practice in the present (Tate, 2017). Nevertheless, a recent study concluded in support of instructional design-aided learning development, in terms of student perspective. In an investigation of student perceptions of quality, across four models of course design, spanning a 3-year period, Brown, Lewis, and Toussaint (2018) found significant support for instructional designer-supported course design across all eight of their tested standards.
The implications of this evolution in model development and use are key. On the one hand, the use of a model – any model – can be valuable in providing a frame within which to work, and a way to represent that work to others, to scale an effort, and ensure consistency. However, the choice of a model and its application can raise questions. If, for example, designers using a single model – one which may not have been appropriately tested, deemed not a model at all by the standards of Andrews and Goodson (1980), or perhaps even the invention of the designer, its efficacy could be questionable. Rigidity can contribute to conflict between designer and subject matter experts (Tate, 2017). In an updated investigation of popular instructional design models, Mutlu (2016) asserts that much of current instructional design practice fits within the broad umbrella of ADDIE. The ADDIE model originates among the models included in Andrews and Goodson (1980), specifically, Branson (1975). While the nature of Mutlu’s (2016) investigation suggests some consolidation of models used in practice, and even alignment with a shared guiding process (i.e., ADDIE), the need for a more comprehensive analysis, encompassing more models, is also acknowledged. At the heart of Mutlu’s findings is the observation that while “developmental attempts of instructional designers will result in different variations of…models” (p. 6159), some similarities are inevitable.
Methods
In January of 2018, a survey was distributed (Appendix A) via email to individuals subscribed to various lists including the Arizona State University Blackboard Users Group, Michigan Blackboard Users Group (MiBUG), Professional and Organizational Development (POD) Network, and the University Professional and Continuing Education Association (UPCEA). The survey was also distributed to a list of individuals in teaching and learning/e-learning/instructional design leadership roles in higher education across the United States. The target population, instructional designers working in higher education, was selected for convenience and to align with the researchers’ interests.
A web-based questionnaire, created with and hosted on Qualtrics®, was selected to efficiently collect data from many respondents (Trochim, 2006; Wyse, 2012). The instrument was developed to gather information about instructional designers and others working in the instructional design field with other job titles, within higher education. Items were adapted with permission from surveys conducted earlier by Intentional Futures (2016) and Sharif and Cho (2015). Other concepts underpinning survey items were inspired by prior work of Miller (2007) and Gibby, Quiros, Demps, and Liu (2002). Earlier inquiry by the authors unrelated to this research, but partially inspiring its future direction, involved our interacting with instructional design staff from several other institutions including Michigan State University, Virginia Tech, the University of Arizona, and SUNY Polytechnic Institute, about their practices related to online course design. In essentially all cases, learning design was occurring by virtue of what Hixon (2008) refers to as “team-based course development” (p. 2), an approach which is common at schools focused exclusively on online education (Hixon, 2008). At more traditional institutions, however, faculty typically have “freedom and responsibility to design and develop courses autonomously” (Hixon, 2008, p. 2). Despite these insights into process, the precise models or guiding theoretical frameworks were not ascertained. Therefore, our intent here was to determine which models and frameworks were generally guiding instructional design practice. Information collected by the survey includes data regarding the models, frameworks, and theories in use by the designers and, when applicable, their teams. Individual characteristics (e.g., gender, education, etc.), experience, education, and role were also collected. Additional variables (e.g. leadership and role diversification scores) were also derived from the data.
Sampling and Data Procedures
Before distributing the survey to the recipients described above, the instrument entered a pilot phase amongst a group of approximately fifteen doctoral students and faculty as well as the instructional design staff of a Midwestern university’s teaching and learning center. Edits and suggestions were received and implemented; following this process, the survey link was distributed as previously explained. The sample population was chiefly one of convenience, as participants were selected because they were available, and, those who participated were presumably willing (Creswell, 2015). Snowball sampling may have also occurred, as recipients of the invitation were asked to forward the invitation to others who they believe matched the indicated criteria (Goodman, 1961).
Data was collected over a four-week timeframe in early 2018. Though the survey instrument consisted of four question blocks and used conditional branching to assure that individuals who met specific criteria were exposed to a certain set of questions, this inquiry focuses primarily on data gathered by a single item. An open-ended question, number 18, asked subjects to “Please indicate which theoretical framework(s) or model(s) from the literature underpin your instructional design practice.”
Findings
Nearly three hundred (297) individuals responded to the survey, yielding 254 completed submissions, of which 247 proved usable (subjects who opted not to provide gender, indicated non-binary, or skipped the question, were removed). Most respondents are female (67%); 30% indicated being male. A large majority of respondents, more than 95%, in fact, are employed full-time. The gender and employment demographics, as well as the findings related to formal education are relatively consistent with earlier studies such as Intentional Futures (2016) and Bean (2014), as well as U.S. Bureau of Labor Statistics data. Formal instructional design education was indicated by 72% (178) of subjects; 59 specified not having formal education in the field. Collectively, the group possesses substantial credentials; more than 60% (153) have earned graduate degrees and another 29% (72) have terminal degrees. In other words, just one in ten survey respondents possessed only a baccalaureate degree or less education.
The focus of this investigation relates specifically to the area of models, frameworks, and theories which were reported to be currently in use among the instructional designers who responded to the survey. Of 247 possible, 196 subjects (79%) indicated a theoretical framework, model, etc. when prompted. The rate of response to this item was among the lowest across the entire instrument. Among those subjects offering a response, 111 (57%) indicated multiple (>1) theories and/or models, 85 (43%) offered one response; the greatest number of elements offered by any single respondent was eleven.
ADDIE was the most often indicated instructional design model, reported by 81 subjects (41%). Backward design was mentioned frequently as well, by 58 subjects (30%). Frequencies dropped noticeably from there, as numerous other models and theories were reported. The most popular of those mentioned were Bloom’s Taxonomy, Quality Matters, Constructivism, and six others, as noted in Table 1, below.
Table 1
Models Reported by Participants
Response |
Number of Subjects |
Percent (%) of Total (n = 196) |
---|
ADDIE |
81 |
41.3 |
Backward Design |
58 |
29.6 |
Bloom’s Taxonomy |
18 |
9.2 |
Quality Matters |
16 |
8.2 |
Constructivist/Constructivism |
12 |
6.1 |
Dick & Carey |
11 |
5.6 |
Fink |
11 |
5.6 |
Knowles, Adult Learning |
10 |
5.1 |
SAMR |
7 |
3.6 |
TPACK |
4 |
2.0 |
In addition to the responses quantified above, some subjects opted to respond differently to the question, offering narrative input. Though these responses represented a small minority, their content gravitated around two similar themes. Either a rather wide range of practices were in place, or essentially no formal models or frameworks were in use. In each case, it was noted that the respective condition existed as each project, course, or subject matter expert is handled individually and uniquely addressed based on the needs presented.
To explore relationships between the data gathered regarding models and frameworks in use and other factors, analysis of those subjects who responded, versus those who did not was done. Across genders, males and females responded proportionately; approximately 23% of males (18) did not respond; 22% of females (37) likewise provided no answer. Other characteristic groups were not as consistent in their response or lack thereof.
Among subjects indicating formal instructional design education (174), 19% (33) did not offer a response, compared to 30% (17) of respondents who indicated a lack of formal ID education. Continuing a focus on the role of education, comparison was conducted among subject-reported perception of their education in preparing them for instructional design work. Of the largest grouping of subjects (89), having indicated their education prepared them for most aspects of instructional design practice, 20% (18) did not indicate a model or framework. Thirteen percent (4) of subjects indicating that education prepared them for all aspects of practice offered no response. Those who indicated being prepared for only some aspects of instructional design work by their respective education (63), left the item blank with the greatest frequency, 25% (16).
By virtue of conditional branching, subjects who indicated formal instructional design education were also asked how long ago that education had been obtained. The majority (161) obtained their education within the prior fifteen years. Within this group, 20% did not respond. Among those with instructional design education from sixteen or more years ago (28), however, just three subjects (approximately 10%) did not answer. Results of similar analysis, based on indicated years of experience in the field, a question posed to all subjects, were also calculated. Those subjects with the least experience in instructional design, less than five years (68), were the most likely to leave the response blank, doing so in 32% of cases. The highest response rate was achieved by those indicating 6-10 years of experience, also the largest grouping of respondents (73); just 15% of these respondents did not provide a response to the survey question.
Discussion
As noted at the outset of this article, the collective understanding of instructional design is clouded to some extent by its relative youth as a field, as well as its increasing complexity in response to outside trends. Therefore, our research seeks to provide additional understanding of instructional design by surfacing the models and frameworks reported by practitioners to be guiding their practice, within the context of higher education. To claim a need for understanding is vague, however, and does not necessarily get at the heart of the matter. What was sought here was specifically to learn which instructional design models are in vogue among those working in and leading instructional design teams, though some insight into what may be impacting the decision to use, or not to use a particular model may also have been gained.
The findings suggest that ADDIE, though noted no more or less prominently in use than the other 39 models addressed in 1980 by Andrews and Goodson, has perhaps, in modern practice risen to a degree of prominence in instructional design. The second most commonly indicated response is an instructional design model best known as Backward Design, though also commonly referred to as Learning or Understanding by Design, or the Wiggins and McTighe (1998) model. At first glance, the gravitation toward this model may seem far more current than the subject matter of Andrews and Goodson and somehow more novel than ADDIE, given Wiggins and McTighe’s notable efforts over the last twenty years. However, Wiggins and McTighe (1998) acknowledge that the idea of designing curriculum and learning activities with the end in mind is hardly new, citing Ralph (1949) for providing the earliest descriptions of this approach. The prominent response of ADDIE, Backward Design (or both) suggests that just as designers recreate instructional sequences which have proven successful in past efforts (Andrews & Goodson, 1980), they may also settle on and repeat design approaches which have successfully yielded those sequences. From this point in the data the responses splinter somewhat, as can be observed in the findings section, and include a myriad of frameworks, not all of which are necessarily aimed holistically at instructional design.
Bloom’s Taxonomy, the third most-offered response, though sometimes addressed as an instructional design framework (May, 2018) is not a model in the same process-driven sense as ADDIE. Rather, Bloom’s Taxonomy supports the situation of learning, and serves to determine the level or nature of the learning at hand, rather than accounting for the entirety of a course- or unit-level instructional design process. This relationship between model and framework is also evident in some of the other pairings and singular mentions of other things, such as technology-focused frameworks including TPACK and SAMR.
The evolution of technology over the past seventy years has played an undeniable role in education and likewise in the practice of instructional design. A veritable catalog of the challenges and affordances associated with technology are predominant themes in instructional design literature (Gibby et al., 2002; Miller, 2007; Sharif & Cho, 2015). Therefore, a sort of love-hate relationship exists with technology, at once acknowledging its advantage and power and simultaneously recognizing its potential as a distractive force, often included but not always proven to be of value. Therefore, addressing technology purposefully makes sense as an aim of modern instructional design. However, neither TPACK nor SAMR are generally considered in the literature as instructional design models or frameworks. Rather, TPACK is a framework for technology integration, representing necessary teacher knowledge (Koehler & Mishra, 2009). SAMR, on the other hand, while also not an instructional design model, is a framework for technology implementation which can support a designer in critically considering determinations and purposing of technology (Hamilton et al., 2016).
Arguably the most interesting findings, and also those which identify the greatest need for additional research were those subjects who offered no response, input several models, theories and frameworks, or offered narrative replies indicating a fluid, ill-defined approach to instructional design guided by the unique needs presented. This may point to a practice of instructional design which aligns with White (2000), which better allows for a “collaborative relationship between the instructor and instructional designer…” [and] “just-in-time instructional development” (p. 59). Just 19% of subjects with instructional design education neglected to offer a response to the question, while 30% of those lacking such education offered no response.
These findings also illuminate the complexity of teaching and learning within higher education as the goals and outcomes vary across vastly different fields and disciplines. These goals may also differ from training and preparation to education. All of which align with various theories of learning and subsequently different approaches to instructional design (Christensen, 2008). This is in contrast to a corporate or government context in which one might see more of an emphasis on training.
A related finding, regarding the subject’s perception of preparedness from their education, was also interesting. Those subjects who looked upon their education more favorably were more likely to offer a response. Also of note, are response rates relating to the elapsed time since completion of one’s instructional design education. Subjects with more recently completed education were twice as likely not to respond compared to those whose education was obtained longer ago. Whether this suggests a shifting focus in instructional design curricula, an evolution which occurs over time in practice, or something else entirely cannot be determined. What may be appropriate to posit is that an instructional designer’s time in the field, and the passage of time between their education and the present, are factors impacting their practice, while other individual characteristics (e.g. gender) do not.
This study affords only the ability to speculate as to what precisely is guiding the adoption of particular models and use of frameworks, as well as why such a sizable group of respondents did not provide such information. Future research might seek to expand the data set and determine if the use of a particular model (or lack thereof) relates to the success, or perceived success of instructional design work, from designers or the faculty with whom they work. An expanded study might look at other sectors wherein instructional designers are found, such as corporate training, perhaps affirming or contradicting findings within higher education. In this way, the research might add other perspectives to Brown et al. (2018). Moreover, qualitative inquiry in the form of phenomenological study of designers at work, interacting with subject matter experts, and the components of a project, may yield additional insight.
References
Allen, I. E., & Seaman, J. (2016). Online report card: Tracking online education in the United States. Babson Survey Research Group.
Andrews, D. H., & Goodson, L. A. (1980). A comparative analysis of models of instructional design. Journal of Instructional Development, 3(4), 2–16.
Bean, C. (2014). The accidental instructional designer: Learning design for the digital age. American Society for Training and Development.
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain. David McKay Co., Inc.
Brown, V. S., Lewis, D., & Toussaint, M. (2018). Students' perceptions of quality across four course development models. Online Learning, 22(2), 173–195.
Bureau of Labor Statistics, United States (2016). Occupational outlook handbook. Retrieved July 3, 2017, https://www.bls.gov/ooh/education-training-and-library/instructional-coordinators.htm
Caplan, D. (2004). The development of online courses. In T. Anderson & F. Elloumi (Eds.), Theory and practice of online learning (pp. 175–194) Athabasca, AB, Canada: Athabasca University.
Christensen, T. (2008). The role of theory in instructional design: Some views of an ID practitioner. Performance Improvement, 47(4), 25–32.
Creswell, J. (2015). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4th edition). Pearson, Inc.
Friesen, P. A. (1973). Designing instruction: A systematic or “systems” approach using programmed instruction as a model. Miller Publishing, Santa Monica, CA.
Gibby, S., Quiors, O., Demps, E., & Liu, M. (2002). Challenges of being an instructional designer for new media evelopment: A view from the practitioners. Journal of Educational Multimedia and Hypermedia, 11(3), 195–219.
Goodman, L. A. (1961). Snowball sampling. Annals of Mathematical Statistics, 32(1). 148–170.
Hamilton, E. R., Rosenberg, J. M., & Akcaoglu, M. (2016). The substitution augmentation modification redefinition (SAMR) model: A critical review and suggestions for its use. TechTrends, 60(5), 433–441.
Hatcher, J. A. (2008). The public role of professionals: Developing and evaluating the civic-minded professional scale. (Publication No. 3331248) [Doctoral dissertation, Indiana University]. Proquest LLC.
Hixon, E. (2008). Team-based online course development: A case study of collaboration models. Online Journal of Distance Learning Administration, 11(4), 1–8.
Intentional Futures. (2016). Instructional design in higher education: A report on the role, workflow, and experience of instructional designers. https://intentionalfutures.com/static/instructional-design-in-higher-education-report-5129d9d1e6c988c254567f91f3ab0d2c.pdf
Kim, M. C., Hannafin, M. J., & Bryan, L. A. (2007). Technology-enhanced inquiry tools in science education: An emerging pedagogical framework for classroom practice. Science Education, 91(6), 1010–1030.
Koehler, M., & Mishra, P. (2009). What is technological pedagogical content knowledge (TPACK)? Contemporary Issues in Technology and Teacher Education, 9(1), 60–70.
Lohr, L., & Ursyn, A. (2010). Visualizing the instructional design process: Seven usability strategies for promoting creative instruction. Design Principles and Practices: An International Journal, 4(2), 427–436.
Mager, R. (1962). Preparing objectives for programmed instruction. Fearon Publishers.
Miller, J. L. (2007). The new education professionals: The emerging specialties of instructional designer and learning manager. International Journal of Public Administration, 30(5), 483–498.
Mutlu, G. (2016). A qualitative analysis and comparison of the two contemporary models of instructional design. Journal of Human Sciences, 13(3), 6154–6163.
Reiser, R. A. (2001). A history of instructional design and technology: Part II: A history of instructional design. Educational Technology Research and Development, 49(2), 57–67.
Sharif, A., & Cho, S. (2015). 21st-Century instructional designers: Bridging the perceptual gaps between identity, practice, impact and professional development. International Journal of Educational Technology in Higher Education, 12(3), 72.
Shaw, K. (2012). Leadership through instructional design in higher education. Online Journal of Distance Learning Administration, 15(3).
Simon, H. (1998). What we know about learning. Journal of Engineering Education, 87(4), 343–348.
Skinner, B. F. (1954). The science of learning and the art of teaching. Harvard Educational Review, 24(2) 86–97.
Smith, P. & Ragan, T. (2005). Instructional design. Hoboken, NJ: John Wiley & Sons, Inc.
Smith, P. & Ragan, T. (1993). Instructional design. Upper Saddle River, NJ: Prentice-Hall.
Tate, E. (2017). Easing instructional designer-faculty conflicts. Inside HigherEd. Retrieved May 30, 2017 from https://www.insidehighered.com/digital-learning/article/2017/05/03/easing-conflicts-between-instructional-designers-and-faculty
Trochim, W. (2006). Types of surveys. Research Methods Knowledge Base. Retrieved from http://www.socialresearchmethods.net/kb/survtype.php
White, C. (2000). Collaborative online course development: Converting correspondence courses to the web. Educational Technology, 40(6), 58–60.
Wiggins, G., & McTighe, J. (1998). What is backward design? Understanding by Design, 1, 7–19.
Wyse, S. (2012). Advantages and disadvantages of surveys. Snap Surveys. Retrieved from http://www.snapsurveys.com/blog/advantages-disadvantages-surveys
Yanchar, S. C., & Gabbitas, B. W. (2011). Between eclecticism and orthodoxy in instructional design. Educational Technology Research and Development, 59, 383–398.
Appendix a — Instructional Designer Survey Instrument
Q1 By continuing, you grant consent for your responses to be included in reporting and data analysis. Any identifiable information provided will be removed prior to compiling results. Do you wish to continue?
Skip To: Q2 If By continuing, you grant consent for your responses to be included in reporting and data analysis... = Yes
Skip To: End of Survey If By continuing, you grant consent for your responses to be included in reporting and data analysis... = No
Q2 Are you currently working in an instructional design role (including management of instructional design staff)?
Skip To: End of Survey If Are you currently working in an instructional design role (including management of instructional... = No
Skip To: Q3 If Are you currently working in an instructional design role (including management of instructional... = Yes Page 2 of 8
Q3 Please indicate your current level of employment
- Full-time (40 hours/week, 10 months or more per year) (1)
- Three-quarter time (30 hours/week, 10 months or more per year) (2)
- Half-time (20 hours/week, 10 months or more per year) (3)
- Less than half-time ( (4)
- Other, please specify: (5) ________________________________________________
Q4 Please indicate your gender:
- Male (1)
- Female (2)
- Non-binary/third gender (3)
- Prefer not to say (4)
- Prefer to self-describe: (5) ________________________________________________
Q5 Do you have formal instructional design education (e.g. a degree in instructional design or a closely related field)?
- Yes (1)
- No (2)
- Other, please specify: (3) ________________________________________________
Skip To: Q7 If Do you have formal instructional design education (e.g. a degree in instructional design or a clo... = Yes
Skip To: Q6 If Do you have formal instructional design education (e.g. a degree in instructional design or a clo... = No Page 3 of 8
Q6 My formal education prepared me for work in the field of instructional design in:
- All aspects
- Most aspects
- Some aspects
- Only a few aspects
- Other, please specify: ________________________________________________
Q7 Approximately how long ago did you complete your formal education in instructional design?
- <5 years (1)
- 5-10 years (2)
- 11-15 years (3)
- 16-20 years (4)
- 21-25 years (5)
- >25 years (6)
Q8 Please indicate your highest level of completed education:
- High School
- Associate’s Degree
- Bachelor’s Degree
- Master’s Degree
- Doctoral Degree
- Other, please specify: ________________________________________________
Q9 Please select the option which best indicates your years of experience in instructional design:
- <5 years (1)
- 5-10 years (2)
- 11-15 years (3)
- 16-20 years (4)
- 21-25 years (5)
- >25 years (6)
Q10 Do you manage other employees?
- Yes, formally. (1)
- Yes, informally (the other employee(s) do not report to me, but I assign work to them) (2)
- No (3)
Skip To: Q11 If Do you manage other employees? = Yes, formally.
Skip To: Q11If Do you manage other employees? = Yes, informally (the other employee(s) do not report to me, but I assign work to them)
Skip To: Q13 If Do you manage other employees? = No
Q11 Approximately how many other employees do you manage?
- 1-2 (1)
- 3-4 (2)
- 5-6 (3)
- more than 6 (4)
Q12Which of the following best describes the function(s) of the employees you manage (select all that apply)?
- Instructional Design
- Audio/Video/Graphic Production
- Coding/Programming
- Technical Support
- Administrative/Clerical
- Project Management
- Other, please specify:
Q13 About how much of your time at work is invested in instructional design activities, not including management of other instructional designers?
- (1)
- 21%-40% (2)
- 41%-60% (3)
- 61%-80% (4)
- >80% (5)
Q14 In addition to instructional design work, which of the following functions do you also perform (select all that apply)?
- Audio/Video authoring/editing or Graphic design (1)
- Coding/Programming (including HTML) (2)
- Committee work (e.g. assessment/accreditation councils, oversight groups, etc.) (3)
- Faculty development (e.g. designing and/or conducting workshops/training) (4)
- Instructor (e.g. teaching one or more courses on a regular basis) (5)
- Personnel management (e.g. hiring, performance review, etc.) (6)
- Scholarly activity (e.g. research, publishing) (7)
- Server administration (e.g. LMS, database, web server) (8)
- Technical Support (9)
- Other, please explain: (10) ________________________________________________
Q15 Which of the following best describes your area of specialization in your current instructional design role?
- online learning design (1)
- classroom learning design (2)
- blended learning design (3)
- general learning design, including classroom, online, and blended (4)
- other, please specify: (5) ____________________________________
Q16 Which of the following best describes the design model in use in your current setting?
- The same design model is applied to each project (i.e. a template is used) (1)
- The design model varies slightly, project by project, based on needs (2)
- The design model varies greatly, project by project, based on needs (3)
- No formal design model is used (4)
- Other, please specify: (5) _______________________________________________
Q17 Which of the following describes ownership of the design model in your current setting (select all that apply)?
- I created the model/models my team and I use (1)
- I was given the model/models I/my team use(s) (2)
- I do not have authority to change the design model(s) (3)
- I have authority to make changes to the design model(s) (4)
- I and others have authority to make changes to the design model(s) (5)
- Other, please specify: (6) ________________________________________________
Q18 Please indicate which theoretical framework(s) or model(s) from the literature underpin your instructional design practice:
______________________________________________________________
Q19 Would you be interested in being interviewed to further discuss your answers to this survey?
- Yes, my email address is: (1) ________________________________________________
- No (2)