Paper The following article is Free article

The Modelling Framework for Experimental Physics: description, development, and applications

and

Published 19 October 2018 © 2018 European Physical Society
, , Focus on Modelling in Physics Instruction Citation Dimitri R Dounas-Frazer and H J Lewandowski 2018 Eur. J. Phys. 39 064005 DOI 10.1088/1361-6404/aae3ce

0143-0807/39/6/064005

Abstract

The ability to construct, use, and revise models is a crucial experimental physics skill. Many existing frameworks describe modelling in science education at introductory levels. However, most have limited applicability to the context of upper-division physics lab courses or experimental physics. Here, we review the Modelling Framework for Experimental Physics, a theoretical framework tailored to labs and experimentation. A key feature of the framework is recursive interaction between models and apparatus. Models are revised to account for new evidence produced by apparatus, and apparatus are revised to better align with the simplifying assumptions of models. Another key feature is the distinction between the physical phenomenon being investigated and the measurement equipment used to conduct the investigation. Models of physical systems facilitate explanation or prediction of phenomena, whereas models of measurement systems facilitate interpretation of data. We describe the framework, provide a chronological history of its development, and summarise its applications to research and curricular design. Ultimately, we argue that the Modelling Framework is a theoretically sound and well-tested tool that is applicable to multiple physics domains and research purposes. In particular, it is useful for characterising students' approaches to experimentation, designing or evaluating curricula for lab courses, and developing instruments to assess students' experimental modelling skills.

Export citation and abstract BibTeX RIS

1. Models and modelling in physics education

The importance of modelling in physics education has been recognised for over 30 years. Johnson-Laird (1980), Hestenes (1987), and Halloun and Hestenes (1987) played key roles in the development of early theories of learning and curricular design. Today, students' development of mental models (Özcan 2015, Körhasan et al 2016, Brewe et al 2018) and representational competence (Fredlund et al 2012, McPadden and Brewe 2017, Opfermann et al 2017) continue to be active areas of research in lecture or studio settings. Meanwhile, undergraduate physics lab courses have been gaining increased attention over the last decade (Schumacher 2007). Consider, for example, multiple large survey studies of students' views about lab courses (Hanif et al 2009, Deacon and Hajek 2011, Coppens et al 2016) or experimental physics (Wilcox and Lewandowski 2018). Despite these trends, research on student engagement in modelling during lab courses is sparse, and there is a need for theoretical frameworks and empirical studies that focus on students' ability to model experimental systems.

Since the 1980s, conceptions of models originally rooted in cognitive science have been supplemented with philosophical arguments (Gilbert and Justi 2016). Knuuttila (2011) argues that models are tools for generating knowledge. As tools, models consist of representations that are expressed externally via a material medium, e.g. ink and paper. Mental models, on the other hand, can be precursors to external representations, and are also part of the modelling process (Gilbert and Justi 2016). To facilitate knowledge generation, models support people to ask questions, recognise and explain patterns in data, and make judgements about those explanations (Passmore and Svoboda 2012). Models can also be used to communicate new knowledge or understanding to others (Schwarz 2009). However, tools do not spontaneously work on their own. Indeed, Giere (2009) and Gouvea and Passmore (2017) make bids for focusing on the agents and ends of modelling: people use models to represent part of the world for some purpose.

In physics education, the purpose of modelling is often for students to describe, explain, or predict the behaviour of physical phenomena (Etkina et al 2006a). Students use evidence to generate models, and they use models to search for new evidence. Such intertwining of evidence- and model-based reasoning supports students to make sense of phenomena (Russ and Odden 2017). Koponen (2007) argues that modelling in physics is a bidirectional process through which models both inform, and are informed by, experimental apparatus. Apparatus are designed to isolate particular aspects of a model, and data are used to verify the model or identify its limitations. Koponen further claims that two models are at play during experimentation: one to formulate explanations or predictions about a physical system that is the target of investigation, and another to interpret the experimental data generated by the measurement equipment. Thus, in physics education, the practice of modelling should involve recursive interactions between apparatus, evidence, and models of both phenomena and equipment. More broadly, we view such interactions as key features of experimental physics practice. Although many theoretical frameworks for modelling exist (Brewe 2008, Windschitl et al 2008, Schwarz 2009, Passmore and Svoboda 2012, Gilbert and Justi 2016, Fuhrmann et al 2018), few capture these key features.

Here, we review the Modelling Framework for Experimental Physics, which was explicitly designed to characterise physicists' use of models when conducting experiments. In the next section, we describe the framework using a Malus's law experiment to explicate various modelling subtasks and processes. In section 3, we compare the Modelling Framework to two commonly used pedagogical frameworks in physics education: Modelling Instruction (Brewe 2008, Megowan-Romanowicz 2011) and the Investigative Science Learning Environment (ISLE) (Etkina and Van Heuvelen 2007). Then, in sections 4 and 5, we provide an overview of the framework's initial development and its subsequent validation and use as a research tool. Section 6 discusses applications to the design of undergraduate lab courses. Finally, in section 7, we argue that the Modelling Framework is a dual-purpose tool that is theoretically sound, well-tested, and versatile. By 'versatile,' we mean that the framework is applicable to multiple physics domains, can be used to characterise student and instructor behaviours, and can inform the design and evaluation of curricula for lab courses.

2. Description of the Modelling Framework for Experimental Physics

A visual representation of the Modelling Framework for Experimental Physics is given in figure 1. The framework can be thought of as a flowchart that consists of five subtasks: making measurements, constructing models of equipment and phenomena, making comparisons between data and predictions, proposing causes for discrepancies, and enacting revisions to models and apparatus. Because modelling is a recursive process, the flowchart is cyclical. In the framework, the goal is to achieve sufficiently good agreement between data and predictions, consistent with other work on modelling in physics education (Etkina et al 2006a).

Figure 1.

Figure 1. Modelling Framework for Experimental Physics. The version of the framework presented here was previously published by Dounas-Frazer et al (2018) and is adapted from a visualisation originally developed by Zwickl et al (2014). CC-BY 3.0.

Standard image High-resolution image

Below, we elaborate on each subtask of the Modelling Framework, drawing on examples from a Malus's law experiment to explicate abstract concepts. This experiment involves shining laser light through two linear polarizers, one that is fixed and one that is free to rotate (the analyser). After passing through both polarizers, the intensity of the transmitted laser light is proportional to ${\cos }^{2}\theta $, where θ is the angle between the axes of polarisation of the two polarizers. Thus, by rotating the analyser, students can vary the intensity of the transmitted light. A schematic of the experiment is provided in figure 2.

  • (i)  
    Make measurements. The measurement equipment interacts with the physical system and produces raw data.Malus's law example. The physical system apparatus includes a laser and two linear polarising filters. The measurement equipment consists of a photodetector and voltmeter. Making measurements involves shining light through the polarizers onto the photodetector and producing a voltage on the voltmeter. See figure 2(A).
  • (ii)  
    Construct models. The physical and measurement system apparatus are associated with distinct models. Each model is created from the key features of the system, relevant physics principles and concepts, particular parameter values, and appropriate assumptions and simplifications that make the model tractable while limiting its scope of applicability.Malus's law example. The simplest model of the physical system involves treating the laser light as a perfectly monochromatic plane wave with uniform intensity. Polarizers are similarly idealised as having extinction ratios of exactly zero. The simplest measurement system model treats the measurement equipment as a black box: the photodetector instantaneously converts incident laser light to an electrical signal that is sent to the voltmeter, which produces a voltage linearly proportional to the power of the light.
  • (iii)  
    Make comparisons. The physical system model is used to generate predictions, and the measurement system model is used to analyse and interpret raw data. Interpreted data are compared to predictions, and domain-specific criteria are used to determine whether the level of agreement is good enough. For example, Bailey (2017) found that, compared to particle physics, independent measurements of the same quantity in nuclear physics typically have a larger spread, which may both contribute to and reflect different standards about acceptable levels of precision in these two subfields. Moreover, regardless of subfield, physicists 'tend not to make new measurements unless they are expected to be more accurate than previous measurements' (Bailey 2017, p 160600), which further suggests that standards for good enough agreement are phenomenon-dependent and change in time. Applying accepted scientific criteria for determining what counts as 'good enough' is part of the modelling process (Giere 2009, Gouvea and Passmore 2017). If the agreement is good enough, then there is no need to revise models or apparatus. However, additional data are sometimes needed in order to properly judge the quality of agreement; in this case, more measurements are required. If there is a discrepancy between data and predictions, the experimenter may propose and enact changes to models or apparatus to resolve the discrepancy. In this sense, the 'Maybe' and 'No' pathways in figure 1 respectively correspond to efforts to reduce statistical and systematic sources of uncertainty.Malus's law example. The physical system model is used to predict Malus's law: $I={I}_{0}{\cos }^{2}\theta $, where I0 and I are the intensities of light incident on the analyser and photodetector, respectively. The measurement system model is used to interpret raw data by converting voltage as a function of angle to normalised intensity as a function of angle. Experimental data are compared to a cosine-squared curve; see figure 2(B). Goodness-of-fit metrics can help determine quality of agreement. Many such metrics have established criteria for acceptable fits among physicists.
  • (iv)  
    Propose causes. When discrepancies between data and predictions must be resolved, physicists generate hypotheses about potential sources of those discrepancies. This modelling subtask can be challenging for people who are unfamiliar with nonideal behaviour of equipment or assumptions that limit the predictive power of models (Zwickl et al 2015, Dounas-Frazer et al 2018).Malus's law example. Common sources of discrepancy between data and prediction include nonzero background due to ambient light, polarizers with nonzero extinction ratios, or laser light with nonzero ellipticity. Discrepancies may also arise if the light is sufficiently powerful to saturate the photodetector or if changes in light power are faster than the slew rate limit of the detector.
  • (v)  
    Enact revisions. The Modelling Framework describes four pathways for revision: one may revise the models or apparatus of the physical or measurement systems. These pathways are consistent with other conceptions of modelling as a process that informs the design and execution of experiments (Koponen 2007, Gilbert and Justi 2016, Russ and Odden 2017). Depending on which proposed causes are deemed most likely or easiest to implement, some revisions may be prioritised over others (Stanley et al 2017). Once revisions are enacted, the modelling cycle repeats until data and predictions are brought into good enough agreement.Malus's law example. Apparatus revisions include covering the photodetector to shield it from ambient light, or placing a neutral density filter in the optical path of the laser light to decrease its overall intensity. Model revisions include adding a fit variable to Malus's law to represent offsets due to ambient light or ellipticity, or determining the nonlinear calibration of the photodetector for high light intensities. See parts (C) and (D) of figure 2.

Figure 2.

Figure 2. Hypothetical Malus's law experiment. (A) Initial setup. P1 and P2 are polarizers. PD is a photodetector. VM is a voltmeter. I0 and I are the intensities of laser light incident on P2 and PD. θ is the angle between axes of polarisation of P1 and P2. α is related to the linear responsivity of PD. (B) Disagreement between data and prediction. (C) Revised setup. NDF is a neutral density filter. Ioffset is a modification to Malus's law. f(V) represents nonlinear calibration of PD output to optical power. (D) Improved agreement between data and prediction.

Standard image High-resolution image

3. Comparison to other frameworks

The Modelling Framework is not the only framework for characterising model-based reasoning in physics. Modelling Instruction and ISLE are two popular curricula that aim to develop students' modelling abilities (among other learning goals), and each is based on a particular conception of modelling and its role in experimentation.

Modelling Instruction has its roots in early theoretical work and curriculum design by Hestenes (1987), Halloun and Hestenes (1987), and Wells et al (1995). Since then, Modelling Instruction has been adapted for use in high school physics classes throughout the United States and beyond (Megowan-Romanowicz 2011), as well as in introductory undergraduate courses on mechanics and electricity and magnetism (Brewe 2008, McPadden and Brewe 2017). As described by Brewe (2008), Modelling Instruction was designed to develop students' model-based understanding of the physical world. To this end, students engage in cycles of model construction and revision, called 'Modelling Instruction cycles'. These cycles begin by introducing students to a phenomenon that cannot be explained by models with which they are already familiar. Students then collect data related to the phenomenon, develop and coordinate multiple representations of the data, apply those representations to solve physics problems, identify common characteristics of those representations, and use those characteristics to construct new models or revise old ones. Further, Megowan-Romanowicz (2011) describes the central role of whiteboards in facilitating peer-to-peer collaboration and communication throughout Modelling Instruction cycles. Students use whiteboards to collaboratively generate mathematical, graphical, and diagrammatic representations of models. An overarching aim of Modelling Instruction is to reinforce the interconnected nature of key concepts in the introductory physics curriculum. Over the course of a semester, students repeatedly revisit and refine a limited number of general models. Doing so maintains a focus on fundamental physics principles and reinforces their interconnected nature (Brewe 2008).

ISLE is both a curriculum and a philosophy for secondary and post-secondary physics labs at the introductory level. As described by Etkina and Van Heuvelen (2007), ISLE was designed to develop students' competence with scientific abilities, including modelling and the ability to create, use, and transition between multiple representations of phenomena. To do so, students engage in cycles of experimentation, called 'ISLE cycles', during which they make observations about a phenomenon, look for patterns, generate several plausible explanations, make predictions, conduct tests, compare predictions to data, and, ultimately, revise or eliminate explanations that are inconsistent with their data. Throughout, students are asked to represent their ideas using pictures, diagrams, tables, graphs, and language. As students collect additional data, they transition from generating qualitative explanations for patterns to proposing quantitative rules. ISLE cycles focus on carefully selected phenomena, where 'careful selection means that experiments are simple, focus on one phenomenon and the pattern is easily identified' (Etkina and Van Heuvelen 2007, p 4). ISLE cycles culminate in the construction of physical principles, like Newton's second law of motion, and the application of those principles to physics problems and to scenarios beyond the classroom context. A major feature of the ISLE curriculum is that students use rubrics to self-assess their own scientific abilities (Etkina et al 2006b). Recently, ISLE cycles have been used as an analytic framework to characterise student and expert approaches to explaining a surprising physics phenomenon (Čančula et al 2015).

Modelling Instruction, ISLE, and the Modelling Framework conceptualise modelling in similar ways. For instance, all three frameworks describe modelling as a cyclic process that involves making measurements or observations, generating explanations or predictions, testing predictions, and revising models in order to better describe or explain phenomena. Nevertheless, the Modelling Framework differs from Modelling Instruction and ISLE in several respects. First, it explicitly acknowledges the measurement system apparatus and corresponding model. Second, it describes revisions to the measurement and physical system apparatus. To understand these differences, it is helpful to consider how learning goals of introductory labs differ from those of upper-division labs. In introductory labs, major learning goals often include supporting students to construct, discover, or understand physical principles, and to apply those principles when solving problems or making sense of the 'real world'. On the other hand, upper-division labs typically emphasise skill development rather than conceptual learning (Wilcox and Lewandowski 2017). The main purpose of many such courses is to provide students with experience designing and conducting experiments that involve sophisticated equipment and complicated phenomena (Dounas-Frazer et al 2018). Similarly, many upper-division lab instructors perceive their role as preparing students to succeed in graduate-level research or in research and development careers (Dounas-Frazer and Lewandowski 2017). In these learning environments, then, the apparatus itself is the 'real world', and 'problem solving' entails using apparatus and models to achieve a good enough match between data and prediction à la Koponen (2007). Thus, some differences between Modelling Instruction, ISLE, and the Modelling Framework arise because the framework was designed with upper-division lab contexts in mind (section 4), although it can also be applied in introductory contexts (see Vonk et al 2017).

Another difference between Modelling Instruction, ISLE, and the Modelling Framework is that the framework is not tightly coupled to a particular curriculum or pedagogical approach. Although we describe multiple curriculum design applications of the framework in section 6, none represent an official curriculum that is intended to be replicated at scale. Thus, the framework is not in tension with Modelling Instruction or ISLE. For example, Modelling Instruction cycles or ISLE cycles could be adapted to take into account the existence of measurement systems or the possibility of apparatus revisions. Likewise, curricula whose design has been informed by the Modelling Framework could incorporate student-centred teaching approaches that use whiteboards to facilitate collaborative model construction or rubrics to facilitate self-assessment. Indeed, Gandhi et al (2016) designed an introductory lab course in which students collaboratively constructed and revised both models and apparatus, and regularly self-assessed their own growth as learners.

The potential compatibility of the Modelling Framework with a variety of pedagogical approaches speaks to one dimension of its versatility. The versatility of the framework is also evidenced by its use an analytic tool for education research purposes, similar to how Čančula et al (2015) have used ISLE cycles. In the following sections, we describe the development, validation, and application of the framework in both educational and research contexts.

4. Development of the Modelling Framework

The Modelling Framework (figure 1) was originally developed in the context of the transformation of the Advanced Laboratory course at the University of Colorado Boulder (CU) (Zwickl et al 2013, 2014). Those efforts began in late 2010 and employed the following approach: define learning goals, develop curriculum that align with those goals, and assess student learning. The Modelling Framework emerged from the first two phases of the transformation process. To identify consensus learning goals for the course, Zwickl et al (2013) worked with 21 faculty members via individual interviews or group meetings. Learning goals were also informed by comparison of the Advanced Laboratory to other laboratory courses and a review of the physics education literature. Using this process, Zwickl et al identified modelling as a major learning goal for the course. They further argued that modelling had three components: modelling the physical system, modelling the measurement system, and comparing data and predictions.

After identifying modelling as a learning goal for the Advanced Laboratory, Zwickl et al (2014) developed laboratory activities designed to engage students in the practice of modelling during optics experiments (section 6). To guide their designs, they created the first iteration of the Modelling Framework. This iteration was informed by a review of the physics education literature and Zwickl and Lewandowski's own experience and expertise as experimental physicists. The first iteration differs from that presented in figure 1 in several ways, most of them aesthetic. Nevertheless, the main features have persisted across all versions of the framework: measurement and physical systems are distinct, revisions include changes to both apparatus and models, and the need for revision is triggered by comparisons that fail to yield 'good enough' agreement between interpreted data and model predictions. Thus, from its inception, the Modelling Framework aligned with philosophical conceptions of modelling, such as Koponen's (2007) bidirectionality.

In addition to modelling, Zwickl et al identified several other learning goals for the Advanced Laboratory, including scientific communication and experimental design. These learning goals are also important to many other lab instructors (AAPT Committee on Laboratories 2015, Dounas-Frazer et al 2018). Modelling is not distinct from communication or design (Passmore and Svoboda 2012, Gilbert and Justi 2016). For example, communication and design are major features of the Modelling Instruction and ISLE curricula (Etkina and Van Heuvelen 2007, Megowan-Romanowicz 2011). However, the Modelling Framework was not specifically designed to describe students' use of models when communicating scientific arguments or designing physics experiments. Therefore, applications that require simultaneous attention to modelling and communication or design should combine the Modelling Framework with other relevant frameworks. Despite these limitations, the framework has proved to be a useful tool for education research and curriculum design.

5. Validation and use of the Modelling Framework as a research tool

After the Modelling Framework was developed, it was validated and used as a research tool in a series of four studies, three focused on students (Zwickl et al 2015, Dounas-Frazer et al 2016, Stanley et al 2017) and one on instructors (Dounas-Frazer et al 2018). These studies demonstrate that the Modelling Framework is appropriate for characterising modelling in upper-division lab contexts. They also resulted in small changes that improved the descriptive power of the framework.

5.1. Student approaches to modelling during optics and electronics activities

Zwickl et al (2015) conducted think-aloud interviews during which eight students verbalised their reasoning while testing the power output of a light-emitting diode. One major contribution of this investigation was validation of the Modelling Framework's focus on measurement system models. During interviews, student engagement with the measurement apparatus (i.e. a photodetector and oscilloscope) included constructing models, revising models, interpreting output, and identifying limitations of the equipment. Zwickl et al also described two common barriers to students' use or revision of models. First, some students did not articulate crucial assumptions and corresponding limitations of their models, preventing them from making model-based refinements to their experiment. Second, lack of familiarity with the concept of solid angle and the unit of steradian prevented some students from appropriately comparing their measurements to numerical values specified in the data sheet for the light-emitting diode. These barriers are consistent with other work that has demonstrated how students' prior knowledge impacts their ability to construct and evaluate models (Stewart et al 2005, Fortus et al 2016, Ruppert et al 2017).

Next, Dounas-Frazer et al (2016) expanded on previous work by demonstrating that the Modelling Framework is applicable to a context other than optics, namely, electronics. In that work, eight pairs of students from two institutions participated in think-aloud interviews that involved troubleshooting a malfunctioning electric circuit. Troubleshooting can be thought of as a type of modelling in which the physical system (i.e. the malfunctioning circuit) is revised in order to bring its performance into alignment with expectations informed by a physical system model (i.e. the model of the functional circuit). Dounas-Frazer et al examined students' modelling behaviours during two key episodes in the troubleshooting process: isolating the source of malfunction to a particular subsystem, and evaluating the performance of the repaired circuit. In the former episode, students engaged in constructing models and making comparisons more often than other subtasks; in the latter episode, making comparisons, proposing causes, and enacting revisions were the most common subtasks. Thus, not only did students engage in multiple modelling cycles throughout the troubleshooting process, different phases of troubleshooting corresponded to different combinations of subtasks. Because troubleshooting is a central feature of electronics lab courses (Dounas-Frazer and Lewandowski 2017), this study suggests that so, too, is modelling.

Transitioning from clinical research settings to an actual classroom, Stanley et al (2017) used the Modelling Framework to study student use of models in an Electronics Laboratory course specifically designed to engage students in modelling circuits (Lewandowski and Finkelstein 2015). They analysed 45 student lab notebooks for evidence of students' documented engagement in modelling subtasks. Three notebook entries were selected for analysis, each corresponding to a different activity: voltage divider circuit (high scaffolding), photometer circuit (medium scaffolding), and voltage-controlled electromagnet (low scaffolding). Student engagement in modelling tracked the level of scaffolding in the lab guide. Compared to open-ended prompts, explicit prompts resulted in more thorough engagement in modelling. In particular, most students did not enact and document revisions to their models or apparatus unless specifically asked to do so. Further, Stanley et al found that students often neglected to make and document comparisons between expected and actual performance of circuits even when prompted, potentially due to lack of clarity about what constitutes 'good enough' agreement between data and predictions. Making comparisons and judging the quality of agreement is crucial for making informed decisions about whether and how to iteratively improve an experiment (Holmes et al 2015). Therefore, there is a clear need to better understand and support student reasoning about which standards for 'good enough' agreement are appropriate in different experimental physics contexts (see Giere 2009, Gouvea and Passmore 2017). In addition to identifying areas for curricular improvement and future study, Stanley et al demonstrated that the Modelling Framework is useful for understanding student modelling in formal educational settings.

5.2. Instructor perspectives on modelling in optics and electronics lab courses

Shortly after it was created, the Modelling Framework informed a set of nationally endorsed recommendations for undergraduate physics lab courses in the United States (AAPT Committee on Laboratories 2015). However, until recently, research on modelling in upper-division physics labs focused almost exclusively on the perspectives and behaviours of instructors and students from CU (Dounas-Frazer et al 2016, Stanley et al 2017, Zwickl et al 2014, 2015).

To get a better idea of whether and how modelling is taken up in lab courses across the United States, Dounas-Frazer et al (2018) conducted interviews with 19 optics instructors and 16 electronics instructors from 27 institutions. During these interviews, instructors described how various subtasks of the Modelling Framework aligned with their learning goals or activity design. Making measurements, constructing models, and comparing data to predictions were each identified as important learning goals by a majority of instructors in the study. Enacting revisions and proposing causes were less commonly identified as important. Limited class time was cited as a barrier to student revisions to experiments. Meanwhile, many instructors said that students are unfamiliar with the nonideal behaviour of devices, and therefore struggle to propose causes for discrepancies between data and predictions. This finding is consistent with students' failure to recognise or articulate assumptions and limitations of models during experimental physics think-aloud interviews (Zwickl et al 2015).

Further, Dounas-Frazer et al (2018) found that modelling is taken up differently in optics compared to electronics. For example, with respect to the subtask of making comparisons, optics instructors were more likely to describe engaging students in rigorous statistical analyses (e.g. fitting curves to data and evaluating the goodness of fit). In contrast, electronics instructors said that comparisons typically involved qualitative checks of circuit performance because building functional circuits was more important than achieving precise agreement between predicted and expected output voltages. Electronics instructors often framed particular modelling subtasks or the whole Modelling Framework as necessary aspects of troubleshooting, in alignment with students' approaches to repairing malfunctioning circuits during think-aloud interviews (Dounas-Frazer et al 2016). Overall, Dounas-Frazer et al (2018) not only demonstrated the versatility of the framework for describing the goals and activities of a national sample of lab courses, they also shed light on the different purposes of modelling across two experimental physics domains.

5.3. Empirically motivated changes to the Modelling Framework

Over the course of the investigations described in sections 5.1 and 5.2, the Modelling Framework itself was modified in order to better capture the experimental modelling process. While many changes were aesthetic, some represented shifts in understanding of what modelling entails. In the first visualization of the framework, Zwickl et al (2014) did not explicitly include 'limitations' as part of system models, even though they recognised that students should be able to articulate model limitations. After Zwickl et al (2015) observed that students' unarticulated assumptions and unrecognised model limitations were barriers to modelling, they revised the framework to include 'limitations, simplifications, and assumptions' as an explicit part of system models. The 'Maybe' pathway was added by Dounas-Frazer et al (2018) based on experience observing students troubleshoot electric circuits (Dounas-Frazer et al 2016). When confronted with a disagreement between data and predictions, students sometimes collected additional data in order to be sure that the discrepancy was significant, rather than immediately trying to explain it. These and other empirically motivated changes to the Modelling Framework likely improve its usefulness as a well-tested tool for describing the experimental modelling process.

6. Applications of the Modelling Framework to curriculum design

The Modelling Framework (figure 1) is a dual-purpose tool that can both characterise people's reasoning about experimental systems and inform curriculum design for laboratory courses (Zwickl et al 2014). To date, we are aware of applications of the framework to the design of three undergraduate contexts: a fourth-year Advanced Laboratory (Zwickl et al 2014), a third-year Electronics Laboratory (Lewandowski and Finkelstein 2015), and first-year introductory courses (Vonk et al 2017).

The first educational application of the Modelling Framework coincided with its development, as described in section 4. After establishing that students' ability to model experimental systems was a learning goal for the CU Advanced Laboratory (Zwickl et al 2013), Zwickl et al (2014) developed and used the framework to guide their transformation of that course. Soon after, Lewandowski and Finkelstein (2015) also transformed the CU Electronics Laboratory in order to meet similar learning goals. Both courses aimed to engage students in constructing models, making predictions and comparisons, and revising models and apparatus. Inspired by previous work on scaffolded inquiry (Kirschner et al 2006, Hmelo-Silver et al 2007, Buck et al 2008, Etkina et al 2008), lab guides explicitly prompted students to engage in these modelling subtasks. For example, the lab guide for an Advanced Laboratory polarisation activity prompted students to use the Jones formalism to model the propagation of laser light through polarising filters and wave plates, and to derive predictions like Malus's law (Zwickl et al 2014). To model the measurement system, students used manufacturer documentation to understand the operation and limitations of a photodetector, and to appropriately convert output voltage into measurements of optical power. Similarly, in the Electronics Laboratory, the lab guide for a voltage divider circuit prompted students to revise their circuit schematic and equation for the transfer function in order to include the input resistance of a digital multimeter. The level of explicit scaffolding in the lab guides faded over the course of the semester in order to provide students with more control over how they modelled their circuits (Lewandowski and Finkelstein 2015). In both courses, lab guides prompted students to reason about model limitations or revisions in order to explain or minimise systematic biases in their measurements.

At the introductory level, Vonk et al (2017) used the Modelling Framework to design model-making activities in introductory algebra- and calculus-based physics courses. Here 'model-making' refers to devising an experiment to determine the relationship between two variables (e.g. wavelength and frequency of a wave on a string), collecting and analysing data, constructing a model to relate the variables, and using the model to make predictions. Instead of working with apparatus, students used direct measurement videos (DMVs) to explore phenomena and collect data. DMVs are 'short high-quality videos that show a scientifically interesting event', which are analysed using online tools like digital rulers and stopwatches (Vonk et al 2017, p 4). Because students did not use apparatus, they engaged in only some modelling subtasks: making measurements, constructing models of physical phenomena, and making comparisons. In a study of 116 students' performance on a model-making assessment, Vonk et al showed that students who completed activities using DMVs designed to bolster model-making skills outperformed those who did not. This work suggests that the Modelling Framework has implications for a wide range of course formats and activity types beyond physics labs and apparatus-based activities.

7. Summary and ongoing work

The Modelling Framework for Experimental Physics (figure 1) was developed to describe the process of constructing, using, and revising models when conducting physics experiments. It was informed by, and is consistent with, other theoretical conceptions of scientific modelling. Multiple studies have demonstrated that there is a good empirical mapping between the framework and students' approaches to completing experimental physics tasks and instructors' learning goals and activity design in upper-division lab courses. Additionally, the framework has been used to inform the design and evaluation of lab courses and introductory algebra- and calculus-based physics courses. For these reasons, we argue that the Modelling Framework is a theoretically sound, well-tested, and versatile tool.

The multi-year process through which the Modelling Framework was developed has yielded some insights about barriers to engaging students in modelling during upper-division physics labs. In these courses, students' prior knowledge about relevant physics concepts or model limitations impacts their ability to appropriately compare data to predictions or propose causes for discrepancies, and their ability to judge the quality of agreement between data and predictions impacts whether they revise their models or apparatus. We believe that it is possible to strike an effective balance between learning physics concepts and theories and engaging in modelling-based experimentation. However, the dearth of research on teaching and learning in physics labs (National Research Council 2012) makes it difficult to know which instructional practices work well for particular learning goals, physics domains, and student populations.

In ongoing work, we are using the Modelling Framework to inform the design of standardised and scalable assessments of students' experimental modelling abilities (Dounas-Frazer et al 2018). Interviews with instructors point to the need for process-based instruments that assess students' competence with multiple subtasks and the iterative nature of modelling, as well as their rationale for choosing one modelling pathway over another (e.g. deciding to revise the apparatus rather than collect more data). We aim to develop instruments that are compatible with a recent model for centralised data collection and large-scale deployment of research-based assessments (Wilcox et al 2016). Doing so will enable us to identify individual courses or types of courses that are successfully improving students' experimental modelling abilities, which in turn will pave the way for further research on effective modelling-oriented teaching practices in physics labs. Ultimately, we hope that this work will create more opportunities for students to authentically engage in experimentation during their undergraduate physics education.

Acknowledgments

We thank Benjamin Pollard and Laura Ríos for useful discussions about the ideas presented here. This material is based upon work supported by the National Science Foundation under Grant Nos. DUE-1726045 and PHY-1734006.

Please wait… references are loading.
10.1088/1361-6404/aae3ce