banner



A Course In Mathematical Biology Pdf

CBE Life Sci Educ. 2010 Fall; 9(3): 333–341.

How to Build a Course in Mathematical–Biological Modeling: Content and Processes for Knowledge and Skill

Anne-Marie Hoskinson

Department of Biological Sciences, Minnesota State University–Mankato, Mankato, MN 56001

John Jungck, Monitoring Editor

Received 2010 Mar 17; Revised 2010 May 13; Accepted 2010 Jun 6.

Abstract

Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical–biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments.

INTRODUCTION

Almost no domain of modern biology can be taught without input and collaboration from mathematics. The success of molecular biology results from collaboration with computational mathematicians, whose algorithms can extract solutions from intractably large problems. Ecologists have benefited from analytical and topological mathematics to model the functioning and stability of complex ecosystems. Biological systems are emergent systems, displaying complexity across multiple levels of organization. Integration of mathematics and biology is not only useful but also essential to understanding biological systems (Cohen, 2004 blue right-pointing triangle ). Biologists and mathematicians, researchers and educators have echoed this call for collaboration among mathematics and biology at the undergraduate level and above (Jungck, 1997 blue right-pointing triangle ; Tanner et al., 2003 blue right-pointing triangle ; Steen, 2005 blue right-pointing triangle ).

Mathematics and biology intersect in the sphere of models. Models are simplified representations of systems, and mathematical models formalize such representations with equations. Mathematical modeling is as important to biology as it is to physics or chemistry, for similar reasons: biological systems can span several orders of magnitude in space and time, and mathematical models can provide insight and focus where qualitative models cannot (Phillips and Milo, 2009 blue right-pointing triangle ). In addition, ethical considerations make mathematical models of some biological systems the only option for gaining insight. However, few mathematicians are formally trained as biologists, and few biologists have rigorous mathematical backgrounds. Building robust models of biological systems requires both understanding of existing models and mastering the process of modeling.

Science process learning supports students' acceptance and understanding of scientific concepts and ideas (Stamp et al., 2006 blue right-pointing triangle ; Lombrozo et al., 2008 blue right-pointing triangle ) and works in a wide variety of settings (Senocak et al., 2007 blue right-pointing triangle ). Process learning is important for at least two reasons. First, although scholars and learners can access many resources that describe what we know about biology, mathematics, and models, there are fewer resources that describe how we know. There is already far too much content to cover in the time allotted for our courses. It is precisely because content knowledge of biology and mathematics is increasing so rapidly that teaching and learning the processes of science and mathematics are so important. Understanding complex biological systems requires mastery of what Holling (1996) blue right-pointing triangle termed the science of the integration of parts, or the interrelationships within complex systems. Second, our students bring misconceptions and predictable errors to our mathematics and science classes (Gray et al., 2005 blue right-pointing triangle ; Stamp et al., 2006 blue right-pointing triangle ; Colburn, 2007 blue right-pointing triangle ; Tariq, 2008 blue right-pointing triangle ). Misconceptions can be very resistant to modification and can obstruct higher learning. Teacher-centered strategies for countering misconceptions are notoriously ineffective. Instead, students must confront their own conceptions and errors, understand them, construct a new framework of understanding, and replace the faulty concepts (Nazario et al., 2002 blue right-pointing triangle ; Lombrozo et al., 2008 blue right-pointing triangle ). In other words, they must learn science process skills.

Likewise, collaboration among biologists and mathematicians can result in richer, more sophisticated understanding of complex biological systems (Cohen, 2004 blue right-pointing triangle ; May, 2004 blue right-pointing triangle ; Phillips and Milo, 2009 blue right-pointing triangle ). An analysis of >190 studies (reported in Johnson et al., 2000 blue right-pointing triangle ) strongly demonstrates that emphasizing processes of cooperative learning increases students' academic achievement, higher-level reasoning, retention, and motivation more than competitive or individual learning. The American Association for the Advancement of Science (1991) blue right-pointing triangle ; many professional associations, such as the Mathematical Association of America (Steen, 2005 blue right-pointing triangle ) and the Ecological Society of America; and the two principal financial grantors in the United States (National Science Foundation and National Institutes of Health) urge researchers and educators to foster interdisciplinary collaborative experiences at the undergraduate level. However, undergraduate biology majors are infrequently exposed to and taught how to collaborate effectively.

Several researchers have published general prescriptive models describing processes of implementing cooperative learning (Johnson and Johnson, 1992 blue right-pointing triangle ; Smith, 1995 blue right-pointing triangle ; Johnson et al., 1998a blue right-pointing triangle ; Tanner et al., 2003 blue right-pointing triangle ). Accounts of what worked and what did not during and after cooperative-learning implementation are few (but see Schlegel and Pace, 2004 blue right-pointing triangle ; Phillips et al., 2007 blue right-pointing triangle ). The perceived trade-off between time spent teaching content versus process favors measurable results that content-teaching targets: facts and concepts checked off a list, proportions of undergraduates scoring well on a class exam or graduate school admissions test. On these kinds of metrics, outcomes of process learning are perceived as intangible or unmeasurable. For these and many other reasons, professors, administrators, and even our students resist cooperative-learning processes (Herreid, 1998 blue right-pointing triangle ).

To explore these issues, beginning in fall 2006, I designed a new course in mathematical models of biology for undergraduate biology majors in the School of Biology at the Georgia Institute of Technology. The school's purpose for the course was to teach biology majors how to build mathematical models, by using a broad survey of mathematical models from many domains of biology as the foundation. Therefore, the school's goals focused on both student knowledge and skills from the outset. Trained as a mathematical biologist, I was quite comfortable and familiar with many models. The modeling process requires both science and craft and an understanding of both the modeling and collaborative processes. I wondered whether the modeling process could be effectively taught, and I also wondered whether and how cooperative skills also might be taught. Through iteration (four successive semesters), I developed a set of heuristics (sensu Starfield et al., 1994 blue right-pointing triangle ), "plausible or reasonable approach[es] … often proved to be useful, rule[s] of thumb" for designing and implementing a course that emphasized both content and process of mathematical modeling and cooperative learning. The purposes of this article are to offer tools for other instructors to prototype their own courses designed to incorporate both content and process learning, and to demonstrate how content and process learning support and complement each other.

BUILDING A COURSE: OBJECTIVES

A model of the course had to be built upon the School of Biology's objectives: to teach students models and a modeling process. Content and processes were undefined. Therefore, the starting point was to identify existing guidelines for modeling processes and cooperative processes (Table 1). We used learning outcomes, feedback, and assessments (described below) in an iterated, dynamic process of refining course objectives, which then helped us develop and refine instructional model choices and model exercises (Table 2).

Table 1.

A prototype of curriculum choices to build both content and process competence in model-building and cooperative skills

Course objectives Content Process
Model competence Quantitatively representing hypotheses Model components Scenarios
Graphically and verbally representing vague problems     Objective Heuristics
    Assumptions Incomplete information
    Variables and parameters Planning for mistakes
Instructional model criteria Frequent group presentations (informal and formal)
    Accessibility Learning journal
    Ubiquity
    Novelty factor
Cooperative skills competence Communicating results targeted to audience Ground rules Team projects
Practicing collaboration In-class problem solving and prototyping

Table 2.

Course progression (top to bottom) showing how targeted model competencies drive the resulting choices of instructional models used and exercises (e.g., in-class problem solving, homework)

Model competencies builta Model choice Model exercises
Build discrete models. Generate descriptive figures. Plan for sensitivity analysis. Understand how probabilities influence models. Single-species population models: density-independent (assumes paradise), density-dependent First model: a discrete population viability analysis with missing information and variable ranges; Second model: same population/system with a probabilistic PVA.
Turn words into equations. Build continuous models. Check assumptions. Graphical analysis. Multispecies population models: Lotka–Volterra (L-V) predator–prey (P-P), interspecific competition Read Volterra (1926) blue right-pointing triangle and generate the two difference equations from description. Build a model of two-species competition.
Turn words into equations. Interpret equations in plain language. Understand systems of ordinary differential equations (ODEs). Analyze a model's stability. Compartment models: S-I-R (susceptible-infected-recovered; Anderson and May, 1979 blue right-pointing triangle ) Generate mathematical models of measles, human immunodeficiency virus, tuberculosis, and teach the class your variant. Build vaccination into your model.
Interpret equations in plain language. Master systems of ODEs. Seek common ideas and notation among models. Michaelis–Menten enzyme kinetics; revisit L-V, P-P Come up with a procedure to find V MAX.
Understand Markov processes. Build spatial models. Explore scenarios. Spatially explicit, contagious process models (e.g., forest fires, infectious diseases, invasive species) Prototype a contagious-process model.
Master Markov processes. Check assumptions. Explore scenarios. Compare evidence to model prediction. Sequence evolution WHIPPO (BioQUEST Curriculum Consortium, 2010 blue right-pointing triangle ).
Compare evidence to model prediction. Turn words into equations. Analyze stability. Game theory: Chicken, Hawk-Dove, Prisoner's dilemma, evolutionarily stable strategies Write a payoff matrix for the final scene in The Good, the Bad, and the Ugly. Explain how cooperation could arise.
Explain model results in plain language. Use models to make decisions. Recognize caveats and cautions. Explore scenarios. Decision analysis Roundtable exercise: students read a paper about a conservation problem addressed with a model and then adopted roles for a stakeholder meeting where objective was to reach a consensus solution [based on Ralls and Starfield (1995) blue right-pointing triangle and Starfield et al. (1995) blue right-pointing triangle ].

The modeling process can be as fluid as the scientific process; like the scientific process, there is broad agreement on fundamental elements (Starfield et al., 1994 blue right-pointing triangle ; Nicolson et al., 2002 blue right-pointing triangle ; Haefner, 2005 blue right-pointing triangle ). Because by definition models are functional representations and simplifications of complex biological systems, they should be built with the target audience in mind, beginning with a clear objective, or purpose (Ralls and Starfield, 1995 blue right-pointing triangle ; Starfield et al., 1995 blue right-pointing triangle ). Scenarios and observations need to be translated into equations. Variables and parameters should be defined, with forethought to both simplicity and tractability. Assumptions (simplifying hypotheses) should be made explicit. The model should be tested for its sensitivity to changes in variable and parameter inputs through multiple scenarios. Each prototype should be verified against the model's objective(s) (Starfield et al., 1994 blue right-pointing triangle ). The entire process should be transparent (Nicolson et al., 2002 blue right-pointing triangle ; Gotelli, 2008 blue right-pointing triangle ), for the sake of both the modelers and the intended audience. Like the scientific process, the modeling process is flexible, nonlinear, and iterative.

Building on the work of Johnson et al. (1998b) blue right-pointing triangle , Johnson and Johnson (2000) blue right-pointing triangle , and Tanner et al. (2003) blue right-pointing triangle , many researchers have identified five essential elements of cooperative learning.

  1. Students share collective goals and an individual stake in the group's success (positive interdependence). They understand that these goals can only be achieved when each group member succeeds and contributes.

  2. Students interact in person. Introductory classes and students who are inexperienced in academic discourse need structured "face time," typically during class times.

  3. Students must be both individually and collectively accountable.

  4. Students must be given opportunities to assess whether the group is meeting its common goals, how group members are contributing, and what adjustments should be made.

  5. Students actually need to be taught cooperative skills with the same attention and rigor as academic content.

Cooperative groups are therefore much more than students divided among groups working on a common project. Cooperative groups must be deliberately and thoughtfully constructed, their objectives well defined, and their dynamics attended to (Phillips et al., 2007 blue right-pointing triangle ). Instructors must think deliberately about structuring cooperative activities into the curriculum and assessing the individual and group outcomes. Just as important, instructors must teach interpersonal skills and structure group interactions in ways that allow students to practice listening, critiquing, questioning, and demonstrating skeptical scientist skills (Tanner et al., 2003 blue right-pointing triangle ; Phillips et al., 2007 blue right-pointing triangle ).

Course objectives that emphasize students mastering both content and processes of modeling and cooperation facilitated focus on students and their learning, rather than lengthy lists of content to cover (Wiggins and McTighe, 2005 blue right-pointing triangle ). Choosing course objectives using questions about learning outcomes also meant that anyone teaching the course in the future could choose models (content) with which he or she was familiar, as long as chosen content fulfilled the course objectives. Over the four semesters this course was offered, the learning outcomes evolved to include knowledge and skills in the following areas.

Quantitatively Representing Hypotheses (Jungck, 1997 blue right-pointing triangle )

Both students and some professors think of biology as the least quantitative of the sciences, but fields such as genomics, ecosystem dynamics, bioinformatics, and drug and vaccine discovery are driven by breakthroughs in quantitative understanding (Cohen, 2004 blue right-pointing triangle ). This course objective included practicing skills in prototyping, model parameterization, sensitivity analyses, calibration and validation, analysis and interpretation of results, and choosing appropriate scenarios and simulations.

Graphically and Verbally Representing Vague Problems

This may be especially important in the emergent biological sciences (Steen, 2005 blue right-pointing triangle ). It is often challenging for novices, who are unfamiliar with the assumptions behind vague problems.

Communicating Model Experiment Results Targeted to the Audience, in the Most Economic and Efficient Ways Possible

Models, with their simplifying assumptions and uncertainties, make for particularly skeptical recipients (Starfield and Bleloch, 1991 blue right-pointing triangle ; Ralls and Starfield, 1995 blue right-pointing triangle ). Students need practice in simplifying both the mathematics and biology of their models and in presenting their results orally and in writing.

Cooperation and Collaboration

Science is a collaborative practice. Students need repetition and feedback from instructors and peers about how to accomplish shared goals.

CONTENT

Model Choices

Once course objectives were defined, I chose mathematical–biological models for the course based on three criteria: accessibility for novices, ubiquity, and a paradox or novelty factor (Table 1). The School of Biology's primary objective of this course was to expose biology majors to a quantitative survey of mathematical models from many domains of biology, so accessibility was important.

I began each semester with single-species population models because both the models and modeling software (spreadsheets) were accessible to novice modelers. The advantages of using spreadsheets as a modeling environment with novice modelers include having short ramp-up time and facilitating communication, both among team members and by the team to their audience. Students grasped modeling fundamentals (define the objective, identify variables, parameters, and assumptions) of these simple discrete and continuous models, and simple continuous models also facilitated a review of differential and integral calculus. It was also easy for instructors to model the modeling process by having students question a single assumption and then find out how the model and its output changed. For example, a key assumption of density-independent population models is paradise: space, food, and water do not limit population growth. By asking what happened when space, food, or water did limit population growth, students could see the biological and mathematical consequences of assumptions. Multispecies interactions were then introduced by removing the assumption of no interaction from single-species models.

I also identified common themes among mathematical models from different domains of biology. For example, mass action is a property of many different biological models (in addition to models in chemistry, physics, and sociology). In the Lotka–Volterra predator–prey model, the mass-action term describes how predators encounter prey and convert them to more predators by consuming them. In epidemiological models, mass action describes how a disease spreads within a population from infected to susceptible individuals. In the Michaelis–Menten model of enzyme kinetics, substrate–enzyme complex formation is governed by mass action. Although the three different models have very different objectives focused on biological systems that can differ by orders of magnitude, mass action terms are mathematically similar, with similar biological interpretations. Giving students many opportunities to move between biological and mathematical insight was important to developing their modeling process skills.

Finally, I chose some models to illustrate paradoxes, absurdities, or limitations of mathematical models. For example, graphical analysis of a simple linear model of obligate mutualisms shows an unstable two-species equilibrium. However, obligate mutualisms are common and persistent in nature. When students connected the mathematical model to the biological realities, they were able to examine their modeling process and identify model assumptions as oversimplifying the system. In game theory, the classic Prisoner's dilemma seems to preclude the rise of cooperation; clearly, cooperation operates at many scales of interaction—within humans and within other species—and can even spread rapidly (Fowler and Christakis, 2010 blue right-pointing triangle ). Conversely, the Prisoner's dilemma, with its assumption that agents have perfect and complete knowledge of one another's choices, can be found in the most unlikely system: strains of RNA viruses (Turner and Chao, 1999 blue right-pointing triangle ), which certainly cannot know anything. Both these examples force students to confront naïve conceptions about mathematics, models, and biology and to replace misconceptions with robust knowledge.

Over four semesters, I added or dropped some models, but generally progressed from single-species population models, multispecies models (predation, competition, and mutualism), epidemiological models, enzyme kinetics, game theory and evolutionary dynamics, and spatial models (summarized in Table 2).

Assessment

I used frequent assessments—individual and group, informal and formal (for grade), formative, and summative—to gauge content and process knowledge (course objectives). The first purpose of assessments was to gauge student mastery of the modeling process. Formative assessments included cold-calling on a student to identify the objective, and then another student to identify an assumption, and so on. I chose the responders (instead of calling on a raised hand) but kept anxiety low by moving rapidly. There was no score attached to a cold call. Much class time was devoted to students formulating solutions in small groups, so I also visited groups and questioned group members about their process. These frequent formative assessments of student process invited students to be responsible for their learning (Nicol and Macfarlane-Dick, 2006 blue right-pointing triangle ). A summative assessment of modeling process mastery was an exam question asking students to develop and apply a list of criteria to evaluate modeling papers.

The second kind of assessment was to evaluate student mastery of the mathematical models themselves. For example, student learning was evaluated by whether each could choose and justify whether to use a discrete versus continuous models (i.e., population models). Another evaluation asked students to explain systems of differential equations to other group members or to the class (i.e., disease and other compartment models, enzyme kinetics). Modeling assignments built on in-class problem-solving exercises. Each student practiced turning plain language into equations, interpreting systems of equations, suggesting starting points for sensitivity analysis, or summarizing the group's findings for the rest of the class. By practicing interpretation and making short presentations, students received frequent feedback from instructors and peers about their individual mastery of the models. The summative assessment for model mastery was a final project, conceived by each group. The final project was an original mathematical (differential or integral) model of a biological phenomenon. They parameterized their models, chose scenarios and experiments, ran their models, gathered results, and presented their results in a 15-min group presentation.

We (co-instructors and I) used results of our frequent assessments to gauge our teaching effectiveness and make adjustments when necessary. For example, the first semester the course was taught, one of us introduced Leslie matrices to support age-structured models in the second week of classes, beginning with matrix theory. From that day's minute-papers, we learned that the level of instruction was too sophisticated and students learned little but felt intimidated. In subsequent semesters, we delayed Leslie matrices until several weeks later, in age-structured models.

Ground Rules

Class ground rules were posted on the syllabus and discussed the first day. The syllabus included two instructor-defined ground rules: 1) all members of a group are responsible for all of the group's product; and 2) members assist each other in learning, e.g., software, models, graphing, writing, and listening. Part of the first class was invested in helping students define other class ground rules. These helped foster cooperative skills such as patience, transparency, compassion, and assertiveness. Students on a team all earned the same score for the single project report (cooperative learning element 1: group and individual accountability). In the first two semesters, I neglected opportunities for group processing during the course. As a result, there were many more group problems requiring instructor intervention when students began working on student-selected final modeling projects. In the third and fourth semesters, I made structured group processing the final component of all group projects (cooperative learning element 4).

Learning by Doing

To support model and model-building mastery, I built problem-solving and rapid prototyping into the instruction curriculum. Approximately half the instructional time (totaling 3 h/wk) was devoted to solving problems and generating rapid prototypes of models. Because students worked in small cooperative groups of three or four, instructors provided student-centered feedback and adjustment on model development, the modeling process, and cooperative processes. Like many curriculum strategies, this tool benefited both content and process goals.

PROCESS

Team Projects

Because one course objective was for students to learn and practice cooperative skills, all work for this class except an individual midterm examination occurred in groups of three or four members. Group sizes of three or four maximized the potential for each team member to contribute. In the second semester offered, course enrollment temporarily increased to 60 and I tried group sizes of five and six but found that one member frequently dominated the proceedings or freeloaded. After the second semester the course was offered, enrollment returned to a maximum of 40, and group size did not exceeded four. Contrary to students' (and some colleagues') conceptions of group work and scores, individual differences in proficiency do emerge, especially when group members shuffle after every one or two assignments.

Learning Journals

The course's objectives focused on content and process mastery, but students are strongly accustomed to content-rich courses. Consequently, I heard frequently from students (especially early in the semester) that they were unsure of what they were learning. Because transparency helps learning, all students subsequently turned in a weekly observation about what they had learned, applied, or noticed that week (ungraded). Some students had to be coached away from reporting about the week's activities or the failings of others and toward processing their understanding of what they had learned. Typically, a few weeks into the semester, students began to see the value of this metacognitive exercise in their process of becoming skilled biological modelers and collaborators. By the end of the semester, >90% of students used the weekly learning journals to reflect on how much they had learned about the content and processes of modeling and cooperation.

Modeling Heuristics

Heuristics, or rules of thumb, for simplifying both models and the modeling process, can be especially useful for novices (Nicolson et al., 2002 blue right-pointing triangle ). Novice modelers often struggle with distinguishing the essential or important details of their learning, and experts find it difficult to distill the fundamentals. I distributed a list of modeling heuristics to students and then frequently referred to these during the semester. The heuristics for modeling included keeping the objective in mind, keeping a list of assumptions, planning for sensitivity analyses, keeping things simple without oversimplifying, and several others. These helped students constrain problems as they parameterized their models. They also facilitated group and individual processing when stuck. The sources for these heuristics were books, chapters, and papers on modeling (Starfield, 1991 blue right-pointing triangle ; Starfield and Bleloch, 1991 blue right-pointing triangle ; Nicolson et al., 2002 blue right-pointing triangle ; Haefner, 2005 blue right-pointing triangle ; Gotelli, 2008 blue right-pointing triangle ). Over the four semesters, several students also contributed to the list—Cut through Gordian knots became a favorite for finding a simple way through a difficult problem.

MERGING CONTENT AND PROCESS

Over the four semesters, I explored how best to target content and processes of models and cooperation. In this section, I offer several curriculum heuristics that served instructors and students well. In the next section, I illustrate these steps in action with a narrative of one model cycle.

Map Out a General Learning Path

Drawing from field-specific knowledge, examples from primary literature, even our own experiences, each of us has some knowledge of the big ideas in our field and how students learn the processes of our disciplines. The purposes of this heuristic are to make explicit the often implicit understandings and assumptions we have about what we teach and how students learn, to emphasize student learning over instructor knowledge, and to relieve the anxiety of de-emphasizing content. Initially, this step was informal. After the first and especially second semesters, when I had both results of assessments and a feel for models, exercises, and teaching practices that worked, I revisited this step to clarify course objectives and content and process choices.

Decide Which Mistakes to Induce

One of the insights of process learning is that right answers aren't always possible. Certainly perfect recall of facts isn't ever possible, so process learning supports discipline mastery. Students learn by getting predictable wrong answers, discovering their mistakes, and correcting both the knowledge and path to it. Using my map of a general learning path in this course, I could see, and target, common mistakes that novice modelers often make. I chose teaching examples and assignment scenarios specifically so students could make the mistakes I wanted them to make. Rather than seeming manipulative, I made this explicit on the first day of class and reminded students often that one of my goals was to get them to make as many mistakes in as short a time as possible so that they could learn to correct themselves in a low-stakes environment.

Simple Models, Complex Scenarios

Because one course objective was process mastery, I redesigned assignments that asked for exact solutions to models that explored scenarios with certain (carefully chosen) elements of uncertainty. Groups had to define their model's objective, its variables, and its assumptions, build a model, and use their model to make predictions or recommendations. This target encouraged basic model-process skills, and group processing.

Incomplete Information

Modelers need to know what data they have, what data they lack and need to find, and what data are extraneous in their pursuit to build a model. By omitting key information, I forced students to determine what they knew and didn't know. Students began to see the value of cooperative learning when team members thought of novel approaches and scenarios.

Short Papers, Short Deadlines

Typical assignment deliverables were one to two pages of text and one page of figures and tables per small group. Under the space constraint, students were forced to negotiate with another about and choose what would go into the modeling report, and what would not. Typical turnaround time for the modeling reports was 1 wk from the date assigned.

Evaluation Rubrics Must Support Content and Process

Over the four semesters, typical evaluation rubrics evolved from very specific, solution-seeking to exploring scenarios. Students were especially accustomed to lists of content to memorize, so careful attention to including process criteria and standards was important.

Plan for Frustration

Even when structured well, process learning can be frustrating for students. The open-ended modeling scenarios were similarly frustrating to them, as were instructors' refusals (for the most part) to tell students the answer, or how to do it, or what we were looking for. There is dignity in this academic struggle, and students must be allowed and invited to be wrong. Students also need to know that we are not trifling with their time, so I did not allow students to wander too far astray if that detour wouldn't help their understanding of the model. Over the four semesters, I noticed that individual and group frustration peaked at approximately the fifth week of the semester. I made it a point to offer reassurance and a course-objective check-in. Humor also was useful.

AN EXAMPLE MODEL CYCLE

Although model choices progressed both in complexity and sophistication through the semester (Table 2), I modeled a similar approach in teaching and learning cycles. I began with an example, or a case, or data from papers to illustrate the need for the mathematical model. That prefaced an ≈10-min minilecture developing a model from the given scenario:

"Suppose there are 100 mockingbirds on campus, and they grow by 50% next year. How large will the population be? How large will it be 50 years from now?"

From a starting question, students practiced turning words into equations. They came up with operational definitions for objectives (to predict the population some time in the future or past), variables (growth rate), and assumptions (simplifying hypotheses, such as a closed population, equal fitness among individuals, asexual reproduction). Rather than lecture on these, I prompted for the concept with a question or an example: "What are we assuming about birds on the Georgia Institute of Technology campus and birds on the Emory University campus?" This encouraged students to move between the mathematical and biological worlds. Through the semester, I followed the same modeling process: finding or stating the objective, defining variables and assumptions, validating and interpreting results, and planning for sensitivity analysis. Typically, students had 15–20 min in teams to explore a scenario or solve a problem. This allowed me to visit each group and question or probe students both for model understanding and cooperative processes.

For the first assignment, I focused mostly on building competence in the modeling process, such as clearly stating a model objective and its nontrivial assumptions and exploring scenarios by methodically choosing parameter and variable ranges. The assignment objective for students was essentially to perform a population viability analysis by using a discrete, deterministic population model. In the first semester, I used data from a real species (white-tailed deer in Georgia) and asked a question (i.e., "What will the population be in 10 yr?"). Students simply searched the Internet for missing information, unaware of the assumptions behind most data (i.e., sample sizes and confidence intervals) and solved the problem. Most mathematical models of biological systems are not built because the problem is easy or solvable; often crucial information is missing. In subsequent semesters, I invented a fictitious species with fictitious (but deliberately chosen) life-history traits. I deliberately omitted key information about the species so that students would explore a range of variable values. I constrained the output to choosing among four recommended actions. Each recommendation was constructed to invite a sensitivity analysis of one of the four model variables. Groups had 1 wk to build the model, run scenarios, and turn in a report. Group members signed off, indicating their participation and understanding of the product.

Each iteration of a model assignment (both within a semester and among semesters) emphasized scaffolding of model mastery, modeling process development, and development of the cooperative process. Because I constrained each modeling problem to address specific competencies, most students in most groups made most of the expected mistakes. This, in turn, made assessment more efficient: instructors did not have to repeat all the same feedback to each individual group separately. Instead, I developed short lists of suggested model improvements for the next model assignment (i.e., plan for a sensitivity analysis) and focused group feedback on those deficiencies particular to the individual group.

OUTCOMES

Throughout the four semesters this course was taught, I used a variety of assessments to judge whether students were learning both mathematical modeling content and processes of modeling and cooperation. In spring 2008, I contacted all current and former students via e-mail, asking for their perceptions about what they had learned and retained, both mathematical–biological and interpersonal (cooperative). I used the Student Assessment of Learning Gains (SALG) (Carroll et al., 2007 blue right-pointing triangle ), a web-based assessment tool that focuses on whether and how a course has influenced student learning. The SALG site is both anonymous and confidential for students, and instructors can customize questions to focus on specific curricular tools.

Because I was interested both in whether students thought they had learned content and process and whether they thought those gains were durable, I used a different response URL for each of the four semesters. All respondents from all four semesters answered the same 43 questions on a 1–5 Likert scale. I asked respondents about their skills and knowledge of both mathematical–biological models, and modeling and cooperative processes. Respondents also were invited to leave comments if they chose. I invited all 175 students who completed the course to respond. Eighty-nine students (51%; range, 43–58%) responded to the questionnaire invitation, a good response rate (Dommeyer et al., 2004 blue right-pointing triangle ), especially given the delay of up to 16 mo.

Students reported strong gains in learning modeling content (mean of response to each question = 4.10/5, N = 89). They reported similar gains in modeling process skills, including applying the model-building steps (mean = 4.25/5) and their abilities to approach vague problems (mean = 4.12/5). Finally, they reported gains in their cooperative skills. Collaboration in modeling assignments and group assignments helped them understand their roles in a group and work with peers (mean = 4.13/5 and 4.18/5, respectively). Although most responses to most questions showed similarly strong endorsement of course curriculum and of impact on both knowledge and skills, responses were not unilaterally positive on any question in any semester. Notably, student opinions of curriculum and self-reported gains in content and process were as strong for those who had finished 12–16 mo before being surveyed as students just finishing the course.

Another metric of course success came from student midterm exams—the single individual summative assessment each semester. In each of the four semesters, I asked two questions. In the first, I asked students to define a short list of criteria they would use to evaluate a model, to prioritize their list, and to explain their top three criteria to a colleague unfamiliar with models. The purpose of this question was to gauge student mastery of the modeling process. More than 96% of students in each semester chose 1) model objective, 2) variables, 3) assumptions, and 4) sensitivity analysis among their top six criteria. I asked a second question designed to gauge modeling content knowledge. Although the question varied among semesters, students again showed high mastery (>85% able to provide a response showing mastery).

The most significant metric of success was a final group project and presentation. The final project was designed to assess mastery of all four course objectives. Over the course of ≈6 wk, student groups generated a biological question that could be explored with a differential model, developed and tested an original model, and then presented their findings in an open seminar. Many students found it difficult or intimidating to describe variables and formulas—much less systems of differential equations—early in the semester. All students were competent in explaining systems of differential equations in plain language by the final project.

CONCLUSIONS

Learning mathematics and biology is hard enough for students, and our ability as instructors to manage the staggering volume of biological content knowledge is already impossible. Yet, the trade-off between teaching time devoted to content and process is real. Here, I have presented the rationale for a course that combined both content and process learning of mathematical–biological models. My objective was to make the case that we can stop trying to cover the mountain of content and instead make targeted, deliberate choices about the content we teach while also teaching important and oft-forgotten process skills. Although my inductive approach is different from the traditional deductive approach of scientific reports, my intention was to offer a way of imagining mathematical–biological curricula that are effective, produce durable learning, and are energizing for us to teach and for students to learn.

As scientists and instructors, we ask whether curriculum innovations and student learning are linked. More than half of all students who completed the course responded to questions probing their content and process skills, despite that the majority of those who completed the course had graduated. Their perceptions of their own skills and knowledge remained high, even at 1 yr postcompletion and beyond. Although most of the data presented here are student-reported, therefore potentially biased, one of the process skills they were taught was metacognition. Two intriguing questions that emerge from this report are whether, and to what degree, student perceptions of learning correlate to objective measures of learning, and whether learning metacognitive processes increases their accuracy at self-assessment of learning.

I now teach the course at a different university, where factors from the university's mission to student population, preparedness, and even life experiences differ from the original institution's views. I have had to adjust my expectations about student performance and the pacing of the class, but these are nearly universal curricular concerns among academics. Like many colleagues who continually seek to improve their own teaching and students' learning, I adjust and improve (I hope) each time the course is taught. The course objectives still drive content and process choices, made to increase student competence in analyzing and building models and in creating productive collaborations.

ACKNOWLEDGMENTS

I thank the School of Biology at the Georgia Institute of Technology for supporting a novel and somewhat risky course design. My co-instructors taught me much about teaching skeptics and making room for all to contribute. Donna Llewellen (of Georgia Tech's Center for Excellence in Teaching and Learning (CETL)) and Rupal Thazhath provided support and encouragement when obstacles to success in this course seemed insurmountable. Tristan Utschig (CETL) introduced SALG and Wisconsin Center for Education Research's resources to me. The American Society for Microbiology Biology Scholars Residency assisted me in turning these results into a manuscript. At Minnesota State University–Mankato, Stephen Bohnenblust and Candace Raskin provided helpful critiques on several drafts. I am grateful to Stephen Druschel and two anonymous reviewers whose suggested changes helped me make better sense and structure of the course. Tony Starfield was the inspiration and original mentor for the course concept. My deepest gratitude goes to 175 students who turned themselves inside-out to learn a lot about mathematics and biology, a lot about themselves, and inspired me to rise with them.

REFERENCES

American Association for the Advancement of Science, 1991. American Association for the Advancement of Science, editor. Science for All Americans. Washington, DC: Oxford University Press; 1991. [Google Scholar]

Anderson and May, 1979. Anderson R., May R. Population biology of infectious diseases: part I. Nature. 1979;280:361–367. [PubMed] [Google Scholar]

BioQUEST Curriculum Consortium, 2010. BioQUEST Curriculum Consortium. Problem spaces, Beloit, WI: BioQUEST; 2010. WHIPPO Problem Space: the WHIPPO-1 Dataset. [Google Scholar]

Carroll et al., 2007. Carroll S., Seymour E., Weston T. Student Assessment of Learning Gains. Madison, WI: Wisconsin Center for Education Research; 2007. [Google Scholar]

Cohen, 2004. Cohen J. Mathematics is biology's next microscope, only better; biology is mathematics' next physics, only better. PLoS Biol. 2004;2:2017–2023. [PMC free article] [PubMed] [Google Scholar]

Colburn, 2007. Colburn A. Constructivism and conceptual change, part II. Sci. Teach. 2007;74:14–14. [Google Scholar]

Donmeyer et al., 2004. Donmeyer C., Baum P., Hanna R., Chapman K. Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluation. Assessment & Evaluation in Higher Educ. 2004;29(5):611–623. [Google Scholar]

Fowler and Christakis, 2010. Fowler J. H., Christakis N. A. Cooperative behavior cascades in human social networks. Proc. Natl. Acad. Sci. USA. 2010;107:5334–5338. [PMC free article] [PubMed] [Google Scholar]

Gotelli, 2008. Gotelli N. J. A Primer of Ecology. 4th ed. Sunderland, MA: Sinauer; 2008. [Google Scholar]

Gray et al., 2005. Gray S. S., Loud B. J., Sokolowski C. P. Undergraduates' errors in using and interpreting variables: a comparative study. 27th Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education.; PME-NA, Virginia Institute of Technology: 2005. pp. 1–7. [Google Scholar]

Haefner, 2005. Haefner J. W. 2nd ed. New York: Springer Science+Business Media; 2005. Modeling Biological Systems: Principles and Applications. [Google Scholar]

Herreid, 1998. Herreid C. F. Why isn't cooperative learning used to teach science? Bioscience. 1998;48:553–559. [Google Scholar]

Holling, 1996. Holling C. S. Surprise for science, resilence for ecosystems, and incentives for people. Ecol. Appl. 1996;6:733–735. [Google Scholar]

Johnson and Johnson, 1992. Johnson D. W., Johnson R. T. Implementing cooperative learning. Contemp. Educ. 1992;63:173–180. [Google Scholar]

Johnson et al., 1998a. Johnson D. W., Johnson R. T., Houlubec E. Cooperation in the classroom. Edina, MN: Interaction Book; 1998a. [Google Scholar]

Johnson et al., 1998b. Johnson D. W., Johnson R. T., Smith K. A. Cooperative learning returns to college: what evidence is there that it works? Change. 1998b;30:27–35. [Google Scholar]

Johnson et al., 2000. Johnson D. W., Johnson R. T., Stanne M. B. Minneapolis, MN: Cooperative Learning Center, University of Minnesota; 2000. Cooperative Learning Methods: a Meta-Analysis. [Google Scholar]

Johnson and Johnson, 2000. Johnson R. T., Johnson D. W. How can we put cooperative learning into practice? Sci. Teach. 2000;67:39. [Google Scholar]

Jungck, 1997. Jungck J. Ten equations that changed biology: mathematics in problem-solving biology curricula. Bioscene. 1997;23:11–36. [Google Scholar]

Lombrozo et al., 2008. Lombrozo T., Thanukos A., Weisberg M. The importance of understanding the nature of science for accepting evolution. Evol. Educ. Outreach. 2008;1:290–298. [Google Scholar]

May, 2004. May R. M. Uses and abuses of mathematics in biology. Science. 2004;303:790–793. [PubMed] [Google Scholar]

Nazario et al., 2002. Nazario G. M., Burrowes P. A., Rodriguez J. Persisting misconceptions: using pre- and post-tests to identify biological misconceptions. J. Coll. Sci. Teach. 2002;31:292–296. [Google Scholar]

Nicol and Macfarlane-Dick, 2006. Nicol D. J., Macfarlane-Dick D. Formative assessment and self-regulated learning: a model and seven principles of good feedback process. Stud. High. Educ. 2006;31:199–218. [Google Scholar]

Nicolson et al., 2002. Nicolson C. R., Starfield A. M., Kofinas G. P., Kruse J. A. Ten heuristics for interdisciplinary modeling projects. Ecosystems. 2002;5:376–384. [Google Scholar]

Phillips et al., 2007. Phillips M., Gildensoph L. H., Myers M. J., Norton C. G., Olson A. M., Tweeten K. A., Wygal D. D. Investigative labs in biology: the importance of attending to team dynamics. J. Coll. Sci. Teach. 2007;37:23–27. [Google Scholar]

Phillips and Milo, 2009. Phillips R., Milo R. A feeling for the numbers in biology. Proc. Natl. Acad. Sci. USA. 2009;106:21465–21471. [PMC free article] [PubMed] [Google Scholar]

Ralls and Starfield, 1995. Ralls K., Starfield A. Choosing a management strategy: two structured decision-making methods for evaluating the predictions of stochastic simulation models. Conserv. Biol. 1995:175–181. [Google Scholar]

Schlegel and Pace, 2004. Schlegel W. M., Pace D. Using collaborative learning teams to decode disciplines: physiology and history. In: Pace D., Middendorf J., editors. Decoding the Disciplines: Helping Students Learn Disciplinary Ways of Thinking. New York: Wiley & Sons; 2004. pp. 75–83. [Google Scholar]

Senocak et al., 2007. Senocak E., Taskesenligil Y., Sozbilir M. A study on teaching gases to prospective primary science teachers through problem-based learning. Res. Sci. Educ. 2007;37:279–290. [Google Scholar]

Smith, 1995. Smith K. A. Cooperative learning: effective teamwork for engineering classrooms. Proceedings of ASEE/IEEE Frontiers in Education Conference; 1–4 November 1995; Atlanta, GA. 1995. pp. 2b5.13–2b5.18. [Google Scholar]

Stamp et al., 2006. Stamp N., Armstrong M., Biger J. Ecological misconceptions, survey III: the challenge of identifying sophisticated understanding. Bull. Ecol. Soc. Am. 2006;87:168–175. [Google Scholar]

Starfield, 1991. Starfield A. M. Qualitative rule-based modeling. Bioscience. 1991;40:601–604. [Google Scholar]

Starfield and Bleloch, 1991. Starfield A. M., Bleloch A. L. 2nd ed. Edina, MN: Burgess Press; 1991. Building Models for Conservation and Wildlife Management. [Google Scholar]

Starfield et al., 1995. Starfield A., Roth J., Ralls K. "Mobbing" in Hawaiian monk seals (Monachus schauinslani): the value of simulation modeling in the absence of apparently crucial data. Conserv. Biol. 1995;9:166–174. [Google Scholar]

Starfield et al., 1994. Starfield A., Smith K., Bleloch A. L. Edina. MN: Burgess International Group; 1994. How to Model It: Problem-Solving for the Computer Age. [Google Scholar]

Steen, 2005. Steen L. A. Washington, DC: The Mathematical Association of America; 2005. Math & Bio 2010, Linking Undergraduate Disciplines. [Google Scholar]

Tanner et al., 2003. Tanner K., Chatman L. S., Allen D. Approaches to cell biology teaching: cooperative learning in the science classroom—beyond students working in groups. Cell Biol. Educ. 2003;2:1–5. [PMC free article] [PubMed] [Google Scholar]

Tariq, 2008. Tariq V. N. Defining the problem: mathematical errors and misconceptions exhibited by first-year bioscience undergraduates. Int. J. Math. Educ. Sci. Technol. 2008;39:889–904. [Google Scholar]

Turner and Chao, 1999. Turner P. E., Chao L. Prisoner's dilemma in an RNA virus. Nature. 1999;398:441–443. [PubMed] [Google Scholar]

Volterra, 1926. Volterra V. Fluctuations in the abundance of a species considered mathematically. Nature. 1926;118:558–560. [Google Scholar]

Wiggins and McTighe, 2005. Wiggins G., McTighe J. Understanding by Design. Upper Saddle River, NJ: Prentice Hall; 2005. [Google Scholar]

A Course In Mathematical Biology Pdf

Source: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2931681/

Posted by: knottgrecond.blogspot.com

0 Response to "A Course In Mathematical Biology Pdf"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel