1. K-State home
  2. »Mathematics
  3. »research
  4. » Center for Quantitative Education
  5. »Research

Department of Mathematics

Research

At the Center for Quantitative Education, we have multiple ongoing research projects on how best to teach students in a technological world. Much of the research is done in collaboration with graduate students in both the Mathematics Department and the College of Education. A recent Ed. D. thesis for example, considered under what circumstances students will choose to watch an instructional video and when they prefer to read text, with implications for the design of effective videos. Our work has led to many papers and conference presentations, and many Ph.D. and M.S. students have already completed degrees while working with the Center. In addition, the Q-Center supports two or three post-docs at any given time who work on both pure mathematics research with members of the mathematics faculty and also collaborate part-time on educational research and development. This helps build a stronger community of mathematicians who understand and can use educational research to improve instruction. Post-docs from these positions have gone on to jobs at a variety of schools including those that value teaching (e.g. James Madison University) and those that are viewed as more research-oriented (e.g. Ohio State University and University of California, Riverside). Some of the Center's current research topics are:

Data Mining and Retention

Our main objective is to identify the general characteristics of groups within typical first-year mathematics courses and then adapt aspects of the course to best suit their needs. Data mining provides a method for getting to know students by making sense of the large amounts of information they generate.

When you use online homework, you automatically accumulate a very large database of student responses. Dr. Andrew Bennett and Rachel Manspeaker have used data mining techniques to identify specific student patterns with a goal of improving retention. After reducing dimensionality of the data with a singular value decomposition and then applying k-medoids clustering, they identified several different groups of students based on work during the first four weeks of class. Rachel conducted interviews with 19 representative students spread across these groups to identify common patterns of study. Note that this is not traditional learning styles analysis, which proceeds from a predetermined sense of how students learn. This research started with actual student behaviors and has used "black box" methods to create classifications based on actual performances: Overachievers, Smart Slackers, Employees, Rote Memorizers, and Sisyphean Strivers. This has enabled us to identify students in need of intervention within a few weeks of the start of class. Unfortunately, even within that time the students have established inefficient study patterns which we have found difficult to amend. We have now devised an online questionnaire based on the interview data to help classify students' prior to enrollment. With this we hope to get students into the appropriate learning situations (e.g. studio vs. traditional college algebra) for their individual needs.

This leads to the related issue in retention of properly placing students. There have been issues with placements based solely on ACT scores, particularly since we now have around 150 freshmen students annually placing into Calculus II based on self-selection, since the ACT exam doesn't include any calculus (and relatively little pre-calculus). In 2009 we implemented an online placement exam based on the online homework system and asked all entering freshmen to take it before coming to orientation. Over 2800 students took the exam in 2009 and over 3200 this year. Working with two M.S. students, Nina Ostapyuk and Theang Ho, we matched the results from a prior year against success on exams in freshmen math classes to develop a more accurate placement system. This led to more accurate placement of students into the correct first course so they have a successful freshman year, improving retention rates and ultimately graduation rates.

Assessment

In order for assessment to be most useful, teachers need to be able to measure students' understanding in real time. We are building models to provide that feedback which is updated as each assignment is passed back to the students. A Bayesian learning model, constructed from student performance, is being analyzed to uncover information which may help instructors gauge the understanding of the class as a whole and of individual students.

To measure student learning, we need to have clear goals about what it means to learn the material. Marilyn Carlson's group in Research Innovations in Mathematics and Science Education has devised a Precalculus Concept Assessment that measures conceptual understanding of precalculus students. However, it requires an hour to give the exam and so it is difficult to find time to give it at the beginning and end of a course. Ph.D. student Drew Cousino and Dr. Andrew Bennett developed a method for obtaining this information in real-time without requiring separate exams. They gave the PCA as both a pre and post test in Studio College Algebra. By comparing online homework results, results from "clicker questions" during lecture, and problem-by-problem data from exams with PCA results, they have built a Bayesian model that tracks whether students are learning conceptually in something close to real time.

Longitudinal Research

For the last several years, we have been working with members of the Physics and Electrical Engineering departments to determine how the concepts students learn in their mathematics courses transfer to their major disciplines. Using qualitative research methods, we have been tracking a group of engineering students as they progress through the required math service courses.

Most students take mathematics because they need it later in their major. But we rarely assess how well our teaching actually helps the students after our class is over. Dr. Dean Zollman and Dr. Sanjay Rebello in Physics and Dr. Steve Warren in Electrical Engineering have been collaborating with members of the Q-Center to address this issue for several years. Using data-mining and clustering techniques we have succeeded in demonstrating transfer between courses in the same discipline, but have had significantly more difficulty in demonstrating how students transfer ideas between disciplines (e.g. from math to physics). As such, we are adopting a more qualitative approach.

Differentiated Instruction

Students that have an interest in the context of an applied mathematical concept tend to retain the information longer and develop a deeper understanding of the material. We're attempting to appeal to different students' interests through "Choose Your Own Homework" assignments.

At several conferences on mathematics and other disciplines sponsored by the MAA subcommittee on Curriculum Renewal and the First Two Years, participants commented that it would be better if their students had more opportunities to do applied (word) problems in their own discipline. Members of the Q-Center, particularly Danielle McNaney, have developed some "Choose Your Own Homework" assignments that let students choose what context they want to use for applications.

Select the type of problem you would like to answer.

 

Business
Education
Political Science
Agriculture
The Business Exponential worksheet shows the total number of Starbucks stores open at the end of each given year since 1994. In cell B3 enter the formula=A3-1994. Drag this formula down to cell B15. Column B now displays the number of years since 1994. Graph columns B and C on an x-y scatter plot.

 

 

(a) Using what you learned it Studio 4, add an EXPONENTIAL trendline with the equation and R2 value displayed on the graph. What is the exponential equation that best models the data?

 

(b) Use the equation you found to estimate how many stores Starbucks will have open in 2015.

 

(c) Create a second trendline, this time a QUADRATIC function (polynomial of degree 2). Make sure to display the equation and R2 value on the graph. Enter the equation in the space provided. Does the exponential function or the quadratic function appear to be a better model of the data? Refer to the techniques in the studio to decide on which model, exponential or quadratic, is better.

While normally letting students pick their own problems would lead to an organizational nightmare for grading, by having the work submitted online we actually made grading easier. Instead of having different graders handling different sections, they were assigned different problems and contexts. Avoiding issues with handling paper and finding where students had written their answers, the grading went substantially quicker for the choose your own homework assignment.

Student reaction was not as positive as we had hoped however. While some students enjoyed being able to choose their own context, others complained it was difficult to type a mathematical explanation into a text box. Others objected that in other work turned in online they had multiple attempts, while this, being hand-graded, just gave them one chance. While Dani has finished her work and is now teaching at a charter school in North Carolina, other members of the Q-Center team have continued working on these assignments (especially Dr. Andrew Bennett and Rekha Natarajan, the College Algebra Coordinator).

We found much more success when we revised the assignments so that the answers don't require as much mathematical notation, and we began giving students one chance to fix their errors. Because of the efficiencies of managing the answers online, it actually took less time for graders to grade these assignments twice than to grade a traditional paper assignment once. One issue we are still addressing is that freshmen are not yet so invested in their major that problems in specific academic areas appeal to them as much as we anticipated. We are therefore looking at revising the contexts to be more in line with freshman life.

 

Video Feedback

Because of the popularity of Youtube, many educators have suggested adding short video tutorials to online homework assignments. The Q-Center has investigated the effect these tutorials have on student learning.

At conferences on online homework systems, a common request is to add video. Q-Center members Cris Gawlik, Ed.D. student, and Dr. Andrew Bennett responded by adding video feedback to a variety of problems from online assignments in College Algebra and in Math for Elementary School Teachers. We then examined the records of which students chose to view the videos, and Cris followed up by interviewing 15 students from Math for Elementary School Teachers about the online feedback, including the videos.

The main points we learned were that students are very sensitive to the length of the video. A video of 120 seconds will be watched, but a 270-second video seems to exceed their attention span. In general students were less welcoming of video feedback than we expected. It appears that students are more willing to watch a video prior to beginning the assignment, when they are in "learning mode." But once the assignment begins, they switch to "getting it done mode" and don't want to do anything that seems to take them away from the specifics of finishing the assignment. If they have been unsuccessful three or more times on an assignment (really six or more since they get two shots at each problem set), then they will reluctantly conclude they need to learn more to complete the assignment and are more willing to watch a video. This leaves us with the issue of how to design a 120-second video that can help a student who is struggling badly with the material.

How Students Behave Online

We have centuries of experience in how to organize material in a "real-world" classroom, but since students working online bring different attitudes, we need to understand and use these to improve student learning.

Decisions about structuring the online homework assignments have been heavily influenced by observing how students behave online. For example, assignments that can be completed in about 20 minutes (less for College Algebra and longer for Differential Equations) tend to maximize student effort. Assignments that require less time lead to students getting done quickly without necessarily attaining mastery, but longer assignments prompt students to settle for lower scores instead of going back and trying again. We have started studying how different ways of configuring the assignments affects student behavior. Ed. D. student Bill Weber (now at FHSU) has looked at how changing the scoring from just giving students their best score overall to requiring a certain minimum score on the assignment affects student behavior. While this is a fairly minor change, it did increase the likelihood of students trying to improve their scores. Interestingly, it did not do so uniformly for different types of students. Utilizing the classifications discussed in the section on Data-Mining above, we found that minor changes in the structure of the assignment made for relatively large behavioral changes for Overachievers and Sisyphean Strivers, but were ignored by other students. This is helping us understand how to adapt online (and offline) instruction to promote student learning for the different types of students.