Summary
I completed the actions outlined in my reflective memo from last year. I continued to use various active learning techniques (prs + concept tests) and forms of regular feedback (e.g. mud cards, concept test results, etc.). One new addition was a quiz recap directly following the completion of the quiz. Based on student performance and subject evaluations, I believe that most of the students met most of the learning objectives. One exception is their understanding of the frame dependence of stagnation quantities which is inadequate. This is a long standing challenge.
Learning Objectives
1. What are the learning objectives (expressed as measurable outcomes) for this subject?
The subject learning objectives are contained on the course Web page. I do not have any changes to recommend to these.
2. To what extent were you able to integrate the CDIO skills specified for this subject in the Curriculum Plan of 2002?
I did not specifically implement any CDIO syllabus items since these are largely covered in the systems portion of Unified. However, several of the in-class experiments I did required 2.1.2 Modeling and 2.1.3 Estimation and Qualitative Analysis - see item 3. e) below. I would say both skills were taught at the "Introduce" level.
Teaching and Assessment Methods
3. What teaching and assessment methods did you use and what evidence indicates these methods were successful or not?
- Prepared lecture notes were available on the Web for all of the material. These notes have evolved over several years starting with a set of handwritten lecture notes. Each year I augment them when I find specific areas of difficulty from mud responses, etc. This year, I worked with Robin to increase the font size for the imbedded equations so they are more readable. I also added to the discussions regarding cp & cv and frame dependence of stagnation quantities. In general I am quite happy with the notes. In the end-of-term evaluations (PDF) 97% of the respondents rated the Web page (for all of Unified) Very Effective (70%) or Somewhat Effective (27%). 100% of the respondents rated the prepared lecture notes (for all of Unified) Very Effective (88%) or Somewhat Effective (12%).
- I used 25 concept questions over the 12 lectures with responses taken on the PRS system. The performance and answers to these were provided on the Web page. I continue to find these to be very useful for engaging the class in the material while I am lecturing. 100% of the respondents on the SEF rated the lectures Very Effective (64%) or Somewhat Effective (36%). 95% of the respondents on the SEF rated the in-class exercises as Very Effective (53%) or Somewhat Effective (43%). Also several positive comments were made about the PRS/concept questions in the written comments (PDF) from the end-of-term evaluations. In general my teaching reviews were good, so I think the students found my lectures to be helpful to them.
- I used mud cards for each lecture and responded to them the evening the lecture was delivered and put the responses up on Web. These responses were linked to the relevant areas of the online notes. See for example T1 mud responses. 82% of the respondents on the end-of-term evaluations said the mud cards were Very Effective (30%) or Somewhat Effective (53%), however the majority found they were only Somewhat Effective (53%). Nonetheless, I still found the mud cards to be valuable to me for providing feedback on each lecture. Even in cases where there were very few respondents it was helpful, therefore I will continue to use these.
- I wrote short assessments of each lecture (how they went). See for example T3 mud responses. This was mostly helpful for me, although I did use it to stress the important points from the lecture. I am not sure how many students read these responses. In general, I think that we have saturated the students in terms of available material on the Web. Further information goes un-read.
- I did 4 small demonstrations/in-class experiments. The students seemed to like these activities since they allowed them to apply the subject material to a real problem. They all were of the form where I asked the students to estimate something (using the concepts in class), then did the experiment and then discussed the result in light of their estimates. The activities thus had three primary objectives: to engage the students in the material we were working on and show them how to apply it, to highlight the various simplifications and assumptions in the engineering models we use, and to give the students practice in estimating various parameters required for input to the models (e.g. the volume of the room, or the weight of something, etc.)
- I had a formal self-assessment activity during the last recitation. I asked the students to grade themselves on the subject learning objectives. I did this largely as an effort to get them to think more deeply about the success criteria in advance of the exam, since based on past history their responses don't correlate well with their exam performance.
- I wheeled the CFM56 into class for one of the lectures. The students very much enjoyed this. I find that it prompts a degree of interest and a depth of questioning that I do not get otherwise.
- I used homeworks and exams to assess student learning. Each of the homework problems and exams was coded to specific subject learning outcomes. The overall weighting was 10% class participation, 30% homework, 60% quizzes. Also, the use of the PRS system in-class and for the self-assessment activity and the collection of, and response to, all of the mud cards gave me a large amount of data on the class performance. Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds).
Data Source | Formative | Summative | Feedback for Me |
---|---|---|---|
PRS System | X | . | X |
Time spent | X | . | X |
Self-assessment | X | . | X |
Muddiest Part of the Lecture | X | . | X |
Homework | X | X | X |
Quizzes | . | X | X |
End-of-term SEF data | . | . | X |
Class attendance | X |
Student Learning
4. How well did the students perform on each subject learning objective? (Where possible, make reference to specific data to support your conclusion.)
The performance on the homework (see plot below) and the time spent (see plot below) were both good with students on average performing well. The performance on the quiz is shown in the table below with each question labeled in terms of the learning objectives that were addressed. Middle-B performance for the quiz was 78%, suggesting the class on average (75.4%) was close to this standard. More than half of the points on the quiz were devoted to assessing conceptual understanding (versus the mechanics of solving a problem), consistent with the learning objectives.
-
To be able to state the First Law and to define heat, work, thermal efficiency and the difference between various forms of energy.
As evidenced by the performance on questions 1e, 1f, and 1h, most of the students achieved this learning objective. One deficiency surfaced with flow work and shaft work (1d), but I think if I would have directly asked what these forms of work were, the students would have done well. This particular question asked for an extension of these ideas and application to a cycle. -
To be able to identify and describe energy exchange processes (in terms of various forms of energy, heat and work) in aerospace systems.
To be able to explain at a level understandable by a high school senior or non-technical person how various heat engines work (e.g. a refrigerator, an IC engine, a jet engine).
Both of these learning objectives were well-addressed in homework #1 and on quiz question #2. I was pleased with the student performance in this area. In particular, they took home the message (delivered in lecture and recitations) that they were required to describe the workings of heat engines in terms of energy. Reading the answers on the quiz (from most but not all students) was a pleasant surprise. -
To be able to apply the steady-flow energy equation or the First Law of Thermodynamics to a system of thermodynamic components (heaters, coolers, pumps, turbines, pistons, etc.) to estimate required balances of heat, work and energy flow.
Historically the students do very well on this learning objective since it stresses the mechanical elements of solving a thermodynamics problem, versus conceptual understanding. The very good performance on quiz problem 3 is evidence of this. There was however, a weakness with the frame dependence of stagnation quantities as seen on problem 4. Due to having one less lecture this year (this was replaced with the review session after the quiz), I had to spend a little less time on this topic. It is always the most difficult topic for the students conceptually anyway, and their performance suffered. -
To be able to explain at a level understandable by a high school senior or non-technical person the concepts of path dependence/independence and reversibility/irreversibility of various thermodynamic processes, to represent these in terms of changes in thermodynamic state, and to cite examples of how these would impact the performance of aerospace power and propulsion systems.
The students performed well in this area as evidenced by quiz questions 1a, 1b, 1c, and 1g. Indeed, the biggest change from last year was the performance in understanding the requirements for and implications of quasi-equilibrium. This was the weakest part of the quiz performance last year so I focused on it more in lecture - to good effect I think. -
To be able to apply ideal cycle analysis to simple heat engine cycles to estimate thermal efficiency and work as a function of pressures and temperatures at various points in the cycle.
Their homework performance and quiz performance (3a, 3b, 3c) on this learning objective were very good.
Quiz | 1a | 1b | 1c | 1d | 1e | 1f | 1g | 1h | 2 | 3a | 3b | 3c | 4a | 4b | Total |
LO# | 4,5 | 4,5 | 4,5 | 1,4 | 1,4 | 1,4 | 5 | 1 | 2,3 | 4,6 | 4,6 | 4,6 | 4 | 4 | na |
Mean | 95% | 82% | 78% | 41% | 84% | 85% | 85% | 93% | 75% | 98% | 94% | 80% | 63% | 35% | 75.4% |
Weight | 5% | 5% | 5% | 5% | 5% | 5% | 5% | 5% | 16% | 6% | 12% | 6% | 8% | 12% | 100% |
Continuous Improvement
5. What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?
- The major action was introducing a quiz recap/discussion directly following completion of the quiz, under the presumption that this was a "teachable moment". Most of the students in the class found this helpful as noted in the response to the PRS question "I found the quiz review after the exams helpful, and think they are a better use of time than adding another lecture; 1=strongly agree, 2=agree, 3=neutral, 4=disagree, 5=strongly disagree" shown below. I too found this to be a good addition - worth the trade of an additional lecture hour. It enabled me to explain how the quiz was designed, to address questions, and to discuss scoring rubrics.
- I improved my delivery of quasi-equilibrium processes (evidenced on the exam performance and mud responses) by focusing on the physical implications and physical examples. The students had a much easier time understanding the time to reach equilibrium when applied to a coffee cup.
- I worked with Robin to increase the font size for the imbedded equations in the Web notes so they are more readable.
- I also added new discussions regarding cp & cv and frame dependence of stagnation quantities based on previous years' mud responses.
- I improved my presentation of cv & cp, in particular the implications for them being only a function of temperature for ideal gases. I did this with reference to doing an experiment - the physical example seemed easier for the students to grasp. I have added this example to the notes.
6. How do you use feedback from students and colleagues to improve your subject?
The most valuable feedback I get comes in the form of class performance on the PRS questions. I use this in real time to modify my lectures. Second most valuable is the feedback on the mud cards which I use to modify the next lecture and as a way of updating the course notes.
7. What will you continue or change?
- I will continue to use the suite of teaching and assessment methods.
- I will continue to schedule a quiz recap directly following the quiz.
- I will add some additional discussions to the notes focusing on the physical implications of quasi-equilibrium processes. This went well this year and I need to capture it in the notes.
- I will continue to seek a new strategy for discussing the frame dependence of stagnation quantities. I will schedule a meeting with Darmofal and Greitzer to see if we can brainstorm some ideas.
Information Sharing
8. To whom have you forwarded this reflective memo?
- Prof. Zoltan Spakovszky (16.050 instructor)
- Students from this class
- Students who will take the class next year
- Other members of the Unified staff