HIED 840: Discussion Posts
Lesson 1 & 2
Lesson 1: Initial post is due Sunday at 11:59 p.m. ET of Week One.
To begin our journey in assessment, let’s share our previous experience in this area. Please first introduce yourself with your name, institutional affiliation (if any), and academic and/or professional background. Then, respond to one of the two options below:
• Option 1: Have you ever designed, implemented, or been involved in assessment of student learning or program effectiveness? What was the purpose of the assessment? Did the assessment address any particular issue or concern? How was it implemented? Who participated in it? What were the major findings? What have you learned about assessment?
• Option 2: If you have not done the above, think about other related experiences. Have you ever participated in a survey, focus group, interview or consultation that could be used to assess a program or student learning? The topics could be student performance or achievement, health and wellness, evaluation of senior administrators, campus environment and safety, student services, student engagement, leadership development, service learning, etc. Try to respond to the same questions listed under Option 1.
Lesson 2: Follow-up posts are due Sunday at 11:59 p.m. ET of Week Two.
Read and comment on posts submitted by your peers. How are their experiences different from and similar to yours? Share comparative comments, suggestions, and ideas. Be sure to respond to the comments/questions that you receive.
Grading Rubric:
The attached Discussion Forum Rubric will be used for grading purpose.
Hi everyone, I am currently a high school Chemistry and Biology teacher in Wuhan, China. I graduated from Meredith College in 2015 with a B.S. in Biology and a minor in Studio Art. I have been a science teacher for the past five years in China (3.5 in Beijing and 1.5 so far in Wuhan). I enjoy working with Chinese students and hope to eventually continue working with international students within a higher education setting. I also enjoy research, so I’m hoping to pursue the IR Certificate as well. I have never worked for a higher education institution, so I have not been involved in assessment of student learning at that level, but I have been involved in assessment of student learning at a high school level. As a teacher, I am consistently collecting evidence of individual student learning through both summative and formative assessment. I will use a Fall 2019, 21 week semester Chemistry class with 36 students as an example. For summative assessments, mainly I collect test, quiz, homework, and lab report grades. I collect 15 test grades, one per chapter plus a midterm and a final exam. Generally the students get about 18 quiz and homework grades and 16 lab report grades. For formative assessments I observe the students’ performance in lab procedures, class projects, presentations, and class participation. Generally, I am assessing collaborative skills, instances of critically thinking through problems, and topical questions. For projects and presentations, the students receive rubrics both before they complete the project and grades based on these rubrics. Finally, I document subjective general notes on each student’s behavior, attitude, and other noteworthy attributes during week 12 and week 21. The purpose of these assessments are to gauge the students’ engagement in class activities, extent of material mastery, and their level of preparation for their external examination. The assessments generally highlight areas of concern rather than address concerns. At the end of the 21 week semester, each student gets an overall grade for coursework and quizzes, exams, conduct, and a final comment. I also provide a prediction for the student’s external examination score. This is then sent to the administration in order to place the students in appropriate classes and help them in their assessments. For this particular class, the findings reflected that they were strong in lab skills but weak in both formal testing skills and homework completion, so their external assessment predictive scores were quite low. Following the assessments I made adjustments to my assignments and collaborated more closely with their homeroom teachers to assure students were being held accountable for their work. I have learned a fair amount about assessment through my few years of teaching and completely agree with the textbook that it is a continuum with “three phases: planning, implementing, and improvising and sustaining the process” (Banta, 2015, p. 15). As I am not an administrator, I am not involved in institutional evaluations or program effectiveness, but I look forward to exploring assessment through that lens. Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education (2nd ed.). San Francisco, CA: Jossey-Bass. 9781118903322 |
Assessment Plan Part 1
Lesson 2: Initial post is due Sunday at 11:59 p.m. ET at the end of Lesson Two.
As an individual student, you will participate in a whole class Discussion Board. You will write a post of approximately two pages (double spaced) about the process of re-writing and improving the learning outcomes with your group and what you as an individual learned about writing learning outcomes. You will receive an individual grade on this assignment.
Lesson 3: Follow-up posts are due Sunday at 11:59 p.m. ET at the end of the Lesson Three.
Read and comment on posts submitted by your peers. How are their experiences different from and similar to yours? Share comparative comments, suggestions, and ideas. Be sure to respond to the comments/questions that you receive.
In Adelman’s introduction, he discusses the importance of “Tuning” when developing student learning outcomes. Tuning, “is an effort by faculty in specific disciplines to craft a template of subject-matter reference points and student learning outcomes in their fields” (Adelman, 2015, p. 4). This is a practice deeply reliant on collaboration and cooperation with peers within a department and interdepartmentally. With this in mind, working together with classmates to re-write and improve the learning outcomes of the M.Ed. Higher Education program at Penn State is a relevant exercise that will surely help me if I am tasked with developing student learning outcomes in the future. Our group utilized a Google document, Zoom, and now Microsoft Teams to work synchronously and asynchronously on re-writing and improving the learning outcomes. Through the use of comments, questions, and group editing throughout the document, the process of revision is clearly documented. We began by stating verbs that are unclear and should be avoided and discussing the possibility of including specific examples to illustrate the learning outcomes. Adelman’s text was a valuable resource while restructuring the verbs and syntax of the statement. Next, we evaluated the learning outcomes by addressing the eight questions asked in Cornell University’s 8-item review checklist (Banta & Palomba, 2014). These questions complimented Adelman’s use of diction to ensure the outcome statements are “SMART”: specific, measurable, attainable, results-focused, and time-focused or time-framed. (Banta & Palomba, 2014, p. 70). We then reviewed the learning outcome examples from Penn State and East Carolina University and found the length of the statement and the level of specificity to be particularly helpful (Penn State World Campus, n.d.). Bloom’s Revised Taxonomy was helpful in ensuring the tone of the learning objectives reflected the appropriate level of thinking (Anderson & Krathwohl, 2001). The discussions between our group members included specific questions, constructive feedback, and group agreement on the answers. For example, the question of including a specific time frame for the learning outcomes sparked thoughtful feedback from each member of the team with references supporting their opinion on the subject. This careful evaluation of each detail of the project fosters trust in the members of my team and an overall tone of respect. I learned that the word chosen to describe an outcome must be specific and an active verb. This task seems fairly straightforward while reading about its importance in the required texts, but actually identifying descriptive verbs and restructuring the sentences to clearly reflect expectations is challenging. I also was surprised by how complex this task seemed when I was trying to identify the best vocabulary and sentence structure to use by myself. The ability to list many specific verbs then work together to make the objectives concise is what made our teams list of objectives so strong. As I continue learning about assessment within higher education, I am excited for the opportunity to work with a set team of peers throughout the semester who have already exhibited diligence in assignment completion and while defending and challenging each other’s ideas. References
Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing, Abridged Edition. Boston, MA: Allyn and Bacon. https://www.apu.edu/live_data/files/333/blooms_taxonomy_action_verbs.pdf (Links to an external site.) Banta, T. W., & Palomba, C. A. (2014). Engaging Faculty and Students in Assessment. In Assessment essentials: planning, implementing, and improving assessment in higher education. Jossey-Bass, a Wiley Brand. Penn State World Campus. (n.d.b). Lesson 2: Assessment and Learning Outcomes. In HIED 840 Canvas: Assessing Student Outcomes and Evaluating Academic Outcomes: Spring 2021. https://psu.instructure.com/courses/2114043/modules/items/31057881  |
Lesson 5 & 6 - Survey Review
Note: this discussion will span Lessons 5 & 6.
For Lesson 5: Initial post is due Sunday at 11:59 p.m. ET at the end of Lesson 5.
Conduct an Internet search and find a locally designed, high quality survey from an institution you are/were affiliated with. There are the types of surveys you might look at:
• Learning Outcomes Survey
• Student Evaluation of Teaching
• Campus Climate Survey
• Exit Surveys (at institution level)
• Alumni Survey (institutional wide or by academic programs)
• Employer Survey (by student affairs or academic department)
Then explain why it is a well-designed survey and provide evidence to support your argument. Your discussion post should include the following information:
1. Name of the institution and the survey
2. Website link to the survey (or upload the survey as an attachment)
3. Purpose of the survey
4. Survey population and data collection process
5. Examples of best practices in survey design
6. Use of the survey results
7. Possible improvement to the survey
8. Would you recommend this survey to other institutions? Why?
For Lesson 6: Follow-up post is due Sunday at 11:59 p.m. ET at the end of Lesson 6.
For this portion of the discussion forum you will be reviewing the information presented by your fellow classmates in Lesson 5 and engaging them in discussion. Questions to prompt that discussion include:
• Do you agree that the identified survey is a “quality” survey? Why or why not?
• Do you have any experience with the surveys identified by your classmates?
 Did you have similar results?
 If you had issues, what were they and how did you address them?
Name of Institution and Survey:Â Â North Carolina State University (NCSU) Exit Survey for departing employees
Survey Example:Â Employee Exit Survey (at institutional level)Â (Links to an external site.)Â
Purpose of the survey:Â To assess environmental factors that may contribute to employee satisfaction, retention, and turnover.Â
Survey population and data collection process:Â The survey population is departing employees of the university. Including “all full-time and part-time SHRA, EHRA and Post-Doc employees separating from the university” (Employee Relations, 2012, para. 2). They are asked to provide feedback about their experience at the university and reasons for leaving. The survey is embedded in the offboarding process. The employee’s department is responsible for distributing the survey which may be taken either online with automatic submission, through an electronic pdf then emailed to employeerelations@ncsu.edu, or printing a hard copy of the survey and mailing it to the employee relations physical address.Â
Examples of best practices in survey design:Â The survey is presented in sections with clear and succinct subheadings. Each question or prompt within the section is relevant to the subheading. Each prompt is formatted as a question. The possible answers are straightforward, organized, and often repeat the key word the question is assessing. Simple and specific vocabulary is used throughout the survey. The open ended questions are short and include specific guidance so respondents comprehend the question.Â
Use of the survey results: The results of the survey accompany notes from the departing employee’s exit interview and are “forwarded to the appropriate administrator (Vice Chancellor or Dean) 30 days after separation” to be used to improve environmental factors that may contribute to employee turnover (Employee Relations, 2012, para 1). The survey asks questions to evaluate reasons for leaving, workload, salary and benefits, administration, supervisors, communication, future plans, and comments and recommendations. The responses to these prompts will hopefully provide a narrative of the employees experience to aid administrators in improving the department or addressing problems.Â
Some possible improvements to the survey:
Shorten the survey. The survey covers a wide range of topics and is eight pages long. The length of this survey could result in non-response error for some sections. The four short answer questions are the last survey prompts so these likely reflect less reliable feedback than if they were asked in the beginning of the survey.Â
Would you recommend this survey to other institutions? Why? Yes, I think the flexibility for taking and submitting the survey is accessible for faculty of all ages. The questions are clearly written and address the employee’s personal and professional reasons for leaving as well as their opinion on the institution’s direction as a whole. The answer choices are specific to NC State, but could easily be changed to suit another institution by switching the name of the institution. I especially appreciate the wide range of content covered in the questions with only four short answer prompts. It displays instances of best practices in survey design.Â
References:
Employee Relations. (2012) Exit Interview Program. ncsu. https://er.hr.ncsu.edu/exit-interview-program/Â
Lesson 9 & 10 - NSSE
Lessons 9 & 10 Discussion Forum
Note: This discussion will span both Lesson 9 & 10.
Topic: Review the NSSE 2018 Multi-year Report of Volkwein University (NSSE18 Multi-Year Report VU FR.xlsx Download NSSE18 Multi-Year Report VU FR.xlsx), which compares Engagement Indicator scores and High Impact Practice participation rates of its first-year students between 2015 and 2018.
For Lesson 9: Initial post is due Sunday at 11:59 pm, ET at end of Lesson 9.
Address ALL following questions in your initial post:
1. Summarize 4-5 significant changes you have noticed in student engagement indicators. What do the results show?
2. What questions do you have when you review the data?
3. If this happened on your campus, who would you collaborate to discuss the data, identify potential factors that impact student engagement, and improve student engagement?
4. Based on the results, what actions would you suggest to improve student engagement at VU? Consider academic and student support actions.
For Lesson 10: Follow-up posts are due Sunday at 11:59, ET at the end of Lesson 10.
Consider the initial posts from lesson 9. What patterns did your classmates uncover that you didn’t highlight? What new perspectives have you achieved? Do you have additional suggestions to your classmates?
I noticed significant changes in the following student engagement indicators (EIs).
I chose these seven student EIs because they showed significant differences in their mean values. Interestingly, the EIs that showed significance were either encompassed in the Academic Challenge theme or the Experiences with faculty theme. However, the five I will focus on, are those within the Academic Challenge theme. There were several other significant differences of means, but I chose to focus on academic challenge and experiences with faculty to frame my overall evaluation. It appears there was conscious effort by the academic departments to increase their academic challenge. None of these changes are inherently good or bad, but understanding them could be indicative of the first-year students’ (FYS)participation in High Impact Practices (HIPs) over the course of their college career. To explain my methodology for calculating p-values, I’ll walk through one example. When evaluating the Assigned Writing EI, there appears to be a significant difference between the mean number of pages calculated from the three survey questions. To test whether the difference between 2018 and 2015 is significant, we can run a 2 sample t-test for the difference in means. The null hypothesis is that the two population means are equal and the alternate hypothesis is that they are not equal. After running the test (2-sided) the test statistic is 2.29 with degrees of freedom (DF) = 1,346. The p-value is 0.02198. Therefore, at the 5% significant level, we reject the null hypothesis and conclude that there is a difference between the mean number of pages calculated between 2015 and 2018. I have questions regarding the strategy of the institutional leaders because half of the HIPs measured experienced an increase in FYS engagement. Of the categories that did not increase, the participation of FYS in research with faculty was stagnant between 2015-2018, internship or field experience (plan to do) decreased by 4% and study abroad (plan to do) decreased by 7%. It appears as though the institution increased the rigor of their academics drastically while decreasing the quality of interactions and student-faculty interactions slightly to influence student participation in high-impact practices. References: NSSE. (2018). Annual Results. http://hdl.handle.net/2022/23391 The Trustees of Indiana University. (2020). Engagement Indicators. Evidence-Based Improvement in Higher Education. https://nsse.indiana.edu/nsse/survey-instruments/engagement-indicators.html.  |