Show simple item record

dc.contributor.authorKatubiya, Gladys
dc.contributor.authorSoeder, Alison
dc.contributor.authorStewart, Tracy
dc.contributor.otherDan Su
dc.date.accessioned2022-05-02T20:12:14Z
dc.date.available2022-05-02T20:12:14Z
dc.date.issued2022
dc.identifier.urihttps://hdl.handle.net/11274/13626
dc.descriptionTexas A&M University- Commerceen_US
dc.description.abstractA significant part of university programs is the institution's responsibility to influence its students' values, skills, and attitudes through departmental goals, objectives, and priorities. For Texas A&M University-Commerce, these objectives are updated and managed in digital assessment management platform. This study intends to analyse the differences in user feedback from the first year of campus implementation to the second year use of the assessment software. The data gathered from software evaluations will demonstrate changes in the number of respondents, the number of satisfied users, the number of dissatisfied users, software strengths and weaknesses. The study uses qualitative and quantitative feedback, and descriptive statistics and t-tests are applied to compare the two years of data to identify trends and make recommendations. Participants included Institutional Effectiveness (IE) Authors and individuals involved in the IE review process who completed and submitted the IE Fall Feedback Survey. The study assessed Likert scale questions and open-ended questions addressing the experiences in the use of the assessment software and also measured users’ satisfaction with the platform's features and training resources. Results are evaluated by calculating percentage distribution and frequency. The study shows a decrease in total respondents between 2020 and 2021. In 2021, more than 80% of the respondents were satisfied with the communication and information about the year’s IE cycle. These results were similar to the results documented in 2020. In both years, respondents found the platform productive and well-structured. More respondents in 2021 noted dissatisfaction with some of the software’s features compared to 2020. The results guide the Institutional Effectiveness and Research department in providing recommendations on how best to use the assessment software as well as bringing attention to assessment obstacles and ensuring efficient monitoring of student learning outcomes and assessment data.en_US
dc.description.abstract2nd Place Winner for Education, Humanities, Social Sciences, and Business
dc.language.isoen_USen_US
dc.titleA Comparative Analysis of Year One and Year Two Usage of a Data Management Program as an Assessment Toolen_US
dc.typePresentationen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record