Determining progress in writing competency by assessing students’ argumentation
2016 (English)In: Proc. 46th ASEE/IEEE Frontiers in Education Conference, Piscataway, NJ: IEEE Press, 2016Conference paper (Refereed)
A problem when it comes to evaluating the quality of education in professional competencies, such as writing skills, is being able to detect and measure progression. We have previously defined course level based learning outcomes for academic writing competency in computer science; these are used in a writing across the curriculum (WAC), in the discipline (WID) program. However, in order to assess whether the program is effective, i.e., that the participating students’ writing skills progress throughout the education, we need a different set of criteria. Such criteria must capture the quality of the text from an academic perspective. They must also be easy to evaluate, and it must be possible to compare evaluations of different texts.
There are many, sometimes conflicting, definitions of what ‘good academic writing’ or ‘quality’ in academic communication is. In this paper, we have defined it in terms of how the material is structured, how well arguments are presented, and how critical thinking is used to strengthen arguments. Following this definition, it is clear that argumentative skills can be used as an indicator of quality in academic communication. Our criteria for measuring writing competency are thus heavily based on assessing students’ use of argumentative skills in written texts and are similar to criteria previously used to assess the quality of student participation in classroom discussions.
This paper presents a framework for quantitative and qualita- tive evaluation of texts written by computer science students. We have related the criteria in our framework to general definitions of academic writing, and to our previously defined goals for writing competencies. The framework provides a grading scheme that can be used to assign a score to a text, corresponding to the level of academic quality exhibited in that text. The results of our framework thus enables comparisons between different student texts. We have used the framework to evaluate writing progression for a group of IT engineering students over three years.
Place, publisher, year, edition, pages
Piscataway, NJ: IEEE Press, 2016.
Proficiency assessment, Computer Science Education, Writing across the curriculum
Educational Sciences Computer Science
Research subject Computer Science with specialization in Computer Science Education Research
IdentifiersURN: urn:nbn:se:uu:diva-304403OAI: oai:DiVA.org:uu-304403DiVA: diva2:1032927
FIE 2016, October 12–15, Erie, PA