Exploring the attitude of ESP learners towards using automated writing evaluation to assess their writing

Tamer Gamal Ahmed Abd El Rasoul, Marwa Adel Aboelwafa, Abeer Refky Seddeek

Abstract


The aim of the current study is to explore the attitudes of ESP learners towards using automated writing evaluation (AWE) to assess their writing. The mixed-method qualitative and quantitative approach is employed in this study. The sample of the study consisted of 201 second-year students from the college of engineering at the Arab Academy for Science, Technology and Maritime Transport, Egypt. A post-experiment questionnaire was utilized to investigate the students` attitudes towards using AWE to assess their writing. The results of the study revealed that the students hold positive attitudes towards using the AWE software Grammarly since it encouraged them to self-correct their errors and revise their writings before submitting them to their teachers. Based on the findings of this study, it is recommended to conduct research on the pedagogical usage of AWE tools in writing classes, and the attitudes of the writing instructors towards using AWE tools in their writing classes.

 

Received: 20 March 2023

Accepted: 29 April 2023

Published: 07 June 2023


Keywords


Automated Writing Evaluation (AWE); Assessing ESP Writing Performance; English for Specific Purposes (ESP).

Full Text:

PDF

References


Cavaleri, M., & Dianati, S. 2016. You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), 223-236. Retrieved from https://journal.aall.org.au/index.php/jall/issue/view/22

Cavaleri, M., & Dianati, S. 2016. You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), 223-236. Retrieved from https://journal.aall.org.au/index.php/jall/issue/view/22

Chapelle, C. A., Cotos, E., & Lee, J. 2015. Validity arguments for diagnostic assessment using automated writing evaluation. Language Testing, 32(3), https://doi.org/10.1177/0265532214565386.

Chen, C.-F., & Cheng, W.-Y. 2008. Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning and Technology, 12(2), 94-112. Retrieved from https://www.lltjournal.org/item/2631

Flower, L. 1994. The construction of negotiated meaning: A social cognitive theoryCarbondale, IL: Southern Illinois University Press.

Graham, S. 2006. Writing. In P. Alexander, & P. Winne, (eds.), Handbook of Educational Psychology (pp. 457–478). Mahwah, NJ: Erlbaum.

Hayes, J. R. 2012. Modeling and Remodeling Writing. Written Communication, 29(3), 369– 388.

Hayes, J. R., & Olinghouse, N. G. (ed.) 2015. Can Cognitive Writing Models inform the Design of the Common Core State Standards? The Elementary School Journal, 115(4), 480-497.

Horowitz, D. 1986. Process, Not Product: Less Than Meets the Eye. TESOL Quarterly, 20(1), 141-144. DOI:10.2307/3586397.

Hyland, K. 2003. Second language writing. New York: Cambridge University Press.

Hyland, K., & Hyland, F. 2006. Feedback in second language writing: Contexts and issues. Cambridge University.

Khoii, R., & Doroudian, A. 2013. Automated Scoring of EFL Learners’ Written Performance: a Torture or a Blessing? In Conference proceedings. ICT for language learning (p. 367)

Lavolette, E. 2015. The accuracy of computer- assisted feedback and students. Retrieved from https://www.mendeley.com/catalogue/accuracy-computer-assisted-feedback-students-responses-it/.

O’Neill, R., & Russell, A. M. T. (ed.) 2019. Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology,.

Shim, Y. (ed.) 2013. The effects of online writing evaluation program. Teaching English with Technology, 13(3), 18–34.

Silva, T. 1990. Second language composition instruction: developments, issues, and directions. In Kroll (ed.), Second language writing: research insights for the classroom (pp11-23).Cambridge: Cambridge University Press.

Smith, T. (ed.) 2018. ‘More states opting to “robo-grade” student essays by computer’. National Public Radio website. Available at https://www.npr.org/2018/06/30/624373367/more-states-opting-torobo-

Walsh, K. (ed.) 2010 The importance of writing skills: Online tools to encourage success. Retrieved December 27, 2012,. Retrieved December 27, 2012, from http://www.emergingedtech.com/2010/11/the-importance-of-writing-skills-online-tools-to-encourage-success/

Wang, P. (ed.) 2013 Can Automated Writing Evaluation Programs Help Students Improve Their English Writing? International Journal of Applied Linguistics & English Literature, 2(1), 6–12.

Wang, Y.-J., Shang, H.-F., & Briody, P. (ed.) 2013 Exploring the impact of using automated writing evaluation in English as a foreign language university students’ writing. Computer Assisted Language Learning, 26(3), 234–257. https://doi.org/10.1080/09588221.2012.655300.

Warschauer, M., & Grimes, D. 2008. Automated Writing Assessment in the classroom. Pedagogies: An International Journal, 3(1), 22–36 (2008). https://doi.org/10.1080/15544800701771580.

Wilson, J., & Czik, A. 2016. Automated essay evaluation software in English Language Arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers and Education, 100, 94–109. https://doi.org/10.1016/j.compedu.2016.05.004




DOI: http://dx.doi.org/10.21622/ilcc.2023.03.1.157

Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Tamer Gamal Ahmed Abd El Rasoul, Marwa Adel Aboelwafa, Abeer Refky Seddeek

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Insights into Language, Culture and Communication
E-ISSN: 2812-491X
P-ISSN: 2812-4901 

Published by:

Academy Publishing Center (APC)
Arab Academy for Science, Technology and Maritime Transport (AASTMT)
Alexandria, Egypt
ilcc@aast.edu