Research Article
BibTex RIS Cite

Analysis of Raters’ Scoring Behaviours in the Assessment of Writing

Year 2023, Volume: 9 Issue: 2, 362 - 404, 23.10.2023
https://doi.org/10.31464/jlere.1328127

Abstract

This study aims to obtain descriptive data on the scoring behaviours of foreign language teachers in the assessment of writing. In this case study, data related to the demographic characteristics of the teachers, the score they gave and the discourse describing their scoring process were obtained with the help of an interview form through the semi-structured interview technique. At the end of the descriptive analysis of data, it was revealed that the scoring behaviours of the teachers varied greatly, that most of the teachers did not use a scale during scoring, that the teachers who used a scale developed various scales with different sections and points, which indicates that teachers are indecisive, and that teachers mostly focused on the formal-grammatical- dimension of the written product. The study also revealed that there was a significant relation between the scores of the raters and their gender, between their scores and their institution.
Keywords: foreign language teaching, writing assessment, scoring, raters

Supporting Institution

-

Project Number

-

Thanks

-

References

  • AbuSeileek, Asassfeh, S. M. (2021). Holistic vs. analytic scoring between expository and narrative genres: Does the assessment type matter? International Journal of Linguistics, Literature and Translation, 4(1), 215-220. https://doi.org/10.32996/ijllt.2021.4.1.21
  • Bachman, L. F. (2004). Statistical analyses for language assessment book. Cambridge University Press.
  • Baker, K. M. (2016). Peer review as a strategy for improving students’ writing process. Active Learning in Higher Education, 17(3), 179–192. https://doi.org/10.1177/146978741665479
  • Beck, S. W., Llosa, L., Black, K., & Anderson, A. T. G., (2018). From assessing to teaching writing: What teachers prioritize. Assessing Writing, 37, 68-77. https://doi.org/10.1016/j.asw.2018.03.003
  • Brown, J. D. (1989). Manoa writing placement examination. Manoa Writing Board Technical Report, 5.
  • Brown, J. D., & Bailey, K. M. (2008). Language testing courses: What are they in 2007? Language Testing, 25(3), 349–383. https://doi.org/10.1177/02655322080901
  • Brown, G. T. L., Glasswell, K., & Harland, D. (2008). Accuracy in the scoring of writing: Studies of reliability and validity using a New Zealand writing assessment system. Assessing Writing, 9, 105-121. https://doi.org/10.1016/j.asw.2004.07.001
  • Calp, M. (2013). Serbest ve yaratıcı yazma tekniğine göre oluşturulan kompozisyonların yazılı anlatımın niteliği ve puanlama tekniği açısından karşılaştırılması. Turkish Studies: International Periodical Fr the Languages, Literature and History of Turkish or Turkic, 8(9), 879-898. https://doi.org/10.7827/turkishstudies/5340
  • CECR (2018). Cadre européen commun de référence pour les langues: Apprendre, enseigner, évaluer.. Retrieved on May 15, 2023 from www.coe.int/lang-cecr
  • Cole, J. C., Haley, K. A., & Muenz, T. A. (1997). Written expression reviewed. Research in the Schools, 4(1), 17–34.
  • Coombe, C. A., Folse, K. S., & Hubley, N. J. (2007). A practical guide to assessing English language learners. University of Michigan.
  • Cooper, C. G. (1997). Holistic evaluation of writing. In C. R. Cooper, & L. Odell (Eds.). Evaluating writing (pp. 3-33). National Council of Teachers of English.
  • Cooper, P. L. (1984). The assessment of writing ability: A review of research. Educational Testing Service. https://doi.org/10.1002/j.2330-8516.1984.tb00052.x
  • Crusan, D. (2010). Assessment in the second language writing classroom. University of Michigan.
  • Crusan, D. Plakans, L., & Gebril, A. (2016). Writing assessment literacy: Surveying second language teachers’ knowledge, beliefs and practices. Assessing Writing, 28, 43-56. https://doi.org/10.1016/j.asw.2006.03.001.
  • Çetin,B. (2002). Kompozisyon tipi sınavlarda kompozisyonun biçimsel özelliklerinden kestirilen puanların anahtarla ve genel izlenimle puanlanmasından elde edilen puanlarla ilişkisi [The relation between scores predicted from structural features of an essay and scores based on scoring key and overall impression, in essay type examination] Unpublished MA Thesis, Hacettepe University, Ankara.
  • Du, Y., & Wright, B. D. (1997). Measuring student writing abilities in a large-scale writing assessment. In M. Wilson, Jr G. Engelhard, & K. Draney (Eds.). Objective measurement: Theory into practice (pp. 1-24). Abex.
  • Du, Y., Wright, B. D., & Brown, W. L. (1996). Differential facet functioning detection in direct writing assessment. In Annual Conference of the American Educational Research Association 8-12 Nisan 1996 (pp. 1-21). ERIC.
  • Engelhard Jr, G. (1994). Examining rater errors in the assessment of written composition with a many-faceted Rasch model. Journal of Educational Measurement, 31(2), 93-112. https://doi.org/10.1111/j.1745-3984.1994.tb00436.x
  • Engelhard Jr, G., & Myford, C. M. (2003). Monitoring faculty consultant performance in the advanced placement English Literature and composition program with a many‐faceted Rasch model. ETS Research Report Series 2003(1), i-60. https://doi.org/10.1002/j.2333-8504.2003.tb01893.x
  • Enginarlar, H. (1991). A quantitatıve and qualitative comparison of three techniques of grading ESL/EFL essays. Journal of Human Sciences, 10(1), 23-45. https://www.j-humansciences.com/ojs/index.php.IJHS/issue/view/27.pdf
  • Erman Aslanoğlu, A., & Şata, M. (2021). Examining the differential rater functioning in the process of assessing writing skills of middle school 7th grade students. Participatory Educational Research, 8(4), 239-252. https://doi.org/10.17275/per.21.88.8.4
  • Ghalib, T. K., & Hattami, A. A. (2015). Holistic versus analytic evaluation of EFL writing: A case study. English Language Teaching, 8(7), 225-236. https://doi.org/10.5539/elt.v8n7p225
  • Göçer, A. (2011). Öğrencilerin yazılı anlatım çalışmalarının Türkçe öğretmenlerince değerlendirilmesi üzerine. Ondokuz Mayıs Üniversitesi Eğitim Fakültesi Dergisi, 30(2), 71-97. https://doi.org/10.7822/egt34
  • Gyagenda, I. S., & Engelhard Jr, G. (2009). Using classical and modern measurement theories to explore rater, domain and gender influences on student writing ability. Journal of Applied Measurement, 10(3), 225-246. https://d1wqtxts1xzle7.cloudfront.net/32596450/Gyagenda_Engelhard-libre.pdf
  • Hamp-Lyons, L. (2002). The scope of writing assessment. Assessing Writing, 8, 5-16. https://doi.org/10.1016/S1075-2935(02)00029-6
  • Han, T., & Huang, J. (2017). Examining the impact of scoring methods on the institutional EFL writing assessment: A Turkish perspective. PASAA: Journal of Language Teaching and Learning in Thailand, 53, 112-147. https://files. eric.ed.gov.tr/fulltext/EJ1153666.pdf
  • Hunter, K., & Docherty, P. (2011). Reducing variation in the assessment of student writing. Assessment and Evaluation in Higher Education, 36(1), 109-124. https://doi.org/10.1080/02602930903215842
  • Jakobson, R. (1960). Linguistics and poetics. In T. A. Sebeok (Eds.). Style in language (pp. 350-377). Mass. MIT.
  • Johnson, J. S., & Lim, G. S. (2009). The influence of rater language background on writing performance assessment. Language Testing, 26(4), 485-505. https://doi.org/10.1177/ 0265532209340186
  • Kalay, S., & Büyükkarcı, K. (2020). English language teachers’ views on teaching and assessment of writing skills. SDU International Journal of Educational Studies, 7(2), 262-286. https://doi.org/10.33710/sduijes.710062
  • Karatay, H. (2011). Süreç temelli yazma modelleri: Planlı yazma ve değerlendirme. In M. Özbay (Eds.). Yazma eğitimi (pp. 21-43). Pegem Akademi.
  • Köksal, D. (2004). Assessing teacher’ testing skills in ELT and enhancing their professional development through distance learning on the net. Turkish Online Journal of Distance Education, 5(1), 1- 11. https://dergipark.org.tr/en/pub/tojde/issue/16931/176755
  • Liu, L. (2022). Scoring judgment of pre-service EFL teachers: Does writing proficiency play a role?. The Asia-Pacific Education Researcher, 31(3), 333-343. https://doi.org/10.1007/s40299-021-00575-9
  • Li, J., & Huang, J. (2022). The impact of essay organization and overall quality on the holistic scoring of EFL writing: Perspectives from classroom English teachers and national writing raters. Assessing Writing, 51, 1-15. https://doi.org/10.1016/j.asw.2021.100604
  • Lumley, T. (2002). Assessment criteria in a large-scale writing test: What do they really mean to the raters?. Language Testing, 19(3), 246-276. https://doi.org/10.1191/0265532202lt230oa
  • Lunz, M. E., Wright, B. D., & Linacre, J. M. (1990). Measuring the impact of judge severity on examination scores. Applied Measurement in Education, 3(4), 331-345. https://doi.org/10.1207/s15324818ame0304_3
  • Mede, E., & Atay, D. (2017). English language teachers’ assessment literacy: The Turkish context. Dil Dergisi, 168(1), 43-60. https://dergipark.org.tr/tr/pub/dilder/issue/47674/602254
  • Mertler, C. (2009). Teachers’ assessment knowledge and their perceptions of the impact of classroom assessment professional development. Improving Schools, 12(1), 101–113. https://doi.org/10.1177/1365480209105575
  • Mousavi, S. A. (2002). An encyclopedic dictionary of language testing (3rd ed.). Tung Hua.
  • O'Malley, J. M., & Pierce, L. V. (1996). Authentic assessment for English language learners: Practical approaches for teachers. Addison-Wesley.
  • O’Neill, P. (2011). Reframing reliability for writing assessment. Journal of Writing Assessment, 4(1), 1-15. https://escholarship.org/uc/item/6w87j2wp
  • Oruç, N. (1999). Evaluating the reliability of two grading systems for writing assessment at Anadolu University preparatory school. Unpublished MA Thesis, Bilkent University, Ankara.
  • Perkins, K. (1983). On the use of composition scoring techniques, objective measures and objective tests to evaluate ESL writing ability. TESOL Quarterly, 17(4), 651-671. https://doi.org/10.2307/3586618
  • Peterson, S., Childs, R., & Kennedy, K. (2004). Written feedback and scoring of sixth-grade girls’ and boys’ narrative and persuasive writing. Assessing Writing, 9(2), 160-180. https://doi.org/10.1016/j.asw.2004.07.002
  • Polat, M. (2003). A study on developing a writing assessment profile for English preparatory program of Anadolu University School of Foreign Languages. Unpublished MA Thesis, Anadolu University, Eskişehir.
  • Popham, W. J. (2009). Assessment literacy for teachers: Faddish or fundamental? Theory into Practice, 48, 4–11. https://doi.org/10.1080/00405840802577536
  • Rahayu, E. Y. (2020). The anonymous teachers’ factors of assessing paragraph writing. Journal of English for Academic and Specific Purposes, 3(1), 1-19. https://doi.org/10.18860/jeasp.v3i1.9208
  • Sakyi, A. A. (2000). Validation of holistic scoring for ESL writing assessment: How raters evaluate compositions. In A. J. Kunnan (Eds.). Fairness and validation in language assessment: Selected papers from the 19th language testing research colloquium (pp. 129-151). Cambridge University.
  • Seviour, M. (2015). Assessing academic writing on a pre-sessional EAP course: Designing assessment which supports learning. Journal of English for Academic Purposes, 18, 84-89. https://doi.org/10.1016/j.jeap.2015.03.007
  • Smith, W. L. (1993). Assessing the reliability and adequacy of using holistic scoring of essays as a college composition placement technique. In M. M. Williamson & B. A. Huot (Eds.). Validating holistic scoring for writing assessment (pp. 142-205). Hampton.
  • Stiggins, R. J., & Bridgeford, N. J. (1983). An analysis of published tests of writing proficiency. Educational Measurement: Issues and Practices, 2(1), 6-19. https://doi.org/10.1111/j.1745-3992.1983.tb00679.x
  • Şeker, M. (2018). Intervention in teachers’ differential scoring judgments in assessing L2 writing through communities of assessment practice. Studies in Education Evaluation, 59, 209-217. https://doi.org/10.1016/j.stueduc.2018.08.003
  • Thomas, N. (2020). Idea sharing: Are analytic assessment scales more appropriate than holistic assessment scales for L2 writing and speaking? PASAA: Journal of Language Teaching and Learning in Thailand, 59(1), 236-251. https://files. eric.ed.gov.tr/fulltext/EJ1239980.pdf
  • Tokur Üner, B., & Aşılıoğlu, B. (2022). İngilizce öğretiminde ölçme ve değerlendirme sürecine ilişkin öğretmen görüşleri. EKEV Akademi Dergisi, 89, 25-50. https://dergipark.org.tr/en/pub/sosekev/issue/71371/1147452
  • Turgut, M. F. (1990). Eğitimde ölçme ve değerlendirme metotları. Saydam.
  • Wang, J., Engelhard Jr, G., Raczynskia, K., Song, T., & Wolfec, E. W. (2017). Evaluating rater accuracy and perception for integrated writing assessments using a mixed-methods approach. Assessing Writing, 33, 36-47. https://doi.org/10.1016/j.asw.2017.03.003
  • Weigle, S. C. (2007). Teaching writing teachers about assessment. Journal of Second Language Writing, 16(3), 194-209. https://doi.org/10.1016/j.jslw.2007.07.004
  • Weigle, S. C., & Montee, M. (2012). Raters’ perceptions of textual borrowing in integrated writing tasks. Studies in Writing, 27, 117–152. https://doi.org/10.1163/9789004248489_007
  • White, E. (1994). Issues and problems in writing assessment. Assessing Writing, 1, 11–27. https://doi.org/10.1016/1075-2935(94)90003-5
  • Wilson, J., Olinghouse, N. G., McCoach, D. B., Santangelo, T., & Andrada, G. N. (2016). Comparing the accuracy of different scoring methods for identifying sixth graders at risk of failing a state writing assessment. Assessing Writing, 27, 11-23. https://doi.org/10.1016/j.asw.2015.06.003
  • Wind, S. A., & Engelhard Jr, G. (2012). Examining rating quality in writing assessment: Rater agreement, error, and accuracy. Journal of Applied Measurement, 13(4), 1-15. https://d1wqtxts1xzle7.cloudfront.net/31697482/SW_GE_2012-libre.pdf (d1wqtxts1xzle7.cloudfront.net)
  • Wiseman, C. S. (2012). A comparison of performance of analytic vs holistic scoring rubrics to assess L2 writing. Iranian Journal of Language Testing, 2(1), 59-92. https://www.ijlt.ir/article_ 114361_9544f0e7ef140d3731098f945f34a848.pdf
  • Zhang, J. (2016). Same text different processing? Exploring how raters’ cognitive and metacognitive strategies influence rating accuracy in essay scoring. Assessing Writing, 27, 37-53. https://doi.org/10.1016/j.asw.2015.11.001
  • Zorbaz, K. Z. (2013). Yazılı anlatımın puanlanması. Abant İzzet Baysal Üniversitesi Eğitim Fakültesi Dergisi, 13(1) 179-192. https://openaccess.mku.edu.tr/xmlui/bitstream/handle/20.500.12483/1993/Zorbaz%2c%2Kemal%20Zeki%202013.pdf?sequence=1&isAllowed=y

Yazma Becerisinin Değerlendirilmesinde Puanlayıcı Davranışlarının İncelenmesi

Year 2023, Volume: 9 Issue: 2, 362 - 404, 23.10.2023
https://doi.org/10.31464/jlere.1328127

Abstract

Çalışmanın amacı yabancı dil öğretmenlerinin, yazma becerisini değerlendirilmesindeki puanlama davranışlarına ilişkin betimsel veriler elde etmektir. Nitel araştırma yönteminin kullanıldığı çalışmada yarı yapılandırılmış görüşme tekniği ile 73 öğretmenden demografik bilgiler, İngilizce ve Fransızca dillerini öğrenen öğrencilere yazdırılan B1 düzeyindeki bir metne verilen puan ve bu metni değerlendirme sürecini betimledikleri paragraf elde edilmiştir. Verilerin SPSS programı, belge inceleme ve betimsel çözümleme teknikleriyle incelenmesiyle öğretmenlerin puanlama davranışlarının farklılık sergilediği görülmüştür. Nitekim öğretmenlerin puanlama yaparken çoğunlukla ölçüt kullanmadıkları görülmüştür. Ayrıca ölçüt kullanan öğretmenlerin, farklı sayılarda bölümlerden oluşan ve farklı veya aynı puan değerine sahip ölçekler geliştirmesi öğretmenlerin kararsızlık yaşadığını göstermesi bakımından önemlidir. Öğretmenlerin çoğunlukla yazma ürününün biçimsel boyutuna odaklandığı, katılımcı öğretmenlerin sınırlı bir bölümünün hata türleri arasında ayrım yaptığı veya hata ile puan denkliğini gözettiği sonucuna ulaşılmıştır. Katılımcıların cinsiyeti, çalıştığı kurumu ile verdiği puan arasında bir ilişki olduğu, ancak yaşı ve deneyim süresi ile verdiği puan arasında anlamlı bir ilişki olmadığı sonucuna da ulaşılmıştır.

Project Number

-

References

  • AbuSeileek, Asassfeh, S. M. (2021). Holistic vs. analytic scoring between expository and narrative genres: Does the assessment type matter? International Journal of Linguistics, Literature and Translation, 4(1), 215-220. https://doi.org/10.32996/ijllt.2021.4.1.21
  • Bachman, L. F. (2004). Statistical analyses for language assessment book. Cambridge University Press.
  • Baker, K. M. (2016). Peer review as a strategy for improving students’ writing process. Active Learning in Higher Education, 17(3), 179–192. https://doi.org/10.1177/146978741665479
  • Beck, S. W., Llosa, L., Black, K., & Anderson, A. T. G., (2018). From assessing to teaching writing: What teachers prioritize. Assessing Writing, 37, 68-77. https://doi.org/10.1016/j.asw.2018.03.003
  • Brown, J. D. (1989). Manoa writing placement examination. Manoa Writing Board Technical Report, 5.
  • Brown, J. D., & Bailey, K. M. (2008). Language testing courses: What are they in 2007? Language Testing, 25(3), 349–383. https://doi.org/10.1177/02655322080901
  • Brown, G. T. L., Glasswell, K., & Harland, D. (2008). Accuracy in the scoring of writing: Studies of reliability and validity using a New Zealand writing assessment system. Assessing Writing, 9, 105-121. https://doi.org/10.1016/j.asw.2004.07.001
  • Calp, M. (2013). Serbest ve yaratıcı yazma tekniğine göre oluşturulan kompozisyonların yazılı anlatımın niteliği ve puanlama tekniği açısından karşılaştırılması. Turkish Studies: International Periodical Fr the Languages, Literature and History of Turkish or Turkic, 8(9), 879-898. https://doi.org/10.7827/turkishstudies/5340
  • CECR (2018). Cadre européen commun de référence pour les langues: Apprendre, enseigner, évaluer.. Retrieved on May 15, 2023 from www.coe.int/lang-cecr
  • Cole, J. C., Haley, K. A., & Muenz, T. A. (1997). Written expression reviewed. Research in the Schools, 4(1), 17–34.
  • Coombe, C. A., Folse, K. S., & Hubley, N. J. (2007). A practical guide to assessing English language learners. University of Michigan.
  • Cooper, C. G. (1997). Holistic evaluation of writing. In C. R. Cooper, & L. Odell (Eds.). Evaluating writing (pp. 3-33). National Council of Teachers of English.
  • Cooper, P. L. (1984). The assessment of writing ability: A review of research. Educational Testing Service. https://doi.org/10.1002/j.2330-8516.1984.tb00052.x
  • Crusan, D. (2010). Assessment in the second language writing classroom. University of Michigan.
  • Crusan, D. Plakans, L., & Gebril, A. (2016). Writing assessment literacy: Surveying second language teachers’ knowledge, beliefs and practices. Assessing Writing, 28, 43-56. https://doi.org/10.1016/j.asw.2006.03.001.
  • Çetin,B. (2002). Kompozisyon tipi sınavlarda kompozisyonun biçimsel özelliklerinden kestirilen puanların anahtarla ve genel izlenimle puanlanmasından elde edilen puanlarla ilişkisi [The relation between scores predicted from structural features of an essay and scores based on scoring key and overall impression, in essay type examination] Unpublished MA Thesis, Hacettepe University, Ankara.
  • Du, Y., & Wright, B. D. (1997). Measuring student writing abilities in a large-scale writing assessment. In M. Wilson, Jr G. Engelhard, & K. Draney (Eds.). Objective measurement: Theory into practice (pp. 1-24). Abex.
  • Du, Y., Wright, B. D., & Brown, W. L. (1996). Differential facet functioning detection in direct writing assessment. In Annual Conference of the American Educational Research Association 8-12 Nisan 1996 (pp. 1-21). ERIC.
  • Engelhard Jr, G. (1994). Examining rater errors in the assessment of written composition with a many-faceted Rasch model. Journal of Educational Measurement, 31(2), 93-112. https://doi.org/10.1111/j.1745-3984.1994.tb00436.x
  • Engelhard Jr, G., & Myford, C. M. (2003). Monitoring faculty consultant performance in the advanced placement English Literature and composition program with a many‐faceted Rasch model. ETS Research Report Series 2003(1), i-60. https://doi.org/10.1002/j.2333-8504.2003.tb01893.x
  • Enginarlar, H. (1991). A quantitatıve and qualitative comparison of three techniques of grading ESL/EFL essays. Journal of Human Sciences, 10(1), 23-45. https://www.j-humansciences.com/ojs/index.php.IJHS/issue/view/27.pdf
  • Erman Aslanoğlu, A., & Şata, M. (2021). Examining the differential rater functioning in the process of assessing writing skills of middle school 7th grade students. Participatory Educational Research, 8(4), 239-252. https://doi.org/10.17275/per.21.88.8.4
  • Ghalib, T. K., & Hattami, A. A. (2015). Holistic versus analytic evaluation of EFL writing: A case study. English Language Teaching, 8(7), 225-236. https://doi.org/10.5539/elt.v8n7p225
  • Göçer, A. (2011). Öğrencilerin yazılı anlatım çalışmalarının Türkçe öğretmenlerince değerlendirilmesi üzerine. Ondokuz Mayıs Üniversitesi Eğitim Fakültesi Dergisi, 30(2), 71-97. https://doi.org/10.7822/egt34
  • Gyagenda, I. S., & Engelhard Jr, G. (2009). Using classical and modern measurement theories to explore rater, domain and gender influences on student writing ability. Journal of Applied Measurement, 10(3), 225-246. https://d1wqtxts1xzle7.cloudfront.net/32596450/Gyagenda_Engelhard-libre.pdf
  • Hamp-Lyons, L. (2002). The scope of writing assessment. Assessing Writing, 8, 5-16. https://doi.org/10.1016/S1075-2935(02)00029-6
  • Han, T., & Huang, J. (2017). Examining the impact of scoring methods on the institutional EFL writing assessment: A Turkish perspective. PASAA: Journal of Language Teaching and Learning in Thailand, 53, 112-147. https://files. eric.ed.gov.tr/fulltext/EJ1153666.pdf
  • Hunter, K., & Docherty, P. (2011). Reducing variation in the assessment of student writing. Assessment and Evaluation in Higher Education, 36(1), 109-124. https://doi.org/10.1080/02602930903215842
  • Jakobson, R. (1960). Linguistics and poetics. In T. A. Sebeok (Eds.). Style in language (pp. 350-377). Mass. MIT.
  • Johnson, J. S., & Lim, G. S. (2009). The influence of rater language background on writing performance assessment. Language Testing, 26(4), 485-505. https://doi.org/10.1177/ 0265532209340186
  • Kalay, S., & Büyükkarcı, K. (2020). English language teachers’ views on teaching and assessment of writing skills. SDU International Journal of Educational Studies, 7(2), 262-286. https://doi.org/10.33710/sduijes.710062
  • Karatay, H. (2011). Süreç temelli yazma modelleri: Planlı yazma ve değerlendirme. In M. Özbay (Eds.). Yazma eğitimi (pp. 21-43). Pegem Akademi.
  • Köksal, D. (2004). Assessing teacher’ testing skills in ELT and enhancing their professional development through distance learning on the net. Turkish Online Journal of Distance Education, 5(1), 1- 11. https://dergipark.org.tr/en/pub/tojde/issue/16931/176755
  • Liu, L. (2022). Scoring judgment of pre-service EFL teachers: Does writing proficiency play a role?. The Asia-Pacific Education Researcher, 31(3), 333-343. https://doi.org/10.1007/s40299-021-00575-9
  • Li, J., & Huang, J. (2022). The impact of essay organization and overall quality on the holistic scoring of EFL writing: Perspectives from classroom English teachers and national writing raters. Assessing Writing, 51, 1-15. https://doi.org/10.1016/j.asw.2021.100604
  • Lumley, T. (2002). Assessment criteria in a large-scale writing test: What do they really mean to the raters?. Language Testing, 19(3), 246-276. https://doi.org/10.1191/0265532202lt230oa
  • Lunz, M. E., Wright, B. D., & Linacre, J. M. (1990). Measuring the impact of judge severity on examination scores. Applied Measurement in Education, 3(4), 331-345. https://doi.org/10.1207/s15324818ame0304_3
  • Mede, E., & Atay, D. (2017). English language teachers’ assessment literacy: The Turkish context. Dil Dergisi, 168(1), 43-60. https://dergipark.org.tr/tr/pub/dilder/issue/47674/602254
  • Mertler, C. (2009). Teachers’ assessment knowledge and their perceptions of the impact of classroom assessment professional development. Improving Schools, 12(1), 101–113. https://doi.org/10.1177/1365480209105575
  • Mousavi, S. A. (2002). An encyclopedic dictionary of language testing (3rd ed.). Tung Hua.
  • O'Malley, J. M., & Pierce, L. V. (1996). Authentic assessment for English language learners: Practical approaches for teachers. Addison-Wesley.
  • O’Neill, P. (2011). Reframing reliability for writing assessment. Journal of Writing Assessment, 4(1), 1-15. https://escholarship.org/uc/item/6w87j2wp
  • Oruç, N. (1999). Evaluating the reliability of two grading systems for writing assessment at Anadolu University preparatory school. Unpublished MA Thesis, Bilkent University, Ankara.
  • Perkins, K. (1983). On the use of composition scoring techniques, objective measures and objective tests to evaluate ESL writing ability. TESOL Quarterly, 17(4), 651-671. https://doi.org/10.2307/3586618
  • Peterson, S., Childs, R., & Kennedy, K. (2004). Written feedback and scoring of sixth-grade girls’ and boys’ narrative and persuasive writing. Assessing Writing, 9(2), 160-180. https://doi.org/10.1016/j.asw.2004.07.002
  • Polat, M. (2003). A study on developing a writing assessment profile for English preparatory program of Anadolu University School of Foreign Languages. Unpublished MA Thesis, Anadolu University, Eskişehir.
  • Popham, W. J. (2009). Assessment literacy for teachers: Faddish or fundamental? Theory into Practice, 48, 4–11. https://doi.org/10.1080/00405840802577536
  • Rahayu, E. Y. (2020). The anonymous teachers’ factors of assessing paragraph writing. Journal of English for Academic and Specific Purposes, 3(1), 1-19. https://doi.org/10.18860/jeasp.v3i1.9208
  • Sakyi, A. A. (2000). Validation of holistic scoring for ESL writing assessment: How raters evaluate compositions. In A. J. Kunnan (Eds.). Fairness and validation in language assessment: Selected papers from the 19th language testing research colloquium (pp. 129-151). Cambridge University.
  • Seviour, M. (2015). Assessing academic writing on a pre-sessional EAP course: Designing assessment which supports learning. Journal of English for Academic Purposes, 18, 84-89. https://doi.org/10.1016/j.jeap.2015.03.007
  • Smith, W. L. (1993). Assessing the reliability and adequacy of using holistic scoring of essays as a college composition placement technique. In M. M. Williamson & B. A. Huot (Eds.). Validating holistic scoring for writing assessment (pp. 142-205). Hampton.
  • Stiggins, R. J., & Bridgeford, N. J. (1983). An analysis of published tests of writing proficiency. Educational Measurement: Issues and Practices, 2(1), 6-19. https://doi.org/10.1111/j.1745-3992.1983.tb00679.x
  • Şeker, M. (2018). Intervention in teachers’ differential scoring judgments in assessing L2 writing through communities of assessment practice. Studies in Education Evaluation, 59, 209-217. https://doi.org/10.1016/j.stueduc.2018.08.003
  • Thomas, N. (2020). Idea sharing: Are analytic assessment scales more appropriate than holistic assessment scales for L2 writing and speaking? PASAA: Journal of Language Teaching and Learning in Thailand, 59(1), 236-251. https://files. eric.ed.gov.tr/fulltext/EJ1239980.pdf
  • Tokur Üner, B., & Aşılıoğlu, B. (2022). İngilizce öğretiminde ölçme ve değerlendirme sürecine ilişkin öğretmen görüşleri. EKEV Akademi Dergisi, 89, 25-50. https://dergipark.org.tr/en/pub/sosekev/issue/71371/1147452
  • Turgut, M. F. (1990). Eğitimde ölçme ve değerlendirme metotları. Saydam.
  • Wang, J., Engelhard Jr, G., Raczynskia, K., Song, T., & Wolfec, E. W. (2017). Evaluating rater accuracy and perception for integrated writing assessments using a mixed-methods approach. Assessing Writing, 33, 36-47. https://doi.org/10.1016/j.asw.2017.03.003
  • Weigle, S. C. (2007). Teaching writing teachers about assessment. Journal of Second Language Writing, 16(3), 194-209. https://doi.org/10.1016/j.jslw.2007.07.004
  • Weigle, S. C., & Montee, M. (2012). Raters’ perceptions of textual borrowing in integrated writing tasks. Studies in Writing, 27, 117–152. https://doi.org/10.1163/9789004248489_007
  • White, E. (1994). Issues and problems in writing assessment. Assessing Writing, 1, 11–27. https://doi.org/10.1016/1075-2935(94)90003-5
  • Wilson, J., Olinghouse, N. G., McCoach, D. B., Santangelo, T., & Andrada, G. N. (2016). Comparing the accuracy of different scoring methods for identifying sixth graders at risk of failing a state writing assessment. Assessing Writing, 27, 11-23. https://doi.org/10.1016/j.asw.2015.06.003
  • Wind, S. A., & Engelhard Jr, G. (2012). Examining rating quality in writing assessment: Rater agreement, error, and accuracy. Journal of Applied Measurement, 13(4), 1-15. https://d1wqtxts1xzle7.cloudfront.net/31697482/SW_GE_2012-libre.pdf (d1wqtxts1xzle7.cloudfront.net)
  • Wiseman, C. S. (2012). A comparison of performance of analytic vs holistic scoring rubrics to assess L2 writing. Iranian Journal of Language Testing, 2(1), 59-92. https://www.ijlt.ir/article_ 114361_9544f0e7ef140d3731098f945f34a848.pdf
  • Zhang, J. (2016). Same text different processing? Exploring how raters’ cognitive and metacognitive strategies influence rating accuracy in essay scoring. Assessing Writing, 27, 37-53. https://doi.org/10.1016/j.asw.2015.11.001
  • Zorbaz, K. Z. (2013). Yazılı anlatımın puanlanması. Abant İzzet Baysal Üniversitesi Eğitim Fakültesi Dergisi, 13(1) 179-192. https://openaccess.mku.edu.tr/xmlui/bitstream/handle/20.500.12483/1993/Zorbaz%2c%2Kemal%20Zeki%202013.pdf?sequence=1&isAllowed=y
There are 65 citations in total.

Details

Primary Language English
Journal Section Research Articles
Authors

Yusuf Polat 0000-0001-9341-6643

Nejla Gezmiş 0000-0003-4909-1460

Project Number -
Early Pub Date October 21, 2023
Publication Date October 23, 2023
Submission Date July 16, 2023
Published in Issue Year 2023 Volume: 9 Issue: 2

Cite

APA Polat, Y., & Gezmiş, N. (2023). Analysis of Raters’ Scoring Behaviours in the Assessment of Writing. Journal of Language Education and Research, 9(2), 362-404. https://doi.org/10.31464/jlere.1328127

________________________________________________

Journal of Language Education and Research (JLERE)
Dil Eğitimi ve Araştırmaları Dergisi

https://dergipark.org.tr/en/pub/jlere

ISSN: 2149-5602
Facebook Grup
Copyright © Journal of Language Education and Research