Abstract
The assessment of oral proficiency in instructional contexts presents interesting opportunities when designed to foster communicative performance in a specific target language use domain. A task-based Tourism English oral performance assessment was designed using a specialized spoken dialogue system (SDS) in which the computer is programmed to act as a hotel guest and examinees respond as a hotel employee. Based on a mixed-method research design, this study examined whether task administration conditions and the rubric for scoring performance assessment are appropriate for providing evidence of Tourism English ability. The analysis of 30 L2 students’ oral performances, their post-test questionnaire responses, and semi-structured individual interviews indicates that they considered the tasks engaging and relevant to their future profession. Similarly, four raters’ verbal reports from semi-structured individual interviews suggest that the test tasks effectively elicited ratable speech samples that can be argued to represent students’ oral communication skills in a hospitality setting.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abdel Ghany, S. Y., & Abdel Latif, M. M. (2012). English language preparation of tourism and hospitality undergraduates in Egypt: Does it meet their future work place requirements? Journal of Hospitality, Leisure, Sport & Tourism Education, 11, 93–100. https://doi.org/10.1016/j.jhlste.2012.05.001
Aysu, S., & Ozcan, F. H. (2021). Needs analysis in curriculum design: Language needs of tourism students. Sakarya University Journal of Education, 11(2), 305–326. https://doi.org/10.19126/suje.854993
Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford University Press.
Blue, G. M., & Harun, M. (2003). Hospitality language as a professional skill. English for Specific Purposes, 22(1), 73–91. https://doi.org/10.1016/S0889-4906(01)00031-X
Chapelle, C. A. (2021). Argument-based validation in testing and assessment. Sage Publishing.
Chukharev-Hudilainen, E., & Ockey, G. J. (2021). The development and evaluation of interactional competence elicitor (ICE) for oral language assessments. ETS Research Report. Educational Testing Service. https://doi.org/10.1002/ets2.12319
Creswell, J. W., & Plano Clark, V. L. (2012). Designing and conducting mixed methods research. Sage Publications, Inc.
Dimova, S., Yan, X., & Ginther, A. (2020). Local language testing: Design, implementation, and development (1st ed.). https://doi.org/10.4324/9780429492242
Evanini, K., Timpe-Laughlin, V., Tsuprun, E., Blood, I., Lee, J., Bruno, J., & Suendermann-Oeft, D. (2018). Game-based spoken dialog language learning applications for young students. In Proceedings from the 2018 Interspeech conference (pp. 548–549).
Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory. Aldine.
Gokturk N. (2020). Development and evaluation of a spoken dialog system-mediated paired oral task for measuring second language oral communication ability in English (Publication No. 28153696) [Doctoral dissertation, Iowa State University]. ProQuest Dissertations Publishing. https://doi.org/10.31274/etd-20210114-53
Karatay, Y. (2022). Development and Validation of Spoken Dialog System-Based Oral Communication Tasks in an Esp Context (Order No. 29165842). Available from Dissertations & Theses @ Iowa State University; ProQuest Dissertations & Theses Global. (2725255274). https://www.proquest.com/dissertations-theses/development-validation-spoken-dialog-system-based/docview/2725255274/se-2
Kasper, G., & Youn, S. (2018). Transforming instruction to activity: Role-play in language assessment. Applied Linguistics Review, 9(4), 589–616. https://doi.org/10.1515/applirev-2017-0020
Kim, H., Yang, H., Shin, D., & Lee, J. H. (2022). Design principles and architecture of a second language learning chatbot. Language Learning & Technology, 26(1), 1–18. http://hdl.handle.net/10125/73463
Knoch, U., Fairbairn, J., & **, Y. (2021). Scoring second language spoken and written performance: Issues, options and directions. Equinox Publishing.
Leslie, D., Russell, H., & Govan, P. (2004). Foreign language skills and the needs of the UK tourism sector. Industry and Higher Education, 18(4), 255–266.
Leslie, D., & Russell, H. (2006). The importance of foreign language skills in the tourism sector: A comparative study of student perceptions in the UK and continental Europe. Tourism Management, 27(6), 1397–1407. https://doi.org/10.1016/j.tourman.2005.12.016
Litman, D., Strik, H., & Lim, G. S. (2018). Speech technologies and the assessment of second language speaking: Approaches, challenges, and opportunities. Language Assessment Quarterly, 15(3), 294–309. https://doi.org/10.1080/15434303.2018.1472265
Manias, E., & Mcnamara, T. (2016). Standard setting in specific-purpose language testing: What can a qualitative study add? Language Testing, 33(2), 235–249. https://doi.org/10.1177/0265532215608411
Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Research Report, 03-16. https://doi.org/10.1002/j.2333-8504.2003.tb01908.x
Ockey, G. J., & Chukharev-Hudilainen, E. (2021). Human versus computer partner in the paired oral discussion test. Applied Linguistics, 1-21. https://doi.org/10.1093/applin/amaa067
Okada, Y., & Greer, T. (2013). Pursuing a relevant response in oral proficiency interview role plays. In S. Ross & G. Kasper (Eds.), Assessing second language pragmatics (pp. 288–310). Palgrave Macmillan. https://doi.org/10.1057/9781137003522_11
Roever, C. (2011). Testing of second language pragmatics: Past and future. Language Testing, 28, 463–481. https://doi.org/10.1177/0265532210394633
Timpe-Laughlin, V., Evanini, K., Green, A., Blood, I., Dombi, J., & Ramanarayanan, V. (2017). Designing interactive, automated dialogues for L2 pragmatics learning. In V. Petukhova & Y. Tian (Eds.), Proceedings of the 21st workshop on the semantics and pragmatics of dialogue (pp. 143–152).
Timpe-Laughlin, V., Sydorenko, T., & Daurio, P. (2020). Using spoken dialogue technology for L2 speaking practice: What do teachers think? Computer Assisted Language Learning, 1-24. https://doi.org/10.1080/09588221.2020.1774904
Timpe-Laughlin, V., Sydorenko, T., & Dombi, J. (2022). Human versus machine: Investigating L2 learner output in face-to-face versus fully automated role-plays. Computer Assisted Language Learning. https://doi.org/10.1080/09588221.2022.2032184
Van Batenburg, E. S. L., Oostdam, R. J., Van Gelderen, A. J. S., Fukkink, R. G., & De Jong, N. H. (2019). Oral interaction in the EFL classroom: The effects of instructional focus and task type on learner affect. The Modern Language Journal, 103, 308–326. https://doi.org/10.1111/modl.12545
Youn, S. J. (2015). Validity argument for assessing L2 pragmatics in interaction using mixed methods. Language Testing, 32, 199–225. https://doi.org/10.1177/0265532214557113
Zahedpisheh, N., Bakar, A., Zulqarnain, B., & Saffari, N. (2017). English for tourism and hospitality purposes (ETP). English Language Teaching, 10(9), 86–94. https://doi.org/10.5539/elt.v10n9p86
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
Appendices
1.1 Appendix A
1.1.1 Sample Transcript of a Test Task
-
1 c Hi, I am calling from room 263. It is very hot in my room, the A/C is not working.
-
2 u Oh I’m sorry to hear that how can I help you?
-
2 c You are not hel** me.
-
3 u Okay I can change your room if you’d like.
-
3 c That won’t work for me. Just do something about the A/C.
-
4 u Okay I’m.. I am sending a technician to fix the problem.
-
4 c You should have considered getting this fixed before you assigned me to this room.
-
5 u Okay. Sorry. I can give you a refund if you like.
-
5 c Giving my money back won’t fix the problem.
-
6 u I know. But I am trying to find a solution.
-
6 c Look, I really appreciate you trying to help me. Is there a chance you can suggest a reasonable solution?
-
7 u Okay. Then I can move you to a new room.
-
7 c That’s too much work for me. My luggage is very heavy.
-
8 u All right. Let me see.
-
8 c Are we still talking about the A/C?
-
9 u Yes.
-
9 c I don’t think you are even trying to help me.
-
10 u if you don’t want to move, then I can.. I can send a technician to fix the problem but you don’t want it either.
-
10 c Alright, if you say so.
Note: c = Computer, U = user.
1.2 Appendix B
1.2.1 Rating Scale
Skills | Rating | |||
---|---|---|---|---|
4 | 3 | 2 | 1 | |
Interactional competence | Task performed competently with almost always appropriate responses given in each task | Task performed generally competently with usually appropriate responses given in each task | Task performed somewhat competently and with somewhat appropriate responses given in each task | Task not completed due to limitations in responses given in each task |
Fluency | Appropriate speech rate (i.e., no unnatural language -related pauses) | Mostly appropriate speech rate (i.e., few unnatural language -related pauses) | Somehow appropriate speech rate (i.e., some unnatural language -related pauses) | Inappropriate speech rate (i.e., too many unnatural language -related pauses) |
Pronunciation | Very effective pronunciation and prosodic patterns with only minimal errors in production | Above average pronunciation and prosodic patterns but with occasional errors in production interpretation | Somehow average range of pronunciation and prosodic patterns but with errors in production affecting the delivery | Limited range of pronunciation and prosodic patterns with errors in production highly affecting the delivery |
Grammar/vocabulary | A high degree of grammatical accuracy in both simple and complex structures and domain-specific vocabulary use | Sufficient grammatical accuracy in both simple and complex structures and domain-specific vocabulary use. | Somehow sufficient grammatical accuracy in both simple and complex structures and domain-specific vocabulary use | Insufficient grammatical accuracy in both simple and complex structures and limited domain-specific vocabulary use. |
1.3 Appendix C
1.3.1 Test Takers’ Questionnaire Responses in ‘Overall’ and ‘Preference’ Categories
Category | Questionnaire item | 1* | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|---|
Overall | 1. I really enjoyed the tasks. | 0.0% | 0.0% | 9.5% | 61.9% | 28.6% |
2. The topic was relevant to real-life hotel situations. | 4.8% | 4.8% | 9.5% | 14.3% | 66.7% | |
3. If I know I’ll have a speaking test like this, I might study to improve my speaking. | 0.0% | 0.0% | 19.0% | 33.3% | 47.6% | |
Authenticity | 4. I think the interaction I had with the computer was natural. | 0.0% | 4.8% | 14.3% | 28.6% | 52.4% |
5. The computer did not allow me to think about what to say. | 0.0% | 9.5% | 42.9% | 33.3% | 14.3% | |
6. I needed to respond quickly as in real life telephone conversation with a customer. | 0.0% | 4.8% | 9.5% | 33.3% | 52.4% | |
7. There were instances when the computer did not respond to what I said. | 0.0% | 0.0% | 19.0% | 47.6% | 33.3% | |
Preference | 8. I would prefer to discuss with a human partner next time. | 23.8% | 57.1% | 4.8% | 9.5% | 4.8% |
9. I would prefer to discuss with a computer partner next time. | 4.8% | 14.3% | 19.0% | 28.6% | 33.3% | |
10. I’d like to see more tasks like this in the future. | 4.8% | 4.8% | 28.6% | 33.3% | 28.6% | |
Self-evaluation | 11. The test allowed me to demonstrate my oral communication skills at a hotel situation. | 0.0% | 0.0% | 28.6% | 38.1% | 33.3% |
12. The computer understood me well. | 0.0% | 4.8% | 38.1% | 47.6% | 9.5% | |
13. I was able to understand the computer well. | 4.8% | 9.5% | 28.6% | 28.6% | 28.6% | |
14. The difficulty of the task was appropriate for me. | 0.0% | 19.0% | 19.0% | 42.9% | 19.0% | |
15. The computer’s speech was fast for me. | 0.0% | 4.8% | 9.5% | 19.0% | 66.7% |
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Karatay, Y. (2023). Using Spoken Dialog Systems to Assess L2 Learners’ Oral Skills in a Local Language Testing Context. In: Yan, X., Dimova, S., Ginther, A. (eds) Local Language Testing. Educational Linguistics, vol 61. Springer, Cham. https://doi.org/10.1007/978-3-031-33541-9_12
Download citation
DOI: https://doi.org/10.1007/978-3-031-33541-9_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-33540-2
Online ISBN: 978-3-031-33541-9
eBook Packages: EducationEducation (R0)