Feedback Generation for Automatic User Interface Design Evaluation

  • Conference paper
  • First Online:
Software Technologies (ICSOFT 2021)

Abstract

During the last decades the interest to study User Interfaces (UI) has increased. However, the learning of UI design is a difficult process. To obtain better results, novel UI designers need guidance through this process. Feedback is among the most important factors to improve knowledge and skill acquisition. Nevertheless, the complexity of providing individual feedback is remarkable: it is a time-consuming task and requires a fair amount of expertise. This paper presents the Feedback ENriched user Interface Simulation (FENIkS) as a solution to this problem. FENIkS is a UI design simulation tool, based on model-driven engineering. The students design the UI through different models while automatically receiving feedback on how design principles have been applied through several options. From the models it is possible to generate a working prototype, enriched with feedback that explains the application of design principles. This paper describes the foundations of FENIkS for the automatic UI design evaluation that further allows generating automatic feedback. It explains FENIkS’ design: the meta-model and how design options, design principles and types of feedback are used to automatically generate feedback. The perceived usability was positive evaluated. The results of the experimental evaluation demonstrated that FENIkS improves students’ understanding of design principles.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Akiki, P.A., Bandara, A.K., Yu, Y.: Adaptive model-driven user interface development systems. ACM Comput. Surv. 47(1), 2015. https://doi.org/10.1145/2597999

  2. Shute, V.J.: Focus on formative feedback. Rev. Educ. Res. 78(1), 153–189 (2008). https://doi.org/10.3102/0034654307313795

    Article  Google Scholar 

  3. Merrill, M.D.: First principles of instruction. Educ. Technol. Res. Dev. 50(3), 43–59 (2002). https://doi.org/10.1007/BF02505024

    Article  Google Scholar 

  4. Benitti, F.B.V., Sommariva, L.: Evaluation of a game used to teach usability to undergraduate students in computer science. J. Usability Stud. 11(1), 21–39 (2015)

    Google Scholar 

  5. Sutcliffe, A.G., Kurniawan, S., Shin, J.-E.: A method and advisor tool for multimedia user interface design. Int. J. Hum. Comput. Stud. 64(4), 375–392 (2006). https://doi.org/10.1016/j.ijhcs.2005.08.016

    Article  Google Scholar 

  6. Barrett, M.L.: A hypertext module for teaching user interface design. ACM SIGCSE Bull. 25(1), 107–111 (1993). https://doi.org/10.1145/169073.169359

    Article  Google Scholar 

  7. Ruiz, J., Snoeck, M.: Automatic feedback generation for supporting user interface design. In: 16th International Conference on Software Technologies (2021). https://doi.org/10.5220/0010513400230033

  8. Deeva, G., Bogdanova, D., Serral, E., Snoeck, M., De Weerdt, J.: A review of automated feedback systems for learners: Classification framework, challenges and opportunities. Comput. Educ. 162, 104094 (2021). https://doi.org/10.1016/j.compedu.2020.104094

    Article  Google Scholar 

  9. Ormeño, Y.I., Panach, J.I., Condori-Fernández, N., Pastor, Ó.: A proposal to elicit usability requirements within a model-driven development environment. Int. J. Inf. Syst. Model. Des. 5(4), 1–21 (2014)

    Article  Google Scholar 

  10. Molina, P.J., Meliá, S., Pastor, O.: User interface conceptual patterns. In: Forbrig, P., Limbourg, Q., Vanderdonckt, J., Urban, B. (eds.) DSV-IS 2002. LNCS, vol. 2545, pp. 159–172. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-36235-5_12

    Chapter  Google Scholar 

  11. Pastor, O., Molina, J.C.: Model-driven architecture in practice. In: A Software Production Environment Based on Conceptual Modeling, Springer, Berlin (2007). https://doi.org/10.1007/978-3-540-71868-0

  12. Kitchenham, B., Charters, S.: Guidelines for performing systematic literature reviews in software engineering (2007)

    Google Scholar 

  13. Hedegaard, S., Simonsen, J.G.: Mining until it hurts: automatic extraction of usability issues from online reviews compared to traditional usability evaluation. In: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, pp. 157–166 (2014). https://doi.org/10.1145/2639189.2639211

  14. Mateo Navarro, P.L., Hillmann, S., Möller, S., Sevilla Ruiz, D., Martínez Pérez, G.: Run-time model based framework for automatic evaluation of multimodal interfaces. J. Multimodal User Interfaces 8(4), 399–427 (2014). https://doi.org/10.1007/s12193-014-0170-3

    Article  Google Scholar 

  15. Jarraya, M., Moussa, F.: Proxy oriented approach for evaluating usability of a resilient life-critical interactive systems. In: 2018 IEEE 32nd International Conference on Advanced Information Networking and Applications (AINA), pp. 464–471 (2018). https://doi.org/10.1109/AINA.2018.00075

  16. Kristoffersen, S.: A preliminary experiment of checking usability principles with formal methods. In: 2009 Second International Conferences on Advances in Computer-Human Interactions, pp. 261–270 (2009). https://doi.org/10.1109/ACHI.2009.26

  17. de Oliveira, K.M., Lepreux, S., Kolski, C., Seffah, A.: Predictive usability evaluation: aligning HCI and software engineering practices. In: Proceedings of the 26th Conference on l’Interaction Homme-Machine, pp. 177–182 (2014). https://doi.org/10.1145/2670444.2670467

  18. Santana, L.T.E., Pansanato, G.A.: Identifying usability problems in web applications through analysis of user interaction logs using pattern recognition. In: Proceedings of the IADIS International Conference WWW/Internet 2011, ICWI 2011, pp. 587–590 (2011).

    Google Scholar 

  19. Paternò, A., Schiavone, F., Conte, A.G.: Customizable automatic detection of bad usability smells in mobile accessed web applications. In: 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2017 (2017). https://doi.org/10.1145/3098279.3098558

  20. El-Halees, A.M.: Software usability evaluation using opinion mining. J. Softw. 9(2), 343–350 (2014). https://doi.org/10.4304/jsw.9.2.343-349

    Article  Google Scholar 

  21. Casas, S., Trejo, N., Farias, R.: AJMU: an aspect-oriented framework for evaluating the usability of wimp applications. J. Softw. Eng. 10, 1–15 (2016). https://doi.org/10.3923/jse.2016.1.15

    Article  Google Scholar 

  22. Shekh, S., Tyerman, S.: Develo** a dynamic usability evaluation framework using an aspect-oriented approach. In: ENASE, pp. 203–214 (2009)

    Google Scholar 

  23. Chaudhary, N., Sangwan, O.P.: Multi criteria based fuzzy model for website evaluation. In: 2015 2nd International Conference on Computing for Sustainable Global Development (INDIACom), pp. 1798–1802 (2015)

    Google Scholar 

  24. Kallel, I., Jouili, M., Ezzedine, H.: HMI fuzzy assessment of complex systems usability. In: Abraham, A., Muhuri, P.K., Muda, A.K., Gandhi, N. (eds.) ISDA 2017. AISC, vol. 736, pp. 630–639. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-76348-4_61

    Chapter  Google Scholar 

  25. Molina, F., Toval, A.: Integrating usability requirements that can be evaluated in design time into model driven engineering of web information systems. Adv. Eng. Softw. 40(12), 1306–1317 (2009). https://doi.org/10.1016/j.advengsoft.2009.01.018

    Article  MATH  Google Scholar 

  26. do Amaral, L.A., de Mattos Fortes, R.P., Bittar, T.J.: A4U-an approach to evaluation considering accessibility and usability guidelines. In: Proceedings of the 24th Brazilian Symposium on Multimedia and the Web, pp. 295–298 (2018). https://doi.org/10.1145/3243082.3264666

  27. Schiavone, A.G., Paternò, F.: An extensible environment for guideline-based accessibility evaluation of dynamic web applications. Univ. Access Inf. Soc. 14(1), 111–132 (2015). https://doi.org/10.1007/s10209-014-0399-3

    Article  Google Scholar 

  28. Benaida, A., Namoun, M.: Technical and perceived usability issues in Arabic educational websites. Int. J. Adv. Comput. Sci. Appl. 9(5), 391–400 (2018). https://doi.org/10.14569/IJACSA.2018.090551

  29. Khaddam, D., Bouzit, I., Calvary, S., Chêne, G.: MenuErgo: computer-aided design of menus by automated guideline review. In: IHM 2016 - Actes de la 28ieme Conference Francophone sur l’Interaction Homme-Machine, pp. 36–47 (2016). https://doi.org/10.1145/3004107.3004130

  30. Dhouib, H.B., Trabelsi, A., Abdallah, A.: EiserWebs: an evaluation tool for interactive systems based on web services. In: 4th International Conference on Information and Communication Technology and Accessibility, ICTA 2013 (2013). https://doi.org/10.1109/ICTA.2013.6815297

  31. Vigo, F., Leporini, M., Paternò, B.: Enriching web information scent for blind users. In: ASSETS’09 - Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 123–130 (2009). https://doi.org/10.1145/1639642.1639665

  32. Dias, A.C.R., Paiva, F.: Pattern-based usability testing. In: 10th IEEE International Conference on Software Testing, Verification and Validation Workshops, ICSTW 2017, pp. 366–371 (2017). https://doi.org/10.1109/ICSTW.2017.65

  33. Soui, M., Chouchane, M., Gasmi, I., Mkaouer, M.W.: PLAIN: PLugin for predicting the usAbility of mobile user INterface. In: VISIGRAPP (1: GRAPP), pp. 127–136 (2017)

    Google Scholar 

  34. Ivory, M.Y., Hearst, M.A.: The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv. 33(4), 470–516 (2001). https://doi.org/10.1145/503112.503114

    Article  Google Scholar 

  35. Ponce, A., Balderas, P., Peffer, D., Molina, T.: Deep learning for automatic usability evaluations based on images: a case study of the usability heuristics of thermostats. Energy Build. 2(162), 111–120 (2018). https://doi.org/10.1016/j.enbuild.2017.12.043

  36. Ruiz, J., Serral, E., Snoeck, M.: Evaluating user interface generation approaches: model-based versus model-driven development. Softw. Syst. Model. 18(4), 2753–2776 (2018). https://doi.org/10.1007/s10270-018-0698-x

    Article  Google Scholar 

  37. Galitz, W.O.: The Essential Guide to User Interface Design: an Introduction to GUI Design Principles and Techniques. Wiley, Hoboken (2007)

    Google Scholar 

  38. Folmer, E., Bosch, J.: Architecting for usability: a survey. J. Syst. Softw. 70(1–2), 61–78 (2004). https://doi.org/10.1016/S0164-1212(02)00159-0

    Article  Google Scholar 

  39. Kimball, M.A.: Visual design principles: an empirical study of design lore. J. Tech. Writ. Commun. 43(1), 3–41 (2013). https://doi.org/10.2190/TW.43.1.b

    Article  Google Scholar 

  40. Ruiz, J., Serral, E., Snoeck, M.: Unifying functional user interface design principles. Int. J. Human-Computer Interact. (2020). https://doi.org/10.1080/10447318.2020.1805876

    Article  Google Scholar 

  41. Nielsen, J.: Enhancing the explanatory power of usability heuristics. In: Conference on Human Factors in Computing Systems, pp. 152–158 (1994). https://doi.org/10.1145/191666.191729

  42. Shneiderman, B., Plaisant, C.: Designing the User Interface: Strategies for Effective Human-Computer Interaction, 5th ed. Pearson Addison-Wesley, Boston (2009)

    Google Scholar 

  43. Bastien, J.M.C., Scapin, D.L.: Ergonomic criteria for the evaluation of human-computer interfaces. Inria (1993)

    Google Scholar 

  44. Law, E.L.-C., Hvannberg, E.T.: Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation. In: Proceedings of the third Nordic conference on Human-computer interaction, pp. 241–250 (2004). https://doi.org/10.1145/1028014.1028051

  45. Bastien, J.M.C., Scapin, D.L.: A validation of ergonomic criteria for the evaluation of human-computer interfaces. Int. J. Human-Comput. Interact. 4(2), 183–196 (1992). https://doi.org/10.1080/10447319209526035

    Article  Google Scholar 

  46. Scapin, D.L., Bastien, J.M.C.: Ergonomic criteria for evaluating the ergonomic quality of interactive systems. Behav. Inf. Technol. 16(4–5), 220–231 (1997). https://doi.org/10.1080/014492997119806

    Article  Google Scholar 

  47. R. M. Baecker, Readings in Human-Computer Interaction: Toward the Year 2000. Morgan Kaufmann, San Francisco (2014)

    Google Scholar 

  48. Mariage, C., Vanderdonckt, J., Pribeanu, C.: State of the art of web usability guidelines. In: The Handbook of Human Factors in Web Design (2005)

    Google Scholar 

  49. Sedrakyan, G., Snoeck, M.: Feedback-enabled MDA-prototy** effects on modeling knowledge. In: et al. Enterprise, Business-Process and Information Systems Modeling. BPMDS EMMSAD 2013 2013. Lecture Notes in Business Information Processing, vol 147, pp. 411–425. Springer, Berlin (2013). https://doi.org/10.1007/978-3-642-38484-4_29

  50. Ruiz, J., Serral, E., Snoeck, M.: UI-GEAR: User interface generation prEview capable to adapt in real-time. In: Modelsward, pp. 277–284 (2017). https://doi.org/10.5220/0006115402770284

  51. Snoeck, M.: Enterprise Information Systems Engineering: The MERODE Approach. Springer, New York (2014). https://doi.org/10.1007/978-3-319-10145-3

  52. Schlungbaum, E.: Model-based user interface software tools current state of declarative models. Georgia Institute of Technology (1996)

    Google Scholar 

  53. Ruiz, J., Serral, E., Snoeck, M.: Technology enhanced support for learning interactive software systems. In: Hammoudi, S., Pires, L., Selic, B. (eds.) Model-Driven Engineering and Software Development. MODELSWARD 2018. Communications in Computer and Information Science, vol. 991, pp. 185–210. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-11030-7_9

  54. Limbourg, Q., Vanderdonckt, J., Michotte, B., Bouillon, L., Florins, M.: USIXML: a user interface description language supporting multiple levels of independence. In: ICWE Workshops, pp. 325–338 (2004)

    Google Scholar 

  55. Ruiz, J., Sedrakyan, G., Snoeck, M.: Generating user interface from conceptual, presentation and user models with JMermaid in a learning approach. In: Interaction 2015 (2015). https://doi.org/10.1145/2829875.2829893

  56. Serral Asensio, E., Ruiz, J., Elen, J., Snoeck, M.: Conceptualizing the domain of automated feedback for learners. In: Proceedings of the XXII Iberoamerican Conference on Software Engineering, CIbSE 2019 (2019)

    Google Scholar 

  57. Lewis, J.R.: IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use, Boca Raton (1993)

    Google Scholar 

  58. Ruiz, J., Serral, E., Snoeck, M.: Learning UI functional design principles through simulation with feedback. IEEE Trans. Learn. Technol. 13(4), 833–846 (2020). https://doi.org/10.1109/TLT.2020.3028596

    Article  Google Scholar 

  59. Poelmans, S., Wessa, P.: A constructivist approach in an e-learning environment for statistics: a students’ evaluation. Interact. Learn. Environ. 23(3), 385–401 (2015)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jenny Ruiz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ruiz, J., Snoeck, M. (2022). Feedback Generation for Automatic User Interface Design Evaluation. In: Fill, HG., van Sinderen, M., Maciaszek, L.A. (eds) Software Technologies. ICSOFT 2021. Communications in Computer and Information Science, vol 1622. Springer, Cham. https://doi.org/10.1007/978-3-031-11513-4_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-11513-4_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-11512-7

  • Online ISBN: 978-3-031-11513-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation