Skip to main content
Log in

Platform for real-time subjective assessment of interactive multimedia applications

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

With the advent of cloud computing and remote execution of interactive applications, there is a need for evaluating the Quality of Experience (QoE) and the influence on this QoE of network condition variations, media encoding parameter settings and related optimization algorithms. However, current QoE assessment focuses mainly on audiovisual quality in non-interactive applications, such as video-on-demand services. On the other hand, where experiments aim to quantify interactive quality, the focus is typically targeted at games, using an ad-hoc test setup to assess the impact of network variations on the playing experience. In this paper, we present a novel platform enabling the assessment of a broad range of interactive applications (e.g., thin client remote desktop systems, remotely rendered engineering applications, games). Dynamic reconfiguration of media encoding and decoding is built into the system, to allow dynamic adaptation of the media encoding to the network conditions and the application characteristics. Evaluating the influence of these automatic adaptations is a key asset of our approach. A range of possible use cases is discussed, as well as a performance study of our implementation, showing that the platform we built is capable of highly controllable subjective user assessment. Furthermore, we present results obtained by applying the platform for a subjective evaluation of an interactive multimedia application. Specifically, the influence of visual quality and frame rate on interactive QoE has been assessed for a remotely executed race game.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. For example, for a frame rate of 5 fps, the inter-frame rate is 200 ms. The average polling latency is therefore uniformly distributed between 0 ms and 200 ms, yielding an average value of 100 ms and a standard deviation of 57.7 ms.

References

  1. Chang YC, Tseng PH, Chen KT, Lei CL (2011) Understanding the performance of thin-client gaming. In: IEEE international workshop technical committee on communications quality and reliability (CQR), 2011, pp 1–6. doi:10.1109/CQR.2011.5996092

  2. Chen KT, Huang P, Lei CL (2006) How sensitive are online gamers to network quality? Commun ACM 49(11):34–38. doi:10.1145/1167838.1167859

    Article  Google Scholar 

  3. Chen KT, Tu CC, Xiao WC (2009) OneClick: a framework for measuring network quality of experience. In: INFOCOM 2009. IEEE, pp 702–710. doi:10.1109/INFCOM.2009.5061978

  4. Chen KT, Wu CC, Chang YC, Lei CL (2009) A crowdsourceable QoE evaluation framework for multimedia content. In: ACM multimedia, pp 491–500

  5. Claypool M (2009) Motion and scene complexity for streaming video games. In: Proceedings of the 4th international conference on foundations of digital games, FDG ’09. ACM, New York, NY, USA, pp 34–41. doi:10.1145/1536513.1536529

    Chapter  Google Scholar 

  6. Claypool M, Claypool K (2006) Latency and player actions in online games. Commun ACM 49(11):40–45. doi:10.1145/1167838.1167860

    Article  Google Scholar 

  7. Dick M, Wellnitz O, Wolf L (2005) Analysis of factors affecting players’ performance and perception in multiplayer games. In: Proceedings of 4th ACM SIGCOMM workshop on network and system support for games, NetGames ’05. ACM, New York, NY, USA, pp 1–7. doi:10.1145/1103599.1103624

    Chapter  Google Scholar 

  8. Engel K, Sommer O, Ertl T (2000) A framework for interactive hardware accelerated remote 3d-visualization. In: Proc. TCVG Symp. on Vis. (VisSym), pp 167–177

  9. International Telecommunication Union (2000) ITU-T Recommendation P.920. Interactive test methods for audiovisual communications. http://www.itu.int/. Accessed 4 Dec 2012

  10. International Telecommunication Union (2008) ITU-T Recommendation P.910. Subjective video quality assessment methods for multimedia applications. http://www.itu.int/. Accessed 4 Dec 2012

  11. Jarschel M, Schlosser D, Scheuring S, Hossfeld T (2012) Gaming in the clouds: QoE and the users’ perspective. Math Comput Model. doi:10.1016/j.mcm.2011.12.014

    Google Scholar 

  12. Jovic M, Hauswirth M (2008) Measuring the performance of interactive applications with listener latency profiling. In: Proceedings of the 6th international symposium on principles and practice of programming in Java, PPPJ ’08. ACM, New York, NY, USA, pp 137–146. doi:10.1145/1411732.1411751

    Google Scholar 

  13. JSON (2012) JavaScript Object Notation (JSON). http://www.json.org/. Accessed 4 Dec 2012

  14. Jumisko-Pyykkö S, Utriainen T (2011) A hybrid method for quality evaluation in the context of use for mobile (3d) television. Multimed Tools Appl 55(2):185–225. doi:10.1007/s11042-010-0573-4

    Article  Google Scholar 

  15. Kohler E, Morris R, Chen B, Jannotti J, Kaashoek MF (2000) The click modular router. ACM Trans Comput Syst 18:263–297. doi:10.1145/354871.354874

    Article  Google Scholar 

  16. Kuipers F, Kooij R, De Vleeschauwer D, Brunnström K (2010) Techniques for measuring quality of experience. In: Wired/Wireless internet communications. Lecture notes in computer science (LNCS), vol 6074. Springer Berlin / Heidelberg, pp 216–227

    Chapter  Google Scholar 

  17. Lai AM, Nieh J (2006) On the performance of wide-area thin-client computing. ACM Trans Comput Syst 24(2):175–209. doi:10.1145/1132026.1132029

    Article  Google Scholar 

  18. Moorthy A, Bovik A (2011) Visual quality assessment algorithms: what does the future hold? Multimed Tools Appl 51(2):675–696. doi:10.1007/s11042-010-0640-x

    Article  Google Scholar 

  19. Nieh J, Yang SJ, Novik N (2003) Measuring thin-client performance using slow-motion benchmarking. ACM Trans Comput Syst 21(1):87–115. doi:10.1145/592637.592640

    Article  Google Scholar 

  20. Quax P, Monsieurs P, Lamotte W, De Vleeschauwer D, Degrande N (2004) Objective and subjective evaluation of the influence of small amounts of delay and jitter on a recent first person shooter game. In: Proceedings of 3rd ACM SIGCOMM workshop on network and system support for games, NetGames ’04. ACM, New York, NY, USA, pp 152–156. doi:10.1145/1016540.1016557

    Google Scholar 

  21. Ries M, Svoboda P, Rupp M (2008) Empirical study of subjective quality for massive multiplayer games. In: 15th international conference on systems, signals and image processing, 2008. IWSSIP 2008, pp 181–184. doi:10.1109/IWSSIP.2008.4604397

  22. Santiago P, Ignasi I, Elisa M, Antonio MJ (2008) TRUE: an online testing platform for multimedia evaluation. In: Proceedings of the 2nd int. workshop on EMOTION: corpora for research on emotion and affect at the 6th conference on language resources & evaluation (LREC)

  23. Stegmaier S, Magallón M, Ertl T (2002) A generic solution for hardware-accelerated remote visualization. In: Proc. TCVG Symp. on Vis. (VisSym), pp 87–96

  24. Tolia N, Andersen D, Satyanarayanan M (2006) Quantifying interactive user experience on thin clients. Computer 39(3):46–52. doi:10.1109/MC.2006.101

    Article  Google Scholar 

  25. VDrift: VDrift Open Source Racing Simulator, version 2011-10-22. http://www.vdrift.net. Accessed 4 Dec 2012

  26. Video Quality Experts Group (2010) Report on the validation of video quality models for high definition video content. Tech. rep. http://www.its.bldrdoc.gov/vqeg/projects/hdtv/. http://www.vqeg.org/. Accessed 4 Dec 2012

  27. VirtualGL (2012) VirtualGL. http://www.virtualgl.org. Accessed 4 Dec 2012

  28. Wattimena AF, Kooij RE, van Vugt JM, Ahmed OK (2006) Predicting the perceived quality of a first person shooter: the Quake IV G-model. In: Proceedings of 5th ACM SIGCOMM workshop on network and system support for games, NetGames ’06. ACM, New York, NY, USA. doi:10.1145/1230040.1230052

    Google Scholar 

  29. Yang C, Niu Y, Xia Y, Cheng X (2008) Performance analysis of interactive desktop applications in virtual machine environment. Chin J Electron 17(2):242–246

    Google Scholar 

  30. Yoo SH, Yoon WC (2006) Modeling users’ task performance on the mobile device: PC convergence system. Interact Comput 18(5):1084–1100. doi:10.1016/j.intcom.2006.01.003

    Article  Google Scholar 

  31. Zeldovich N, Chandra R (2005) Interactive performance measurement with VNCplay. In: Proceedings of the annual conference on USENIX annual technical conference, ATEC ’05. USENIX Association, Berkeley, CA, USA, pp 54–64. http://suif.stanford.edu/vncplay/. Accessed 4 Dec 2012

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bert Vankeirsbilck.

Additional information

Bert Vankeirsbilck is funded by a Ph.D. grant of the IWT – Vlaanderen. Chris Develder is supported in part by a post-doctoral fellowship of the Research Foundation – Flanders (FWO-Vl.).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Vankeirsbilck, B., Verslype, D., Staelens, N. et al. Platform for real-time subjective assessment of interactive multimedia applications. Multimed Tools Appl 72, 749–775 (2014). https://doi.org/10.1007/s11042-013-1395-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-013-1395-y

Keywords

Navigation