Publications of the Department of Audiovisual Technology

The following list (automatically generated by the University Library) contains the publications from the year 2016. The publications up to the year 2015 can be found on an extra page.

Note: If you want to search through all the publications, select "Show All" and then you can use the browser search with Ctrl+F.

Results: 162
Created on: Thu, 18 Apr 2024 23:03:14 +0200 in 0.0755 sec


Wedel, Simon; Koppetz, Michael; Skowronek, Janto; Raake, Alexander
ViProVoQ: towards a vocabulary for video quality assessment in the context of creative video production. - In: MM'19, (2019), S. 2387-2395

This paper presents a method for developing a consensus vocabulary to describe and evaluate the visual experience of videos. As a first result, a vocabulary characterizing the specific look of cinema-type video is presented. Such a vocabulary can be used to relate perceptual features of professional high-end image and video quality of experience (QoE) with the underlying technical characteristics and settings of the video systems involved in the creative content production process. For the vocabulary elicitation, a combination of different survey techniques was applied in this work. As the first step, individual interviews were conducted with experts of the motion picture industry on image quality in the context of cinematography. The data obtained from the interviews was used for the subsequent Real-time Delphi survey, where an extended group of experts worked out a consensus on key aspects of the vocabulary specification. Here, 33 experts were supplied with the anonymized results of the other panelists, which they could use to revise their own assessment. Based on this expert panel, the attributes collected in the interviews were verified and further refined, resulting in the final vocabulary proposed in this paper. Besides an attribute-based sensory evaluation of high-quality image, video and film material, applications of the vocabulary are the development of dimension-based image and video quality models, and the analysis of the multivariate relationship between quality-relevant perceptual attributes and technical system parameters.



https://doi.org/10.1145/3343031.3351171
Lestari, Purji; Schade, Hans-Peter
Boundary matched human area segmentation for Chroma keying using hybrid depth-color analysis. - In: 2019 IEEE 4th International Conference on Signal and Image Processing (ICSIP 2019), (2019), S. 761-767

https://doi.org/10.1109/SIPROCESS.2019.8868469
Zhou, Jun; Qi, Lianyong; Raake, Alexander; Xu, Tao; Piekarska, Marta; Zhang, Xuyun
User attitudes and behaviors toward personalized control of privacy settings on smartphones. - In: Concurrency and computation, ISSN 1532-0634, Volume 31 (2019), issue 22, e4884, Seite 1-14

https://doi.org/10.1002/cpe.4884
Singla, Ashutosh; Ramachandra Rao, Rakesh Rao; Göring, Steve; Raake, Alexander
Assessing media QoE, simulator sickness and presence for omnidirectional videos with different test protocols. - In: 26th IEEE Conference on Virtual Reality and 3D User Interfaces, (2019), S. 1163-1164

https://doi.org/10.1109/VR.2019.8798291
Raake, Alexander; Skowronek, Janto; Soloducha, Michal
Telecommunications applications. - In: Sensory evaluation of sound, (2019), S. 227-267

Singla, Ashutosh; Göring, Steve; Raake, Alexander; Meixner, Britta; Koenen, Rob; Buchholz, Thomas
Subjective quality evaluation of tile-based streaming for omnidirectional videos. - In: Proceedings of the 10th ACM Multimedia Systems Conference (MMSys'19), (2019), S. 232-242

https://doi.org/10.1145/3304109.3306218
Fremerey, Stephan; Huang, Rachel; Göring, Steve; Raake, Alexander
Are people pixel-peeping 360˚ videos?. - In: Electronic imaging, ISSN 2470-1173, Bd. 31 (2019), 10, art00002, S. 220-1-220-6

In this paper, we compare the influence of a higher-resolution Head-Mounted Display (HMD) like HTC Vive Pro on 360˚ video QoE to that obtained with a lower-resolution HMD like HTC Vive. Furthermore, we evaluate the difference in perceived quality for entertainment-type 360˚ content in 4K/6K/8K resolutions at typical high-quality bitrates. In addition, we evaluate which video parts people are focusing on while watching omnidirectional videos. To this aim we conducted three subjective tests. We used HTC Vive in the first and HTC Vive Pro in the other two tests. The results from our tests are showing that the higher resolution of the Vive Pro seems to enable people to more easily judge the quality, shown by a minor deviation between the resulting quality ratings. Furthermore, we found no significant difference between the quality scores for the highest bitrate for 6K and 8K resolution. We also compared the viewing behavior for the same content viewed for the first time with the behavior when the same content is viewed again multiple times. The different representations of the contents were explored similarly, probably due to the fact that participants are finding and comparing specific parts of the 360˚ video suitable for rating the quality.



https://doi.org/10.2352/ISSN.2470-1173.2019.10.IQSP-220
Ramachandra Rao, Rakesh Rao; Göring, Steve; Vogel, Patrick; Pachatz, Nicolas; Villamar Villarreal, Juan Jose; Robitza, Werner; List, Peter; Feiten, Bernhard; Raake, Alexander
Adaptive video streaming with current codecs and formats: extensions to parametric video quality model ITU-T P.1203. - In: Electronic imaging, ISSN 2470-1173, Bd. 31 (2019), 10, art00015, S. 314-1-314-6

Adaptive streaming is fast becoming the most widely used method for video delivery to the end users over the internet. The ITU-T P.1203 standard is the first standardized quality of experience model for audiovisual HTTP-based adaptive streaming. This recommendation has been trained and validated for H.264 and resolutions up to and including full-HD. The paper provides an extension for the existing standardized short-term video quality model mode 0 for new codecs i.e., H.265, VP9 and AV1 and resolutions larger than full-HD (e.g. UHD-1). The extension is based on two subjective video quality tests. In the tests, in total 13 different source contents of 10 seconds each were used. These sources were encoded with resolutions ranging from 360p to 2160p and various quality levels using the H.265, VP9 and AV1 codecs. The subjective results from the two tests were then used to derive a mapping/correction function for P.1203.1 to handle new codecs and resolutions. It should be noted that the standardized model was not re-trained with the new subjective data, instead only a mapping/correction function was derived from the two subjective test results so as to extend the existing standard to the new codecs and resolutions.



https://doi.org/10.2352/ISSN.2470-1173.2019.10.IQSP-314
Göring, Steve; Zebelein, Julian; Wedel, Simon; Keller, Dominik; Raake, Alexander
Analyze and predict the perceptibility of UHD video contents. - In: Electronic imaging, ISSN 2470-1173, Bd. 31 (2019), 12, art00009, S. 215-1-215-6

720p, Full-HD, 4K, 8K, ..., display resolutions are increasing heavily over the past time. However, many video streaming providers are currently streaming videos with a maximum of 4K/UHD-1 resolution. Considering that normal video viewers are enjoying their videos in typical living rooms, where viewing distances are quite large, the question arises if more resolution is even recognizable. In the following paper we will analyze the problem of UHD perceptibility in comparison with lower resolutions. As a first step, we conducted a subjective video test, that focuses on short uncompressed video sequences and compares two different testing methods for pairwise discrimination of two representations of the same source video in different resolutions.We selected an extended stripe method and a temporal switching method. We found that the temporal switching is more suitable to recognize UHD video content. Furthermore, we developed features, that can be used in a machine learning system to predict whether there is a benefit in showing a given video in UHD or not. Evaluating different models based on these features for predicting perceivable differences shows good performance on the available test data. Our implemented system can be used to verify UHD source video material or to optimize streaming applications.



https://doi.org/10.2352/ISSN.2470-1173.2019.12.HVEI-215
Keller, Dominik; Seybold, Tamara; Skowronek, Janto; Raake, Alexander
Assessing texture dimensions and video quality in motion pictures using sensory evaluation techniques. - In: 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX), (2019), insges. 6 S.

The quality of images and videos is usually examined with well established subjective tests or instrumental models. These often target content transmitted over the internet, such as streaming or videoconferences and address the human preferential experience. In the area of high-quality motion pictures, however, other factors are relevant. These mostly are not error-related but aimed at the creative image design, which has gained comparatively little attention in image and video quality research. To determine the perceptual dimensions underlying movie-type video quality, we combine sensory evaluation techniques extensively used in food assessment - Degree of Difference test and Free Choice Profiling - with more classical video quality tests. The main goal of this research is to analyze the suitability of sensory evaluation methods for high-quality video assessment. To understand which features in motion pictures are recognizable and critical to quality, we address the example of image texture properties, measuring human perception and preferences with a panel of image-quality experts. To this aim, different capture settings were simulated applying sharpening filters as well as digital and analog noise to exemplary source sequences. The evaluation, involving Multidimensional Scaling, Generalized Procrustes Analysis as well as Internal and External Preference Mapping, identified two separate perceptual dimensions. We conclude that Free Choice Profiling connected with a quality test offers the highest level of insight relative to the needed effort. The combination enables a quantitative quality measurement including an analysis of the underlying perceptual reasons.



https://doi.org/10.1109/QoMEX.2019.8743189