COSMN: Clustering-Based Optimization for 360-Degree Live Streaming over Mobile Networks
Paper Submission: "COSMN: Clustering-Based Optimization for 360-Degree Live Streaming over Mobile Networks"
DOI:
https://doi.org/10.4108/eetinis.131.9499Keywords:
Mobile Network, Field Of View, 360-Degree Video, Clustering-Based Optimization, Quality of Experience, COSMNAbstract
The rapid growth of 360-degree video streaming has transformed how users experience immersive content, especially on mobile devices. However, delivering high-quality 360-degree video streams to mobile devices is challenging due to their constrained computational resources, limited bandwidth, and the need for real-time processing. The paper introduces COSMN (Clustering-Based Optimization for 360-Degree Video Streaming over Mobile Networks), an innovative framework to tackle these challenges. COSMN leverages a clustering-based optimization approach to dynamically adapt video streaming to the viewer’s region of interest (ROI), minimizing resource consumption while maintaining high-quality visuals for the most relevant portions of the video. The framework operates by dividing the 360-degree video into multiple tiles and clustering these tiles based on user viewing patterns. By predicting user behavior with clustering algorithms, COSMN efficiently prioritizes bandwidth and processing power for the tiles within the viewer’s ROI. The system also integrates adaptive bitrate streaming techniques to ensure seamless playback under varying network conditions. Experimental results demonstrate that COSMN significantly reduces bandwidth usage and computational load on mobile devices while providing a smooth and immersive viewing experience. Compared to traditional 360-degree online streaming methods, COSMN achieves superior performance in terms of latency, video quality, and resource efficiency. This work paves the way for scalable, 360-degree online streaming solutions on mobile platforms, making immersive video experiences more accessible and practical for everyday users.
Downloads
References
[1] A. J. Nair, S. Manohar, A. Mittal, and R. Chaudhry, “Unleashing digital frontiers: Bridging realities of augmented reality, virtual reality, and the metaverse,” in The Metaverse Dilemma: Challenges and Opportunities for Business and Society. Emerald Publishing Limited, 2024, pp. 85–112.
[2] K. Logeswaran, S. Savitha, P. Suresh, K. Prasanna Kumar, M. Gunasekar, R. Rajadevi, M. Dharani, and A. Jayasurya, “Unifying technologies in industry 4.0: Harnessing the synergy of internet of things, big data, augmented reality/virtual reality, and blockchain technologies,” Topics in Artificial Intelligence Applied to Industry 4.0, pp. 127–147, 2024.
[3] M. Sayyed, B. R. Jadhav, V. Barnabas, and S. K. Gupta, “Human-machine interaction in the metaverse: A comprehensive review and proposed framework,” Impact and Potential of Machine Learning in the Metaverse, pp. 1–28, 2024.
[4] J. Tu, C. Chen, Z. Yang, M. Li, Q. Xu, and X. Guan, “Pstile: Perception-sensitivity-based 360◦ tiled video streaming for industrial surveillance,” IEEE Transactions on Industrial Informatics, vol. 19, no. 9, pp. 9777–9789, 2023.
[5] A. Majidi and A. H. Zahran, “Optimized joint unicastmulticast panoramic video streaming in cellular networks,” in 2020 IEEE 28th International Conference on Network Protocols (ICNP). IEEE, 2020, pp. 1–6.
[6] D. Nguyen, N. V. Hung, N. T. Phong, T. T. Huong, and T. C. Thang, “Scalable multicast for live 360-degree video streaming over mobile networks,” IEEE Access, vol. 10, pp. 38 802–38 812, 2022.
[7] V. H. Nguyen, N. N. Pham, C. T. Truong, D. T. Bui, H. T. Nguyen, and T. H. Truong, “Retina-based quality assessment of tile-coded 360-degree videos,” EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, vol. 9, no. 32, 2022.
[8] C.-H. Yeh, J.-R. Lin, M.-J. Chen, C.-H. Yeh, C.-A. Lee, and K.-H. Tai, “Fast prediction for quality scalability of high efficiency video coding scalable extension,” Journal of Visual Communication and Image Representation, vol. 58, pp. 462–476, 2019.
[9] M. J. Mohammed, A. Ghazi, A. M. Awad, S. I. Hassan, H. M. Jawad, K. M. Jasim, and M. A. Nurmamatovna, “A comparison of 4g lte and 5g network cybersecurity performance,” in 2024 35th Conference of Open Innovations Association (FRUCT). IEEE, 2024, pp. 452–464.
[10] F. Duanmu, E. Kurdoglu, S. A. Hosseini, Y. Liu, and Y.Wang, “Prioritized buffer control in two-tier 360 video streaming,” in Proceedings of the Workshop on VirtualReality and Augmented Reality Network, 2017, pp. 13–18.
[11] F. Duanmu, E. Kurdoglu, Y. Liu, and Y. Wang, “View direction and bandwidth adaptive 360 degree video streaming using a two-tier system,” in 2017 IEEE International Symposium on Circuits and Systems (ISCAS), 2017, pp. 1–4.
[12] N. V. Hung, P. H. Thinh, N. H. Thanh, T. T. Lam, T. T. Hien, V. T. Ninh, and T. T. Huong, “Lvsumoptimized live 360 degree video streaming in unicast and multicast over mobile networks,” in 2023 IEEE 15th International Conference on Computational Intelligence and Communication Networks (CICN). IEEE, 2023, pp. 29–34.
[13] D. Nguyen, N. V. Hung, T. T. Huong, and T. C. Thang, “A cross-layer framework for multi-user 360- degree video streaming over cellular networks,” in 2022 IEEE International Conference on Consumer Electronics (ICCE). IEEE, 2022, pp. 1–3.
[14] X. Feng, W. Li, and S. Wei, “Liveroi: region of interest analysis for viewport prediction in live mobile virtual reality streaming,” in Proceedings of the 12th ACM Multimedia Systems Conference, 2021, pp. 132–145.
[15] A. Yaqoob and G.-M. Muntean, “Advanced predictive tile selection using dynamic tiling for prioritized 360 video vr streaming,” ACM Transactions on Multimedia Computing, Communications and Applications, vol. 20, no. 1, pp. 1–28, 2023.
[16] P. K. Yadav and W. T. Ooi, “Tile rate allocation for 360-degree tiled adaptive video streaming,” in Proceedings of the 28th ACM International Conference on Multimedia, 2020, pp. 3724–3733.
[17] V. H. Nguyen, D. T. Bui, T. L. Tran, C. T. Truong, and T. H. Truong, “Scalable and resilient 360-degree-video adaptive streaming over http/2 against sudden network drops,” Computer Communications, vol. 216, pp. 1–15, 2024.
[18] A. T. Nasrabadi, A. Samiei, and R. Prakash, “Viewport prediction for 360 videos: a clustering approach,” in Proceedings of the 30th ACMWorkshop on Network and Operating Systems Support for Digital Audio and Video, 2020, pp. 34–39.
[19] X. Chen, T. Tan, and G. Cao, “Macrotile: Toward qoeaware and energy-efficient 360-degree video streaming,” IEEE Transactions on Mobile Computing, vol. 23, no. 2, pp. 1112–1126, 2022.
[20] O. El Marai, S. Messinis, N. Doulamis, T. Taleb, and J. Manner, “Roads infrastructure digital twin: Advancing situational awareness through bandwidth-aware 360° video streaming and multi-view clustering,” IEEE Open Journal of Vehicular Technology, 2025.
[21] K. K. Sreedhar, A. Aminlou, M. M. Hannuksela, and M. Gabbouj, “Viewport-adaptive encoding and streaming of 360-degree video for virtual reality applications,” in 2016 IEEE International Symposium on Multimedia (ISM). IEEE, 2016, pp. 583–586.
[22] W. Gao, C. Li, H. Lv,W. Dai, J. Zou, H. Xiong, X. Pan, and H. Wang, “Optimal tile-based encoding for 360-degree video streaming,” in 2022 Picture Coding Symposium (PCS). IEEE, 2022, pp. 295–299.
[23] H. Nguyen, “Mellifluous viewport bitrate adaptation for 360 videos streaming over http/2,” Journal on Information Technologies & Communications, vol. 2024, no. 2, pp. 2–2, 2024.
[24] N. V. Hung, N. A. Quan, N. Tan, T. T. Hai, D. T. Trung, L. M. Nam, B. T. Loan, and N. T. T. Nga, “Building predictive smell models for virtual reality environments,” , vol. 24, no. 2, pp. 556–582, 2025.
[25] S. Peng, J. Hu, H. Xiao, S. Yang, and C. Xu, “Viewportdriven adaptive 360◦ live streaming optimization framework,” Journal of Networking and Network Applications, vol. 1, no. 4, pp. 139–149, 2022.
[26] Z. Jiang, X. Zhang, Y. Xu, Z. Ma, J. Sun, and Y. Zhang, “Reinforcement learning based rate adaptation for 360-degree video streaming,” IEEE Transactions on Broadcasting, vol. 67, no. 2, pp. 409–423, 2020.
[27] N. Hung, T. Lam, T. Binh, A. Marshal, and T. Huong, “Efficient deep learning-based viewport estimation for 360-degree video streaming,” Advances in Science, Technology and Engineering Systems Journal, vol. 9, pp. 49–61, 05 2024.
[28] W. Feng, S. Wang, and Y. Dai, “Adaptive 360- degree streaming: Optimizing with multi-window and stochastic viewport prediction,” IEEE Transactions on Mobile Computing, pp. 1–14, 2025.
[29] N. Hung, P. Dat, N. Tan, N. Quan, L. Trang, L. Nam et al., “Heverl–viewport estimation using reinforcement learning for 360-degree video streaming,” Informatics and Automation, vol. 24, no. 1, pp. 302–328, 2025.
[30] A. Dharmasiri, C. Kattadige, V. Zhang, and K. Thilakarathna, “Viewport-aware dynamic 360 {deg} video segment categorization,” arXiv preprint arXiv:2105.01701, 2021.
[31] A. Saadallah, S.-M. Senouci, I. El-Korbi, and P. Brunet, “Dynamic field-of-view-based clustering for efficient 360-degree multicast streaming,” in GLOBECOM 2024-2024 IEEE Global Communications Conference. IEEE, 2024, pp. 602–607.
[32] T. M. C. Chu and H.-J. Zepernick, “Performance analysis of an adaptive rate scheme for qoe-assured mobile vr video streaming,” Computers, vol. 11, no. 5, p. 69, 2022.
[33] N. H. Lich, T. T. Huong, N. V. Hung, and P. N. Nam, “Efficient short-form video streaming: An integration of dynamic bitrate adaptation and predictive segment preloading,” in 2024 Fifteenth International Conference on Ubiquitous and Future Networks (ICUFN). IEEE, 2024, pp. 360–365.
[34] H. Ahmadi, O. Eltobgy, and M. Hefeeda, “Adaptive multicast streaming of virtual reality content to mobile users,” in Proceedings of the on Thematic Workshops of ACM Multimedia 2017, 2017, pp. 170–178.
[35] C. Ge, N. Wang, G. Foster, and M. Wilson, “Toward qoeassured 4k video-on-demand delivery through mobile edge virtualization with adaptive prefetching,” IEEE Transactions on Multimedia, vol. 19, no. 10, pp. 2222–2237, 2017.
[36] L. Sun, F. Duanmu, Y. Liu, Y. Wang, Y. Ye, H. Shi, and D. Dai, “Multi-path multi-tier 360-degree video streaming in 5g networks,” in Proceedings of the 9th ACM multimedia systems conference, 2018, pp. 162–173.
[37] M. Mahmoud, S. Rizou, A. S. Panayides, N. V. Kantartzis, G. K. Karagiannidis, P. I. Lazaridis, and Z. D. Zaharis, “Optimized tile quality selection in multiuser 360° video streaming,” IEEE Open Journal of the Communications Society, 2024.
[38] J. Son, D. Jang, and E.-S. Ryu, “Implementing 360 video tiled streaming system,” in Proceedings of the 9th ACM Multimedia Systems Conference, 2018, pp. 521–524.
[39] 3GPP, “Nr; physical channels and modulation (release 16),” 3GPP, Tech. Rep. TS 138.211 V16.2.0 (2020-07), 2020.
[40] D. V. Nguyen, H. Van Trung, H. L. D. Huong, T. T. Huong, N. P. Ngoc, and T. C. Thang, “Scalable 360 video streaming using http/2,” in 2019 IEEE 21st International Workshop on Multimedia Signal Processing (MMSP). IEEE, 2019, pp. 1–6.
[41] X. Corbillon, F. De Simone, and G. Simon, “360-degree video head movement dataset,” in Proceedings of the 8th ACM on Multimedia Systems Conference, 2017, pp. 199–204.
[42] M. Mahmoud, S. Rizou, A. S. Panayides, N. V. Kantartzis, G. K. Karagiannidis, P. I. Lazaridis, and Z. D. Zaharis, “A survey on optimizing mobile delivery of 360° videos: Edge caching and multicasting,” IEEE Access, vol. 11, pp. 68 925–68 942, 2023.
[43] J. M. Boyce, Y. Ye, J. Chen, and A. K. Ramasubramonian, “Overview of shvc: Scalable extensions of the high efficiency video coding standard,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 26, no. 1, pp. 20–34, 2015.
[44] S. Petrangeli, J. Famaey, M. Claeys, S. Latré, and F. De Turck, “Qoe-driven rate adaptation heuristic for fair adaptive video streaming,” ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), vol. 12, no. 2, pp. 1–24, 2015.
[45] X. Yin, A. Jindal, V. Sekar, and B. Sinopoli, “A control-theoretic approach for dynamic adaptive video streaming over http,” in Proceedings of the 2015 ACM Conference on Special Interest Group on Data Communication, 2015, pp. 325–338.
[46] H. Mao, R. Netravali, and M. Alizadeh, “Neural adaptive video streaming with pensieve,” in Proceedings of the conference of the ACM special interest group on data communication, 2017, pp. 197–210.
[47] C. Zhou, Y. Ban, Y. Zhao, L. Guo, and B. Yu, “Pdas: Probability-driven adaptive streaming for short video,” in Proceedings of the 30th ACM International Conference on Multimedia, 2022, pp. 7021–7025.
[48] J. De Vriendt, D. De Vleeschauwer, and D. Robinson, “Model for estimating qoe of video delivered using http adaptive streaming,” in 2013 IFIP/IEEE International Symposium on Integrated Network Management (IM2013). IEEE, 2013, pp. 1288–1293.
[49] R. K. Mok, E. W. Chan, and R. K. Chang, “Measuring the quality of experience of http video streaming,” in 12th IFIP/IEEE international symposium on integrated network management (IM 2011) and workshops. IEEE, 2011, pp. 485–492.
[50] N. V. Hung, T. D. Chien, N. P. Ngoc, and T. H. Truong, “Flexible http-based video adaptive streaming for good qoe during sudden bandwidth drops,” EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, vol. 10, no. 2, pp. e3–e3, 2023.
[51] M. U. Younus, “Analysis of the impact of different parameter settings on wireless sensor network lifetime,” International Journal of Advanced Computer Science and Applications, vol. 9, no. 3, 2018.
Downloads
Published
Issue
Section
Categories
License
Copyright (c) 2025 Hung Nguyen Viet, Hoang Bui Huy, Cong Tran Thanh, Huong Truong Thu

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
This is an open-access article distributed under the terms of the Creative Commons Attribution CC BY 3.0 license, which permits unlimited use, distribution, and reproduction in any medium so long as the original work is properly cited.