Andreas Wilms 78481ca337 init
python
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00
2025-09-09 19:40:44 +02:00

🌐 Interactive Point Cloud Study

Bachelor Thesis 📍 Chair of Networked Embedded Systems and Communication Systems 📅 Completed: September 2024 🏫 University of Augsburg 🔗 Thesis Info Page


🎓 Thesis Title

"Improving Crowdsourcing-based QoE Studies for Point Clouds via Interaction Incentives"


🧠 Summary

Point clouds serve as the digital foundation for numerous modern technologies, including:

  • Autonomous driving
  • Manufacturing applications
  • Virtual Reality (VR)

In VR, point clouds enable highly immersive and realistic environments due to their spatial accuracy. This allows users to engage more naturally, and industries—such as manufacturing—benefit by simulating environments for training, maintenance, and operation without real-world risks.

💡 Example: A virtual training module where a trainee practices machinery maintenance using point cloud-rendered equipment.

However, despite their potential, widespread adoption of point cloud applications in VR is limited due to:

  • Poor user experience
  • Low perceived quality
  • Hardware limitations
  • Bandwidth constraints

These limitations often lead to point cloud size reduction techniques, which, while improving performance, negatively affect user experience (QoE).


Why Evaluate QoE?

To ensure a good user experience, user-centered evaluations are essential. Yet, technical metrics alone do not reflect true user satisfaction.

📉 Current Challenges

  • QoE studies are often costly, limited in scale, and lack quality feedback.
  • Current methods do not scale well or incentivize active user engagement.

💡 My Approach

This thesis explores the feasibility and optimization of using crowdsourcing to evaluate point cloud QoE.

Key Contributions:

  • Designed and implemented a full interactive point cloud QoE study using a crowdsourcing approach.

  • 🎯 Investigated how interaction incentives can:

    • Increase user engagement
    • Improve feedback quality
    • Enhance overall study effectiveness
  • 🔍 Analyzed whether these mechanisms result in better and more scalable QoE evaluations.


📈 Outcomes

  • Developed a complete framework: Design → Implementation → Deployment → Analysis
  • Demonstrated that interactive incentives significantly improve participant engagement and data reliability in crowdsourced QoE studies.

📄 Full Thesis (PDF)


Description
No description provided
Readme 365 MiB
Languages
JavaScript 98.2%
HTML 0.8%
Python 0.6%
CSS 0.4%