DataScape VR was developed as a VR-based data visualization platform, designed to provide a more intuitive and interactive approach to data analysis. Traditional 2D data representation often limits spatial understanding and trend discovery, making it difficult for analysts to uncover complex relationships. By leveraging a 3D immersive environment, DataScape VR allows users to walk through, manipulate, and interact with data in real-time, offering a new dimension to data exploration.
This project was programmed in VS Code using C# and built within Unity for deployment on VR headsets with six degrees of freedom (6DOF). The platform supports real-time data visualization, interactive exploration, and dynamic filtering, making it a valuable tool for data analysts, business professionals, engineers, and students.
The project focused on creating a fully interactive virtual reality data environment, with core functionalities including:
3D Data Visualization: Developed tools to display datasets as bar graphs, scatter plots, and heatmaps, allowing users to walk through and interact with their data in real-time.
User Interaction and Navigation: Implemented teleportation and continuous movement for navigation, enabling smooth user experience in the virtual space.
Data Manipulation Tools: Designed grab, zoom, rotate, and filter mechanics to refine data visualization and provide deeper insights.
Haptic and Audio Feedback: Integrated haptic feedback on selection and spatial 3D sound cues to enhance immersion and interaction.
Optimized Performance for VR: Ensured smooth rendering at 72+ FPS with baked lighting and optimized draw calls to maintain visual clarity without performance loss.
DataScape VR was developed following an iterative design approach, ensuring seamless integration of data processing, user interaction, and VR optimization.
Concept Development and Prototyping
Identified key challenges in traditional data visualization and conceptualized a VR-based solution.
Developed early UI wireframes for menu navigation and data interaction.
3D Data Representation and Environment Setup
Created a VR workspace designed to dynamically display datasets with customizable viewing options.
Built scalable graphing tools that allow users to transition between different visualization modes.
Programming and User Interaction Development
Implemented ray-cast selection mechanics, allowing users to interact with individual data points using VR controllers.
Designed real-time filtering options to dynamically adjust datasets based on user input.
Programmed in C# within VS Code, ensuring modular and scalable code structure.
Performance Optimization for VR
Maintained a frame rate of 72+ FPS to prevent motion sickness.
Applied baked lighting and GPU-based optimizations to reduce computational load.
Testing and User Feedback
Conducted multiple user tests to refine interaction mechanics and optimize visual clarity.
Adjusted UI elements based on feedback to improve ease of use and accessibility.
Enhanced Data Analysis Capabilities: Provided users with real-time insights through interactive 3D data manipulation.
Improved User Engagement in Data Exploration: Allowed spatial navigation of complex datasets, improving comprehension and decision-making.
Optimized VR Performance: Achieved stable frame rates and smooth interactions, making the platform suitable for extended use.
Scalable for Future Enhancements: Built a modular system architecture, enabling integration with external data sources and advanced analytics features.
Developing DataScape VR provided valuable insights into VR development, user interaction design, and performance optimization. Through this project, I gained hands-on experience in:
Translating traditional 2D data analysis techniques into a 3D immersive environment.
Optimizing real-time rendering and interaction mechanics for virtual reality.
Designing intuitive user interfaces and navigation systems within a 3D space.
Moving forward, I plan to expand DataScape VR by integrating AI-driven analytics and machine learning models. This will enable users to receive real-time trend detection, predictive modeling, and automated anomaly detection, further enhancing the platform’s capabilities. Additionally, multi-user collaboration features could be introduced, allowing teams to analyze datasets together in a shared VR space.