Publications – Angelos Barmpoutis https://abarmpou.github.io/angelos Professor of Digital Arts and Sciences Mon, 19 Feb 2024 19:37:41 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 Assessing the Influence of Passive Haptics on User Perception of Physical Properties in Virtual Reality https://abarmpou.github.io/angelos/page/assessing-the-influence-of-passive-haptics-on-user-perception-of-physical-properties-in-virtual-reality/ Wed, 14 Feb 2024 18:49:54 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=298 Read More]]> This paper presents a pilot study that explores the role of low-cost passive haptics on how users perceive physical properties such as the size and weight of objects within virtual reality environments. An A/B-type study was conducted as an air hockey simulation in which
participants experienced two versions: one adhered to conventional VR settings, while the other incorporated a tangible surface, a real table. Statistical analysis of the data collected from post-study questionnaires indicated a shift in perception of size and weight when exposed to the haptic-enhanced simulation, with virtual objects perceived as larger or heavier. It was also noted that the observed shift of the user perception was stronger when the simulation with the tangible surface was experienced first. The paper presents details on the implementation of the air hockey simulation and the setup within the testing environment as well as the statistical analysis performed on the collected data, offering practical recommendations for future applications.

]]>
Investigating how interaction with physical objects within virtual environments affects knowledge acquisition and recall https://abarmpou.github.io/angelos/page/investigating-how-interaction-with-physical-objects-within-virtual-environments-affects-knowledge-acquisition-and-recall/ Tue, 13 Feb 2024 19:04:33 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=308 Read More]]> The use of passive haptics in virtual reality environments has been shown to improve procedural learning across various application domains such as first responders training, kayaking, and others (Calandra at al. 2023, Barmpoutis et al. 2020). In this paper we want to go one step further and quantify the effect of passive haptics on knowledge acquisition and recall. We developed a specialized virtual reality application for learning various chemical compounds and their components. Participants engaged in activities that involved precise mixing and proportioning of chemical components to form targeted compounds (see Figure 1). Employing an A-B test framework, participants were randomly assigned to two identical virtual reality environments, differing only in the substitution of the VR controller with a physical jar.

Post-study surveys were administered to gauge user perceptions regarding interaction accuracy and realism, as well as their ability to recall acquired knowledge (specifically, the list of ingredients) from their virtual experience. This pilot study, conducted at the University of Florida Reality Lab, involved 12 subjects. Rigorous statistical analyses, including chi-square tests, were performed on the collected data, with detailed results outlined in this paper.

Two key findings emerged from the study: (a) the presence of the physical jar significantly heightened perceived interaction accuracy, particularly in precise liquid pouring tasks, and (b) users exhibited a remarkable 33% improvement in knowledge recall when utilizing the physical jar as opposed to a conventional VR controller. These results establish a compelling, statistically significant correlation between the integration of passive haptic objects in VR and knowledge acquisition and recall. Furthermore, this study lays the groundwork for a larger-scale study in the future.

]]>
Reinscribing the 3rd dimension in epigraphic studies and transcending disciplinary boundaries https://abarmpou.github.io/angelos/page/reinscribing-the-3rd-dimension-in-epigraphic-studies-and-transcending-disciplinary-boundaries/ Sun, 31 Dec 2023 20:26:10 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=300 Read More]]> Over the past decade, archaeology and epigraphy have been reconsidering their modus operandi. Prompted and facilitated by technological advances, motivated by new research questions, and challenged by growing calls to engage with contemporary audiences, they have been experimenting with methodological approaches and interdisciplinary collaborations. Within this context, the Digital Epigraphy and Archaeology project (DEA) has been developing 3D digitization techniques that accommodate various types of artifacts, has been incorporating multidisciplinary approaches to achieve a more holistic stance towards the objects of study, and has focused on the reproducibility and accessibility of both its techniques and the 3D models.

This paper presents the DEA’s introspective and reembodied ways of preserving and studying the past by reconsidering historical artifacts and their digital re-materialization. The following sections discuss the project’s approach to copies and digital copies, 3D digitization and enhanced visualization processes, comprehensive cloud services, and 3D printing to present the DEA steps toward facilitating and advancing archaeology and epigraphy. Through such approaches that combine traditional rigor with technological novelty and affordances, the team’s vision is to popularize archaeology and epigraphy within and beyond academia and pinpoint the significance of the world’s heritage to the new generations of students and the public.

]]>
Prostate Capsule Segmentation in Micro-Ultrasound Images Using Deep Neural Networks https://abarmpou.github.io/angelos/page/prostate-capsule-segmentation-in-micro-ultrasound-images-using-deep-neural-networks/ Tue, 18 Apr 2023 23:36:13 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=287 Read More]]> Prostate cancer is the most common internal malignancy among males. Micro-Ultrasound is a promising imaging modality for cancer identification and computer-assisted visualization. Identifying the prostate capsule area is essential in active surveillance monitoring and treatment planning. In this paper, we present a pilot study that assesses prostate capsule segmentation using the U-Net deep neural network framework. To the best of our knowledge, this is the first study on prostate capsule segmentation in Micro-Ultrasound images. For our study, we collected multi-frame volumes of Micro-Ultrasound images, and then expert prostate cancer surgeons annotated the capsule border manually. The lack of clear boundaries and variation of shapes between patients make the task challenging, especially for novice Micro-Ultrasound operators. In total 2099 images were collected from 8 subjects, 1296 of which were manually annotated and were split into a training set (1008), a validation set (112), and a test set from a different subject (176). The performance of the model was evaluated by calculating the Intersection over Union (IoU) between the manually annotated area of the capsule and the segmentation mask computed from the trained deep neural network. The results demonstrate high IoU values for the training set (95.05%), the validation set (93.18%) and the test set from a separate subject (85.14%). In 10-fold cross-validation, IoU was 94.25%, and accuracy was 99%, validating the robustness of the model. Our pilot study demonstrates that deep neural networks can produce reliable segmentation of the prostate capsule in Micro-Ultrasound images and pave the road for the segmentation of other anatomical structures within the capsule, which will be the subject of our future studies.

]]>
Developing Mini VR Game Engines as an Engaging Learning Method for Digital Arts & Sciences https://abarmpou.github.io/angelos/page/developing-mini-vr-game-engines-as-an-engaging-learning-method-for-digital-arts-sciences/ Sat, 11 Mar 2023 23:42:04 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=289 Read More]]> Digital Arts and Sciences curricula have been known for combining topics of emerging technologies and artistic creativity for the professional preparation of future technical artists and other creative media professionals. One of the key challenges in such an interdisciplinary curriculum is the instruction of complex technical concepts to an audience that lacks prior computer science background. This paper discusses how developing small custom virtual and augmented reality game engines can become an effective and engaging method for teaching various fundamental technical topics from Digital Arts and Sciences curricula. Based on empirical evidence, we demonstrate examples that integrate concepts from geometry, linear algebra, and computer programming to 3D modeling, animation, and procedural art. The paper also introduces an open-source framework for implementing such a curriculum in Quest VR headsets, and we provide examples of small-scale focused exercises and learning activities.

]]>
AI-driven Human Motion Classification and Analysis using Laban Movement System https://abarmpou.github.io/angelos/page/ai-driven-human-motion-classification-and-analysis-using-laban-movement-system/ Thu, 14 Jul 2022 18:30:46 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=295 Read More]]> Human movement classification and analysis are important in the research of health sciences and the arts. Laban movement analysis is an effective method to annotate human movement in dance that describes communication and expression. Technology-supported human movement analysis employs motion sensors, infrared cameras, and other wearable devices to capture critical joints of the human skeleton and facial key points. However, the aforementioned technologies are not mainstream, and the most popular form of motion capture is conventional video recording, usually from a single stationary camera. Such video recordings can be used to evaluate human movement or dance performance. Any methods that can systematically analyze and annotate these raw video footage would be of great importance to this field. Therefore, this research offers an analysis and comparison of AI-based computer vision methods that can annotate the human movement automatically. This study trained and compared four different machine learning algorithms (random forest, K neighbors, neural network, and decision tree) through supervised learning on existing video datasets of dance performances. The developed system was able to automatically produce annotation in the four dimensions (effort, space, shape, body) of Laban movement analysis. The results demonstrate accurately produced annotations in comparison to manually entered ground truth Laban annotation.

]]>
Virtual Kayaking: A study on the effect of low-cost passive haptics on the user experience while exercising https://abarmpou.github.io/angelos/page/virtual-kayaking-a-study-on-the-effect-of-low-cost-passive-haptics-on-the-user-experience-while-exercising/ Thu, 16 Jul 2020 14:44:20 +0000 https://research.dwi.ufl.edu/people/angelos/?post_type=product&p=35 Read More]]>

This paper presents the results of a pilot study that assesses the effect of passive haptics on the user experience in virtual reality simulations of recreation and sports activities. A virtual reality kayaking environment with realistic physics simulation and water rendering was developed that allowed users to steer the kayak using natural motions. Within this environment the users experienced two different ways of paddling using: a) a pair of typical virtual reality controllers, and b) one custom-made “smart paddle” that provided the passive haptic feedback of a real paddle. The results of this pilot study indicate that the users learned faster how to steer the kayak using the paddle, which they found to be more intuitive to use and more appropriate for this application. The results also demonstrated an increase in the perceived level of enjoyment and realism of the virtual experience.

Kayaking is an outdoor activity that can be enjoyed with easy motions and with minimal skill, and can be performed on equal terms by both people who are physically able and those with disabilities [1]. For this reason, it is an ideal exercise for physical therapy and its efficacy as a rehabilitation tool has been demonstrated in several studies [1-6]. Kayaking simulations offer a minimal-risk environment, which, in addition to rehabilitation, can be used in training and recreational applications [5]. The mechanics of boat simulation in general have been well-studied and led to the design of high-fidelity simulation systems in the past decades [3,7]. These simulators immerse the users by rendering a virtual environment on a projector [1,4,6] or a computer screen that is mounted on the simulator system [2,8]. Furthermore, the users can control the simulation by imitating kayaking motions using remote controls equipped with accelerometers (such as Wii controllers) [5] or by performing the same motions in front of a kinesthetic sensor (such as Kinect sensors) [4,6].

The recent advances in virtual reality technologies and in particular the availability of head mounted displays as self-contained low-cost consumer devices led to the development of highly immersive virtual experiences compared to the conventional virtual reality experiences with wall projectors and computer displays. Kayaking simulations have been published as commercial game titles in these virtual reality platforms [13]. However, the use of head mounted displays in intensive physical therapy exercises bears the risk of serious injuries due to the lack of user contact with the real environment. These risks could potentially be reduced if the users maintained continuous contact with the surrounding objects such as the simulator hardware, the paddle(s), and the floor of the room, with the use of passive haptics. Additionally, the overall user experience can be improved through sensory-rich interaction with the key components of the simulated environment.

This paper assesses the role of passive haptics in virtual kayaking applications. Passive haptics can be implemented in virtual reality systems by tracking objects of interest in real-time and aligning them with identically shaped virtual objects, which results in a sensory-rich experience [9,10]. This alignment between real and virtual objects allows users to hold and feel the main objects of interaction including hand-held objects, tables, walls, and various tools [11,12].

In this paper we present a novel virtual reality kayaking application with passive haptic feedback on the key objects of interaction, namely the paddle and the kayak seat. These objects are being tracked in real-time with commercially available tracking sensors that are firmly attached to them. Although the users’ real-world view is occluded by the head-mounted display, the users can see the virtual representation of these objects and naturally feel, hold, and interact with them. Subsequently, the users can perform natural maneuvers during the virtual kayaking experience by interacting with our “smart” paddle using the same range of motions as in real kayaking.

The proposed system was assessed with a pilot user study (n=10) that tested the following hypotheses: a) The use of passive haptics helps users learn kayaking faster and operate the simulation better compared to the conventional controller-based interaction. b) The use of passive haptics improves the level of immersion while kayaking in virtual reality.

The study was undertaken at the Realities Lab of the Digital Worlds Institute at the University of Florida. The volunteers who participated in this experiment were randomly assigned to the study and the control group and experienced the proposed virtual kayaking system with and without the use of passive haptics respectively. The data collection was performed with pre- and post-test surveys. In addition, the progress of each individual user during kayaking was recorded and the collected timestamps were analyzed.

The results from this study are presented in detail and indicate that the use of passive haptics in this application has a statistically significant impact on the user experience and affects their enjoyment, learning progress, as well as the perceived level of realism of the virtual reality simulation.

]]>
Assessing the Role of Virtual Reality with Passive Haptics in Music Conductor Education: A Pilot Study https://abarmpou.github.io/angelos/page/assessing-the-role-of-virtual-reality-with-passive-haptics-in-music-conductor-education-a-pilot-study/ Fri, 10 Jul 2020 15:01:47 +0000 https://research.dwi.ufl.edu/people/angelos/?post_type=product&p=71 Read More]]>

This paper presents a novel virtual reality system that offers immersive experiences for instrumental music conductor training. The system utilizes passive haptics that bring physical objects of interest, namely the baton and the music stand, within a virtual concert hall environment. Real-time object and finger tracking allow the users to behave naturally on a virtual stage without significant deviation from the typical performance routine of instrumental music conductors. The proposed system was tested in a pilot study (n=13) that assessed the role of passive haptics in virtual reality by comparing our proposed “smart baton” with a traditional virtual reality controller. Our findings indicate that the use of passive haptics increases the perceived level of realism and that their virtual appearance affects the perception of their physical characteristics.

The use of computer systems in instrumental music conductor education has been a well studied topic even outside the area of virtual reality [1]. Several systems have been proposed that offer targeted learning experiences [2,3] which may also combine gamified elements [6]. In the past decades, several visual interfaces have been designed using the available technologies at each given period of time [4,5,7], which most recently included eye tracking [8] and augmented and virtual reality platforms [3].

Recent advances in real-time object tracking and the availability of such systems as mainstream consumer products has opened new possibilities for virtual reality applications [13, 14,]. It has been shown that the use of passive haptics in VR contribute to a sensory-rich experience [15,16], as users have now the opportunity to hold and feel the main objects of interaction within a given immersive environment, such as tools, handles, and other instruments. For example, tracking the location of a real piano can help beginners learn how to play it using virtual reality [20]. However, the use of passive haptics in virtual environments for music education is an understudied area, because it requires precise real-time tracking of objects that are significantly smaller than a piano, such as hand held musical instruments, bows, batons, etc.

In this paper, we present a novel system for enhancing the training of novice instrumental music conductors through a tangible virtual environment. For the purposes of the proposed system a smart baton and a smart music stand have been designed using commercially available tracking sensors (VIVE trackers). The users wear a high-fidelity virtual reality headset (HTC VIVE), which renders the environment of a virtual concert hall from the conductor’s standpoint. Within this environment, the users can feel the key objects of interaction within their reach, namely the baton, the music stand, and the floor of the stage through passive haptics. A real-time hand and finger motion tracking system continuously tracks the left hand of the user in addition to the tracking of the baton, which is usually held in the right hand. This setup creates a natural user interface that allows the conductors to perform naturally on a virtual stage, thus creating a highly immersive training experience.

The main goals of the proposed system are the following: a) Enhance the traditional training of novice instrumental music conductors by increasing their practice time without requiring additional space allocation or time commitment from music players, which is also cost-effective. b) Provide an interface for natural user interaction that does not deviate from the traditional environment of conducting, including the environment, the tools, and the user behavior (hand gesture, head pose, and body posture), thus making the acquired skills highly transferable to the real-life scenario. c) Just-in-time feedback is essential in any educational setting, therefore one of the goals of the proposed system is to generate quantitative feedback on the timeliness of their body movement and the corresponding music signals. d) Last but not least, the proposed system recreates the conditions of a real stage performance, which may help the users reduce stage fright within a risk-free virtual environment [9,10,11,12].

A small scale pilot study (n=13) was performed in order to assess the proposed system and particularly the role of passive haptics in this virtual reality application. The main focus of the study was to test whether the use of passive haptics increases the perceived level of realism in comparison to a typical virtual reality controller, and whether the virtual appearance of a real physical object, such as the baton, affects the perception of its physical characteristics. These hypotheses were tested using A/B tests followed by short surveys. The statistical significance of the collected data was calculated, and the results are discussed in detail.  The reported findings support our hypotheses and set the basis for a larger-scale future study.

]]>
Discover DaVinci – A Gamified Blockchain Learning App https://abarmpou.github.io/angelos/page/discover-davinci-a-gamified-blockchain-learning-app/ Sun, 03 May 2020 20:56:49 +0000 https://research.dwi.ufl.edu/people/angelos/?post_type=product&p=148 Read More]]> Discover DaVinci is a novel augmented reality system that incorporates blockchain technology with experiential learning to engage participants in an interactive discovery of Leonardo da Vinci’s ouvre. In the true spirit of this “Renaissance man”, Discover DaVinci explores new ideas and technologies “ahead of their time”.

In order to illustrate the emerging potential at the intersection of art and blockchain, we present a case study of a new interactive system produced at the University of Florida Digital Worlds Institute.

The technologies of mobile computing, augmented reality (AR), and blockchain are starting to merge, creating new opportunities and scenarios to interact with our environment. In AR we can look at virtual objects superimposed within a real environment and resize them, rotate them, explore and interact with them on multiple levels. With the combination of AR and blockchain, we can create a system capable of keeping track of digital assets located virtually in 3D space (i.e., spatial computing). The global scale of blockchain and related technologies heightens the potential for trade and digital distribution with a fully automated and trusted way to keep track of their creations without a “middle-man”.

Discover DaVinci is a novel educational tool that teaches concepts of blockchain technology through an augmented reality experiential learning game.

This project was developed in collaboration with several units from the University of Florida and industry partners:
• Digital Arts & Sciences Faculty (Computer Science and Digital Worlds Institute)
• Digital Worlds Studios’ Artists and Programmers
• Gator Blockchain Club (gatorblockchainclub.com) – Student-run blockchain club at the University of Florida
• Center for Innovation and Entrepreneurship (College of Business)
• Creative Campus Committee at the University of Florida
Industry Partners:
• DLUX, decentralized content network (dlux.io)
• Steem (steem.com), and Steemit (steemit.com)
• A-Frame, web VR platform (aframe.io)

Discover DaVinci utilizes the format of a digital, collectible trading & drafting card game with AR elements on the STEEM blockchain. Although each player “owns” their cards, all transactions are public. Every collectible card is a unique token, owned by the player – a digital asset registered to the player’s account. The aim is to draw new question cards daily, answer the questions about Leonardo DaVinci, collect the special AR invention cards, and ultimately submit the accumulated card collection into a drawing for prizes. The app was developed to honor the 500th anniversary of Leonardo Davinci by promoting new and innovative technologies.

]]>
Custom Virtual Reality System with Real-Time Therapist Interactions to Enhance Home Exercise Performance and Adherence https://abarmpou.github.io/angelos/page/custom-virtual-reality-system-with-real-time-therapist-interactions-to-enhance-home-exercise-performance-and-adherence/ Wed, 12 Feb 2020 15:04:58 +0000 https://research.dwi.ufl.edu/people/angelos/?post_type=product&p=73 Read More]]> Purpose/Hypothesis:
Following lower extremity (LE) joint replacement, patients are increasingly prescribed virtual reality-based home exercise programs (HEP). One goal of virtual reality (VR) use is to promote HEP adherence. Exercise adherence, as well as exercise performance, is increased with human interaction and real-time therapist feedback, which is not commonly incorporated in commercially available VR systems. To address these limitations, a custom VR system was developed using an infrared camera for motion tracking, avatar streaming, and real-time remote therapist interactions. The primary aim of this study was to evaluate the use of this custom VR system on HEP performance in adults post LE joint replacement. We also examined patient and therapist opinions of VR system feedback features and ability to improve HEP adherence.

Number of Subjects:
14 patients (11 female; 62.5±7.5 years) with unilateral hip (n=6) or knee (n=8) replacements (4.6±5.9 months post-surgery) and 11 therapists (6 PT, 4 OT, 1 COTA; female; >2 yrs experience) participated.

Materials/Methods:
Subjects completed two random-ordered LE exercise conditions using either the custom VR system or a conventional HEP with diagrams and written instructions while therapists observed remotely via video streaming. Four standing exercises were performed (hip flexion, abduction, extension, squats). Instructions and verbal feedback were standardized, and 3-D LE motions were recorded. Exercise performance was assessed by calculating peak joint angles and movement velocities. The effect of remote therapist interaction and verbal feedback on exercise performance during the VR condition was assessed by calculation of peak joint angles during aberrant, compensatory movements (i.e. trunk lean). Exercise performance during the two conditions was compared using paired t-tests. Patient and therapist preferences were assessed using standardized questionnaires with open-ended and Likert scale-based items.

Results:
Peak joint angles during the two conditions were not different (p>.05), but movements were slower with VR use for 3 of 4 exercises (p<.05) and compensations were reduced with remote therapist interactions and verbal feedback. 100% of patient and therapist participants reported preferences for remote interactions including verbal feedback and interactions with streaming avatars to display real-time movements. 79% of patients and 91% of therapists reported agreement that the VR system could improve HEP adherence.

Conclusion:
A custom VR system that incorporates real-time remote therapist interactions improved HEP performance in individuals post LE joint replacement. Both patients and therapists reported high preferences for real-time interactions.

Clinical Relevance:
VR systems should consider the role of real-time therapist interactions to promote engagement and adherence to HEPs, as well as provide opportunity for feedback to enhance exercise performance. Further, web-based systems can allow for multi-user group exercise sessions and engagement for those in ru

]]>