Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.
This document provides a summary of a lecture on perception in augmented and virtual reality. It discusses the history of disappearing computers from room-sized to handheld. It reviews the key concepts of augmented reality, virtual reality, and mixed reality on Milgram's continuum. It discusses how perception of reality works through our senses and how virtual reality aims to create an illusion of reality. It covers factors that influence the sense of presence such as immersion, interaction, and realism.
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
This document discusses empathic computing and its relationship to the metaverse. It defines key elements of the metaverse like virtual worlds, augmented reality, mirror worlds, and lifelogging. Research on the metaverse is still fragmented across these areas. The document outlines a vision for empathic computing systems that allow sharing experiences, emotions, and environments through technologies like virtual reality, augmented reality, and sensor data. Examples are given of research projects exploring collaborative VR experiences and AR/VR systems for remote collaboration and communication. The goal is for technology to support more natural and implicit understanding between people.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
The document discusses using virtual avatars to improve remote collaboration. It provides background on communication cues used in face-to-face interactions versus remote communication. It then discusses early experiments using augmented reality for remote conferencing dating back to the 1990s. The document outlines key questions around designing effective virtual bodies for collaboration and discusses various technologies that have been developed for remote collaboration using augmented reality, virtual reality, and mixed reality. It summarizes several studies that have evaluated factors like avatar representation, sharing of different communication cues, and effects of spatial audio and visual cues on collaboration tasks.
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
1) The document discusses the concept of empathic computing and its application to designing for the broader metaverse.
2) Empathic computing aims to develop systems that allow people to share what they are seeing, hearing, and feeling with others through technologies like augmented reality, virtual reality, and physiological sensors.
3) Potential research directions are explored, like using lifelogging data in VR, bringing elements of the real world into VR, and developing systems like "Mini-Me" avatars that can convey non-verbal communication cues to facilitate remote collaboration.
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
This document provides a summary of a lecture on perception in augmented and virtual reality. It discusses the history of disappearing computers from room-sized to handheld. It reviews the key concepts of augmented reality, virtual reality, and mixed reality on Milgram's continuum. It discusses how perception of reality works through our senses and how virtual reality aims to create an illusion of reality. It covers factors that influence the sense of presence such as immersion, interaction, and realism.
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
This document discusses empathic computing and its relationship to the metaverse. It defines key elements of the metaverse like virtual worlds, augmented reality, mirror worlds, and lifelogging. Research on the metaverse is still fragmented across these areas. The document outlines a vision for empathic computing systems that allow sharing experiences, emotions, and environments through technologies like virtual reality, augmented reality, and sensor data. Examples are given of research projects exploring collaborative VR experiences and AR/VR systems for remote collaboration and communication. The goal is for technology to support more natural and implicit understanding between people.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
The document discusses using virtual avatars to improve remote collaboration. It provides background on communication cues used in face-to-face interactions versus remote communication. It then discusses early experiments using augmented reality for remote conferencing dating back to the 1990s. The document outlines key questions around designing effective virtual bodies for collaboration and discusses various technologies that have been developed for remote collaboration using augmented reality, virtual reality, and mixed reality. It summarizes several studies that have evaluated factors like avatar representation, sharing of different communication cues, and effects of spatial audio and visual cues on collaboration tasks.
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
1) The document discusses the concept of empathic computing and its application to designing for the broader metaverse.
2) Empathic computing aims to develop systems that allow people to share what they are seeing, hearing, and feeling with others through technologies like augmented reality, virtual reality, and physiological sensors.
3) Potential research directions are explored, like using lifelogging data in VR, bringing elements of the real world into VR, and developing systems like "Mini-Me" avatars that can convey non-verbal communication cues to facilitate remote collaboration.
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
Lecture 7 of the COMP 4010 course in Virtural Reality. This lecture was about 3D User Interfaces for Virtual Reality. The lecture was taught by Mark Billinghurst on September 13th 2016 at the University of South Australia.
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
This document provides an introduction to extended reality technologies from Mark Billinghurst, the director of the Empathic Computing Lab at the University of South Australia. It outlines Billinghurst's background and research interests. It then provides an overview of the class, including assignments, equipment available, and the lecture schedule. The lecture schedule covers topics such as augmented reality, virtual reality, the metaverse, and the history of AR/VR.
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture 5 in the COMP 4010 class on Augmented and Virtual Reality. This lecture was about AR Interaction and Prototyping methods. Taught by Mark Billinghurst on August 24th 2021 at the University of South Australia.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
This document discusses various techniques for prototyping augmented reality interfaces, including sketching, storyboarding, wireframing, mockups, and video prototyping. Low-fidelity techniques like sketching and paper prototyping allow for rapid iteration and exploring interactions at early stages. Higher-fidelity techniques like interactive mockups and video prototypes communicate the look and feel of the final product and allow for user testing. A variety of tools are presented for different stages of prototyping, from sketching and interactive modeling in VR, to scene assembly using drag-and-drop tools, to final mockups using design software. Case studies demonstrate applying these techniques from initial concepts through to higher-fidelity prototypes. Overall the document
Lecture 1 of the COMP 4010 course on AR and VR. This lecture provides an introduction to AR/VR/MR/XR. The lecture was taught at the University of South Australia by Mark Billinghurst on July 21st 2021.
This document discusses augmented reality technology and visual tracking methods. It covers how humans perceive reality through their senses like sight, hearing, touch, etc. and how virtual reality systems use input and output devices. There are different types of visual tracking including marker-based tracking using artificial markers, markerless tracking using natural features, and simultaneous localization and mapping which builds a model of the environment while tracking. Common tracking technologies involve optical, magnetic, ultrasonic, and inertial sensors. Optical tracking in augmented reality uses computer vision techniques like feature detection and matching.
COMP 4010 Course on Virtual and Augmented Reality. Lectures for 2017. Lecture 2: VR Technology. Taught by Bruce Thomas on August 3rd 2017 at the University of South Australia. Slides by Mark Billinghurst
COMP 4010 Lecture 8 on an Introduction to Augmented Reality. This lecture provides a basic introduction to AR. Taught by Gun Lee on September 17th 2019 at the University of South Australia.
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
Moving Beyond Questionnaires to Evaluate MR ExperiencesMark Billinghurst
This document discusses the evolution of Mark Billinghurst's research evaluating mixed reality experiences over 25 years. It summarizes four of his key studies:
1) His 1995 study which used sketch maps to measure cognitive maps in virtual environments. It found maps correlated with orientation and different worlds produced different understanding.
2) His 1998 study of a collaborative AR/VR experience which found seeing a partner's body improved performance and AR was better than VR.
3) His 2003 study analyzing communication behaviors in colocated AR interfaces, finding gestures and speech were similar between face-to-face and AR conditions.
4) A 2018 meta-review analyzing 10 years of AR usability studies and opportunities to improve experiments
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
Lecture 7 of the COMP 4010 course in Virtural Reality. This lecture was about 3D User Interfaces for Virtual Reality. The lecture was taught by Mark Billinghurst on September 13th 2016 at the University of South Australia.
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
This document provides an introduction to extended reality technologies from Mark Billinghurst, the director of the Empathic Computing Lab at the University of South Australia. It outlines Billinghurst's background and research interests. It then provides an overview of the class, including assignments, equipment available, and the lecture schedule. The lecture schedule covers topics such as augmented reality, virtual reality, the metaverse, and the history of AR/VR.
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture 5 in the COMP 4010 class on Augmented and Virtual Reality. This lecture was about AR Interaction and Prototyping methods. Taught by Mark Billinghurst on August 24th 2021 at the University of South Australia.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
This document discusses various techniques for prototyping augmented reality interfaces, including sketching, storyboarding, wireframing, mockups, and video prototyping. Low-fidelity techniques like sketching and paper prototyping allow for rapid iteration and exploring interactions at early stages. Higher-fidelity techniques like interactive mockups and video prototypes communicate the look and feel of the final product and allow for user testing. A variety of tools are presented for different stages of prototyping, from sketching and interactive modeling in VR, to scene assembly using drag-and-drop tools, to final mockups using design software. Case studies demonstrate applying these techniques from initial concepts through to higher-fidelity prototypes. Overall the document
Lecture 1 of the COMP 4010 course on AR and VR. This lecture provides an introduction to AR/VR/MR/XR. The lecture was taught at the University of South Australia by Mark Billinghurst on July 21st 2021.
This document discusses augmented reality technology and visual tracking methods. It covers how humans perceive reality through their senses like sight, hearing, touch, etc. and how virtual reality systems use input and output devices. There are different types of visual tracking including marker-based tracking using artificial markers, markerless tracking using natural features, and simultaneous localization and mapping which builds a model of the environment while tracking. Common tracking technologies involve optical, magnetic, ultrasonic, and inertial sensors. Optical tracking in augmented reality uses computer vision techniques like feature detection and matching.
COMP 4010 Course on Virtual and Augmented Reality. Lectures for 2017. Lecture 2: VR Technology. Taught by Bruce Thomas on August 3rd 2017 at the University of South Australia. Slides by Mark Billinghurst
COMP 4010 Lecture 8 on an Introduction to Augmented Reality. This lecture provides a basic introduction to AR. Taught by Gun Lee on September 17th 2019 at the University of South Australia.
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
Moving Beyond Questionnaires to Evaluate MR ExperiencesMark Billinghurst
This document discusses the evolution of Mark Billinghurst's research evaluating mixed reality experiences over 25 years. It summarizes four of his key studies:
1) His 1995 study which used sketch maps to measure cognitive maps in virtual environments. It found maps correlated with orientation and different worlds produced different understanding.
2) His 1998 study of a collaborative AR/VR experience which found seeing a partner's body improved performance and AR was better than VR.
3) His 2003 study analyzing communication behaviors in colocated AR interfaces, finding gestures and speech were similar between face-to-face and AR conditions.
4) A 2018 meta-review analyzing 10 years of AR usability studies and opportunities to improve experiments
Presentation by Mark Billinghurst on Collaborative Immersive Analytics at the BDVA conference on November 7th 2017. This talk provides an overview of the topic of Collaborative Immersive Analytics
Studying information behavior: The Many Faces of Digital Visitors and ResidentsLynn Connaway
Connaway, L. S. (2018). Studying information behavior: The Many Faces of Digital Visitors and Residents. Presented at Bar-Ilan University, March 11, 2018, Ramat Gan, Israel.
Studying information behavior: The Many Faces of Digital Visitors and ResidentsOCLC
Wikipedia is commonly used by individuals across educational stages to familiarize themselves with topics, despite warnings from teachers not to use it. While some acknowledge it may contain inaccuracies, others view it favorably as an initial starting point for providing keywords and technical terms to explore a subject further. Comparisons are made that traditional printed encyclopedias also contain mistakes that cannot be corrected.
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
Keynote talk given by Mark Billinghurat at the Foundation of Digital Games (FDG) 2021 conference on August 5th 2021. The talk was on how Empathic Computing techniques can be used to create new type of games.
Presentation given by Mark Billinghurst on research into Empathic Glasses. Combining Augmented Reality, Wearable Computers, Emotion Sensing and Remote Collaboration. Given on February 18th 2016.
Leveraging Technology in Collaborative Work - FoundationsStephen Judd
Modern knowledge work, such as that done by Extension professionals, often calls for collaborative efforts to address complex issues from a variety of angles. Using technology to facilitate collaboration can allow teams to span geographical boundaries, work at different times, easily share information, foster frequent interaction, expand the team’s expertise, and reduce costs. However, collaboration within virtual spaces is different than traditional face-to-face work and requires consideration of a variety of factors: comfort with and access to the technology, leadership and coordination of the team, scheduling across time zones and institutions, etc.
In this webinar we will highlight published research about technology-facilitated collaboration and discuss its benefits, challenges, and factors that contribute to success. This foundational webinar will set the stage for subsequent webinars that will address specific tools and techniques that can be used to foster the success of collaborative work using technology.
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
COMP 4010 lecture on research directions in AR and VR, taught by Mark Billinghurst on November 2nd 2017 at the University of South Australia. This is the final lecture in the 2017 COMP 4010 course on AR and VR
This document discusses the use of virtual and collaborative virtual environments for education, with a focus on students with special needs. It describes several projects led by Sue Cobb at the University of Nottingham to develop VEs and CVEs using participatory design methods. Evaluation of the projects found that students were engaged with the technologies and they showed potential for supporting collaboration, communication skills, and perspective taking. However, more work is needed to improve realism and robustness for use in classroom settings.
Lecture 11 from the 2017 COMP 4010 course on AR and VR at the University of South Australia. This lecture was on AR applications and was taught by Mark Billinghurst on October 26th 2017.
Liberact conference 2013 Gnome Surfer & Moclo PlannerConsuelo Valdes
This document discusses research into using reality-based interfaces to enhance collaborative learning and discovery in fields involving large datasets, such as genomics. It presents two interactive systems developed for this purpose: G-nome Surfer, a tabletop interface for collaborative exploration of genomic data; and MoClo Planner, a collaborative tool for designing and specifying biological constructs. User studies found that both tools improved performance on genomics tasks, reduced workload, and increased enjoyment compared to traditional tools. Going forward, the research aims to explore how to better visualize and engage with large, complex datasets and facilitate collaboration across teams and locations.
Using Qualitative Methods for Library Evaluation: An Interactive WorkshopOCLC
Connaway, Lynn Silipigni, and Marie L. Radford. 2016. "Using Qualitative Methods for Library Evaluation: An Interactive Workshop." Presented at the Libraries in the Digital Age (LIDA) Conference, Zadar, Croatia, June 14.
Using Qualitative Methods for Library Evaluation: An Interactive WorkshopLynn Connaway
Connaway, Lynn Silipigni, and Marie L. Radford. 2016. "Using Qualitative Methods for Library Evaluation: An Interactive Workshop." Presented at the Libraries in the Digital Age (LIDA) Conference, Zadar, Croatia, June 14.
Final lecture from the COMP 4010 course on Virtual and Augmented Reality. This lecture was about Research Directions in Augmented Reality. Taught by Mark Billinghurst on November 1st 2016 at the University of South Australia
Designing and deploying mobile user studies in the wild: a practical guideKaren Church
This tutorial was presented as part of Mobile HCI 2012 in San Francisco on the 19th September 2012. The tutorial aims to provide a practical guide to conduct mobile field studies based on the learning outcomes of the research I've been involved in while working as a Research Scientist in Telefonica Research, Barcelona. I cover how to design effective mobile field studies, the importance of mobile prototyping, the impact of various design choices on the study setup and deployment, how to engage participants and how to avoid ethical and legal issues. I've also tried to include listings of useful resources for those who are interested in conducting mobile field studies of their own.
More details: http://mm2.tid.es/mhcitutorial/
Karen Church
Research Scientist
Telefonica Research
www.karenchurch.com
@karenchurch
Keynote talk by Mark Billinghurst at the 9th XR-Metaverse conference in Busan, South Korea. The talk was given on May 20th, 2024. It talks about progress on achieving the Metaverse vision laid out in Neil Stephenson's book, Snowcrash.
These are slides from the Defence Industry event orgranized by the Australian Research Centre for Interactive and Virtual Environments (IVE). This was held on April 18th 2024, and showcased IVE research capabilities to the South Australian Defence industry.
This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
This document discusses empathic computing and collaborative immersive analytics. It notes that while fields like scientific and information visualization are well established, little research has looked at collaborative visualization specifically. Collaborative immersive analytics combines mixed reality, visual analytics and computer-supported cooperative work. Empathic computing aims to develop systems that allow sharing experiences, emotions and perspectives using technologies like virtual and augmented reality with physiological sensors. Applying these concepts could enhance communication and understanding for collaborative immersive analytics tasks.
This document discusses how metaverse concepts can be applied to corporate learning and leadership development. It defines the metaverse and outlines its key components: virtual worlds, augmented reality, mirror worlds, and lifelogging. Traditional corporate learning is described as instructor-led, group-based, and discrete. The document proposes applying metaverse concepts like learning in the flow of work, just-in-time learning, and adaptive personalized learning. Specific applications explored are virtual reality for skills and soft skills training, augmented reality for hands-on training, lifelogging for adaptive training, and mirror worlds for capturing real-world tasks.
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Software Engineering and Project Management - Introduction, Modeling Concepts...Prakhyath Rai
Introduction, Modeling Concepts and Class Modeling: What is Object orientation? What is OO development? OO Themes; Evidence for usefulness of OO development; OO modeling history. Modeling
as Design technique: Modeling, abstraction, The Three models. Class Modeling: Object and Class Concept, Link and associations concepts, Generalization and Inheritance, A sample class model, Navigation of class models, and UML diagrams
Building the Analysis Models: Requirement Analysis, Analysis Model Approaches, Data modeling Concepts, Object Oriented Analysis, Scenario-Based Modeling, Flow-Oriented Modeling, class Based Modeling, Creating a Behavioral Model.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
5. Typical Research Questions
• Is collaboration with AR/VR better than video conferencing?
• What is the impact of a particular input method?
• How should people be represented in Social XR interfaces?
• What communication cues can be added to improve collaboration?
• How can you effectively collaborate in hybrid interfaces?
• And more….
6. ISMAR Paper Trends
• ISMAR papers surveyed from 2008 – 2017
• Collaboration identified as new trend
• Only 9/526 papers = 1.7%
Kim, K., Billinghurst, M., Bruder, G., Duh, H. B. L., &
Welch, G. F. (2018). Revisiting trends in augmented
reality research: A review of the 2nd decade of ISMAR
(2008–2017). IEEE transactions on visualization and
computer graphics, 24(11), 2947-2962.
7. AR User Studies
• Key findings
• < 10% of all AR papers have user study
• Few collaborative user studies
• 12/291 user study papers < 5%
• Less than half used HMD
• Most studies in lab/indoor
• 1/15 studies outdoor, 3/15 field studies
Dey, A., Billinghurst, M., Lindeman, R. W., &
Swan, J. (2018). A systematic review of 10 years
of augmented reality usability studies: 2005 to
2014. Frontiers in Robotics and AI, 5, 37.
11. Existing AR Collaborative Studies
• Many papers use a combination of subjective and objective measures
• Typically have a small number of subjects
• Typically less than 20, University students
• Most involve pairs of users
• Less than half of the studies use HMDs
• Split between HMDs and HHDs
• Most experiments in controlled environments
• Lack of experimentation in real world conditions, heuristic, pilot studies
• Most evaluation is in a remote collaboration setting
• 30% in face-to-face collaboration
12. Opportunities
• Need for increased user studies in collaboration
• More use of field studies, natural user
• Need a wider range of evaluation methods
• Use a more diverse selection of participants
• Increase number of participants
• More user studies conducted outdoors are needed
• Report participant demographics, study design, or experimental task
13. Example: Collocated Communication Behaviours
• Is there a difference between AR-based & screen-based FtF collaboration?
• Hypothesis: FtF AR produces similar behaviours to FtF non-AR
Billinghurst, M., Belcher, D., Gupta, A., & Kiyokawa, K. (2003). Communication behaviors in colocated
collaborative AR interfaces. International Journal of Human-Computer Interaction, 16(3), 395-423.
14. Experiment Design
• Building arranging task
• Both people have half the requirements
• Conditions
• Face to Face – FtF with real buildings
• Projection – FtF with screen projection
• Augmented Reality – FtF with AR buildings
Face to Face Projection Augmented Reality
15. Measures
• Objective
• Performance time
• Communication Process Measures
• The number and type of gestures made
• The number of deictic phrases spoken
• The average number of words per phrase
• The number of speaker turns
• Subjective
• Subjective survey
• User comments
• Post experiment interview
How well could you work with your partner?
(1 = not very well, 5 = very well)
How easy was it to move the virtual objects?
(1 = not very easy, 5 = very easy)
What is that? (pointing)
16. Results
• Performance time
• Sig. diff. between conditions – AR slowest
• Communication measures
• No difference in number of words/turns
• Sig. Diff. in deictic phrases (FtF same as AR)
• Sig. Diff. in pick gestures (FtF same as AR)
• Subjective measures
• FtF manipulation same as AR
• FtF to work with than AR/FtF
Percentage Breakdown of Gestures
Subject Survey Results
17. Lessons Learned
• Positive Lessons
• Communication process measures valuable
• Gesture, speech analysis
• Collect user feedback/interviews
• Stronger statistical analysis
• Make observations
• Fewer mistakes
• Surveys could be stronger
• Validated surveys
• Better interview analysis
• Thematic analysis
“AR’s biggest limit was lack of peripheral
vision. The interaction physically (trading
buildings back and forth) as well as spatial
movement was natural, it was just a little
difficult to see.
By contrast in the Projection condition you
could see everything beautifully but
interaction was tough because the interface
didn’t feel instinctive.”
“working solo together”.
18. • Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Example 2: Virtual Communication Cues (2019)
20. Conditions
• Baseline: In the Baseline condition, we showed only the head and hands of the
collaborator in the scene. The head and hands were presented in all conditions
• Field-of-view (FoV): We showed the FoV frustum of each collaborator to the
other. This enabled collaborators to understand roughly where their partner was
looking and how much area the other person could see at any point in time.
• Head-gaze (FoV + Head-gaze ray): FoV frustum plus a ray originating from the
user's head to identify the center of the FoV, which provided a more precise
indication where the other collaborator was looking
• Eye-gaze (FoV + Eye-gaze ray): In this cue, we showed a ray originating from
the user's eye to show exactly where the user was looking at.
21. Task
• Search task
• Find specific blocks together
• Two phases:
• Object identification
• Object placement
• Designed to force collaboration
• Each person seeing different information
• Within-subject Design
• Everyone experiences all conditions
23. Measures
• Performance (Objective)
• Rate of Mutual Gaze
• Task completion time
• Observed (Objective)
• Number of hand gestures
• Physical movement
• Distance between collaborator
• Subjective
• Usability Survey (SUS)
• Social Presence Survey
• Interview
24. Data Collected
• Participants
• 16 pairs = 32 people
• 9 women
• Aged 20 – 55, average 31 years
• Experience
• No experience with VR (6), no experience AR (10), no HMD (7).
• Data collection
• Objective
• 4 (conditions) × 8 (trials per condition) × 16 pairs = 512 data points
• Subjective
• 4 (conditions) × 32 (participants) = 128 data points.
27. Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
28. Example 3: Scaling Up (2020)
• IEEE VR 2020
• Large scale virtual conference
• 1965 attendees
Ahn, S. J., Levy, L., Eden, A., Won, A. S., MacIntyre,
B., & Johnsen, K. (2021). IEEEVR2020: Exploring
the first steps toward standalone virtual
conferences. Frontiers in Virtual Reality, 2, 648575.
29. Tools Used
• Mozilla Hubs
• 3D social VR
• Twitch
• Streaming
• Slack
• Text messaging
• Social Network
• Text-based
30. Analysis
•Subjective Survey
• Demographics
• Likert scale questions
• Conference effectiveness
• Media appropriateness
• Social Presence
• Open ended responses
• Thematic analysis
• Observation
• User behaviour
33. Thematic Analysis
• Look for common themes in the text from the open-ended questions
• Themes observed
• Fun and Playful Connections and Conversations
• Split Views on Posters in Hubs
• New Ways to Attend Conference Talks in Hubs
• Infrastructure Challenges
“The BOFs were super
enjoyable and a real hit for
learning and networking.”
“It was intimidating that there
were so few other people
there. Most often it was just
me and the presenter.”
“I think the experience would have
been vastly better with a better
connection”
34. Field Observations
• Process
• Moving between rooms
• Short interviews
• Observe and code behaviors
• Observation styles
• Broad observation – observe whole room
• Spotlight - focus on one participant for 10 minutes
• Categories of Behavior
• spatial (how attendees interacted in room), interactions (attendees interacted with each other)
• harassment (toxic interactions), communication (how attendees talked about their experience)
35. Field Observations
• Spatial navigation issues
• Difficulty in navigating space and interacting with each other
• Need to remove HMD to use keyboard
• Evolving interactions over time
• Learning interaction methods over time
• HMD use dropped by end of conference
• Limitations of social interactions
• Most users moving to less social platforms (twitch)
• Audio issues – being heard anywhere
• Democratization of Academic Conferences
• Increased diversity and removal of status
• Significantly increased participation
36. Example 4: More Detail (2022)
• Evaluating large scale social VR
• Using wider range of measures
Moreira, C., Simões, F. P., Lee, M. J., Zorzal, E. R.,
Lindeman, R. W., Pereira, J. M., ... & Jorge, J.
(2022). Toward VR in VR: Assessing Engagement
and Social Interaction in a Virtual Conference. IEEE
Access, 11, 1906-1922.
37. IEEE VR 2021
• Fully online virtual conference – 1200+ attendees
• Tools
• Virbella 3D platform – virtual avatars, Desktop, or HMD viewing
• Discord for chat/messaging
• Twitch/Youtube for video streaming
46. Key Lessons Learned
• There is a need for more Social XR evaluation studies
• Use a variety of subjective and objective measures
• Focus on the communication measures, not performance
• There are opportunities for new evaluation methods
• Adapt the tools to the number of participants
47. New Tools
• New types of sensors
• EEG, ECG, GSR, etc
• Sensors integrated into AR/VR systems
• Integrated into HMDs
• Data processing and capture tools
• iMotions, etc
• AR/VR Analytics tools
• Cognitive3D, etc
48. Sensor Enhanced VR HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
49. Multiple Physiological Sensors into HMD
• Incorporate range of sensors on HMD faceplate and over head
• EMG – muscle movement
• EOG – Eye movement
• EEG – Brain activity
• EDA, PPG – Heart rate
50.
51. Cognitive3D
• Capture capture and analytics for VR
• Multiple sensory input (eye tracking, HR, EEG, body movement, etc)
53. Moving Beyond Questionnaires
• Move data capture from post experiment to during experiment
• Move from performance measures to process measures
• Richer types of data captured
• Physiological Cues
• EEG, GSR, EMG, Heart rate, etc.
• Richer Behavioural Cues
• Body motion, user positioning, etc.
• Higher level understanding
• Map data to Emotion recognition, Cognitive load, etc.
• Use better analysis tools
• Video analysis, conversation analysis, multi-modal analysis, etc.
54. • Types of Studies
• Need for increased user studies in collaboration
• More use of field studies, natural user experiences
• Use a more diverse selection of participants
• Evaluation measures
• Need a wider range of evaluation methods
• Establish correlations between objective and subject measures
• Better tools
• New types of physiological sensors
• Develop new analytics
Research Opportunities