SlideShare una empresa de Scribd logo
1 de 144
Descargar para leer sin conexión
UNIVERSIT`A DEGLI STUDI DI TRIESTE
Dipartimento di Ingegneria e Architettura
Laurea Magistrale in Ingegneria Informatica
Design and implementation of a Virtual Reality
application for Computational Fluid Dynamics
Dicembre 2017
Laureando Relatore
Lorenzo D’Eri Prof. Francesco Fabris
Anno Accademico 2016/2017
”The most likely way for the world to be destroyed, most experts agree, is by accident.
That’s where we come in; we’re computer engineers. We cause accidents.”
- Nathaniel Borenstein, co-creator of MIME
Abstract (Italian)
Il progetto discusso in questa tesi consiste nella progettazione e nello sviluppo di
un applicativo che permetta visualizzazione, interazione e manipolazione in Realt`a
Virtuale di dati provenienti dall’ambito della Fluidodinamica Computazionale.
Nella prima parte della tesi viene innanzitutto delineato lo stato dell’arte circa le
applicazioni per la Realt`a Virtuale in ambito scientifico. La ricerca effettuata mostra
che, pur essendoci esempi di applicativi che utilizzano la Realt`a Virtuale per la
visualizzazione di dati, specialmente in campo medico e militare, nessuno di essi
riesce ad offrire una soluzione completa e autocontenuta. Inoltre, in questa parte
della tesi viene offerta una panoramica sulle tecnologie software e hardware che sono
state utilizzate nel corso dello sviluppo oggetto di questa trattazione.
La tesi procede dunque con la descrizione dei due artefatti che rappresentano l’output
del progetto: da un lato un’applicazione di Realt`a Virtuale costruita sul motore
grafico e fisico Unity3D, dall’altro una versione ulteriormente sviluppata di Para-
Unity, un plug-in pre-esistente per il software di visualizzazione CFD ParaView. Il
sistema finale composto da questi due artefatti permette di esportare dati di flu-
idodinamica computazionale da ParaView ed importarli in un ambiente in Realt`a
Virtuale, nel quale sono state implementate diverse forme di interazione e manipo-
lazione con essi.
In conclusione, la tesi sottolinea le scelte progettuali che sono state compute nel
delineare gli applicativi. `E infatti mostrato come tali scelte abbiano portato ad una
soluzione generale, flessibile e di facile espansione: qualit`a di grande importanza
nell’ottica di un ulteriore sviluppo delle funzionalit`a che il sistema software risultante
pu`o offrire all’utente.
Abstract
The project discussed in this paper consists in the development of a working proto-
type that allows visualization, interaction and manipulation of Compulational Fluid
Dymanics datasets in a 3D Virtual Reality environment.
In the first part of this thesis, background information about the state of the art
of Virtual Reality applications for scientific purposes is presented. Research shows
that, although there being current development efforts, especially in the military
and medicine fields, no full-fledged solutions are currently available on the market.
Moreover, the first part also outlines the main software and hardware technologies
that have been used in the development phase of the project. The thesis then pro-
ceeds to describe the two pieces of software that make up the output of the project:
a Virtual Reality application built upon the Unity 3D graphics and physics engine,
and an improved version of a pre-existing plug-in for the well-known CFD visual-
ization software ParaView, called ParaUnity. The final system in which these two
applications are included is effectively able to export CFD datasets from ParaView
and import them in a Virtual Reality environment, in which various forms of inter-
action and manipulation have been implemented.
In conclusion, the thesis underlines the design choices in engineering the applications,
that lead to a general, flexible and easily-expandable solution, which is of utter
importance in view of further developments in the features and capabilities that the
resulting system software can offer to the user.
Acknowledgements
To Cri, who has always been there for me.
To my family: my mother Sabina, my father Angelo and my brother Giorgio, who
have supported me in any possible way.
To Fede, not only a friend but also the best late-night software engineering consultant
I could have hoped for.
To my friends I Cazzilli: Seb, Ste and Fede, who probably spent this year training
at gnagno so that I now have no chance of winning against them.
To Simo, who willingly spent his study breaks wearing blond wigs and yellow shirts
with me in front of hundreds of thousands of people.
To Ventura and R´emi, for sharing not only the journey, but also countless cups of
coffee and an embarrassing amount of French cheese.
To Antonis, for believing in us and in our project even in times of crisis.
To prof. Fabris, whose passion for teaching should serve as an example for every
teacher and student.
To myself, for never giving up on my daily dose of Italian pasta.
iv
Contents
Abstract (Italian) ii
Abstract iii
Acknowledgements iv
List of Figures ix
List of Tables x
Abbreviations xi
1 Introduction 1
1.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Project Management . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2.1 Time management . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.2 Versioning and productivity tools . . . . . . . . . . . . . . . . 6
1.2.2.1 Github . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.2.2.2 Waffle . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2 Background 8
2.1 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.1 Virtual Reality applications . . . . . . . . . . . . . . . . . . . 9
2.1.2 Virtual Reality for scientific data . . . . . . . . . . . . . . . . 10
2.2 Technologies Used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 HTC Vive . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1.1 HTC Vive software . . . . . . . . . . . . . . . . . . . 11
2.2.2 ParaView and VTK . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2.2.1 Virtual Reality Capabilities in ParaView . . . . . . . 14
2.2.3 Unity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2.3.1 Object behaviors in Unity . . . . . . . . . . . . . . . 15
v
Contents vi
2.2.3.2 Virtual Reality Capabilities in Unity . . . . . . . . . 17
2.2.4 ParaUnity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3 Unity Application 19
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.1.1 Why Unity? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 Application Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.2.1 Top-level Managers . . . . . . . . . . . . . . . . . . . . . . . . 22
3.3 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.3.1 Controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.3.2 Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.3.2.1 Lighting . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.3.3 ParaView object loading . . . . . . . . . . . . . . . . . . . . . 29
3.3.3.1 3D model creation . . . . . . . . . . . . . . . . . . . 30
3.3.3.2 3D model material . . . . . . . . . . . . . . . . . . . 31
3.3.4 Static Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.3.5 Radial Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.3.6 Object interaction . . . . . . . . . . . . . . . . . . . . . . . . 36
3.3.6.1 Bounding box . . . . . . . . . . . . . . . . . . . . . . 38
3.3.7 Resizing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
3.3.8 Animation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.3.9 Slicing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.3.10 Logger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4 ParaUnity 49
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.2 Initial state . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
4.2.1 Plug-in implementation . . . . . . . . . . . . . . . . . . . . . . 52
4.2.1.1 TCP Socket . . . . . . . . . . . . . . . . . . . . . . . 52
4.2.1.2 Exporting the objects . . . . . . . . . . . . . . . . . 53
4.2.2 Initial communication protocol . . . . . . . . . . . . . . . . . 54
4.3 Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.3.1 From Disk I/O to RAM I/O . . . . . . . . . . . . . . . . . . . 55
4.3.1.1 Performance Results . . . . . . . . . . . . . . . . . . 58
4.3.2 Asynchronous communication for time-dependent data . . . . 59
4.3.2.1 Performance Results . . . . . . . . . . . . . . . . . . 60
5 Conclusions 62
5.1 Final system architecture . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.2 Objectives achieved . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Contents vii
5.3 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
A Setup Instructions 67
A.1 Building ParaView and ParaUnity . . . . . . . . . . . . . . . . . . . . 67
A.1.1 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
A.1.2 Obtain the source code . . . . . . . . . . . . . . . . . . . . . . 68
A.1.3 Compile Qt . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
A.1.4 Compile ParaView . . . . . . . . . . . . . . . . . . . . . . . . 68
A.1.5 Compile ParaUnity . . . . . . . . . . . . . . . . . . . . . . . . 69
A.1.6 Loading the plug-in in ParaView . . . . . . . . . . . . . . . . 70
A.2 Building the Unity Application . . . . . . . . . . . . . . . . . . . . . 70
A.2.1 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
A.2.2 Obtain the source code . . . . . . . . . . . . . . . . . . . . . . 70
A.2.3 Compile the application . . . . . . . . . . . . . . . . . . . . . 70
A.2.4 Exporting an object from ParaView to Unity . . . . . . . . . . 71
B Code of the Unity Application 72
B.1 AnimationManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
B.2 BoundingBoxBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . 75
B.3 ControllerBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
B.4 ControllersManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
B.5 EnvironmentManager . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
B.6 FloorBehavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
B.7 FunctionsExtensions . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
B.8 Globals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
B.9 HeadsetManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
B.10 Interactable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
B.11 LightManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
B.12 LogManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
B.13 ModeManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
B.14 ParaviewObjectLoader . . . . . . . . . . . . . . . . . . . . . . . . . . 95
B.15 PointerRadialMenuBehaviour . . . . . . . . . . . . . . . . . . . . . . 99
B.16 RadialMenuBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . 100
B.17 SizeManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
B.18 SlicingManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
B.19 SlicingPlaneBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . 106
B.20 StaticMenuManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
B.21 UIPointerRadialMenuBehaviour . . . . . . . . . . . . . . . . . . . . . 111
Contents viii
B.22 XDocumentLoader . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
C Code of ParaUnity 115
C.1 Unity3D.h . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
C.2 Unity3D.cpp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Bibliography 130
List of Figures
1.1 Gantt chart of the project . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2 GitHub repository for the Unity Application . . . . . . . . . . . . . . 7
1.3 Waffle board for the Unity Application . . . . . . . . . . . . . . . . . 7
2.1 HTC Vive kit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2 HTC Vive controller buttons . . . . . . . . . . . . . . . . . . . . . . . 13
2.3 Example of Unity’s interface . . . . . . . . . . . . . . . . . . . . . . . 16
3.1 Screenshot of the resulting Unity Application . . . . . . . . . . . . . 20
3.2 Architectural diagram of the main Unity scene . . . . . . . . . . . . . 24
3.3 Screenshot of the two controllers in action . . . . . . . . . . . . . . . 27
3.4 Screenshot of the environment of the application . . . . . . . . . . . . 28
3.5 Screenshot of a mesh colored according to its values . . . . . . . . . . 32
3.6 Screenshot of the opened static menu . . . . . . . . . . . . . . . . . . 34
3.7 Screenshot of the opened radial menu . . . . . . . . . . . . . . . . . . 36
3.8 Screenshot of the bounding box . . . . . . . . . . . . . . . . . . . . . 39
3.9 Screenshots of four different frames of an ongoing animation . . . . . 44
3.10 Screenshot of the mock implementation of the slicing plane . . . . . . 46
3.11 Screenshot of a message being logged through the logger . . . . . . . 47
4.1 ParaUnity buttons on ParaView’s GUI . . . . . . . . . . . . . . . . . 50
5.1 Final system architecture diagram . . . . . . . . . . . . . . . . . . . . 63
ix
List of Tables
2.1 ParaView-VTK Architecture (simplified) . . . . . . . . . . . . . . . . 14
4.1 Performance comparison between disk I/O and RAM I/O . . . . . . . 59
4.2 Performance comparison for the improved communication protocol. . 61
x
Abbreviations
CFD Computational Fluid Dynamics
VR Virtual Reality
AIRC Aerospace Integration Research Centre
SATM School of Aerospace, Technology and Manufacturing
HMD Head Mounted Display
API Application Programming Interface
OS Operating System
VRTK Virtual Reality ToolKit
GUI Graphical User Interface
OOP Object Oriented Programming
OOD Object Oriented Design
UI User Interface
TCP Transfer Communication Protocol
RAM Random Access Memory
XML Extensible Markup Language
UML Unified Modeling Language
xi
Chapter 1
Introduction
The work presented in this thesis consists in the development of a Virtual Real-
ity application for visualizing and interacting with Computational Fluid Dynamics
datasets.
With Virtual Reality technologies improving rapidly, particularly driven by the gam-
ing and entertainment market, industries have begun to research possible applica-
tions across various fields, such as medicine, learning and engineering. Chapter 2
will present the current state of the art regarding Virtual Reality application and
will provide examples of the research being done.
One field for which Virtual Reality technologies offers some promising possibilities
is Computational Fluid Dynamics. In fact, given that CFD datasets are usually
large, complex, three-dimensional and often time-dependent, it is evident that an
immersive and interactive experience in Virtual Reality could provide a new, more
efficient way to visualize and interact with such data. This is among the reasons why
Cranfield University’s Aerospace Integration Research Centre (part of the School of
Aerospace, Technology and Manufacturing) has founded a Virtual Reality R&D
division. The work described in this thesis is one of the first three projects of such
division.
The project itself consisted in the development of an application that could load
CFD data, present them to the user in a virtual environment and provide some
1
Introduction 2
degrees of interaction and manipulation. As described in Chapter 3, the development
platform of choice was Unity. Such application would receive the CFD objects from
ParaView and VTK (both presented, along with Unity, in Chapter 2), and then act
independently as a stand-alone application.
The Virtual Reality kit for the development is a HTC Vive (further detailed in
Section 2.2.1), which consists in a head-mounted display, two haptic controllers and
two cameras for three-dimensional tracking.
As explained in Chapter 4, a plugin for ParaView called ParaUnity was used (and
further developed) to enable the communication between the two software systems.
The importance of this lies in the fact that the user does not need to know about
Unity to run the application, as they are presented with the same familiar interface
of ParaView with the addition of an export button.
In Chapter 5 the final state of the overall system is presented. As one can see, it
consists in a complete, self-contained software system that provides the user with the
possibility of interacting with ParaView objects in a virtual environment. Moreover,
a summary of the obtained results is provided and some ideas for future development
are proposed to the reader.
1.1 Objectives
Being a rather open-ended project, i.e. a project in which there is no strict and well-
defined set of software requirement specifications, the objectives of the development
have been purposefully kept wide and general, as to reflect the idea that the project
could follow an exploratory approach.
Nonetheless, there are still some guidelines that have been followed from the begin-
ning to the end of the project:
• The project shall result in a working prototype of a Virtual Reality application.
Introduction 3
• The application shall allow the handling of CFD data; in particular, it shall
provide:
– visualization of the data,
– interaction with the data,
– some basic forms of manipulation of the data.
• The application shall allow the import of data from ParaView.
• The application shall run compatibly at least on Windows (version 7 or greater),
and optionally on Linux.
• The application shall support a HTC Vive kit.
• The code should be designed to be maintainable, flexible and expandable.
• The application should be easy to use, being it aimed at CFD scientists with
little to no prior VR experience.
In Section 5.2 these objectives will be discussed in light of the work done.
1.2 Project Management
If on one hand the open-ended nature of the project can foster creativity and inno-
vation, given that the developer has much more control over the whole process from
design to deploy, on the other it requires careful planning to avoid failures and to
maximize efficiency.
It is, in fact, the job of a Software Engineer to not only design the code, but to also
take all the necessary step to organize the work and prepare all the tools that he or
she may need.
Introduction 4
1.2.1 Time management
The duration of this project spans across several months, from March to early
September. To organize it, a certain number of high-level tasks has been defined.
These have been then further divided in units of work and have been given a du-
ration and a time allocation. The result of this planning can be seen in the Gantt
chart in Figure 1.1.
The chart itself clearly shows the division of the project in phases. Initially, the has
been a preparatory research in which the existing literature has been reviewed and
documented, and time has also been dedicated in becoming familiar with ParaView
and VTK. This was followed by a second phase of environment setup, with the
objective of preparing the deployment machine with all the necessary software, and
setting up the development environment for an optimal work-flow. Part of this phase
has also been spent at exploring the compatibility with a Linux operating system.
These two initial phases have been a collaborative effort with the other two MSc
students working on the same topic, and therefore required careful job division and
collaboration to ensure a better time efficiency.
From the third phase the actual individual development of the Unity application has
started. This lasted from late May to late July, when the work had to be finalized in
time for the Technical Presentation. The time planning shows that the development
followed a feature-based approach, in which each work unit consists in a feature as
perceived by the user. The most critical point of this phase is the integration with
ParaUnity (the development of which proceeded more or less in parallel with the
Unity Application). In fact, this required having both pieces of software at a very
specific point of their development, and would be a necessary prerequisite for any
further feature. Both the development of the Unity Application and of ParaUnity
were followed by careful validation testing.
The next phase began after the Technical Presentation, and was completely dedi-
cated to refactoring the code, ensuring higher flexibility and maintainability. This
was then followed by the redaction of the thesis and the rest of the documentation,
in time for the deadline of the submission of the thesis, in mid-August.
Introduction 5
2017
Mar Apr May Jun Jul Aug S
Preparatory Research
Literature Review
ParaView training
VTK documentation study
Lit. Review submission
Environment Setup
Cross-platform testing
Configure, build and install
Manual Writing
Unity Application
Unity training
Scene setup
Static menu
Radial menu
ParaUnity integration
Object interaction
Scaling
Animation
Materials
Tests
ParaUnity
Configure, build and install
Unity integration
RAM export
Communication protocol
HTC Vive support
Tests
Technical Presentation
Code refactoring
Class redefinition
Generalization
Regression Tests
Documentation
Code commenting
Thesis Redaction
Revision and Correction
Thesis Submission
Poster Presentation
Figure 1.1: Gantt chart of the project
Introduction 6
1.2.2 Versioning and productivity tools
As any project of this size, and especially being it part of a prototyping work-
flow, some productivity tools were required to ensure organization and efficiency
during all phases of the development. Among these, GitHub and Waffle have proved
particularly useful in managing some aspects of the project.
1.2.2.1 Github
The codebase for the project has been hosted on GitHub, a web version control
repository based on Git. The reasons for this choice were multiple: on one hand,
Git’s versioning system gives the advantage of having fine-grained incremental back-
ups, accessible from any machine, with the version history being fully explorable via
GitHub’s web interface. This, along with the merging capabilities that Git natively
provides, resulted in a smooth initial phase of collaborative development with the
two other students working on similar projects. Moreover, thanks to Git’s branch-
ing system the project benefited from a more organized development, in which each
feature was independently developed on one branch.
As an example, the home page of the GitHub repository for the Unity Application
is shown in Figure 1.2.
1.2.2.2 Waffle
Waffle is an online automated project management tool, that interfaces with GitHub
repositories, handling GitHub issues and pull requests. Its main feature is the ability
to create a board-like view for every issue in the repository, thus providing the
developer with an interactive to-do list that interfaces directly with GitHub.
During the development, it has been used extensively as a productivity tool, mainly
to keep track of the features and the bugs that were being worked on. In Figure 1.3
the board view for an initial state of the Unity Application repository is shown.
Introduction 7
Figure 1.2: GitHub repository for the Unity Application
Figure 1.3: Waffle board for the Unity Application
Chapter 2
Background
2.1 Literature Review
In a world where technology is ever-changing and rapidly evolving, one of the greatest
and most fascinating challenges is to try and combine results coming from different
areas of research to obtain a newer and more interesting perspective on a subject.
This is especially true when new technologies involve different ways of interfacing
with the digital world, thus providing revolutionary means to visualize and interact
with data.
Virtual Reality is a perfect example of one such technology. It consists of all the
tools that use a computer to generate images, sounds and other sensory inputs to
immerse the user into a virtual world. It can be said, as stated in [1], that:
From an evolutionary perspective, virtual reality is seen as a way to over-
come limitations of standard human-computer interfaces; from a revo-
lutionary perspective, virtual reality technology opens the door to new
types of applications that exploit the possibilities offered by presence
simulation.
8
Background 9
Ever since the creation of the first toolkits for Virtual Reality, many fields have
benefited from such new approach to the digital world. One of these, and the
one this thesis focuses on, is Computational Fluid Dynamics. As stated on [2],
Computational Fluid Dynamics is a specific application of fluid mechanics that uses
methods of numerical analysis to solve problems of flows of fluids.
In the past, great effort has been spent in developing software to visualize and
analyze scientific data such as that coming from CFD applications, each piece of
software focusing on specific aspects and offering different capabilities. Lately, some
of these programs have been further developed (often through the use of additional
plugins, as seen in Section 2.2.4) to offer a certain degree of VR integration. How-
ever, at the current stage, there are still no stable and widespread versions of such
programs that have been developed with native VR capabilities.
2.1.1 Virtual Reality applications
Concepts that could be referred to as Virtual Reality can be dated back as far as the
first half of the twentieth century, but VR technologies in their current, modern form
have been around for the last two decades. In fact, in 1996, Steve Bryson published
an article in which he stated that “[by] immersing the user in the solution, virtual
reality reveals the spatially complex structures in computational science in a way
that makes them easy to understand and study” [3]. The same paper, on the other
hand, highlighted the challenges in performance that such technology would pose.
Pushed by the gaming industry and the consumer market [4], Virtual Reality has
evolved ever since then to achieve what Bryson had predicted. Currently, there exist
VR applications in many fields, one obvious example of which being the military.
For instance, some of the most common military uses for Virtual Reality include
flight simulators for pilot training, such as those detailed in [5] and [6], or more
general-purpose simulators aimed at infantry like the one described in [7]. Another
field in which Virtual Reality is becoming more and more widespread is medicine. In
fact, VR environments are currently being used for simulating surgical procedures
(as explained in [8]) and for visualizing and interacting with medical data (such as
Background 10
MRI [9] and CAT scans [10]). In the last few years, programs aimed at bringing
Virtual Reality to schools and universities are also being developed and tested, as
clearly explained in [11].
On the other hand, as far as Computational Fluid Dynamics data are concerned,
there have been different attempts to develop specific CFD software with native VR
capabilities in the past (see [12] and [13]), and some development effort is currently
being made to integrate existing applications with Virtual Reality (see Section 2.2.2).
However, as stated before, results are nowhere near a full integration for general CFD
problems at this time.
2.1.2 Virtual Reality for scientific data
Most of the applications presented in Section 2.1.1, especially those aimed at visual-
izing scientific data in a virtual environment, share a common work-flow regarding
how data is handled (as explained in [14]):
• Generation: data is generated from sensors or external software.
• Pre-processing, importing, post-processing: data is moved to the visu-
alization software.
• Rendering: the software analyses data to create its graphical representation.
• Visualization: data is presented to the user via a VR kit.
• Feedback: through sensors, gestures, specific hardware such as controller and
a GUI, the user can modify the visualization in real time.
The same work-flow is also adopted by the software system discussed in this the-
sis, where the Visualization and Feedback phases are implemented in the Unity
Application described in Chapter 3.
Background 11
2.2 Technologies Used
Throughout the development of this project, a certain number of technologies, both
hardware and software, has been used extensively. It is therefore the purpose of this
section to provide an introduction to such technologies in order to allow a better
understanding of the whole software system involved.
2.2.1 HTC Vive
The Virtual Reality kit that has been used for the development is an HTC Vive kit
[15], developed by HTC and Valve Corporation.
The kit, officially released on 5 April 2016, consists in a head-mounted display (with
a camera), two wireless haptic controllers and two cameras for position tracking (as
shown in Figure 2.1).
The interaction between the cameras, the controllers and the HMD allow for Room
scale tracking, which means that the user’s movements and position in the real space
are mapped in the virtual environment, thus creating a more immersive experience
[16].
From a technical standpoint the kit’s headset uses one screen per eye, with a display
resolution of 1080x1200 and a refresh rate of 90 Hz. The kit collects sensory inputs
from a total of 70 infrared sensors (32 from the headset and 19 from each controller)
along with gyroscopes and accelerometers. As seen in Figure 2.2, each controller
includes several forms of input: regular buttons, a 2D axis radial touchpad, a trigger
and two grip buttons.
2.2.1.1 HTC Vive software
Software-wise, the HTC Vive ships with a runtime application system called SteamVR,
a VR Platform developed by Valve as an extension of Steam. Developers can in-
terface SteamVR via OpenVR [17], a C++ API that mainly exposes the events
Background 12
generated by the controller, the headset’s view and the SteamVR dashboard.
As far as OS support is concerned, the HTC Vive kit was initially developed for
Microsoft Windows platforms, although current effort is being carried on for MacOS
and Linux support.
Figure 2.1: HTC Vive kit
Source: Photo courtesy of HTC
2.2.2 ParaView and VTK
ParaView is an open-source, multi-platform data analysis and visualization applica-
tion [18]. It is entirely written in C++, it features both 2D and 3D representations
and can be controlled either via its interactive interface or programmatically. In
its latest versions, ParaView also features highly optimized parallel computations in
order to support clusters of supercomputers.
The software architecture is designed to follow a modular pattern, in order to en-
courage further development of new features through additional plugins or through
Background 13
Figure 2.2: HTC Vive controller buttons; in the order they are numbered: menu
button, touchpad, system button, hair trigger, grip button.
Source: Photo courtesy of HTC
whole applications. One of these, Immersive ParaView, released in 2011, provides a
certain degree of integration with VR technologies, as described in [19] and [20].
As thoroughly explained in [21] and shown in Table 2.1, one of ParaView’s underlying
layers consists of Visualization Toolkit (VTK), which is a C++ open-source software
system for 3D computer graphics, image processing, and visualization [22]. VTK
provides the basic visualization and rendering algorithms, thus being the underlying
engine on which ParaView relies.
From a higher abstraction point of view over ParaView’s architecture, the software
exposes wrappers to most of its features through APIs in Python, Java and TCL.
Background 14
Table 2.1: ParaView-VTK Architecture (simplified)
ParaView
VTK
OpenGL MPI OpenVR Etc.
2.2.2.1 Virtual Reality Capabilities in ParaView
From version 5.3, released in March 2017, ParaView offers some native Virtual
Reality capabilities, although they are intended for developers and are not included
in the general distribution binaries. The implemented features include the possibility
to send the loaded objects to a VR environment, in which some forms of interaction
such as grabbing, moving and scaling are provided to the user.
However, being it only a beta version, not all types of objects can be simulated in
VR, and the features offered come with some disadvantages. From the user’s point of
view, given that the Virtual Reality environment is implemented in the main thread
(since it requires synchronization with VTK’s event-handling routines), ParaView’s
GUI is completely unresponsive while the objects are displayed on the headset. Fur-
thermore, from a developer’s perspective, it must be noted that both ParaView and
VTK have a large, complex and structured code-base, on which layers of codes have
been added over the years by following a very strict set of conventions. Unfortu-
nately this means that, although offering great performance potential, developing
natively in ParaView-VTK can often be a very time-consuming task, this being
inappropriate for the open-ended project discussed in this thesis.
Another limitation hindering ParaView’s VR capability lies in the fact that , to
obtain a VR-ready version of the software the user is forced to build it from source,
along with all its dependencies and those required to communicate with SteamVR,
thus making it a rather unpractical solution for general distribution.
Background 15
2.2.3 Unity
Unity is a cross-platform, all-purpose graphics and physics engine used mainly for
simulations, video games and animations. It was first released in 2005 by Unity
Technologies, and ever since then it grew to become a de-facto standard for inde-
pendent and industry-level game development. It offers both 3D and 2D capabilities,
although for this project only the 3D version has been used.
Unity ships with a highly workflow-centered editor, which provides the developer
with all the necessary tools to build a scene, populate it with objects and define
behaviors for each object. Every Unity project is characterized by a certain number
of scenes, which basically represent environmental units, the assets, i.e. the resources
that have to be loaded in each scene (3D models, textures, scripts, images, . . . ) and
the GameObjects, which are instances of the objects. Unity editor’s interface, as
can be seen in Figure 2.3, allows for a hierarchical definition of the objects and their
properties which can be done on a much higher abstraction level than the underlying
code. All of these features contribute to making prototyping a simpler and quicker
process on Unity.
From a coding perspective, Unity mainly supports two programming languages
for scripting: C# and UnityScript (which is a proprietary language derived from
JavaScript). For the purpose of this project all scripts for Unity have been devel-
oped in C#.
2.2.3.1 Object behaviors in Unity
In Unity, each object can have zero or more behaviors, which define the way the
object responds to inputs. When defining a new behavior, one is essentially attaching
a script to an object.
Each script consists of a class which inherits from a MonoBehaviour class. This is
an abstract class that defines several useful methods, the most relevant being:
Background 16
Figure 2.3: Example of Unity’s interface
• Awake(): a method that is called immediately after the object has been cre-
ated, usually at the beginning of a scene.
• Start(): a method similar to Awake, with the exception that it is ensured to
be called after all the Awakes of the other objects have been executed. This
allows inter-object communication at start-up.
• Update(): a method that is called at every frame, i.e. several tens of times
per second.
By overriding these methods and defining new ones, the whole behavior of each
object can be defined.
Background 17
2.2.3.2 Virtual Reality Capabilities in Unity
Unity does not yet natively support Virtual Reality kits, although this can be easily
achieved via some of the official plug-ins available on the Unity Asset Store.
For this project, two plug-ins have been included in the application: SteamVR,
developed by Valve, which exposes the OpenVR API to the Unity Application, and
Virtual Reality ToolKit, a community-developed plug-in which consists in a set of
predefined objects and classes that abstract the underlying VR hardware and offers
the developer with transparent access to not only the HMD and the controllers, but
also the events generated by the kit.
2.2.4 ParaUnity
As stated in Section 2.2.2, ParaView’s capabilities can be extended through plug-in.
One of such plug-ins, named ParaUnity and developed as open-source C++ software
by Rudolf Biczok [23], bridges the gap between ParaView and Unity.
In fact, if built and loaded in ParaView, it adds two buttons to the GUI through
which any object currently loaded in the pipeline can be sent to a pre-prepared Unity
scene, this being either a built executable application or an instance of a project
opened in the Unity editor.
As will be further discussed in Chapter 4, ParaUnity suffers from a certain number
of limitations in terms of performance, compatibility and lack of features. However,
being it an open-source code, it was decided to include its further development in
this project to improve the performance of the overall software system.
It is worth mentioning that in its current implementation ParaUnity requires the use
of an older version of Qt, a cross-platform GUI development application extensively
used by ParaView. This imposed the constraint of linking the plug-in against Para-
View 5.2, released in November 2016, which was also the latest version of ParaView
to offer no native VR capabilities. However, being that ParaUnity aims specifically
Background 18
at making ParaView VR-ready through Unity, such requirement did not hinder the
development of the project in any way.
Chapter 3
Unity Application
3.1 Introduction
As already mentioned in Chapter 1, the main objective of this thesis was to develop
an application, either as an extension of ParaView or as a more stand-alone piece of
software, that would let the user visualize and interact with CFD data coming from
ParaView and VTK. The output of the project was a Unity application, which can
be seen opened in the Unity Editor in Figure 3.1.
In Section 2.2 all the necessary software, technologies and environments have been
introduced: ParaView, VTK, Unity and ParaUnity. In this section a brief explana-
tion of the reasons behind the choice of Unity will be given, followed by a high-level
view of the Unity Application along with some more in-depth implementation de-
tails.
3.1.1 Why Unity?
When initially planning the general development for the project, two alternative
paths have been identified: either continuing the development of ParaView’s last
19
Unity Application 20
Figure 3.1: Screenshot of the resulting Unity Application
version, which included some beta VR capabilities as native features, or using a
different environment for the Virtual Reality and somehow connect ParaView to it.
Given the open-ended nature of the project, along with the fact that a prototyping
programming work-flow had been chosen, and given the complex, structured way in
which ParaView’s and VTK’s codebases are organized, the second route seemed a
better solution in terms of number of features that could have been implemented
over the course of the project. With this in mind, Unity has been chosen as the
graphics engine of the application.
As already discussed in Section 2.2.3, Unity is an environment designed specifically
for prototyping and gray boxing1
, and through plug-ins it can be effectively made
VR-ready. Furthermore, as will be further explained in Chapter 4, there exists an
open-source plug-in for ParaView which can bridge the gap between this and Unity,
although it required some further development. For these reasons, along with the
1
In Unity, the principle of gray boxing is to design a scene in which all the objects consist of
gray boxes (the default 3D object in Unity), although they behave similarly to how they should in
a final version. This is meant to quickly test an environment or a scene without spending time on
3D models, textures, and general artistic design, but only by focusing on the mechanics and the
features. It can be considered a kind of 3D sketch for an environment.
Unity Application 21
large availability of pre-built community-made tutorials and tools, and the inherent
cross-platform compatibility offered by Unity, the decision of proceeding in this
direction seemed the most appropriate.
3.2 Application Architecture
The application’s architecture has been designed to be as flexible and expandable
as possible, so that any additional feature could be added to the VR environment
with minimum effort.
The project consists in only one scene, called Main scene, in which all the objects
are inserted with the hierarchy shown in Figure 3.2. Each feature is controlled by
Managers, which implement the business logic of the application through behavior
scripts (as further detailed in Section 3.3). Managers communicate with each other
through references, and are designed to expose a simple and coherent interface to
the upper layers of the application.
As stated in Section 2.2.3, Unity offers two different programming languages for the
behavior scripts: C# or UnityJava. For the purpose of this project, all scripts have
been developed in C#.
Some managers have children objects, which are usually the actual models rendered
in the Virtual Reality environment. It is important to notice that the parent-child
relationship between the managers and their children has nothing to do with the
concept of inheritance in programming. In fact, defining a parent in Unity simply
means that the child’s position, rotation and scale will be relative to the parent’s;
this is especially useful to handle animations (as seen for example in Section 3.3.5).
Inter-object communication happens through a set of globally shared variables. In
fact, although Unity provides ways of transversing the scene in order to look for
specific objects, this has a non-negligible computational cost, and is often advised
against by the community. Instead, each object that needs to be accessed from
outside registers itself to a Global static class. Then, at runtime, the application
Unity Application 22
begins with an initialization process in which each manager stores the references
to the objects it needs to communicate with as private fields, for faster subsequent
accesses. On the other hand, all non-manager objects need not know about the
external parts of the software, in accordance to the SOLID principles of Object-
Oriented Programming (as described in [25]).
Internally, both Managers and objects have one or more behavioral scripts attached,
in which the overridden implementation of Awake, Start and optionally Update
are defined. Furthermore, methods have been implemented to trigger and process
custom events, such as the loading of an object from ParaView. Such methods follow
the naming convention OnEventName, and are called through a callback system.
3.2.1 Top-level Managers
At the current stage of development, the following top-level managers have been
defined, as seen in Figure 3.2):
• VRManager: core of the VRTK plugin (see Section 2.2.3.2), responsible
for the Virtual Reality hardware (sensors, controllers, camera) and their 3D
representations.
• ModeManager: exposes utility methods to check whether the Unity appli-
cation is running as a stand-alone executable (“Player mode”) or inside the
Unity editor (“Editor mode”). Used by ParaViewManager.
• EnvironmentManager: responsible for the virtual environment in which
the user is immersed, i.e. the room. It exposes methods to toggle the room’s
visibility, used by StaticMenuManager.
• StaticMenuManager: manages the static menu that the user can summon
by pressing the Menu button on the controller. It’s responsible for toggling its
visibility and for binding the menu buttons to the appropriate functions.
• LightManager: responsible for the lighting of the environment. Currently
exposes a function to set the intensity of the ambient lighting.
Unity Application 23
• ParaViewManager: the most fundamental part of the application: it is
responsible for communicating with a ParaView instance, interpreting its data
and creating an appropriate object in the Unity VR environment. Further
details about the communication with ParaView are discussed in Section 4.2.2.
• AnimationManager: handles the animation of the 3D models, in case of
time-dependent datasets. It exposes functions to play, pause and loop the
animation, while keeping the state of the current frame.
• SizeManager: responsible for resizing the 3D object, it allows upscaling,
downscaling and optional autoresizing to a custom size at object loading.
• SlicingManager: mock implementation of a slicing plane, its purpose is fur-
ther discussed in Section 3.3.9.
• LoggerManager: handles the visibility of an optional overlay text that dis-
plays info messages, warnings and errors directly in the headset’s display. It
is mainly intended for developers and it is turned off by default.
More technical information about these entities will be provided in Section 3.3.
Unity Application 24
Figure 3.2: Architectural diagram of the main Unity scene
Unity Application 25
3.3 Features
After giving a high-level description of the software’s architecture, it is worth ana-
lyzing the most important features from a more in-depth perspective.
3.3.1 Controllers
As explained in Section 2.2.1, the Virtual Reality kit chosen for the development
of the application is a HTC Vive. In particular, it is worth reminding that each
controller features five different buttons, one of which is also a 2D circular trackpad.
In order to make the user experience easy and intuitive, while at the same time
including as many features as possible in the controllers, a usability study has been
carried out to determine the most appropriate button configuration. The results of
this tests showed that no users had difficulties with an asymmetric configuration,
in which the two controllers offer different functionalities, and the test users found
themselves using primarily the trackpad button and the trigger. Consequently, the
following choices have been made:
• By default, the left controller would host a radial menu with all the features
related to the ParaView object (see Section 3.3.5), while the right controller
would be primarily used as a pointer device to interact with the UI elements
(see Section 3.3.4).
• An “Inverted Controllers” mode would be provided, aimed at left-handed
users.
• Both controllers would have the ability to grab the object.
• The buttons would be bound to the functions:
– Menu button: open the static menu,
– Trigger: when pressed while inside the object and held, the user is grab-
bing the object; if pressed while the pointer is shown, behaves similarly
to a mouse click,
Unity Application 26
– Trackpad: when touched on the left controller, spawns the radial menu,
which can be interacted with by moving the finger over the trackpad and
confirmed by clicking the button; when pressed on the right controller,
shows the pointer to interact with the static menu,
– Grip button: reserved for applying/confirming ParaView-related fea-
tures, such as clipping,
– System button: open the SteamVR dashboard (default behavior, can-
not be overridden).
If on one hand this means that both controllers have to be used to access to all
the features of the application (which is nearly always the case, for popular VR
softwares), on the other it ensures an immersive and intuitive experience, even for
users who are not familiar with the kit or with Virtual Reality in general.
Figure 3.3 shows a screenshot of the controllers in action. As one can see, the radial
menu is showing over the left controller, given that the trackpad is being touched,
and the pointer of the right controller is currently active.
3.3.2 Environment
The environment in which the user is immersed is designed to simulate a large room
with natural and artificial ambient lighting.
Given that the focus of the user will most likely be the object, one could think that
the choice for the environment is not important. However, virtual reality develop-
ment guidelines suggest that the experience of being immersed in an environment
that fails at simulating reality (for example, a completely white room or an infinite
floor) can, in the long run, end up alienating the user. Instead, a familiar set up
with proper lighting is important to reduce perception dissonance between the real
and the simulated world.
Bearing this in mind, it is also important to remark that the application discussed
in this thesis is meant to be further developed and improved, and the focus of the
Unity Application 27
Figure 3.3: Screenshot of the two controllers in action
project has been primarily on the features related with ParaView and the interaction
with the object. Therefore, it has been chosen to develop a simple environment with
minimal characteristics. The result can be appreciated in Figure 3.4.
In any case, the objects that make up the environment can optionally be hidden,
so that a user would find themselves walking on a plain white floor with no walls
around. This is currently toggled from the Static Menu (see Section 3.3.4), and is
implemented in the EnvironmentManager with the following function:
Unity Application 28
Figure 3.4: Screenshot of the environment of the application
public void ToggleShow(bool show) {
// Hide all walls
foreach (GameObject wall in walls)
wall.SetActive(show);
// Set default material to floor
floor.SetMaterial(show);
}
3.3.2.1 Lighting
As stated before, lighting in the scene comes both from an artificial directional
light and from ambient lighting. Both of them are meant to be managed by a
LightManager.
Unity Application 29
Currently, the user can adjust the brightness of the environment via a slider on the
static menu. The function that implements this is:
public void SetIntensity (float value) {
RenderSettings . ambientLight = new Color(value , value , value , 1);
}
3.3.3 ParaView object loading
As stated before, the most important feature is that the application must be able to
communicate with ParaView, load its objects and render them in the virtual world.
In the current implementation, this task is handled by a ParaviewObjectLoader.
This is responsible for the following:
• Setting up a TCP listener for incoming connections from ParaView,
• Implementing the communication protocol for import/export synchronization
(see Section 4.3.2),
• Reading the data from memory,
• Calling the appropriate methods to construct the 3D model from the data,
• Registering the object as global and triggering appropriate events.
As this is the only scenario that is highly dependent on external input (i.e. an
object being send from ParaView, an external application), the design of the object
loader required careful handling of its asynchronous nature. For this purpose, a
custom event named ParaviewObjectLoaded has been designed, in order for the
other manager to register their callbacks. This is important as some code (for
example the autoresizing of the object) must only be executed after an object has
been loaded from ParaView, which could be in any moment after the beginning of
the application, and does not depend on the application itself.
Unity Application 30
A similar consideration can be made for the object’s destruction. In fact, supposing
that the user wants to substitute the currently loaded object with another one,
the first must first be deleted. In fact, currently the application supports only one
object being loaded at any time. This is achieved through another event, called
ParaviewObjectUnloaded, which is necessary for the other managers to clean up
any reference to the object and to update their state before importing another one.
As an example, the code to register a loaded ParaView object and call the appro-
priate callbacks is:
public static void RegisterParaviewObject (GameObject paraviewObj) {
// Register object in globals
Globals.paraviewObj = paraviewObj ;
// Trigger callbacks
if ( ParaviewObjectLoadedCallbacks != null)
ParaviewObjectLoadedCallbacks (paraviewObj);
}
3.3.3.1 3D model creation
To import an object this must first be read from memory; in its current implemen-
tation, data is stored in RAM in X3D format (see Section 4.3.2). This means that
in order to read the data the application must make the appropriate calls to the
Windows API, and subsequently pass the result of this operation to an entity that
can convert X3D strings into 3D models.
The reading phase is carried out by an object that exists at a lower level of abstrac-
tion: the XDocumentLoader. This exposes a Load() function, which takes the name
and the size of the object as arguments and returns a properly formatted XDocu-
ment, (a XML representation of the object). The Load() function is implemented
as follows:
Unity Application 31
public static XDocument Load(string objectName , uint objectSize) {
// Attach to the object in shared memory
sHandle = new SafeFileHandle (hHandle , true);
Attach(objectName , objectSize);
// Parse the raw data into XML
XDocument doc = XDocument.Parse(pBuffer);
Detach ();
return doc;
}
The XDocument is then passed to an external library, called X3DLoader, which han-
dles the conversion.
3.3.3.2 3D model material
One of the most important aspects of CFD data visualization is coloring. In fact,
more often than not, it can be extremely useful to color an object according to the
value of a certain variable (such as pressure, temperature, ...) on each face of the
mesh.
Such information about the color is included in the X3D representation of the object,
but must be rendered in Unity through the use of shaders. For this purpose, a custom
shader that implements Vertex Color Shading has been included in the project.
In order to programmatically apply a material based on such shader to the object,
the design choice has been to include a non-visible object, called MaterialPlace-
holder, from which the ParaviewObjectManager can read information about the
material chosen by the developer and apply it on any loaded objects. This gives free-
dom to customize the material through the editor, even when no Paraview objects
are loaded in the scene.
Figure 3.5 shows an example of a mesh colored according to pressure, blue being
lower values and red being higher values.
Unity Application 32
Figure 3.5: Screenshot of a mesh colored according to its values
3.3.4 Static Menu
As seen in the previous sections, some of the aspects of the VR environment can be
changed at runtime. This happens through a menu, which is called StaticMenu to
distinguish it from the radial menu (see Section 3.3.5).
By default the menu is hidden, so as not to interfere with the object visualization,
but can be toggled through the Menu button on any of the two controllers. The
behavior defined in the StaticMenuManager class mimics that of the SteamVR
dashboard, which is the default system menu of the whole SteamVR environment:
when opened, the static menu positions itself at a fixed distance directly in front of
the user, facing them. If then the user moves around, the menu stays in its place in
the 3D world until hidden and shown again. By usability testing, this has proven
to be the most natural and unobstructive solution for the users.
Unity Application 33
The code responsible for positioning and showing the menu is particularly interest-
ing, because it shows Unity’s way of handling positions and rotations:
private void Show () {
// Get the XZ projection of the forward position of the headset camera
Vector3 forwardXZ = new Vector3(forward.x, 0, forward.z);
// Gets the point at 2.5m along the forwardXZ vector.
Vector3 headsetFront = position + distance * forwardXZ.normalized;
// Gets the position at 2.5m in front of the headset , along the XZ plane.
Vector3 headsetFront = headset. GetFrontPositionXZ (2.5f);
// Sets the menu position to headsetFrontPosition , keeping the y coordinate.
position = new Vector3( headsetFront .x, position.y, headsetFront .z);
// Sets the rotation so that the menu faces the camera
rotation = Quaternion. LookRotation (position - headset.position);
// Shows menu
gameObject.SetActive(true);
}
As can be seen in Figure 3.6, the static menu currently offers the following function-
alities:
• Light intensity slider: allows the user to adjust the brightness of the scene
(see Section 3.3.2.1),
• Show room toggle: enables or disables the walls and the floor’s texture (see
Section 3.3.2),
• Show log toggle: enables or disables the overlay info text (see Section 3.3.10),
• Enable bounding box toggle: enables or disables an explicit bounding box
drawn around the object (see Section 3.3.6.1),
• Invert hands toggle: inverts the functionality of left and right controllers
(see Section 3.3.1),
• Close menu: closes the menu,
Unity Application 34
• Quit: gracefully terminates the application.
Figure 3.6: Screenshot of the opened static menu
3.3.5 Radial Menu
All of the features related to the interaction with ParaView’s objects are imple-
mented through a radial menu attached to the left controller. The idea behind this
design choice is that such features must be quickly accessible to the user, so that
the interaction feels more natural and more intuitive.
The radial menu itself is a custom implementation of one of VRTK’s prefabricated
objects. Its behavior offers a good example of Unity’s parenting mechanism: the
object is a child of the controller, therefore its coordinates are relative to those
of the controller. This means, given that the controller’s position is updated by
Unity Application 35
VRTK through data coming from sensors, that the menu is effectively following the
controller, as if it was an additional part of it. Such behavior is at the base of the
concept of Mixed Reality: the controller exists in both the real and the simulated
world, but in the latter it is enriched by completely virtual features.
RadialMenu’s behavior defines the callbacks related to each menu entry, which are
then bound to the appropriate functions in the Unity editor. All the callbacks follow
a similar structure:
public void OnActionButtonPressed () {
appropriateManager .action ();
}
This code snippet shows the way managers and objects communicate with each
other, delegating the implementation of the business logic of each routine to the
appropriate entity.
Currently, the radial menu supports the following features:
• Scale Up: while held, increases the size of the model (see Section 3.3.7).
• Scale Down: while held, decreases the size of the model (see Section 3.3.7).
• Play/Pause: if the model is a time-dependent model, i.e. it consists of mul-
tiple frames, this button plays and pauses the animation (see Section 3.3.8).
The button’s icon is updated according to the current state of the animation
and it is disabled if the object is static.
• Slice: when pressed, shows the mock implementation of the slicing plane (see
Section 3.3.9).
Such features can be seen in Figure 3.7, where they are represented by self-explanatory
icons.
Unity Application 36
Figure 3.7: Screenshot of the opened radial menu
3.3.6 Object interaction
Among the project objectives outlined in Section 1.1, one of the most important was
the ability to interact with ParaView’s object. In fact, this is essential for any true
Virtual Reality application: in the real world objects can be touched, moved, rotated
and manipulated. Therefore, when a user enters a VR environment, they naturally
expect the same laws to apply, in which case they feel more immersed. Moreover,
one must bear in mind that the scope of the application discussed in this thesis is
the analysis of CFD data, and therefore the ability to manipulate the dataset in a
3D environment is an enormous advantage over the classical visualization on a 2D
screen.
Bearing this in mind, basic form of interaction with the model have been imple-
mented, with two exceptions: gravity and solidness. On one hand, having the model
Unity Application 37
react to gravity is impractical and counter-productive, as it means that it cannot be
left floating at eye level for detailed examination; a similar consideration applies to
solidness: given that with current technologies it is not possible to simulate touch
through the controller, the most commonly accepted way of grabbing and object is
by keeping a button pressed (the trigger, in this case) while holding the controller
inside of it.
Apart from this, the object is designed to behave like a rigid body, meaning that
it can be grabbed by any of its parts, and it can be translated and rotated in 3D
space.
To achieve this, the object is given an Interactable behavior at loading time. This,
internally, gives it rigid body properties, and defines a collider. In Unity, a collider
is a region surrounding the object that represents its boundaries, for the purpose of
determining collision. As an object enters or exits another object’s collider, an event
is triggered. This can be bound to appropriate callbacks to implement grabbing, as
in the case being discussed. For performance reasons, the collider’s shape for the
object was chosen to be the minimum bounding box (see Section 3.3.6.1).
The code for implementing the object grabbing lies both in the controller and in
the interactable object itself, although the object’s end is more interesting. In fact,
whenever a controller is inside the bounding box, and the trigger is pressed, the
method OnBeginInteraction() is called. Similarly, when the trigger is released,
the method OnEndInteraction() is invoked.
Unity Application 38
public void OnBeginInteraction (Controller controller) {
// The object becames integral with the controller
this.transform.parent = controller.transform;
// Save reference to the controller
attachedController = controller;
}
public void OnEndInteraction (Controller controller) {
if (controller == attachedController )
{
// Release the object
this.transform.parent = null;
// Reset reference
attachedController = null;
}
}
It is worth noting that saving and checking the reference to the controller is necessary
to implement the possibility of passing the object from one controller to the other
without releasing it. In fact, one can see how if a second OnBeginInteraction()
is called from the other controller before the interaction with the first ends, the
reference is updated. Therefore, when the trigger of the first controller will be
released, the object will still be integral with the second controller, as expected.
3.3.6.1 Bounding box
As explained before, each ParaView object is surrounded by a bounding box that
acts as a collider. This allows interaction with the controllers.
It is worth mentioning, however, that the fact that the bounding box is effectively
bigger than the mesh itself can be counter-intuitive for the user. In fact, it means
that there are regions of space that are outside the mesh but inside the bounding box,
thus making the object grabbable without the controller being inside of it. To limit
the effects of this, and ease the process of grabbing, an option on the static menu
lets the user enable a visible bounding box. This is rendered as a semi-transparent
Unity Application 39
box, as seen in Figure 3.8 and is shown only when the controller is actively colliding
with it.
Figure 3.8: Screenshot of the bounding box
Another feature of the current implementation of the bounding box is its ability
to adapt in the case of animated models. In fact, in each frame of the animation
the model can have a different size and shape, thus requiring a different bounding
box. To solve this problem, it has been chosen to implement a function that actively
resizes the bounding box depending on the current shape of the object: such function
is bound as a callback to the NextFrameLoaded event, which is triggered by the
animation manager after every frame (see Section 3.3.8). The code that resizes the
bounding box is the following:
Unity Application 40
public void FitColliderToChildren () {
// Temporarily reset the parent ’s orientation , since the bounds will
// be given in world coordinates but the collider is in local coordinates .
Quaternion oldRotation = transform.rotation;
transform.rotation = Quaternion.identity;
bool hasBounds = false;
Bounds bounds = new Bounds(Vector3.zero , Vector3.zero);
// Iterate over all active children.
foreach(Renderer childRenderer in GetComponentsInChildren <MeshRenderer >()) {
if (hasBounds)
bounds. Encapsulate ( childRenderer .bounds);
else {
bounds = childRenderer .bounds;
hasBounds = true;
}
}
// Set collider center and size (with conversion from global to local)
BoxCollider collider = GetComponent <BoxCollider >();
collider.center = Scale(bounds.center - transform.position , transform.
→ localScale.Reciprocal ());
collider.size = Scale(bounds.size , transform.localScale.Reciprocal ());
// Reset the objects orientation
transform.rotation = oldRotation ;
}
This function is a good example of the concept of global and local coordinates
in Unity. In fact, the bounding box is considered as an independent object, and
therefore expressed in global coordinates, while the Paraview Object itself is a child
of ParaViewManager, and therefore it uses local coordinates relative to its parent.
For this reason, some of the entities involved require transformations before being
applied.
3.3.7 Resizing
In general, the size of CFD models can span over several orders of magnitude de-
pending on the object being studied: from micrometers in the case of the flow of red
Unity Application 41
blood cells to tens of meters for aircrafts aerodynamicity analysis. However, all the
cases in this spectrum need to be correctly represented in the virtual environment
at a scale which is appropriate for the user.
At the same time, it is often interesting to analyze finer details of a model, like a
particularly turbulent area or a small zone of higher pressure. This means that the
user should be able to enlarge and reduce the mesh accordingly.
For this purposes, a SizeManager has been defined and implemented. Currently, it
offers the following functions:
• Autoresize: automatically resize the mesh to a fixed maximum length at
loading time. This can be disabled in the editor, and the parameter for the
maximum length can be changed.
• Scale up: increase the size of the mesh.
• Scale down: decrease the size of the mesh.
By analyzing popular implementation of zooming and scaling functions, it has been
observed that the most natural solution for the user is to modify the dimensions
relatively to their current values, instead of linearly. In fact, a linear resizing often
feels too slow for large models and too fast for small models, and can be problematic
when the size of the model approaches zero. Consequently, one can observe the
implementation of the ScaleUp function as an example:
public void ScaleUp () {
obj.transform.localScale *= 1 + scaleSpeed * Time.deltaTime;
}
The previous line of code provides a simple example on how time is handled in
Unity. In fact, Unity exposes a static class Time, of which the most important
field is deltaTime. This field returns the time in seconds it took to complete the
last frame (i.e. the delay between two subsequent calls to Update()), as this value
Unity Application 42
depends on the computational complexity of the operations performed in the last
frame and is thus not known at compile time.
3.3.8 Animation
As stated before, time-dependent datasets are a very important and interesting
subset of CFD models, and the most natural way to represent them is through an
animation.
For this purpose, an AnimationManager has been designed, and it currently offers
the following functionalities:
• Play the animation,
• Pause the animation,
• Loop the animation.
When a time-dependent object is exported from ParaView, each frame is encoded
separately as an independent model. Then, through the TCP connection between
ParaView and Unity (see Section 4.3.2), a message is sent to Unity with the total
number of frames to be imported. They are then read and inserted to the scene. In
the current implementation, all frames are children of the same root object, and only
one of them is showing at any given time. Therefore, animating the mesh consists
in hiding the current frame and showing the next one. This is done in the Update()
function that every scripts inherits from the MonoBehavior class, with the visibility
toggling logic being encapsulated in the ShowNextFrame() method.
Unity Application 43
// Called at every frame
void Update () {
if(isPlaying) {
// DELAY_COUNT slows the animation
if (delay >= DELAY_COUNT) {
delay = 0;
ShowNextFrame ();
} else {
delay ++;
}
}
}
private void ShowNextFrame () {
// Hide all frames
foreach (Transform child in obj.transform)
child.gameObject.SetActive(false);
// Show next one
obj.transform.GetChild( currentFrame ).gameObject.SetActive(true);
// Update counter
currentFrame = ( currentFrame + 1) % obj.transform.childCount;
// Trigger event
if ( NextFrameLoadedCallbacks != null)
NextFrameLoadedCallbacks ();
}
As one can notice from the last lines of the ShowNextFrame function, the Anima-
tionManager defines a custom event called NextFrameLoaded, to which any other
manager can bind their callbacks. This is necessary as changing the frame is ba-
sically equivalent to changing the model, and therefore some updating might be
necessary in the state of other parts of the software. As seen in Section 3.3.6.1, an
example of this is the recalculation of the bounding box dimensions, which must be
adapted to the new shape of the model at every frame.
In the current implementation, in case of time-independent models, i.e. models that
consist of only one frame, the Play/Pause button on the radial menu (see Section
3.3.5) is disabled, and therefore none of the AnimationManager’s routines are called.
Unity Application 44
Figure 3.9 displays four different frames of an ongoing animation. The model shown
in the screenshots is taken from ParaView’s official tutorial examples.
Figure 3.9: Screenshots of four different frames of an ongoing animation
3.3.9 Slicing
As any user of ParaView knows, the application is used not only for visualizing
the data, but also to manipulate it in many ways through the use of filters. Such
functions, which include slicing, contouring, wireframing, etc., are necessary to offer
a more detailed view about the values of the dataset being analyzed. It is therefore
important to be able to port these features in the Virtual Reality environment, so
that this can become as independent from ParaView as possible, without requiring
the user to change and re-export the data multiple times.
Unity Application 45
As discussed in Section 3.2, the application has been engineered to be flexible and
expandable, so that any new feature could be easily added without requiring major
refactoring. As a proof-of-concept of this, a mock implementation of one of the
most important filters in ParaView has been realized: the slicing function. Slicing
consists in dividing a mesh in two parts along the plane, which is often followed by
hiding one of the parts. This can be interesting to show inner details of the model.
Given that this project focused on defining and implementing the general envi-
ronment and architecture of the software system, the design choice has been to
implement a stub of the feature, rather than a fully working function, to be used as
an example for future development. In its current implementation, a slicing plane
(which is just a semi-transparent plane) can be summoned by pressing one of the
buttons of the radial menu. Its position is integral with the right controller, so that
moving and rotating it over the mesh is natural and intuitive for the user. Pressing
the button again hides the plane.
Figure 3.10 shows a screenshot of the slicing plane being held through a simple cube.
It is worth mentioning that, although the plane just described does not perform
any slicing operation, there exists a fully-working version developed by a fellow
student for his MSc Thesis. This has already been integrated in the application
being discussed in this paper, in a different development branch, as is discussed in
detail in his thesis.
3.3.10 Logger
When immersed in the Virtual Reality environment, it can be difficult for the user
to be notified about the state of the application. For this specific reason, it has been
chosen to implement a logging a mechanism appropriate for a VR experience.
The object responsible for this feature is the LoggerManager, which effectively acts
as a wrapper around Unity’s own Debug static class. In the current implementation,
the LoggerManager offers the following functionalities:
Unity Application 46
Figure 3.10: Screenshot of the mock implementation of the slicing plane
• Info log: shows a message in a default color of black,
• Warning log: shows a message in a default color of yellow,
• Error log: shows a message in a default color of red,
• Toggle visibility: enables and disables the log messages (disabled by default)
The log can be enabled through a toggle button in the static menu (see Section
3.3.4), and it appears as a fixed overlay text in the bottom-center of the head-
mounted display. Even if the camera is moved, i.e. if the user moves their head, the
text moves along with it, thus appearing to be fixed on the screen. An example of
a message being logged can be seen in Figure 3.11.
Unity Application 47
Figure 3.11: Screenshot of a message being logged through the logger
To correctly position the text and make it integral with the camera, the manager
accesses VRTK’s API, which exposes a reference to the active camera object, re-
gardless of the VR hardware being used. This is then attached to the camera of the
text overlay:
protected virtual void SetCanvasCamera () {
// Get VR camera
Transform camera = VRTK_DeviceFinder . HeadsetCamera ();
// Attach to the text
textCanvas.worldCamera = camera.GetComponent <Camera >();
}
Unity Application 48
As an example, the code that implements a simple information logging function is
provided:
public void Log(string message) {
// Set text and color
text.text = message;
text.color = defaultColor ;
// Also debug to console
Debug.Log(message);
}
Additionally, some parameters of the log message can be changed through Unity’s
editor, such as the font size, the position and the color for warning and errors.
Chapter 4
ParaUnity
4.1 Introduction
The application described in Chapter 3 is able to receive objects from ParaView
and insert them in a Virtual Reality environment. However, as already discussed
in Section 2.2.4, ParaView has no native knowledge of Unity. It has therefore been
necessary to find a way to bridge the gap between the two software systems in order
to allow the import/export process and make communication possible.
The work presented in this chapter consists largely in the further development of an
existing plug-in for ParaView, called ParaUnity, created by Rudolf Biczok in 2016
and available as open-source code on GitHub [23]. The plug-in is entirely written in
C++ and requires an older version of ParaView to be executed (specifically, version
5.2, released in November 2016). The reason behind this is that ParaUnity is built
using Qt version 4 as its GUI and utility library, which is unsupported in the latest
versions of ParaView.
The building process of the plug-in results in a dll file, which can then be loaded
from ParaView’s plug-in management tools. This essentially adds two buttons on
the main interface of the program, that can be used for sending the object to a
49
ParaUnity 50
pre-prepared Unity scene, either in a running instance of the Unity editor or in an
compiled executable application. These buttons can be seen in Figure 4.1.
Figure 4.1: ParaUnity buttons on ParaView’s GUI.
The button with the P (for Player) spawns an instance of the executable Unity
application, while the button with the E sends the object to the Editor.
The next section of this chapter will describe the initial state of the plug-in, as it was
before the beginning of this project. Then, a description of the main improvements
that have been made to ParaUnity will be provided.
4.2 Initial state
In the latest version on its official Github repository, ParaUnity is distributed along-
side an example of a Unity project on which ParaView’s object can be loaded. Such
ParaUnity 51
project features a very basic scene, which suffers from the following limitations:
• The object can only be visualized, as no form of interactions (grabbing, rotat-
ing, etc.) have been implemented.
• No ParaView features (slicing, contouring, etc.) are available.
• Objects are rendered with a plain white texture, regardless of their color in
ParaView.
• No environments or menus have been included.
• The application is only compatible with the Oculus Rift VR kit.
It is immediately evident that, with such limitations, most of the project objectives
outlined in Section 1.1 are not satisfied. This is the reason behind the development
of the whole application described in Chapter 3: to provide the user with a complete,
immersive scene in which they could freely interact with the object. By effectively
substituting the Unity project included in the repository with the one discussed in
this thesis, all of these limitations have been overcome.
Furthermore, considering the code of the plug-in itself, two additional performance
limitation have been identified:
• Object export-import is done through disk I/O, which can be several orders
of magnitudes slower than RAM I/O, depending on the hardware and the
structure of the data.
• Communication between ParaView and Unity is completely synchronous, mean-
ing that the whole model has to be completely exported before Unity can start
importing it, thus creating non-neglectable idling times.
The overcome these problems, the initial project has been expanded to include some
further development on the plug-in, as described in Section 4.3.
ParaUnity 52
4.2.1 Plug-in implementation
ParaUnity is a much simpler piece of software, compared to the Unity application,
and therefore the vast majority of its logic is included in just one class, called
Unity3D. This class offers the following functionalities:
• Connecting to an existing TCP Socket and communicating through it (see
Section 4.2.2).
• Spawning an instance of a pre-prepared Unity application from its executable,
or connecting to a running instance of the Unity Editor.
• Requesting an X3D representation of the currently loaded objects to VTK.
• Writing data to disk.
Most of these features are implemented using Qt libraries or the Windows API.
4.2.1.1 TCP Socket
In its initial stage, ParaUnity communicated to external software through single-use
TCP connections. This was done by instantiating a QTcpSocket, which is part of
the Qt Network library and writing a byte representation of the message on it. It
is worth noting that in this implementation, ParaUnity is not waiting for a reply,
nor it has any methods to read it, as the TCP connection is destroyed immediately
after sending the message. This can be seen in the following implementation:
ParaUnity 53
bool Unity3D :: sendMessage (const QString& message , int port) {
// Creates the socket
QTcpSocket *socket = new QTcpSocket(this);
// Connects to localhost
socket -> connectToHost ("127.0.0.1", port);
// Checks connection status
if (!socket -> waitForConnected ()) {
return false;
}
// Writes message
socket ->write(QByteArray(message.toLatin1 ()));
// Waits and disconnects
socket -> waitForBytesWritten ();
socket -> waitForDisconnected ();
return true;
}
Connections are always done to localhost, although the port can vary (as described
in 4.2.2).
4.2.1.2 Exporting the objects
To obtain an X3D representation of the data loaded in ParaView, ParaUnity uses a
vtkX3DExporter, which is part of the VTK library. In fact, after instantiating the
exporter, the plug-in links it to the current render window, sets the output to a file
with an appropriate filename and sends the command to write the data to disk, as
can be seen in the following code:
ParaUnity 54
void Unity3D :: exportScene (/* ... */) {
/* ... */
// Instantiate the exporter
vtkX3DExporter *exporter = vtkX3DExporter :: New();
/* ... */
// Set its input to the render window
exporter ->SetInput(renderProxy -> GetRenderWindow ());
// Set the filename
exporter -> SetFileName ( outputFilename );
// Write to file
exporter ->Write ();
/* ... */
}
4.2.2 Initial communication protocol
As stated before, ParaUnity and Unity communicate through both a local TCP
Socket and disk I/O operations. The actual communication protocol, as it was in
the initial stage of development, can be summarized as follows:
1. Unity starts, and immediately creates a TCP Listening socket on a random
port.
2. In a standardized location, Unity creates an empty directory, the name of
which is the port number.
3. ParaUnity starts and searches the standardized location for files. For every file
it finds (i.e. every registered port), it checks for the status of the connection
on that port by sending a test message.
4. If ParaUnity finds an active Unity instance, it starts exporting the scene in
the instance’s directory.
ParaUnity 55
5. When ParaUnity is done exporting, it sends a message to Unity with the path
of the file (or the path of the directory, in case of time-dependent data).
6. When Unity receives the message, it reads the file from that location, interprets
it into an object and inserts it into the scene.
4.3 Development
As explained in Section 4.2.1, the official ParaUnity plug-in suffered from perfor-
mance limitations, which could hinder the user experience especially with large
datasets. To overcome this problem, a custom version of ParaUnity has been de-
signed, with particular focus on two major aspects: data I/O and inter-process
communication.
4.3.1 From Disk I/O to RAM I/O
The average mechanical hard drive can perform around 300 I/O random-access
operations per second. With solid-state drives, such value can go up to about 2000.
However, this is still several order of magnitude lower than the speed of an average
RAM, which can perform about 40 million random access I/Os every second [26].
With this numbers in mind, it becomes clear that one of the performance bottlenecks
of the initial implementation of ParaUnity is disk I/O.
To alleviate this issue, and take advantage of the much-higher speed offered by
main memory, it has been decided to move the transmission of the data to RAM.
On a Windows machine, one of the possible solutions (and the one adopted for the
project) is to interface with the Windows API and store the data as Named Shared
Memory objects [27]. The work-flow can be summarized as follows:
1. Process A creates a file mapping in the main memory of the system, i.e. re-
quests the Operating System to allocate a specific amount of memory and give
it a custom name.
ParaUnity 56
2. Process A maps part of the memory in its process space with the memory
requested in the previous step, so that they can be effectively treated as the
same part of memory.
3. Process A writes the data to the portion of its space which is mapped to shared
memory; consequently, the data becomes accessible to any process through the
name set in Step 1.
4. Process B, which must know the name and the size of the shared object be-
forehand, accesses the file mapping in shared memory by name.
5. Process B maps the area of the shared memory in which the object is written
to a portion of memory in the process space.
6. Process B can now access the object as if it were in its own memory space.
7. When both processes are done accessing the files, Process A frees the memory
in shared memory and they both unmap their view of the file.
This has been implemented in ParaUnity by setting the vtkX3DExporter to return a
string of data instead of writing to a file, and copying that string to shared memory
with the following logic:
ParaUnity 57
void Unity3D :: writeSceneToMemory () {
// Create object in shared memory
handle = CreateFileMapping (
/* ... other parameters ... */
objectSize ,
objectName
);
// Map memory to process space
pBuffer = (char *) MapViewOfFile (
handle ,
/* ... other parameters ... */
objectSize
);
// Write to memory
CopyMemory (( void *) pBuffer , exporter -> GetOutputString (), objectSize);
}
Both the object name and the object size is then transmitted to Unity through the
TCP Socket, as explained in Section 4.3.2. Unity is then able to access the file with
the following logic (in ParaViewLoader, see 3.3.3):
ParaUnity 58
public static XDocument Load(string objectName , uint objectSize) {
// Attach to the object in shared memory
handle = OpenFileMapping (
/* ... other parameters ... */
objectName
);
// Map memory to process space
pBuffer = MapViewOfFile (
handle ,
/* ... other parameters ... */
objectSize
);
// Access the data and parse it
XDocument doc = XDocument.Parse(pBuffer);
Detach ();
}
4.3.1.1 Performance Results
To test the effectiveness of shared memory mapped files versus common files, a
normalized performance testing platform has been designed. The tests have been run
a total of 100 times, and the average values have been calculated and are displayed
in Table 4.1 for datasets of 1 byte, 1 Kb, 1 Mb and 1 Gb. To measure the elapsed
time of the operation Windows API’s function QueryPerformanceCounter has been
used, which has a precision of a tenth of a nanosecond. The machine on which the
tests have been carried out features a Samsung 850 EVO 500GB SSD and 8 Corsair
DDR3 4GB 667MHz RAM modules.
As the table shows, apart from very small sets of data, for which the overhead makes
up the majority of the elapsed time, the access to RAM is effectively up to 20x faster
than for the SSD. This means that, for a particularly large dataset of 10 GB, the
I/O time can decrease from 1 minute to 3 seconds, or from 1 hour to 3 minutes.
ParaUnity 59
Table 4.1: Performance comparison between disk I/O and RAM I/O
Data Size
Write time (ms)
SSD RAM
1 B 2.25658 0.03998
1 KB 2.81049 0.04102
1 MB 7.79258 0.31661
1 GB 65582.1 367.192
Unfortunately, such tremendous performance improvement cannot be fully appreci-
ated in the complete, full-fledged system. In fact, although disk I/O is indeed the
first bottleneck of the import/export process, there is a second bottleneck: data
encoding. In fact, as mentioned before, data exported from ParaView is encoded
in the X3D format, which is a human-readable, ASCII, XML format. It is there-
fore unsurprising that, as shown by Visual Studio profiling tools after the RAM
I/O implementation, about 85% of the total export time is spent in the calls to
vtkX3DExporter, which is part of VTK’s library and thus out of the scope of Para-
Unity.
4.3.2 Asynchronous communication for time-dependent data
Another performance limitation highlighted in Section 4.2.1.2 is due to the fact that,
in the case of a time-dependent model, all frames are exported from ParaView before
Unity is notified and can start importing. In considering possible improvements on
this aspect, it has been decided to redesign the communication protocol in order
to asynchronously update Unity about the availability of an exported frame. This
way, Unity can import the model frame by frame while this is being exported, thus
greatly reducing the idling time.
The new improved protocol is as follows (in gray the parts that haven’t been changed
from the initial version):
1. Unity starts, and immediately creates a TCP Listening socket on a random
port.
ParaUnity 60
2. In a standardized location, Unity creates an empty file, the name of which is
the port number.
3. ParaUnity starts, and searches the standardized location for files. For every
file it finds (i.e. every registered port), it checks for the status of the connection
on that port by sending a test message.
4. If ParaUnity finds an active Unity instance, it starts exports the first frame as
a named shared memory object.
5. When ParaUnity is done exporting the frame, it sends Unity a message through
the TCP Socket, containing: the name of the object, the size of the first frame,
the index of the first frame and the total number of frames.
6. If there are other frames, ParaUnity starts exporting the next frame.
7. When Unity receives the message, it reads the file from that location, interprets
it into an object and inserts it into the scene. It then sends a confirmation to
ParaUnity formatted as "<frameNumber> OK".
8. When ParaUnity receives a confirmation for the i-th frame, it frees the memory
in shared memory relative to that frame and, if there are other frames, exports
the next one.
9. All steps from step 5 are repeated until there are no other frames left.
It is immediately evident by looking at the improved version of the communication
algorithm that, by implementing an asynchronous communication between Para-
Unity and Unity, much of the idling time during the import/export process has
been cut off.
4.3.2.1 Performance Results
Given the fact that the protocol is intrinsically integrated in the system, the only
suitable way for testing its performance is to time the whole round-trip time (RTT)
ParaUnity 61
of certain pre-defined sets of data. Therefore, 10 repeated tests have been performed
with 3 datasets of different sizes: 10 MB, 170 MB and 2 GB. All sets had exactly
100 frames. The average results for the old and the new protocol are shown in Table
4.2.
Table 4.2: Performance comparison for the improved communication protocol.
Data Size
RTT (s)
Improvement
Old protocol Improved protocol
10 MB 4.15 3.81 8.11%
170 MB 37.84 32.24 14.78%
2 GB 421.11 352.72 16.24%
Given the fact that the reviews to the protocol are effectively parallelizing it, an
upper bound to the performance increase would have been 50%, meaning complete
parallelization. However, in practice this results cannot be achieved due to:
• The additional overhead resulting by a more complex communication between
the two processes,
• The fact that the load is not balanced among the two processes (in fact,
importing in Unity is in general faster than exporting from ParaUnity, and
therefore there is an idling time which cannot be avoided).
This considered, as the data from Table 4.2 shows, the revised algorithm for the
communication between ParaUnity and Unity offers an improvement of around 15%
for large enough datasets.
Chapter 5
Conclusions
In Chapters 3 and 4 a detailed analysis of the two main parts of the final system
has been provided. In particular, both the Unity Application and ParaUnity have
been discussed from an architectural and a more in-depth perspective, along with
an overview of the communication protocol between the two processes.
This chapter will focus on analyzing the final overall system and its work-flow, and
to evaluate the capabilities it provides and the objectives it achieves.
5.1 Final system architecture
A UML diagram of the architecture of the resulting system is shown in Figure 5.1.
The entity in the top-left corner is the ParaView and VTK software system, in which
the CFD dataset is loaded. When the user clicks the appropriate button on Para-
View’s GUI, the data is processed by VTK and sent to ParaUnity. In here, a TCP
connection with Unity is established, while at the same time the data is exported
as a Named Shared Memory Object in the system’s RAM. When ParaUnity notifies
Unity about the availability of data in the RAM, the latter starts the importing
process, and the model is then loaded in the running scene. It can be noticed that
62
Conclusions 63
Unity is also responsible for back-and-forth communication with the VR kit, in or-
der to collect data from sensors and user input and update the view on the HMD
accordingly.
From such work-flow it can be concluded that the proposed system succeeds in
bridging the gap between ParaView and VR technologies, thanks to an improved
version of ParaUnity and a full-fledged Unity Application.
Figure 5.1: Final system architecture diagram
5.2 Objectives achieved
After having described the final system in Section 5.1, it is necessary to compare
its capabilities with the project objectives outlined in Section 1.1. As said before,
the work described in this thesis had a rather open-ended exploratory approach
Conclusions 64
rather than a strictly defined set of software requirements, but it is nonetheless
important to validate the results against the guidelines defined at the beginning of
the development, which will be analyzed one by one in the remaining part of this
section.
The project shall result in a working prototype of a VR application: as
described in Chapter 3, one of the two outputs of this project is a fully working
Unity Application, with a satisfactory set of features.
The application shall allow the handling of CFD data: all the CFD mod-
els imported in the Unity Application can be visualized, interacted with (grabbed,
moved, rotated, ...) and manipulated (scaled, animated, ...). More forms of ma-
nipulations, such as mesh slicing, are currently being investigated in MSc Thesis
projects by fellow students.
The application shall allow the import of data from ParaView: through
ParaUnity, data loaded in ParaView and VTK can be exported to the Unity Appli-
cation.
The application shall run compatibly at least on Windows: one of Unity’s
main strength is its cross-platform compatibility. In fact, most of a Unity project can
be transparently compiled to many desktop and mobile platforms (Windows, Linux,
MacOS, Android, iOS, ...). The current implementation of the Unity application
is completely compatible with Windows (version 7 or greater). Additional cross-
platform compatibility can be easily achieved by generalizing the access to RAM,
which currently uses the Windows API.
The application shall support a HTC Vive kit: since the HTC Vive kit is the
one that has been used throughout the whole development process, this objective is
satisfied. Moreover, thanks to the flexibility of VRTK, adding support to different
VR kits would only require remapping the control buttons and their callbacks.
Conclusions 65
The code should be designed to be maintainable, flexible and expandable:
as said before, much of the development effort has been spent in designing a codebase
which could then be expanded. In its current state the application is designed
to facilitate adding more features to the application, increasing its cross-platform
compatibility, or modifying part of its functionalities.
The application should be easy to use: one of the main points being made in
Chapter 3 is the focus on the intuitiveness of the interactions between the user and
the application. In discussing design choices for most of the features, usability tests
have been carried out to highlight possible difficulties, and to rework the interface
in its the most natural configuration.
5.3 Future work
All considered, the project described in this thesis can be seen as an interesting
starting point for further development in the field of Virtual Reality for Compu-
tational Fluid Dynamics. In fact, given that little to no software on the market
currently offers this kind of capabilities, it is probably worth to invest more devel-
opment effort in enriching the application, especially by adding more CFD-related
features, so that it could become a production-ready piece of software.
Being open-ended by nature, the possibilities for further development are virtually
endless, and everything is designed to allow and ease such process. As stated before,
the main path to follow to improve the application would be porting as many Para-
View features as possible, and reimplementing them natively into the application.
In fact, data manipulation such as contouring or clipping must currently be done in
ParaView, and the objects have then to be exported again to Unity. Moving their
scope to the latter would make the application more independent and self-contained,
and would be a further step towards a complete CFD application in Virtual Reality.
Moreover, another suggestion for an interesting feature would be the possibility to
display the chosen variable with which the mesh is colored. Currently, Unity has no
Appendices 66
knowledge about variables and their values, and only receives informations about
the colors of each face and vertex of the mesh. Therefore, to give the user the
possibility of changing the visualization of the object in real-time, would foster a
more productive kind of interaction with the application.
As far as performance is concerned, the current bottleneck in the import/export
process, as explained in Section 4.3.1, is given by the necessity to encode data in
the X3D format. It could be interesting to research different ways to represent the
data before transferring it, so that the whole process could be sped up. Moreover, a
different representation format could also be useful to further generalize the Unity
Application, by also making it compatible to other CFD visualization software be-
sides ParaView.
Appendix A
Setup Instructions
This appendix provides the instructions to setup, install and run the software system
described in this thesis. They refer to a machine with VR-ready hardware running
Windows 10.
A.1 Building ParaView and ParaUnity
This section provides the instructions for building a working copy of ParaView with
the ParaUnity plug-in. It is a simplified and adapted version of the readme file of
the official ParaUnity repository [23].
A.1.1 Prerequisites
• CMake 3.8.1
• Visual Studio 2015 x64 Community Edition
67
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics
Design and implementation of a Virtual Reality application for Computational Fluid Dynamics

Más contenido relacionado

La actualidad más candente

Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...
Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...
Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...Nóra Szepes
 
Specification of the Linked Media Layer
Specification of the Linked Media LayerSpecification of the Linked Media Layer
Specification of the Linked Media LayerLinkedTV
 
[E Book] Linux, G C C X G C C The Gnu C C++ Language System For Emb...
[E Book]  Linux,  G C C  X G C C  The  Gnu  C  C++  Language  System For  Emb...[E Book]  Linux,  G C C  X G C C  The  Gnu  C  C++  Language  System For  Emb...
[E Book] Linux, G C C X G C C The Gnu C C++ Language System For Emb...Emanuele Bonanni
 
bonino_thesis_final
bonino_thesis_finalbonino_thesis_final
bonino_thesis_finalDario Bonino
 
Bike sharing android application
Bike sharing android applicationBike sharing android application
Bike sharing android applicationSuraj Sawant
 
Hello, android introducing google’s mobile development platform, 2nd editio...
Hello, android   introducing google’s mobile development platform, 2nd editio...Hello, android   introducing google’s mobile development platform, 2nd editio...
Hello, android introducing google’s mobile development platform, 2nd editio...Kwanzoo Dev
 
Android Application for American Sign Language Recognition
Android Application for American Sign Language RecognitionAndroid Application for American Sign Language Recognition
Android Application for American Sign Language RecognitionVishisht Tiwari
 
Bachelor Thesis .pdf (2010)
Bachelor Thesis .pdf (2010)Bachelor Thesis .pdf (2010)
Bachelor Thesis .pdf (2010)Dimitar Dimitrov
 
Nvidia cuda programming_guide_0.8.2
Nvidia cuda programming_guide_0.8.2Nvidia cuda programming_guide_0.8.2
Nvidia cuda programming_guide_0.8.2Piyush Mittal
 
Machine_translation_for_low_resource_Indian_Languages_thesis_report
Machine_translation_for_low_resource_Indian_Languages_thesis_reportMachine_translation_for_low_resource_Indian_Languages_thesis_report
Machine_translation_for_low_resource_Indian_Languages_thesis_reportTrushita Redij
 
Mikel berdufi university_of_camerino_thesis
Mikel berdufi university_of_camerino_thesisMikel berdufi university_of_camerino_thesis
Mikel berdufi university_of_camerino_thesisMikel Berdufi
 

La actualidad más candente (20)

Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...
Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...
Thesis - Nora Szepes - Design and Implementation of an Educational Support Sy...
 
report
reportreport
report
 
Specification of the Linked Media Layer
Specification of the Linked Media LayerSpecification of the Linked Media Layer
Specification of the Linked Media Layer
 
[E Book] Linux, G C C X G C C The Gnu C C++ Language System For Emb...
[E Book]  Linux,  G C C  X G C C  The  Gnu  C  C++  Language  System For  Emb...[E Book]  Linux,  G C C  X G C C  The  Gnu  C  C++  Language  System For  Emb...
[E Book] Linux, G C C X G C C The Gnu C C++ Language System For Emb...
 
bonino_thesis_final
bonino_thesis_finalbonino_thesis_final
bonino_thesis_final
 
My PhD Thesis
My PhD Thesis My PhD Thesis
My PhD Thesis
 
Manual Civil 3d Ingles
Manual Civil 3d InglesManual Civil 3d Ingles
Manual Civil 3d Ingles
 
Bike sharing android application
Bike sharing android applicationBike sharing android application
Bike sharing android application
 
Hello, android introducing google’s mobile development platform, 2nd editio...
Hello, android   introducing google’s mobile development platform, 2nd editio...Hello, android   introducing google’s mobile development platform, 2nd editio...
Hello, android introducing google’s mobile development platform, 2nd editio...
 
Master's Thesis
Master's ThesisMaster's Thesis
Master's Thesis
 
Android Application for American Sign Language Recognition
Android Application for American Sign Language RecognitionAndroid Application for American Sign Language Recognition
Android Application for American Sign Language Recognition
 
Bachelor Thesis .pdf (2010)
Bachelor Thesis .pdf (2010)Bachelor Thesis .pdf (2010)
Bachelor Thesis .pdf (2010)
 
Project Dissertation
Project DissertationProject Dissertation
Project Dissertation
 
Nvidia cuda programming_guide_0.8.2
Nvidia cuda programming_guide_0.8.2Nvidia cuda programming_guide_0.8.2
Nvidia cuda programming_guide_0.8.2
 
Machine_translation_for_low_resource_Indian_Languages_thesis_report
Machine_translation_for_low_resource_Indian_Languages_thesis_reportMachine_translation_for_low_resource_Indian_Languages_thesis_report
Machine_translation_for_low_resource_Indian_Languages_thesis_report
 
Eple thesis
Eple thesisEple thesis
Eple thesis
 
Polyworks beginners guide
Polyworks beginners guidePolyworks beginners guide
Polyworks beginners guide
 
diss
dissdiss
diss
 
Javanotes6 linked
Javanotes6 linkedJavanotes6 linked
Javanotes6 linked
 
Mikel berdufi university_of_camerino_thesis
Mikel berdufi university_of_camerino_thesisMikel berdufi university_of_camerino_thesis
Mikel berdufi university_of_camerino_thesis
 

Similar a Design and implementation of a Virtual Reality application for Computational Fluid Dynamics

Work Measurement Application - Ghent Internship Report - Adel Belasker
Work Measurement Application - Ghent Internship Report - Adel BelaskerWork Measurement Application - Ghent Internship Report - Adel Belasker
Work Measurement Application - Ghent Internship Report - Adel BelaskerAdel Belasker
 
Report on e-Notice App (An Android Application)
Report on e-Notice App (An Android Application)Report on e-Notice App (An Android Application)
Report on e-Notice App (An Android Application)Priyanka Kapoor
 
AUGUMENTED REALITY FOR SPACE.pdf
AUGUMENTED REALITY FOR SPACE.pdfAUGUMENTED REALITY FOR SPACE.pdf
AUGUMENTED REALITY FOR SPACE.pdfjeevanbasnyat1
 
Data over dab
Data over dabData over dab
Data over dabDigris AG
 
iGUARD: An Intelligent Way To Secure - Report
iGUARD: An Intelligent Way To Secure - ReportiGUARD: An Intelligent Way To Secure - Report
iGUARD: An Intelligent Way To Secure - ReportNandu B Rajan
 
Uni v e r si t ei t
Uni v e r si t ei tUni v e r si t ei t
Uni v e r si t ei tAnandhu Sp
 
Ensuring Distributed Accountability in the Cloud
Ensuring Distributed Accountability in the CloudEnsuring Distributed Accountability in the Cloud
Ensuring Distributed Accountability in the CloudSuraj Mehta
 
Agentless Monitoring with AdRem Software's NetCrunch 7
Agentless Monitoring with AdRem Software's NetCrunch 7Agentless Monitoring with AdRem Software's NetCrunch 7
Agentless Monitoring with AdRem Software's NetCrunch 7Hamza Lazaar
 
Evaluation of Real-Time Communication in IoT Services by WebRTC
Evaluation of Real-Time Communication in IoT Services by WebRTCEvaluation of Real-Time Communication in IoT Services by WebRTC
Evaluation of Real-Time Communication in IoT Services by WebRTCChandan Sarkar
 
Security in mobile banking apps
Security in mobile banking appsSecurity in mobile banking apps
Security in mobile banking appsAlexandre Teyar
 

Similar a Design and implementation of a Virtual Reality application for Computational Fluid Dynamics (20)

Work Measurement Application - Ghent Internship Report - Adel Belasker
Work Measurement Application - Ghent Internship Report - Adel BelaskerWork Measurement Application - Ghent Internship Report - Adel Belasker
Work Measurement Application - Ghent Internship Report - Adel Belasker
 
document
documentdocument
document
 
Report-V1.5_with_comments
Report-V1.5_with_commentsReport-V1.5_with_comments
Report-V1.5_with_comments
 
Knapp_Masterarbeit
Knapp_MasterarbeitKnapp_Masterarbeit
Knapp_Masterarbeit
 
Report on e-Notice App (An Android Application)
Report on e-Notice App (An Android Application)Report on e-Notice App (An Android Application)
Report on e-Notice App (An Android Application)
 
AUGUMENTED REALITY FOR SPACE.pdf
AUGUMENTED REALITY FOR SPACE.pdfAUGUMENTED REALITY FOR SPACE.pdf
AUGUMENTED REALITY FOR SPACE.pdf
 
Data over dab
Data over dabData over dab
Data over dab
 
iGUARD: An Intelligent Way To Secure - Report
iGUARD: An Intelligent Way To Secure - ReportiGUARD: An Intelligent Way To Secure - Report
iGUARD: An Intelligent Way To Secure - Report
 
Uni v e r si t ei t
Uni v e r si t ei tUni v e r si t ei t
Uni v e r si t ei t
 
ThesisB
ThesisBThesisB
ThesisB
 
FULLTEXT01.pdf
FULLTEXT01.pdfFULLTEXT01.pdf
FULLTEXT01.pdf
 
Ensuring Distributed Accountability in the Cloud
Ensuring Distributed Accountability in the CloudEnsuring Distributed Accountability in the Cloud
Ensuring Distributed Accountability in the Cloud
 
Agentless Monitoring with AdRem Software's NetCrunch 7
Agentless Monitoring with AdRem Software's NetCrunch 7Agentless Monitoring with AdRem Software's NetCrunch 7
Agentless Monitoring with AdRem Software's NetCrunch 7
 
Malware Analysis
Malware Analysis Malware Analysis
Malware Analysis
 
Evaluation of Real-Time Communication in IoT Services by WebRTC
Evaluation of Real-Time Communication in IoT Services by WebRTCEvaluation of Real-Time Communication in IoT Services by WebRTC
Evaluation of Real-Time Communication in IoT Services by WebRTC
 
Milan_thesis.pdf
Milan_thesis.pdfMilan_thesis.pdf
Milan_thesis.pdf
 
test6
test6test6
test6
 
E.M._Poot
E.M._PootE.M._Poot
E.M._Poot
 
Security in mobile banking apps
Security in mobile banking appsSecurity in mobile banking apps
Security in mobile banking apps
 
Master's Thesis
Master's ThesisMaster's Thesis
Master's Thesis
 

Último

Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...soniya singh
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number SystemsJheuzeDellosa
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...aditisharan08
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfkalichargn70th171
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comFatema Valibhai
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...harshavardhanraghave
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Intelisync
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxComplianceQuest1
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataBradBedford3
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfkalichargn70th171
 

Último (20)

Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number Systems
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 

Design and implementation of a Virtual Reality application for Computational Fluid Dynamics

  • 1. UNIVERSIT`A DEGLI STUDI DI TRIESTE Dipartimento di Ingegneria e Architettura Laurea Magistrale in Ingegneria Informatica Design and implementation of a Virtual Reality application for Computational Fluid Dynamics Dicembre 2017 Laureando Relatore Lorenzo D’Eri Prof. Francesco Fabris Anno Accademico 2016/2017
  • 2. ”The most likely way for the world to be destroyed, most experts agree, is by accident. That’s where we come in; we’re computer engineers. We cause accidents.” - Nathaniel Borenstein, co-creator of MIME
  • 3. Abstract (Italian) Il progetto discusso in questa tesi consiste nella progettazione e nello sviluppo di un applicativo che permetta visualizzazione, interazione e manipolazione in Realt`a Virtuale di dati provenienti dall’ambito della Fluidodinamica Computazionale. Nella prima parte della tesi viene innanzitutto delineato lo stato dell’arte circa le applicazioni per la Realt`a Virtuale in ambito scientifico. La ricerca effettuata mostra che, pur essendoci esempi di applicativi che utilizzano la Realt`a Virtuale per la visualizzazione di dati, specialmente in campo medico e militare, nessuno di essi riesce ad offrire una soluzione completa e autocontenuta. Inoltre, in questa parte della tesi viene offerta una panoramica sulle tecnologie software e hardware che sono state utilizzate nel corso dello sviluppo oggetto di questa trattazione. La tesi procede dunque con la descrizione dei due artefatti che rappresentano l’output del progetto: da un lato un’applicazione di Realt`a Virtuale costruita sul motore grafico e fisico Unity3D, dall’altro una versione ulteriormente sviluppata di Para- Unity, un plug-in pre-esistente per il software di visualizzazione CFD ParaView. Il sistema finale composto da questi due artefatti permette di esportare dati di flu- idodinamica computazionale da ParaView ed importarli in un ambiente in Realt`a Virtuale, nel quale sono state implementate diverse forme di interazione e manipo- lazione con essi. In conclusione, la tesi sottolinea le scelte progettuali che sono state compute nel delineare gli applicativi. `E infatti mostrato come tali scelte abbiano portato ad una soluzione generale, flessibile e di facile espansione: qualit`a di grande importanza nell’ottica di un ulteriore sviluppo delle funzionalit`a che il sistema software risultante pu`o offrire all’utente.
  • 4. Abstract The project discussed in this paper consists in the development of a working proto- type that allows visualization, interaction and manipulation of Compulational Fluid Dymanics datasets in a 3D Virtual Reality environment. In the first part of this thesis, background information about the state of the art of Virtual Reality applications for scientific purposes is presented. Research shows that, although there being current development efforts, especially in the military and medicine fields, no full-fledged solutions are currently available on the market. Moreover, the first part also outlines the main software and hardware technologies that have been used in the development phase of the project. The thesis then pro- ceeds to describe the two pieces of software that make up the output of the project: a Virtual Reality application built upon the Unity 3D graphics and physics engine, and an improved version of a pre-existing plug-in for the well-known CFD visual- ization software ParaView, called ParaUnity. The final system in which these two applications are included is effectively able to export CFD datasets from ParaView and import them in a Virtual Reality environment, in which various forms of inter- action and manipulation have been implemented. In conclusion, the thesis underlines the design choices in engineering the applications, that lead to a general, flexible and easily-expandable solution, which is of utter importance in view of further developments in the features and capabilities that the resulting system software can offer to the user.
  • 5. Acknowledgements To Cri, who has always been there for me. To my family: my mother Sabina, my father Angelo and my brother Giorgio, who have supported me in any possible way. To Fede, not only a friend but also the best late-night software engineering consultant I could have hoped for. To my friends I Cazzilli: Seb, Ste and Fede, who probably spent this year training at gnagno so that I now have no chance of winning against them. To Simo, who willingly spent his study breaks wearing blond wigs and yellow shirts with me in front of hundreds of thousands of people. To Ventura and R´emi, for sharing not only the journey, but also countless cups of coffee and an embarrassing amount of French cheese. To Antonis, for believing in us and in our project even in times of crisis. To prof. Fabris, whose passion for teaching should serve as an example for every teacher and student. To myself, for never giving up on my daily dose of Italian pasta. iv
  • 6. Contents Abstract (Italian) ii Abstract iii Acknowledgements iv List of Figures ix List of Tables x Abbreviations xi 1 Introduction 1 1.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Project Management . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2.1 Time management . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2.2 Versioning and productivity tools . . . . . . . . . . . . . . . . 6 1.2.2.1 Github . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2.2.2 Waffle . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Background 8 2.1 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.1.1 Virtual Reality applications . . . . . . . . . . . . . . . . . . . 9 2.1.2 Virtual Reality for scientific data . . . . . . . . . . . . . . . . 10 2.2 Technologies Used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2.1 HTC Vive . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2.1.1 HTC Vive software . . . . . . . . . . . . . . . . . . . 11 2.2.2 ParaView and VTK . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2.2.1 Virtual Reality Capabilities in ParaView . . . . . . . 14 2.2.3 Unity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.2.3.1 Object behaviors in Unity . . . . . . . . . . . . . . . 15 v
  • 7. Contents vi 2.2.3.2 Virtual Reality Capabilities in Unity . . . . . . . . . 17 2.2.4 ParaUnity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3 Unity Application 19 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.1.1 Why Unity? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.2 Application Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.2.1 Top-level Managers . . . . . . . . . . . . . . . . . . . . . . . . 22 3.3 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.3.1 Controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.3.2 Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 3.3.2.1 Lighting . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.3.3 ParaView object loading . . . . . . . . . . . . . . . . . . . . . 29 3.3.3.1 3D model creation . . . . . . . . . . . . . . . . . . . 30 3.3.3.2 3D model material . . . . . . . . . . . . . . . . . . . 31 3.3.4 Static Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.3.5 Radial Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.3.6 Object interaction . . . . . . . . . . . . . . . . . . . . . . . . 36 3.3.6.1 Bounding box . . . . . . . . . . . . . . . . . . . . . . 38 3.3.7 Resizing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 3.3.8 Animation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 3.3.9 Slicing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 3.3.10 Logger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 4 ParaUnity 49 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 4.2 Initial state . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 4.2.1 Plug-in implementation . . . . . . . . . . . . . . . . . . . . . . 52 4.2.1.1 TCP Socket . . . . . . . . . . . . . . . . . . . . . . . 52 4.2.1.2 Exporting the objects . . . . . . . . . . . . . . . . . 53 4.2.2 Initial communication protocol . . . . . . . . . . . . . . . . . 54 4.3 Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 4.3.1 From Disk I/O to RAM I/O . . . . . . . . . . . . . . . . . . . 55 4.3.1.1 Performance Results . . . . . . . . . . . . . . . . . . 58 4.3.2 Asynchronous communication for time-dependent data . . . . 59 4.3.2.1 Performance Results . . . . . . . . . . . . . . . . . . 60 5 Conclusions 62 5.1 Final system architecture . . . . . . . . . . . . . . . . . . . . . . . . . 62 5.2 Objectives achieved . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
  • 8. Contents vii 5.3 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 A Setup Instructions 67 A.1 Building ParaView and ParaUnity . . . . . . . . . . . . . . . . . . . . 67 A.1.1 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 A.1.2 Obtain the source code . . . . . . . . . . . . . . . . . . . . . . 68 A.1.3 Compile Qt . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 A.1.4 Compile ParaView . . . . . . . . . . . . . . . . . . . . . . . . 68 A.1.5 Compile ParaUnity . . . . . . . . . . . . . . . . . . . . . . . . 69 A.1.6 Loading the plug-in in ParaView . . . . . . . . . . . . . . . . 70 A.2 Building the Unity Application . . . . . . . . . . . . . . . . . . . . . 70 A.2.1 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 A.2.2 Obtain the source code . . . . . . . . . . . . . . . . . . . . . . 70 A.2.3 Compile the application . . . . . . . . . . . . . . . . . . . . . 70 A.2.4 Exporting an object from ParaView to Unity . . . . . . . . . . 71 B Code of the Unity Application 72 B.1 AnimationManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 B.2 BoundingBoxBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . 75 B.3 ControllerBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 B.4 ControllersManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 B.5 EnvironmentManager . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 B.6 FloorBehavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 B.7 FunctionsExtensions . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 B.8 Globals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 B.9 HeadsetManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 B.10 Interactable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 B.11 LightManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 B.12 LogManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 B.13 ModeManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 B.14 ParaviewObjectLoader . . . . . . . . . . . . . . . . . . . . . . . . . . 95 B.15 PointerRadialMenuBehaviour . . . . . . . . . . . . . . . . . . . . . . 99 B.16 RadialMenuBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . 100 B.17 SizeManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 B.18 SlicingManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 B.19 SlicingPlaneBehaviour . . . . . . . . . . . . . . . . . . . . . . . . . . 106 B.20 StaticMenuManager . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 B.21 UIPointerRadialMenuBehaviour . . . . . . . . . . . . . . . . . . . . . 111
  • 9. Contents viii B.22 XDocumentLoader . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 C Code of ParaUnity 115 C.1 Unity3D.h . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 C.2 Unity3D.cpp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Bibliography 130
  • 10. List of Figures 1.1 Gantt chart of the project . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2 GitHub repository for the Unity Application . . . . . . . . . . . . . . 7 1.3 Waffle board for the Unity Application . . . . . . . . . . . . . . . . . 7 2.1 HTC Vive kit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 HTC Vive controller buttons . . . . . . . . . . . . . . . . . . . . . . . 13 2.3 Example of Unity’s interface . . . . . . . . . . . . . . . . . . . . . . . 16 3.1 Screenshot of the resulting Unity Application . . . . . . . . . . . . . 20 3.2 Architectural diagram of the main Unity scene . . . . . . . . . . . . . 24 3.3 Screenshot of the two controllers in action . . . . . . . . . . . . . . . 27 3.4 Screenshot of the environment of the application . . . . . . . . . . . . 28 3.5 Screenshot of a mesh colored according to its values . . . . . . . . . . 32 3.6 Screenshot of the opened static menu . . . . . . . . . . . . . . . . . . 34 3.7 Screenshot of the opened radial menu . . . . . . . . . . . . . . . . . . 36 3.8 Screenshot of the bounding box . . . . . . . . . . . . . . . . . . . . . 39 3.9 Screenshots of four different frames of an ongoing animation . . . . . 44 3.10 Screenshot of the mock implementation of the slicing plane . . . . . . 46 3.11 Screenshot of a message being logged through the logger . . . . . . . 47 4.1 ParaUnity buttons on ParaView’s GUI . . . . . . . . . . . . . . . . . 50 5.1 Final system architecture diagram . . . . . . . . . . . . . . . . . . . . 63 ix
  • 11. List of Tables 2.1 ParaView-VTK Architecture (simplified) . . . . . . . . . . . . . . . . 14 4.1 Performance comparison between disk I/O and RAM I/O . . . . . . . 59 4.2 Performance comparison for the improved communication protocol. . 61 x
  • 12. Abbreviations CFD Computational Fluid Dynamics VR Virtual Reality AIRC Aerospace Integration Research Centre SATM School of Aerospace, Technology and Manufacturing HMD Head Mounted Display API Application Programming Interface OS Operating System VRTK Virtual Reality ToolKit GUI Graphical User Interface OOP Object Oriented Programming OOD Object Oriented Design UI User Interface TCP Transfer Communication Protocol RAM Random Access Memory XML Extensible Markup Language UML Unified Modeling Language xi
  • 13. Chapter 1 Introduction The work presented in this thesis consists in the development of a Virtual Real- ity application for visualizing and interacting with Computational Fluid Dynamics datasets. With Virtual Reality technologies improving rapidly, particularly driven by the gam- ing and entertainment market, industries have begun to research possible applica- tions across various fields, such as medicine, learning and engineering. Chapter 2 will present the current state of the art regarding Virtual Reality application and will provide examples of the research being done. One field for which Virtual Reality technologies offers some promising possibilities is Computational Fluid Dynamics. In fact, given that CFD datasets are usually large, complex, three-dimensional and often time-dependent, it is evident that an immersive and interactive experience in Virtual Reality could provide a new, more efficient way to visualize and interact with such data. This is among the reasons why Cranfield University’s Aerospace Integration Research Centre (part of the School of Aerospace, Technology and Manufacturing) has founded a Virtual Reality R&D division. The work described in this thesis is one of the first three projects of such division. The project itself consisted in the development of an application that could load CFD data, present them to the user in a virtual environment and provide some 1
  • 14. Introduction 2 degrees of interaction and manipulation. As described in Chapter 3, the development platform of choice was Unity. Such application would receive the CFD objects from ParaView and VTK (both presented, along with Unity, in Chapter 2), and then act independently as a stand-alone application. The Virtual Reality kit for the development is a HTC Vive (further detailed in Section 2.2.1), which consists in a head-mounted display, two haptic controllers and two cameras for three-dimensional tracking. As explained in Chapter 4, a plugin for ParaView called ParaUnity was used (and further developed) to enable the communication between the two software systems. The importance of this lies in the fact that the user does not need to know about Unity to run the application, as they are presented with the same familiar interface of ParaView with the addition of an export button. In Chapter 5 the final state of the overall system is presented. As one can see, it consists in a complete, self-contained software system that provides the user with the possibility of interacting with ParaView objects in a virtual environment. Moreover, a summary of the obtained results is provided and some ideas for future development are proposed to the reader. 1.1 Objectives Being a rather open-ended project, i.e. a project in which there is no strict and well- defined set of software requirement specifications, the objectives of the development have been purposefully kept wide and general, as to reflect the idea that the project could follow an exploratory approach. Nonetheless, there are still some guidelines that have been followed from the begin- ning to the end of the project: • The project shall result in a working prototype of a Virtual Reality application.
  • 15. Introduction 3 • The application shall allow the handling of CFD data; in particular, it shall provide: – visualization of the data, – interaction with the data, – some basic forms of manipulation of the data. • The application shall allow the import of data from ParaView. • The application shall run compatibly at least on Windows (version 7 or greater), and optionally on Linux. • The application shall support a HTC Vive kit. • The code should be designed to be maintainable, flexible and expandable. • The application should be easy to use, being it aimed at CFD scientists with little to no prior VR experience. In Section 5.2 these objectives will be discussed in light of the work done. 1.2 Project Management If on one hand the open-ended nature of the project can foster creativity and inno- vation, given that the developer has much more control over the whole process from design to deploy, on the other it requires careful planning to avoid failures and to maximize efficiency. It is, in fact, the job of a Software Engineer to not only design the code, but to also take all the necessary step to organize the work and prepare all the tools that he or she may need.
  • 16. Introduction 4 1.2.1 Time management The duration of this project spans across several months, from March to early September. To organize it, a certain number of high-level tasks has been defined. These have been then further divided in units of work and have been given a du- ration and a time allocation. The result of this planning can be seen in the Gantt chart in Figure 1.1. The chart itself clearly shows the division of the project in phases. Initially, the has been a preparatory research in which the existing literature has been reviewed and documented, and time has also been dedicated in becoming familiar with ParaView and VTK. This was followed by a second phase of environment setup, with the objective of preparing the deployment machine with all the necessary software, and setting up the development environment for an optimal work-flow. Part of this phase has also been spent at exploring the compatibility with a Linux operating system. These two initial phases have been a collaborative effort with the other two MSc students working on the same topic, and therefore required careful job division and collaboration to ensure a better time efficiency. From the third phase the actual individual development of the Unity application has started. This lasted from late May to late July, when the work had to be finalized in time for the Technical Presentation. The time planning shows that the development followed a feature-based approach, in which each work unit consists in a feature as perceived by the user. The most critical point of this phase is the integration with ParaUnity (the development of which proceeded more or less in parallel with the Unity Application). In fact, this required having both pieces of software at a very specific point of their development, and would be a necessary prerequisite for any further feature. Both the development of the Unity Application and of ParaUnity were followed by careful validation testing. The next phase began after the Technical Presentation, and was completely dedi- cated to refactoring the code, ensuring higher flexibility and maintainability. This was then followed by the redaction of the thesis and the rest of the documentation, in time for the deadline of the submission of the thesis, in mid-August.
  • 17. Introduction 5 2017 Mar Apr May Jun Jul Aug S Preparatory Research Literature Review ParaView training VTK documentation study Lit. Review submission Environment Setup Cross-platform testing Configure, build and install Manual Writing Unity Application Unity training Scene setup Static menu Radial menu ParaUnity integration Object interaction Scaling Animation Materials Tests ParaUnity Configure, build and install Unity integration RAM export Communication protocol HTC Vive support Tests Technical Presentation Code refactoring Class redefinition Generalization Regression Tests Documentation Code commenting Thesis Redaction Revision and Correction Thesis Submission Poster Presentation Figure 1.1: Gantt chart of the project
  • 18. Introduction 6 1.2.2 Versioning and productivity tools As any project of this size, and especially being it part of a prototyping work- flow, some productivity tools were required to ensure organization and efficiency during all phases of the development. Among these, GitHub and Waffle have proved particularly useful in managing some aspects of the project. 1.2.2.1 Github The codebase for the project has been hosted on GitHub, a web version control repository based on Git. The reasons for this choice were multiple: on one hand, Git’s versioning system gives the advantage of having fine-grained incremental back- ups, accessible from any machine, with the version history being fully explorable via GitHub’s web interface. This, along with the merging capabilities that Git natively provides, resulted in a smooth initial phase of collaborative development with the two other students working on similar projects. Moreover, thanks to Git’s branch- ing system the project benefited from a more organized development, in which each feature was independently developed on one branch. As an example, the home page of the GitHub repository for the Unity Application is shown in Figure 1.2. 1.2.2.2 Waffle Waffle is an online automated project management tool, that interfaces with GitHub repositories, handling GitHub issues and pull requests. Its main feature is the ability to create a board-like view for every issue in the repository, thus providing the developer with an interactive to-do list that interfaces directly with GitHub. During the development, it has been used extensively as a productivity tool, mainly to keep track of the features and the bugs that were being worked on. In Figure 1.3 the board view for an initial state of the Unity Application repository is shown.
  • 19. Introduction 7 Figure 1.2: GitHub repository for the Unity Application Figure 1.3: Waffle board for the Unity Application
  • 20. Chapter 2 Background 2.1 Literature Review In a world where technology is ever-changing and rapidly evolving, one of the greatest and most fascinating challenges is to try and combine results coming from different areas of research to obtain a newer and more interesting perspective on a subject. This is especially true when new technologies involve different ways of interfacing with the digital world, thus providing revolutionary means to visualize and interact with data. Virtual Reality is a perfect example of one such technology. It consists of all the tools that use a computer to generate images, sounds and other sensory inputs to immerse the user into a virtual world. It can be said, as stated in [1], that: From an evolutionary perspective, virtual reality is seen as a way to over- come limitations of standard human-computer interfaces; from a revo- lutionary perspective, virtual reality technology opens the door to new types of applications that exploit the possibilities offered by presence simulation. 8
  • 21. Background 9 Ever since the creation of the first toolkits for Virtual Reality, many fields have benefited from such new approach to the digital world. One of these, and the one this thesis focuses on, is Computational Fluid Dynamics. As stated on [2], Computational Fluid Dynamics is a specific application of fluid mechanics that uses methods of numerical analysis to solve problems of flows of fluids. In the past, great effort has been spent in developing software to visualize and analyze scientific data such as that coming from CFD applications, each piece of software focusing on specific aspects and offering different capabilities. Lately, some of these programs have been further developed (often through the use of additional plugins, as seen in Section 2.2.4) to offer a certain degree of VR integration. How- ever, at the current stage, there are still no stable and widespread versions of such programs that have been developed with native VR capabilities. 2.1.1 Virtual Reality applications Concepts that could be referred to as Virtual Reality can be dated back as far as the first half of the twentieth century, but VR technologies in their current, modern form have been around for the last two decades. In fact, in 1996, Steve Bryson published an article in which he stated that “[by] immersing the user in the solution, virtual reality reveals the spatially complex structures in computational science in a way that makes them easy to understand and study” [3]. The same paper, on the other hand, highlighted the challenges in performance that such technology would pose. Pushed by the gaming industry and the consumer market [4], Virtual Reality has evolved ever since then to achieve what Bryson had predicted. Currently, there exist VR applications in many fields, one obvious example of which being the military. For instance, some of the most common military uses for Virtual Reality include flight simulators for pilot training, such as those detailed in [5] and [6], or more general-purpose simulators aimed at infantry like the one described in [7]. Another field in which Virtual Reality is becoming more and more widespread is medicine. In fact, VR environments are currently being used for simulating surgical procedures (as explained in [8]) and for visualizing and interacting with medical data (such as
  • 22. Background 10 MRI [9] and CAT scans [10]). In the last few years, programs aimed at bringing Virtual Reality to schools and universities are also being developed and tested, as clearly explained in [11]. On the other hand, as far as Computational Fluid Dynamics data are concerned, there have been different attempts to develop specific CFD software with native VR capabilities in the past (see [12] and [13]), and some development effort is currently being made to integrate existing applications with Virtual Reality (see Section 2.2.2). However, as stated before, results are nowhere near a full integration for general CFD problems at this time. 2.1.2 Virtual Reality for scientific data Most of the applications presented in Section 2.1.1, especially those aimed at visual- izing scientific data in a virtual environment, share a common work-flow regarding how data is handled (as explained in [14]): • Generation: data is generated from sensors or external software. • Pre-processing, importing, post-processing: data is moved to the visu- alization software. • Rendering: the software analyses data to create its graphical representation. • Visualization: data is presented to the user via a VR kit. • Feedback: through sensors, gestures, specific hardware such as controller and a GUI, the user can modify the visualization in real time. The same work-flow is also adopted by the software system discussed in this the- sis, where the Visualization and Feedback phases are implemented in the Unity Application described in Chapter 3.
  • 23. Background 11 2.2 Technologies Used Throughout the development of this project, a certain number of technologies, both hardware and software, has been used extensively. It is therefore the purpose of this section to provide an introduction to such technologies in order to allow a better understanding of the whole software system involved. 2.2.1 HTC Vive The Virtual Reality kit that has been used for the development is an HTC Vive kit [15], developed by HTC and Valve Corporation. The kit, officially released on 5 April 2016, consists in a head-mounted display (with a camera), two wireless haptic controllers and two cameras for position tracking (as shown in Figure 2.1). The interaction between the cameras, the controllers and the HMD allow for Room scale tracking, which means that the user’s movements and position in the real space are mapped in the virtual environment, thus creating a more immersive experience [16]. From a technical standpoint the kit’s headset uses one screen per eye, with a display resolution of 1080x1200 and a refresh rate of 90 Hz. The kit collects sensory inputs from a total of 70 infrared sensors (32 from the headset and 19 from each controller) along with gyroscopes and accelerometers. As seen in Figure 2.2, each controller includes several forms of input: regular buttons, a 2D axis radial touchpad, a trigger and two grip buttons. 2.2.1.1 HTC Vive software Software-wise, the HTC Vive ships with a runtime application system called SteamVR, a VR Platform developed by Valve as an extension of Steam. Developers can in- terface SteamVR via OpenVR [17], a C++ API that mainly exposes the events
  • 24. Background 12 generated by the controller, the headset’s view and the SteamVR dashboard. As far as OS support is concerned, the HTC Vive kit was initially developed for Microsoft Windows platforms, although current effort is being carried on for MacOS and Linux support. Figure 2.1: HTC Vive kit Source: Photo courtesy of HTC 2.2.2 ParaView and VTK ParaView is an open-source, multi-platform data analysis and visualization applica- tion [18]. It is entirely written in C++, it features both 2D and 3D representations and can be controlled either via its interactive interface or programmatically. In its latest versions, ParaView also features highly optimized parallel computations in order to support clusters of supercomputers. The software architecture is designed to follow a modular pattern, in order to en- courage further development of new features through additional plugins or through
  • 25. Background 13 Figure 2.2: HTC Vive controller buttons; in the order they are numbered: menu button, touchpad, system button, hair trigger, grip button. Source: Photo courtesy of HTC whole applications. One of these, Immersive ParaView, released in 2011, provides a certain degree of integration with VR technologies, as described in [19] and [20]. As thoroughly explained in [21] and shown in Table 2.1, one of ParaView’s underlying layers consists of Visualization Toolkit (VTK), which is a C++ open-source software system for 3D computer graphics, image processing, and visualization [22]. VTK provides the basic visualization and rendering algorithms, thus being the underlying engine on which ParaView relies. From a higher abstraction point of view over ParaView’s architecture, the software exposes wrappers to most of its features through APIs in Python, Java and TCL.
  • 26. Background 14 Table 2.1: ParaView-VTK Architecture (simplified) ParaView VTK OpenGL MPI OpenVR Etc. 2.2.2.1 Virtual Reality Capabilities in ParaView From version 5.3, released in March 2017, ParaView offers some native Virtual Reality capabilities, although they are intended for developers and are not included in the general distribution binaries. The implemented features include the possibility to send the loaded objects to a VR environment, in which some forms of interaction such as grabbing, moving and scaling are provided to the user. However, being it only a beta version, not all types of objects can be simulated in VR, and the features offered come with some disadvantages. From the user’s point of view, given that the Virtual Reality environment is implemented in the main thread (since it requires synchronization with VTK’s event-handling routines), ParaView’s GUI is completely unresponsive while the objects are displayed on the headset. Fur- thermore, from a developer’s perspective, it must be noted that both ParaView and VTK have a large, complex and structured code-base, on which layers of codes have been added over the years by following a very strict set of conventions. Unfortu- nately this means that, although offering great performance potential, developing natively in ParaView-VTK can often be a very time-consuming task, this being inappropriate for the open-ended project discussed in this thesis. Another limitation hindering ParaView’s VR capability lies in the fact that , to obtain a VR-ready version of the software the user is forced to build it from source, along with all its dependencies and those required to communicate with SteamVR, thus making it a rather unpractical solution for general distribution.
  • 27. Background 15 2.2.3 Unity Unity is a cross-platform, all-purpose graphics and physics engine used mainly for simulations, video games and animations. It was first released in 2005 by Unity Technologies, and ever since then it grew to become a de-facto standard for inde- pendent and industry-level game development. It offers both 3D and 2D capabilities, although for this project only the 3D version has been used. Unity ships with a highly workflow-centered editor, which provides the developer with all the necessary tools to build a scene, populate it with objects and define behaviors for each object. Every Unity project is characterized by a certain number of scenes, which basically represent environmental units, the assets, i.e. the resources that have to be loaded in each scene (3D models, textures, scripts, images, . . . ) and the GameObjects, which are instances of the objects. Unity editor’s interface, as can be seen in Figure 2.3, allows for a hierarchical definition of the objects and their properties which can be done on a much higher abstraction level than the underlying code. All of these features contribute to making prototyping a simpler and quicker process on Unity. From a coding perspective, Unity mainly supports two programming languages for scripting: C# and UnityScript (which is a proprietary language derived from JavaScript). For the purpose of this project all scripts for Unity have been devel- oped in C#. 2.2.3.1 Object behaviors in Unity In Unity, each object can have zero or more behaviors, which define the way the object responds to inputs. When defining a new behavior, one is essentially attaching a script to an object. Each script consists of a class which inherits from a MonoBehaviour class. This is an abstract class that defines several useful methods, the most relevant being:
  • 28. Background 16 Figure 2.3: Example of Unity’s interface • Awake(): a method that is called immediately after the object has been cre- ated, usually at the beginning of a scene. • Start(): a method similar to Awake, with the exception that it is ensured to be called after all the Awakes of the other objects have been executed. This allows inter-object communication at start-up. • Update(): a method that is called at every frame, i.e. several tens of times per second. By overriding these methods and defining new ones, the whole behavior of each object can be defined.
  • 29. Background 17 2.2.3.2 Virtual Reality Capabilities in Unity Unity does not yet natively support Virtual Reality kits, although this can be easily achieved via some of the official plug-ins available on the Unity Asset Store. For this project, two plug-ins have been included in the application: SteamVR, developed by Valve, which exposes the OpenVR API to the Unity Application, and Virtual Reality ToolKit, a community-developed plug-in which consists in a set of predefined objects and classes that abstract the underlying VR hardware and offers the developer with transparent access to not only the HMD and the controllers, but also the events generated by the kit. 2.2.4 ParaUnity As stated in Section 2.2.2, ParaView’s capabilities can be extended through plug-in. One of such plug-ins, named ParaUnity and developed as open-source C++ software by Rudolf Biczok [23], bridges the gap between ParaView and Unity. In fact, if built and loaded in ParaView, it adds two buttons to the GUI through which any object currently loaded in the pipeline can be sent to a pre-prepared Unity scene, this being either a built executable application or an instance of a project opened in the Unity editor. As will be further discussed in Chapter 4, ParaUnity suffers from a certain number of limitations in terms of performance, compatibility and lack of features. However, being it an open-source code, it was decided to include its further development in this project to improve the performance of the overall software system. It is worth mentioning that in its current implementation ParaUnity requires the use of an older version of Qt, a cross-platform GUI development application extensively used by ParaView. This imposed the constraint of linking the plug-in against Para- View 5.2, released in November 2016, which was also the latest version of ParaView to offer no native VR capabilities. However, being that ParaUnity aims specifically
  • 30. Background 18 at making ParaView VR-ready through Unity, such requirement did not hinder the development of the project in any way.
  • 31. Chapter 3 Unity Application 3.1 Introduction As already mentioned in Chapter 1, the main objective of this thesis was to develop an application, either as an extension of ParaView or as a more stand-alone piece of software, that would let the user visualize and interact with CFD data coming from ParaView and VTK. The output of the project was a Unity application, which can be seen opened in the Unity Editor in Figure 3.1. In Section 2.2 all the necessary software, technologies and environments have been introduced: ParaView, VTK, Unity and ParaUnity. In this section a brief explana- tion of the reasons behind the choice of Unity will be given, followed by a high-level view of the Unity Application along with some more in-depth implementation de- tails. 3.1.1 Why Unity? When initially planning the general development for the project, two alternative paths have been identified: either continuing the development of ParaView’s last 19
  • 32. Unity Application 20 Figure 3.1: Screenshot of the resulting Unity Application version, which included some beta VR capabilities as native features, or using a different environment for the Virtual Reality and somehow connect ParaView to it. Given the open-ended nature of the project, along with the fact that a prototyping programming work-flow had been chosen, and given the complex, structured way in which ParaView’s and VTK’s codebases are organized, the second route seemed a better solution in terms of number of features that could have been implemented over the course of the project. With this in mind, Unity has been chosen as the graphics engine of the application. As already discussed in Section 2.2.3, Unity is an environment designed specifically for prototyping and gray boxing1 , and through plug-ins it can be effectively made VR-ready. Furthermore, as will be further explained in Chapter 4, there exists an open-source plug-in for ParaView which can bridge the gap between this and Unity, although it required some further development. For these reasons, along with the 1 In Unity, the principle of gray boxing is to design a scene in which all the objects consist of gray boxes (the default 3D object in Unity), although they behave similarly to how they should in a final version. This is meant to quickly test an environment or a scene without spending time on 3D models, textures, and general artistic design, but only by focusing on the mechanics and the features. It can be considered a kind of 3D sketch for an environment.
  • 33. Unity Application 21 large availability of pre-built community-made tutorials and tools, and the inherent cross-platform compatibility offered by Unity, the decision of proceeding in this direction seemed the most appropriate. 3.2 Application Architecture The application’s architecture has been designed to be as flexible and expandable as possible, so that any additional feature could be added to the VR environment with minimum effort. The project consists in only one scene, called Main scene, in which all the objects are inserted with the hierarchy shown in Figure 3.2. Each feature is controlled by Managers, which implement the business logic of the application through behavior scripts (as further detailed in Section 3.3). Managers communicate with each other through references, and are designed to expose a simple and coherent interface to the upper layers of the application. As stated in Section 2.2.3, Unity offers two different programming languages for the behavior scripts: C# or UnityJava. For the purpose of this project, all scripts have been developed in C#. Some managers have children objects, which are usually the actual models rendered in the Virtual Reality environment. It is important to notice that the parent-child relationship between the managers and their children has nothing to do with the concept of inheritance in programming. In fact, defining a parent in Unity simply means that the child’s position, rotation and scale will be relative to the parent’s; this is especially useful to handle animations (as seen for example in Section 3.3.5). Inter-object communication happens through a set of globally shared variables. In fact, although Unity provides ways of transversing the scene in order to look for specific objects, this has a non-negligible computational cost, and is often advised against by the community. Instead, each object that needs to be accessed from outside registers itself to a Global static class. Then, at runtime, the application
  • 34. Unity Application 22 begins with an initialization process in which each manager stores the references to the objects it needs to communicate with as private fields, for faster subsequent accesses. On the other hand, all non-manager objects need not know about the external parts of the software, in accordance to the SOLID principles of Object- Oriented Programming (as described in [25]). Internally, both Managers and objects have one or more behavioral scripts attached, in which the overridden implementation of Awake, Start and optionally Update are defined. Furthermore, methods have been implemented to trigger and process custom events, such as the loading of an object from ParaView. Such methods follow the naming convention OnEventName, and are called through a callback system. 3.2.1 Top-level Managers At the current stage of development, the following top-level managers have been defined, as seen in Figure 3.2): • VRManager: core of the VRTK plugin (see Section 2.2.3.2), responsible for the Virtual Reality hardware (sensors, controllers, camera) and their 3D representations. • ModeManager: exposes utility methods to check whether the Unity appli- cation is running as a stand-alone executable (“Player mode”) or inside the Unity editor (“Editor mode”). Used by ParaViewManager. • EnvironmentManager: responsible for the virtual environment in which the user is immersed, i.e. the room. It exposes methods to toggle the room’s visibility, used by StaticMenuManager. • StaticMenuManager: manages the static menu that the user can summon by pressing the Menu button on the controller. It’s responsible for toggling its visibility and for binding the menu buttons to the appropriate functions. • LightManager: responsible for the lighting of the environment. Currently exposes a function to set the intensity of the ambient lighting.
  • 35. Unity Application 23 • ParaViewManager: the most fundamental part of the application: it is responsible for communicating with a ParaView instance, interpreting its data and creating an appropriate object in the Unity VR environment. Further details about the communication with ParaView are discussed in Section 4.2.2. • AnimationManager: handles the animation of the 3D models, in case of time-dependent datasets. It exposes functions to play, pause and loop the animation, while keeping the state of the current frame. • SizeManager: responsible for resizing the 3D object, it allows upscaling, downscaling and optional autoresizing to a custom size at object loading. • SlicingManager: mock implementation of a slicing plane, its purpose is fur- ther discussed in Section 3.3.9. • LoggerManager: handles the visibility of an optional overlay text that dis- plays info messages, warnings and errors directly in the headset’s display. It is mainly intended for developers and it is turned off by default. More technical information about these entities will be provided in Section 3.3.
  • 36. Unity Application 24 Figure 3.2: Architectural diagram of the main Unity scene
  • 37. Unity Application 25 3.3 Features After giving a high-level description of the software’s architecture, it is worth ana- lyzing the most important features from a more in-depth perspective. 3.3.1 Controllers As explained in Section 2.2.1, the Virtual Reality kit chosen for the development of the application is a HTC Vive. In particular, it is worth reminding that each controller features five different buttons, one of which is also a 2D circular trackpad. In order to make the user experience easy and intuitive, while at the same time including as many features as possible in the controllers, a usability study has been carried out to determine the most appropriate button configuration. The results of this tests showed that no users had difficulties with an asymmetric configuration, in which the two controllers offer different functionalities, and the test users found themselves using primarily the trackpad button and the trigger. Consequently, the following choices have been made: • By default, the left controller would host a radial menu with all the features related to the ParaView object (see Section 3.3.5), while the right controller would be primarily used as a pointer device to interact with the UI elements (see Section 3.3.4). • An “Inverted Controllers” mode would be provided, aimed at left-handed users. • Both controllers would have the ability to grab the object. • The buttons would be bound to the functions: – Menu button: open the static menu, – Trigger: when pressed while inside the object and held, the user is grab- bing the object; if pressed while the pointer is shown, behaves similarly to a mouse click,
  • 38. Unity Application 26 – Trackpad: when touched on the left controller, spawns the radial menu, which can be interacted with by moving the finger over the trackpad and confirmed by clicking the button; when pressed on the right controller, shows the pointer to interact with the static menu, – Grip button: reserved for applying/confirming ParaView-related fea- tures, such as clipping, – System button: open the SteamVR dashboard (default behavior, can- not be overridden). If on one hand this means that both controllers have to be used to access to all the features of the application (which is nearly always the case, for popular VR softwares), on the other it ensures an immersive and intuitive experience, even for users who are not familiar with the kit or with Virtual Reality in general. Figure 3.3 shows a screenshot of the controllers in action. As one can see, the radial menu is showing over the left controller, given that the trackpad is being touched, and the pointer of the right controller is currently active. 3.3.2 Environment The environment in which the user is immersed is designed to simulate a large room with natural and artificial ambient lighting. Given that the focus of the user will most likely be the object, one could think that the choice for the environment is not important. However, virtual reality develop- ment guidelines suggest that the experience of being immersed in an environment that fails at simulating reality (for example, a completely white room or an infinite floor) can, in the long run, end up alienating the user. Instead, a familiar set up with proper lighting is important to reduce perception dissonance between the real and the simulated world. Bearing this in mind, it is also important to remark that the application discussed in this thesis is meant to be further developed and improved, and the focus of the
  • 39. Unity Application 27 Figure 3.3: Screenshot of the two controllers in action project has been primarily on the features related with ParaView and the interaction with the object. Therefore, it has been chosen to develop a simple environment with minimal characteristics. The result can be appreciated in Figure 3.4. In any case, the objects that make up the environment can optionally be hidden, so that a user would find themselves walking on a plain white floor with no walls around. This is currently toggled from the Static Menu (see Section 3.3.4), and is implemented in the EnvironmentManager with the following function:
  • 40. Unity Application 28 Figure 3.4: Screenshot of the environment of the application public void ToggleShow(bool show) { // Hide all walls foreach (GameObject wall in walls) wall.SetActive(show); // Set default material to floor floor.SetMaterial(show); } 3.3.2.1 Lighting As stated before, lighting in the scene comes both from an artificial directional light and from ambient lighting. Both of them are meant to be managed by a LightManager.
  • 41. Unity Application 29 Currently, the user can adjust the brightness of the environment via a slider on the static menu. The function that implements this is: public void SetIntensity (float value) { RenderSettings . ambientLight = new Color(value , value , value , 1); } 3.3.3 ParaView object loading As stated before, the most important feature is that the application must be able to communicate with ParaView, load its objects and render them in the virtual world. In the current implementation, this task is handled by a ParaviewObjectLoader. This is responsible for the following: • Setting up a TCP listener for incoming connections from ParaView, • Implementing the communication protocol for import/export synchronization (see Section 4.3.2), • Reading the data from memory, • Calling the appropriate methods to construct the 3D model from the data, • Registering the object as global and triggering appropriate events. As this is the only scenario that is highly dependent on external input (i.e. an object being send from ParaView, an external application), the design of the object loader required careful handling of its asynchronous nature. For this purpose, a custom event named ParaviewObjectLoaded has been designed, in order for the other manager to register their callbacks. This is important as some code (for example the autoresizing of the object) must only be executed after an object has been loaded from ParaView, which could be in any moment after the beginning of the application, and does not depend on the application itself.
  • 42. Unity Application 30 A similar consideration can be made for the object’s destruction. In fact, supposing that the user wants to substitute the currently loaded object with another one, the first must first be deleted. In fact, currently the application supports only one object being loaded at any time. This is achieved through another event, called ParaviewObjectUnloaded, which is necessary for the other managers to clean up any reference to the object and to update their state before importing another one. As an example, the code to register a loaded ParaView object and call the appro- priate callbacks is: public static void RegisterParaviewObject (GameObject paraviewObj) { // Register object in globals Globals.paraviewObj = paraviewObj ; // Trigger callbacks if ( ParaviewObjectLoadedCallbacks != null) ParaviewObjectLoadedCallbacks (paraviewObj); } 3.3.3.1 3D model creation To import an object this must first be read from memory; in its current implemen- tation, data is stored in RAM in X3D format (see Section 4.3.2). This means that in order to read the data the application must make the appropriate calls to the Windows API, and subsequently pass the result of this operation to an entity that can convert X3D strings into 3D models. The reading phase is carried out by an object that exists at a lower level of abstrac- tion: the XDocumentLoader. This exposes a Load() function, which takes the name and the size of the object as arguments and returns a properly formatted XDocu- ment, (a XML representation of the object). The Load() function is implemented as follows:
  • 43. Unity Application 31 public static XDocument Load(string objectName , uint objectSize) { // Attach to the object in shared memory sHandle = new SafeFileHandle (hHandle , true); Attach(objectName , objectSize); // Parse the raw data into XML XDocument doc = XDocument.Parse(pBuffer); Detach (); return doc; } The XDocument is then passed to an external library, called X3DLoader, which han- dles the conversion. 3.3.3.2 3D model material One of the most important aspects of CFD data visualization is coloring. In fact, more often than not, it can be extremely useful to color an object according to the value of a certain variable (such as pressure, temperature, ...) on each face of the mesh. Such information about the color is included in the X3D representation of the object, but must be rendered in Unity through the use of shaders. For this purpose, a custom shader that implements Vertex Color Shading has been included in the project. In order to programmatically apply a material based on such shader to the object, the design choice has been to include a non-visible object, called MaterialPlace- holder, from which the ParaviewObjectManager can read information about the material chosen by the developer and apply it on any loaded objects. This gives free- dom to customize the material through the editor, even when no Paraview objects are loaded in the scene. Figure 3.5 shows an example of a mesh colored according to pressure, blue being lower values and red being higher values.
  • 44. Unity Application 32 Figure 3.5: Screenshot of a mesh colored according to its values 3.3.4 Static Menu As seen in the previous sections, some of the aspects of the VR environment can be changed at runtime. This happens through a menu, which is called StaticMenu to distinguish it from the radial menu (see Section 3.3.5). By default the menu is hidden, so as not to interfere with the object visualization, but can be toggled through the Menu button on any of the two controllers. The behavior defined in the StaticMenuManager class mimics that of the SteamVR dashboard, which is the default system menu of the whole SteamVR environment: when opened, the static menu positions itself at a fixed distance directly in front of the user, facing them. If then the user moves around, the menu stays in its place in the 3D world until hidden and shown again. By usability testing, this has proven to be the most natural and unobstructive solution for the users.
  • 45. Unity Application 33 The code responsible for positioning and showing the menu is particularly interest- ing, because it shows Unity’s way of handling positions and rotations: private void Show () { // Get the XZ projection of the forward position of the headset camera Vector3 forwardXZ = new Vector3(forward.x, 0, forward.z); // Gets the point at 2.5m along the forwardXZ vector. Vector3 headsetFront = position + distance * forwardXZ.normalized; // Gets the position at 2.5m in front of the headset , along the XZ plane. Vector3 headsetFront = headset. GetFrontPositionXZ (2.5f); // Sets the menu position to headsetFrontPosition , keeping the y coordinate. position = new Vector3( headsetFront .x, position.y, headsetFront .z); // Sets the rotation so that the menu faces the camera rotation = Quaternion. LookRotation (position - headset.position); // Shows menu gameObject.SetActive(true); } As can be seen in Figure 3.6, the static menu currently offers the following function- alities: • Light intensity slider: allows the user to adjust the brightness of the scene (see Section 3.3.2.1), • Show room toggle: enables or disables the walls and the floor’s texture (see Section 3.3.2), • Show log toggle: enables or disables the overlay info text (see Section 3.3.10), • Enable bounding box toggle: enables or disables an explicit bounding box drawn around the object (see Section 3.3.6.1), • Invert hands toggle: inverts the functionality of left and right controllers (see Section 3.3.1), • Close menu: closes the menu,
  • 46. Unity Application 34 • Quit: gracefully terminates the application. Figure 3.6: Screenshot of the opened static menu 3.3.5 Radial Menu All of the features related to the interaction with ParaView’s objects are imple- mented through a radial menu attached to the left controller. The idea behind this design choice is that such features must be quickly accessible to the user, so that the interaction feels more natural and more intuitive. The radial menu itself is a custom implementation of one of VRTK’s prefabricated objects. Its behavior offers a good example of Unity’s parenting mechanism: the object is a child of the controller, therefore its coordinates are relative to those of the controller. This means, given that the controller’s position is updated by
  • 47. Unity Application 35 VRTK through data coming from sensors, that the menu is effectively following the controller, as if it was an additional part of it. Such behavior is at the base of the concept of Mixed Reality: the controller exists in both the real and the simulated world, but in the latter it is enriched by completely virtual features. RadialMenu’s behavior defines the callbacks related to each menu entry, which are then bound to the appropriate functions in the Unity editor. All the callbacks follow a similar structure: public void OnActionButtonPressed () { appropriateManager .action (); } This code snippet shows the way managers and objects communicate with each other, delegating the implementation of the business logic of each routine to the appropriate entity. Currently, the radial menu supports the following features: • Scale Up: while held, increases the size of the model (see Section 3.3.7). • Scale Down: while held, decreases the size of the model (see Section 3.3.7). • Play/Pause: if the model is a time-dependent model, i.e. it consists of mul- tiple frames, this button plays and pauses the animation (see Section 3.3.8). The button’s icon is updated according to the current state of the animation and it is disabled if the object is static. • Slice: when pressed, shows the mock implementation of the slicing plane (see Section 3.3.9). Such features can be seen in Figure 3.7, where they are represented by self-explanatory icons.
  • 48. Unity Application 36 Figure 3.7: Screenshot of the opened radial menu 3.3.6 Object interaction Among the project objectives outlined in Section 1.1, one of the most important was the ability to interact with ParaView’s object. In fact, this is essential for any true Virtual Reality application: in the real world objects can be touched, moved, rotated and manipulated. Therefore, when a user enters a VR environment, they naturally expect the same laws to apply, in which case they feel more immersed. Moreover, one must bear in mind that the scope of the application discussed in this thesis is the analysis of CFD data, and therefore the ability to manipulate the dataset in a 3D environment is an enormous advantage over the classical visualization on a 2D screen. Bearing this in mind, basic form of interaction with the model have been imple- mented, with two exceptions: gravity and solidness. On one hand, having the model
  • 49. Unity Application 37 react to gravity is impractical and counter-productive, as it means that it cannot be left floating at eye level for detailed examination; a similar consideration applies to solidness: given that with current technologies it is not possible to simulate touch through the controller, the most commonly accepted way of grabbing and object is by keeping a button pressed (the trigger, in this case) while holding the controller inside of it. Apart from this, the object is designed to behave like a rigid body, meaning that it can be grabbed by any of its parts, and it can be translated and rotated in 3D space. To achieve this, the object is given an Interactable behavior at loading time. This, internally, gives it rigid body properties, and defines a collider. In Unity, a collider is a region surrounding the object that represents its boundaries, for the purpose of determining collision. As an object enters or exits another object’s collider, an event is triggered. This can be bound to appropriate callbacks to implement grabbing, as in the case being discussed. For performance reasons, the collider’s shape for the object was chosen to be the minimum bounding box (see Section 3.3.6.1). The code for implementing the object grabbing lies both in the controller and in the interactable object itself, although the object’s end is more interesting. In fact, whenever a controller is inside the bounding box, and the trigger is pressed, the method OnBeginInteraction() is called. Similarly, when the trigger is released, the method OnEndInteraction() is invoked.
  • 50. Unity Application 38 public void OnBeginInteraction (Controller controller) { // The object becames integral with the controller this.transform.parent = controller.transform; // Save reference to the controller attachedController = controller; } public void OnEndInteraction (Controller controller) { if (controller == attachedController ) { // Release the object this.transform.parent = null; // Reset reference attachedController = null; } } It is worth noting that saving and checking the reference to the controller is necessary to implement the possibility of passing the object from one controller to the other without releasing it. In fact, one can see how if a second OnBeginInteraction() is called from the other controller before the interaction with the first ends, the reference is updated. Therefore, when the trigger of the first controller will be released, the object will still be integral with the second controller, as expected. 3.3.6.1 Bounding box As explained before, each ParaView object is surrounded by a bounding box that acts as a collider. This allows interaction with the controllers. It is worth mentioning, however, that the fact that the bounding box is effectively bigger than the mesh itself can be counter-intuitive for the user. In fact, it means that there are regions of space that are outside the mesh but inside the bounding box, thus making the object grabbable without the controller being inside of it. To limit the effects of this, and ease the process of grabbing, an option on the static menu lets the user enable a visible bounding box. This is rendered as a semi-transparent
  • 51. Unity Application 39 box, as seen in Figure 3.8 and is shown only when the controller is actively colliding with it. Figure 3.8: Screenshot of the bounding box Another feature of the current implementation of the bounding box is its ability to adapt in the case of animated models. In fact, in each frame of the animation the model can have a different size and shape, thus requiring a different bounding box. To solve this problem, it has been chosen to implement a function that actively resizes the bounding box depending on the current shape of the object: such function is bound as a callback to the NextFrameLoaded event, which is triggered by the animation manager after every frame (see Section 3.3.8). The code that resizes the bounding box is the following:
  • 52. Unity Application 40 public void FitColliderToChildren () { // Temporarily reset the parent ’s orientation , since the bounds will // be given in world coordinates but the collider is in local coordinates . Quaternion oldRotation = transform.rotation; transform.rotation = Quaternion.identity; bool hasBounds = false; Bounds bounds = new Bounds(Vector3.zero , Vector3.zero); // Iterate over all active children. foreach(Renderer childRenderer in GetComponentsInChildren <MeshRenderer >()) { if (hasBounds) bounds. Encapsulate ( childRenderer .bounds); else { bounds = childRenderer .bounds; hasBounds = true; } } // Set collider center and size (with conversion from global to local) BoxCollider collider = GetComponent <BoxCollider >(); collider.center = Scale(bounds.center - transform.position , transform. → localScale.Reciprocal ()); collider.size = Scale(bounds.size , transform.localScale.Reciprocal ()); // Reset the objects orientation transform.rotation = oldRotation ; } This function is a good example of the concept of global and local coordinates in Unity. In fact, the bounding box is considered as an independent object, and therefore expressed in global coordinates, while the Paraview Object itself is a child of ParaViewManager, and therefore it uses local coordinates relative to its parent. For this reason, some of the entities involved require transformations before being applied. 3.3.7 Resizing In general, the size of CFD models can span over several orders of magnitude de- pending on the object being studied: from micrometers in the case of the flow of red
  • 53. Unity Application 41 blood cells to tens of meters for aircrafts aerodynamicity analysis. However, all the cases in this spectrum need to be correctly represented in the virtual environment at a scale which is appropriate for the user. At the same time, it is often interesting to analyze finer details of a model, like a particularly turbulent area or a small zone of higher pressure. This means that the user should be able to enlarge and reduce the mesh accordingly. For this purposes, a SizeManager has been defined and implemented. Currently, it offers the following functions: • Autoresize: automatically resize the mesh to a fixed maximum length at loading time. This can be disabled in the editor, and the parameter for the maximum length can be changed. • Scale up: increase the size of the mesh. • Scale down: decrease the size of the mesh. By analyzing popular implementation of zooming and scaling functions, it has been observed that the most natural solution for the user is to modify the dimensions relatively to their current values, instead of linearly. In fact, a linear resizing often feels too slow for large models and too fast for small models, and can be problematic when the size of the model approaches zero. Consequently, one can observe the implementation of the ScaleUp function as an example: public void ScaleUp () { obj.transform.localScale *= 1 + scaleSpeed * Time.deltaTime; } The previous line of code provides a simple example on how time is handled in Unity. In fact, Unity exposes a static class Time, of which the most important field is deltaTime. This field returns the time in seconds it took to complete the last frame (i.e. the delay between two subsequent calls to Update()), as this value
  • 54. Unity Application 42 depends on the computational complexity of the operations performed in the last frame and is thus not known at compile time. 3.3.8 Animation As stated before, time-dependent datasets are a very important and interesting subset of CFD models, and the most natural way to represent them is through an animation. For this purpose, an AnimationManager has been designed, and it currently offers the following functionalities: • Play the animation, • Pause the animation, • Loop the animation. When a time-dependent object is exported from ParaView, each frame is encoded separately as an independent model. Then, through the TCP connection between ParaView and Unity (see Section 4.3.2), a message is sent to Unity with the total number of frames to be imported. They are then read and inserted to the scene. In the current implementation, all frames are children of the same root object, and only one of them is showing at any given time. Therefore, animating the mesh consists in hiding the current frame and showing the next one. This is done in the Update() function that every scripts inherits from the MonoBehavior class, with the visibility toggling logic being encapsulated in the ShowNextFrame() method.
  • 55. Unity Application 43 // Called at every frame void Update () { if(isPlaying) { // DELAY_COUNT slows the animation if (delay >= DELAY_COUNT) { delay = 0; ShowNextFrame (); } else { delay ++; } } } private void ShowNextFrame () { // Hide all frames foreach (Transform child in obj.transform) child.gameObject.SetActive(false); // Show next one obj.transform.GetChild( currentFrame ).gameObject.SetActive(true); // Update counter currentFrame = ( currentFrame + 1) % obj.transform.childCount; // Trigger event if ( NextFrameLoadedCallbacks != null) NextFrameLoadedCallbacks (); } As one can notice from the last lines of the ShowNextFrame function, the Anima- tionManager defines a custom event called NextFrameLoaded, to which any other manager can bind their callbacks. This is necessary as changing the frame is ba- sically equivalent to changing the model, and therefore some updating might be necessary in the state of other parts of the software. As seen in Section 3.3.6.1, an example of this is the recalculation of the bounding box dimensions, which must be adapted to the new shape of the model at every frame. In the current implementation, in case of time-independent models, i.e. models that consist of only one frame, the Play/Pause button on the radial menu (see Section 3.3.5) is disabled, and therefore none of the AnimationManager’s routines are called.
  • 56. Unity Application 44 Figure 3.9 displays four different frames of an ongoing animation. The model shown in the screenshots is taken from ParaView’s official tutorial examples. Figure 3.9: Screenshots of four different frames of an ongoing animation 3.3.9 Slicing As any user of ParaView knows, the application is used not only for visualizing the data, but also to manipulate it in many ways through the use of filters. Such functions, which include slicing, contouring, wireframing, etc., are necessary to offer a more detailed view about the values of the dataset being analyzed. It is therefore important to be able to port these features in the Virtual Reality environment, so that this can become as independent from ParaView as possible, without requiring the user to change and re-export the data multiple times.
  • 57. Unity Application 45 As discussed in Section 3.2, the application has been engineered to be flexible and expandable, so that any new feature could be easily added without requiring major refactoring. As a proof-of-concept of this, a mock implementation of one of the most important filters in ParaView has been realized: the slicing function. Slicing consists in dividing a mesh in two parts along the plane, which is often followed by hiding one of the parts. This can be interesting to show inner details of the model. Given that this project focused on defining and implementing the general envi- ronment and architecture of the software system, the design choice has been to implement a stub of the feature, rather than a fully working function, to be used as an example for future development. In its current implementation, a slicing plane (which is just a semi-transparent plane) can be summoned by pressing one of the buttons of the radial menu. Its position is integral with the right controller, so that moving and rotating it over the mesh is natural and intuitive for the user. Pressing the button again hides the plane. Figure 3.10 shows a screenshot of the slicing plane being held through a simple cube. It is worth mentioning that, although the plane just described does not perform any slicing operation, there exists a fully-working version developed by a fellow student for his MSc Thesis. This has already been integrated in the application being discussed in this paper, in a different development branch, as is discussed in detail in his thesis. 3.3.10 Logger When immersed in the Virtual Reality environment, it can be difficult for the user to be notified about the state of the application. For this specific reason, it has been chosen to implement a logging a mechanism appropriate for a VR experience. The object responsible for this feature is the LoggerManager, which effectively acts as a wrapper around Unity’s own Debug static class. In the current implementation, the LoggerManager offers the following functionalities:
  • 58. Unity Application 46 Figure 3.10: Screenshot of the mock implementation of the slicing plane • Info log: shows a message in a default color of black, • Warning log: shows a message in a default color of yellow, • Error log: shows a message in a default color of red, • Toggle visibility: enables and disables the log messages (disabled by default) The log can be enabled through a toggle button in the static menu (see Section 3.3.4), and it appears as a fixed overlay text in the bottom-center of the head- mounted display. Even if the camera is moved, i.e. if the user moves their head, the text moves along with it, thus appearing to be fixed on the screen. An example of a message being logged can be seen in Figure 3.11.
  • 59. Unity Application 47 Figure 3.11: Screenshot of a message being logged through the logger To correctly position the text and make it integral with the camera, the manager accesses VRTK’s API, which exposes a reference to the active camera object, re- gardless of the VR hardware being used. This is then attached to the camera of the text overlay: protected virtual void SetCanvasCamera () { // Get VR camera Transform camera = VRTK_DeviceFinder . HeadsetCamera (); // Attach to the text textCanvas.worldCamera = camera.GetComponent <Camera >(); }
  • 60. Unity Application 48 As an example, the code that implements a simple information logging function is provided: public void Log(string message) { // Set text and color text.text = message; text.color = defaultColor ; // Also debug to console Debug.Log(message); } Additionally, some parameters of the log message can be changed through Unity’s editor, such as the font size, the position and the color for warning and errors.
  • 61. Chapter 4 ParaUnity 4.1 Introduction The application described in Chapter 3 is able to receive objects from ParaView and insert them in a Virtual Reality environment. However, as already discussed in Section 2.2.4, ParaView has no native knowledge of Unity. It has therefore been necessary to find a way to bridge the gap between the two software systems in order to allow the import/export process and make communication possible. The work presented in this chapter consists largely in the further development of an existing plug-in for ParaView, called ParaUnity, created by Rudolf Biczok in 2016 and available as open-source code on GitHub [23]. The plug-in is entirely written in C++ and requires an older version of ParaView to be executed (specifically, version 5.2, released in November 2016). The reason behind this is that ParaUnity is built using Qt version 4 as its GUI and utility library, which is unsupported in the latest versions of ParaView. The building process of the plug-in results in a dll file, which can then be loaded from ParaView’s plug-in management tools. This essentially adds two buttons on the main interface of the program, that can be used for sending the object to a 49
  • 62. ParaUnity 50 pre-prepared Unity scene, either in a running instance of the Unity editor or in an compiled executable application. These buttons can be seen in Figure 4.1. Figure 4.1: ParaUnity buttons on ParaView’s GUI. The button with the P (for Player) spawns an instance of the executable Unity application, while the button with the E sends the object to the Editor. The next section of this chapter will describe the initial state of the plug-in, as it was before the beginning of this project. Then, a description of the main improvements that have been made to ParaUnity will be provided. 4.2 Initial state In the latest version on its official Github repository, ParaUnity is distributed along- side an example of a Unity project on which ParaView’s object can be loaded. Such
  • 63. ParaUnity 51 project features a very basic scene, which suffers from the following limitations: • The object can only be visualized, as no form of interactions (grabbing, rotat- ing, etc.) have been implemented. • No ParaView features (slicing, contouring, etc.) are available. • Objects are rendered with a plain white texture, regardless of their color in ParaView. • No environments or menus have been included. • The application is only compatible with the Oculus Rift VR kit. It is immediately evident that, with such limitations, most of the project objectives outlined in Section 1.1 are not satisfied. This is the reason behind the development of the whole application described in Chapter 3: to provide the user with a complete, immersive scene in which they could freely interact with the object. By effectively substituting the Unity project included in the repository with the one discussed in this thesis, all of these limitations have been overcome. Furthermore, considering the code of the plug-in itself, two additional performance limitation have been identified: • Object export-import is done through disk I/O, which can be several orders of magnitudes slower than RAM I/O, depending on the hardware and the structure of the data. • Communication between ParaView and Unity is completely synchronous, mean- ing that the whole model has to be completely exported before Unity can start importing it, thus creating non-neglectable idling times. The overcome these problems, the initial project has been expanded to include some further development on the plug-in, as described in Section 4.3.
  • 64. ParaUnity 52 4.2.1 Plug-in implementation ParaUnity is a much simpler piece of software, compared to the Unity application, and therefore the vast majority of its logic is included in just one class, called Unity3D. This class offers the following functionalities: • Connecting to an existing TCP Socket and communicating through it (see Section 4.2.2). • Spawning an instance of a pre-prepared Unity application from its executable, or connecting to a running instance of the Unity Editor. • Requesting an X3D representation of the currently loaded objects to VTK. • Writing data to disk. Most of these features are implemented using Qt libraries or the Windows API. 4.2.1.1 TCP Socket In its initial stage, ParaUnity communicated to external software through single-use TCP connections. This was done by instantiating a QTcpSocket, which is part of the Qt Network library and writing a byte representation of the message on it. It is worth noting that in this implementation, ParaUnity is not waiting for a reply, nor it has any methods to read it, as the TCP connection is destroyed immediately after sending the message. This can be seen in the following implementation:
  • 65. ParaUnity 53 bool Unity3D :: sendMessage (const QString& message , int port) { // Creates the socket QTcpSocket *socket = new QTcpSocket(this); // Connects to localhost socket -> connectToHost ("127.0.0.1", port); // Checks connection status if (!socket -> waitForConnected ()) { return false; } // Writes message socket ->write(QByteArray(message.toLatin1 ())); // Waits and disconnects socket -> waitForBytesWritten (); socket -> waitForDisconnected (); return true; } Connections are always done to localhost, although the port can vary (as described in 4.2.2). 4.2.1.2 Exporting the objects To obtain an X3D representation of the data loaded in ParaView, ParaUnity uses a vtkX3DExporter, which is part of the VTK library. In fact, after instantiating the exporter, the plug-in links it to the current render window, sets the output to a file with an appropriate filename and sends the command to write the data to disk, as can be seen in the following code:
  • 66. ParaUnity 54 void Unity3D :: exportScene (/* ... */) { /* ... */ // Instantiate the exporter vtkX3DExporter *exporter = vtkX3DExporter :: New(); /* ... */ // Set its input to the render window exporter ->SetInput(renderProxy -> GetRenderWindow ()); // Set the filename exporter -> SetFileName ( outputFilename ); // Write to file exporter ->Write (); /* ... */ } 4.2.2 Initial communication protocol As stated before, ParaUnity and Unity communicate through both a local TCP Socket and disk I/O operations. The actual communication protocol, as it was in the initial stage of development, can be summarized as follows: 1. Unity starts, and immediately creates a TCP Listening socket on a random port. 2. In a standardized location, Unity creates an empty directory, the name of which is the port number. 3. ParaUnity starts and searches the standardized location for files. For every file it finds (i.e. every registered port), it checks for the status of the connection on that port by sending a test message. 4. If ParaUnity finds an active Unity instance, it starts exporting the scene in the instance’s directory.
  • 67. ParaUnity 55 5. When ParaUnity is done exporting, it sends a message to Unity with the path of the file (or the path of the directory, in case of time-dependent data). 6. When Unity receives the message, it reads the file from that location, interprets it into an object and inserts it into the scene. 4.3 Development As explained in Section 4.2.1, the official ParaUnity plug-in suffered from perfor- mance limitations, which could hinder the user experience especially with large datasets. To overcome this problem, a custom version of ParaUnity has been de- signed, with particular focus on two major aspects: data I/O and inter-process communication. 4.3.1 From Disk I/O to RAM I/O The average mechanical hard drive can perform around 300 I/O random-access operations per second. With solid-state drives, such value can go up to about 2000. However, this is still several order of magnitude lower than the speed of an average RAM, which can perform about 40 million random access I/Os every second [26]. With this numbers in mind, it becomes clear that one of the performance bottlenecks of the initial implementation of ParaUnity is disk I/O. To alleviate this issue, and take advantage of the much-higher speed offered by main memory, it has been decided to move the transmission of the data to RAM. On a Windows machine, one of the possible solutions (and the one adopted for the project) is to interface with the Windows API and store the data as Named Shared Memory objects [27]. The work-flow can be summarized as follows: 1. Process A creates a file mapping in the main memory of the system, i.e. re- quests the Operating System to allocate a specific amount of memory and give it a custom name.
  • 68. ParaUnity 56 2. Process A maps part of the memory in its process space with the memory requested in the previous step, so that they can be effectively treated as the same part of memory. 3. Process A writes the data to the portion of its space which is mapped to shared memory; consequently, the data becomes accessible to any process through the name set in Step 1. 4. Process B, which must know the name and the size of the shared object be- forehand, accesses the file mapping in shared memory by name. 5. Process B maps the area of the shared memory in which the object is written to a portion of memory in the process space. 6. Process B can now access the object as if it were in its own memory space. 7. When both processes are done accessing the files, Process A frees the memory in shared memory and they both unmap their view of the file. This has been implemented in ParaUnity by setting the vtkX3DExporter to return a string of data instead of writing to a file, and copying that string to shared memory with the following logic:
  • 69. ParaUnity 57 void Unity3D :: writeSceneToMemory () { // Create object in shared memory handle = CreateFileMapping ( /* ... other parameters ... */ objectSize , objectName ); // Map memory to process space pBuffer = (char *) MapViewOfFile ( handle , /* ... other parameters ... */ objectSize ); // Write to memory CopyMemory (( void *) pBuffer , exporter -> GetOutputString (), objectSize); } Both the object name and the object size is then transmitted to Unity through the TCP Socket, as explained in Section 4.3.2. Unity is then able to access the file with the following logic (in ParaViewLoader, see 3.3.3):
  • 70. ParaUnity 58 public static XDocument Load(string objectName , uint objectSize) { // Attach to the object in shared memory handle = OpenFileMapping ( /* ... other parameters ... */ objectName ); // Map memory to process space pBuffer = MapViewOfFile ( handle , /* ... other parameters ... */ objectSize ); // Access the data and parse it XDocument doc = XDocument.Parse(pBuffer); Detach (); } 4.3.1.1 Performance Results To test the effectiveness of shared memory mapped files versus common files, a normalized performance testing platform has been designed. The tests have been run a total of 100 times, and the average values have been calculated and are displayed in Table 4.1 for datasets of 1 byte, 1 Kb, 1 Mb and 1 Gb. To measure the elapsed time of the operation Windows API’s function QueryPerformanceCounter has been used, which has a precision of a tenth of a nanosecond. The machine on which the tests have been carried out features a Samsung 850 EVO 500GB SSD and 8 Corsair DDR3 4GB 667MHz RAM modules. As the table shows, apart from very small sets of data, for which the overhead makes up the majority of the elapsed time, the access to RAM is effectively up to 20x faster than for the SSD. This means that, for a particularly large dataset of 10 GB, the I/O time can decrease from 1 minute to 3 seconds, or from 1 hour to 3 minutes.
  • 71. ParaUnity 59 Table 4.1: Performance comparison between disk I/O and RAM I/O Data Size Write time (ms) SSD RAM 1 B 2.25658 0.03998 1 KB 2.81049 0.04102 1 MB 7.79258 0.31661 1 GB 65582.1 367.192 Unfortunately, such tremendous performance improvement cannot be fully appreci- ated in the complete, full-fledged system. In fact, although disk I/O is indeed the first bottleneck of the import/export process, there is a second bottleneck: data encoding. In fact, as mentioned before, data exported from ParaView is encoded in the X3D format, which is a human-readable, ASCII, XML format. It is there- fore unsurprising that, as shown by Visual Studio profiling tools after the RAM I/O implementation, about 85% of the total export time is spent in the calls to vtkX3DExporter, which is part of VTK’s library and thus out of the scope of Para- Unity. 4.3.2 Asynchronous communication for time-dependent data Another performance limitation highlighted in Section 4.2.1.2 is due to the fact that, in the case of a time-dependent model, all frames are exported from ParaView before Unity is notified and can start importing. In considering possible improvements on this aspect, it has been decided to redesign the communication protocol in order to asynchronously update Unity about the availability of an exported frame. This way, Unity can import the model frame by frame while this is being exported, thus greatly reducing the idling time. The new improved protocol is as follows (in gray the parts that haven’t been changed from the initial version): 1. Unity starts, and immediately creates a TCP Listening socket on a random port.
  • 72. ParaUnity 60 2. In a standardized location, Unity creates an empty file, the name of which is the port number. 3. ParaUnity starts, and searches the standardized location for files. For every file it finds (i.e. every registered port), it checks for the status of the connection on that port by sending a test message. 4. If ParaUnity finds an active Unity instance, it starts exports the first frame as a named shared memory object. 5. When ParaUnity is done exporting the frame, it sends Unity a message through the TCP Socket, containing: the name of the object, the size of the first frame, the index of the first frame and the total number of frames. 6. If there are other frames, ParaUnity starts exporting the next frame. 7. When Unity receives the message, it reads the file from that location, interprets it into an object and inserts it into the scene. It then sends a confirmation to ParaUnity formatted as "<frameNumber> OK". 8. When ParaUnity receives a confirmation for the i-th frame, it frees the memory in shared memory relative to that frame and, if there are other frames, exports the next one. 9. All steps from step 5 are repeated until there are no other frames left. It is immediately evident by looking at the improved version of the communication algorithm that, by implementing an asynchronous communication between Para- Unity and Unity, much of the idling time during the import/export process has been cut off. 4.3.2.1 Performance Results Given the fact that the protocol is intrinsically integrated in the system, the only suitable way for testing its performance is to time the whole round-trip time (RTT)
  • 73. ParaUnity 61 of certain pre-defined sets of data. Therefore, 10 repeated tests have been performed with 3 datasets of different sizes: 10 MB, 170 MB and 2 GB. All sets had exactly 100 frames. The average results for the old and the new protocol are shown in Table 4.2. Table 4.2: Performance comparison for the improved communication protocol. Data Size RTT (s) Improvement Old protocol Improved protocol 10 MB 4.15 3.81 8.11% 170 MB 37.84 32.24 14.78% 2 GB 421.11 352.72 16.24% Given the fact that the reviews to the protocol are effectively parallelizing it, an upper bound to the performance increase would have been 50%, meaning complete parallelization. However, in practice this results cannot be achieved due to: • The additional overhead resulting by a more complex communication between the two processes, • The fact that the load is not balanced among the two processes (in fact, importing in Unity is in general faster than exporting from ParaUnity, and therefore there is an idling time which cannot be avoided). This considered, as the data from Table 4.2 shows, the revised algorithm for the communication between ParaUnity and Unity offers an improvement of around 15% for large enough datasets.
  • 74. Chapter 5 Conclusions In Chapters 3 and 4 a detailed analysis of the two main parts of the final system has been provided. In particular, both the Unity Application and ParaUnity have been discussed from an architectural and a more in-depth perspective, along with an overview of the communication protocol between the two processes. This chapter will focus on analyzing the final overall system and its work-flow, and to evaluate the capabilities it provides and the objectives it achieves. 5.1 Final system architecture A UML diagram of the architecture of the resulting system is shown in Figure 5.1. The entity in the top-left corner is the ParaView and VTK software system, in which the CFD dataset is loaded. When the user clicks the appropriate button on Para- View’s GUI, the data is processed by VTK and sent to ParaUnity. In here, a TCP connection with Unity is established, while at the same time the data is exported as a Named Shared Memory Object in the system’s RAM. When ParaUnity notifies Unity about the availability of data in the RAM, the latter starts the importing process, and the model is then loaded in the running scene. It can be noticed that 62
  • 75. Conclusions 63 Unity is also responsible for back-and-forth communication with the VR kit, in or- der to collect data from sensors and user input and update the view on the HMD accordingly. From such work-flow it can be concluded that the proposed system succeeds in bridging the gap between ParaView and VR technologies, thanks to an improved version of ParaUnity and a full-fledged Unity Application. Figure 5.1: Final system architecture diagram 5.2 Objectives achieved After having described the final system in Section 5.1, it is necessary to compare its capabilities with the project objectives outlined in Section 1.1. As said before, the work described in this thesis had a rather open-ended exploratory approach
  • 76. Conclusions 64 rather than a strictly defined set of software requirements, but it is nonetheless important to validate the results against the guidelines defined at the beginning of the development, which will be analyzed one by one in the remaining part of this section. The project shall result in a working prototype of a VR application: as described in Chapter 3, one of the two outputs of this project is a fully working Unity Application, with a satisfactory set of features. The application shall allow the handling of CFD data: all the CFD mod- els imported in the Unity Application can be visualized, interacted with (grabbed, moved, rotated, ...) and manipulated (scaled, animated, ...). More forms of ma- nipulations, such as mesh slicing, are currently being investigated in MSc Thesis projects by fellow students. The application shall allow the import of data from ParaView: through ParaUnity, data loaded in ParaView and VTK can be exported to the Unity Appli- cation. The application shall run compatibly at least on Windows: one of Unity’s main strength is its cross-platform compatibility. In fact, most of a Unity project can be transparently compiled to many desktop and mobile platforms (Windows, Linux, MacOS, Android, iOS, ...). The current implementation of the Unity application is completely compatible with Windows (version 7 or greater). Additional cross- platform compatibility can be easily achieved by generalizing the access to RAM, which currently uses the Windows API. The application shall support a HTC Vive kit: since the HTC Vive kit is the one that has been used throughout the whole development process, this objective is satisfied. Moreover, thanks to the flexibility of VRTK, adding support to different VR kits would only require remapping the control buttons and their callbacks.
  • 77. Conclusions 65 The code should be designed to be maintainable, flexible and expandable: as said before, much of the development effort has been spent in designing a codebase which could then be expanded. In its current state the application is designed to facilitate adding more features to the application, increasing its cross-platform compatibility, or modifying part of its functionalities. The application should be easy to use: one of the main points being made in Chapter 3 is the focus on the intuitiveness of the interactions between the user and the application. In discussing design choices for most of the features, usability tests have been carried out to highlight possible difficulties, and to rework the interface in its the most natural configuration. 5.3 Future work All considered, the project described in this thesis can be seen as an interesting starting point for further development in the field of Virtual Reality for Compu- tational Fluid Dynamics. In fact, given that little to no software on the market currently offers this kind of capabilities, it is probably worth to invest more devel- opment effort in enriching the application, especially by adding more CFD-related features, so that it could become a production-ready piece of software. Being open-ended by nature, the possibilities for further development are virtually endless, and everything is designed to allow and ease such process. As stated before, the main path to follow to improve the application would be porting as many Para- View features as possible, and reimplementing them natively into the application. In fact, data manipulation such as contouring or clipping must currently be done in ParaView, and the objects have then to be exported again to Unity. Moving their scope to the latter would make the application more independent and self-contained, and would be a further step towards a complete CFD application in Virtual Reality. Moreover, another suggestion for an interesting feature would be the possibility to display the chosen variable with which the mesh is colored. Currently, Unity has no
  • 78. Appendices 66 knowledge about variables and their values, and only receives informations about the colors of each face and vertex of the mesh. Therefore, to give the user the possibility of changing the visualization of the object in real-time, would foster a more productive kind of interaction with the application. As far as performance is concerned, the current bottleneck in the import/export process, as explained in Section 4.3.1, is given by the necessity to encode data in the X3D format. It could be interesting to research different ways to represent the data before transferring it, so that the whole process could be sped up. Moreover, a different representation format could also be useful to further generalize the Unity Application, by also making it compatible to other CFD visualization software be- sides ParaView.
  • 79. Appendix A Setup Instructions This appendix provides the instructions to setup, install and run the software system described in this thesis. They refer to a machine with VR-ready hardware running Windows 10. A.1 Building ParaView and ParaUnity This section provides the instructions for building a working copy of ParaView with the ParaUnity plug-in. It is a simplified and adapted version of the readme file of the official ParaUnity repository [23]. A.1.1 Prerequisites • CMake 3.8.1 • Visual Studio 2015 x64 Community Edition 67