The document discusses haptic feedback in virtual reality using NullSpace VR's haptic suit. It covers best practices for creating haptic content including understanding the human body and timing, using reusable components, and creating complex effects through systems. NullSpace VR's suit has 16 haptic regions and supports Unity and Unreal. Their developer API allows for code-defined and file-defined haptics. The future of haptics is promising as integration improves.
14. Finer Quality Haptics
Principles of Animation
Squash & Stretch
Anticipation
Staging
Timing
Exaggeration
Principles of Haptics
Interpreted Simultaneity
15. Best Practices of Haptics
Understand the Human Body
Timing is important
Reusable components
Use systems to create complex effects
Understand normalizing haptics
Nodal Graph for emanation and gathering effects
No gifs or images here.
Pay attention!
17. NullSpace VR’s Suit
16 Haptic Regions
Built-in Tracking
Closed Alpha focusing on content & integration
Consumer & VR Arcade targeted
Audio to Haptic
VR Headset agnostic
18. NullSpace VR’s Developer API
Lightweight background engine (see Vive/Oculus)
Unity support
Unreal support coming soon
Code Defined Haptics
Create & play them in-line.
Combine effects to create compound effects.
File Defined
Multiple levels for construction
19. The Future of Haptics
NullSpaceVR will be open for crowdfunding early 2017!
Haptics as a field is moving forward
VR integration is getting easier
21. Feeldelity - /fēlˈdelədē/
Feeldelity (n) - The accuracy or exactness that a user can understand and
interpret sensations they are experiencing.
* Not an absolute measurement
* Behaves like attention & based on context
Notas del editor
AR/VR content need better solution for user feedback.In this talk we'll dig into best practices of creating haptic content for your games and experiences.Find out the finer details of creating quality haptics.Learn about NullSpace VR's haptic feedback technology, upper body suit and developer API.
AR/VR content need better solution for user feedback.In this talk we'll dig into best practices of creating haptic content for your games and experiences.Find out the finer details of creating quality haptics.Learn about NullSpace VR's haptic feedback technology, upper body suit and developer API.
Diving right in,
I’ll use a lot of gifs to convey emotions and feelings - the core goal of haptic feedback.
The first main topic is understanding the user’s body.
What are they feeling and since they’re letting us control it, how can we give them the best experience.
Way number 1 - Contrast of haptics is important, but also dangerous. Imagine feeling these 3 gifs. If they happen to close together or without proper prompting, you’d completely miss gif #2
Timing is important for a host of reasons, but not in the way you’d think.
When you experience a collection of sensations, you process them at different rates. This means you want to line up some parts before, some parts during and last parts after.
Processing Time is the big reason why timing is important.
Have you ever experienced something and not fully comprehended it for a moment.
We want to cater to how our player’s and customers process information.
Touch is not a user’s top sense. It’s probably third behind sight and sound.
Time and Processing Time matter when you consider focus.
Think about your tongue, now that you’re focused on it, you’ll pay attention to the experiences it is feeding your brain.
You can create stronger haptic experiences by prompting players and users - which directs their focus to what they’re feeling.
Saying “Watch out, there’s a spider on your arm!” that ties in with the audio and visuals will deliver the best.
We don’t experience with touch alone.
Reusable Components - Core part of games & experiences. These are superfluous examples to get the idea across.
Code created haptics let us put together building blocks.
We use a file format here.
Systems create complex effects (emanation, gathering, traversal)
Gathering effects - you want a graph to say ‘Create effects away from the point and move towards it’
A movement moving across the body, from point A to B.
Reusable Components - Core part of games.
We use a file format here.
Systems create complex effects (emanation, gathering, traversal)
Understand the body
Reaction Time
Ramp In/Ramp Out
Processing time
Focus
Emanation is the easiest to explain. It’s the shockwave of feeling that pressure creates.
However, using emanation everywhere is dangerous, because it can normalize their feeling to expect it everywhere.
There’s a fantastic book written by Disney as they were pioneering parts of animation.
Taking reference from the principles of animation, you can make better haptics.
While a slide isn’t a book, I have several proposed ‘principles of haptics’ that we’re exploring and working to provide tools and technology for.
Here are some takeaways!
Don’t worry about the gifs/images, they’ll be back next slide.
#1 - The body and how it works. Understand it or you’ll make less than stellar experiences.
#2 - Exaggerate or offset your timing so they process it at the time when the bass drops for maximum effect.
#3 - Make components that can let you create many different effects rather than hand crafting a single complex effect.
#4 Be careful with the contrast and acclimitizing your player or user to too much of one thing.
#5 - Nodal graphs representing the locations you have access to can make life so much easier.
So, moving on.
Who are we, NullSpace VR (logo is in the corner
We’re NullSpaceVR
We make an upper body haptic feedback suit.
Located in Seattle, WA
We don’t just make the hardware of the suit.
We’re also providing the software tools to create great haptics as well as tie them into existing applications.
Unreal is on the way
We’re seeking to provide the best tools for tackling this new field, especially when it is alongside VR, another largely unexplored field!
Wireless is on the way
AR/Phone integration are on the way.
Unreal is also on our implementation list
This is a sensory homonculous, a fictional concept of an individual with the portions of the body scaled to how touch sensitive they are. (Note: The eyes are wrong)
I use an internal term which is Feeldelity, or how much a user can detect and understand what is occurring.
Feeldelity is baseline determined by the locations sensitivity, but further modified by context and cues. If you direct user attention and expectation, a locations feeldelity rises.
Feeldelity can create false reports where a person thinks they feel something as well.