SlideShare una empresa de Scribd logo
1 de 75
Descargar para leer sin conexión
ICAT2011@ Osaka University (2011.11.28)




Augmented Reality & DigiLog:
Toward Ubiquitous Virtual Reality 2.0



      Woontack Woo (禹 雲澤), Ph.D.
       http://twitter.com/wwoo_ct
           GIST CTI/U-VR Lab
             Gwangju, Korea
Gwangju (光州), Korea, the city of
 Science & Technology, Light, Culture & Art, Food




GIST is Research-oriented University
U-VR Lab and CTI started in 2001 and 2005, respectively
Brief History

Personal History and Status of AR
 Estimated user 180M+ by 2012
 Major brands are taking keen interest
 Consumers are hungry for Apps


                             1992            1994                       1999
    1968                                                   1998
                1991      ‘AR’ by Tom     Continuum                   1st ISMR
HMD by Ivan                                            1st IWAR in
               1st
                 ICAT      Caudell @      by Milgram                  9 th ICAT
 Sutherland                                            SF, CA, USA
                            Boeing          @ ATR                    (Waseda U)




  1999          2001         2002            2004
                                                        2005           2006
 ATR MIC      GIST U-VR   1st ISMAR,      14th ICAT
                                                       GIST CTI      1stISUVR
   Lab           Lab      Darmstadt        in Seoul




                             2009                         2011
  2007          2008                        2010
                             Sony                      ISO/SC24/       2012
Sony ‘Eye                                 Qualcomm
              Wikitude     ‘EyePet’                       WG9        KAIST U-
    of                                      R&D
                                                       Sony PS Vi     VR Lab
Judgment’     mAR Guide    LBS AR          Center
                                                           ta
Outline




Paradigm Shift : DigiLog with AR & Ubiquitous VR


DigiLog Applications and U-VR Core


U-VR 2.0: What’s Next?


Summary and Q&A
Media vs. A-Reality


(S-)Media creates Perception
Perception is (A-)Reality
So, (S-)Media creates (A-)Reality

What does (S-) and (A-) mean?
 S-Media : Smart, Social (CI)
 A-Reality : Altered, Augmented
Computing History and My Perspective

Computing History

      Mainframe            Personal            Networked           Ubiquitous           U-VR
60s                 80s                  90s                 00s                10s
      Computer             Computer            Computers           Computing          Computing


  Text            CG/Image             Multimedia           u-Media             s-Media

      Sharing a           Individual      Sharing over         Human-           Community-
      computer              usage           Internet           centered          centered


  Information                   Knowledge                  Intelligence               Wisdom
                                                              Emotion                 Fun

Computing in next 5-10 Years :
 Nomadic human: Desktop-based UI -> Augmented Reality
 Smart space : Intelligence for a user -> Wisdom for community
 Smart media: Personal emotion -> Social fun
DigiLog and Ubiquitous VR

Is DigiLog-X a new Media?
 DigiLog-X : Digital (Service/Content) over Analog Life


 Media platform: Phone/TV/CE + Computer + …
 HW platform: mobile network + Cloud + …
 Service/Content platform: SNS + LBS + CAS + … over Web/App
 UI/UX platform: 3D + AR/VR/MR + …


 So, DigiLog-X is becoming a new Media !!!


How to realize Smart DigiLog?
 Ubiquitous Virtual Reality = VR in smart physical space
 Context-aware Mixed (Mirrored) Augmented Reality for smart DigiLog UI/UX
 => Mobile/wearable + Smart (context-aware) + AR + (for) Social Fun
Hype Cycle of AR 2011

                     Augmented Reality
                     • MIT’s annual review; “10 Emerging Tech.s 2007”
                     • Gartner: top 10 disruptive tech 2008-12
2010                 • Juniper: mAR 1.4B downloads/y, revenue $1.5B/y
                       by 2015 (11M in 2010)
2009


2008
Is AR Hype?

Google Trend (VR vs. AR)

           A: Virtual Reality Embraced by Businesses
           B: Another use for your phone: 'augmented reality
           C: Qualcomm Opens Austria Research Center to Focus on Augmented Reality
           D: Qualcomm Launches Augmented Reality Application Developer Challenge
           E: Review: mTrip iPhone app uses augmented reality
           F: Toyota demos augmented-reality-enhanced car windows
What’s U-VR, MR & AR?




                   Dual space {R, R’}




              RE         RE’
                                          V
     RE
RE


          R                    R’
                                          VE’
              RE         RE’
                                    VE’
What’s U-VR, MR & AR?

Woo’s Definition [11] : U-VR is
 3D Link btw dual (real & virtual) spaces with
  additional info
 CoI augmentation, not just sight: sound,
  haptics, smell, taste, etc.
 Bidirectional UI for H2H/H2S/S2H/S2S
  communication in dual spaces


                              Virtual space
                     How to                                  U-Content
                                  Seamless Augmentation
                     Link            btw dual spaces
                    Seamlessly?                                  LINK

                                                                 CoI
                              Real space
                                                          Social Networks
Outline




Paradigm Shift : DigiLog with AR & Ubiquitous VR


DigiLog Applications and U-VR Core Technology


U-VR 2.0: What’s Next?


Summary and Q&A
DigiLog Applications

DigiLog with AR for Edutainment
 DigiLog with AR: interactive, flexible, interesting, direct experience, etc.
 Edutainment
   Education: learning, training, knowledge
   Entertainement: fun, game, storytelling



Technological Challenges : It should …
 Be simple to use and robust as a tool
 Provide the user with clear and concise information
 Enable the educator/tutor to input information in a simple and effective manner
 Enable easy interaction between learners
 Make complex procedures transparent to the learner
 Be cost effective and easy to install
DigiLog @ U-VR Lab 2006

        Garden Alive: an Emotionally Intelligent Interactive Garden
         Intuitive interaction: TUIs seamlessly bridge to the garden in a virtual world
         Educational purpose: users can evaluate what environmental conditions can affect
          plant growth
         Emotional sympathy to the users: the emotional change of the virtual plants based on
          user’s interaction which maximizes user interest The International Journal of Virtual Reality, 2006, 5(4):21-30

                 The International Journal of Virtual Reality, 2006, 5(4):21-30




                                       Fig. 1. The overall system.

rfaces in the real garden                       nutrient influences growth in different parts of the plant.
                                                    3) Hand gestures
 the real garden                                Furthermore, for more natural interface, users can interact with
                                                                                 Fig. 4. Tangible user interfaces. Tangible user interfaces with watering pot, user’s hand and nutrients supplier in the "Garde
vironment is divided into two parts based on    virtual plants using their hands. We defined the various
 the surface, such as the ground and the        meanings according to hand gestures. Four kinds of hands
e surface of the real garden corresponds to the                         rainbow. Furthermore, there are a fixed number of plants in the
                                                gestures can be recognized. For example, the users are grabbing
  Teaejin Ha, Woontack Woo, ”Garden Alive: An Emotionallysupplier, called the plantsthegrow.International Journal of Virtual Reality (IJVR), 5, 4, pp. 21-30, 2006.
und and the underground. Users can see how      of the nutrients
                                                                 Intelligent Interactive Garden,” In the second the plants are reproduced
                                                                        population and to same numbers of
                                                                                                                                                                       III. ARTIFICIAL INTELLIGN
DigiLog @ U-VR Lab 2006

     Garden Alive: an Emotionally Intelligent Interactive Garden
          Demo
                            데모비디오




            ◦ From the presented Garden Alive, users experience excitement and emotional interaction which is difficult to feel
              in the real garden
              • The various kinds of growing plants which have different gene types according to generational evolution
              • Changes of emotion reflecting the user’s interaction, where the intelligent content can provide emotional feed
                 back to the users


Teaejin Ha, Woontack Woo, ”Garden Alive: An Emotionally Intelligent Interactive Garden,” International Journal of Virtual Reality (IJVR), 5, 4, pp. 21-30, 2006.
DigiLog @ U-VR Lab 2010

     Digilog book for temple bell tolling experience
      Digilog Book: an augmented paper book that provides additional multimedia content
       stimulating readers’ five senses using AR technologies
          • Descriptions for multisensory AR contents; multisensory feedback; and vision-based manual input




Taejin Ha, Youngho Lee, Woontack Woo, "Digilog book for temple bell tolling experience based on interactive augmented reality," Virtual Reality, 15(4), pp. 295-309, 2010.
DigiLog @ U-VR Lab 2010

     Digilog book for temple bell tolling experience
           A ‘‘temple bell experience’’ book




             ◦ The temple bell experience book is expected to encourage readers to explore cultural heritages for ed
               ucation and entertainment purposes




Taejin Ha, Youngho Lee, Woontack Woo, "Digilog book for temple bell tolling experience based on interactive augmented reality," Virtual Reality, 15(4), pp. 295-309, 2010.
Digilog Applications 2010

Enhance Experience, Engage, Educate & Entertain

Hongkil Dong                            Technologies in Chosun




Storytelling application                Storytelling application

Integrated with virtools*               Integrated with virtools*
Digilog Apps 2011

DigiLog Miniature




Storytelling application            Storytelling application

Integrated with virtools*           Integrated with virtools*
Technical Challenges

CoI Localization:
 Context of Interest (CoI): Space vs. Object
 Accurate CoI Recognition and Tracking
3D Interaction



Ubiquitous Augmentation
 LBS/SNS-based Authoring and Mash-up
Smart UI for Intuitive Visualization
 AR-Infography + Organic UI
Networking and public DB management
U-VR ecosystem with SNS, LBS, CaS


HW wish list
 Better camera/GPS/compass, CPU/GPU, I/O, battery
Tracking @ U-VR Lab 2007-8
     BilliARd (2005)




2D Picture Tracking (K.Kim)   3D Vase Tracking (Y. Park)   3D Vase Tracking (Y. Park)




    Tracking (W. Baek)         Tracking MO (W.Back)         Multiple object Tracking
                                                                    (Y.Park)
Tracking @ U-VR Lab 2007-8




PageRecognition (K.Kim)    2D Tracking (K.Kim)       Tracking w/Modeling (K.Kim)




 Tracking App (K.Kim)     Layer Authoring (J.Park)       AR+ tale + -let
                                                         (booklet)
AR @ U-VR Lab 2008

    Multiple 3D Object Tracking for Augmented Reality
     Performance-preserving parallel detection and tracking framework
     Stabilized 3D tracking by fusing detection and frame-to-frame tracking
     Keypoint verification for occluded region removal




Y. Park, V. Lepetit and W.Woo, “Multiple 3D Object Tracking for Augmented Reality,” in Proc. ISMAR 2008, pp.117-120, Sep. 2008.
Y. Park, V. Lepetit and W.Woo, “Extended Keyframe Detection with Stable Tracking for Multiple 3D Object Tracking,” IEEE TVCG, 17(11):
   1728-1735, 2011
AR @ U-VR Lab 2008

    Multiple 3D Object Tracking for Augmented Reality
        Multiple objects 3D tracking demonstration
                                                      데모비디오




        This video shows simultaneous multiple 3D object tracking which maintains frame rate. The video also
        shows the effect of temporal keypoint verification.

Y. Park, V. Lepetit and W.Woo, “Multiple 3D Object Tracking for Augmented Reality,” in Proc. ISMAR 2008, pp.117-120, Sep. 2008.
Y. Park, V. Lepetit and W.Woo, “Extended Keyframe Detection with Stable Tracking for Multiple 3D Object Tracking,” IEEE TVCG, 17(11):
   1728-1735, 2011
AR @ U-VR Lab 2009

    Handling Motion-Blur in 3D Tracking and Rendering for AR
     Generalized image formation model simulating motion-blur effect
     Derivation using Efficient Second-order Minimization into a optimization
     Automated exposure time evaluation




Y. Park, V. Lepetit and W.Woo, “ESM-Blur: Handling & Rendering Blur in 3D Tracking and Augmentation ,” in Proc. ISMAR 2009, pp.163-166,
   Oct. 2009
Y. Park, V. Lepetit and W.Woo, “Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality,” IEEE TVCG, (to appear)
AR @ U-VR Lab 2009

    Handling Motion-Blur in 3D Tracking and Rendering for AR
       Comparison with ESM and augmentation with motion blur effect
                                                      데모비디오




       This video compares the proposed ESM-Blur and ESM-Blur-SE with ESM and illustrate the
       augmentation with motion-blur effect for 3D models under general motion.

Y. Park, V. Lepetit and W.Woo, “ESM-Blur: Handling & Rendering Blur in 3D Tracking and Augmentation ,” in Proc. ISMAR 2009, pp.163-166,
   Oct. 2009
Y. Park, V. Lepetit and W.Woo, “Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality,” IEEE TVCG, (to appear)
AR @ U-VR Lab 2010

     Scalable Tracking for Digilog Books
      Fast and reliable tracking using a multi-core programming approach
      Frame-to-frame tracking for fast performance: Bounded search
      Two-step detection for scalability: “Image searching + Feature-level matching”
                                                                         image
                               image                                                                           6 DOF pose in challenging viewpoints
                                                                                                        33
                            Re-localization of
                                                                           Image searching
                                 Points

                       No
                                  Valid?                                     Feature-level
                                                                               matching
                                       Yes
                              Track Points
                            (Frame to Frame)                      No
                                                                                 Valid Page
                                                                                    ID?
                  No            Enough poi                                             Yes
                                   nts?
                                      Yes
                                                                              Compute
                               Compute                                     Homography (H)
                              Homography                                                                        Matches visualization
            Inliers,                                         Inliers,
            Page ID                                          Page ID,
            (R t)i-1                                         H
                              Decompose
                                                                             Count inliers
                              Homography

                       Tracking Thread (Main)                 Detection Thread (Background)

K. Kim, V. Lepetit and W.Woo, “Scalable Planar Targets Tracking for Digilog Books,” The Visual Computer, 26(6-8):1145-1154, 2010.
AR @ U-VR Lab 2010

     Scalable Tracking for Digilog Books

       Tracking Performance                                                          HongGilDong: Digilog Book Applications
                           데모비디오




       Visualization of inliers                                                      Storytelling application

       Less than 10 ms tracking speed with 314 planar                                Integrated with virtools*
       targets in a database.



K. Kim, V. Lepetit and W.Woo, “Scalable Planar Targets Tracking for Digilog Books,” The Visual Computer, 26(6-8):1145-1154, 2010.
AR @ U-VR Lab 2010

     Real-time Modeling and Tracking
                                                Real-time SfM
      In-situ modeling of various objects and
       collecting of tracking data on real-time
       structure from motion
      Objects insertion by minimal user
       interactions
                                                Interactive Modeling
      Tracking multiple objects independently
                                                  37
       in real-time
               image


                                                               New points
 Searching features       Feature extraction               triangulation (3.3)

                           Keyframes searching                  Bundle           No
  Frame-to-Frame               (3.2.1, 3.4.2 )              adjustment ( 3.3)
     matching                                                        Yes
                            Feature matching
                                    and                                          No
                                                       Object modeling?
                         outliers rejection ( 3.2.2)
    Pose update                                              (3.5)                    Multiple Object Tracking
                                                                     Yes
                                Keyframe
                                condition?                  Map update ( 3.3)
     Rendering
                                            Yes
           Foreground                                                  Background




K. Kim, V. Lepetit and W.Woo, “Keyframe-based Modeling and Tracking of Multiple 3D Objects”, International Symposium on Mixed and Augmented Reality,” ISMAR, 2010.
                     2001 ~ 2010 Copyright@GIST U-VR Lab.
AR @ U-VR Lab 2010

     Real-time Modeling and Tracking
       ISMAR10                                                                       Extension
                          데모비디오




       Supporting various types of objects
       Enhanced multiple object detection



K. Kim, V. Lepetit and W.Woo, “Keyframe-based Modeling and Tracking of Multiple 3D Objects”, International Symposium on Mixed and Augmented Reality,” ISMAR, 2010.
AR @ U-VR Lab 2011

     Reconstruction, Registration, and Tracking for Digilog Miniatures
      Fast and reliable 3D tracking based on the scalable tracker for digilog books
      Tracking data: Incremental 3D reconstruction of the target objects in offline
      Registration: fitting planar surface with the reconstructed keypoints
                   Offline process




                                        SIFT feature                      Incremental                        Bundle
                                         extraction                      reconstruction                    adjustment

                                            Collect                          Set local                      Adjust a
                                           keypoints                        coordinates                      scale
                                     Detection
          (Target tracking)




                                                                                                                           P-n-P
           Online process




                                                                                            voctree



                                                                                                             Keyframe*


                                                 Extracting SIFT features        Searching Keyframe*         Outlier Rejection &
                                                                                                             Finding Keypoints

                                                         search-window
                                                                                                                                   P-n-P
                                                                                                                                   +
                                     Image                                                                                         L-M
                                                                                                                                   minimization


                                             (Adding Keypoints   if available)   Frame-by-Frame Matching       Pose Update (R, t)

K. Kim, N. Park and W.Woo, “Vision-based All-in-One Solution for AR and its Storytelling Applications,” The Visual Computer (submitted), 2011.
AR @ U-VR Lab 2011

     Reconstruction, Registration, and Tracking for Digilog Miniatures
        Miniature I                                         Miniature II                                         Miniature III
                           데모비디오




        Palace                                              Temple                                               Town

        Keyframes: 23                                       Keyframes: 42                                        Keyframes: 82
        Keypoints: 10,370                                   Keypoints: 24,039                                    Keypoints: 80,157




K. Kim, N. Park and W.Woo, “Vision-based All-in-One Solution for AR and its Storytelling Applications,” The Visual Computer (submitted), 2011.
AR @ U-VR Lab 2011

      Depth-assisted Real-time 3D Object Detection for AR
       Texture-less 3D Object Detection in Real-time
       Robust Detection under varying lighting conditions
       Scale difference detection


                                   RGB                    Depth
                                  Image                   Image
          Image & Depth
            Templates

                                     Gradient Computation




c-                                       Image & Depth
                                       Template Matching
n-
s,
                                      3D Point Registration



t                                                                 No
e                                      Is registration error
                                              small ?
n
                                                    Yes
y
y                                      Pose computation

et
 e         Figure 3: Overall procedure of the proposed method. The steps
er W. Lee, marked W. Woo, “Depth-assisted Real-timea GPU. Detection for Augmented Reality,” ICAT2011, 2011
            N. Park, in shade runs in parallel on 3D Object
AR @ U-VR Lab 2011

    Depth-assisted 3D Object Detection for AR (Nov. 30, Session 5)
                                                                                          Robust Detection with different lighting
                      Multiple 3D Object Detection
                          데모비디오                                                                   conditions and scales




       -    3D texture-less object detection & pose                                -    Robust detection under varying lighting
            estimation                                                                  conditions
       -    Multiple target detection in real-time                                 -    Detection of scale difference between two
                                                                                        similar objects
                                                     Available at : http://youtu.be/TgnocccmS7U


W. Lee, N. Park, W. Woo, “Depth-assisted Real-time 3D Object Detection for Augmented Reality,” ICAT2011, 2011
AR @ U-VR Lab 2011

    Texture-less 3D object Tracking with RGB-D Cam
     Object training while tracking: start without known 3D model
     Stabilization using color image as well as depth map
     Depth map enhancement around noisy boundary and surface




Y. Park, V. Lepetit and W.Woo, “Texture-Less Object Tracking with Online Training using An RGB-D Camera,” in Proc. ISMAR 2011, pp. 121-
   126, Oct. 2011.
AR @ U-VR Lab 2011

    Texture-less 3D object Tracking with RGB-D Cam
        Tracking while training of texture-less objects
                                                      데모비디오




        This video shows the tracking of texture-less objects that are difficult to track using conventional
        keypoint-based methods. The tracking begins without known object 3D model.

Y. Park, V. Lepetit and W.Woo, “Texture-Less Object Tracking with Online Training using An RGB-D Camera,” in Proc. ISMAR 2011, pp. 121-
   126, Oct. 2011.
AR @ U-VR Lab 2011

In situ 3D Modeling for wearable AR
Interaction @ U-VR Lab 2009-10

     Two-handed tangible interactions for augmented blocks
      Cubical user interface based tangible interactions
           Screw-driving (SD) method for free positioning
           Block-assembly (BA) method using pre-knowledge
       Augmented assembly guidance
           Preliminary and interim guidance in BA




                                         SD sequence




                                                                                                                            BA sequence

H.Lee, M.Billinghusrt, and W.Woo, “Two-handed tangible interaction techniques for composing augmented blocks,” in Virtual Reality, Vol.15, No.2-3,pp133-146, Jun. 2010.
Interaction @ U-VR Lab 2009-10

    Two-handed tangible interactions for augmented blocks
        AR Toy Car Making:
        Tangible Cube Interface based Screw-driving interaction




        Screw-Driving technique is based on the real world condition where two or more real objects are
        joined together using a screw and screw-driver. Supporting axis change by the help of additional
        button and visual hints for 3D positioning


Link: http://youtu.be/t0iVuNygqQw
Interaction @ U-VR Lab 2010

     An Empirical Evaluation of Virtual Hand
       Techniques for 3D Object Manipulation
       Adopt Fitts’ law-based formal evaluation process
       Extend the design parameters of the 1D scale Fitts’ law to 3D scale
       Implement and compare standard TAR manipulation techniques




                                                                          CUP method                                             PADDLE method




               ≈
                                                                        CUBE method                                           Ex_PADDLE method



Taejin Ha, Woontack Woo, "An Empirical Evaluation of Virtual Hand Techniques for 3D Object Manipulation in a Tangible Augmented Reality Environment," IEEE 3D User
   Interfaces, pp. 91-98, 2010.
Interaction @ U-VR Lab 2011

     An Interactive 3D Movement Path Manipulation Method
      Control point allocation test properly generate 3D movement path
      Dynamic selection method effectively selects the small and dense control points




Taejin Ha, Mark Billinghurst, Woontack Woo, "An Interactive 3D Movement Path Manipulation Method in an Augmented Reality Environment," Interacting with Computers, 2011
   (in press).
Interaction @ U-VR Lab 2010-11

     An Empirical Evaluation of Virtual Hand Techniques
       Virtual Hand                                                               3D Path Manipulation

                          데모비디오




        Affordance could enhance usability through                                  ◦ A movement path can be constructed using only a
         promoting the user’s understanding                                            small number of control points
        Instant triggering could help rapid                                         ◦ A movement path can be rapidly manipulated
         manipulation (e.g., button input)                                             with relatively reduced hand and arm
        The selection can be made easier by expanding                                 movements using increased effective distance
         the selection area
Taejin Ha, Woontack Woo, "An Empirical Evaluation of Virtual Hand Techniques for 3D Object Manipulation in a Tangible Augmented Reality Environment," IEEE 3D User
   Interfaces, pp. 91-98, 2010.
Taejin Ha, Mark Billinghurst, Woontack Woo, "An Interactive 3D Movement Path Manipulation Method in an Augmented Reality Environment," Interacting with Computers, 2011
Interaction @ U-VR Lab 2011

     ARWand: Phone-based 3D Object Manipulation in AR
      Exploits a 2D touch screen, a 3DOF accelerometer, and compass sensors information
       to manipulate 3D objects in 3D space
      Design transfer functions to map the control space of mobile phones to an AR display
       space




Taejin Ha, Woontack Woo, "ARWand: Phone-based 3D Object Manipulation in Augmented Reality Environment," ISUVR, pp. 44-47, 2011.
Interaction @ U-VR Lab 2011

     ARWand: Phone-based 3D Object Manipulation in AR
         Experiment and application




           ◦ Low control-to-display gain: a sophisticated translation could be possible but this requires a significant amount of
             clutching
           ◦ High gain could reduce the frequent clutching, but accurate manipulation could be difficult
           ◦ Therefore, we need to consider an optimal control function that satisfies both fast and accurate manipulation


Taejin Ha, Woontack Woo, "ARWand: Phone-based 3D Object Manipulation in Augmented Reality Environment," ISUVR, pp. 44-47, 2011.
Interaction @ U-VR Lab 2011

     Graphical Menus using a Mobile Phone for Wearable AR Systems
      Classifying focusable menus via a mobile phone with stereo HMD
          Display-referenced (DR)
          Manipulator-referenced (MR)
          Target-referenced (TR)




                    DR                        MR                       TR




                                DR                                               MR                                                 TR

H.Lee, D.Kim, and W.Woo, “Graphical Menus using a Mobile Phone for Wearable AR systems,” in Proc. ISUVR 2011, pp55-58, Jul. 2011.
Interaction @ U-VR Lab 2011

    Graphical Menus using a Mobile Phone for Wearable AR Systems
          Wearable menus on three focusable elements




           Based on previous menu work, we determine display-, manipulator- and target-referenced menu
          placement according to focusable elements within a wearable AR system. Also it implemented by
          using a mobile phone with a stereo head-mounted display


Link: http://youtu.be/TVrE5ljlCYI
CAMAR 2009-10

       Mobile AR: WHERE to augment?




                              Concept                                                    Context-aware                                              Annotation (H. Kim)




              Plan Recognition (Y. Jang)                                 Multi-page Recognition (J.Park)                                       LBS + mobile AR (W. Lee)

[Paper] Y. Jang and W. Woo, “Stroke-based semi-automatic region of interest detection for in-situ painting recognition", 14th International Conference on Human-Computer Interaction (HCII 2011), Jul. 9-14,
Orlando, USA, accepted.
[Patent] W. Woo, Y. Jang, “현장에서 그림 인식을 위해 선긋기 상호작용을 통한 반자동식 관심영역 검출 알고리즘 ,” 2010. (출원 중)
CAMAR: Context-aware mobile AR

How to make CAMAR App’s more useful?



         Impractical AR                       Useful AR

 •3D models placed in a webcam
  with little or no interactivity       •Engaging, persistent
                                         experience for the user
 •Layered animation with little or no
  feedback
 •MAR that uses solely GPS,             •[LBS + SNS + MAR]
  compass, and accelerometer input       drawing from a large DB
 •MAR where geo-tagging doesn't          with customization
  serve an everyday purpose              features
CAMAR 2.0: Context-aware mobile AR


                       Sharing




                      Direct
                    response
                                  Mashup




       Reflective
       response         Planned
                       response
Context Awareness @ U-VR Lab 2010

     Context-aware Microblog Browser
      Observe the properties of microblogs from large-scale data analysis
      Propose the method that retrieves user-related hot topics of microblogs




                                                      User’s Interests Inference    Local Recent Hot Topic Detection
                                                               Preference                  Hot Topic Categorization
                                     Hot Topic               Categorization
                                     Selection
                                  with Re-Raking
                                                                                          Local Hot Topics Detection
                                                          Preference Inference
                                                              based on TF
                                                                                      Comparison        Comparison with          Hot Topic
                                     Similarity                                     with Global Data   Previous Local Data      Visualization
                                   Measurement
                                   between Topic                   User & Friends
                                                        Activity
                                    and Interest                    Micro-Blogs                                                 User Context
                                                       Inference                          Local Micro-Blogs Retrieval
                                                                      Retrieval                                                  Acquisition
                                                                                                          Web
                                                                                                       Contextual Information

                                                                                                   Real-Time Local Hot Topics



J. Han, X. Xie, and W. Woo, “Context-based Local Hot Topic Detection for Mobile User,” in Proc. of Adjunct Pervasive 2010, pp.001-004, May. 2010.
Context Awareness @ U-VR Lab 2010

     Context-aware Microblog Browser
        Dependence of Microblogs and Context                                          Microblog Mobile Browser
                           데모비디오




        User history is the most affective for user interest                          Gather user contexts from a mobile phone
        Location and user social relationship is also                                 Detect real-time local hot topics from microblogs
        important and local social networking is more                                 Select hot topics related to user preference and
        important than them                                                           activity




J. Han, X. Xie, and W. Woo, “Context-based Local Hot Topic Detection for Mobile User,” in Proc. of Adjunct Pervasive 2010, pp.001-004, May. 2010.
Context Awareness @ U-VR Lab 2011

     Adaptive Content Recommendation
      Recommend user-preferred content
      Retrieve content efficiently using hierarchal
       context model




J. Han, H. Schmidtke, X. Xie, and W. Woo, “Adaptive Content Recommendation using Hierarchical Context Model with Granularity for Mobile Consumer,” in Pers. Ubiqu. Comp
    ut., pp.000-000, 2012. (Submitted)
Context Awareness @ U-VR Lab 2011

     Adaptive Content Recommendation
       Hierarchical Context Model                                                Content Recommender using Context Model
                         데모비디오




       • Collection of directed acyclic graph                                    • Retrieve tags related to retrieved photos
       • Represent partial order relation                                        • Tag cloud with DAG structure
       • Capture subtag-supertag hierarchies                                     • Collect tags and investigate frequency of the
                                                                                   tags
                                                                                 • Display with different size of fonts



J. Han, H. Schmidtke, X. Xie, and W. Woo, “Adaptive Content Recommendation using Hierarchical Context Model with Granularity for Mobile Consumer,” in Pers. Ubiqu.
    Comput., pp.000-000, 2012. (Submitted)
CAMAR Applications 2009-11
CAMAR @ U-VR Lab 2009

    CAMAR Tag Framework: Context-Aware Mobile Augmented
       Reality for Dual-reality Linkage
       A novel tag concept which adds a tag to an object as a reference point in dual-reality
         to contact about sharing information




H. Kim, W. Lee and W. Woo, “CAMAR Tag Framework: Context-Aware Mobile Augmented Reality Tag Framework for Dual-reality Linkage”, in ISUVR 2009, pp.39-42,
   July 2009.
CAMAR @ U-VR Lab 2010

     Real and Virtual Worlds Linkage through Cloud-Mobile
       Convergence
       Consider opportunities and requirements for
        dual world linkage through CMCVR
       Implement an object-based linkage module
        prototype on a mobile phone
       Evaluate results of obtained 3D points
        normalization
                                                                                      A Model of Real and Virtual Worlds Linkage through CMCVR




                 Object modeling from real to virtual world                                    Content authoring from virtual to real world

H. Kim and W.Woo, “Real and Virtual Worlds Linkage through Cloud-Mobile Convergence”, in Virtual Reality Workshop (CMCVR), pp.10-13, March. 2010.
CAMAR @ U-VR Lab 2010

     Real and Virtual Worlds Linkage through Cloud-Mobile
       Convergence
       Poster linkage from real to virtual world
                   데모비디오                                                          Dual art galleries




       Real and virtual world                                                     Real and virtual world
       - ubiHome, a smart home test bed                                           - art gallery test bed
       - virtual 3D ubiHome                                                       - virtual 3D art gallery

       Two-dimensional objects                                                    Two-dimensional objects
       - like posters                                                             - like structure shape and picture frames


H. Kim and W.Woo, “Real and Virtual Worlds Linkage through Cloud-Mobile Convergence”, in Virtual Reality Workshop (CMCVR), pp.10-13, March. 2010.
CAMAR @ U-VR Lab 2010

     Barcode-assisted Planar Object Tracking for Mobile AR
      embed the information related to a planar object into the barcode,
       and the information is used to limit image regions to perform
       keypoint matching between consecutive frames.




       Tracking by Detection (Mobile)                                             Barcode Detection + Natural Feature
                                                                                  Tracking




N.Park W.Lee and W.Woo, “Barcode-assisted Planar Object Tracking Method for Mobile Augmented Reality” in Proc. ISUVR 2011, pp.40-43, July. 2011.
http://www.youtube.com/watch?feature=player_profilepage&v=nho4y2yoASo, Barcode-assisted Planar Object Tracking Method for Mobile Augmented Reality, GIST CTI.
CAMAR @ U-VR Lab 2010

   2D Detection/Recognition for mobile tagging
    Semi-automatic ROI Detection for Painting Region
        Robust to Illumination, View Direction/Distance Changes
     Fast Recognition based on Local Binary Pattern (LBP) codes
     In-Situ code enrollment for a detected new painting

       Various size of paintings
                                        Extracted binary codes               Updating code
                                                                                                        DB

                                                                         Y      Matching
                                                                      New?      N



                                                                                Object ID #


        ROI* detection                      LBP* code                    Updating new painting code
      (Rectangular shape)                   extraction               Code matching by hamming distance
     * ROI = Region of Interest       * LBP = Local Binary Pattern


Y. Jang and W. Woo, "A Stroke-based Semi-automatic ROI Detection Algorithm for In-Situ Painting Recognition", HCII2011,
Orlando, Florida, USA, July 9-14, 2011 (LNCS)
CAMAR @ U-VR Lab 2010

   2D Detection/Recognition for mobile tagging
    Stroke-based ROI Detection/Recognition [1] ROI Detection/Recognition
                  데모비디오




    Semi-automatic ROI Detection for Painting              Touch-triggered Painting Detection/Recognition
    Region
                                                           Robust to View Distance Changes
    Robust to Illumination, View Direction Changes
                                                           In-situ Painting Code Generation/Enrollment
    Fast Recognition based on Local Binary Pattern (LBP)


[1] http://www.youtube.com/watch?feature=player_detailpage&v=pGp-L2dbcYU
CAMAR @ U-VR Lab 2010

    In Situ Video Tagging on Mobile Phones
     In situ Planar Target Learning on Mobile Phones
     Sensor-based Automatic Fronto-parallel View Generation
     Fast Vanishing Point Computation


                             Input Image



                              Horizontal                       Vanishing Points
                               target ?                          Estimation



                        Fronto-parallel View
                            Generation


                          Target Learning
                         on the mobile GPU



                        Real-time Detection




W. Lee, Y. Park, V. Lepetit, W. Woo, "In-Situ Video Tagging on Mobile Phones," Circuit Systems and Video Technology, IEEE Trans. on, Vol. 21, No. 10, pp. 1487-1496, 2011.
W. Lee, Y. Park, V. Lepetit, W. Woo, "Point-and-Shoot for Ubiquitous Tagging on Mobile Phones," ISMAR10, pp. 57-64, 2010.
CAMAR @ U-VR Lab 2010

     In Situ Video Tagging on Mobile Phones
            In situ Augmentation of Real World Objects                                          Vertical Target Learning & Detection
                           데모비디오




       -    In situ augmentation of real world objects                                 -   Learning a vertical target from an arbitrary
            without pre-trained database                                                   viewpoint
       -    Fast target learning in a few seconds                                      -   Vanishing point-based fronto-parallel view
       -    Real-time detection from novel viewpoints                                      generation
                                                                                       -   Real-time detection from unseen viewpoints
                                                       Available at : http://youtu.be/vaaFhvfwet8

W. Lee, Y. Park, V. Lepetit, W. Woo, "In-Situ Video Tagging on Mobile Phones," Circuit Systems and Video Technology, IEEE Trans. on, Vol. 21, No. 10, pp. 1487-1496, 2011.
W. Lee, Y. Park, V. Lepetit, W. Woo, "Point-and-Shoot for Ubiquitous Tagging on Mobile Phones," ISMAR10, pp. 57-64, 2010.
CAMAR @ U-VR Lab 2011

     Interactive Annotation on Mobile Phones for Real and Virtual
       Space Registration
       Allows to quickly capture the dimensions of a room
       Operates at interactive frame-rates on mobile device
        and provides simple touch-interaction
       Serves as anchors for linking virtual information to
        the real space represented by the room




H. Kim, G. Reitmayr and W.Woo, “Interactive Annotation on Mobile Phones for Real and Virtual Space Registration,” in Proc. ISMAR 2011, pp.265-266, Oct. 2011.
CAMAR @ U-VR Lab 2011

     Interactive Annotation on Mobile Phones for Real and Virtual
       Space Registration
       Demo #1 데모비디오                                            Demo #2




       In office room and seminar room,                         In ART gallery,
       - Capture the dimensions of a room,                      - Load an AR zone-based room model
          approximated as a room                                - Annotate a virtual content on rectangular areas
       - Annotate a virtual content on rectangular                 on the room’s surface
          areas on the room’s surface




Youtube share link http://www.youtube.com/watch?v=I00I-phmPbI
CAMAR @ U-VR Lab 2011

    In-situ AR Mashup for AR Content Authoring
     Easily create AR contents from Web contents
     Context-based content recommendation
          User-similarity, item similarity, social relationship
       Configure AR content sharing setting
          To Whom, When, in What conditions




H.Yoon and W.Woo, “CAMAR Mashup: Empowering End-user Participation in U-VR Environment,” in Proc. ISUVR 2009, pp.33-36, July. 2009. (Best Paper Award)
H.Yoon and W.Woo, “Concept and Applications of In-situ AR Mashup Content,” in Proc. SCI 2011, pp. 25-30, Sept. 2011.
CAMAR @ U-VR Lab 2011

    In-situ AR Mashup for AR Content Authoring
           In-situ Content Mashup




           • Extract query keywords based on context of object
           • Content recommendation based on personal context and social context
           • Access related Flickr, Twitter, Picasa contents in-situ
H.Yoon and W.Woo, “CAMAR Mashup: Empowering End-user Participation in U-VR Environment,” in Proc. ISUVR 2009, pp.33-36, July. 2009. (Best Paper Award)
H.Yoon and W.Woo, “Concept and Applications of In-situ AR Mashup Content,” in Proc. SCI 2011, pp. 25-30, Sept. 2011.
Application Usage Prediction for Smartphones
 Personalized application prediction based on context
 Dynamic home screen: app recommendation and highlight



                                                                 Frequency of
                                      Procedure                   applications

                                         • Sensory info.
                           Data          • Formatting
                         collection      • Data recording

                                                            C1
                                         • Filtering
                            Pre-         • Merging
                         processing      • Discretization
                                                            C2

                                         • WraperSubset     C3
                           Feature         selection
                          selection      • cfsSubClass
                                         • GTT

                                         • MFU/MRU
                         Training &      • Bayesian model
                         prediction      • SVM/C4.5
Outline




Paradigm Shift : DigiLog with AR & Ubiquitous VR


Digilog Applications and U-VR Core


U-VR 2.0: What’s Next?


Summary and Q&A
U-VR2.0 for eco-System




                   Dual space {R, R’}




              RE         RE’
                                          V
     RE
RE


          R                    R’
                                          VE’
              RE         RE’
                                    VE’
What’s Next?

Where is this headed?
 Computing in next 5-10 Years :
   Nomadic human: Desktop-based UI -> Augmented Reality
   Smart space : Intelligence for a user-> Wisdom for community => <STANDARD>
   Responsive content: Personal emotion -> Social fun => <Social Issues>
 Augmented Content is a King, then Context is a queen consort controlling the King!
AR Standard

Interoperability (Standard)
 W3C : HTML5 (ETRI)
  http://www.w3.org/2010/06/w3car/report.html
ISO/IEC JTC1 SC24 : WG6,7,8 & WG9 (NEW on AR)
  X3D(KU), XML(GIST)
ISO/IEC JTC1 SC29 :
  X3D(ETRI)                                     <Figure by. H. Jeon @ ETRI>
web3D :
  X3D (Fraunhofer)
OGC :
  KLM & ARML
  KARML (GATECH)
Social AR?
Issues of Social AR
 Physical self along with a digital profile
 Unauthorized Augmented Advertising
 Privacy: Augmented Behavioral Targeting
 Safety: Physical danger
 Spam
What’s NEXT?

CAMAR 2.0
 <Open + Participate + Share>
 LBS + In-situ Mashup + SNS + CAMAR for sustainable AR eco-system


Wearable CAMAR 2.0
Summary




Paradigm Shift : DigiLog with AR & Ubiquitous VR


DigiLog Applications and VR Core


U-VR 2.0: What’s Next?


Summary and Q&A
Q&A

“The future is already here. It is just not uniformly distributed”
 by William Gibson (SF writer)


More Information
   Woontack Woo, Ph.D.
   Twitter: @wwoo_ct
   Mail: wwoo@gist.ac.kr
   Web: http://cti.gist.ac.kr




   ISUVR 2012 @ KAIST, Aug. 22 - 25, 2012

Más contenido relacionado

La actualidad más candente

COSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to ARCOSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to ARMark Billinghurst
 
AR - Augmented Reality
AR - Augmented RealityAR - Augmented Reality
AR - Augmented Realitydrstupid
 
UbiquitousVirtualReality2010
UbiquitousVirtualReality2010 UbiquitousVirtualReality2010
UbiquitousVirtualReality2010 Woontack Woo
 
426 lecture1: Introduction to AR
426 lecture1: Introduction to AR426 lecture1: Introduction to AR
426 lecture1: Introduction to ARMark Billinghurst
 
RBI paper, CHI 2008
RBI paper, CHI 2008RBI paper, CHI 2008
RBI paper, CHI 2008guest0dd2a1
 
Steve Feiner (Columbia University): The Future of AR
Steve Feiner (Columbia University): The Future of ARSteve Feiner (Columbia University): The Future of AR
Steve Feiner (Columbia University): The Future of ARAugmentedWorldExpo
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Natural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityNatural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityMark Billinghurst
 
Augmented reality technical presentation
 Augmented reality technical presentation Augmented reality technical presentation
Augmented reality technical presentationsairamgoud16
 
Creating Immersive and Empathic Learning Experiences
Creating Immersive and Empathic Learning ExperiencesCreating Immersive and Empathic Learning Experiences
Creating Immersive and Empathic Learning ExperiencesMark Billinghurst
 
IN140703 service support technologies 6.10.2016
IN140703 service support technologies 6.10.2016IN140703 service support technologies 6.10.2016
IN140703 service support technologies 6.10.2016Pirita Ihamäki
 
Augmented Reality - 8th Mass Media
Augmented Reality - 8th Mass MediaAugmented Reality - 8th Mass Media
Augmented Reality - 8th Mass MediaQualcomm
 
Design Approaches For Immersive Experiences AR/VR/MR
Design Approaches For Immersive Experiences AR/VR/MRDesign Approaches For Immersive Experiences AR/VR/MR
Design Approaches For Immersive Experiences AR/VR/MRMark Melnykowycz
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionBello Abubakar
 
Augmented Reality - the next big thing in mobile
Augmented Reality - the next big thing in mobileAugmented Reality - the next big thing in mobile
Augmented Reality - the next big thing in mobileHari Gottipati
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented RealityAnkit Raj
 

La actualidad más candente (20)

COSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to ARCOSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to AR
 
AR - Augmented Reality
AR - Augmented RealityAR - Augmented Reality
AR - Augmented Reality
 
UbiquitousVirtualReality2010
UbiquitousVirtualReality2010 UbiquitousVirtualReality2010
UbiquitousVirtualReality2010
 
426 lecture1: Introduction to AR
426 lecture1: Introduction to AR426 lecture1: Introduction to AR
426 lecture1: Introduction to AR
 
RBI paper, CHI 2008
RBI paper, CHI 2008RBI paper, CHI 2008
RBI paper, CHI 2008
 
Steve Feiner (Columbia University): The Future of AR
Steve Feiner (Columbia University): The Future of ARSteve Feiner (Columbia University): The Future of AR
Steve Feiner (Columbia University): The Future of AR
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Natural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityNatural Interfaces for Augmented Reality
Natural Interfaces for Augmented Reality
 
Augmented reality technical presentation
 Augmented reality technical presentation Augmented reality technical presentation
Augmented reality technical presentation
 
Creating Immersive and Empathic Learning Experiences
Creating Immersive and Empathic Learning ExperiencesCreating Immersive and Empathic Learning Experiences
Creating Immersive and Empathic Learning Experiences
 
IN140703 service support technologies 6.10.2016
IN140703 service support technologies 6.10.2016IN140703 service support technologies 6.10.2016
IN140703 service support technologies 6.10.2016
 
Augmented Reality - 8th Mass Media
Augmented Reality - 8th Mass MediaAugmented Reality - 8th Mass Media
Augmented Reality - 8th Mass Media
 
Design Approaches For Immersive Experiences AR/VR/MR
Design Approaches For Immersive Experiences AR/VR/MRDesign Approaches For Immersive Experiences AR/VR/MR
Design Approaches For Immersive Experiences AR/VR/MR
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction
 
Augmented Reality - the next big thing in mobile
Augmented Reality - the next big thing in mobileAugmented Reality - the next big thing in mobile
Augmented Reality - the next big thing in mobile
 
2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR2013 Lecture 8: Mobile AR
2013 Lecture 8: Mobile AR
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 

Similar a UVR2011(icat2011)

Mixede reality project report
Mixede reality project reportMixede reality project report
Mixede reality project reportsanamsanam7
 
Augmented Reality possibilities
Augmented Reality possibilitiesAugmented Reality possibilities
Augmented Reality possibilitiesElisa Aunola
 
AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...
AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...
AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...ijma
 
An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...
An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...
An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...ijma
 
Augmented Reality - Everything you need to know by Vaibhav Dwivedi
Augmented Reality - Everything you need to know by Vaibhav DwivediAugmented Reality - Everything you need to know by Vaibhav Dwivedi
Augmented Reality - Everything you need to know by Vaibhav DwivediVaibhav Dwivedi
 
Agumented Reality - Sai Kiran Kasireddy
Agumented Reality - Sai Kiran KasireddyAgumented Reality - Sai Kiran Kasireddy
Agumented Reality - Sai Kiran KasireddySai Kiran Kasireddy
 
Human Computer Interface Augmented Reality
Human Computer Interface Augmented RealityHuman Computer Interface Augmented Reality
Human Computer Interface Augmented Realityijtsrd
 
IRJET-Augmented Reality based Platform to Share Virtual Worlds
IRJET-Augmented Reality based Platform to Share Virtual WorldsIRJET-Augmented Reality based Platform to Share Virtual Worlds
IRJET-Augmented Reality based Platform to Share Virtual WorldsIRJET Journal
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.pptAjayPoonia22
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.pptNagulahimasri
 
Unit 8 – Task 4 – Virtual Reality
Unit 8 – Task 4 – Virtual RealityUnit 8 – Task 4 – Virtual Reality
Unit 8 – Task 4 – Virtual RealityChelsie Brandrick
 
COSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsCOSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsMark Billinghurst
 
Ahmad Reza Khawar - Mid-term DI
Ahmad Reza Khawar - Mid-term DIAhmad Reza Khawar - Mid-term DI
Ahmad Reza Khawar - Mid-term DIReza Khawar
 
COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR InteractionCOSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR InteractionMark Billinghurst
 
AUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMING
AUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMINGAUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMING
AUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMINGLiveplex
 
A Seminar Report On Virtual Reality
A Seminar Report On Virtual RealityA Seminar Report On Virtual Reality
A Seminar Report On Virtual RealityLisa Riley
 

Similar a UVR2011(icat2011) (20)

Mixede reality project report
Mixede reality project reportMixede reality project report
Mixede reality project report
 
Augmented Reality possibilities
Augmented Reality possibilitiesAugmented Reality possibilities
Augmented Reality possibilities
 
CAMAR2009
CAMAR2009CAMAR2009
CAMAR2009
 
AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...
AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...
AN EVALUATION OF THE USE OF AUDIO GUIDANCE IN AUGMENTED REALITY SYSTEMS IMPLE...
 
An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...
An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...
An Evaluation of the use of Audio Guidance in Augmented Reality Systems Imple...
 
Augmented Reality - Everything you need to know by Vaibhav Dwivedi
Augmented Reality - Everything you need to know by Vaibhav DwivediAugmented Reality - Everything you need to know by Vaibhav Dwivedi
Augmented Reality - Everything you need to know by Vaibhav Dwivedi
 
Agumented Reality - Sai Kiran Kasireddy
Agumented Reality - Sai Kiran KasireddyAgumented Reality - Sai Kiran Kasireddy
Agumented Reality - Sai Kiran Kasireddy
 
Human Computer Interface Augmented Reality
Human Computer Interface Augmented RealityHuman Computer Interface Augmented Reality
Human Computer Interface Augmented Reality
 
IRJET-Augmented Reality based Platform to Share Virtual Worlds
IRJET-Augmented Reality based Platform to Share Virtual WorldsIRJET-Augmented Reality based Platform to Share Virtual Worlds
IRJET-Augmented Reality based Platform to Share Virtual Worlds
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt
 
Unit 8 – Task 4 – Virtual Reality
Unit 8 – Task 4 – Virtual RealityUnit 8 – Task 4 – Virtual Reality
Unit 8 – Task 4 – Virtual Reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
COSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsCOSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research Directions
 
AR(mobas2012)s
AR(mobas2012)sAR(mobas2012)s
AR(mobas2012)s
 
Ahmad Reza Khawar - Mid-term DI
Ahmad Reza Khawar - Mid-term DIAhmad Reza Khawar - Mid-term DI
Ahmad Reza Khawar - Mid-term DI
 
COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR InteractionCOSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR Interaction
 
AUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMING
AUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMINGAUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMING
AUGMENTED REALITY (AR) IN DAILY LIFE: EXPANDING BEYOND GAMING
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
A Seminar Report On Virtual Reality
A Seminar Report On Virtual RealityA Seminar Report On Virtual Reality
A Seminar Report On Virtual Reality
 

Más de Woontack Woo

KAIST_CTAR20230322.pdf
KAIST_CTAR20230322.pdfKAIST_CTAR20230322.pdf
KAIST_CTAR20230322.pdfWoontack Woo
 
Metaverse_SGAsia2022.pdf
Metaverse_SGAsia2022.pdfMetaverse_SGAsia2022.pdf
Metaverse_SGAsia2022.pdfWoontack Woo
 
Kaist ctar20210701
Kaist ctar20210701Kaist ctar20210701
Kaist ctar20210701Woontack Woo
 
Kaist gsct20210701
Kaist gsct20210701Kaist gsct20210701
Kaist gsct20210701Woontack Woo
 
Augmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and SpaceAugmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and SpaceWoontack Woo
 
A museum gsct2020_f
A museum gsct2020_fA museum gsct2020_f
A museum gsct2020_fWoontack Woo
 
KAIST CT-AR Program 2020
KAIST CT-AR Program 2020KAIST CT-AR Program 2020
KAIST CT-AR Program 2020Woontack Woo
 
Some Thoughts on Augmented City 2019
Some Thoughts on Augmented City 2019Some Thoughts on Augmented City 2019
Some Thoughts on Augmented City 2019Woontack Woo
 
한국문화기술연구원 설립타당성 및 운영방안
한국문화기술연구원 설립타당성 및 운영방안한국문화기술연구원 설립타당성 및 운영방안
한국문화기술연구원 설립타당성 및 운영방안Woontack Woo
 
Augmented Human 2018
Augmented Human 2018Augmented Human 2018
Augmented Human 2018Woontack Woo
 
170119 metadata representation_v2s
170119 metadata representation_v2s170119 metadata representation_v2s
170119 metadata representation_v2sWoontack Woo
 
Kaist arrc 20160901
Kaist arrc 20160901Kaist arrc 20160901
Kaist arrc 20160901Woontack Woo
 
VR AR 미래전략 2016
VR AR 미래전략 2016VR AR 미래전략 2016
VR AR 미래전략 2016Woontack Woo
 
Metadata representation for outdoor AR services
Metadata representation for outdoor AR servicesMetadata representation for outdoor AR services
Metadata representation for outdoor AR servicesWoontack Woo
 
CT Strategy and Planning 2015
CT Strategy and Planning 2015CT Strategy and Planning 2015
CT Strategy and Planning 2015Woontack Woo
 
KAIST UVR Lab 2013
KAIST UVR Lab 2013KAIST UVR Lab 2013
KAIST UVR Lab 2013Woontack Woo
 
미래부의 CPND 정책, 미래가 있는가?
미래부의 CPND 정책, 미래가 있는가?미래부의 CPND 정책, 미래가 있는가?
미래부의 CPND 정책, 미래가 있는가?Woontack Woo
 
CT for Creative Culture City
CT for Creative Culture CityCT for Creative Culture City
CT for Creative Culture CityWoontack Woo
 

Más de Woontack Woo (20)

KAIST_CTAR20230322.pdf
KAIST_CTAR20230322.pdfKAIST_CTAR20230322.pdf
KAIST_CTAR20230322.pdf
 
Metaverse_SGAsia2022.pdf
Metaverse_SGAsia2022.pdfMetaverse_SGAsia2022.pdf
Metaverse_SGAsia2022.pdf
 
Metaverse2021 woo
Metaverse2021 wooMetaverse2021 woo
Metaverse2021 woo
 
Kaist ctar20210701
Kaist ctar20210701Kaist ctar20210701
Kaist ctar20210701
 
Kaist gsct20210701
Kaist gsct20210701Kaist gsct20210701
Kaist gsct20210701
 
Augmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and SpaceAugmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and Space
 
A museum gsct2020_f
A museum gsct2020_fA museum gsct2020_f
A museum gsct2020_f
 
KAIST CT-AR Program 2020
KAIST CT-AR Program 2020KAIST CT-AR Program 2020
KAIST CT-AR Program 2020
 
Some Thoughts on Augmented City 2019
Some Thoughts on Augmented City 2019Some Thoughts on Augmented City 2019
Some Thoughts on Augmented City 2019
 
한국문화기술연구원 설립타당성 및 운영방안
한국문화기술연구원 설립타당성 및 운영방안한국문화기술연구원 설립타당성 및 운영방안
한국문화기술연구원 설립타당성 및 운영방안
 
Augmented Human 2018
Augmented Human 2018Augmented Human 2018
Augmented Human 2018
 
170119 metadata representation_v2s
170119 metadata representation_v2s170119 metadata representation_v2s
170119 metadata representation_v2s
 
Kaist arrc 20160901
Kaist arrc 20160901Kaist arrc 20160901
Kaist arrc 20160901
 
VR AR 미래전략 2016
VR AR 미래전략 2016VR AR 미래전략 2016
VR AR 미래전략 2016
 
Metadata representation for outdoor AR services
Metadata representation for outdoor AR servicesMetadata representation for outdoor AR services
Metadata representation for outdoor AR services
 
CT Strategy and Planning 2015
CT Strategy and Planning 2015CT Strategy and Planning 2015
CT Strategy and Planning 2015
 
KAIST UVR Lab 2013
KAIST UVR Lab 2013KAIST UVR Lab 2013
KAIST UVR Lab 2013
 
UVR_ICGHIT2013
UVR_ICGHIT2013UVR_ICGHIT2013
UVR_ICGHIT2013
 
미래부의 CPND 정책, 미래가 있는가?
미래부의 CPND 정책, 미래가 있는가?미래부의 CPND 정책, 미래가 있는가?
미래부의 CPND 정책, 미래가 있는가?
 
CT for Creative Culture City
CT for Creative Culture CityCT for Creative Culture City
CT for Creative Culture City
 

Último

Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfhans926745
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 

Último (20)

Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 

UVR2011(icat2011)

  • 1. ICAT2011@ Osaka University (2011.11.28) Augmented Reality & DigiLog: Toward Ubiquitous Virtual Reality 2.0 Woontack Woo (禹 雲澤), Ph.D. http://twitter.com/wwoo_ct GIST CTI/U-VR Lab Gwangju, Korea
  • 2. Gwangju (光州), Korea, the city of Science & Technology, Light, Culture & Art, Food GIST is Research-oriented University U-VR Lab and CTI started in 2001 and 2005, respectively
  • 3. Brief History Personal History and Status of AR Estimated user 180M+ by 2012 Major brands are taking keen interest Consumers are hungry for Apps 1992 1994 1999 1968 1998 1991 ‘AR’ by Tom Continuum 1st ISMR HMD by Ivan 1st IWAR in 1st ICAT Caudell @ by Milgram 9 th ICAT Sutherland SF, CA, USA Boeing @ ATR (Waseda U) 1999 2001 2002 2004 2005 2006 ATR MIC GIST U-VR 1st ISMAR, 14th ICAT GIST CTI 1stISUVR Lab Lab Darmstadt in Seoul 2009 2011 2007 2008 2010 Sony ISO/SC24/ 2012 Sony ‘Eye Qualcomm Wikitude ‘EyePet’ WG9 KAIST U- of R&D Sony PS Vi VR Lab Judgment’ mAR Guide LBS AR Center ta
  • 4. Outline Paradigm Shift : DigiLog with AR & Ubiquitous VR DigiLog Applications and U-VR Core U-VR 2.0: What’s Next? Summary and Q&A
  • 5. Media vs. A-Reality (S-)Media creates Perception Perception is (A-)Reality So, (S-)Media creates (A-)Reality What does (S-) and (A-) mean? S-Media : Smart, Social (CI) A-Reality : Altered, Augmented
  • 6. Computing History and My Perspective Computing History Mainframe Personal Networked Ubiquitous U-VR 60s 80s 90s 00s 10s Computer Computer Computers Computing Computing Text CG/Image Multimedia u-Media s-Media Sharing a Individual Sharing over Human- Community- computer usage Internet centered centered Information Knowledge Intelligence Wisdom Emotion Fun Computing in next 5-10 Years : Nomadic human: Desktop-based UI -> Augmented Reality Smart space : Intelligence for a user -> Wisdom for community Smart media: Personal emotion -> Social fun
  • 7. DigiLog and Ubiquitous VR Is DigiLog-X a new Media? DigiLog-X : Digital (Service/Content) over Analog Life Media platform: Phone/TV/CE + Computer + … HW platform: mobile network + Cloud + … Service/Content platform: SNS + LBS + CAS + … over Web/App UI/UX platform: 3D + AR/VR/MR + … So, DigiLog-X is becoming a new Media !!! How to realize Smart DigiLog? Ubiquitous Virtual Reality = VR in smart physical space Context-aware Mixed (Mirrored) Augmented Reality for smart DigiLog UI/UX => Mobile/wearable + Smart (context-aware) + AR + (for) Social Fun
  • 8. Hype Cycle of AR 2011 Augmented Reality • MIT’s annual review; “10 Emerging Tech.s 2007” • Gartner: top 10 disruptive tech 2008-12 2010 • Juniper: mAR 1.4B downloads/y, revenue $1.5B/y by 2015 (11M in 2010) 2009 2008
  • 9. Is AR Hype? Google Trend (VR vs. AR) A: Virtual Reality Embraced by Businesses B: Another use for your phone: 'augmented reality C: Qualcomm Opens Austria Research Center to Focus on Augmented Reality D: Qualcomm Launches Augmented Reality Application Developer Challenge E: Review: mTrip iPhone app uses augmented reality F: Toyota demos augmented-reality-enhanced car windows
  • 10. What’s U-VR, MR & AR? Dual space {R, R’} RE RE’ V RE RE R R’ VE’ RE RE’ VE’
  • 11. What’s U-VR, MR & AR? Woo’s Definition [11] : U-VR is 3D Link btw dual (real & virtual) spaces with additional info CoI augmentation, not just sight: sound, haptics, smell, taste, etc. Bidirectional UI for H2H/H2S/S2H/S2S communication in dual spaces Virtual space How to U-Content Seamless Augmentation Link btw dual spaces Seamlessly? LINK CoI Real space Social Networks
  • 12. Outline Paradigm Shift : DigiLog with AR & Ubiquitous VR DigiLog Applications and U-VR Core Technology U-VR 2.0: What’s Next? Summary and Q&A
  • 13. DigiLog Applications DigiLog with AR for Edutainment DigiLog with AR: interactive, flexible, interesting, direct experience, etc. Edutainment  Education: learning, training, knowledge  Entertainement: fun, game, storytelling Technological Challenges : It should … Be simple to use and robust as a tool Provide the user with clear and concise information Enable the educator/tutor to input information in a simple and effective manner Enable easy interaction between learners Make complex procedures transparent to the learner Be cost effective and easy to install
  • 14. DigiLog @ U-VR Lab 2006 Garden Alive: an Emotionally Intelligent Interactive Garden Intuitive interaction: TUIs seamlessly bridge to the garden in a virtual world Educational purpose: users can evaluate what environmental conditions can affect plant growth Emotional sympathy to the users: the emotional change of the virtual plants based on user’s interaction which maximizes user interest The International Journal of Virtual Reality, 2006, 5(4):21-30 The International Journal of Virtual Reality, 2006, 5(4):21-30 Fig. 1. The overall system. rfaces in the real garden nutrient influences growth in different parts of the plant. 3) Hand gestures the real garden Furthermore, for more natural interface, users can interact with Fig. 4. Tangible user interfaces. Tangible user interfaces with watering pot, user’s hand and nutrients supplier in the "Garde vironment is divided into two parts based on virtual plants using their hands. We defined the various the surface, such as the ground and the meanings according to hand gestures. Four kinds of hands e surface of the real garden corresponds to the rainbow. Furthermore, there are a fixed number of plants in the gestures can be recognized. For example, the users are grabbing Teaejin Ha, Woontack Woo, ”Garden Alive: An Emotionallysupplier, called the plantsthegrow.International Journal of Virtual Reality (IJVR), 5, 4, pp. 21-30, 2006. und and the underground. Users can see how of the nutrients Intelligent Interactive Garden,” In the second the plants are reproduced population and to same numbers of III. ARTIFICIAL INTELLIGN
  • 15. DigiLog @ U-VR Lab 2006 Garden Alive: an Emotionally Intelligent Interactive Garden Demo 데모비디오 ◦ From the presented Garden Alive, users experience excitement and emotional interaction which is difficult to feel in the real garden • The various kinds of growing plants which have different gene types according to generational evolution • Changes of emotion reflecting the user’s interaction, where the intelligent content can provide emotional feed back to the users Teaejin Ha, Woontack Woo, ”Garden Alive: An Emotionally Intelligent Interactive Garden,” International Journal of Virtual Reality (IJVR), 5, 4, pp. 21-30, 2006.
  • 16. DigiLog @ U-VR Lab 2010 Digilog book for temple bell tolling experience Digilog Book: an augmented paper book that provides additional multimedia content stimulating readers’ five senses using AR technologies • Descriptions for multisensory AR contents; multisensory feedback; and vision-based manual input Taejin Ha, Youngho Lee, Woontack Woo, "Digilog book for temple bell tolling experience based on interactive augmented reality," Virtual Reality, 15(4), pp. 295-309, 2010.
  • 17. DigiLog @ U-VR Lab 2010 Digilog book for temple bell tolling experience A ‘‘temple bell experience’’ book ◦ The temple bell experience book is expected to encourage readers to explore cultural heritages for ed ucation and entertainment purposes Taejin Ha, Youngho Lee, Woontack Woo, "Digilog book for temple bell tolling experience based on interactive augmented reality," Virtual Reality, 15(4), pp. 295-309, 2010.
  • 18. Digilog Applications 2010 Enhance Experience, Engage, Educate & Entertain Hongkil Dong Technologies in Chosun Storytelling application Storytelling application Integrated with virtools* Integrated with virtools*
  • 19. Digilog Apps 2011 DigiLog Miniature Storytelling application Storytelling application Integrated with virtools* Integrated with virtools*
  • 20. Technical Challenges CoI Localization: Context of Interest (CoI): Space vs. Object Accurate CoI Recognition and Tracking 3D Interaction Ubiquitous Augmentation LBS/SNS-based Authoring and Mash-up Smart UI for Intuitive Visualization AR-Infography + Organic UI Networking and public DB management U-VR ecosystem with SNS, LBS, CaS HW wish list Better camera/GPS/compass, CPU/GPU, I/O, battery
  • 21. Tracking @ U-VR Lab 2007-8 BilliARd (2005) 2D Picture Tracking (K.Kim) 3D Vase Tracking (Y. Park) 3D Vase Tracking (Y. Park) Tracking (W. Baek) Tracking MO (W.Back) Multiple object Tracking (Y.Park)
  • 22. Tracking @ U-VR Lab 2007-8 PageRecognition (K.Kim) 2D Tracking (K.Kim) Tracking w/Modeling (K.Kim) Tracking App (K.Kim) Layer Authoring (J.Park) AR+ tale + -let (booklet)
  • 23. AR @ U-VR Lab 2008 Multiple 3D Object Tracking for Augmented Reality Performance-preserving parallel detection and tracking framework Stabilized 3D tracking by fusing detection and frame-to-frame tracking Keypoint verification for occluded region removal Y. Park, V. Lepetit and W.Woo, “Multiple 3D Object Tracking for Augmented Reality,” in Proc. ISMAR 2008, pp.117-120, Sep. 2008. Y. Park, V. Lepetit and W.Woo, “Extended Keyframe Detection with Stable Tracking for Multiple 3D Object Tracking,” IEEE TVCG, 17(11): 1728-1735, 2011
  • 24. AR @ U-VR Lab 2008 Multiple 3D Object Tracking for Augmented Reality Multiple objects 3D tracking demonstration 데모비디오 This video shows simultaneous multiple 3D object tracking which maintains frame rate. The video also shows the effect of temporal keypoint verification. Y. Park, V. Lepetit and W.Woo, “Multiple 3D Object Tracking for Augmented Reality,” in Proc. ISMAR 2008, pp.117-120, Sep. 2008. Y. Park, V. Lepetit and W.Woo, “Extended Keyframe Detection with Stable Tracking for Multiple 3D Object Tracking,” IEEE TVCG, 17(11): 1728-1735, 2011
  • 25. AR @ U-VR Lab 2009 Handling Motion-Blur in 3D Tracking and Rendering for AR Generalized image formation model simulating motion-blur effect Derivation using Efficient Second-order Minimization into a optimization Automated exposure time evaluation Y. Park, V. Lepetit and W.Woo, “ESM-Blur: Handling & Rendering Blur in 3D Tracking and Augmentation ,” in Proc. ISMAR 2009, pp.163-166, Oct. 2009 Y. Park, V. Lepetit and W.Woo, “Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality,” IEEE TVCG, (to appear)
  • 26. AR @ U-VR Lab 2009 Handling Motion-Blur in 3D Tracking and Rendering for AR Comparison with ESM and augmentation with motion blur effect 데모비디오 This video compares the proposed ESM-Blur and ESM-Blur-SE with ESM and illustrate the augmentation with motion-blur effect for 3D models under general motion. Y. Park, V. Lepetit and W.Woo, “ESM-Blur: Handling & Rendering Blur in 3D Tracking and Augmentation ,” in Proc. ISMAR 2009, pp.163-166, Oct. 2009 Y. Park, V. Lepetit and W.Woo, “Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality,” IEEE TVCG, (to appear)
  • 27. AR @ U-VR Lab 2010 Scalable Tracking for Digilog Books Fast and reliable tracking using a multi-core programming approach Frame-to-frame tracking for fast performance: Bounded search Two-step detection for scalability: “Image searching + Feature-level matching” image image 6 DOF pose in challenging viewpoints 33 Re-localization of Image searching Points No Valid? Feature-level matching Yes Track Points (Frame to Frame) No Valid Page ID? No Enough poi Yes nts? Yes Compute Compute Homography (H) Homography Matches visualization Inliers, Inliers, Page ID Page ID, (R t)i-1 H Decompose Count inliers Homography Tracking Thread (Main) Detection Thread (Background) K. Kim, V. Lepetit and W.Woo, “Scalable Planar Targets Tracking for Digilog Books,” The Visual Computer, 26(6-8):1145-1154, 2010.
  • 28. AR @ U-VR Lab 2010 Scalable Tracking for Digilog Books Tracking Performance HongGilDong: Digilog Book Applications 데모비디오 Visualization of inliers Storytelling application Less than 10 ms tracking speed with 314 planar Integrated with virtools* targets in a database. K. Kim, V. Lepetit and W.Woo, “Scalable Planar Targets Tracking for Digilog Books,” The Visual Computer, 26(6-8):1145-1154, 2010.
  • 29. AR @ U-VR Lab 2010 Real-time Modeling and Tracking Real-time SfM In-situ modeling of various objects and collecting of tracking data on real-time structure from motion Objects insertion by minimal user interactions Interactive Modeling Tracking multiple objects independently 37 in real-time image New points Searching features Feature extraction triangulation (3.3) Keyframes searching Bundle No Frame-to-Frame (3.2.1, 3.4.2 ) adjustment ( 3.3) matching Yes Feature matching and No Object modeling? outliers rejection ( 3.2.2) Pose update (3.5) Multiple Object Tracking Yes Keyframe condition? Map update ( 3.3) Rendering Yes Foreground Background K. Kim, V. Lepetit and W.Woo, “Keyframe-based Modeling and Tracking of Multiple 3D Objects”, International Symposium on Mixed and Augmented Reality,” ISMAR, 2010. 2001 ~ 2010 Copyright@GIST U-VR Lab.
  • 30. AR @ U-VR Lab 2010 Real-time Modeling and Tracking ISMAR10 Extension 데모비디오 Supporting various types of objects Enhanced multiple object detection K. Kim, V. Lepetit and W.Woo, “Keyframe-based Modeling and Tracking of Multiple 3D Objects”, International Symposium on Mixed and Augmented Reality,” ISMAR, 2010.
  • 31. AR @ U-VR Lab 2011 Reconstruction, Registration, and Tracking for Digilog Miniatures Fast and reliable 3D tracking based on the scalable tracker for digilog books Tracking data: Incremental 3D reconstruction of the target objects in offline Registration: fitting planar surface with the reconstructed keypoints Offline process SIFT feature Incremental Bundle extraction reconstruction adjustment Collect Set local Adjust a keypoints coordinates scale Detection (Target tracking) P-n-P Online process voctree Keyframe* Extracting SIFT features Searching Keyframe* Outlier Rejection & Finding Keypoints search-window P-n-P + Image L-M minimization (Adding Keypoints if available) Frame-by-Frame Matching Pose Update (R, t) K. Kim, N. Park and W.Woo, “Vision-based All-in-One Solution for AR and its Storytelling Applications,” The Visual Computer (submitted), 2011.
  • 32. AR @ U-VR Lab 2011 Reconstruction, Registration, and Tracking for Digilog Miniatures Miniature I Miniature II Miniature III 데모비디오 Palace Temple Town Keyframes: 23 Keyframes: 42 Keyframes: 82 Keypoints: 10,370 Keypoints: 24,039 Keypoints: 80,157 K. Kim, N. Park and W.Woo, “Vision-based All-in-One Solution for AR and its Storytelling Applications,” The Visual Computer (submitted), 2011.
  • 33. AR @ U-VR Lab 2011 Depth-assisted Real-time 3D Object Detection for AR Texture-less 3D Object Detection in Real-time Robust Detection under varying lighting conditions Scale difference detection RGB Depth Image Image Image & Depth Templates Gradient Computation c- Image & Depth Template Matching n- s, 3D Point Registration t No e Is registration error small ? n Yes y y Pose computation et e Figure 3: Overall procedure of the proposed method. The steps er W. Lee, marked W. Woo, “Depth-assisted Real-timea GPU. Detection for Augmented Reality,” ICAT2011, 2011 N. Park, in shade runs in parallel on 3D Object
  • 34. AR @ U-VR Lab 2011 Depth-assisted 3D Object Detection for AR (Nov. 30, Session 5) Robust Detection with different lighting Multiple 3D Object Detection 데모비디오 conditions and scales - 3D texture-less object detection & pose - Robust detection under varying lighting estimation conditions - Multiple target detection in real-time - Detection of scale difference between two similar objects Available at : http://youtu.be/TgnocccmS7U W. Lee, N. Park, W. Woo, “Depth-assisted Real-time 3D Object Detection for Augmented Reality,” ICAT2011, 2011
  • 35. AR @ U-VR Lab 2011 Texture-less 3D object Tracking with RGB-D Cam Object training while tracking: start without known 3D model Stabilization using color image as well as depth map Depth map enhancement around noisy boundary and surface Y. Park, V. Lepetit and W.Woo, “Texture-Less Object Tracking with Online Training using An RGB-D Camera,” in Proc. ISMAR 2011, pp. 121- 126, Oct. 2011.
  • 36. AR @ U-VR Lab 2011 Texture-less 3D object Tracking with RGB-D Cam Tracking while training of texture-less objects 데모비디오 This video shows the tracking of texture-less objects that are difficult to track using conventional keypoint-based methods. The tracking begins without known object 3D model. Y. Park, V. Lepetit and W.Woo, “Texture-Less Object Tracking with Online Training using An RGB-D Camera,” in Proc. ISMAR 2011, pp. 121- 126, Oct. 2011.
  • 37. AR @ U-VR Lab 2011 In situ 3D Modeling for wearable AR
  • 38. Interaction @ U-VR Lab 2009-10 Two-handed tangible interactions for augmented blocks Cubical user interface based tangible interactions  Screw-driving (SD) method for free positioning  Block-assembly (BA) method using pre-knowledge Augmented assembly guidance  Preliminary and interim guidance in BA SD sequence BA sequence H.Lee, M.Billinghusrt, and W.Woo, “Two-handed tangible interaction techniques for composing augmented blocks,” in Virtual Reality, Vol.15, No.2-3,pp133-146, Jun. 2010.
  • 39. Interaction @ U-VR Lab 2009-10 Two-handed tangible interactions for augmented blocks AR Toy Car Making: Tangible Cube Interface based Screw-driving interaction Screw-Driving technique is based on the real world condition where two or more real objects are joined together using a screw and screw-driver. Supporting axis change by the help of additional button and visual hints for 3D positioning Link: http://youtu.be/t0iVuNygqQw
  • 40. Interaction @ U-VR Lab 2010 An Empirical Evaluation of Virtual Hand Techniques for 3D Object Manipulation Adopt Fitts’ law-based formal evaluation process Extend the design parameters of the 1D scale Fitts’ law to 3D scale Implement and compare standard TAR manipulation techniques CUP method PADDLE method ≈ CUBE method Ex_PADDLE method Taejin Ha, Woontack Woo, "An Empirical Evaluation of Virtual Hand Techniques for 3D Object Manipulation in a Tangible Augmented Reality Environment," IEEE 3D User Interfaces, pp. 91-98, 2010.
  • 41. Interaction @ U-VR Lab 2011 An Interactive 3D Movement Path Manipulation Method Control point allocation test properly generate 3D movement path Dynamic selection method effectively selects the small and dense control points Taejin Ha, Mark Billinghurst, Woontack Woo, "An Interactive 3D Movement Path Manipulation Method in an Augmented Reality Environment," Interacting with Computers, 2011 (in press).
  • 42. Interaction @ U-VR Lab 2010-11 An Empirical Evaluation of Virtual Hand Techniques Virtual Hand 3D Path Manipulation 데모비디오  Affordance could enhance usability through ◦ A movement path can be constructed using only a promoting the user’s understanding small number of control points  Instant triggering could help rapid ◦ A movement path can be rapidly manipulated manipulation (e.g., button input) with relatively reduced hand and arm  The selection can be made easier by expanding movements using increased effective distance the selection area Taejin Ha, Woontack Woo, "An Empirical Evaluation of Virtual Hand Techniques for 3D Object Manipulation in a Tangible Augmented Reality Environment," IEEE 3D User Interfaces, pp. 91-98, 2010. Taejin Ha, Mark Billinghurst, Woontack Woo, "An Interactive 3D Movement Path Manipulation Method in an Augmented Reality Environment," Interacting with Computers, 2011
  • 43. Interaction @ U-VR Lab 2011 ARWand: Phone-based 3D Object Manipulation in AR Exploits a 2D touch screen, a 3DOF accelerometer, and compass sensors information to manipulate 3D objects in 3D space Design transfer functions to map the control space of mobile phones to an AR display space Taejin Ha, Woontack Woo, "ARWand: Phone-based 3D Object Manipulation in Augmented Reality Environment," ISUVR, pp. 44-47, 2011.
  • 44. Interaction @ U-VR Lab 2011 ARWand: Phone-based 3D Object Manipulation in AR Experiment and application ◦ Low control-to-display gain: a sophisticated translation could be possible but this requires a significant amount of clutching ◦ High gain could reduce the frequent clutching, but accurate manipulation could be difficult ◦ Therefore, we need to consider an optimal control function that satisfies both fast and accurate manipulation Taejin Ha, Woontack Woo, "ARWand: Phone-based 3D Object Manipulation in Augmented Reality Environment," ISUVR, pp. 44-47, 2011.
  • 45. Interaction @ U-VR Lab 2011 Graphical Menus using a Mobile Phone for Wearable AR Systems Classifying focusable menus via a mobile phone with stereo HMD  Display-referenced (DR)  Manipulator-referenced (MR)  Target-referenced (TR) DR MR TR DR MR TR H.Lee, D.Kim, and W.Woo, “Graphical Menus using a Mobile Phone for Wearable AR systems,” in Proc. ISUVR 2011, pp55-58, Jul. 2011.
  • 46. Interaction @ U-VR Lab 2011 Graphical Menus using a Mobile Phone for Wearable AR Systems Wearable menus on three focusable elements Based on previous menu work, we determine display-, manipulator- and target-referenced menu placement according to focusable elements within a wearable AR system. Also it implemented by using a mobile phone with a stereo head-mounted display Link: http://youtu.be/TVrE5ljlCYI
  • 47. CAMAR 2009-10 Mobile AR: WHERE to augment? Concept Context-aware Annotation (H. Kim) Plan Recognition (Y. Jang) Multi-page Recognition (J.Park) LBS + mobile AR (W. Lee) [Paper] Y. Jang and W. Woo, “Stroke-based semi-automatic region of interest detection for in-situ painting recognition", 14th International Conference on Human-Computer Interaction (HCII 2011), Jul. 9-14, Orlando, USA, accepted. [Patent] W. Woo, Y. Jang, “현장에서 그림 인식을 위해 선긋기 상호작용을 통한 반자동식 관심영역 검출 알고리즘 ,” 2010. (출원 중)
  • 48. CAMAR: Context-aware mobile AR How to make CAMAR App’s more useful? Impractical AR Useful AR •3D models placed in a webcam with little or no interactivity •Engaging, persistent experience for the user •Layered animation with little or no feedback •MAR that uses solely GPS, •[LBS + SNS + MAR] compass, and accelerometer input drawing from a large DB •MAR where geo-tagging doesn't with customization serve an everyday purpose features
  • 49. CAMAR 2.0: Context-aware mobile AR Sharing Direct response Mashup Reflective response Planned response
  • 50. Context Awareness @ U-VR Lab 2010 Context-aware Microblog Browser Observe the properties of microblogs from large-scale data analysis Propose the method that retrieves user-related hot topics of microblogs User’s Interests Inference Local Recent Hot Topic Detection Preference Hot Topic Categorization Hot Topic Categorization Selection with Re-Raking Local Hot Topics Detection Preference Inference based on TF Comparison Comparison with Hot Topic Similarity with Global Data Previous Local Data Visualization Measurement between Topic User & Friends Activity and Interest Micro-Blogs User Context Inference Local Micro-Blogs Retrieval Retrieval Acquisition Web Contextual Information Real-Time Local Hot Topics J. Han, X. Xie, and W. Woo, “Context-based Local Hot Topic Detection for Mobile User,” in Proc. of Adjunct Pervasive 2010, pp.001-004, May. 2010.
  • 51. Context Awareness @ U-VR Lab 2010 Context-aware Microblog Browser Dependence of Microblogs and Context Microblog Mobile Browser 데모비디오 User history is the most affective for user interest Gather user contexts from a mobile phone Location and user social relationship is also Detect real-time local hot topics from microblogs important and local social networking is more Select hot topics related to user preference and important than them activity J. Han, X. Xie, and W. Woo, “Context-based Local Hot Topic Detection for Mobile User,” in Proc. of Adjunct Pervasive 2010, pp.001-004, May. 2010.
  • 52. Context Awareness @ U-VR Lab 2011 Adaptive Content Recommendation Recommend user-preferred content Retrieve content efficiently using hierarchal context model J. Han, H. Schmidtke, X. Xie, and W. Woo, “Adaptive Content Recommendation using Hierarchical Context Model with Granularity for Mobile Consumer,” in Pers. Ubiqu. Comp ut., pp.000-000, 2012. (Submitted)
  • 53. Context Awareness @ U-VR Lab 2011 Adaptive Content Recommendation Hierarchical Context Model Content Recommender using Context Model 데모비디오 • Collection of directed acyclic graph • Retrieve tags related to retrieved photos • Represent partial order relation • Tag cloud with DAG structure • Capture subtag-supertag hierarchies • Collect tags and investigate frequency of the tags • Display with different size of fonts J. Han, H. Schmidtke, X. Xie, and W. Woo, “Adaptive Content Recommendation using Hierarchical Context Model with Granularity for Mobile Consumer,” in Pers. Ubiqu. Comput., pp.000-000, 2012. (Submitted)
  • 55. CAMAR @ U-VR Lab 2009 CAMAR Tag Framework: Context-Aware Mobile Augmented Reality for Dual-reality Linkage A novel tag concept which adds a tag to an object as a reference point in dual-reality to contact about sharing information H. Kim, W. Lee and W. Woo, “CAMAR Tag Framework: Context-Aware Mobile Augmented Reality Tag Framework for Dual-reality Linkage”, in ISUVR 2009, pp.39-42, July 2009.
  • 56. CAMAR @ U-VR Lab 2010 Real and Virtual Worlds Linkage through Cloud-Mobile Convergence Consider opportunities and requirements for dual world linkage through CMCVR Implement an object-based linkage module prototype on a mobile phone Evaluate results of obtained 3D points normalization A Model of Real and Virtual Worlds Linkage through CMCVR Object modeling from real to virtual world Content authoring from virtual to real world H. Kim and W.Woo, “Real and Virtual Worlds Linkage through Cloud-Mobile Convergence”, in Virtual Reality Workshop (CMCVR), pp.10-13, March. 2010.
  • 57. CAMAR @ U-VR Lab 2010 Real and Virtual Worlds Linkage through Cloud-Mobile Convergence Poster linkage from real to virtual world 데모비디오 Dual art galleries Real and virtual world Real and virtual world - ubiHome, a smart home test bed - art gallery test bed - virtual 3D ubiHome - virtual 3D art gallery Two-dimensional objects Two-dimensional objects - like posters - like structure shape and picture frames H. Kim and W.Woo, “Real and Virtual Worlds Linkage through Cloud-Mobile Convergence”, in Virtual Reality Workshop (CMCVR), pp.10-13, March. 2010.
  • 58. CAMAR @ U-VR Lab 2010 Barcode-assisted Planar Object Tracking for Mobile AR embed the information related to a planar object into the barcode, and the information is used to limit image regions to perform keypoint matching between consecutive frames. Tracking by Detection (Mobile) Barcode Detection + Natural Feature Tracking N.Park W.Lee and W.Woo, “Barcode-assisted Planar Object Tracking Method for Mobile Augmented Reality” in Proc. ISUVR 2011, pp.40-43, July. 2011. http://www.youtube.com/watch?feature=player_profilepage&v=nho4y2yoASo, Barcode-assisted Planar Object Tracking Method for Mobile Augmented Reality, GIST CTI.
  • 59. CAMAR @ U-VR Lab 2010 2D Detection/Recognition for mobile tagging Semi-automatic ROI Detection for Painting Region  Robust to Illumination, View Direction/Distance Changes Fast Recognition based on Local Binary Pattern (LBP) codes In-Situ code enrollment for a detected new painting Various size of paintings Extracted binary codes Updating code DB Y Matching New? N Object ID # ROI* detection LBP* code Updating new painting code (Rectangular shape) extraction Code matching by hamming distance * ROI = Region of Interest * LBP = Local Binary Pattern Y. Jang and W. Woo, "A Stroke-based Semi-automatic ROI Detection Algorithm for In-Situ Painting Recognition", HCII2011, Orlando, Florida, USA, July 9-14, 2011 (LNCS)
  • 60. CAMAR @ U-VR Lab 2010 2D Detection/Recognition for mobile tagging Stroke-based ROI Detection/Recognition [1] ROI Detection/Recognition 데모비디오 Semi-automatic ROI Detection for Painting Touch-triggered Painting Detection/Recognition Region Robust to View Distance Changes Robust to Illumination, View Direction Changes In-situ Painting Code Generation/Enrollment Fast Recognition based on Local Binary Pattern (LBP) [1] http://www.youtube.com/watch?feature=player_detailpage&v=pGp-L2dbcYU
  • 61. CAMAR @ U-VR Lab 2010 In Situ Video Tagging on Mobile Phones In situ Planar Target Learning on Mobile Phones Sensor-based Automatic Fronto-parallel View Generation Fast Vanishing Point Computation Input Image Horizontal Vanishing Points target ? Estimation Fronto-parallel View Generation Target Learning on the mobile GPU Real-time Detection W. Lee, Y. Park, V. Lepetit, W. Woo, "In-Situ Video Tagging on Mobile Phones," Circuit Systems and Video Technology, IEEE Trans. on, Vol. 21, No. 10, pp. 1487-1496, 2011. W. Lee, Y. Park, V. Lepetit, W. Woo, "Point-and-Shoot for Ubiquitous Tagging on Mobile Phones," ISMAR10, pp. 57-64, 2010.
  • 62. CAMAR @ U-VR Lab 2010 In Situ Video Tagging on Mobile Phones In situ Augmentation of Real World Objects Vertical Target Learning & Detection 데모비디오 - In situ augmentation of real world objects - Learning a vertical target from an arbitrary without pre-trained database viewpoint - Fast target learning in a few seconds - Vanishing point-based fronto-parallel view - Real-time detection from novel viewpoints generation - Real-time detection from unseen viewpoints Available at : http://youtu.be/vaaFhvfwet8 W. Lee, Y. Park, V. Lepetit, W. Woo, "In-Situ Video Tagging on Mobile Phones," Circuit Systems and Video Technology, IEEE Trans. on, Vol. 21, No. 10, pp. 1487-1496, 2011. W. Lee, Y. Park, V. Lepetit, W. Woo, "Point-and-Shoot for Ubiquitous Tagging on Mobile Phones," ISMAR10, pp. 57-64, 2010.
  • 63. CAMAR @ U-VR Lab 2011 Interactive Annotation on Mobile Phones for Real and Virtual Space Registration Allows to quickly capture the dimensions of a room Operates at interactive frame-rates on mobile device and provides simple touch-interaction Serves as anchors for linking virtual information to the real space represented by the room H. Kim, G. Reitmayr and W.Woo, “Interactive Annotation on Mobile Phones for Real and Virtual Space Registration,” in Proc. ISMAR 2011, pp.265-266, Oct. 2011.
  • 64. CAMAR @ U-VR Lab 2011 Interactive Annotation on Mobile Phones for Real and Virtual Space Registration Demo #1 데모비디오 Demo #2 In office room and seminar room, In ART gallery, - Capture the dimensions of a room, - Load an AR zone-based room model approximated as a room - Annotate a virtual content on rectangular areas - Annotate a virtual content on rectangular on the room’s surface areas on the room’s surface Youtube share link http://www.youtube.com/watch?v=I00I-phmPbI
  • 65. CAMAR @ U-VR Lab 2011 In-situ AR Mashup for AR Content Authoring Easily create AR contents from Web contents Context-based content recommendation  User-similarity, item similarity, social relationship Configure AR content sharing setting  To Whom, When, in What conditions H.Yoon and W.Woo, “CAMAR Mashup: Empowering End-user Participation in U-VR Environment,” in Proc. ISUVR 2009, pp.33-36, July. 2009. (Best Paper Award) H.Yoon and W.Woo, “Concept and Applications of In-situ AR Mashup Content,” in Proc. SCI 2011, pp. 25-30, Sept. 2011.
  • 66. CAMAR @ U-VR Lab 2011 In-situ AR Mashup for AR Content Authoring In-situ Content Mashup • Extract query keywords based on context of object • Content recommendation based on personal context and social context • Access related Flickr, Twitter, Picasa contents in-situ H.Yoon and W.Woo, “CAMAR Mashup: Empowering End-user Participation in U-VR Environment,” in Proc. ISUVR 2009, pp.33-36, July. 2009. (Best Paper Award) H.Yoon and W.Woo, “Concept and Applications of In-situ AR Mashup Content,” in Proc. SCI 2011, pp. 25-30, Sept. 2011.
  • 67. Application Usage Prediction for Smartphones Personalized application prediction based on context Dynamic home screen: app recommendation and highlight Frequency of Procedure applications • Sensory info. Data • Formatting collection • Data recording C1 • Filtering Pre- • Merging processing • Discretization C2 • WraperSubset C3 Feature selection selection • cfsSubClass • GTT • MFU/MRU Training & • Bayesian model prediction • SVM/C4.5
  • 68. Outline Paradigm Shift : DigiLog with AR & Ubiquitous VR Digilog Applications and U-VR Core U-VR 2.0: What’s Next? Summary and Q&A
  • 69. U-VR2.0 for eco-System Dual space {R, R’} RE RE’ V RE RE R R’ VE’ RE RE’ VE’
  • 70. What’s Next? Where is this headed? Computing in next 5-10 Years :  Nomadic human: Desktop-based UI -> Augmented Reality  Smart space : Intelligence for a user-> Wisdom for community => <STANDARD>  Responsive content: Personal emotion -> Social fun => <Social Issues> Augmented Content is a King, then Context is a queen consort controlling the King!
  • 71. AR Standard Interoperability (Standard) W3C : HTML5 (ETRI)  http://www.w3.org/2010/06/w3car/report.html ISO/IEC JTC1 SC24 : WG6,7,8 & WG9 (NEW on AR)  X3D(KU), XML(GIST) ISO/IEC JTC1 SC29 :  X3D(ETRI) <Figure by. H. Jeon @ ETRI> web3D :  X3D (Fraunhofer) OGC :  KLM & ARML  KARML (GATECH)
  • 72. Social AR? Issues of Social AR Physical self along with a digital profile Unauthorized Augmented Advertising Privacy: Augmented Behavioral Targeting Safety: Physical danger Spam
  • 73. What’s NEXT? CAMAR 2.0 <Open + Participate + Share> LBS + In-situ Mashup + SNS + CAMAR for sustainable AR eco-system Wearable CAMAR 2.0
  • 74. Summary Paradigm Shift : DigiLog with AR & Ubiquitous VR DigiLog Applications and VR Core U-VR 2.0: What’s Next? Summary and Q&A
  • 75. Q&A “The future is already here. It is just not uniformly distributed” by William Gibson (SF writer) More Information Woontack Woo, Ph.D. Twitter: @wwoo_ct Mail: wwoo@gist.ac.kr Web: http://cti.gist.ac.kr ISUVR 2012 @ KAIST, Aug. 22 - 25, 2012