These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how Light Field Technology is becoming economic feasible for an increasing number of applications. Light Field Cameras record all of the light fields in a picture instead of just one light field. This capability enables users to change the focus of pictures after they have been taken and to more easily record 3D data. These features are becoming economically feasible improvements because of rapid improvements in camera chips and micro-lens arrays (an example of micro-electronic mechanical systems, MEMS). These features offer alternative ways to do 3D sensing for automated vehicles and augmented reality and can enable faster data collection with telescopes.
Mysore Call Girls 8617370543 WhatsApp Number 24x7 Best Services
Light Field Technology
1. Light Field Technology
Chen Zhao, Chen Zhi, John Allen Ray, Ung Guan Wah, Zhou Xintao
For information on other technologies, please see Jeff Funk’s slide share account
(http://www.slideshare.net/Funk98/presentations) or his book with Chris Magee:
Exponential Change: What drives it? What does it tell us about the future?
http://www.amazon.com/Exponential-Change-drives-about-future-
ebook/dp/B00HPSAYEM/ref=sr_1_1?ie=UTF8&qid=1398325920&sr=8-1&keywords=exponential+change
2. What is light field technology?
• Light goes through the main lens and then hit a special lens
• The special lens incorporates a compound lens known as a
micro-lens array (MLA)
• MLA re-group sensors (e.g. CCD, CMOS) into small groups
• Capture information on the direction, color, and luminosity of
millions of individual rays of light
• Computationally reconstruct the light field (3D)
• http://www.youtube.com/watch?v=7babcK2GH3I
Lytro camera
Generalized conceptual
diagram of light field camera
4. Timeline of Light Field Camera
(LFC)
1992 Plenoptic camera proposed in research
2006 Refocus Imaging (later Lytro)
2008 Foundation of Raytrix
2010 R11 first commercialized LF camera, 30,000 euros, 3 megapixels
2011 Lytro launched, first consumer LFC, 499USD. 1.1 megapixels
2014 Jan, Toshiba started sample shipment of Dual camera module
with depth data, 50 USD, 13 megapixels
2015 first half, Pelican Imaging shipment planned.
6. T. Suzuki, “Challenges of Image-Sensor Development”, ISSCC, 2010
http://www.future-fab.com/documents.asp?d_ID=4926
Changes in Scale Impacted on Cost per pixel of Camera Chips
7. Microlens Array (MLA)
Modern fabrication methods:
• Photolithography based on
standard semiconductor
processing technology
• Feature size less than 1mm and
often as small as 10 um
Other applications:
• Coupling light to optical fibres
• Increase light collection
efficiency of CCD arrays
• Focus light in digital projectors
• Concentrators for high efficiency
photovoltaics
9. Mobile Software Refocus
2014 is becoming the year of mobile software refocus
Nokia Lumia https://refocus.nokia.com/
Sony XperiaTM Z2 (waterproof)
Samsung Galaxy S5
LG G Pro 2
Meizu MX3
10. Cameras - Reduction in size
Toshiba
• 18.0 mm x 12.0 mm x 4.65 mm
• Dual camera module
• 13 Megapixels
• CMOS Light Field sensor
• 500,000 microlenses (5X more
than Lytro)
• Module shipped in Jan 2014
• Sample US$50
11. Cameras - Reduction in size
Pelican Imaging
• Developed extremely thin, and
cheap light field camera module
• Height 3mm
• Array of small cameras
• Production cost US$20
• 8 Megapixels output
• Future Nokia Lumia confirmed
with this feature
http://www.youtube.com/watch?v=
Nleclfgqn_U
13. A light-field picture contains more information about
depth than simply correspondence which allow us to
capture the real world in unparalleled detail (4D).
Rendering complexity is independent of scene
complexity.
Processing speed is fast
No need to worry about the focus of the scanner
lenses.
Why light field for 3D sensing
is better
14. 1. Stereoscopic vision:
Currently most common 3D sensor approach
Passive range determination via stereoscopic vision utilizes
the disparity in viewpoints between a pair of near-identical
cameras to measure the distance to a subject of interest.
3D sensing Technology
15. 2. Structured light:
Replaces the stereoscopic vision sensor's second imaging sensor with
a projection component.
Similar to stereoscopic vision techniques, this approach takes
advantage of the known camera-to-projector separation to locate a
specific point between them and compute the depth with
triangulation algorithms.
3D sensing Technology
16. 3. Time of flight
An indirect system to obtain travel time information by
measuring the delay or phase shift of a modulated optical signal
for all pixels in the scene.
The ToF sensor in the system consists of an array of pixels,
where each pixel is capable of determining the distance to
the scene.
3D sensing Technology
17. Stereoscopic
vision
Structured light Time of
flight(TOF)
Light Field
Software
complexity
High High Low Low
Material Cost Low High/Middle Middle Low
Response time Middle Slow Fast Fast
Low light Weak Light source dep
(IR or visible)
Good (IR, laser) Good
Outdoor Good Weak Fair Good
Depth (“z”)
accuracy
cm μm ~ cm mm ~ cm μm ~ cm
Range Mid range Very short
range(cm) to
mid range (4-
6m)
Short
range(<1m) to
long
range(~40m)
Very short
range(cm) to long
rang (~100m)
Application
Device control
3D movie
3D scanning
18. 3D-vision gesture control system, which is a highly precise
and reliable user interface for interacting with any display
screen from any distance. Whether on a personal
computer, set top box, television set, mobile device, game
console, digital sign or interactive kiosk, The depth
tracking software enables users to control onscreen
interaction with simple hand motions instead of a remote
control, keyboard or touch screen.
3D sensing control
26. http://www.youtube.com/watch?feature=p
layer_embedded&v=pky822zG4hM
• Without the benefit of clear natural sight,
such advances light field in AR are
extraordinarily helpful
• Ability to see more - inside of a patient
(Diagnosis and Therapy)
• Limited – type of displays are cumbersome
• New equipment such as transparent screens
- Displaying information and graphics about
the person's condition,
- combination of visualization and location
tracking technology,
• Medical AR technology compared with light
field, ultrasound and location technology
27. AR using Light Field
3D MOULD
• More details
• Bring it every where you can
28. AR Tool kit (VRML) vs. Rendering Light field
Reference: https://www.academia.edu/5470506/AN_AUGMENTED_REALITY_SYSTEM_BASED_ON_LIGHT_FIELDS
29. AR Tool kit (VRML) vs. Rendering Light field
Detail Processing
• Constant Response
• Shorter Response
Reference: https://www.academia.edu/5470506/AN_AUGMENTED_REALITY_SYSTEM_BASED_ON_LIGHT_FIELDS
30. Higher processing requirement
Traditional ARNearing peak of
inflated expectation
Break away from
Monitor & display
Level 0 - Physical World Hyper Linkin
Level 1 - Marker Based AR
Level 2 - Markerless AR
Level 3 - Augmented Vision
http://www.sprxmobile.com/the-augmented-reality-hype-cycle/
31. Distribution of AR Application on mobile
• Increase demand in mobile device application
• Install with Light Field Camera for fast response
• “Shot and Focus” later
http://www.augmentedplanet.com/2010/06/the-mobile-augmented-reality-competitive-landscape/
32. Representation Hologram Vs Light Field
a) Depicts the representation of a hologram.
b) and c) show two different representations of a
light field.
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
33. Light Field Mapping
• The possibility to transform a light field into a holographic
representation and vice versa.
• The holographic data representation is similar to a light field
• “M” transforms the light field into a holographic representation.
• Method to extract depth from the input light field.
• If accurate depth information is available for the light field it can
optionally be added to the input
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
34. Light Field Mapping
• Depth Reconstruction from Light Fields
• Effects of Loss of Data from Hologram
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
35. Hologram Vs Light Field
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
36. Hologram Vs Light Field
• Hologram - an illustration of direct output of holographic
content on future generation holographic displays.
• Light field - far more efficient for conventional 2D frame buffer
displays.
• Light Field - the versatility and the power of transformation on
synthetic light fields, real light fields and digitally recorded
holograms.
• The rendered images can be evaluated directly from the
holographic representation or through light field rendering.
• Light field capable of simulating different aperture sizes as well
as focal length – Versatile displays
Future
• Holograms can be captured using a light field camera
• Take advantage of the realism and detail preserving benefits of
a real light field while giving the possibility of a 3D output on a
holographic screen
37. AR Light Field Possible Improvements
• See-through displays
• New tracking sensors
• Interfaces and Interactions
40. Light-Field Telescope
Source: JonathanWedd, Jan van der Laan, Eric Lavelle, and
David Stoker. “A High-Magnification Light-Field Telescope
forExtended Depth-of-Field Biometric Imaging”
41. Light-Field Telescope
Take the image faster and refocus the image after
taking
Increase magnification
Achieve large depth of field and high lateral image
resolution simultaneously
Capture different wavelengths
42. Light Field Microscope
A compact Light Field
Microscope Designed by
Stanford
Consists of an ordinary
research microscope and
cooled scientific camera
A microlens array is
inserted
Source: Marc Levoy, Ren Ng, Andrew Adams, Matthew Footer, Mark
Horowitz. “Light Field Microscopy”. Stanford University
43. Light Field Microscope
Source: Marc Levoy, Ren Ng, Andrew Adams, Matthew Footer, Mark
Horowitz. “Light Field Microscopy”. Stanford University
44. Light Field Microscope
Captures light fields of biological
specimens in a single snapshot
Offers 3D functional imaging of neuronal
activity in entire organisms at single cell level
Separates image acquisition from the
selection of viewpoint and focus
Captures video of high speed moving
specimens
46. Autonomous Vehicle
Applications
Consumer
Passenger Vehicles
Agricultural Vehicles
Military
Combat Vehicles
Logistics/Supplies
Search & Rescue Vehicles
M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012
http://cdn.intechopen.com/pdfs-wm/45459.pdf
47. LFC Vs. Laser Based Systems
Advantages:
Easier data interpretation
Significantly Lower Cost
Lower Power Requirement
Comparable performance
Disadvantages:
Smaller field of view (multiple cameras required)
D. Stavens, “LEARNING TO DRIVE: PERCEPTION FOR AUTONOMOUS CARS”, 2011
http://purl.stanford.edu/pb661px9942
48. Light Field Depth Map
M.Tao, et al, “Depth from Combining Defocus and Correspondence Using Light-Field Cameras”, 2013
http://www.cs.berkeley.edu/~ravir/lightfield_ICCV.pdf
49. Light Field Terrain Analysis
M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012
http://cdn.intechopen.com/pdfs-wm/45459.pdf
50. Light Field Terrain Analysis
M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012
http://cdn.intechopen.com/pdfs-wm/45459.pdf
51. Future of Light Field
We have analyzed advantages of LF
We showed applications of LF in cameras, mobile
phones, 3D scanning, AR, scientific research, and
autonomous vehicles
Future applications of LF will enable us to do more
simple and fast 3D applications