LivePhantom: retrieving virtual world light data to real environments

To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor...

Full description

Saved in:
Bibliographic Details
Main Authors: Kolivand, H., Billinghurst, M., Sunar, M. S.
Format: Article
Language:English
Published: Public Library of Science 2016
Subjects:
Online Access:http://eprints.utm.my/id/eprint/71877/1/MohdShahrizalSunar2016_LivePhantomRetrievingVirtualWorld.pdf
http://eprints.utm.my/id/eprint/71877/
https://www.scopus.com/inward/record.uri?eid=2-s2.0-85002616472&doi=10.1371%2fjournal.pone.0166424&partnerID=40&md5=1ebb435bd961cfdf299c62e3f55ce972
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.utm.71877
record_format eprints
spelling my.utm.718772017-11-22T12:07:35Z http://eprints.utm.my/id/eprint/71877/ LivePhantom: retrieving virtual world light data to real environments Kolivand, H. Billinghurst, M. Sunar, M. S. QA75 Electronic computers. Computer science To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. Public Library of Science 2016 Article PeerReviewed application/pdf en http://eprints.utm.my/id/eprint/71877/1/MohdShahrizalSunar2016_LivePhantomRetrievingVirtualWorld.pdf Kolivand, H. and Billinghurst, M. and Sunar, M. S. (2016) LivePhantom: retrieving virtual world light data to real environments. PLoS ONE, 11 (12). ISSN 1932-6203 https://www.scopus.com/inward/record.uri?eid=2-s2.0-85002616472&doi=10.1371%2fjournal.pone.0166424&partnerID=40&md5=1ebb435bd961cfdf299c62e3f55ce972
institution Universiti Teknologi Malaysia
building UTM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Teknologi Malaysia
content_source UTM Institutional Repository
url_provider http://eprints.utm.my/
language English
topic QA75 Electronic computers. Computer science
spellingShingle QA75 Electronic computers. Computer science
Kolivand, H.
Billinghurst, M.
Sunar, M. S.
LivePhantom: retrieving virtual world light data to real environments
description To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.
format Article
author Kolivand, H.
Billinghurst, M.
Sunar, M. S.
author_facet Kolivand, H.
Billinghurst, M.
Sunar, M. S.
author_sort Kolivand, H.
title LivePhantom: retrieving virtual world light data to real environments
title_short LivePhantom: retrieving virtual world light data to real environments
title_full LivePhantom: retrieving virtual world light data to real environments
title_fullStr LivePhantom: retrieving virtual world light data to real environments
title_full_unstemmed LivePhantom: retrieving virtual world light data to real environments
title_sort livephantom: retrieving virtual world light data to real environments
publisher Public Library of Science
publishDate 2016
url http://eprints.utm.my/id/eprint/71877/1/MohdShahrizalSunar2016_LivePhantomRetrievingVirtualWorld.pdf
http://eprints.utm.my/id/eprint/71877/
https://www.scopus.com/inward/record.uri?eid=2-s2.0-85002616472&doi=10.1371%2fjournal.pone.0166424&partnerID=40&md5=1ebb435bd961cfdf299c62e3f55ce972
_version_ 1643656302912602112
score 13.160551