File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Illumination direction estimation for augmented reality using a surface input real valued output regression network

TitleIllumination direction estimation for augmented reality using a surface input real valued output regression network
Authors
KeywordsIlluminant direction estimation
Neural network with functions as input
Surface input pattern
Issue Date2010
Citation
Pattern Recognition, 2010, v. 43 n. 4, p. 1700-1716 How to Cite?
AbstractDue to low cost for capturing depth information, it is worthwhile to reduce the illumination ambiguity by employing scenario depth information. In this article, a neural computation approach is reported that estimates illuminant direction from scenario reflectance map. Since the reflectance map recovered from depth map and image is a variable sized point cloud, we propose to parameterize it as a two dimensional polynomial function. Afterwards, a novel network model is presented for mapping from continuous function (reflectance map) to vectorial output (illuminant direction). Experimental results show that the proposed model works well on both synthetic and real scenes. © 2009 Elsevier Ltd. All rights reserved.
Persistent Identifierhttp://hdl.handle.net/10722/196677
ISSN
2021 Impact Factor: 8.518
2020 SCImago Journal Rankings: 1.492
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChow, CK-
dc.contributor.authorYuen, SY-
dc.date.accessioned2014-04-24T02:10:33Z-
dc.date.available2014-04-24T02:10:33Z-
dc.date.issued2010-
dc.identifier.citationPattern Recognition, 2010, v. 43 n. 4, p. 1700-1716-
dc.identifier.issn0031-3203-
dc.identifier.urihttp://hdl.handle.net/10722/196677-
dc.description.abstractDue to low cost for capturing depth information, it is worthwhile to reduce the illumination ambiguity by employing scenario depth information. In this article, a neural computation approach is reported that estimates illuminant direction from scenario reflectance map. Since the reflectance map recovered from depth map and image is a variable sized point cloud, we propose to parameterize it as a two dimensional polynomial function. Afterwards, a novel network model is presented for mapping from continuous function (reflectance map) to vectorial output (illuminant direction). Experimental results show that the proposed model works well on both synthetic and real scenes. © 2009 Elsevier Ltd. All rights reserved.-
dc.languageeng-
dc.relation.ispartofPattern Recognition-
dc.subjectIlluminant direction estimation-
dc.subjectNeural network with functions as input-
dc.subjectSurface input pattern-
dc.titleIllumination direction estimation for augmented reality using a surface input real valued output regression network-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.patcog.2009.10.008-
dc.identifier.scopuseid_2-s2.0-74449091092-
dc.identifier.volume43-
dc.identifier.issue4-
dc.identifier.spage1700-
dc.identifier.epage1716-
dc.identifier.isiWOS:000274954100042-
dc.identifier.issnl0031-3203-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats