File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Article: 3D-HoloNet: fast, unfiltered, 3D hologram generation with camera-calibrated network learning
| Title | 3D-HoloNet: fast, unfiltered, 3D hologram generation with camera-calibrated network learning |
|---|---|
| Authors | |
| Issue Date | 5-Feb-2025 |
| Publisher | Optica Publishing Group |
| Citation | Optics Letters, 2025, v. 50, n. 4, p. 1188-1191 How to Cite? |
| Abstract | Computational holographic displays typically rely on time-consuming iterative computer-generated holographic (CGH) algorithms and bulky physical filters to attain high-quality reconstruction images. This trade-off between inference speed and image quality becomes more pronounced when aiming to realize 3D holographic imagery. This work presents 3D-HoloNet, a deep neural network-empowered CGH algorithm for generating phase-only holograms (POHs) of 3D scenes, represented as RGB-D images, in real time. The proposed scheme incorporates a learned, camera-calibrated wave propagation model and a phase regularization prior into its optimization. This unique combination allows for accommodating practical, unfiltered holographic display setups that may be corrupted by various hardware imperfections. Results tested on an unfiltered holographic display reveal that the proposed 3D-HoloNet can achieve 30 fps at full HD for one color channel using a consumer-level GPU while maintaining image quality comparable to iterative methods across multiple focused distances. |
| Persistent Identifier | http://hdl.handle.net/10722/361858 |
| ISSN | 2023 Impact Factor: 3.1 2023 SCImago Journal Rankings: 1.040 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Zhou, Wenbin | - |
| dc.contributor.author | Qu, Feifan | - |
| dc.contributor.author | Meng, Xiangyu | - |
| dc.contributor.author | Li, Zhenyang | - |
| dc.contributor.author | Peng, Yifan | - |
| dc.date.accessioned | 2025-09-17T00:31:16Z | - |
| dc.date.available | 2025-09-17T00:31:16Z | - |
| dc.date.issued | 2025-02-05 | - |
| dc.identifier.citation | Optics Letters, 2025, v. 50, n. 4, p. 1188-1191 | - |
| dc.identifier.issn | 0146-9592 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/361858 | - |
| dc.description.abstract | <p> Computational holographic displays typically rely on time-consuming iterative computer-generated holographic (CGH) algorithms and bulky physical filters to attain high-quality reconstruction images. This trade-off between inference speed and image quality becomes more pronounced when aiming to realize 3D holographic imagery. This work presents <em>3D-HoloNet</em>, a deep neural network-empowered CGH algorithm for generating phase-only holograms (POHs) of 3D scenes, represented as RGB-D images, in real time. The proposed scheme incorporates a learned, camera-calibrated wave propagation model and a phase regularization prior into its optimization. This unique combination allows for accommodating practical, unfiltered holographic display setups that may be corrupted by various hardware imperfections. Results tested on an unfiltered holographic display reveal that the proposed <em>3D-HoloNet</em> can achieve 30 fps at full HD for one color channel using a consumer-level GPU while maintaining image quality comparable to iterative methods across multiple focused distances. <br></p> | - |
| dc.language | eng | - |
| dc.publisher | Optica Publishing Group | - |
| dc.relation.ispartof | Optics Letters | - |
| dc.title | 3D-HoloNet: fast, unfiltered, 3D hologram generation with camera-calibrated network learning | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.1364/OL.544816 | - |
| dc.identifier.volume | 50 | - |
| dc.identifier.issue | 4 | - |
| dc.identifier.spage | 1188 | - |
| dc.identifier.epage | 1191 | - |
| dc.identifier.eissn | 1539-4794 | - |
| dc.identifier.issnl | 0146-9592 | - |

