File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Semantic-Aware Vision-Assisted Integrated Sensing and Communication: Architecture and Resource Allocation

TitleSemantic-Aware Vision-Assisted Integrated Sensing and Communication: Architecture and Resource Allocation
Authors
Issue Date2024
Citation
IEEE Wireless Communications, 2024, v. 31, n. 3, p. 302-308 How to Cite?
AbstractMany intelligent (mobile) applications are driven by real-time environmental information which may be unavailable at the core network and is challenging to transmit, given the limited spectrum resource. This article proposes an innovative architecture, referred to as semantic-aware, vision-assisted integrated sensing and communication (SA-VA-ISAC), to enable real-time environmental information collection and transmission, by integrating emerging paradigms and key technologies, including computer vision (CV), ISAC, mobile edge computing (MEC), semantic communications, and beamforming. First, the CV and ISAC are employed to capture abundant environmental information, which is further aggregated at an MEC server. Second, semantic communications enable information compression to satisfy the stringent reliability and latency requirements, and beamforming provides high-quality wireless coverage. To facilitate the resource allocation in the proposed architecture, deep learning (DL) is adopted for environmental information collection and aggregation, semantic encoder and decoder and beamforming design. Numerical results manifest the advantages of the proposed architecture and the DL-based resource allocation schemes.
Persistent Identifierhttp://hdl.handle.net/10722/353142
ISSN
2023 Impact Factor: 10.9
2023 SCImago Journal Rankings: 5.926
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLu, Yang-
dc.contributor.authorMao, Weihao-
dc.contributor.authorDu, Hongyang-
dc.contributor.authorDobre, Octavia A.-
dc.contributor.authorNiyato, Dusit-
dc.contributor.authorDing, Zhiguo-
dc.date.accessioned2025-01-13T03:02:17Z-
dc.date.available2025-01-13T03:02:17Z-
dc.date.issued2024-
dc.identifier.citationIEEE Wireless Communications, 2024, v. 31, n. 3, p. 302-308-
dc.identifier.issn1536-1284-
dc.identifier.urihttp://hdl.handle.net/10722/353142-
dc.description.abstractMany intelligent (mobile) applications are driven by real-time environmental information which may be unavailable at the core network and is challenging to transmit, given the limited spectrum resource. This article proposes an innovative architecture, referred to as semantic-aware, vision-assisted integrated sensing and communication (SA-VA-ISAC), to enable real-time environmental information collection and transmission, by integrating emerging paradigms and key technologies, including computer vision (CV), ISAC, mobile edge computing (MEC), semantic communications, and beamforming. First, the CV and ISAC are employed to capture abundant environmental information, which is further aggregated at an MEC server. Second, semantic communications enable information compression to satisfy the stringent reliability and latency requirements, and beamforming provides high-quality wireless coverage. To facilitate the resource allocation in the proposed architecture, deep learning (DL) is adopted for environmental information collection and aggregation, semantic encoder and decoder and beamforming design. Numerical results manifest the advantages of the proposed architecture and the DL-based resource allocation schemes.-
dc.languageeng-
dc.relation.ispartofIEEE Wireless Communications-
dc.titleSemantic-Aware Vision-Assisted Integrated Sensing and Communication: Architecture and Resource Allocation-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/MWC.001.2300014-
dc.identifier.scopuseid_2-s2.0-85184314050-
dc.identifier.volume31-
dc.identifier.issue3-
dc.identifier.spage302-
dc.identifier.epage308-
dc.identifier.eissn1558-0687-
dc.identifier.isiWOS:001167547000001-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats