Papers
Topics
Authors
Recent
2000 character limit reached

Underwater Optical ISAC System

Updated 13 November 2025
  • O-ISAC is a multifunctional underwater system that uses LED-based optical signals to simultaneously support communication, energy transfer, and 3D sensing.
  • The system optimizes time allocation (α) and camera array geometry to balance uplink data rates and target localization accuracy.
  • Rigorous modeling of channel impairments, including ship attitude variability, informs the design trade-offs for improved performance.

An underwater Optical Integrated Sensing and Communication (O-ISAC) system is a multi-functional architecture designed to jointly enable underwater communication, energy transfer, and 3D sensing using a common lightwave (e.g., LED-based) optical signal. The paradigm exploits the dual utility of optical waves for both informational and physical-layer tasks, allowing surface vessels to power and communicate with seabed sensors, while simultaneously performing target localization via camera-based sensing. Recent research has established rigorous models and closed-form expressions quantifying the impact of practical deployment factors, notably ship attitude variation, on system performance metrics such as target localization mean squared error (MSE) and achievable uplink data rate (Palitharathna et al., 5 Nov 2025).

1. Architecture and Functional Components

The O-ISAC configuration comprises a surface-ship–mounted Access Point (AP) and two principal underwater nodes:

  • Access Point (AP):
    • LED transmitter for both downlink power/sensing illumination.
    • Photodiode (PD) to receive uplink information from the seabed sensor.
    • An array of MM pinhole cameras for 3D localization of the underwater target.
  • Seabed Energy-Harvesting (EH) Sensor:
    • Photovoltaic (PV) cell to harvest incident optical energy during DL.
    • LED transmitter for uplink communication during the harvest-use phase.
  • Underwater Sensing Target:
    • Passive reflector that scatters incident AP light back toward the ship's camera array.

Operations are structured in frames of duration TT, partitioned by the time-allocation parameter α\alpha:

  • Downlink (DL) LPT + Sensing: αT\alpha T seconds for simultaneous energy transfer and illumination/scattering.
  • Uplink Data Transmission: (1α)T(1-\alpha)T seconds for the EH sensor to transmit backlog using harvested energy.

2. Channel Modeling and Ship Attitude Effects

Underwater optical channels exhibit both geometric and stochastic losses:

  • Channel Gain (hh):

h=hghphth = h_g\,h_p\,h_t

  • hgh_g: Deterministic geometric loss (Lambertian emission, concentrator gain).
  • hp=ec(λ)dh_p = e^{-c(\lambda)d}: Absorption/scattering (Beer’s law).
  • hth_t: Log-normal turbulence-induced fading.

For a direct LoS LED–PD link (dd distance, irradiance angle θ\theta, PD incidence angle ϕ\phi): hg={(m1+1)Ap2πd2cosm1(θ)cos(ϕ)Tc1(ϕ),θπ2 0,otherwiseh_g = \begin{cases} \dfrac{(m_1+1)\,A_p}{2\pi\,d^2}\cos^{m_1}(\theta)\cos(\phi)T\,c_1(\phi), & |\theta| \le \frac{\pi}{2} \ 0, & \text{otherwise} \end{cases} where m1=ln2/ln(cosθ1/2)m_1 = -\ln2/\ln(\cos\theta_{1/2}).

  • Ship Attitude: The roll (θR\theta_R), pitch (θP\theta_P), and yaw (θY\theta_Y) are modeled as independent Gaussian random variables: θKN(μθK,σθK2),K{R,P,Y}\theta_K \sim \mathcal{N}(\mu_{\theta_K}, \sigma_{\theta_K}^2), \quad K \in \{R, P, Y\} All AP elements (LED, PD, cameras) share orientation, producing stochastic variation in link angle and thus in hgh_g.
  • Alignment Factor: Link orientation (cosine of irradiance angle) under small-angle approximation becomes Gaussian, allowing for transformation of attitude statistics into channel gain statistics.

3. Energy Harvesting and Communication Performance

The seabed EH sensor converts incident light during DL into stored energy for UL transmission:

  • Harvested Energy (EhE_h): Eh=fvtαTrPVhA,EPDLln(1+rPVhA,EPDLI0)E_h = f v_t \alpha T r_{PV} h_{A,E} P_{\rm DL} \ln\bigg(1 + \frac{r_{PV} h_{A,E} P_{\rm DL}}{I_0}\bigg) where fill factor ff, thermal voltage vtv_t, PV responsivity rPVr_{PV}, downlink gain hA,Eh_{A,E}, AP power PDLP_{\rm DL}, and dark current I0I_0 as parameters.
  • Uplink Power: PUL=Eh(1α)TP_{UL} = \frac{E_h}{(1-\alpha)T}
  • Uplink Rate Lower Bound (RULR_{UL}): For optical intensity channels,

RUL=1α2log2(1+e2π(rhE,APUL)2σn2)R_{UL} = \frac{1-\alpha}{2} \log_2\Bigg(1 + \frac{e}{2\pi} \frac{(r h_{E,A} P_{UL})^2}{\sigma_n^2}\Bigg)

with PD responsivity rr, uplink channel gain hE,Ah_{E,A}, and AWGN noise variance σn2\sigma_n^2.

  • Average Uplink Rate: After averaging over channel and attitude random variables using Gaussian quadrature,

RULE1α4π5/2σeff2i=1N1j=1N3k=1N2wiWjW~kR_{UL}^E \approx \frac{1-\alpha}{4\pi^{5/2}\sigma_{\rm eff}^2}\sum_{i=1}^{N_1}\sum_{j=1}^{N_3}\sum_{k=1}^{N_2}w_i W_j \tilde W_k \ldots

where the summations run over quadrature nodes of the nonlinear system.

4. Optical Sensing and Target Localization

The AP deploys an array of pinhole cameras to estimate the 3D position of a passive underwater reflector:

  • Camera Projection Model: pc,m=QmT(pStm)\mathbf p_{c,m} = Q_m^T(\mathbf p_S - t_m)

zc,m[xm ym 1]=Kpc,mz_{c,m}\begin{bmatrix}x_m \ y_m \ 1\end{bmatrix} = K\,\mathbf p_{c,m}

QmQ_m is composed of rotations for yaw, pitch, roll; image projection matrix KK.

  • Pixel Observation Model:

Each camera measures

x^m=xm+ex,m,y^m=ym+ey,m\hat x_m = x_m + e_{x,m}, \quad \hat y_m = y_m + e_{y,m}

Error variance inversely scales with reflected intensity: Imref=ρshS,A,mhA,SPDLAcamI_m^{\rm ref} = \rho_s h_{S,A,m} h_{A,S} P_{\rm DL} A_{\rm cam}

  • Least-Squares Localization:

Noisy observations form a linear system, solved via minimization: p^c,1=(Σ^TΣ^)1Σ^Tγ\hat{\mathbf p}_{c,1} = (\hat\Sigma^T\hat\Sigma)^{-1}\hat\Sigma^T \gamma Transform yields position estimate in the global frame.

  • Localization Error Analysis:

Mean-squared error is defined as

MSEp=E[p^SpS2]\mathrm{MSE}_p = \mathbb E\left[\|\hat{\mathbf p}_S-\mathbf p_S\|^2\right]

Closed-form (MSE\overline{\mathrm{MSE}}) accounts for attitude and noise, revealing dependence on geometry, camera layout, and intensity levels.

5. Fundamental Communication–Sensing Trade-Off

  • Time Allocation (α\alpha):
    • Increasing α\alpha boosts harvested energy (improves UL power/rate), increases exposure for sensing (reduces camera pixel noise), but decreases available UL transmission time.
    • Both localization error (MSEp\mathrm{MSE}_p) and UL rate (RULR_{UL}) are sensitive to α\alpha; there exists an optimal value that simultaneously benefits both.
  • Array Geometry: Number and spacing of cameras (M,Δxi,ΔyiM, \Delta x_i, \Delta y_i) control the trade-off between spatial diversity (precision) and per-camera signal strength.

6. Design Optimization and Guidelines

  • Optimal Camera Placement:
    • MSE minimization via inter-camera offset selection:
    • Small spacing (Δ1\Delta_1 low) leads to poor geometric dilution.
    • Large spacing increases channel loss per camera.
    • Simulation indicates optimal spacing (ρx1.2\rho_x \approx 1.2 m) under σθ2=10\sigma_\theta^2 = 10^\circ.
  • Optimal Harvest-Use Ratio (α\alpha^*):
    • RUL(α)R_{UL}(\alpha) is unimodal; maximization yields α0.55\alpha^* \approx 0.55.
    • This value also positions the sensing MSE near its global minimum, supporting joint quality-of-service optimization.
  • Empirical Performance:
    • Achieves minimum localization MSE of 102m210^{-2} \, \mathrm{m}^2 with appropriate camera configuration under moderate attitude fluctuation (1010^\circ).
    • Recommended practices: select α0.55\alpha \simeq 0.55, deploy 5–9 cameras with 1–3 m interspacing, apply derived formulas to refine orientation and beam width for optimal system trade-off.

7. Context and Implications in Underwater Systems

O-ISAC advances the state of integrated underwater infrastructure enabling simultaneous power transfer, communication, and precise sensing within a single optical framework. The treatment of ship attitude as a Gaussian random process adds deployment realism; the closed-form analysis supports rapid parametric optimization in practical scenarios. A plausible implication is that these design rules and performance envelopes, derived for LED-based ship-mounted APs, generalize to other surface-vessel and sensor configurations where orientation stochasticity, frame time partitioning, and camera-array geometry dominate the QoS trade space.

The LPT-enabled underwater optical ISAC system exemplifies the integration of lightwave power transfer, optical communication, and imaging localization, guided by jointly optimized scheduling and spatial layout parameters. This suggests avenues for extension to mobile APs, multi-hop architectures, and collaborative sensor networks in varied sea conditions, constrained principally by achievable channel alignment, energy harvesting efficiency, and camera-imaging systematics.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Underwater Optical Integrated Sensing and Communication (O-ISAC) System.