Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range

Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in... We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Physical Review E American Physical Society (APS)

Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range

Preview Only

Log-log growth of channel capacity for nondispersive nonlinear optical fiber channel in intermediate power range

Abstract

We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.
Loading next page...
 
/lp/aps_physical/log-log-growth-of-channel-capacity-for-nondispersive-nonlinear-optical-wn3gYCdlhs
Publisher
The American Physical Society
Copyright
Copyright © ©2017 American Physical Society
ISSN
1539-3755
eISSN
550-2376
D.O.I.
10.1103/PhysRevE.95.062133
Publisher site
See Article on Publisher Site

Abstract

We consider a model nondispersive nonlinear optical fiber channel with an additive Gaussian noise. Using Feynman path-integral technique, we find the optimal input signal distribution maximizing the channel's per-sample mutual information at large signal-to-noise ratio in the intermediate power range. The optimal input signal distribution allows us to improve previously known estimates for the channel capacity. We calculate the output signal entropy, conditional entropy, and per-sample mutual information for Gaussian, half-Gaussian, and modified Gaussian input signal distributions. We demonstrate that in the intermediate power range the capacity (the per-sample mutual information for the optimal input signal distribution) is greater than the per-sample mutual information for half-Gaussian input signal distribution considered previously as the optimal one. We also show that the capacity grows as loglogP in the intermediate power range, where P is the signal power.

Journal

Physical Review EAmerican Physical Society (APS)

Published: Jun 26, 2017

There are no references for this article.

Sorry, we don’t have permission to share this article on DeepDyve,
but here are related articles that you can start reading right now:

Explore the DeepDyve Library

Unlimited reading

Read as many articles as you need. Full articles with original layout, charts and figures. Read online, from anywhere.

Stay up to date

Keep up with your field with Personalized Recommendations and Follow Journals to get automatic updates.

Organize your research

It’s easy to organize your research with our built-in tools.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve Freelancer

DeepDyve Pro

Price
FREE
$49/month

$360/year
Save searches from
Google Scholar,
PubMed
Create lists to
organize your research
Export lists, citations
Read DeepDyve articles
Abstract access only
Unlimited access to over
18 million full-text articles
Print
20 pages/month
PDF Discount
20% off