Item Infomation


Title: Accurate quality of transmission estimation with machine learning
Authors: Sartzetakis, I.
Participants: Christodoulopoulos, K. K.
Varvarigos, E. M.
Issue Date: 2019
Publisher: IEEE Explore
Series/Report no.: IEEE/OSA Journal of Optical Communications and Networking, 2019, Vol 11, Issue 3, pp 140-150
Abstract: In optical transport networks the quality of transmission (QoT) is estimated before provisioning new connections or upgrading existing ones. Traditionally, a physical layer model (PLM) is used for QoT estimation coupled with high margins to account for the model inaccuracy and the uncertainty in the evolving physical layer conditions. Reducing the margins increases network efficiency but requires accurate QoT estimation. We present two machine learning (ML) approaches to formulate such an accurate QoT estimator. We gather physical layer feedback, by monitoring the QoT of existing connections, to understand the actual physical conditions of the network. These data are used to train either the input parameters of a PLM or a machine learning model (ML-M). The proposed ML methods account for variations and uncertainties in equipment parameters, such as fiber attenuation, dispersion, and nonlinear coefficients, or amplifier noise figure per span, which are typical in deployed networks. We evaluated the accuracy of the proposed methods under various uncertainty scenarios and compared them to QoT estimators proposed in the literature. The results indicate that our estimators yield excellent accuracy with a relatively small amount of data, outperforming other prior estimators.
URI: http://tailieuso.tlu.edu.vn/handle/DHTL/10485
Appears in Collections:Tài liệu hỗ trợ nghiên cứu khoa học
ABSTRACTS VIEWS

18

VIEWS & DOWNLOAD

4

Files in This Item:
Thumbnail
  • D10485.pdf
      Restricted Access
    • Size : 921,87 kB

    • Format : Adobe PDF

  • Bạn đọc là cán bộ, giáo viên, sinh viên của Trường Đại học Thuỷ Lợi cần đăng nhập để Xem trực tuyến/Tải về



    Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.