LayerMapper#

class sionna.phy.nr.LayerMapper(num_layers: int = 1, verbose: bool = False, *, precision: str | None = None, device: str | None = None, **kwargs)[source]#

Bases: sionna.phy.block.Block

Performs MIMO layer mapping of modulated symbols to layers as defined in [3GPPTS38211].

The LayerMapper supports PUSCH and PDSCH channels and follows the procedure as defined in Sec. 6.3.1.3 and Sec. 7.3.1.3 in [3GPPTS38211], respectively.

As specified in Tab. 7.3.1.3.-1 [3GPPTS38211], the LayerMapper expects two input streams for multiplexing if more than 4 layers are active (only relevant for PDSCH).

Parameters:
  • num_layers (int) – Number of MIMO layers. Must be between 1 and 8. If num_layers >= 5, a list of two inputs is expected.

  • verbose (bool) – If True, additional parameters are printed.

  • precision (str | None) – Precision used for internal calculations and outputs. If None, precision is used.

  • device (str | None) – Device for computation (e.g., ‘cpu’, ‘cuda:0’). If None, device is used.

Inputs:

inputs – […, n] or [[…, n1], […, n2]], torch.complex. Sequence of symbols to be mapped. If num_layers >= 5, a list of two inputs is expected and n1/n2 must be chosen as defined in Tab. 7.3.1.3.-1 [3GPPTS38211].

Outputs:

x_mapped – […, num_layers, n/num_layers], torch.complex. Sequence of symbols mapped to the MIMO layers.

Examples

import torch
from sionna.phy.nr import LayerMapper

mapper = LayerMapper(num_layers=2)
symbols = torch.randn(10, 100) + 1j * torch.randn(10, 100)
mapped = mapper(symbols)
print(mapped.shape)
# torch.Size([10, 2, 50])

Attributes

property num_codewords: int#

int : Number of input codewords for layer mapping. Can be either 1 or 2.

property num_layers: int#

int : Number of MIMO layers.

property num_layers0: int#

int : Number of layers for first codeword (only relevant for num_codewords = 2).

property num_layers1: int#

int : Number of layers for second codeword (only relevant for num_codewords = 2).

Methods

build(input_shapes)[source]#

Test input shapes for consistency.