LayerDemapper#
- class sionna.phy.nr.LayerDemapper(layer_mapper: sionna.phy.nr.layer_mapping.LayerMapper, num_bits_per_symbol: int = 1, *, precision: str | None = None, device: str | None = None, **kwargs)[source]#
Bases:
sionna.phy.block.BlockDemaps MIMO layers to coded transport block(s) by following Sec. 6.3.1.3 and Sec. 7.3.1.3 in [3GPPTS38211].
This block must be associated to a
LayerMapperand performs the inverse operation.It is assumed that
num_bits_per_symbolconsecutive LLRs belong to a single symbol position. This allows to apply the LayerDemapper after demapping symbols to LLR values.If the layer mapper is configured for dual codeword transmission, a list of both transport block streams is returned.
- Parameters:
layer_mapper (sionna.phy.nr.layer_mapping.LayerMapper) – Associated
LayerMapper.num_bits_per_symbol (int) – Modulation order. Defines how many consecutive LLRs are associated to the same symbol position.
precision (str | None) – Precision used for internal calculations and outputs. If None,
precisionis used.device (str | None) – Device for computation (e.g., ‘cpu’, ‘cuda:0’). If None,
deviceis used.
- Inputs:
inputs – […, num_layers, n/num_layers], torch.float. MIMO layer data sequences.
- Outputs:
llr – […, n] or [[…, n1], […, n2]], torch.float. Sequence of bits after layer demapping. If
num_codewords= 2, a list of two transport blocks is returned.
Notes
As it is more convenient to apply the layer demapper after demapping symbols to LLRs, this block groups the input sequence into groups of
num_bits_per_symbolLLRs before restoring the original symbol sequence. This behavior can be deactivated by settingnum_bits_per_symbol= 1.Examples
import torch from sionna.phy.nr import LayerMapper, LayerDemapper mapper = LayerMapper(num_layers=2) demapper = LayerDemapper(mapper, num_bits_per_symbol=4) symbols = torch.randn(10, 100, dtype=torch.complex64) mapped = mapper(symbols) # After channel + demapping to LLRs... llrs = torch.randn(10, 2, 200) # num_bits_per_symbol=4, so 50*4=200 demapped = demapper(llrs) print(demapped.shape) # torch.Size([10, 400])
Methods