17.5. d2l API Document

class d2l.Accumulator(n)

Sum a list of numbers over time

class d2l.BPRLoss(weight=None, batch_axis=0, **kwargs)
forward(positive, negative)

Defines the forward computation. Arguments can be either NDArray or Symbol.

class d2l.CTRDataset(data_path, feat_mapper=None, defaults=None, min_threshold=4, num_feat=34)
class d2l.Dataset

Abstract dataset class. All datasets should have this interface.

Subclasses need to override __getitem__, which returns the i-th element, and __len__, which returns the total number elements.

Note

An mxnet or numpy array can be directly used as a dataset.

filter(fn)

Returns a new dataset with samples filtered by the filter function fn.

Note that if the Dataset is the result of a lazily transformed one with transform(lazy=False), the filter is eagerly applied to the transformed samples without materializing the transformed result. That is, the transformation will be applied again whenever a sample is retrieved after filter().

fncallable

A filter function that takes a sample as input and returns a boolean. Samples that return False are discarded.

Dataset

The filtered dataset.

sample(sampler)

Returns a new dataset with elements sampled by the sampler.

samplerSampler

A Sampler that returns the indices of sampled elements.

Dataset

The result dataset.

take(count)

Returns a new dataset with at most count number of samples in it.

countint or None

A integer representing the number of elements of this dataset that should be taken to form the new dataset. If count is None, or if count is greater than the size of this dataset, the new dataset will contain all elements of this dataset.

Dataset

The result dataset.

transform(fn, lazy=True)

Returns a new dataset with each sample transformed by the transformer function fn.

fncallable

A transformer function that takes a sample as input and returns the transformed sample.

lazybool, default True

If False, transforms all samples at once. Otherwise, transforms each sample on demand. Note that if fn is stochastic, you must set lazy to True or you will get the same result on all epochs.

Dataset

The transformed dataset.

transform_first(fn, lazy=True)

Returns a new dataset with the first element of each sample transformed by the transformer function fn.

This is useful, for example, when you only want to transform data while keeping label as is.

fncallable

A transformer function that takes the first elemtn of a sample as input and returns the transformed element.

lazybool, default True

If False, transforms all samples at once. Otherwise, transforms each sample on demand. Note that if fn is stochastic, you must set lazy to True or you will get the same result on all epochs.

Dataset

The transformed dataset.

class d2l.Decoder(**kwargs)

The base decoder interface for the encoder-decoder architecture.

forward(X, state)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.DotProductAttention(dropout, **kwargs)
forward(query, key, value, valid_length=None)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.Encoder(**kwargs)

The base encoder interface for the encoder-decoder architecture.

forward(X)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.EncoderDecoder(encoder, decoder, **kwargs)

The base class for the encoder-decoder architecture.

forward(enc_X, dec_X, *args)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.HingeLossbRec(weight=None, batch_axis=0, **kwargs)
forward(positive, negative, margin=1)

Defines the forward computation. Arguments can be either NDArray or Symbol.

class d2l.Loss(weight, batch_axis, **kwargs)

Base class for loss.

weightfloat or None

Global scalar weight for loss.

batch_axisint, default 0

The axis that represents mini-batch.

hybrid_forward(F, x, *args, **kwargs)

Overrides to construct symbolic graph for this Block.

xSymbol or NDArray

The first input tensor.

*argslist of Symbol or list of NDArray

Additional input tensors.

class d2l.MLPAttention(units, dropout, **kwargs)
forward(query, key, value, valid_length)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.MaskedSoftmaxCELoss(axis=-1, sparse_label=True, from_logits=False, weight=None, batch_axis=0, **kwargs)
forward(pred, label, valid_length)

Defines the forward computation. Arguments can be either NDArray or Symbol.

class d2l.RNNModel(rnn_layer, vocab_size, **kwargs)
forward(inputs, state)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.RNNModelScratch(vocab_size, num_hiddens, ctx, get_params, init_state, forward)

A RNN Model based on scratch implementations

class d2l.RandomGenerator(sampling_weights)

Draw a random int in [0, n] according to n sampling weights

class d2l.Residual(num_channels, use_1x1conv=False, strides=1, **kwargs)
forward(X)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.Seq2SeqDecoder(vocab_size, embed_size, num_hiddens, num_layers, dropout=0, **kwargs)
forward(X, state)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.Seq2SeqEncoder(vocab_size, embed_size, num_hiddens, num_layers, dropout=0, **kwargs)
forward(X, *args)

Overrides to implement forward computation using NDArray. Only accepts positional arguments.

*argslist of NDArray

Input tensors.

class d2l.SeqDataLoader(batch_size, num_steps, use_random_iter, max_tokens)

A iterator to load sequence data

class d2l.Timer

Record multiple running times.

class d2l.VOCSegDataset(is_train, crop_size, voc_dir)

A customized dataset to load VOC dataset.

filter(imgs)

Returns a new dataset with samples filtered by the filter function fn.

Note that if the Dataset is the result of a lazily transformed one with transform(lazy=False), the filter is eagerly applied to the transformed samples without materializing the transformed result. That is, the transformation will be applied again whenever a sample is retrieved after filter().

fncallable

A filter function that takes a sample as input and returns a boolean. Samples that return False are discarded.

Dataset

The filtered dataset.

d2l.bbox_to_rect(bbox, color)

Convert bounding box to matplotlib format.

d2l.build_colormap2label()

Build a RGB color to label mapping for segmentation.

d2l.corr2d(X, K)

Compute 2D cross-correlation.

class d2l.defaultdict

defaultdict(default_factory[, …]) –> dict with default factory

The default factory is called without arguments to produce a new value when a key is not present, in __getitem__ only. A defaultdict compares equal to a dict with the same items. All remaining arguments are treated the same as if they were passed to the dict constructor, including keyword arguments.

copy() → a shallow copy of D.
default_factory

Factory for default value called by __missing__().

d2l.download_voc_pascal(data_dir='../data')

Download the VOC2012 segmentation dataset.

d2l.evaluate_loss(net, data_iter, loss)

Evaluate the loss of a model on the given dataset

d2l.load_array(data_arrays, batch_size, is_train=True)

Construct a Gluon data loader

d2l.load_data_fashion_mnist(batch_size, resize=None)

Download the Fashion-MNIST dataset and then load into memory.

d2l.load_data_pikachu(batch_size, edge_size=256)

Load the pikachu dataset

d2l.load_data_voc(batch_size, crop_size)

Download and load the VOC2012 semantic dataset.

d2l.plot(X, Y=None, xlabel=None, ylabel=None, legend=[], xlim=None, ylim=None, xscale='linear', yscale='linear', fmts=['-', 'm--', 'g-.', 'r:'], figsize=(3.5, 2.5), axes=None)

Plot data points.

d2l.read_time_machine()

Load the time machine book into a list of sentences.

d2l.read_voc_images(root='../data/VOCdevkit/VOC2012', is_train=True)

Read all VOC feature and label images.

d2l.resnet18(num_classes)

A slightly modified ResNet-18 model

d2l.set_axes(axes, xlabel, ylabel, xlim, ylim, xscale, yscale, legend)

Set the axes for matplotlib.

d2l.set_figsize(figsize=(3.5, 2.5))

Set the figure size for matplotlib.

d2l.show_bboxes(axes, bboxes, labels=None, colors=None)

Show bounding boxes.

d2l.show_images(imgs, num_rows, num_cols, titles=None, scale=1.5)

Plot a list of images.

d2l.show_trace_2d(f, results)

Show the trace of 2D variables during optimization.

d2l.split_batch(X, y, ctx_list)

Split X and y into multiple devices specified by ctx

d2l.split_data_ml100k(data, num_users, num_items, split_mode='random', test_ratio=0.1)

Split the dataset in random mode or seq-aware mode.

d2l.synthetic_data(w, b, num_examples)

generate y = X w + b + noise

d2l.tokenize(lines, token='word')

Split sentences into word or char tokens

d2l.train_2d(trainer, steps=20)

Optimize a 2-dim objective function with a customized trainer.

d2l.try_all_gpus()

Return all available GPUs, or [cpu(),] if no GPU exists.

d2l.try_gpu(i=0)

Return gpu(i) if exists, otherwise return cpu().

d2l.update_D(X, Z, net_D, net_G, loss, trainer_D)

Update discriminator

d2l.update_G(Z, net_D, net_G, loss, trainer_G)

Update generator

d2l.use_svg_display()

Use the svg format to display a plot in Jupyter.

d2l.voc_label_indices(colormap, colormap2label)

Map a RGB color to a label.

d2l.voc_rand_crop(feature, label, height, width)

Randomly crop for both feature and label images.