Two uses of TDNNs (from Wikipedia)

Hirunika Karunathilaka
2 min readMar 23, 2020

--

Let’s first take a look at how a Time Delay Network is described in Wikipedia.

Time delay neural network (TDNN)is a multi-layer artificial neural network architecture whose purpose is to 1) classify patterns with shift-invariance, and 2) model context at each layer of the network.

So, the first point is the Time Delay Neural Network is a multi-layer ANN architecture. This means the network contains more than one layer of artificial neurons.

Purpose 1: classify patterns with shift variance

The term shift-variance means that if we shift the input in time (or shift the entries in a vector) then the output is shifted by the same amount. So, in shift-invariance classification, the classifier does not require explicit segmentation prior to classification. For the classification of a temporal pattern such as speech, the TDNN thus avoids having to determine the beginning and endpoints of sounds before classifying them.

Purpose 2: Modeling context at each layer of the network

The special characteristic of the TDNN to perform this purpose is each neural unit at each layer receives input from a contextual window of outputs from the layer below. It has the connections not only from activations at the layer below but also from a pattern of unit output and its context.

Normally, the activation of a neuron unit is computed by passing the weighted sum of its inputs through an activation function such as sigmoid, relu, etc. But, what differs in TDNN is, in addition to this value, a delay is introduced to the unit. It is the time-delayed (past) outputs from the same units of the layer below.

--

--