pysteps.motion#

Implementations of optical flow methods.

pysteps.motion.interface#

Interface for the motion module. It returns a callable optical flow routine for computing the motion field.

The methods in the motion module implement the following interface:

motion_method(precip, **keywords)

where precip is a (T,m,n) array containing a sequence of T two-dimensional input images of shape (m,n). The first dimension represents the images time dimension and the value of T depends on the type of the method.

The output is a three-dimensional array (2,m,n) containing the dense x- and y-components of the motion field in units of pixels / timestep as given by the input array R.

get_method(name)

Return a callable function for the optical flow method corresponding to the given name.

pysteps.motion.constant#

Implementation of a constant advection field estimation by maximizing the correlation between two images.

constant(R, **kwargs)

Compute a constant advection field by finding a translation vector that maximizes the correlation between two successive images.

pysteps.motion.darts#

Implementation of the DARTS algorithm.

DARTS(input_images, **kwargs)

Compute the advection field from a sequence of input images by using the DARTS method.

pysteps.motion.lucaskanade#

The Lucas-Kanade (LK) local feature tracking module.

This module implements the interface to the local Lucas-Kanade routine available in OpenCV.

For its dense method, it additionally interpolates the sparse vectors over a regular grid to return a motion field.

dense_lucaskanade(input_images[, lk_kwargs, ...])

Run the Lucas-Kanade optical flow routine and interpolate the motion vectors.

pysteps.motion.proesmans#

Implementation of the anisotropic diffusion method of Proesmans et al. (1994).

proesmans(input_images[, lam, num_iter, ...])

Implementation of the anisotropic diffusion method of Proesmans et al. (1994).

pysteps.motion.vet#

Variational Echo Tracking (VET) Module

This module implements the VET algorithm presented by Laroche and Zawadzki (1995) and used in the McGill Algorithm for Prediction by Lagrangian Extrapolation (MAPLE) described in Germann and Zawadzki (2002).

The morphing and the cost functions are implemented in Cython and parallelized for performance.

vet(input_images[, sectors, smooth_gain, ...])

Variational Echo Tracking Algorithm presented in Laroche and Zawadzki (1995) and used in the McGill Algorithm for Prediction by Lagrangian Extrapolation (MAPLE) described in Germann and Zawadzki (2002).

vet_cost_function(sector_displacement_1d, ...)

vet_cost_function_gradient(*args, **kwargs)

Compute the vet cost function gradient.

morph(image, displacement[, gradient])

Morph image by applying a displacement field (Warping).

round_int(scalar)

Round number to nearest integer.

ceil_int(scalar)

Round number to nearest integer.

get_padding(dimension_size, sectors)

Get the padding at each side of the one dimensions of the image so the new image dimensions are divided evenly in the number of sectors specified.