klus.algorithms module

klus.algorithms.dmd(X, Y, mode='exact', retain=19)[source]

Exact and standard DMD of the data matrices X and Y.

Parameters:

mode -- 'exact' for exact DMD or 'standard' for standard DMD

Returns:

eigenvalues d and modes Phi

klus.algorithms.dmdc(X, Y, U, svThresh=1e-10)[source]

DMD + control where control matrix B is unknown, https://arxiv.org/abs/1409.6358 :param X: State matrix in Reals NxM-1, where N is dim of state vector, M is number of samples :param Y: One step time-laged state matrix in Reals NxM-1 :param U: Control input matrix, in Reals QxM-1, where Q is dim of control vector :param svThresh: Threshold below which to discard singular values :return: A_approx, B_approx, Phi (where Phi are dynamic modes of A)

klus.algorithms.amuse(X, Y, evs=5)[source]

AMUSE implementation of TICA, see TICA documentation.

Returns:

eigenvalues d and corresponding eigenvectors Phi containing the coefficients for the eigenfunctions

klus.algorithms.tica(X, Y, evs=5)[source]

Time-lagged independent component analysis of the data matrices X and Y.

Parameters:

evs -- number of eigenvalues/eigenvectors

Returns:

eigenvalues d and corresponding eigenvectors V containing the coefficients for the eigenfunctions

klus.algorithms.ulam(X, Y, Omega, evs=5, operator='K')[source]

Ulam's method for the Koopman or Perron-Frobenius operator. The matrices X and Y contain the input data.

Parameters:
  • Omega -- box discretization of type topy.domain.discretization

  • evs -- number of eigenvalues/eigenvectors

  • operator -- 'K' for Koopman or 'P' for Perron-Frobenius

Returns:

eigenvalues d and corresponding eigenvectors V containing the coefficients for the eigenfunctions

TODO: Switch to sparse matrices.

klus.algorithms.edmd(X, Y, psi, evs=5, operator='K')[source]

Conventional EDMD for the Koopman or Perron-Frobenius operator. The matrices X and Y contain the input data.

Parameters:
  • psi -- set of basis functions, see d3s.observables

  • evs -- number of eigenvalues/eigenvectors

  • operator -- 'K' for Koopman or 'P' for Perron-Frobenius

Returns:

eigenvalues d and corresponding eigenvectors V containing the coefficients for the eigenfunctions

klus.algorithms.gedmd(X, Y, Z, psi, evs=5, operator='K')[source]

Generator EDMD for the Koopman operator. The matrices X and Y contain the input data. For stochastic systems, Z contains the diffusion term evaluated in all data points X. If the system is deterministic, set Z = None.

klus.algorithms.kedmd(X, Y, k, epsilon=0.0, evs=5, operator='P', kind='kernel')[source]

Kernel EDMD for the Koopman or Perron-Frobenius operator. The matrices X and Y contain the input data.

Parameters:
  • X (ndarray) -- Input data.

  • Y (ndarray) -- Input shifted data.

  • k (kernel object) -- Kernel object.

  • epsilon (float) -- Regularization parameter.

  • evs (int) -- Number of eigenvalues/eigenvectors.

  • operator (str) -- 'K' for Koopman or 'P' for Perron-Frobenius.

  • kind (str) -- 'kernel' for kernel EDMD or 'embedded' for embedded EDMD.

  • cond (float) -- Condition number of the kernel matrix.

Returns:

  • A (ndarray) -- Eigenfunction coefficients.

  • d (ndarray) -- Eigenvalues.

  • V (ndarray) -- Eigenvectors.

  • C (ndarray) -- Coefficients from EIgenvalues problem.

  • G_0 (ndarray) -- Kernel matrix.

  • G_1 (ndarray) -- Kernel shifted data.

klus.algorithms.sindy(X, Y, psi, eps=0.001, iterations=10)[source]

Sparse indentification of nonlinear dynamics for the data given by X and Y.

Parameters:
  • psi -- set of basis functions, see topy.observables

  • eps -- cutoff threshold

  • iterations -- number of sparsification steps

Returns:

coefficient matrix Xi

klus.algorithms.kpca(X, k, evs=5)[source]

Kernel PCA.

Parameters:
  • param X -- data matrix, each column represents a data point

  • param k -- kernel

  • param evs -- number of eigenvalues/eigenvectors

Returns:

  • d -- Eigenvalues

  • V -- data X projected onto principal components

  • G -- Gram Matrix

klus.algorithms.kcca(X, Y, k, option='CCA', evs=5, epsilon=1e-06)[source]

Kernel CCA.

Perform kernel CCA ina simplified form for time series, otherwise applies KCCA to two different fields

Parameters:
  • X -- data matrix, each column represents a data point

  • Y -- time-lagged data, each column y_i is x_i mapped forward by the dynamical system, or a different data magtrix

  • option --

    • 'lagged', assume that the data matrix Y is the time lagged version of X

    • 'CCA', assume X and Y to be independent fields (default)

  • k -- kernel

  • evs -- number of eigenvalues/eigenvectors

  • epsilon -- regularization parameter

Return type:

CCA coefficients

klus.algorithms.cmd(X, Y, evs=5, epsilon=1e-06)[source]

Coherent mode decomposition.

Parameters:
  • X -- data matrix, each column represents a data point

  • Y -- lime-lagged data, each column y_i is x_i mapped forward by the dynamical system

  • evs -- number of eigenvalues/eigenvectors

Epsilon:

regularization parameter

Returns:

eigenvalues and modes xi and eta

klus.algorithms.seba(V, R0=None, maxIter=5000)[source]

Sparse eigenbasis approximation as described in

"Sparse eigenbasis approximation: Multiple feature extraction across spatiotemporal scales with application to coherent set identification" by G. Froyland, C. Rock, and K. Sakellariou.

Based on the original Matlab implementation, see https://github.com/gfroyland/SEBA.

Parameters:
  • V -- eigenvectors

  • R0 -- optional initial rotation

  • maxIter -- maximum number of iterations

Returns:

sparse basis output

TODO: perturb near-constant vectors?

klus.algorithms.kcovedmd(X, Y, k, epsilon=0, evs=5, operator='P')[source]

Kernel Covariance EDMD for the Koopman or Perron-Frobenius operator. The matrices X and Y contain the input data.

Parameters:
  • k -- kernel, see d3s.kernels

  • epsilon -- regularization parameter

  • evs -- number of eigenvalues/eigenvectors

  • operator -- 'K' for Koopman or 'P' for Perron-Frobenius (note that the default is P here)

Returns:

eigenvalues d and eigenfunctions evaluated in X

klus.algorithms.sortEig(A, B=None, evs=5, which='LM')[source]

Computes eigenvalues and eigenvectors of A and sorts them in decreasing lexicographic order. Optionally solve a generalized eigenvalue problem.

Parameters:
  • B -- If not empty compute generalized eigenvalue problem Av = wBv

  • evs -- Number of eigenvalues/eigenvectors

Returns:

Sorted eigenvalues and eigenvectors

Return type:

(w,V)

klus.algorithms.FeatureMatrix(x, z, k)[source]

Compute the Feature Matrix Phi Computed on the support vectors x for the point z and the kernel k

klus.algorithms.kgedmd(X, Y, k, epsilon=0, evs=5, operator='P', kind='kernel', cond=1e-15)[source]

Kernel EDMD for the Koopman or Perron-Frobenius generator operator. The matrix contains the data and the matrix Y the derivatives of the data on training points.

Only Gaussian Kernel

The matrix Y contains estimate of derivatives on training points

Parameters:
  • X (ndarray) -- Input data.

  • Y (ndarray) -- Input derivative data.

  • k (kernel object) -- Kernel object.

  • epsilon (float) -- Regularization parameter.

  • evs (int) -- Number of eigenvalues/eigenvectors.

  • operator (str) -- 'K' for Koopman or 'P' for Perron-Frobenius.

  • kind (str) -- 'kernel' for kernel EDMD or 'embedded' for embedded EDMD.

  • cond (float) -- Condition number of the kernel matrix.

Returns:

  • A (ndarray) -- Eigenfunction coefficients.

  • d (ndarray) -- Eigenvalues.

  • V (ndarray) -- Eigenvectors.

  • G_0 (ndarray) -- Kernel matrix.

  • G_1 (ndarray) -- Kernel derivaitve data.