All Versions
38
Latest Version
Avg Release Cycle
77 days
Latest Release
-

Changelog History
Page 1

  • v4.0.0

    ????-??-??
    • Add Multiple Pole Balancing Environment (#1901, #1951).

    • Added functionality for scaling of data (#1876); see the command-line binding mlpack_preprocess_scale or Python binding preprocess_scale().

    • Add new parameter maximum_depth to decision tree and random forest bindings (#1916).

    • Fix prediction output of softmax regression when test set accuracy is calculated (#1922).

    • Action struct in continuous RL environments now stores the action as a double instead of double[1] (#1941, #1931).

    • Pendulum environment now checks for termination. All RL environments now have an option to terminate after a set number of time steps (no limit by default) (#1941).

    • Add support for probabilistic KDE (kernel density estimation) error bounds when using the Gaussian kernel (#1934).

  • v3.3.1

    April 30, 2020

    🚀 Released April 29th, 2020.

    📚 Minor Julia and Python documentation fixes (#2373).

    ⚡️ Updated terminal state and fixed bugs for Pendulum environment (#2354, #2369).

    ➕ Added EliSH activation function (#2323).

    ➕ Add L1 Loss function (#2203).

    Pass CMAKE_CXX_FLAGS (compilation options) correctly to Python build (#2367).

    🔦 Expose ensmallen Callbacks for sparseautoencoder (#2198).

    🛠 Bugfix for LARS class causing invalid read (#2374).

    ➕ Add serialization support from Julia; use mlpack.serialize() and mlpack.deserialize() to save and load from IOBuffers.

  • v3.3.0

    April 07, 2020

    🚀 Released April 7th, 2020.

    Templated return type of Forward function of loss functions (#2339).

    ➕ Added R2 Score regression metric (#2323).

    ➕ Added mean squared logarithmic error loss function for neural networks (#2210).

    ➕ Added mean bias loss function for neural networks (#2210).

    🗄 The DecisionStump class has been marked deprecated; use the DecisionTree class with NoRecursion=true or use ID3DecisionStump instead (#2099).

    ➕ Added probabilities_file parameter to get the probabilities matrix of AdaBoost classifier (#2050).

    🛠 Fix STB header search paths (#2104).

    ➕ Add DISABLE_DOWNLOADS CMake configuration option (#2104).

    ➕ Add padding layer in TransposedConvolutionLayer (#2082).

    🛠 Fix pkgconfig generation on non-Linux systems (#2101).

    🌲 Use log-space to represent HMM initial state and transition probabilities (#2081).

    ➕ Add functions to access parameters of Convolution and AtrousConvolution layers (#1985).

    ➕ Add Compute Error function in lars regression and changing Train function to return computed error (#2139).

    Add Julia bindings (#1949). Build settings can be controlled with the BUILD_JULIA_BINDINGS=(ON/OFF) and JULIA_EXECUTABLE=/path/to/julia CMake parameters.

    CMake fix for finding STB include directory (#2145).

    Add bindings for loading and saving images (#2019); mlpack_image_converter from the command-line, mlpack.image_converter() from Python.

    ➕ Add normalization support for CF binding (#2136).

    ➕ Add Mish activation function (#2158).

    ⚡️ Update init_rules in AMF to allow users to merge two initialization rules (#2151).

    ➕ Add GELU activation function (#2183).

    👍 Better error handling of eigendecompositions and Cholesky decompositions (#2088, #1840).

    ➕ Add LiSHT activation function (#2182).

    ➕ Add Valid and Same Padding for Transposed Convolution layer (#2163).

    ➕ Add CELU activation function (#2191)

    ➕ Add Log-Hyperbolic-Cosine Loss function (#2207)

    🔄 Change neural network types to avoid unnecessary use of rvalue references (#2259).

    ⬆️ Bump minimum Boost version to 1.58 (#2305).

    ♻️ Refactor STB support so HAS_STB macro is not needed when compiling against mlpack (#2312).

    ➕ Add Hard Shrink Activation Function (#2186).

    ➕ Add Soft Shrink Activation Function (#2174).

    ➕ Add Hinge Embedding Loss Function (#2229).

    ➕ Add Cosine Embedding Loss Function (#2209).

    ➕ Add Margin Ranking Loss Function (#2264).

    🛠 Bugfix for incorrect parameter vector sizes in logistic regression and softmax regression (#2359).

  • v3.2.2

    November 26, 2019

    🚀 Released Nov. 26th, 2019.

    • ➕ Add valid and same padding option in Convolution and Atrous Convolution layer (#1988).
    • ➕ Add Model() to the FFN class to access individual layers (#2043).
    • 📚 Update documentation for pip and conda installation packages (#2044).
    • Add bindings for linear SVM (#1935); mlpack_linear_svm from the command-line, linear_svm() from Python.
    • ➕ Add support to return the layer name as std::string (#1987).
    • Speed and memory improvements for the Transposed Convolution layer (#1493).
    • 🛠 Fix Windows Python build configuration (#1885).
    • Validate md5 of STB library after download (#2087).
    • 🔖 Add __version__ to __init__.py (#2092).
  • v3.2.1

    October 01, 2019
  • v3.2.0

    September 26, 2019

    🚀 Released Sept. 25, 2019.

    🛠 Fix occasionally-failing RADICAL test (#1924).

    🛠 Fix gcc 9 OpenMP compilation issue (#1970).

    ➕ Added support for loading and saving of images (#1903).

    ➕ Add Multiple Pole Balancing Environment (#1901, #1951).

    Added functionality for scaling of data (#1876); see the command-line binding mlpack_preprocess_scale or Python binding preprocess_scale().

    ➕ Add new parameter maximum_depth to decision tree and random forest bindings (#1916).

    🛠 Fix prediction output of softmax regression when test set accuracy is calculated (#1922).

    0️⃣ Pendulum environment now checks for termination. All RL environments now have an option to terminate after a set number of time steps (no limit by default) (#1941).

    ➕ Add support for probabilistic KDE (kernel density estimation) error bounds when using the Gaussian kernel (#1934).

    🛠 Fix negative distances for cover tree computation (#1979).

    🛠 Fix cover tree building when all pairwise distances are 0 (#1986).

    👌 Improve KDE pruning by reclaiming not used error tolerance (#1954, #1984).

    📜 Optimizations for sparse matrix accesses in z-score normalization for CF (#1989).

    Add kmeans_max_iterations option to GMM training binding gmm_train_main.

    ⬆️ Bump minimum Armadillo version to 8.400.0 due to ensmallen dependency requirement (#2015).

  • v3.1.1

    May 27, 2019

    🚀 Released May 26, 2019.

    • 🛠 Fix random forest bug for numerical-only data (#1887).
    • Significant speedups for random forest (#1887).
    • Random forest now has minimum_gain_split and subspace_dim parameters (#1887).
    • Decision tree parameter print_training_error deprecated in favor of print_training_accuracy.
    • 🗄 output option changed to predictions for adaboost and perceptron binding. Old options are now deprecated and will be preserved until mlpack 4.0.0 (#1882).
    • Concatenated ReLU layer (#1843).
    • Accelerate NormalizeLabels function using hashing instead of linear search (see src/mlpack/core/data/normalize_labels_impl.hpp) (#1780).
    • ➕ Add ConfusionMatrix() function for checking performance of classifiers (#1798).
    • 🏗 Install ensmallen headers when it is downloaded during build (#1900).
  • v3.1.0

    April 26, 2019

    🚀 Released April 25, 2019.
    🚀 Release email

    ➕ Add DiagonalGaussianDistribution and DiagonalGMM classes to speed up the diagonal covariance computation and deprecate DiagonalConstraint (#1666).

    ➕ Add kernel density estimation (KDE) implementation with bindings to other languages (#1301).

    Where relevant, all models with a Train() method now return a double value representing the goodness of fit (i.e. final objective value, error, etc.) (#1678).

    ➕ Add implementation for linear support vector machine (see src/mlpack/methods/linear_svm).

    🔄 Change DBSCAN to use PointSelectionPolicy and add OrderedPointSelection (#1625).

    👍 Residual block support (#1594).

    Bidirectional RNN (#1626).

    Dice loss layer (#1674, #1714) and hard sigmoid layer (#1776).

    output option changed to predictions and output_probabilities to probabilities for Naive Bayes binding (mlpack_nbc/nbc()). Old options are now deprecated and will be preserved until mlpack 4.0.0 (#1616).

    ➕ Add support for Diagonal GMMs to HMM code (#1658, #1666). This can provide large speedup when a diagonal GMM is acceptable as an emission probability distribution.

    Python binding improvements: check parameter type (#1717), avoid copying Pandas dataframes (#1711), handle Pandas Series objects (#1700).

  • v3.0.4

    November 13, 2018

    🚀 Released November 13, 2018.

    • ⬆️ Bump minimum CMake version to 3.3.2.
    • 🛠 CMake fixes for Ninja generator by Marc Espie (#1550, #1537, #1523).
    • More efficient linear regression implementation (#1500).
    • 🛠 Serialization fixes for neural networks (#1508, #1535).
    • Mean shift now allows single-point clusters (#1536).
  • v3.0.3

    July 29, 2018

    🚀 Released July 27th, 2018.

    • 🛠 Fix Visual Studio compilation issue (#1443).
    • Allow running local_coordinate_coding binding with no initial_dictionary parameter when input_model is not specified (#1457).
    • 🔧 Make use of OpenMP optional via the CMake USE_OPENMP configuration variable (#1474).
    • Accelerate FNN training by 20-30% by avoiding redundant calculations (#1467).
    • 🛠 Fix math::RandomSeed() usage in tests (#1462, #1440).
    • 📚 Generate better Python setup.py with documentation (#1460).