mlpack v3.3.0 Release Notes

Release Date: 2020-04-07 // about 4 years ago
  • ๐Ÿš€ Released April 7th, 2020.

    Templated return type of Forward function of loss functions (#2339).

    โž• Added R2 Score regression metric (#2323).

    โž• Added mean squared logarithmic error loss function for neural networks (#2210).

    โž• Added mean bias loss function for neural networks (#2210).

    ๐Ÿ—„ The DecisionStump class has been marked deprecated; use the DecisionTree class with NoRecursion=true or use ID3DecisionStump instead (#2099).

    โž• Added probabilities_file parameter to get the probabilities matrix of AdaBoost classifier (#2050).

    ๐Ÿ›  Fix STB header search paths (#2104).

    โž• Add DISABLE_DOWNLOADS CMake configuration option (#2104).

    โž• Add padding layer in TransposedConvolutionLayer (#2082).

    ๐Ÿ›  Fix pkgconfig generation on non-Linux systems (#2101).

    ๐ŸŒฒ Use log-space to represent HMM initial state and transition probabilities (#2081).

    โž• Add functions to access parameters of Convolution and AtrousConvolution layers (#1985).

    โž• Add Compute Error function in lars regression and changing Train function to return computed error (#2139).

    Add Julia bindings (#1949). Build settings can be controlled with the BUILD_JULIA_BINDINGS=(ON/OFF) and JULIA_EXECUTABLE=/path/to/julia CMake parameters.

    CMake fix for finding STB include directory (#2145).

    Add bindings for loading and saving images (#2019); mlpack_image_converter from the command-line, mlpack.image_converter() from Python.

    โž• Add normalization support for CF binding (#2136).

    โž• Add Mish activation function (#2158).

    โšก๏ธ Update init_rules in AMF to allow users to merge two initialization rules (#2151).

    โž• Add GELU activation function (#2183).

    ๐Ÿ‘ Better error handling of eigendecompositions and Cholesky decompositions (#2088, #1840).

    โž• Add LiSHT activation function (#2182).

    โž• Add Valid and Same Padding for Transposed Convolution layer (#2163).

    โž• Add CELU activation function (#2191)

    โž• Add Log-Hyperbolic-Cosine Loss function (#2207)

    ๐Ÿ”„ Change neural network types to avoid unnecessary use of rvalue references (#2259).

    โฌ†๏ธ Bump minimum Boost version to 1.58 (#2305).

    ๐Ÿ”จ Refactor STB support so HAS_STB macro is not needed when compiling against mlpack (#2312).

    โž• Add Hard Shrink Activation Function (#2186).

    โž• Add Soft Shrink Activation Function (#2174).

    โž• Add Hinge Embedding Loss Function (#2229).

    โž• Add Cosine Embedding Loss Function (#2209).

    โž• Add Margin Ranking Loss Function (#2264).

    ๐Ÿ›  Bugfix for incorrect parameter vector sizes in logistic regression and softmax regression (#2359).