All Versions
26
Latest Version
Avg Release Cycle
16 days
Latest Release
1964 days ago

Changelog History
Page 1

  • v0.5.4 Changes

    January 06, 2017

    ๐Ÿ†• New Contributors

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • โž• Add a new datasets module behind a datasets feature flag.
    • โž• Add new classification scores: precision, recall, and f1.
    • โž• Add a new Transformer::fit function to allow prefitting of a Transformer before use.

    ๐Ÿ› Bug Fixes

    • None

    Minor Changes

    • LinRegressor now uses solve instead of inverse for improved accuracy and stability.
  • v0.5.3 Changes

    December 20, 2016

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • โž• Adding a new confusion_matrix module.

    ๐Ÿ› Bug Fixes

    • None

    Minor Changes

    • โšก๏ธ Updated rulinalg dependency to 0.3.7.
  • v0.5.2 Changes

    November 17, 2016

    ๐Ÿ†• New Contributors

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • None

    ๐Ÿ› Bug Fixes

    • Regularization constant for GMM is now only added to diagonal.

    Minor Changes

    • โž• Added some better Result handling to GMM.
  • v0.5.1 Changes

    October 03, 2016

    This version includes no changes but is a bump due to a crates bug.

    ๐Ÿ‘€ See the notes for 0.5.0 below.

  • v0.5.0 Changes

    ๐Ÿš€ This is another fairly large release. Thank you to everyone who contributed!

    ๐Ÿ†• New Contributors

    ๐Ÿ’ฅ Breaking Changes

    ๐Ÿ”‹ Features

    • โž• Adding RMSProp gradient descent algorithm. #121
    • โž• Adding cross validation. #125
    • โž• Adding a new Shuffler transformer. #135

    ๐Ÿ› Bug Fixes

    • None

    Minor Changes

    • โž• Adding benchmarks
    • Initiate GMM with sample covariance of data (instead of identity matrix).
  • v0.4.4 Changes

    August 12, 2016

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • โž• Adding new Transformer trait for data preprocessing.
    • โž• Adding a MinMax transformer.
    • โž• Adding a Standardizer transformer.

    Minor Changes

    • None
  • v0.4.3 Changes

    July 28, 2016

    ๐Ÿ†• New Contributors

    • ๐Ÿš€ tafia who is responsible for all changes in this release.

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • None

    Minor Changes

    • ๐Ÿ‘ฏ Made neural nets more efficient by reducing clones and some restructuring.
    • ๐ŸŽ Removing unneeded copying in favour of slicing for performance.
    • Using iter_rows in favour of manually row iterating by chunks.
  • v0.4.2 Changes

    July 24, 2016

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • None

    Minor Changes

    • ๐Ÿ›  Fixed a significant bug in the K-Means algorithm. Centroids โšก๏ธ were not updating correctly during M-step.
  • v0.4.1 Changes

    July 21, 2016

    ๐Ÿ’ฅ Breaking Changes

    • None

    ๐Ÿ”‹ Features

    • โž• Added experimental implementation of DBSCAN clustering.

    Minor Changes

    • โž• Added new example for K-Means clustering in repo.
  • v0.4.0 Changes

    July 12, 2016

    ๐Ÿš€ This is the biggest release so far. Primarily because the linalg module has been pulled out into its own crate: rulinalg.

    In addition to this there have been a number of improvements to the linalg ๐Ÿš€ and learning moduled in this release.

    ๐Ÿ’ฅ Breaking Changes

    • The linalg module pulled out and replaced by reexports of rulinalg. All structs are now imported at the linalg level, i.e. linalg::matrix::Matrix -> linalg::Matrix.
    • Decomposition methods now return Result instead of panicking on fail.
    • K-Means now has a trait for Initializer - which allows generic initialization algorithms.

    ๐Ÿ”‹ Features

    • ๐Ÿ†• New error handling in both the linalg (now rulinalg) and learning modules.
    • ๐Ÿ› Bug fixed in eigendecomposition: it can now be used!
    • K-means can now take a generic initialization algorithm.

    Minor Changes

    • Optimization and code cleanup in the decomposition methods.
    • Some optimization in the K-Means model.