Transform ML models into indigenous code with zero dependences



Transform ML designs right into an indigenous code (Java, C, Python, etc.) with no dependences.

Build Status
Coverage Status
License: MIT

m2cgen(Version 2 Code Generator) – is a light-weight library which offers an easy means to transpile skilled statistical versions into a native code (Python, C, Java).

  • Installment
  • Supported Languages
  • Supported Versions
  • Classification Outcome
  • Usage
  • CLI
  • Frequently Asked Question

Setup

pip mount m2cgen.

Supported Languages

  • Python
  • Java
  • C

Sustained Versions

Category Regression
Linear LogisticRegression, LogisticRegressionCV, RidgeClassifier, RidgeClassifierCV, SGDClassifier, PassiveAggressiveClassifier LinearRegression, HuberRegressor, ElasticNet, ElasticNetCV, TheilSenRegressor, Lars, LarsCV, Lasso, LassoCV, LassoLars, LassoLarsIC, OrthogonalMatchingPursuit, OrthogonalMatchingPursuitCV, Ridge, RidgeCV, BayesianRidge, ARDRegression, SGDRegressor, PassiveAggressiveRegressor
SVM LinearSVC LinearSVR
Tree DecisionTreeClassifier, ExtraTreeClassifier DecisionTreeRegressor, ExtraTreeRegressor
Random Forest RandomForestClassifier, ExtraTreesClassifier RandomForestRegressor, ExtraTreesRegressor
Increasing XGBClassifier( gbtree/dart booster only), LGBMClassifier( gbdt/dart booster just) XGBRegressor( gbtree/dart booster only), LGBMRegressor( gbdt/dart booster only)

Classification Result

Binary Multiclass Remark
Linear Scalar worth; signed range of the sample to the hyperplane for the second class Vector worth; signed distance of the example to the hyperplane per each course The result is constant with the output ofLinearClassifierMixin.decision _ feature
Tree/Random Forest/XGBoost/LightGBM Vector worth; course probabilities Vector value; course likelihoods The result is consistent with the outcome of thepredict_probamethod ofDecisionTreeClassifier/ForestClassifier/XGBClassifier/LGBMClassifier

Use

Below’s a straightforward example of just how an experienced straight model can be represented in Java code:

fromsklearn.datasetsimportload_boston.
fromsklearnimportlinear_model.
importm2cgenasm2c.
boston=-LRB- **********************) load_boston().
X, y=-LRB- **********************) boston.data, boston.target.
estimator=-LRB- **********************) linear_model. LinearRegression().
estimator.fit( X, y).
code=-LRB- **********************) m2c.export _ to_java( estimator)

The example of the generated code:

publiccourseDesign
publicstaticdoublescore(double[]input) 

You can discover more examples of produced code for various models/languages here

CLI

m2cgencan be utilized as a CLI device to generate code utilizing serialized version items (pickle protocol):

$ m2cgen-- language[--indent]
[--class_name][--package_name].
[--recursion-limit]

Piping is additionally supported:

$ feline|m2cgen-- language

Frequently Asked Question

Q: Generation fails withRuntimeError: maximum recursion deepness surpassedmistake.

A: If this error occurs while generating code using an ensemble model, attempt to lower the variety of trained estimators within that model. Alternatively you can boost the optimum recursion depth withsys.setrecursionlimit()