At BigML we continue in our endeavour to make ML services available in any language. Now it is time for node.js to join the club. BigML’s API is a good fit for node.js because of two reasons:
- BigML is Cloud based, so interaction with resources is already done through asynchronous https connections.
- BigML’s resources evolve through different states when created or updated until they finally reach a finished or faulty end state. Using callbacks is a natural way to handle such situations.
Taking advantage of that, we present a library that will help you add all BigML’s capabilities to your node.js projects.
You can download the library from github, but it can also be simply installed using npm
npm install bigml
From then on, by just including in your project
var bigml = require('bigml');
you’ll have full control of your sources, datasets, models, ensembles, predictions and evaluations. Each one has a corresponding class
bigml.Source bigml.Dataset bigml.Model bigml.Ensemble bigml.Prediction bigml.Evaluation
that handles their CRUD methods and can also list the existing resources for you. Let’s see a source creation example:
var bigml = require('bigml'); var source = new bigml.Source(); source.create('./data/iris.csv', myCallback);
In these three lines you will create a source in BigML with the contents of your ./data/iris.csv file and myCallback
will receive the created object for further processing. If you provide no callback, a default one that just prints the resource to stdout is used. All the formerly mentioned objects will be accessible in a similar way for creation, retrieval, update and delete handling. You can check the library documentation for more details.
Furthermore, we’ve added two classes that will help you build local predictions from your BigML existing models or ensembles
bigml.LocalModel bigml.LocalEnsemble
With them, you’ll be able to recover your models’ information and use it to predict locally in your computer, with no latency at all. For example, to predict with a previously created model all you need to provide is its id
var bigml = require('bigml'); var localModel = new bigml.LocalModel('model/51922d0b37203f2a8c000010'); localModel.predict({'petal length': 1}, function(error, prediction) {console.log(prediction)});
The LocalModel
instance will handle the recovery of the model information and will start predicting the moment it becomes available in your computer. The same procedure can be applied to ensembles
var bigml = require('bigml'); var localEnsemble = new bigml.LocalEnsemble('ensemble/51901f4337203f3a9a000215'); localEnsemble.predict({'petal length': 1}, 0, function(error, prediction) {console.log(prediction)});
where, as you see, the first two arguments of the predict method are the input data and a numeric code that sets the combination method to be used (0
for plurality, 1
for confidence weighted and 2
for probability weighted). Similarly, you can predict with a list of models
var bigml = require('bigml'); var localEnsemble = new bigml.LocalEnsemble(['model/51bb69b437203f02b50004ce', 'model/51bb69b437203f02b50004d0']); localEnsemble.predict({'petal length': 1}, 0, function(error, prediction) {console.log(prediction)});
As you’ve probably noticed, the predict methods in these examples take a callback as the last argument. The same method can be used synchronously too, once the information of the models or ensembles has been downloaded to your computer. You can check the library documentation to learn more about that.
By now, we have quite a large collection of bindings for BigML’s API. We hope that these will help the node.js community to add the power of BigML’s Machine Learning service to their projects. If you feel you can contribute anyhow to any of these bindings, let us know! We’re always glad to receive feedback. And if you’re getting serious about implementing something on top of BigML, use the NODEJS coupon code when signing up for any of our subscription plans. First 25 users will get a 25% discount!
One comment