Skip to content
This repository has been archived by the owner on Mar 6, 2021. It is now read-only.

Predicting probabilities instead of classes #4

Open
srimalj opened this issue Mar 11, 2016 · 3 comments
Open

Predicting probabilities instead of classes #4

srimalj opened this issue Mar 11, 2016 · 3 comments

Comments

@srimalj
Copy link

srimalj commented Mar 11, 2016

Hi

I recently needed to predict the class probabilities instead of the class labels.

So I wrote a predict_proba() method, sticking to the convention used in other scikit classifiers.

I added the following which simply take considers the exponential ratios of the decision functions,
to class GenELMClassifier to the module elm.py .

  • def predict_proba(self, X):
  •    """Predict probability values using the model
    
  •    Considers exponent of decision_function values
    
  •    Parameters
    

  •    X : {array-like, sparse matrix} of shape [n_samples, n_features]
    
  •    Returns
    

  •    C : numpy array of shape [n_samples, n_outputs]
    
  •        Predicted values.
    
  •    """
    
  •    raw_predictions = np.exp(self.decision_function(X))
    
  •    probabilities = np.zeros(raw_predictions.shape)
    
  •    rows, cols = raw_predictions.shape
    
  •    for row in range(0, rows):
    
  •        total = sum(raw_predictions[row,:])
    
  •        probabilities[row,:] = raw_predictions[row,:] / total
    
  •    return probabilities
    

(The + signs are from my GIT diffs, please ignore).

I'm not overly familiar with ELMs but if you think the above is correct, feel free to add it up. Alternatively I would be happy to contribute code to the project.

@hahnicity
Copy link

hahnicity commented May 6, 2017

So I was able to get around this for classification problems by hacking the elm class. If you notice there is a method called decision_function for the class. If you use that natively you'll get some an array of numbers stretching from -1 to 1. So it's not usable on its own, but you can modify that and extend the class to get it to work. Here's how sklearn does it with the MLPClassifier. And here's how predict is actually implemented. And here's how you would do it with the elm


class ELMWrapper(ELMClassifier):
    def predict_proba(self, x):
        return self.decision_function(x)

from sklearn.preprocessing import LabelBinarizer
elm = ELMWrapper(binarizer=LabelBinarizer())

It's important to set the LabelBinarizer class up this way. Otherwise your prediction probabilities will be from -1 to 1. I imagine that if you wanted to keep the native binarizer you could scale the binarizer predictions in the predict_proba method, but I'm lazy on code and this seems to work just fine for me.

@renatoosousa
Copy link

it works? please, share with us :)

@gsk1692
Copy link

gsk1692 commented Dec 16, 2017

Hello,
Has this worked? Which is the correct one? Those who have tried, please let us know...
Is there any other predict_proba method for ELM?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants