You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the Default Classifier the linear classification boils down to the function sgd_train_linear_model(). When I debug the code until sklearn_model.fit(x, y, sample_weight=w, **fit_kwargs) at line 327 the variabe "w" is empty.
But then at line 338:
Convert weights to pytorch
classes = (
torch.IntTensor(sklearn_model.classes_)
if hasattr(sklearn_model, "classes_")
else None
)
It can suddendly derrive the weights from the model. How so? Help me understand.
Debug and read the docs.
The text was updated successfully, but these errors were encountered:
I posted this originally on StackOverflow, but now I thought it may fit better here. After all it is related to this setup.
I am using the captum setup https://captum.ai/api/concept.html https://captum.ai/tutorials/TCAV_Image
In the Default Classifier the linear classification boils down to the function sgd_train_linear_model(). When I debug the code until sklearn_model.fit(x, y, sample_weight=w, **fit_kwargs) at line 327 the variabe "w" is empty.
But then at line 338:
Convert weights to pytorch
It can suddendly derrive the weights from the model. How so? Help me understand.
Debug and read the docs.
The text was updated successfully, but these errors were encountered: