Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Notebook 4 - one s too many in the fit function before the random sampling section #16

Open
waterboy96 opened this issue Feb 10, 2023 · 4 comments · May be fixed by #19
Open

Notebook 4 - one s too many in the fit function before the random sampling section #16

waterboy96 opened this issue Feb 10, 2023 · 4 comments · May be fixed by #19

Comments

@waterboy96
Copy link

Hi,

I was trying to reimplement the course material for my use case which is 1D as homework, and I had a bug running the fit function when using the PyTorch dataloaders, and I noticed that our fit function implemented right before the sampling chapter uses a variable called preds in the report function, however the predictions are stored in pred during the loop, and this lead to me having inconsistent dimensions, removing the s worked out.

Cheers

@PiotrCzapla
Copy link

Out of curiosity do you have a minimal example maybe that shows the issue?

@waterboy96
Copy link
Author

Hi @PiotrCzapla

Right before the random sampling section we define fit as:

def fit():
    for epoch in range(epochs):
        for xb,yb in train_dl:
            pred = model(xb)
            loss = loss_func(pred, yb)
            loss.backward()
            opt.step()
            opt.zero_grad()
        report(loss, preds, yb)

We define pred on line 4 in our loop, but we call report on preds in the last line. In most of the other fit functions it assigns the prediction to preds correctly such as the first fit function in the same notebook:

def fit():
    for epoch in range(epochs):
        for i in range(0, n, bs):
            s = slice(i, min(n,i+bs))
            xb,yb = x_train[s],y_train[s]
            preds = model(xb)
            loss = loss_func(preds, yb)
            loss.backward()
            with torch.no_grad():
                for p in model.parameters(): p -= p.grad * lr
                model.zero_grad()
        report(loss, preds, yb)

PiotrCzapla added a commit to PiotrCzapla/course22p2 that referenced this issue Mar 3, 2023
@PiotrCzapla PiotrCzapla linked a pull request Mar 3, 2023 that will close this issue
@PiotrCzapla
Copy link

I see what you mean, it is in 04_minibatch_training just before jupyter chapter "Random sampling". :) I made a fix #19 with github.dev it is super easy nowadays to make such changes directly in the browser even to notebooks ! :)

Thank you.

@waterboy96
Copy link
Author

I did not know that. Thanks, I will try it next time!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants