You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1, good question, we use our copy of OpenCLIP and customize the setup to OpenAI CLIP (that was never done before).
2, we aim for controlled experiments over OpenAI CLIP so all gains are from data ONLY for fair comparison on data and all data algorithms in future.
All existing non OpenAI works are not changing data alone: big batch size, model, lr. These are CLIP "system" that can always combine w/ latest NN tech or hardware; these are not very related to CLIP itself (eg changing activation function are common to every transformer).
BTW, we notice slightly better acc. perf. on quickgelu over gelu in 2nd half of training (not initially); so we suspect the benefits of gelu is not fully verified but very ad-hoc?
It is not clear from readme:
The text was updated successfully, but these errors were encountered: