-
Notifications
You must be signed in to change notification settings - Fork 812
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Beam search w/ Flashlight Text #2017
base: main
Are you sure you want to change the base?
Conversation
3cdebdc
to
9da074e
Compare
encoder_output = model_kwargs["encoder_outputs"][encoder_output_key] | ||
|
||
def update_func(emissions, N, T, prev_step_token_idxs, prev_step_model_states, timestep): | ||
# `emissions` and `N` are unused in this current implementation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could change this around to take in emissions, but AFAIK there is no easy way to get the actual tensor from the data_ptr.
|
||
return final_tokens_as_tensors | ||
|
||
if num_python_workers > 1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PyTorch has a MP module (essentially a clone of Python's MP); however, this relies on pickle
, which means all of these functions would have to be at a global level. Very open to suggestions here.
e2b999c
to
9f2d2d6
Compare
9f2d2d6
to
64597a5
Compare
a78dd94
to
3fd765e
Compare
No description provided.