Skip to content

Commit

Permalink
feat(multi_head_attention.py): set bias=True
Browse files Browse the repository at this point in the history
  • Loading branch information
huangting4201 committed Jan 19, 2024
1 parent 71543b3 commit 05fa04a
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions internlm/model/multi_head_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ def __init__(
embed_dim,
3 * embed_dim,
process_group,
bias=False,
bias=True,
sequence_parallel=gpc.config.parallel.sequence_parallel,
**factory_kwargs,
) # according to https://spaces.ac.cn/archives/9577
Expand All @@ -232,7 +232,7 @@ def __init__(
embed_dim,
embed_dim,
process_group,
bias=False,
bias=True,
sequence_parallel=gpc.config.parallel.sequence_parallel,
**factory_kwargs,
)
Expand Down

0 comments on commit 05fa04a

Please sign in to comment.