Skip to content

Commit fd65973

Browse files
authored
[Bugfix] layer_norm_eps in GPT2Config should be float (#2240)
1 parent 135bcf9 commit fd65973

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

python/mlc_llm/model/gpt2/gpt2_model.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ class GPT2Config(ConfigBase): # pylint: disable=too-many-instance-attributes
2828
n_embd: int
2929
n_layer: int
3030
n_head: int
31-
layer_norm_epsilon: int
31+
layer_norm_epsilon: float
3232
n_inner: int = -1
3333
context_window_size: int = 0
3434
prefill_chunk_size: int = 0

0 commit comments

Comments
 (0)