Replies: 1 comment
-
|
Hi, have you slolve this question? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to run the example at https://github.com/NVIDIA/Megatron-LM/tree/main/examples/multimodal.
But I found that once I set the pipeline
--pipeline-model-parallel-sizeto be larger than 1, I will get the error message:File "/workspace/megatron-lm/megatron/core/models/gpt/gpt_model.py", line 219, in forward rotary_seq_len = self.rotary_pos_emb.get_rotary_seq_len( File "/workspace/megatron-lm/megatron/core/models/common/embeddings/rotary_pos_embedding.py", line 173, in get_rotary_seq_len rotary_seq_len = transformer_input.size(0) AttributeError: 'NoneType' object has no attribute 'size'Beta Was this translation helpful? Give feedback.
All reactions