Replies: 1 comment
-
Hi @zyan1234 you should be able to do it by adding the HF_API_TOKEN as part of the environmental variable of the SageMaker Hugging Face Inference Toolkit. |
Beta Was this translation helpful? Give feedback.
-
Hi @zyan1234 you should be able to do it by adding the HF_API_TOKEN as part of the environmental variable of the SageMaker Hugging Face Inference Toolkit. |
Beta Was this translation helpful? Give feedback.
-
Hi,
I am trying to access Mistral-7b-v0.1 from AWS sagemaker Jupiter notebook with proper IAM permissions. The notebook I am using for fine tuning is
https://github.com/philschmid/sagemaker-huggingface-llama-2-samples/blob/master/training/sagemaker-notebook.ipynb
Using the model id mistralai/Mistral-7b-v0.1. It was working fine, the model was pulled from hugging face repository and fine tuning was also successful. But now from last 5 days when I am trying to run the same code without any changes, I am getting access issue with the model
ErrorMessage "raise EnvironmentError(OSError: You are trying to access a gated repo. Make sure to request at https://huggingface.co/mistralai/Mistral-7b-v0.1 and pass a token having permission to this repo either by logging in with 'huggingface-cli login' or by passing 'token=
Command " opt/conda/bin/python3.10 run_cpm.py -- dataset_path /opt/ml/input/data/ training -- epochs 3 --lr 0.0002 --merge_weights True -- model_id mistralai/Mistral-7b-v0.1 -- per_device_train_size 2" exit code : 1
The sagemaker has default access to pull huggingface model repo. Anything changed w.r.t access.
Please help how do I access the mistral model from sagemaker.
Beta Was this translation helpful? Give feedback.
All reactions