Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Solve BUG:AttributeError: module transformers has no attribute LLaMATokenizer #64

Open
XuyaoWang opened this issue Mar 16, 2023 · 12 comments

Comments

@XuyaoWang
Copy link

I wang to follow the guide below.

Given Hugging Face hasn't officially supported the LLaMA models, we fine-tuned LLaMA with Hugging Face's transformers library by installing it from a particular fork (i.e. this PR to be merged). The hash of the specific commit we installed was 68d640f7c368bcaaaecfc678f11908ebbd3d6176.
but while click PR and try to run the example:
tokenizer = transformers.LLaMATokenizer.from_pretrained("/output/path/tokenizer/") model = transformers.LLaMAForCausalLM.from_pretrained("/output/path/llama-7b/") batch = tokenizer( "The primary use of LLaMA is research on large language models, including", return_tensors="pt", add_special_tokens=False ) batch = {k: v.cuda() for k, v in batch.items()} generated = model.generate(batch["input_ids"], max_length=100) print(tokenizer.decode(generated[0]))
I got an error:
AttributeError: module transformers has no attribute LLaMATokenizer
if you meet same bug, you just change your code to:
tokenizer = transformers.LlamaTokenizer.from_pretrained("/output/path/tokenizer/") model = transformers.LlamaForCausalLM.from_pretrained("/output/path/llama-7b/") batch = tokenizer( "The primary use of LLaMA is research on large language models, including", return_tensors="pt", add_special_tokens=False ) batch = {k: v.cuda() for k, v in batch.items()} generated = model.generate(batch["input_ids"], max_length=100) print(tokenizer.decode(generated[0]))
This bug is caused by incorrect letter capitalization

@teknium1
Copy link

I see the same but fixing the capitalization didnt fix for me
image

@teknium1
Copy link

Am using transformers 4.27.1, is it a different version?

@kiran1501
Copy link

even i got the same error , please suggest how to fix this

@garcesmarc
Copy link

68d640f7c368bcaaaecfc678f11908ebbd3d6176

@bkvijay
Copy link

bkvijay commented Mar 17, 2023

we are getting this error, and would appreciate your help

ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.

@XuyaoWang
Copy link
Author

XuyaoWang commented Mar 17, 2023

Am using transformers 4.27.1, is it a different version?

I didn't install transformers in pip, I download transformers in github in branch "llama_push" an move the downloaded file into conda

@zywang0108
Copy link

zywang0108 commented Mar 17, 2023

Similar to the previous answers, the following steps works for me:

  1. Git clone from branch llama_push.
  2. cd into this repo, git checkout 68d640f7c368bcaaaecfc678f11908ebbd3d6176
  3. Install the "transformers" package by running the command "python setup.py install".
  4. Consolidate all output files from the two subfolders in the PR (LLaMA Implementation huggingface/transformers#21955) into a single folder.

@teknium1
Copy link

Yeah you have to install from Transformers github. I had thought since it was merged it was in an updated pip package but its not yet.
pip install git+https://github.com/huggingface/transformers.git works for me

@ruian0
Copy link

ruian0 commented Mar 21, 2023

LlamaTokenizer instead of LLaMATokenizer

@xiaoweiweixiao
Copy link

@ruian0 Thanks for your idea, I fixed this bug, but I faced an another bug:
Exception: Could not find the transformer layer class to wrap in the model.
Do you know how to correct this problem?

@Alro10
Copy link

Alro10 commented Mar 30, 2023

Another nice solution:
Install transformer lib by running this command: pip install -q git+https://github.com/zphang/transformers@c3dc391 that worked well here

@sunyoubo
Copy link

transformers.LLaMATokenizer is change to transformers.LlamaTokenizer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants