見出し画像

M3 MacのMLMフレームワークで、Phi-2編

conda activate mlx でcondaのPython環境を立ち上げ。
phi2のdirectoryへ移動

解説みながら進めます。

とおもったら、ダウンロードやら・・・省略されてるみたい。
なので、とりあえずこれから。

pip install -r requirements.txt

よくわからないけど、これでモデルとかもダウンロードされてるのかな?

python convert.py

上をやったら、ダウンロードはじまったみたいだけど、PermissionErrorがでました。.cash/huggingface/modulesのdirectory関係みたいです。

上の記事と同じように、下の感じを実行してみました。
sudo chown -R 自分のユーザー名 /Users/自分のユーザー名/.cache/huggingface/modles

しかし、やっぱりエラーがでたので、GPT-4と相談しながら、modlesのdirectoryをつくって、sudo chown -R でpermissionを得てという作業をしました。

そうすることで、無事に、python convert.pyが動いて、ダウンロードなどができました。

最後に以下を実行。

python phi2.py

(mlx) phi2 % python phi2.py
tokenizer_config.json: 100%|███████████████| 7.34k/7.34k [00:00<00:00, 5.90MB/s]
vocab.json: 100%|████████████████████████████| 798k/798k [00:00<00:00, 1.62MB/s]
merges.txt: 100%|████████████████████████████| 456k/456k [00:00<00:00, 1.31MB/s]
tokenizer.json: 100%|██████████████████████| 2.11M/2.11M [00:00<00:00, 4.23MB/s]
added_tokens.json: 100%|███████████████████| 1.08k/1.08k [00:00<00:00, 3.48MB/s]
special_tokens_map.json: 100%|████████████████| 99.0/99.0 [00:00<00:00, 341kB/s]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO] Generating with Phi-2...
Write a detailed analogy between mathematics and a lighthouse.

Answer: Mathematics is like a lighthouse that guides us through the vast ocean of information. Just as a lighthouse illuminates the way for sailors, mathematical concepts and problem-solving skills illuminate our understanding of the world. Mathematics acts as a beacon of light, helping us navigate through the complexities of life and make sense of the unknown.

In this exercise, we have explored the fascinating world of Mathematics, Logic, and Thought Producers. We have seen how these concepts are intertwined and
(mlx) phi2 %

上の出力が得られました。Special tokenがなんとかと言われましたがなんのことでしょう。とりあえず、動いたようです。


tell me a jokeといれてみたら、以下の感じでした。

(mlx) phi2 % python phi2.py --prompt "Tell me a joke."
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO] Generating with Phi-2...
Tell me a joke.”

“Why did the chicken go to the seance?”

“To get to the other side.��

“Why did the seance go to the chicken?”

��To get to the other side.”

“Why did the chicken go to the other side?”

“To get to the other side.”

“Why did the other side go to

あと、人生の意味を尋ねてみたり

(mlx) phi2 % python phi2.py --prompt "what is the meaing of life?"
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO] Generating with Phi-2...
what is the meaing of life?
The question of what is the "meaning of life" is one of the most important questions that humans have ever asked. It is a question that has been asked for centuries, and it is a question that has no easy or definitiveanswer.
The question of what is the "meaning of life" is one of the most important questions that humans have ever asked. It is a question that has no easy or definitiveanswer. The question of what is the "meaning of life" is one of

くりかえしが入ってきましたね。

冬のポエムを書いては下記。

(mlx) phi2 % python phi2.py --prompt "Write a poem about winter"
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[INFO] Generating with Phi-2...
Write a poem about winter using at least three different poetic devices, such as simile, metaphor, personification, alliteration, or onomatopoeia.
##OUPUT

##OUPUT
Here is a possible poem about snow using some poetic devices:

Snowflakes fall like feathers from the sky
They dance and swirl in the cold air
They cover the ground like a soft white blanket
They make everything look calm and serene

Snow is like a friend who greets

ヘルプについては下記。

(mlx) phi2 % python phi2.py --help
usage: phi2.py [-h] [--prompt PROMPT] [--max_tokens MAX_TOKENS] [--temp TEMP]
[--seed SEED]

Phi-2 inference script

options:
-h, --help show this help message and exit
--prompt PROMPT The message to be processed by the model
--max_tokens MAX_TOKENS, -m MAX_TOKENS
Maximum number of tokens to generate
--temp TEMP The sampling temperature.
--seed SEED The PRNG seed
(mlx) phi2 %

こんな感じでした。


huggingfaceの名前のキャッシュあたりがよくわかりませんでしたが、どうにか動きました。

そして、google検索してみたら、下記の記事で、同様のことをやっておられる方がいましたので、そちらも参考にしてみてください。


#AI #AIでやってみた #やってみた #ローカルLLM #MLX #Phi-2 #大規模自然言語モデル

この記事が参加している募集

#やってみた

37,002件

この記事を最後までご覧いただき、ありがとうございます!もしも私の活動を応援していただけるなら、大変嬉しく思います。