見出し画像

google/flan-ul2を使ってみた(Google Colab)

今回は、Googleが公開した最新の言語モデルであるgoogle/flan-ul2について紹介します。google/flan-ul2とは何かというと、自然言語を生成したり理解したりするための人工知能の技術です。

google/flan-ul2は、200億ものパラメータという巨大な規模を持っており、これまでに公開された中でも最強クラスの性能を誇ります。実際に使ってみるには高性能なコンピュータが必要です。そこで今回はGoogle Colabという無料のオンラインサービスを使ってgoogle/flan-ul2を動かしてみます。

コードは、hugging Faceの下記を利用します。

!pip install accelerate transformers bitsandbytes
from transformers import T5ForConditionalGeneration, AutoTokenizer
import torch
model = T5ForConditionalGeneration.from_pretrained("google/flan-ul2", device_map="auto", load_in_8bit=True)                                                                 
tokenizer = AutoTokenizer.from_pretrained("google/flan-ul2")

input_string = "Answer the following question by reasoning step by step. The cafeteria had 23 apples. If they used 20 for lunch, and bought 6 more, how many apple do they have?"                                               

inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)

print(tokenizer.decode(outputs[0]))
# <pad> They have 23 - 20 = 3 apples left. They have 3 + 6 = 9 apples. Therefore, the answer is 9.</s>

出力結果は、以下となりました。

<pad> They have 23 - 20 = 3 apples left. They have 3 + 6 = 9 apples. Therefore, the answer is 9.</s>

ちゃんと計算過程が出力されています。推論が出来るとHugging Faceに書いてありましたので、少し難しい推論を行ってみます。


input_string = "Answer the following question by reasoning step by step.When it's 1, the answer is 1. When it's 2, the answer is 4. When it's 3, the answer is 27. What is the answer when it's 4?"                                               
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)

print(tokenizer.decode(outputs[0]))

<pad> When it's 4, the answer is 14. Therefore, the final answer is 14.</s>

この回答は、256を期待しておりましたが、14となってしまいました。これは、f(x)=xのx乗という式に当てはめてみました。1の1乗は1、2の2乗は2×2で4、3の3乗は3×3×3で27、4の4乗は4×4×4×4で256となります。


次は会話文を読ませて、登場人物についてどう思うか考えてもらいました。

input_string = """Read next conversation, and answer what Shinji thinks about Misato.
Announcement: A special state of emergency has been issued for the Kanto and Chubu regions, with a focus on the Tokai area, as of 12:30 today. Residents are requested to evacuate to designated shelters immediately.

Announcement: I will repeat the message... (repeats)

Misato: Why do we have to lose sight of it at a time like this? This is ridiculous.

Phone: Due to the declaration of a special state of emergency, all normal lines are currently unavailable.

Shinji: It's no use. Maybe I shouldn't have come... Meeting up is impossible... Oh well, let's go to the shelter.

Operator: Unidentified moving object is still approaching Honsho.

Shigeru: I confirm the target on the monitor. Sending it to the main monitor.

Fuyutsuki: It's been 15 years.

Gendo: Yes, without a doubt, it's an "Angel."

BGM: 1-3

UN forces: Target hit by all bullets! ...Gwaaah!

Misato: Sorry to keep you waiting!

Shigeru: The target is still alive. It is still approaching Tokyo-3 and invading.

Operator: The air force cannot stop it!

Commander: It's an all-out war! Gather all the forces in Atsugi and Iruma!

Commander: No holding back! We must destroy the target by any means necessary!

"""                                               
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)

print(tokenizer.decode(outputs[0]))

<pad> Misato is a good friend</s>

確かに、この文章では、Misatoは良い友達というレベルしか表現されないかと思いますが、少し回答としては寂しい感じがします。

この記事が気に入ったらサポートをしてみませんか?