0
0
Prompt Engineering / GenAIml~10 mins

Translation in Prompt Engineering / GenAI - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to load the translation model using Hugging Face Transformers.

Prompt Engineering / GenAI
from transformers import MarianMTModel, MarianTokenizer
model_name = 'Helsinki-NLP/opus-mt-en-de'
tokenizer = MarianTokenizer.from_pretrained([1])
Drag options to blanks, or click blank then click option'
A'en-de'
B'Helsinki-NLP/opus-mt-en-de'
Cmodel_name
DMarianTokenizer
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the model name as a string literal instead of using the variable.
Passing the class name instead of the model name.
2fill in blank
medium

Complete the code to tokenize the input text for translation.

Prompt Engineering / GenAI
text = 'Hello, how are you?'
inputs = tokenizer([1], return_tensors='pt', padding=True)
Drag options to blanks, or click blank then click option'
Atext
B'text'
C['Hello']
D['Hello, how are you?']
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the string 'text' instead of the variable text.
Passing a list when a string is expected.
3fill in blank
hard

Fix the error in generating the translated tokens from the model output.

Prompt Engineering / GenAI
translated_tokens = model.generate([1])
Drag options to blanks, or click blank then click option'
Ainputs['input_ids']
Binputs['attention_mask']
Cinputs
Dinputs['token_type_ids']
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the whole inputs dictionary instead of just the input IDs.
Passing attention mask or token type IDs instead of input IDs.
4fill in blank
hard

Fill both blanks to decode the translated tokens into a readable string.

Prompt Engineering / GenAI
translated_text = tokenizer.decode([1][0], skip_special_tokens=[2])
Drag options to blanks, or click blank then click option'
Atranslated_tokens
BTrue
CFalse
Dinputs
Attempts:
3 left
💡 Hint
Common Mistakes
Using the inputs variable instead of translated tokens.
Setting skip_special_tokens to False causing special tokens to appear.
5fill in blank
hard

Fill all three blanks to create a function that translates English text to German.

Prompt Engineering / GenAI
def translate_en_to_de(text):
    tokenizer = MarianTokenizer.from_pretrained([1])
    model = MarianMTModel.from_pretrained([2])
    inputs = tokenizer(text, return_tensors=[3], padding=True)
    translated_tokens = model.generate(inputs['input_ids'])
    return tokenizer.decode(translated_tokens[0], skip_special_tokens=True)
Drag options to blanks, or click blank then click option'
A'Helsinki-NLP/opus-mt-en-de'
C'pt'
D'tf'
Attempts:
3 left
💡 Hint
Common Mistakes
Using different model names for tokenizer and model.
Using 'tf' instead of 'pt' for return_tensors causing errors.