0
0
NLPml~10 mins

Beam search decoding in NLP - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to initialize the beam width for beam search decoding.

NLP
beam_width = [1]
Drag options to blanks, or click blank then click option'
A5
B1
C0
D10
Attempts:
3 left
💡 Hint
Common Mistakes
Setting beam width to 0 disables beam search.
Using 1 makes beam search behave like greedy search.
2fill in blank
medium

Complete the code to select the top scoring sequences at each decoding step.

NLP
top_sequences = sorted(all_candidates, key=lambda x: x.score, reverse=[1])[:beam_width]
Drag options to blanks, or click blank then click option'
ANone
B0
CTrue
DFalse
Attempts:
3 left
💡 Hint
Common Mistakes
Using reverse=False sorts ascending, which keeps worst sequences.
Using None or 0 causes errors or unexpected sorting.
3fill in blank
hard

Fix the error in the code that updates the beam with new candidates.

NLP
beam = [1]
Drag options to blanks, or click blank then click option'
Atop_sequences
Ball_candidates
Cbeam + all_candidates
D[]
Attempts:
3 left
💡 Hint
Common Mistakes
Assigning beam to all_candidates keeps too many sequences.
Using beam + all_candidates duplicates sequences.
Assigning beam to [] empties the beam.
4fill in blank
hard

Fill both blanks to complete the loop that expands sequences and applies beam search.

NLP
for step in range(max_length):
    all_candidates = []
    for seq in beam:
        next_tokens = model.predict(seq.sequence)
        for token, score in next_tokens.items():
            candidate = seq.sequence + [token]
            candidate_score = seq.score [1] score
            all_candidates.append(Candidate(candidate, candidate_score))
    beam = sorted(all_candidates, key=lambda x: x.score, reverse=[2])[:beam_width]
Drag options to blanks, or click blank then click option'
A+
B-
CTrue
DFalse
Attempts:
3 left
💡 Hint
Common Mistakes
Subtracting scores instead of adding.
Sorting ascending (reverse=False) loses best sequences.
5fill in blank
hard

Fill all three blanks to complete the beam search decoding function.

NLP
def beam_search_decode(model, start_token, beam_width, max_length):
    beam = [Candidate([start_token], 0.0)]
    for _ in range(max_length):
        all_candidates = []
        for seq in beam:
            next_tokens = model.predict(seq.sequence)
            for token, score in next_tokens.items():
                candidate = seq.sequence + [[1]]
                candidate_score = seq.score [2] score
                all_candidates.append(Candidate(candidate, candidate_score))
        beam = sorted(all_candidates, key=lambda x: x.score, reverse=[3])[:beam_width]
    return beam
Drag options to blanks, or click blank then click option'
Atoken
B+
CTrue
Dseq
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong variable names for token or sequence.
Subtracting scores instead of adding.
Sorting ascending loses best sequences.