0
0
GenaiHow-ToBeginner · 3 min read

How to Get JSON Output from LLM: Simple Guide

To get JSON output from a large language model (LLM), you need to design your prompt to explicitly ask for JSON format and provide a clear JSON structure example. Use instructions like "Respond only with JSON" and validate the output by parsing it with a JSON parser.
📐

Syntax

When prompting an LLM for JSON output, your prompt should include:

  • Instruction: Tell the model to respond only in JSON format.
  • JSON Template: Provide an example or template of the JSON structure you expect.
  • Clear Fields: Specify the keys and value types you want in the JSON.
text
Prompt example:
"Generate a JSON object with keys \"name\" (string), \"age\" (number), and \"city\" (string). Respond only with JSON."
💻

Example

This example shows how to prompt an LLM using Python and OpenAI's API to get JSON output, then parse it safely.

python
import openai
import json

openai.api_key = "YOUR_API_KEY"

prompt = (
    "Generate a JSON object with keys \"name\", \"age\", and \"city\". "
    "Respond only with JSON."
)

response = openai.ChatCompletion.create(
    model="gpt-4",
    messages=[{"role": "user", "content": prompt}],
    temperature=0
)

json_text = response.choices[0].message.content.strip()

try:
    data = json.loads(json_text)
    print("Parsed JSON output:", data)
except json.JSONDecodeError:
    print("Failed to parse JSON. Raw output:", json_text)
Output
Parsed JSON output: {'name': 'Alice', 'age': 30, 'city': 'New York'}
⚠️

Common Pitfalls

  • Not specifying JSON format: The model may respond in plain text or mixed formats.
  • Incomplete JSON: The output might be missing commas or quotes, causing parse errors.
  • Extra text: The model may add explanations or greetings outside the JSON.
  • Parsing errors: Always use try-except to handle invalid JSON.

To fix these, always instruct the model clearly and validate the output before use.

text
Wrong prompt:
"Tell me about a person named Alice."

Right prompt:
"Respond only with a JSON object with keys \"name\", \"age\", and \"city\"."
📊

Quick Reference

StepTip
1. InstructionTell the LLM to respond only with JSON.
2. TemplateProvide a clear JSON example or structure.
3. ValidateParse the output with a JSON parser to catch errors.
4. Handle errorsUse try-except to manage invalid JSON outputs.
5. TemperatureSet temperature to 0 for consistent output.

Key Takeaways

Always instruct the LLM explicitly to respond only with JSON format.
Provide a clear JSON structure or example in your prompt.
Parse the LLM output with a JSON parser and handle errors gracefully.
Set model temperature to 0 for more predictable JSON output.
Avoid extra text by reinforcing the JSON-only response in your prompt.