0
0
Prompt Engineering / GenAIml~5 mins

Self-hosted LLMs (Llama, Mistral) in Prompt Engineering / GenAI - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What does 'self-hosted LLM' mean?
A self-hosted LLM is a large language model that you run on your own computer or server instead of using a cloud service. It gives you full control over the model and data.
Click to reveal answer
beginner
Name two popular self-hosted LLMs.
Two popular self-hosted LLMs are LLaMA and Mistral. They are open models you can run locally or on your own servers.
Click to reveal answer
intermediate
Why might someone choose to use a self-hosted LLM like Llama or Mistral?
People use self-hosted LLMs to keep data private, avoid cloud costs, customize models, and have faster access without internet delays.
Click to reveal answer
intermediate
What is a key challenge when running self-hosted LLMs?
A key challenge is needing powerful hardware like GPUs and enough memory to run large models efficiently.
Click to reveal answer
advanced
How do Llama and Mistral differ in their design or use?
LLaMA models focus on being efficient and open for research, while Mistral models aim for high performance with fewer parameters, making them faster and lighter.
Click to reveal answer
What is a main advantage of self-hosting an LLM?
AAutomatic cloud updates
BNo need for any hardware
CFull control over data and model
DUnlimited free usage
Which hardware is usually needed to run self-hosted LLMs efficiently?
ASmartphone
BPowerful GPU and enough RAM
CTablet
DBasic laptop CPU
Llama and Mistral are examples of what kind of models?
ASelf-hosted LLMs
BImage recognition models
CCloud-only LLMs
DSpeech-to-text models
Which is NOT a reason to use a self-hosted LLM?
AData privacy
BAvoiding cloud fees
CNo need for internet
DNo hardware requirements
Mistral models are designed to be:
ALightweight and fast
BVery large and slow
COnly for image tasks
DClosed source
Explain what a self-hosted LLM is and why someone might want to use one.
Think about running the model on your own computer instead of online.
You got /2 concepts.
    Compare Llama and Mistral models in terms of their design goals and typical use cases.
    Focus on what makes each model special.
    You got /3 concepts.