Challenge - 5 Problems
Log Pipeline Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
What is the output of this Elasticsearch ingest pipeline simulation?
Given the following ingest pipeline configuration and a sample document, what will be the value of the processed_message field after the pipeline runs?
Elasticsearch
{
"description": "Add processed_message field",
"processors": [
{
"set": {
"field": "processed_message",
"value": "{{message}} - processed"
}
}
]
}
Sample document:
{
"message": "User login successful"
}Attempts:
2 left
💡 Hint
The set processor adds or updates a field with the given value, using mustache templates for variables.
✗ Incorrect
The set processor uses the mustache template {{message}} to insert the original message value, then appends ' - processed'. So the new field processed_message contains the original message plus the suffix.
🧠 Conceptual
intermediate1:30remaining
Which processor is best to parse a timestamp string into a date field?
You have logs with a timestamp field as a string like "2024-06-01T12:30:45Z". Which Elasticsearch ingest processor should you use to convert this string into a date type for better querying?
Attempts:
2 left
💡 Hint
Think about which processor handles date formats and converts strings to dates.
✗ Incorrect
The date processor parses date strings into date objects, enabling Elasticsearch to index and query them as dates.
❓ Predict Output
advanced2:00remaining
What error does this pipeline cause when processing a document?
Consider this pipeline snippet:
{
"processors": [
{
"grok": {
"field": "message",
"patterns": ["%{COMMONAPACHELOG}"]
}
},
{
"remove": {
"field": "message"
}
}
]
}
If the input document does not have a 'message' field, what error will Elasticsearch raise?
Attempts:
2 left
💡 Hint
The grok processor requires the field to exist to parse it.
✗ Incorrect
If the grok processor tries to parse a field that does not exist, it raises an error indicating the field is missing. The remove processor runs after, so it is not reached.
🚀 Application
advanced2:00remaining
How many fields will the document have after this pipeline runs?
Given this pipeline:
{
"processors": [
{
"grok": {
"field": "log",
"patterns": ["%{IP:client} %{WORD:method} %{URIPATHPARAM:request}"]
}
},
{
"remove": {
"field": "log"
}
}
]
}
And this input document:
{
"log": "192.168.1.1 GET /index.html"
}
How many fields will the output document have?
Attempts:
2 left
💡 Hint
The grok processor extracts fields, then the remove processor deletes the original log field.
✗ Incorrect
The grok processor extracts client, method, and request fields from the log string. Then the remove processor deletes the original log field, leaving only the extracted fields.
🧠 Conceptual
expert1:30remaining
Which pipeline processor can conditionally execute based on a field's value?
You want to run a processor only if the field 'status' equals 'error'. Which feature or processor allows this conditional execution in an Elasticsearch ingest pipeline?
Attempts:
2 left
💡 Hint
Processors support an 'if' property to run conditionally.
✗ Incorrect
Elasticsearch ingest processors support an 'if' condition that evaluates a painless expression to decide if the processor runs.