What will be the output of the following Jenkinsfile snippet when run on a Jenkins server with Docker installed?
pipeline {
agent {
docker {
image 'alpine:3.14'
args '-v /tmp:/tmp'
}
}
stages {
stage('Test') {
steps {
sh 'echo Hello from Docker container'
}
}
}
}Think about what the sh step inside a Docker agent does.
The Jenkinsfile uses a Docker agent with the Alpine image. The sh step runs inside the container, so the echo command outputs the string inside the container environment.
Which of the following Jenkinsfile snippets correctly defines a Docker agent with the image node:16 and mounts the workspace directory?
Remember to use the correct environment variable syntax for mounting volumes.
Option A correctly uses ${WORKSPACE} to mount the Jenkins workspace directory inside the container. Option A uses $WORKSPACE which may not expand correctly in Jenkinsfile Groovy syntax. Option A uses dockerfile incorrectly. Option A mounts a fixed path which is not the workspace.
A Jenkins pipeline using a Docker agent fails with the error: Cannot connect to the Docker daemon. What is the most likely cause?
Think about what is needed to run Docker commands on the Jenkins agent.
The error indicates Jenkins cannot communicate with Docker. This usually means the Docker daemon is not running or Jenkins cannot access it. Misspelled image names or syntax errors cause different errors. Permission to read Jenkinsfile would cause pipeline load failure.
Consider this Jenkinsfile snippet:
pipeline {
agent {
docker {
image 'python:3.9'
args '-e ENV=production'
}
}
stages {
stage('Print Env') {
steps {
sh 'echo $ENV'
}
}
}
}What will be printed when this pipeline runs?
Check how environment variables passed via args are available inside the container.
The -e ENV=production argument sets the environment variable inside the Docker container. The sh 'echo $ENV' command prints the value of ENV, which is 'production'.
You want to speed up your Jenkins pipeline builds by caching dependencies inside a Docker container used as an agent. Which approach is best?
Think about how Docker volumes can help keep data between container runs.
Mounting a volume to cache dependencies allows the container to reuse downloaded packages, speeding up builds. Pulling fresh images or reinstalling dependencies every build slows down the process. Running on master without Docker loses container benefits.