Complete the code to create a Spark session that can use spot instances.
from pyspark.sql import SparkSession spark = SparkSession.builder.appName('SpotInstanceApp').config('spark.[1]', 'true').getOrCreate()
The configuration spark.dynamicAllocation.enabled allows Spark to dynamically allocate executors, which is useful when using spot instances to save costs.
Complete the code to set the maximum number of executors for spot instances.
spark.conf.set('spark.dynamicAllocation.maxExecutors', [1])
Setting spark.dynamicAllocation.maxExecutors to '10' limits the maximum executors Spark can allocate, which helps control costs when using spot instances.
Fix the error in the code to request spot instances with a maximum bid price.
spark.conf.set('spark.executor.instances', '5') spark.conf.set('spark.executor.spotBidPrice', [1])
The bid price should be a string representing a numeric value without currency symbols. So '0.5' is correct.
Fill both blanks to create a dictionary that maps instance types to their spot prices.
spot_prices = { [1]: [2] for [1] in ['m4.large', 'm4.xlarge'] }The dictionary comprehension uses instance types as keys and their spot prices as values. Here, 'm4.large' is the key and 0.05 is the price.
Fill all three blanks to filter a DataFrame for spot instances with price less than 0.1 and select instance type and price.
filtered_df = df.filter(df.price [1] [2]).select([3], 'price')
The filter uses '<' to select rows where price is less than 0.1. Then it selects the 'instance_type' and 'price' columns.