0
0
Hadoopdata~10 mins

Migration from Hadoop to cloud-native - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to list files in a Hadoop directory.

Hadoop
hadoop fs -ls [1]
Drag options to blanks, or click blank then click option'
A/data/input
B/user/hadoop
C/tmp
D/var/log
Attempts:
3 left
💡 Hint
Common Mistakes
Using a directory path that does not exist.
Forgetting the leading slash in the path.
2fill in blank
medium

Complete the code to copy a file from Hadoop to local filesystem.

Hadoop
hadoop fs -[1] /data/input/file.txt ./
Drag options to blanks, or click blank then click option'
Amove
Bput
Cget
DcopyFromLocal
Attempts:
3 left
💡 Hint
Common Mistakes
Using put which uploads files instead.
Using copyFromLocal which copies local to Hadoop.
3fill in blank
hard

Fix the error in the command to remove a Hadoop directory recursively.

Hadoop
hadoop fs -rm -[1] /data/old_data
Drag options to blanks, or click blank then click option'
AR
Bf
Crf
Dr
Attempts:
3 left
💡 Hint
Common Mistakes
Using -rf which is a Unix command, not Hadoop fs.
Using -f without recursion.
4fill in blank
hard

Fill both blanks to create a cloud-native command to copy data from Hadoop to AWS S3.

Hadoop
hadoop distcp [1] [2]
Drag options to blanks, or click blank then click option'
Ahdfs://namenode:8020/data/input
Bs3a://mybucket/data/input
Cs3n://mybucket/data/input
Dhdfs://namenode:9000/data/input
Attempts:
3 left
💡 Hint
Common Mistakes
Using s3n:// which is deprecated.
Mixing up source and destination paths.
5fill in blank
hard

Fill all three blanks to write a Spark command to read data from cloud storage and show the first 5 rows.

Hadoop
spark.read.format([1]).load([2]).[3]
Drag options to blanks, or click blank then click option'
A"parquet"
B"s3a://mybucket/data/input"
Cshow(5)
Dshow
Attempts:
3 left
💡 Hint
Common Mistakes
Using show without parentheses or row count.
Using incorrect format strings like "csv" instead of "parquet".