0
0
Hadoopdata~10 mins

Sqoop for database imports in Hadoop - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to import data from a MySQL database using Sqoop.

Hadoop
sqoop import --connect jdbc:mysql://localhost/db --username user --password pass --table [1]
Drag options to blanks, or click blank then click option'
Ausers
Bemployees
Cdata
Dhadoop
Attempts:
3 left
💡 Hint
Common Mistakes
Using the database name instead of the table name.
Confusing username with table name.
2fill in blank
medium

Complete the code to specify the target directory in HDFS for the imported data.

Hadoop
sqoop import --connect jdbc:mysql://localhost/db --username user --password pass --table employees --target-dir [1]
Drag options to blanks, or click blank then click option'
A/user/root/data
B/data/mysql
C/tmp/import
D/user/hadoop/employees
Attempts:
3 left
💡 Hint
Common Mistakes
Using local file system paths instead of HDFS paths.
Choosing a directory that may not exist or is not writable.
3fill in blank
hard

Fix the error in the Sqoop import command to specify the number of mappers.

Hadoop
sqoop import --connect jdbc:mysql://localhost/db --username user --password pass --table employees --num-mappers [1]
Drag options to blanks, or click blank then click option'
Aten
Bzero
C4
D-1
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero or negative numbers.
Using words instead of numbers.
4fill in blank
hard

Fill both blanks to import only rows where the salary is greater than 50000.

Hadoop
sqoop import --connect jdbc:mysql://localhost/db --username user --password pass --table employees --where "[1] [2] 50000"
Drag options to blanks, or click blank then click option'
Asalary
B>
C<
Dage
Attempts:
3 left
💡 Hint
Common Mistakes
Using the wrong column name.
Using the wrong comparison operator.
5fill in blank
hard

Fill all three blanks to import data and save it as a text file with tab delimiter.

Hadoop
sqoop import --connect jdbc:mysql://localhost/db --username user --password pass --table employees --target-dir /user/hadoop/employees --as-[1]file --fields-terminated-by '[2]' --num-mappers [3]
Drag options to blanks, or click blank then click option'
Atext
B\t
C4
Dsequence
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong file format option.
Using wrong delimiter character.
Using invalid number of mappers.