0
0
Hadoopdata~10 mins

Why Hadoop was created for big data - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to show the main reason Hadoop was created.

Hadoop
Hadoop was created to handle [1] data volumes that traditional systems could not manage.
Drag options to blanks, or click blank then click option'
Alarge
Bsmall
Cstructured
Dsimple
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing 'small' because it sounds easier
Confusing data type with data size
2fill in blank
medium

Complete the code to explain what Hadoop uses to store big data.

Hadoop
Hadoop uses [1] to store data across many computers.
Drag options to blanks, or click blank then click option'
ACloud storage
BSQL databases
CHDFS
DLocal files
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing 'SQL databases' because of familiarity
Choosing 'Cloud storage' which is different technology
3fill in blank
hard

Fix the error in the sentence about Hadoop's processing model.

Hadoop
Hadoop processes data using the [1] model, which breaks tasks into small parts.
Drag options to blanks, or click blank then click option'
AMapReduce
BBatch
CStream
DReal-time
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing 'Batch' which is a general term
Choosing 'Stream' or 'Real-time' which Hadoop does not primarily use
4fill in blank
hard

Fill both blanks to describe Hadoop's fault tolerance and scalability.

Hadoop
Hadoop achieves fault tolerance by [1] data and scales by adding [2] to the cluster.
Drag options to blanks, or click blank then click option'
Areplicating
Bremoving nodes
Cnodes
Dcompressing
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing 'removing nodes' which reduces capacity
Choosing 'compressing' which is unrelated to fault tolerance
5fill in blank
hard

Fill all three blanks to complete the explanation of Hadoop's design principles.

Hadoop
Hadoop was designed to handle [1] data by distributing storage with [2] and processing with [3].
Drag options to blanks, or click blank then click option'
Asmall
BHDFS
CMapReduce
Dlarge
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing 'small' instead of 'large'
Mixing up HDFS and MapReduce roles