Streams in Node.js are needed to handle large data efficiently. Without streams, reading a big file means loading it all into memory, which can cause delays and high memory use. Streams break data into small chunks, letting the program process each chunk as it arrives. This keeps memory use low and improves speed. The example code shows reading a file with a stream and logging each chunk's size. The execution table traces each chunk received and processed, showing memory stays low. Key points include why reading whole files at once is bad, what chunks are, and how streaming helps memory. The quiz tests understanding of chunk sizes, stream end, and memory use differences. Overall, streams help Node.js work smoothly with big data by reading and processing it bit by bit.