Downloading large files using nodejs piped stream causes huge memory usage and OOM Error

I am using node js to download large files(300MB) from a server and pipe the response to a file write stream. As far as I understand pipes in nodejs, the data flow is managed by node and I don’t have to consider draining and other events. The issue I face is that the memory usage of the docker where my application is running increases in the same amount as the file being downloaded (i.e It seems the file is being saved in memory). This memory usage persists even when I delete the file in the docker. I am attaching the code used for creating request and piping, below for reference. The code is running fine but causing performance issues like huge memory/CPU usage and crashes with OOM error. I am not able to understand what I am doing wrong.