Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Core dump on process.stdout.write() or console.log() for large files #5131

Closed
arunvspider opened this issue Feb 7, 2016 · 4 comments
Closed
Labels
process Issues and PRs related to the process subsystem.

Comments

@arunvspider
Copy link

Using filesystem stream read a large file for size ~6GB having random contents, write the read contents to the console using console.log() or process.stdout.write(), node hangs for about a minute and dumps core.

No issues noted while piping the read stream to fs.writable stream() & file written successful.

btw, executing "cat " on the same large file completes successfully.

Node script:

var fs = require("fs");
var fstream = fs.createReadStream("randfile.txt");

fstream.on("readable", ()=> {
 process.stdout.write(fstream.read());
});

<--- Last few GCs --->

3022 ms: Mark-sweep 1334.3 (1458.1) -> 1334.3 (1458.1) MB, 21.2 / 0 ms [last resort gc].
3024 ms: Scavenge 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 0.8 / 0 ms [allocation failure].
3024 ms: Scavenge 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 0.5 / 0 ms [allocation failure].
3045 ms: Mark-sweep 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 20.8 / 0 ms [last resort gc].
3066 ms: Mark-sweep 1334.4 (1458.1) -> 1334.4 (1458.1) MB, 21.2 / 0 ms [last resort gc].

<--- JS stacktrace --->

==== JS stack trace =========================================
Security context: 0x2cd1a77e3ac1 <JS Object>
    1: toString [buffer.js:400] [pc=0x323ae154b0f6] (this=0x12e5a99444f9 <an Uint8Array with map 0x21206d5054f1>)
    2: /* anonymous */ [/home/arun/workspace/nodews/learning/stream/longfileread.js:~5] [pc=0x323ae159132f] (this=0x3a75328eb1c1 <a ReadStream with map 0x21206d518d69>)
    3: emit [events.js:~130] [pc=0x323ae159a476] (this=0x3a75328eb1c1 <a ReadStream with map 0x21206d518d69>,type=...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted (core dumped)

[bnoordhuis - fixed formatting]

@arunvspider
Copy link
Author

Including node & host version information

$ node --version
v5.5.0
[arun@ ~]$
[arun@ ~]$ uname -a
Linux . 4.0.4-303.fc22.x86_64 #1 SMP Thu May 28 12:37:06 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux

@vkurchatkin
Copy link
Contributor

This is not unexpected. You read faster than you write, and eventually run out of memory.

@mscdex
Copy link
Contributor

mscdex commented Feb 7, 2016

Writing to stdout and stderr is async and as @vkurchatkin said, the output to the terminal is being increasingly buffered in memory because writing is async and it takes longer for those writes to complete than to read data from the file.

@mscdex mscdex added the process Issues and PRs related to the process subsystem. label Feb 7, 2016
@bnoordhuis
Copy link
Member

Let's close this. There is no back-pressure in the OP's example so there is no way for node.js to know it needs to slow down reading from that file. Caveat emptor.

I checked the documentation and it looks like I forgot to update doc/api/console.markdown in commit dac1d38; it still claims that's stdio is synchronous unless it's a pipe. I'll file a pull request for that and link to this issue.

bnoordhuis added a commit to bnoordhuis/io.js that referenced this issue Feb 7, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: nodejs#5131
PR-URL: nodejs#5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
rvagg pushed a commit that referenced this issue Feb 8, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: #5131
PR-URL: #5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
rvagg pushed a commit that referenced this issue Feb 9, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: #5131
PR-URL: #5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
MylesBorins pushed a commit that referenced this issue Feb 22, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: #5131
PR-URL: #5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
MylesBorins pushed a commit that referenced this issue Feb 22, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: #5131
PR-URL: #5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
MylesBorins pushed a commit that referenced this issue Mar 2, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: #5131
PR-URL: #5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
scovetta pushed a commit to scovetta/node that referenced this issue Apr 2, 2016
Mea culpa, looks like I forgot to update console.markdown in commit
dac1d38 ("doc: stdout/stderr can block when directed to file").
This commit rectifies that.

Refs: nodejs#5131
PR-URL: nodejs#5133
Reviewed-By: Brian White <mscdex@mscdex.net>
Reviewed-By: Evan Lucas <evanlucas@me.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
process Issues and PRs related to the process subsystem.
Projects
None yet
Development

No branches or pull requests

4 participants