For some days I have searched for a working solution to an error

Error: EMFILE, too many open files

It seems that many people have the same problem. The usual answer involves increasing the number of file descriptors. So, I've tried this:

sysctl -w kern.maxfiles=20480,

The default value is 10240. This is a little strange in my eyes, because the number of files I'm handling in the directory is under 10240. Even stranger, I still receive the same error after I've increased the number of file descriptors.

Second question:

After a number of searches I found a work around for the "too many open files" problem:

var requestBatches = {};
function batchingReadFile(filename, callback) {
// First check to see if there is already a batch
if (requestBatches.hasOwnProperty(filename)) {
requestBatches[filename].push(callback);
return;
} // Otherwise start a new one and make a real request
var batch = requestBatches[filename] = [callback];
FS.readFile(filename, onRealRead); // Flush out the batch on complete
function onRealRead() {
delete requestBatches[filename];
for (var i = 0, l = batch.length; i < l; i++) {
batch[i].apply(null, arguments);
}
}
} function printFile(file){
console.log(file);
} dir = "/Users/xaver/Downloads/xaver/xxx/xxx/" var files = fs.readdirSync(dir); for (i in files){
filename = dir + files[i];
console.log(filename);
batchingReadFile(filename, printFile);

Unfortunately I still recieve the same error. What is wrong with this code?

One last question (I'm new to javascript and node), I'm in the process of developping a web application with a lot of requests for about 5000 daily users. I've many years of experience in programming with other languages like python and java. so originally I thought to developp this application with django or play framework. Then I discovered node and I must say that the idea of non-blocking I/O model is really nice, seductive, and most of all very fast!

But what kind of problems should I expect with node? Is it a production proven web server? What are your experiences?

javascript osx node.js file-descriptor  edited Jan 23 '12 at 5:05 jlafay 7,017 8 45 81 asked Jan 22 '12 at 23:18 xaverras 600 1 6 12      "Is it a production proven web server?" May be a bit pedantic, but node isn't a web server as such. –  UpTheCreek Oct 2 '13 at 19:45



|

Answers
8

Using the graceful-fs module by Isaac Schlueter (node.js maintainer) is probably the most appropriate solution. It does incremental back-off if EMFILE is encountered. It can be used as a drop-in replacement for the built-in fs module.

 answered Apr 10 '13 at 19:32 Braveg1rl 7,034 3 24 41 1   Saved me, why is this not the node default? Why do I need to install some 3rd party plugin to solve the issue? –  Anthony Webb Aug 14 '13 at 14:44 3   I think, generally speaking, Node tries to expose as much to the user as possible. This gives everyone (not just Node core developers) the opportunity to solve any problems arising from the use of this relatively raw interface. At the same time, it's really easy to publish solutions, and download those published by others through npm. Don't expect a lot of smarts from Node itself. Instead, expect to find the smarts in packages published on npm. – Braveg1rl Aug 14 '13 at 15:00 4   That's fine if it's your own code, but plenty of npm modules dont use this. –  UpTheCreek Oct 2 '13 at 19:43 1   This module solved all my issues! I agree that node appears to be a little raw still, but mainly because it's really hard to understand what is going wrong with so little documentation and 解决方法 right solutions to known issues. –  sidonaldsonOct 31 '13 at 12:39      how do you npm it? how do I combine this in my code instead of the regular fs? –  Aviram Netanel Feb 4 '14 at 11:45  |  show more comment

For when graceful-fs doesn't work... or you just want to understand where the leak is coming from. Follow this process.

(e.g. graceful-fs isn't gonna fix your wagon if your issue is with sockets.)

From My Blog Article: http://www.blakerobertson.com/devlog/2014/1/11/how-to-determine-whats-causing-error-connect-emfile-nodejs.html

How To Isolate

This command will output the number of open handles for nodejs processes:

lsof -i -n -P | grep nodejs

COMMAND     PID    USER   FD   TYPE    DEVICE SIZE/OFF NODE NAME
...
nodejs 12211 root 1012u IPv4 151317015 0t0 TCP 10.101.42.209:40371->54.236.3.170:80 (ESTABLISHED)
nodejs 12211 root 1013u IPv4 151279902 0t0 TCP 10.101.42.209:43656->54.236.3.172:80 (ESTABLISHED)
nodejs 12211 root 1014u IPv4 151317016 0t0 TCP 10.101.42.209:34450->54.236.3.168:80 (ESTABLISHED)
nodejs 12211 root 1015u IPv4 151289728 0t0 TCP 10.101.42.209:52691->54.236.3.173:80 (ESTABLISHED)
nodejs 12211 root 1016u IPv4 151305607 0t0 TCP 10.101.42.209:47707->54.236.3.172:80 (ESTABLISHED)
nodejs 12211 root 1017u IPv4 151289730 0t0 TCP 10.101.42.209:45423->54.236.3.171:80 (ESTABLISHED)
nodejs 12211 root 1018u IPv4 151289731 0t0 TCP 10.101.42.209:36090->54.236.3.170:80 (ESTABLISHED)
nodejs 12211 root 1019u IPv4 151314874 0t0 TCP 10.101.42.209:49176->54.236.3.172:80 (ESTABLISHED)
nodejs 12211 root 1020u IPv4 151289768 0t0 TCP 10.101.42.209:45427->54.236.3.171:80 (ESTABLISHED)
nodejs 12211 root 1021u IPv4 151289769 0t0 TCP 10.101.42.209:36094->54.236.3.170:80 (ESTABLISHED)
nodejs 12211 root 1022u IPv4 151279903 0t0 TCP 10.101.42.209:43836->54.236.3.171:80 (ESTABLISHED)
nodejs 12211 root 1023u IPv4 151281403 0t0 TCP 10.101.42.209:43930->54.236.3.172:80 (ESTABLISHED)
....

Notice the: 1023u (last line) - that's the 1024th file handle which is the default maximum.

Now, Look at the last column. That indicates which resource is open. You'll probably see a number of lines all with the same resource name. Hopefully, that now tells you where to look in your code for the leak.

If you don't know multiple node processes, first lookup which process has pid 12211. That'll tell you the process.

In my case above, I noticed that there were a bunch of very similar IP Addresses. They were all 54.236.3.### By doing ip address lookups, was able to determine in my case it was pubnub related.

Command Reference

Use this syntax to determine how many open handles a process has open...

To get a count of open files for a certain pid

I used this command to test the number of files that were opened after doing various events in my app.

lsof -i -n -P | grep "8465" | wc -l

# lsof -i -n -P | grep "nodejs.*8465" | wc -l
28
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
31
# lsof -i -n -P | grep "nodejs.*8465" | wc -l
34

What is your process limit?

ulimit -a

The line you want will look like this:  open files (-n) 1024

 edited Nov 7 '16 at 21:17 commanda 4,219 1 16 24 answered Jan 12 '14 at 2:27 blak3r 8,274 8 51 80 1   How can you change open files limit? –  2619 May 29 '14 at 9:35 6   ulimit -n 2048 to allow 2048 files open –  Gael Nov 11 '14 at 1:10



|

I ran into this problem today, and finding no good solutions for it, I created a module to address it. I was inspired by @fbartho's snippet, but wanted to avoid overwriting the fs module.

The module I wrote is Filequeue, and you use it just like fs:

var Filequeue = require('filequeue');
var fq = new Filequeue(200); // max number of files to open at once fq.readdir('/Users/xaver/Downloads/xaver/xxx/xxx/', function(err, files) {
if(err) {
throw err;
}
files.forEach(function(file) {
fq.readFile('/Users/xaver/Downloads/xaver/xxx/xxx/' + file, function(err, data) {
// do something here
}
});
});

 answered Mar 8 '13 at 1:50 Trey Griffith 71 1 4



|

You're reading too many files at once. Node reads files asynchronously, so you'll be reading all files at once. So you're probably reading 10240 at once.

See if this works:

var fs = require('fs')
var events = require('events')
var util = require('util')
var path = require('path') var FsPool = module.exports = function(dir) {
events.EventEmitter.call(this)
this.dir = dir;
this.files = [];
this.active = [];
this.threads = 1;
this.on('run', this.runQuta.bind(this))
};
// So will act like an event emitter
util.inherits(FsPool, events.EventEmitter); FsPool.prototype.runQuta = function() {
if(this.files.length === 0 && this.active.length === 0) {
return this.emit('done');
}
if(this.active.length < this.threads) {
var name = this.files.shift() this.active.push(name)
var fileName = path.join(this.dir, name);
var self = this;
fs.stat(fileName, function(err, stats) {
if(err)
throw err;
if(stats.isFile()) {
fs.readFile(fileName, function(err, data) {
if(err)
throw err;
self.active.splice(self.active.indexOf(name), 1)
self.emit('file', name, data);
self.emit('run'); });
} else {
self.active.splice(self.active.indexOf(name), 1)
self.emit('dir', name);
self.emit('run');
}
});
}
return this
};
FsPool.prototype.init = function() {
var dir = this.dir;
var self = this;
fs.readdir(dir, function(err, files) {
if(err)
throw err;
self.files = files
self.emit('run');
})
return this
};
var fsPool = new FsPool(__dirname) fsPool.on('file', function(fileName, fileData) {
console.log('file name: ' + fileName)
console.log('file data: ', fileData.toString('utf8')) })
fsPool.on('dir', function(dirName) {
console.log('dir name: ' + dirName) })
fsPool.on('done', function() {
console.log('done')
});
fsPool.init()

 edited Jul 25 '16 at 11:19 atc 3,661 2 27 53 answered Jan 23 '12 at 0:30 Tim P. 146 1 9



|

I just finished writing a little snippet of code to solve this problem myself, all of the other solutions appear way too heavyweight and require you to change your program structure.

This solution just stalls any fs.readFile or fs.writeFile calls so that there are no more than a set number in flight at any given time.

// Queuing reads and writes, so your nodejs script doesn't overwhelm system limits catastrophically
global.maxFilesInFlight = 100; // Set this value to some number safeish for your system
var origRead = fs.readFile;
var origWrite = fs.writeFile; var activeCount = 0;
var pending = []; var wrapCallback = function(cb){
return function(){
activeCount--;
cb.apply(this,Array.prototype.slice.call(arguments));
if (activeCount < global.maxFilesInFlight && pending.length){
console.log("Processing Pending read/write");
pending.shift()();
}
};
};
fs.readFile = function(){
var args = Array.prototype.slice.call(arguments);
if (activeCount < global.maxFilesInFlight){
if (args[1] instanceof Function){
args[1] = wrapCallback(args[1]);
} else if (args[2] instanceof Function) {
args[2] = wrapCallback(args[2]);
}
activeCount++;
origRead.apply(fs,args);
} else {
console.log("Delaying read:",args[0]);
pending.push(function(){
fs.readFile.apply(fs,args);
});
}
}; fs.writeFile = function(){
var args = Array.prototype.slice.call(arguments);
if (activeCount < global.maxFilesInFlight){
if (args[1] instanceof Function){
args[1] = wrapCallback(args[1]);
} else if (args[2] instanceof Function) {
args[2] = wrapCallback(args[2]);
}
activeCount++;
origWrite.apply(fs,args);
} else {
console.log("Delaying write:",args[0]);
pending.push(function(){
fs.writeFile.apply(fs,args);
});
}
};

 answered Dec 2 '12 at 23:28 fbartho 184 6      U should make a repo for this on github. –  NickSep 4 '14 at 3:06      This works very well if graceful-fs is not working for you. –  Ceekay Nov 8 '16 at 18:08



|

With bagpipe, you just need change

FS.readFile(filename, onRealRead);

=>

var bagpipe = new Bagpipe(10);

bagpipe.push(FS.readFile, filename, onRealRead))

The bagpipe help you limit the parallel. more details: https://github.com/JacksonTian/bagpipe

 answered Nov 20 '12 at 4:22 user1837639 27 3      It's all on chinese or other asian language. Is there any documentation written in english? –  Fatih Arslan Feb 20 '13 at 23:16      @FatihArslan English doc is available now. –  user1837639 Jul 30 '13 at 12:12 2   or use async.js – Melbourne2991 Mar 3 '15 at 5:03



|

Had the same problem when running the nodemon command so i reduced the name of files open in sublime text and the error dissappeared.

 answered Dec 9 '15 at 7:24 Buhiire Keneth 84 5      I, too, was getting EMFILE errors and through trial and error noticed that closing some Sublime windows resolved the issue. I still don't know why. I tried adding ulimit -n 2560 to my .bash_profile, but that didn't solve the issue. Does this indicate a need to change to Atom instead? –  The Qodesmith Jan 26 '16 at 13:38



|

cwait is a general solution for limiting concurrent executions of any functions that return promises.

In your case the code could be something like:

var Promise = require('bluebird');
var cwait = require('cwait'); // Allow max. 10 concurrent file reads.
var queue = new cwait.TaskQueue(Promise, 10);
var read = queue.wrap(Promise.promisify(batchingReadFile)); Promise.map(files, function(filename) {
console.log(filename);
return(read(filename));
})

 answered May 10 '16 at 13:46 jjrv 2,776 1 22 40



|

protected by Community♦ Jan 23 '15 at 12:43

Thank you for your interest in this question. Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).

Would you like to answer one of these unanswered questions instead?

Not the answer you're looking for? Browse other questions tagged javascript osx node.js file-descriptor or ask your own question.

解决javascript - node and Error: EMFILE, too many open files的更多相关文章

  1. 【解决】Node JS Error: ENOENT

    The Node Beginner Book 书中的实例代码当上传图片时会报Error: ENOENT, 原因:图片默认会选择系统的缓存文件夹下,在windows下无权访问C盘,所以就报错了.. 解决 ...

  2. javascript – Node.js请求CERT_HAS_EXPIRED

    javascript – Node.js请求CERT_HAS_EXPIRED 转  http://www.voidcn.com/article/p-ssctwovd-bsy.html 原文   htt ...

  3. 完美解决VS2003.Net fatal error LNK1201: 写入程序数据库“.pdb”时出错

    我的开发环境是Win7旗舰64位+VS2003.Net,经常卡pdb错误,文末给出一个完美的解决方案和一个懒人补丁包.问题描述如下:在重新编译的时候,经常报错: fatal error LNK1201 ...

  4. 解决 Your project contains error(s),please fix them before running your application问题

    原文地址: Android笔记:解决 Your project contains error(s),please fix them before running your application问题 ...

  5. 解决办法:CMake编译时出现“error in configuration process project files may be invalid”

    无论是CMake2.84 还是当前最新的CMake2.87都可能会出现这种错: 查遍国内外的网上都没有给出可行办法,结果还是自己解决了 现把出错原因和解决办法如下:出错原因:因是英文版本,通常安装没有 ...

  6. p4factory 解决“g++: internal compiler error: Killed (program cc1plus)” make error问题

    参考:解决: g++: internal compiler error: Killed (program cc1plus) 在安装p4factory的时候,执行: ./install_deps.sh ...

  7. WebAssembly是解决JavaScript 痼疾的银弹?

    写在前面 <没有银弹>是 Fred Brooks 在 1987 年所发表的一篇关于软件工程的经典论文.该论文的主要论点是,没有任何一项技术或方法可以能让软件工程的生产力在十年内提高十倍. ...

  8. 解决MySQL报错ERROR 2002 (HY000)【转】

    今天在为新的业务线搭架数据库后,在启动的时候报错 root@qsbilldatahis-db01:/usr/local/mysql/bin# ./mysql ERROR 2002 (HY000): C ...

  9. Node.js Error: listen EADDRNOTAVAIL

    1 前言 nodejs部署在云服务器,外网用域名加端口访问不进来,但在服务器本地用127.0.0.1加端口可以访问,并且端口已经放开,然后只能排查配置.此文章仅作为记录使用. 如果端口和另一个的端口一 ...

随机推荐

  1. greenplum常见问题及解决方法

    本文链接:https://blog.csdn.net/q936889811/article/details/85612046                文章目录 1.错误:数据库初始化:gpini ...

  2. js大文件分块上传断点续传demo

    文件夹上传:从前端到后端 文件上传是 Web 开发肯定会碰到的问题,而文件夹上传则更加难缠.网上关于文件夹上传的资料多集中在前端,缺少对于后端的关注,然后讲某个后端框架文件上传的文章又不会涉及文件夹. ...

  3. C语言利用fgetc复制拷贝文件内容

    #include <stdio.h> #include <stdlib.h> //文件的内容复制 int main(int a,char *argv[]){ if(a!=3){ ...

  4. cgp的辣鸡比赛题解

    目录 cgp的gcd 题目链接 思路 代码 cgp调戏妹子 题目链接 思路 代码 cgp的序列 题目链接 思路 代码 cgp的背包 题目链接 思路 代码 cgp的gcd 题目链接 传送门 思路 首先看 ...

  5. 28、对多次使用的RDD进行持久化或Checkpoint

    一.图解 二.说明 如果程序中,对某一个RDD,基于它进行了多次transformation或者action操作.那么就非常有必要对其进行持久化操作,以避免对一个RDD反复进行计算. 此外,如果要保证 ...

  6. python 装饰器demo

    本质就是一个函数,这个函数符合闭包语法结构,可以在函数不需要改变任何代码情况下增加额外功装饰器的返回值是一个函数的引用 功能:1.引入日志:2.函数执行时间统计:3.执行函数前预备处理:4.执行函数后 ...

  7. 在Mac如何启动MySQL

    安装好MySQL服务后(安装步骤可以参考系列经验1).打开“系统偏好设置”,单击下端的“MySQL”图标. 2 在“MySQL”对话框中,单击“启动MySQL服务”按钮. 3 在弹出的窗口中,输入管理 ...

  8. docker理论 Cgroup namespace 各种隔离

    耦合 是指两个或两个以上的体系或者两种运动形式间通过相互作用而批次影响以至联合起来的现象. Nginx与apache 在同一台服务器运行都占用80端口,起冲突这是我们修改其中一个端口为8080 半解耦 ...

  9. python定制后处理云图

    用后处理软件处理的云图会出现这样或那样的不满意,其实我们可以将求解数据导出以后,借助python定制云图. 我们以fluent为例 求解完成之后,我们将我们需要做云图的物理量以ASCII导出 如下的p ...

  10. LeetCode 第 159 场周赛

    一. 缀点成线(LeetCode-5230) 1.1 题目描述 1.2 解题思路 比较简单的一题,利用公式 y = kx + b,确定好k和b就好,并且要考虑一种情况,函数 x = h. 1.3 解题 ...