NodeJs:缓慢的 req.pipe

2024-02-28

我发现 tus 的服务器实现(https://tus.io https://tus.io)对于nodejs(https://github.com/tus/tus-node-server https://github.com/tus/tus-node-server)与 go 实现相比确实很慢(https://github.com/tus/tusd https://github.com/tus/tusd).

在这里您可以找到不同实现之间的比较(本地运行、同一台机器、相同输入)

nodejs:

[2019-01-31 16:22:45,578] INFO Uploading 52428800 bytes chunk from offset: 104857600
[2019-01-31 16:22:47,329] INFO Total bytes sent: 157286400 (kb/s: 29930)

go:

 [2019-01-31 16:26:31,894] INFO Uploading 52428800 bytes chunk from offset: 104857600
    [2019-01-31 16:26:32,348] INFO Total bytes sent: 209715200 (kb/s: 115639)

我探索了 tus-node-server 代码库,然后构建了一个真正简化的服务器实现(我试图减少可能的开销)。

这是代码:

const fs = require('fs');
const express = require('express');
const app = express();

let offset = 0;
let len = Math.pow(2,30);

app.post('/files',(req,res) => {
    console.log("post received");
    res.set({
        'Location': 'http://localhost:8888/files/test',
        'Tus-Resumable': '1.0.0',
    });
    res.status(201).end();
});

app.options('/files',(req,res) => {
    console.log("options received");
    res.set({
        'Location': 'http://localhost:8888/files/test',
        'Tus-Resumable': '1.0.0',
        'Tus-Version': '1.0.0,0.2.2,0.2.1'
    });
    res.status(200).end();
});

app.head('/files/test',(req,res) => {
    console.log("options received");
    res.set({
        'Upload-Offset': offset,
        'Upload-Length': len
    });
    res.status(200).end();
});

app.patch('/files/test',(req, res) => {
    let localOffset = parseInt(req.get('Upload-Offset'), 10);
    // the file is pre-created
    const path = `./file.tmp`;
    const options = {
        flags: 'r+',
        start: localOffset
    };

    const stream = fs.createWriteStream(path, options);

    let new_offset = 0;
    req.on('data', (buffer) => {
        new_offset += buffer.length;
    });


    return req.pipe(stream).on('finish', () => {

        localOffset += new_offset;

        offset = localOffset;

        res.set({
            'Upload-Offset': offset,
            'Upload-Length': len
        });
        res.status(204).end();
    });


});

const host = 'localhost';
const port = 8888;
app.listen(port, host, (err, resp) => {
    if(err) {
        console.error(err);
        return
    }
    console.log('listening')
});

我认为性能不佳是由于以下代码块造成的:

const stream = fs.createWriteStream(path, options);
req.pipe(stream)

我还使用管道检查了文件副本,并且获得了良好的性能(类似于 go 实现)

const fs = require('fs');
const path = require('path');
const from = path.normalize(process.argv[2]);
const to = path.normalize(process.argv[3]);

const readOpts = {}; // {highWaterMark: Math.pow(2,16)};
const writeOpts ={}; // {highWaterMark: Math.pow(2,16)};

const startTs = Date.now();
const source = fs.createReadStream(from, readOpts);
const dest = fs.createWriteStream(to, writeOpts);
let offset = 0;

source.on('data', (buffer) => {
    offset += buffer.length;
});

dest.on('error', (e) => {
    console.log('[FileStore] write: Error', e);
});

source.pipe(dest).on('finish',() => {
    const endTs = Date.now();
    const kbs = (offset / (endTs - startTs)) / 1000;
    console.log("SPEED: ", kbs, offset);
});

所以瓶颈似乎是请求和管道的处理。

您能帮助我了解发生了什么以及为什么与 go 版本相比如此慢


我认为你有一个highWaterMark问题在这里。

您的测试之间的差异是由于:

  • req具有 16 kb 的高水位标记
  • createReadStream has a 64 kb 的高水位标记 https://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options

您可以看到增值:

console.log('readableHighWaterMark', req.readableHighWaterMark);

相反,假设您的网络延迟可以忽略不计(因为您是本地主机),您 可以尝试创建writeStream带有更大的水印:

const options = {
    flags: 'w',
    start: localOffset,
    highWaterMark: 1048576
};
const stream = fs.createWriteStream(path, options);

这应该会加快写入速度,但会消耗更多 RAM。

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

NodeJs:缓慢的 req.pipe 的相关文章

随机推荐