[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Archiever with custom readableStream is not working? #435

Open
ibudisteanu opened this issue Jul 29, 2020 · 3 comments
Open

Archiever with custom readableStream is not working? #435

ibudisteanu opened this issue Jul 29, 2020 · 3 comments

Comments

@ibudisteanu
Copy link
ibudisteanu commented Jul 29, 2020

Hi! I am planning to use Archiver into an open-source protocol. I can't get it working with a custom readableStream. Here is my code. What am I doing wrong?

var archiver = require('archiver');
var fs = require('fs')

// create a file to stream archive data to.
var output = fs.createWriteStream(__dirname + '/example.zip');
var archive = archiver('zip', {
  zlib: { level: 9 } // Sets the compression level.
});



// listen for all archive data to be written
// 'close' event is fired only when a file descriptor is involved
output.on('close', function() {
  console.log(archive.pointer() + ' total bytes');
  console.log('archiver has been finalized and the output file descriptor has closed.');
});

// This event is fired when the data source is drained no matter what was the data source.
// It is not part of this library but rather from the NodeJS Stream API.
// @see: https://nodejs.org/api/stream.html#stream_event_end
output.on('end', function() {
  console.log('Data has been drained');
});

// good practice to catch warnings (ie stat failures and other non-blocking errors)
archive.on('warning', function(err) {
  if (err.code === 'ENOENT') {
    // log warning
  } else {
    // throw error
    throw err;
  }
});

// good practice to catch this error explicitly
archive.on('error', function(err) {
  throw err;
});

// pipe archive data to the file
archive.pipe(output);


/* Implementation */
var stream = require('stream');

// Create the custom stream
function Num(options) { 
    // Inherit properties
    stream.Readable.call(this, options);

    this._start = 0;
    this._end = 100;
    this._curr = this._start;
}

//Inherit prototype
Num.prototype = Object.create(stream.Readable.prototype);
Num.prototype.constructor = stream.Readable;

//Add my own implementation of read method
Num.prototype._read = function() {
    var num = this._curr;
    var buf = Buffer.from(num.toString(), 'utf-8');
    
    this.push(buf);
    this._curr++;

    if (num === this._end) {
        this.push(null);
    }
};


//Usage
var num = new Num();

//Listening to data event (will be executed any time that get a piece of data)
num.on('data', function(chunk) {
    console.log(chunk.toString('utf-8'));
});

archive.append( num, { name: 'file1.txt' });

archive.finalize();

it generates me a zip file with 134 bytes. When I open the archieve, I see my file 'file1.txt' but it is empty and its size is 0 bytes.

@jeff-r-koyaltech
Copy link

I am facing this exact issue. When I .append() a stream, it will not fully process all chunks of data.

@totszwai
Copy link
totszwai commented Jun 6, 2024

Is archiver abandonware? has anyone figured out any alternative? I tried this.

  const genBlob = (sz) => {
    let array = new Float64Array(sz);
    for (let i = 0; i < sz; i++)
      array[i] = rand();
    return new Uint8Array(array.buffer);
  };

const largefilestream = new Stream.Readable();
archive.append(largefilestream, { name: '12345/10gb.bin' });
archive.finalize();

let numberOfGb = 1;
for (let i = 0; i < numberOfGb * 10; i++) {
  largefilestream.push(genBlob(12800000)); // ~100kb
}
largefilestream.push(null);
largefilestream.destroy();

It created the zip but is corrupted.

@jeff-r-koyaltech
Copy link

I ended up abandoning any attempt to use this library, and punted the problem to a golang microservice since I was in a containerized environment! 😆 How's that for a light slap in the face to the Node.js ecosystem?

https://github.com/scosman/zipstreamer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants