yt-dlp support

This commit is contained in:
synt-xerror
2026-03-12 01:22:29 -03:00
parent 2040382842
commit 3780936e01
5435 changed files with 991931 additions and 137 deletions

423
node_modules/node-webpmux/README.md generated vendored Normal file
View File

@@ -0,0 +1,423 @@
# node-webpmux
A mostly-complete pure Javascript re-implementation of webpmux.<br />
Can load "simple" lossy/lossless images as well as animations.
### Install
```npm install node-webpmux```
### Basic usage
```javascript
const WebP = require('node-webpmux');
let img = new WebP.Image();
// Load an animation
await img.load('img.webp');
// Extract the (unprocessed) fourth frame
await img.demux('.', { frame: 3 });
// Replace the fourth frame with a new image from disk
await img.replaceFrame(3, 'different.webp'); // This preserves the existing frame settings
// Alternatively you can do
// let frame = Image.generateFrame({ path: 'different.webp' });
// img.frames[3] = frame;
// Which will completely replace the frame
// Save a new copy
await img.save({ path: 'newimg.webp' });
// Or alternatively, img.save() to save over the existing one
```
### Exports
`TYPE_LOSSY`<br />
`TYPE_LOSSLESS`<br />
`TYPE_EXTENDED`<br />
Constants for what type of image is loaded.
`encodeResults`: enum of values that set[Image/Frame]Data returns.
`Image`: The main class.
### Class definition:
#### Class properties
##### `.width` (read-only)
The width of the loaded image.
##### `.height` (read-only)
The height of the loaded image.
##### `.type` (read-only)
The type of image from the TYPE_* constants table.
##### `.hasAnim` (read-only)
A boolean flag for easily checking if the image is an animation.
##### `.hasAlpha` (read-only)
A boolean flag for easily checking if the image has transparency in any way.
##### `.frames` (read-only)
Returns the array of frames, if any, or undefined.<br />
Note that while the frames themselves are read/write, you shouldn't modify them.
##### `.frameCount` (read-only)
The number of frames in the image's animation, or 0 if it's not an animation.
##### `.anim` (read-only)
Direct access to the raw animation data (see below in the _Layout for internal Image data_ section).
##### `.iccp` (read/write)
A Buffer containing the raw ICCP data stored in the image, or undefined if there isn't any.
##### `.exif` (read/write)
A Buffer containing the raw EXIF data stored in the image, or undefined if there isn't any.
##### `.xmp` (read/write)
A Buffer containing the raw XMP data stored in the image, or undefined if there isn't any.
#### Image member functions
##### `async .initLib()`
Calls Image.initLib(). This member function is no longer particularly useful and is kept for convenience.
##### `async .load(source)`
If `source` is a string, it tries to load that as a path to a WebP image.<br />
If `source` is a buffer, it tries to load the contents of the buffer as a WebP image.
##### `.convertToAnim()`
Sets the image up for being an animation.
##### `async .demux({ path = undefined, buffers = false, frame = -1, prefix = '#FNAME#', start = 0, end = 0 })`
Dump the individual, unprocessed WebP frames to a directory.
* `path`: The directory to dump the frames to, if desired.
* `buffers`: Return the frames as an array of Buffers instead of dumping to a path.
* `prefix`: What to prefix the frame names with. Default is the file name of the original image (without .webp).
Format is \<prefix\>_\<frame number\>.webp.
* `frame`: What frame to dump. Defaults to -1, which has it dump all available frames. Overrides `start`/`end`.
* `start`: The first frame to dump. Defaults to the first frame.
* `end`: The last frame to dump. Defaults to the last frame.
##### `async .replaceFrame(frameIndex, source)`
Replaces a frame in the animation with another image from `source`. All other frame settings are preserved.
* `frameIndex`: Which frame to replace. Frame indexes are 0-based.
* `source`: If this is a string, the frame is loaded from disk. If this is a Buffer, the frame is loaded from there.
##### `async .save(path = this.path, options)`
Save the image to `path`. Options are described below in the _Options for saving_ section.<br />
If `path` is `null`, this will save the image to a Buffer and return it.
##### `async .getImageData()`
Get the raw RGBA pixel data for the image.<br />
Returns a Buffer in the format `[ r, g, b, a, r, g, b, a, ... ]`. Values are range 0 - 255.<br />
Use this for non-animations.<br />
On error, this returns a Buffer full of 0s.
##### `async .setImageData(buffer, { width = 0, height = 0, preset = 0, quality = 75, exact = false, lossless = 0, method = 4, advanced = undefined })`
Encode `buf` as a new WebP using the provided settings and replace the image pixel data with it.<br />
This preserves EXIF/ICCP/XMP if present.<br />
Use this for non-animations.<br />
* `buffer`: A Buffer object with the raw pixels in RGBA order.<br />
Options:
* `width`/`height`<br />
If either are > 0, override the existing width and/or height with this value.<br />
Use this if the pixel data in `buf` has different dimensions than the original image.
* `preset`: What image preset to use, if any.<br />
Range is 0 - 5<br />
Default is 0 (DEFAULT).<br />
An enum of constants can be found under WebP.presets
* `quality`: What quality to set.<br />
Range is 0 - 100.<br />
Default is 75.
* `exact`: Preserve data in transparent pixels.<br />
Defaults to `false`, which means transparent pixels may be modified to help with compression.
* `lossless`: Save the data as a lossy/lossless image.<br />
Range is 0 - 9.<br />
Default is 0 (lossy).<br />
Higher values will result in smaller files, but requires more processing time.
* `method`: Compression method to use.<br />
Range is 0 - 6.<br />
Default is 4.<br />
Higher values will result in smaller files, but requires more processing time.
* `advanced`: Access to more advanced encoding settings offered by libwebp<br />
* * `imageHint`: Hint for what type of image it is (only used for lossless encoding for now, according to libwebp spec).<br />
Range is 0 - 3.<br />
Default is 0 (DEFAULT).<br />
An enum of constants can be found under WebP.hints
* * `targetSize`: Specifies the desired target size in bytes.<br />
Default is 0 (no target).<br />
Takes precedence over the `method` parameter.
* * `targetPSNR`: Specifies the minimum distortion to try to achieve.<br />
Default is 0 (no target).<br />
Takes precedence over the `targetSize` parameter.
* * `segments`: Maximum number of segments to use.<br />
Range is 1 - 4.<br />
Default is 4.
* * `snsStrength`: Spacial Noise Shaping.<br />
Range is 0 - 100.<br />
Default is 50.
* * `filterStrength`<br />
Range is 0 - 100.<br />
Default is 0 (off).
* * `filterSharpness`<br />
Range is 0 - 7, with 7 being the least sharp.<br />
Default is 0 (off).
* * `filterType`<br />
Range is 0 - 1.<br />
Default is 1.<br />
0 is simple; 1 is strong.<br />
Only used if `filterStrength` > 0 or `autoFilter` > 0.
* * `autoFilter`: Auto-adjust the filter's strength.<br />
Range is 0 - 1.<br />
Default is 0 (off).
* * `alphaCompression`: Algorithm for encoding the alpha plane.<br />
Range is 0 - 1.<br />
Default is 1 (Lossless).<br />
0 is off; 1 is lossless.
* * `alphaFiltering`: Predictive filtering method for alpha place.<br />
Range is 0 - 2.<br />
Default is 1 (Fast).<br />
0 is none; 1 is fast; 2 is best
* * `alphaQuality`<br />
Range is 0 - 100.<br />
Default is 100.
* * `pass`: Number of entropy-analysis passes.<br />
Range is 1 - 10.<br />
Default is 1.
* * `showCompressed`: Export the compressed picture back.<br />
Range is 0 - 1.<br />
Default is 0 (don't).<br />
In-loop filtering is not applied.
* * `preprocessing`: Preprocessing filter.<br />
Range is 0 - 2.<br />
Default is 0 (None).<br />
0 is none; 1 is segment-smooth; 2 is pseudo-random dithering.
* * `partitions`: log2(number of token partitions).<br />
Range is 0 - 3.<br />
Default is 0.<br />
Higher values result in harder progressive decoding.
* * `partitionLimit`: Quality degredation allowed to fit the 512k limit on prediction modes coding.<br />
Range is 0 - 100.<br />
Default is 0.
* * `emulateJpegSize`: Compression parameters are remapped to better mat the expected output size from JPEG compression.<br />
Range is 0 - 1.<br />
Default is 0 (Off).<br />
Generally, the output size will be smaller but the degredation will be lower.
* * `threadLevel`: Try to use multi-threaded encoding.<br />
Default is 0 (Off).<br />
NOTE: Currently the WebAssembly is NOT compiled with support for threads, so this option does nothing.<br />
NodeJS doesn't support threads in WebAssembly without an experimental flag, and my testing with it didn't appear to use threads regardless.
* * `lowMemory`: Reduce memory usage but increase CPU use.<br />
Range is 0 - 1.<br />
Default is 0 (Off).
* * `nearLossless`: Near lossless encoding.<br />
Range is 0 - 100.<br />
Default is 100 (off).<br />
0 is max loss, 100 is off.
* * `useDeltaPalette`: Reserved for future lossless feature.<br />
Range is 0 - 0.<br />
Default is 0 (Off).<br />
Setting this will do nothing, as it's forced back to 0.
* * `useSharpYUV`: Use sharp (and slow) RGB->YUV conversion.<br />
Range is 0 - 1.<br />
Default is 0 (Off).
* * `qMin`: Minimum permissible quality factor.<br />
Range is 0 - 100.<br />
Default is 0.
* * `qMax`: Maximum permissible quality factor.<br />
Range is 0 - 100.<br />
Default is 100.
If `lossless` is set above 0, then setting `quality` or `method` is discouraged as they will override settings in the lossless preset.<br />
Return value can be checked against the values in `WebP.encodeResults`.
##### `async .getFrameData(frameIndex)`
Get the raw RGBA pixel data for a specific frame.<br />
Use this for animations.<br />
* `frameIndex`: Which frame to get. Frame indexes are 0-based.<br />
Otherwise identical to `.getImageData()`
##### `async .setFrameData(frameIndex, buffer, { width = 0, height = 0, preset = 0, quality = 75, exact = false, lossless = 0, method = 4, advanced = undefined })`
Encode `buffer` as a new WebP using the provided settings and replace an existing frame's pixel data with it.<br />
Use this for animations.<br />
* `frameIndex`: Which frame to get. Frame indexes are 0-based.<br />
Otherwise identical to `.setImageData()`.
#### Static functions
##### `async Image.initLib()`
Initialize the internal library used for [get/set]ImageData and [get/set]FrameData described above.<br />
There is no need to call this unless you plan to use one of those 4 functions.
##### `Image.from(webp)`
Use the contents of `webp` and return a new Image using them.<br />
Mainly useful for passing Image into a Web Worker or NodeJS Worker and converting the passed object back into an Image instance.<br />
Such an approach can be used to greatly speed up saving of animations or multiple images as libwebp is *not* multi-threaded beyond saving the alpha layer of lossy images.
##### `async Image.save(path, options)`
Save the `options` using `Image.getEmptyImage()`.<br />
Works the same as `.save()` otherwise.<br />
Can be used to create an animation from scratch by passing `frames` in `options`.<br />
&ensp; Example: `Image.save('animation.webp', { frames: ... })` for saving to file
&ensp; OR
&ensp; Example: `Image.save(null, { frames: ... })` for saving to Buffer
##### `async Image.getEmptyImage(ext)`
Returns a basic, lossy 1x1 black image with no alpha or metadata.<br />
Useful if you need to create a WebP from scratch, such as when converting from PNG.<br />
`.setImageData()` would be used to change the canvas size/contents.<br />
Set `ext` to `true` to force the image to be an extended type, if desired. This is mainly for use internally.
##### `async Image.generateFrame({ path = undefined, buffer = undefined, img = undefined, x = undefined, y = undefined, delay = undefined, blend = undefined, dispose = undefined })`
Generates enough of an `anmf` structure to be placed in `.frames`.<br />
Note that, at the moment, only *static* images are supported in this function.
* `path`/`buffer`/`img`
Only one of these can be present.
`path` will load image data from file.
`buffer` will load from the buffer.
`img` will use an existing Image instance.
* `x`/`y`/`delay`/`blend`/`dispose`
Explicitly set these properties. See the _Options for saving_ section for what these do.
### Options for saving
#### These options affect both static images and animations
* `exif`/`iccp`/`xmp`<br />
Save or override EXIF/ICCP/XMP chunks.<br />
Pass `true` to save the existing ones, or pass a Buffer to replace them.<br />
Note that there is no verification whatsoever that the data passed is valid.
#### The options below are only used when saving an animation:
* `width`/`height`: Width/height of the image.<br />
Range 0 - 16777216.<br />
The product of width*height must NOT exceed (2 ** 32) - 1.<br />
Passing 0 to either flags it for being set automatically.
* `bgColor`: The background color of the animation.<br />
Format is [ r, g, b, a ].<br />
Defaults to [ 255, 255, 255, 255 ].
* `loops`: Number of times the animation loops.<br />
Range is 0 - 65535, with 0 being an infinite loop.<br />
Default is 0.
* `x`/`y`/`delay`/`blend`/`dispose`: Changes the default frame x/y position where a frame omits it (see below).
* * `x`/`y` defaults to 0.
* * `delay` defaults to 100.
* * `blend` defaults to `true`.
* * `dispose` defaults to `false`.
* * `frames`: An array of objects defining each frame of the animation with the following properties.
* * * `x`/`y`: x, y offset to place the frame within the animation.<br />
Range 0 - 16777215.<br />
Default is 0,0 (defined above).
* * * `delay`: Length of this frame in miliseconds.<br />
Range 0 - 16777215.<br />
Default is 100 (defined above).<br />
According to the documentation, delays <= 10ms are WebP implementation defined, and many tools/browsers/etc assign their own minimum-allowed delay.
* * * `blend`: Boolean flag for whether or not to use alpha blending when drawing the frame.<br />
Default is `true` (defined above).
* * * `dispose`: Boolean flag to control frame disposal method.<br />
`true` causes the background color to be drawn under the frame.<br />
`false` draws the new frame directly.<br />
Default is `false` (defined above).
### Information about the internal library
[get/set]ImageData and [get/set]FrameData are powered by Google's official libwebp library obtained from the [GitHub mirror](https://github.com/webmproject/libwebp).<br />
Commit 8151f38 was the latest at the time of compilation.<br />
This library was compiled with Emscripten with the command `emcc -O3 -s WASM=1 -s MODULARIZE -s EXPORTED_RUNTIME_METHODS='[cwrap]' -s ALLOW_MEMORY_GROWTH=1 -s EXPORT_NAME=LibWebP -DHAVE_CONFIG_H -I libwebp binding.cpp libwebp/src/{dec,dsp,demux,enc,mux,utils}/*.c libwebp/sharpyuv/*.c --bind -o libwebp.js`.<br />
binding.cpp is a shim I wrote to bridge the needed parts together and can be found in the libwebp/ directory.
libwebp.mjs, found in the root, is the Javascript interface to it.
At present, the only options for encoding are setting the lossless preset, quality, method, and exact flag.<br />
If access to other options is desired (see upstream libwebp/src/webp/encode.h, struct WebPConfig for settings), leave a feature request and I'll add it.<br />
The upstream command line tool `cwebp` can be used to play with the features and see what you find useful.
### Layout for internal Image data
```javascript
{
path, // The path loaded.
loaded, // Boolean flag for if this object has an image loaded.
data: { // The loaded data.
type, // The type of image from the constants table.
vp8: { // The lossy format image. Only if .type is TYPE_LOSSY or TYPE_EXTENDED.
raw, // The raw, compressed image data from the VP8 chunk.
width, height // The width/height, extracted from the VP8 image data.
},
vp8l: { // The lossless format image. Only if .type is TYPE_LOSSLESS or TYPE_EXTENDED.
raw, // The raw, compressed image data from the VP8L chunk.
alpha, // A flag for if this image has alpha data, extracted from the VP8L image data.
width, height // The width/height, extracted from the VP8L image data.
},
extended: { // Only if .type is TYPE_EXTENDED.
raw, // The raw data for the VP8X chunk.
hasICCP, // Flag for if there's an ICC profile chunk defined.
hasAlpha, // Flag for if any image/frame defined has alpha data.
hasEXIF, // Flag for if there's an EXIF chunk defined.
hasXMP, // Flag for if there's an XMP chunk defined.
hasAnim, // Flag for if this image has an animation defined.
width, height // Width/height of the image.
},
anim: {
raw, // A Buffer containing the raw data for the ANIM chunk. Mainly for internal use.
bgColor, // The background color in [ r, g, b, a ] format.
loops, // The loop count.
frames: [ // Array of frames
{ // The frame object definition
raw, // The raw data for the ANMF chunk. Mainly for internal use.
type, // The type of image this frame is, from the constants table.
x, y, // The frame's x, y position.
width, height, // The frame's width and height.
delay, // The duration of the frame.
blend, dispose, // The frame's blend/dispose flags.
// Additionally, one or more of the following.
vp8, // The raw, compressed WebP data for a lossy image. If present, there will be no `vp8l`.
vp8l, // The raw, compressed WebP data for a lossless image. If present, there will be no `vp8` or `alph`.
alph // The raw, compressed WebP data for an alpha map. Might be present if the image is lossy.
},
...
]
},
alph: {
raw // The raw alpha map chunk. Only likely to be here if .vp8 is also defined and .type is TYPE_EXTENDED.
},
iccp: {
raw // The raw ICCP chunk, if defined.
},
exif: {
raw // The raw EXIF chunk, if defined.
},
xmp: {
raw // The raw XMP chunk, if defined.
}
}
}
```
### Breaking changes from 1.x
Image.muxAnim and .muxAnim were merged into Image.save and .save respectively.
* Replace `Image.muxAnim({ path, frames, width, height, bgColor, loops, delay, x, y, blend, dispose, exif, iccp, xmp })`
* With `Image.save(path, undefined, { frames, width, height, bgColor, loops, delay, x, y, blend, dispose, exif, iccp, xmp })`
<br /><br />
* Replace `.muxAnim({ path, width, height, bgColor, loops, delay, x, y, blend, dispose, exif, iccp, xmp })`
* With `.save(path, { width, height, bgColor, loops, delay, x, y, blend, dispose, exif, iccp, xmp })`
`.anim.backgroundColor` renamed to `.anim.bgColor` for brevity and consisteny.<br />
`.anim.loopCount` renamed to `.anim.loop` for consistency.<br />
`.anim.frameCount` and `.frameCount` were removed. Should use `.anim.frames.length` and `.frames.length` respectively instead.<br />
`.demuxAnim()` was renamed to `.demux()`
## Breaking changes from 2.0.0 to 2.0.1
Image.generateFrame()'s `duration` input renamed to `delay`<br />
## Breaking changes from 2.x to 3.0.0
File and buffer codepaths have been merged.
* Replace `.loadBuffer(buffer)`
* With `.load(buffer)`
* Replace `Image.loadBuffer(buffer)`
* With `Image.load(buffer)`
<br /><br />
* Replace `.saveBuffer(settings)`
* With `.save(null, settings)`
* Replace `Image.saveBuffer(settings)`
* With `Image.save(null, settings)`
* Note that it's specifically `null` here. This is because the default behavior of .save() is still saving to the path it was loaded from.
<br /><br />
* Replace `.demuxToBuffers({ setting, setting, ... })`
* With `.demux({ buffers: true, setting, setting, ... })`
* Replace `.demux(path, settings)`
* With `.demux({ path, setting, setting, ... })`
<br /><br />
* Replace `.replaceFrameBuffer(frame, buffer)`
* With `.replaceFrame(frame, buffer)`

315
node_modules/node-webpmux/bin/webpmux generated vendored Normal file
View File

@@ -0,0 +1,315 @@
#!/usr/bin/env node
const fs = require('fs');
const WebP = require('../webp.js');
const intTest = /^[0-9]+$/;
function parseDuration(d) {
let a = d.split(',');
if (a.length == 1) { return { dur: a[0], start: 0, end: 0 }; }
if (a.length == 2) { return { dur: a[0], start: a[1], end: a[1] }; }
if (a.length == 3) { return { dur: a[0], start: a[1], end: a[2] }; }
throw new Error('Failed to parse duration');
}
function parseFrame(f) {
let out = {}, a = f.split('+');
if (a.length < 2) { throw new Error('Failed to parse frame setting shorthand'); }
out.duration = a[1];
out.x = a[2];
out.y = a[3];
if (a[4] == 1) { out.dispose = true; }
else if (a[4] == 0) { out.dispose = false; }
else if (a[4] !== undefined) {
let x = a[4].split('-');
if (x[0] == 1) { out.dispose = true; }
else if (x[0] == 0) { out.dispose = false; }
if (x[1] == 'b') { out.blend = false; }
}
if (a[5] == 'b') { out.blend = true; }
return out;
}
function parseCmdLine(args) {
let state = {}, tester = /^-/;
let test = (_i) => {
let i = _i+1;
if (i >= args.length) { return false; }
else if (tester.test(args[i])) { return false; }
return true;
};
for (let i = 0, l = args.length; i < l; i++) {
switch (args[i]) {
case '-get':
if (!test(i)) { throw new Error('GET_OPTS missing argument'); }
state.get = { what: args[++i] };
switch (state.get.what) {
case 'icc': case 'iccp': state.get.what = 'iccp'; break;
case 'exif': case 'xmp': break;
case 'frame':
if (!test(i)) { throw new Error('GET_OPTS frame missing argument'); }
state.get.frame = args[++i];
break;
default: throw new Error(`Unknown GET_OPTS ${state.set.what}`);
}
break;
case '-set':
if (!test(i)) { throw new Error('SET_OPTS missing argument'); }
state.set = { what: args[++i] };
switch (state.set.what) {
case 'loop': if (!test(i)) { throw new Error('SET_OPTS loop missing argument'); } state.set.loop = args[++i]; break;
case 'iccp': case 'icc': if (!test(i)) { throw new Error(`SET_OPTS ${state.set.what} missing argument`); } state.set.what = 'iccp'; state.set.iccp = args[++i]; break;
case 'exif': if (!test(i)) { throw new Error('SET_OPTS exif missing argument'); } state.set.exif = args[++i]; break;
case 'xmp': if (!test(i)) { throw new Error('SET_OPTS xmp missing argument'); } state.set.xmp = args[++i]; break;
default: throw new Error(`Unknown SET_OPTS ${state.set.what}`);
}
break;
case '-strip': if (!test(i)) { throw new Error('STRIP_OPTS missing argument'); } state.strip = args[++i]; break;
case '-duration': if (!test(i)) { throw new Error('DUR_OPTS missing argument'); } if (!state.duration) { state.duration = []; } state.duration.push(parseDuration(args[++i])); break;
case '-frame':
{
let f = {};
if (!test(i)) { throw new Error('FRAME_OPTS missing argument'); }
if (!state.frames) { state.frames = []; }
f.path = args[++i];
if (!/\.webp$/i.test(f.path)) { throw new Error('First argument to -frame must be a webp image'); }
if (!test(i)) { throw new Error('Missing arguments in -frame'); }
if (args[i+1][0] == '+') { f.bin = parseFrame(args[++i]); }
else {
let ni;
for (let x = i+1, xl = l; x < xl; x++) {
switch (args[x]) {
case 'duration': if (!test(x)) { throw new Error('FRAME_OPTS duration missing argument'); } f.duration = args[++x]; break;
case 'x': if (!test(x)) { throw new Error('FRAME_OPTS x missing argument'); } f.x = args[++x]; break;
case 'y': if (!test(x)) { throw new Error('FRAME_OPTS y missing argument'); } f.y = args[++x]; break;
case 'dispose': if (!test(x)) { throw new Error('FRAME_OPTS dispose missing argument'); } f.dispose = args[++x]; break;
case 'blend': if (!test(x)) { throw new Error('FRAME_OPTS blend missing argument'); } f.blend = args[++x]; break;
default: ni = x-1; xl = x; break;
}
}
i = ni;
}
state.frames.push(f);
}
break;
case '-info': state.info = true; break;
case '-h': case '-help': state.help = true; break;
case '-version': state.version = true; break;
case '-o': if (!test(i)) { throw new Error('OUT missing argument'); } state.out = args[++i]; break;
case '-loop': if (!test(i)) { throw new Error('COUNT missing argument'); } state.loop = args[++i]; break;
case '-bg': if (!test(i)) { throw new Error('COLOR missing argument'); } state.bg = args[++i].split(','); break;
default: if (!state.in) { state.in = args[i]; } else { throw new Error(`Unknown flag ${args[i]}`); }
}
}
return state;
}
function printHelp() {
console.log(`Usage: webpmux -get GET_OPTS IN -o OUT
webpmux -set SET_OPTS IN -o OUT
webpmux -strip STRIP_OPTS IN -o OUT
webpmux -duration DUR_OPTS [-duration ...] IN -o OUT
webpmux -frame FRAME_OPTS [-frame ...] [-loop COUNT] [-bg COLOR] -o OUT
webpmux -info IN
webpmux [-h|-help]
webpmux -version
GET_OPTS:
Extract the relevant data:
iccp get ICC profile
icc get ICC profile (backwards support)
exif get EXIF metadata
xmp get XMP metadata
frame n get nth frame (first frame is frame 1)
SET_OPTS:
Set color profile/metadata:
loop COUNT set the loop count
iccp file.iccp set the ICC profile
icc file.icc set the ICC profile (backwards support)
exif file.exif set the EXIF metadata
xmp file.xmp set the XMP metadata
where: 'file.icc'/'file.iccp' contains the ICC profile to be set.
'file.exif' contains the EXIF metadata to be set.
'file.xmp' contains the XMP metadata to be set.
DUR_OPTS:
Set duration of selected frames
duration set duration for each frame
duration,frame set duration of a particular frame
duration,start,end set duration of frames in the
interval [start, end]
where: 'duration' is the duration in milliseconds.
'start' is the start frame index.
'end' is the inclusive end frame index.
The special 'end' value '0' means: last frame.
STRIP_OPTS:
Strip color profile/metadata:
iccp strip ICC profile
icc strip ICC profile (for backwards support)
exif strip EXIF metadata
xmp strip XMP metadata
FRAME_OPTS:
Create an animation frame:
frame.webp the animation frame
WEBPMUX_FRAMES legacy frame settings
OR
frame.webp the animation frame
duration N the pause duration before next frame
x X the x offset for this frame
y Y the y offset for this frame
dispose on/off dispose method for this frame (on: background, off: none)
blend on/off blending method for this frame
COUNT:
Number of times to repeat the animation.
Valid range is 0 to 65535 [Default: 0 (infinite)]
COLOR:
Background color of the animation canvas.
R,G,B,A ('normal' mode)
A,R,G,B ('legacy' mode)
where: 'A', 'R', 'G', and 'B' are integers in the range 0 to 255 specifying
the Alpha, Red, Green, and Blue component values respectively
[Default: 255, 255, 255, 255]
WEBPMUX_FRAMES (for drop-in support for the upstream webpmux binary, puts it into 'legacy' mode):
+d[+x+y[+m[+b]]]
where: 'd' is the pause duration before next frame
'x', 'y' specify the image offset for this frame
'm' is the dispose method for this frame (0 or 1)
'b' is the blending method for this frame (+b or -b)
IN & OUT are in WebP format.
Note: The nature of EXIF, XMP, and ICC data is not checked and is assumed to be valid.`);
}
function printInfo(img) {
let f = [];
let pad = (s, n) => { let o = `${s}`; while (o.length < n) { o = ` ${o}`; } return o; };
let fra = (fr) => { return fr.vp8 ? fr.vp8.alpha : fr.vp8l ? fr.vp8l.alpha : false; };
let bgcol = (c) => { return `0x${c[0].toString(16)}${c[0].toString(16)}${c[1].toString(16)}${c[2].toString(16)}${c[3].toString(16)}`.toUpperCase(); }
console.log(`Canvas size: ${img.width} x ${img.height}`);
if (img.hasAnim) { f.push('animation'); }
if (img.hasAlpha) { f.push(!img.hasAnim ? 'transparency' : 'alpha'); }
if (f.length == 0) { console.log('No features present.'); }
else { console.log(`Features present: ${f.join(' ')}`); }
if (img.hasAnim) {
console.log(`Background color : ${bgcol(img.anim.bgColor)} Loop Count : ${img.anim.loops}`);
console.log(`Number of frames: ${img.frames.length}`);
console.log('No.: width height alpha x_offset y_offset duration dispose blend image_size compression');
for (let i = 0, fr = img.frames, l = fr.length; i < l; i++) {
let out = '';
out += `${pad(i+1, 3)}: ${pad(fr[i].width, 5)} ${pad(fr[i].height, 5)} ${fra(fr[i]) ? 'yes' : ' no'} `;
out += `${pad(fr[i].x, 8)} ${pad(fr[i].y, 8)} ${pad(fr[i].delay, 8)} ${pad(fr[i].dispose ? 'background' : 'none', 10)} `;
out += `${pad(fr[i].blend ? 'yes' : 'no', 5)} ${pad(fr[i].alph ? fr[i].raw.length+14 : fr[i].raw.length-4, 10)} `;
out += `${pad(fr[i].vp8 ? 'lossy' : 'lossless', 11)}`;
console.log(out);
}
} else {
let size = (fs.statSync(img.path)).size;
if (img.hasAlpha) { console.log(`Size of the image (with alpha): ${size}`); }
}
}
async function main() {
let state = parseCmdLine(process.argv.slice(2)), img = new WebP.Image(), d;
if (state.help) { printHelp(); }
else if (state.version) { console.log(`node-webpmux ${JSON.parse(fs.readFileSync(`${__dirname}/../package.json`)).version}`); }
else if (state.get) {
if (!state.in) { console.log('Missing input file'); return; }
if (!state.out) { console.log('Missing output file'); return; }
try { await img.load(state.in); }
catch (e) { console.log(`Error opening ${state.in}`); return; }
switch (state.get.what) {
case 'iccp': d = img.iccp; break;
case 'exif': d = img.exif; break;
case 'xmp': d = img.xmp; break;
case 'frame': d = (await img.demuxToBuffers({ frame: d.frame-1 }))[0]; break;
}
fs.writeFileSync(state.out, d);
}
else if (state.set) {
if (!state.in) { console.log('Missing input file'); return; }
if (!state.out) { console.log('Missing output file'); return; }
try { await img.load(state.in); }
catch (e) { console.log(`Error opening ${state.in}`); return; }
switch (state.set.what) {
case 'loop':
if (!img.hasAnim) { console.log("Image isn't an animation; cannot set loop count"); return; }
if (!intTest(state.set.loop)) { console.log('Loop count must be a number 0 <= n <= 65535'); return; }
if ((state.set.loop < 0) || (state.set.loop >= 65536)) { console.log('Loop count must be a number 0 <= n <= 65535'); return; }
img.anim.loops = state.set.loop;
try { await img.save(state.out); }
catch (e) { console.log(e); }
break;
case 'iccp':
case 'exif':
case 'xmp':
try { d = fs.readFileSync(state.set[state.set.what]); }
catch (e) { console.log(`Could not open/read ${state.set[state.set.what]}`); return; }
img[state.set.what] = d;
try { await img.save(state.out); }
catch (e) { console.log(e); }
break;
}
}
else if (state.strip) {
if (!state.in) { console.log('Missing input file'); return; }
if (!state.out) { console.log('Missing output file'); return; }
try { await img.load(state.in); }
catch (e) { console.log(`Error opening ${state.in}`); return; }
img[state.strip.what] = undefined;
try { await img.save(state.out); }
catch (e) { console.log(e); }
}
else if (state.duration) {
if (!state.in) { console.log('Missing input file'); return; }
if (!state.out) { console.log('Missing output file'); return; }
try { await img.load(state.in); }
catch (e) { console.log(`Error opening ${state.in}`); return; }
if (!img.hasAnim) { console.log("Image isn't an animation; cannot set frame durations"); return; }
for (let i = 0, dur = state.duration, l = dur.length; i < l; i++) {
if (!intTest(dur.dur)) { console.log('Duration must be a number'); return; }
if (!intTest(dur.start)) { console.log('Start frame must be a number'); return; }
if (!intTest(dur.end)) { console.log('End grame must be a number'); return; }
if (dur.end == 0) { dur.end = img.frames.length-1; }
if (dur.end >= img.frames.length) { console.log('Warning: End frame beyond frame count; clipping'); dur.end = img.frames.length-1; }
if (dur.start >= img.frames.length) { console.log('Warning: Start frame beyond frame count; clipping'); dur.start = img.frames.length-1; }
for (let x = dur.start, xl = dur.end; x <= xl; x++) { img.frames[x].delay = dur.dur; }
}
try { await img.save(state.out); }
catch (e) { console.log(e); }
}
else if (state.frames) {
let bin = false;
if (!state.out) { console.log('Missing output file'); return; }
img = await WebP.Image.getEmptyImage();
img.convertToAnim();
for (let i = 0, f = state.frames, l = f.length; i < l; i++) {
if (f[i].bin) { bin = true; d = f[i].bin; }
else { d = f[i]; }
d = WebP.Image.generateFrame({
path: f[i].path,
x: d.x,
y: d.y,
delay: d.duration,
dispose: d.dispose,
blend: d.blend
});
img.frames.push(d);
}
if (state.loop !== undefined) { img.anim.loops = state.loop; }
if (state.bg !== undefined) {
if (bin) { img.anim.bgColor = [ state.bg[3], state.bg[0], state.bg[1], state.bg[2] ]; }
else { img.anim.bgColor = [ state.bg[0], state.bg[1], state.bg[2], state.bg[3] ]; }
}
try { await img.save(state.out); }
catch (e) { console.log(e); }
}
else if (state.info) {
if (!state.in) { console.log('Missing input file'); return; }
try { await img.load(state.in); }
catch (e) { console.log(`Error opening ${state.in}`); return; }
printInfo(img);
} else { printHelp(); }
}
main().then(()=>{});

135
node_modules/node-webpmux/examples.js generated vendored Normal file
View File

@@ -0,0 +1,135 @@
/*
This file contains examples for how to do some common/basic things.
It will *not* execute. This is on purpose.
Most lesser-used features, such as frame offsets, animation background color, loop count, etc., aren't described here.
You can find the full descriptions of function arguments in the README.
*/
process.exit(); // To make certain it cannot be executed
const WebP = require('node-webpmux');
// Creating an empty (1x1, black) image
img = await WebP.Image.getEmptyImage();
// Loading from disk
await img.load('image.webp');
// Or a Buffer
await img.loadBuffer(buffer);
// Save to a new image on disk
await img.save('path/to/wherever.webp');
// Or a Buffer
await img.saveBuffer();
// Or overwrite the original on disk
await img.save();
// Get a Buffer of size img.width * img.height * 4 containing the image's pixel data in RGBA order
pixels = await img.getImageData();
// Set the image's pixel data, lossless preset 9, while perfectly preserving alpha pixels
await img.setImageData(pixels, { lossless: 9, exact: true });
// These two are useful for modifying images, or converting to/from other formats
// An example of this, using PNGjs's sync API for brevity
png = PNG.sync.read(fs.readFileSync('example.png'));
await img.setImageData(png.data, { width: png.width, height: png.height });
// ^ from PNG, or to PNG v
pixels = await img.getImageData();
fs.writeFileSync('example.png', PNG.sync.write({ data: pixels, width: img.width, height: img.height }, { deflateLevel: 9 }));
// For animations..
pixels = await img.getFrameData(5);
frame = img.frames[5]; // in case you need frame.width and frame.height, as you would for converting to/from other formats
await img.setFrameData(5, pixels, { lossless: 9, exact: true });
// Replacing a frame from disk
await img.replaceFrame(4, 'different frame.webp');
// Or from a Buffer
await img.replaceFrameBuffer(4, buffer);
// Or, you can generate a new frame completely from scratch
width = 20; height = 50;
pixels = Buffer.alloc(width * height * 4);
/* ... populate `pixels` ... omitting it here ... */
img = await WebP.Image.getEmptyImage();
await img.setImageData(pixels, { width, height });
// To add the new frame
frame = await WebP.Image.generateFrame({ img });
anim.frames.push(frame);
// You can also pass `path` or `buffer` instead of `img` to generate a frame using one of those sources
// Or to use it to replace an existing one while preserving the original frame's settings
await anim.replaceFrameBuffer(4, await img.saveBuffer());
// Or if you want to replace the whole frame, settings and all
anim.frames.splice(4, 1, frame);
// To create an entire animation from scratch and save it to disk in one go
frames = [];
/* ... populate `frames` using generateFrame ... omitting it here ... */
await WebP.Image.save('anim.webp', { frames });
// Or to a Buffer
await WebP.Image.saveBuffer({ frames });
// If you instead want to create an animation to do more things to
anim = await WebP.Image.getEmptyImage();
anim.convertToAnim();
anim.frames = frames;
// To export a frame to disk
await anim.demux('directory/to/place/it', { frame: 4 });
// For a range of frames to disk
await anim.demux('directory/to/place/them', { start: 2, end: 5 });
// Or for all the frames to disk
await anim.demux('directory/to/place/them');
// To export to a Buffer instead. Supports the three variants described for .demux() above
await anim.demuxToBuffers({ start: 1, end: 3 });
// To add metadata (here EXIF is shown, but XMP and ICCP is also supported)
// Note that *no* validation is done on metadata. Make sure your source data is valid before adding it.
img.exif = fs.readFileSync('metadata.exif');
// For a quick-and-dirty way to set frame data via a Worker for a moderate speed-up from threaded saving. This example is using NodeJS Workers, but should also work via Web Workers in a browser.
// saver.js
const { Worker } = require('worker_threads');
const WebP = require('node-webpmux');
function spawnWorker(data) {
return new Promise((res, rej) => {
let worker = new Worker('saver.worker.js', { workerData: data });
worker.on('message', res);
worker.on('error', rej);
worker.on('exit', (code) => { if (code != 0) { rej(new Error(`Worker stopped with exit code ${code}`)); } });
});
}
async function go() {
let img = await WebP.Image.load('anim.webp'), newFrames = [], promises = [];
/* populate newFrames via getFrameData and make changes as desired */
for (let i = 0, { frames } = img.data, l = frames.length; i < l; i++) {
promises.push(spawnWorker({ webp: img, frame: i, data: newFrames[i] }).then((newdata) => { img.data.frames[i] = newdata; }));
}
await Promise.all(promises);
await img.save('newanim.webp');
}
go().then(/* ... */)'
// saver.worker.js
const { parentPort, workerData } = require('worker_threads');
const WebP = require('node-webpmux');
async function saveFrame(d) {
let { data, frame, webp } = d;
let img = WebP.Image.from(webp);
await WebP.Image.initLib();
await img.setFrameData(frame, data, { lossless: 9 });
return img.data.frames[frame];
}
parentPort.postMessage(await saveFrame(workerData));

26
node_modules/node-webpmux/io.js generated vendored Normal file
View File

@@ -0,0 +1,26 @@
let fs = {};
if (typeof window === 'undefined') {
const _fs = require('fs');
const { promisify } = require('util');
const { basename } = require('path');
fs = {
read: promisify(_fs.read),
write: promisify(_fs.write),
open: promisify(_fs.open),
close: promisify(_fs.close),
basename,
avail: true
};
} else {
let f = async () => { throw new Error('Running inside a browser; filesystem support is not available'); };
fs = {
read: f,
write: f,
open: f,
close: f,
basename: f,
err: f,
avail: false
};
}
module.exports = fs;

120
node_modules/node-webpmux/libwebp.js generated vendored Normal file
View File

@@ -0,0 +1,120 @@
const libwebpF = require('./libwebp/libwebp.js');
const ranges = {
preset: { n: 0, m: 5 },
lossless: { n: 0, m: 9 },
quality: { n: 0, m: 100 },
method: { n: 0, m: 6 },
exact: { n: 0, m: 1 }
}, advranges = {
imageHint: { n: 0, m: 3 },
targetSize: undefined,
targetPSNR: undefined,
segments: { n: 1, m: 4 },
snsStrength: { n: 0, m: 100 },
filterStrength: { n: 0, m: 100 },
filterSharpness: { n: 0, m: 7 },
filterType: { n: 0, m: 1 },
autoFilter: { n: 0, m: 1 },
alphaCompression: { n: 0, m: 1 },
alphaFiltering: { n: 0, m: 2 },
alphaQuality: { n: 0, m: 100 },
pass: { n: 1, m: 10 },
showCompressed: { n: 0, m: 1 },
preprocessing: { n: 0, m: 2 },
partitions: { n: 0, m: 3 },
partitionLimit: { n: 0, m: 100 },
emulateJpegSize: { n: 0, m: 1 },
threadLevel: { n: 0, m: 5 },
lowMemory: { n: 0, m: 1 },
nearLossless: { n: 0, m: 100 },
useDeltaPalette: { n: 0, m: 1 },
useSharpYUV: { n: 0, m: 1 },
qMin: { n: 0, m: 100 },
qMax: { n: 0, m: 100 }
};
function checkOpts(o) {
for (let i = 0, keys = Object.keys(o), l = keys.length; i < l; i++) {
let key = keys[i], r = ranges[key];
if (!r) { continue; }
if ((o[key] < r.n) || (o[key] > r.m)) { throw new Error(`${key} out of range ${r.n}..${r.m}`); }
}
}
function checkAdv(adv) {
for (let i = 0, keys = Object.keys(adv), l = keys.length; i < l; i++) {
let key = keys[i], r = ranges[key];
if (!r) { continue; }
if ((adv[key] < r.n) || (adv[key] > r.m)) { throw new Error(`advanced.${key} out of range ${r.n}..${r.m}`); }
}
}
module.exports = class libWebP {
enc = 0;
async init() {
let Module = this.Module = await libwebpF();
this.api = Module.WebPEnc;
this.api.getResult = (e) => { return new Uint8Array(new Uint8Array(Module.HEAP8.buffer, e.getResult(), e.getResultSize())); };
this.api.decodeRGBA = Module.cwrap('decodeRGBA', 'number', [ 'number', 'number' ]);
this.api.decodeFree = Module.cwrap('decodeFree', '', [ 'number' ]);
this.api.allocBuffer = Module.cwrap('allocBuffer', 'number', [ 'number' ]);
this.api.destroyBuffer = Module.cwrap('destroyBuffer', '', [ 'number' ]);
}
initEnc() { if (!this.enc) { this.enc = new this.Module.WebPEnc(); } }
destroyEnc() { if (this.enc) { this.enc.delete(); delete this.enc; } }
encodeImage(data, width, height, { preset, lossless, quality, method, exact, advanced } = {}) {
let { api, Module } = this, p, ret = {}, enc;
this.initEnc();
enc = this.enc;
enc.init();
checkOpts({ preset, lossless, quality, method, exact });
if (preset != undefined) { enc.setPreset(preset); }
if (lossless != undefined) { enc.setLosslessPreset(lossless); }
if (quality != undefined) { enc.setQuality(quality); }
if (method != undefined) { enc.setMethod(method); }
if (exact != undefined) { enc.setExact(!!exact); }
if (advanced != undefined) {
checkAdv(advanced);
if (advanced.imageHint != undefined) { enc.advImageHint(advanced.imageHint); }
if (advanced.targetSize != undefined) { enc.advTargetSize(advanced.targetSize); }
if (advanced.targetPSNR != undefined) { enc.advTargetPSNR(advanced.targetPSNR); }
if (advanced.segments != undefined) { enc.advSegments(advanced.segments); }
if (advanced.snsStrength != undefined) { enc.advSnsStrength(advanced.snsStrength); }
if (advanced.filterStrength != undefined) { enc.advFilterStrength(advanced.filterStrength); }
if (advanced.filterSharpness != undefined) { enc.advFilterSharpness(advanced.filterSharpness); }
if (advanced.filterType != undefined) { enc.advFilterType(advanced.filterType); }
if (advanced.autoFilter != undefined) { enc.advAutoFilter(advanced.autoFilter); }
if (advanced.alphaCompression != undefined) { enc.advAlphaCompression(advanced.alphaCompression); }
if (advanced.alphaFiltering != undefined) { enc.advAlphaFiltering(advanced.alphaFiltering); }
if (advanced.alphaQuality != undefined) { enc.advAlphaQuality(advanced.alphaQuality); }
if (advanced.pass != undefined) { enc.advPass(advanced.pass); }
if (advanced.showCompressed != undefined) { enc.advShowCompressed(advanced.showCompressed); }
if (advanced.preprocessing != undefined) { enc.advPreprocessing(advanced.preprocessing); }
if (advanced.partitions != undefined) { enc.advPartitions(advanced.partitions); }
if (advanced.partitionLimit != undefined) { enc.advPartitionLimit(advanced.partitionLimit); }
if (advanced.emulateJpegSize != undefined) { enc.advEmulateJpegSize(advanced.emulateJpegSize); }
if (advanced.threadLevel != undefined) { enc.advThreadLevel(advanced.threadLevel); }
if (advanced.lowMemory != undefined) { enc.advLowMemory(advanced.lowMemory); }
if (advanced.nearLossless != undefined) { enc.advNearLossless(advanced.nearLossless); }
if (advanced.useDeltaPalette != undefined) { enc.advUseDeltaPalette(advanced.useDeltaPalette); }
if (advanced.useSharpYUV != undefined) { enc.advUseSharpYUV(advanced.useSharpYUV); }
if (advanced.qMin != undefined) { enc.advQMin(advanced.qMin); }
if (advanced.qMax != undefined) { enc.advQMax(advanced.qMax); }
}
p = api.allocBuffer(data.length);
Module.HEAP8.set(data, p);
enc.loadRGBA(p, width, height);
api.destroyBuffer(p);
ret.res = enc.encode();
if (ret.res == 0) { ret.buf = api.getResult(enc); }
this.destroyEnc();
return ret;
}
decodeImage(data, width, height) {
let { api, Module } = this, p, ret;
let np = api.allocBuffer(data.length);
Module.HEAP8.set(data, np);
let bp = api.decodeRGBA(np, data.length);
ret = new Uint8Array(new Uint8Array(Module.HEAP8.buffer, bp, width * height * 4));
api.decodeFree(bp);
api.destroyBuffer(np);
return ret;
}
};

144
node_modules/node-webpmux/libwebp/binding.cpp generated vendored Normal file
View File

@@ -0,0 +1,144 @@
#include <stdlib.h>
#include <stdio.h>
#include <emscripten.h>
#include <emscripten/bind.h>
#include "libwebp/src/webp/encode.h"
#include "libwebp/src/webp/decode.h"
using namespace emscripten;
class WebPEnc {
public:
WebPEnc() { this->ready = false; this->picAlloc = false; }
~WebPEnc() { this->reset(); }
bool init() {
if (this->ready) { return false; }
WebPPictureInit(&(this->pic));
WebPMemoryWriterInit(&(this->writer));
this->pic.writer = WebPMemoryWrite;
this->pic.custom_ptr = &(this->writer);
WebPConfigInit(&(this->config));
this->ready = true;
return true;
}
void reset() {
if (!this->ready) { return; }
if (this->picAlloc) { WebPPictureFree(&(this->pic)); }
WebPMemoryWriterClear(&(this->writer));
this->ready = false;
this->picAlloc = false;
}
// Clunky workaround for Embind not supporting pointers to primitives (first argument should be a const uint8_t *)
bool loadRGBA(const int input, int width, int height) {
if (!this->ready) { return false; }
this->pic.width = width;
this->pic.height = height;
WebPPictureImportRGBA(&(this->pic), reinterpret_cast<const uint8_t *>(input), width * 4);
this->picAlloc = true;
return true;
}
bool setPreset(int en) {
if (!this->ready) { return false; }
if (en > 0) { WebPConfigPreset(&(this->config), (WebPPreset)en, 100.0f); }
else { WebPConfigInit(&(this->config)); }
return true;
}
bool setLosslessPreset(int en) {
if (!this->ready) { return false; }
if (en > 0) { WebPConfigLosslessPreset(&(this->config), en); this->pic.use_argb = 1; }
else { WebPConfigInit(&(this->config)); this->pic.use_argb = 0; }
return true;
}
bool setQuality(float q) { if (!this->ready) { return false; } this->config.quality = q; return true; }
bool setMethod(int m) { if (!this->ready) { return false; } this->config.method = m; return true; }
bool setExact(bool ex) { if (!this->ready) { return false; } this->config.exact = ex ? 1 : 0; return true; }
int encode() {
if (!this->ready) { return -1; }
if (!WebPValidateConfig(&(this->config))) { return -2; }
if (!WebPEncode(&(this->config), &(this->pic))) { return this->pic.error_code; }
return 0;
}
// Clunky workaround for Embind not supporting pointers to primitives (this should return uint8_t*)
int getResult() { return (int)this->writer.mem; }
size_t getResultSize() { return this->writer.size; }
bool advImageHint(int en) { if (!this->ready) { return false; } this->config.image_hint = (WebPImageHint)en; return true; }
bool advTargetSize(int s) { if (!this->ready) { return false; } this->config.target_size = s; return true; }
bool advTargetPSNR(float psnr) { if (!this->ready) { return false; } this->config.target_PSNR = psnr; return true; }
bool advSegments(int seg) { if (!this->ready) { return false; } this->config.segments = seg; return true; }
bool advSnsStrength(int str) { if (!this->ready) { return false; } this->config.sns_strength = str; return true; }
bool advFilterStrength(int str) { if (!this->ready) { return false; } this->config.filter_strength = str; return true; }
bool advFilterSharpness(int shr) { if (!this->ready) { return false; } this->config.filter_sharpness = shr ? 1 : 0; return true; }
bool advFilterType(int type) { if (!this->ready) { return false; } this->config.filter_type = type ? 1 : 0; return true; }
bool advAutoFilter(int filter) { if (!this->ready) { return false; } this->config.autofilter = filter ? 1 : 0; return true; }
bool advAlphaCompression(int comp) { if (!this->ready) { return false; } this->config.alpha_compression = comp; return true; }
bool advAlphaFiltering(int filter) { if (!this->ready) { return false; } this->config.alpha_filtering = filter; return true; }
bool advAlphaQuality(int qual) { if (!this->ready) { return false; } this->config.alpha_quality = qual; return true; }
bool advPass(int pass) { if (!this->ready) { return false; } this->config.pass = pass; return true; }
bool advShowCompressed(int comp) { if (!this->ready) { return false; } this->config.show_compressed = comp ? 1 : 0; return true; }
bool advPreprocessing(int prepro) { if (!this->ready) { return false; } this->config.preprocessing = prepro; return true; }
bool advPartitions(int parts) { if (!this->ready) { return false; } this->config.partitions = parts; return true; }
bool advPartitionLimit(int limit) { if (!this->ready) { return false; } this->config.partition_limit = limit; return true; }
bool advEmulateJpegSize(int emulate) { if (!this->ready) { return false; } this->config.emulate_jpeg_size = emulate ? 1 : 0; return true; }
bool advThreadLevel(int threads) { if (!this->ready) { return false; } this->config.thread_level = threads; return true; }
bool advLowMemory(int low) { if (!this->ready) { return false; } this->config.low_memory = low ? 1 : 0; return true; }
bool advNearLossless(int near) { if (!this->ready) { return false; } this->config.near_lossless = near; return true; }
bool advUseDeltaPalette(int delta) { if (!this->ready) { return false; } this->config.use_delta_palette = 0 /*delta*/; return true; }
bool advUseSharpYUV(int sharp) { if (!this->ready) { return false; } this->config.use_sharp_yuv = sharp ? 1 : 0; return true; }
bool advQMin(int min) { if (!this->ready) { return false; } this->config.qmin = min; return true; }
bool advQMax(int max) { if (!this->ready) { return false; } this->config.qmax = max; return true; }
private:
bool ready;
bool picAlloc;
WebPConfig config;
WebPPicture pic;
WebPMemoryWriter writer;
};
// Encoder hooks
EMSCRIPTEN_BINDINGS(WebPBinding) {
class_<WebPEnc>("WebPEnc")
.constructor<>()
.function("init", &WebPEnc::init)
.function("reset", &WebPEnc::reset)
.function("loadRGBA", &WebPEnc::loadRGBA)
.function("setPreset", &WebPEnc::setPreset)
.function("setLosslessPreset", &WebPEnc::setLosslessPreset)
.function("setQuality", &WebPEnc::setQuality)
.function("setMethod", &WebPEnc::setMethod)
.function("setExact", &WebPEnc::setExact)
.function("encode", &WebPEnc::encode)
.function("getResult", &WebPEnc::getResult)
.function("getResultSize", &WebPEnc::getResultSize)
.function("advImageHint", &WebPEnc::advImageHint)
.function("advTargetSize", &WebPEnc::advTargetSize)
.function("advTargetPSNR", &WebPEnc::advTargetPSNR)
.function("advSegments", &WebPEnc::advSegments)
.function("advSnsStrength", &WebPEnc::advSnsStrength)
.function("advFilterStrength", &WebPEnc::advFilterStrength)
.function("advFilterSharpness", &WebPEnc::advFilterSharpness)
.function("advFilterType", &WebPEnc::advFilterType)
.function("advAutoFilter", &WebPEnc::advAutoFilter)
.function("advAlphaCompression", &WebPEnc::advAlphaCompression)
.function("advAlphaFiltering", &WebPEnc::advAlphaFiltering)
.function("advAlphaQuality", &WebPEnc::advAlphaQuality)
.function("advPass", &WebPEnc::advPass)
.function("advShowCompressed", &WebPEnc::advShowCompressed)
.function("advPreprocessing", &WebPEnc::advPreprocessing)
.function("advPartitions", &WebPEnc::advPartitions)
.function("advPartitionLimit", &WebPEnc::advPartitionLimit)
.function("advEmulateJpegSize", &WebPEnc::advEmulateJpegSize)
.function("advThreadLevel", &WebPEnc::advThreadLevel)
.function("advLowMemory", &WebPEnc::advLowMemory)
.function("advNearLossless", &WebPEnc::advNearLossless)
.function("advUseDeltaPalette", &WebPEnc::advUseDeltaPalette)
.function("advUseSharpYUV", &WebPEnc::advUseSharpYUV)
.function("advQMin", &WebPEnc::advQMin)
.function("advQMax", &WebPEnc::advQMax);
}
extern "C" {
// Decoder
EMSCRIPTEN_KEEPALIVE uint8_t *decodeRGBA(const uint8_t *data, size_t dataSize) { return WebPDecodeRGBA(data, dataSize, 0, 0); }
EMSCRIPTEN_KEEPALIVE void decodeFree(uint8_t *data) { WebPFree(data); }
// Utility
EMSCRIPTEN_KEEPALIVE uint8_t *allocBuffer(size_t size) { return (uint8_t*)malloc(size * sizeof(uint8_t)); }
EMSCRIPTEN_KEEPALIVE void destroyBuffer(uint8_t *p) { free(p); }
}

21
node_modules/node-webpmux/libwebp/libwebp.js generated vendored Normal file

File diff suppressed because one or more lines are too long

BIN
node_modules/node-webpmux/libwebp/libwebp.wasm generated vendored Executable file

Binary file not shown.

19
node_modules/node-webpmux/package.json generated vendored Normal file
View File

@@ -0,0 +1,19 @@
{
"name": "node-webpmux",
"version": "3.1.7",
"description": "A pure Javascript/WebAssembly re-implementation of webpmux",
"main": "webp.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/ApeironTsuka/node-webpmux.git"
},
"author": "ApeironTsuka",
"license": "ISC",
"bugs": {
"url": "https://github.com/ApeironTsuka/node-webpmux/issues"
},
"homepage": "https://github.com/ApeironTsuka/node-webpmux#readme"
}

361
node_modules/node-webpmux/parser.js generated vendored Normal file
View File

@@ -0,0 +1,361 @@
const IO = require('./io.js');
const nullByte = Buffer.alloc(1);
nullByte[0] = 0;
const intfTypes = {
NONE: 0,
FILE: 1,
BUFFER: 2
};
const constants = {
TYPE_LOSSY: 0,
TYPE_LOSSLESS: 1,
TYPE_EXTENDED: 2
};
function VP8Width(data) { return ((data[7] << 8) | data[6]) & 0b0011111111111111; }
function VP8Height(data) { return ((data[9] << 8) | data[8]) & 0b0011111111111111; }
function VP8LWidth(data) { return (((data[2] << 8) | data[1]) & 0b0011111111111111) + 1; }
function VP8LHeight(data) { return ((((data[4] << 16) | (data[3] << 8) | data[2]) >> 6) & 0b0011111111111111) + 1; }
function doesVP8LHaveAlpha(data) { return !!(data[4] & 0b00010000); }
function createBasicChunk(name, data) {
let header = Buffer.alloc(8), size = data.length;
header.write(name, 0);
header.writeUInt32LE(size, 4);
if (size&1) { return { size: size + 9, chunks: [ header, data, nullByte ] }; }
else { return { size: size + 8, chunks: [ header, data ] }; }
}
class WebPReader {
constructor() { this.type = intfTypes.NONE; }
readFile(path) { this.type = intfTypes.FILE; this.path = path; }
readBuffer(buf) { this.type = intfTypes.BUFFER; this.buf = buf; this.cursor = 0; }
async readBytes(n, mod) {
let { type } = this;
if (type == intfTypes.FILE) {
let b = Buffer.alloc(n), br;
br = (await IO.read(this.fp, b, 0, n, undefined)).bytesRead;
return mod ? b : br == n ? b : undefined;
} else if (type == intfTypes.BUFFER) { let b = this.buf.slice(this.cursor, this.cursor + n); this.cursor += n; return b; }
else { throw new Error('Reader not initialized'); }
}
async readFileHeader() {
let buf = await this.readBytes(12);
if (buf === undefined) { throw new Error('Reached end while reading header'); }
if (buf.toString('utf8', 0, 4) != 'RIFF') { throw new Error('Bad header (not RIFF)'); }
if (buf.toString('utf8', 8, 12) != 'WEBP') { throw new Error('Bad header (not WEBP)'); }
return { fileSize: buf.readUInt32LE(4) };
}
async readChunkHeader() {
let buf = await this.readBytes(8, true);
if (buf.length == 0) { return { fourCC: '\x00\x00\x00\x00', size: 0 }; }
else if (buf.length < 8) { throw new Error('Reached end while reading chunk header'); }
return { fourCC: buf.toString('utf8', 0, 4), size: buf.readUInt32LE(4) };
}
async readChunkContents(size) {
let buf = await this.readBytes(size);
if (size & 1) { await this.readBytes(1); }
return buf;
}
async readChunk_raw(n, size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error(`Reached end while reading ${n} chunk`); }
return { raw: buf };
}
async readChunk_VP8(size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error('Reached end while reading VP8 chunk'); }
return { raw: buf, width: VP8Width(buf), height: VP8Height(buf) };
}
async readChunk_VP8L(size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error('Reached end while reading VP8L chunk'); }
return { raw: buf, alpha: doesVP8LHaveAlpha(buf), width: VP8LWidth(buf), height: VP8LHeight(buf) };
}
async readChunk_VP8X(size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error('Reached end while reading VP8X chunk'); }
return {
raw: buf,
hasICCP: !!(buf[0] & 0b00100000),
hasAlpha: !!(buf[0] & 0b00010000),
hasEXIF: !!(buf[0] & 0b00001000),
hasXMP: !!(buf[0] & 0b00000100),
hasAnim: !!(buf[0] & 0b00000010),
width: buf.readUIntLE(4, 3) + 1,
height: buf.readUIntLE(7, 3) + 1
};
}
async readChunk_ANIM(size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error('Reached end while reading ANIM chunk'); }
return { raw: buf, bgColor: buf.slice(0, 4), loops: buf.readUInt16LE(4) };
}
async readChunk_ANMF(size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error('Reached end while reading ANMF chunk'); }
let out = {
raw: buf,
x: buf.readUIntLE(0, 3),
y: buf.readUIntLE(3, 3),
width: buf.readUIntLE(6, 3) + 1,
height: buf.readUIntLE(9, 3) + 1,
delay: buf.readUIntLE(12, 3),
blend: !(buf[15] & 0b00000010),
dispose: !!(buf[15] & 0b00000001)
}, keepLooping = true, anmfReader = new WebPReader();
anmfReader.readBuffer(buf);
anmfReader.cursor = 16;
while (keepLooping) {
let header = await anmfReader.readChunkHeader();
switch (header.fourCC) {
case 'VP8 ':
if (!out.vp8) {
out.type = constants.TYPE_LOSSY;
out.vp8 = await anmfReader.readChunk_VP8(header.size);
if (out.alph) { out.vp8.alpha = true; }
}
break;
case 'VP8L':
if (!out.vp8l) {
out.type = constants.TYPE_LOSSLESS;
out.vp8l = await anmfReader.readChunk_VP8L(header.size);
}
break;
case 'ALPH':
if (!out.alph) {
out.alph = await anmfReader.readChunk_ALPH(header.size);
if (out.vp8) { out.vp8.alpha = true; }
}
break;
case '\x00\x00\x00\x00':
default:
keepLooping = false;
break;
}
if (anmfReader.cursor >= buf.length) { break; }
}
return out;
}
async readChunk_ALPH(size) { return this.readChunk_raw('ALPH', size); }
async readChunk_ICCP(size) { return this.readChunk_raw('ICCP', size); }
async readChunk_EXIF(size) { return this.readChunk_raw('EXIF', size); }
async readChunk_XMP(size) { return this.readChunk_raw('XMP ', size); }
async readChunk_skip(size) {
let buf = await this.readChunkContents(size);
if (buf === undefined) { throw new Error('Reached end while skipping chunk'); }
}
async read() {
if (this.type == intfTypes.FILE) { this.fp = await IO.open(this.path, 'r'); }
let keepLooping = true, first = true, { fileSize } = await this.readFileHeader(), out = {};
while (keepLooping) {
let { fourCC, size } = await this.readChunkHeader();
switch (fourCC) {
case 'VP8 ':
if (!out.vp8) {
out.vp8 = await this.readChunk_VP8(size);
if (out.alph) { out.vp8.alpha = true; }
if (first) { out.type = constants.TYPE_LOSSY; keepLooping = false; }
} else { await this.readChunk_skip(size); }
break;
case 'VP8L':
if (!out.vp8l) {
out.vp8l = await this.readChunk_VP8L(size);
if (first) { out.type = constants.TYPE_LOSSLESS; keepLooping = false; }
} else { await this.readChunk_skip(size); }
break;
case 'VP8X':
if (!out.extended) {
out.type = constants.TYPE_EXTENDED;
out.extended = await this.readChunk_VP8X(size);
} else { await this.readChunk_skip(size); }
break;
case 'ANIM':
if (!out.anim) {
let { raw, bgColor, loops } = await this.readChunk_ANIM(size);
out.anim = {
bgColor: [ bgColor[2], bgColor[1], bgColor[0], bgColor[3] ],
loops,
frames: [],
raw
};
} else { await this.readChunk_skip(size); }
break;
case 'ANMF': out.anim.frames.push(await this.readChunk_ANMF(size)); break;
case 'ALPH':
if (!out.alph) {
out.alph = await this.readChunk_ALPH(size);
if (out.vp8) { out.vp8.alpha = true; }
} else { await this.readChunk_skip(size); }
break;
case 'ICCP':
if (!out.iccp) { out.iccp = await this.readChunk_ICCP(size); }
else { await this.readChunk_skip(size); }
break;
case 'EXIF':
if (!out.exif) { out.exif = await this.readChunk_EXIF(size); }
else { await this.readChunk_skip(size); }
break;
case 'XMP ':
if (!out.xmp) { out.xmp = await this.readChunk_XMP(size); }
else { await this.readChunk_skip(size); }
break;
case '\x00\x00\x00\x00': keepLooping = false; break;
default: await this.readChunk_skip(size); break;
}
first = false;
}
if (this.type == intfTypes.FILE) { await IO.close(this.fp); }
return out;
}
}
class WebPWriter {
constructor() { this.type = intfTypes.NONE; this.chunks = []; this.width = this.height = 0; }
reset() { this.chunks.length = 0; width = 0; height = 0; }
writeFile(path) { this.type = intfTypes.FILE; this.path = path; }
writeBuffer() { this.type = intfTypes.BUFFER; }
async commit() {
let { chunks } = this, size = 4, fp;
if (this.type == intfTypes.NONE) { throw new Error('Writer not initialized'); }
if (chunks.length == 0) { throw new Error('Nothing to write'); }
for (let i = 1, l = chunks.length; i < l; i++) { size += chunks[i].length; }
chunks[0].writeUInt32LE(size, 4);
if (this.type == intfTypes.FILE) {
fp = await IO.open(this.path, 'w');
for (let i = 0, l = chunks.length; i < l; i++) { await IO.write(fp, chunks[i], 0, undefined, undefined); }
await IO.close(fp);
} else { return Buffer.concat(chunks); }
}
writeBytes(...chunks) {
if (this.type == intfTypes.NONE) { throw new Error('Writer not initialized'); }
this.chunks.push(...chunks);
}
writeFileHeader() {
let buf = Buffer.alloc(12);
buf.write('RIFF', 0);
buf.write('WEBP', 8);
this.writeBytes(buf);
}
writeChunk_VP8(vp8) { this.writeBytes(...((createBasicChunk('VP8 ', vp8.raw)).chunks)); }
writeChunk_VP8L(vp8l) { this.writeBytes(...((createBasicChunk('VP8L', vp8l.raw)).chunks)); }
writeChunk_VP8X(vp8x) {
let buf = Buffer.alloc(18);
buf.write('VP8X', 0);
buf.writeUInt32LE(10, 4);
buf.writeUIntLE(vp8x.width - 1, 12, 3);
buf.writeUIntLE(vp8x.height - 1, 15, 3);
if (vp8x.hasICCP) { buf[8] |= 0b00100000; }
if (vp8x.hasAlpha) { buf[8] |= 0b00010000; }
if (vp8x.hasEXIF) { buf[8] |= 0b00001000; }
if (vp8x.hasXMP) { buf[8] |= 0b00000100; }
if (vp8x.hasAnim) { buf[8] |= 0b00000010; }
this.vp8x = buf;
this.writeBytes(buf);
}
updateChunk_VP8X_size(width, height) {
this.vp8x.writeUIntLE(width, 12, 3);
this.vp8x.writeUIntLE(height, 15, 3);
}
writeChunk_ANIM(anim) {
let buf = Buffer.alloc(14);
buf.write('ANIM', 0);
buf.writeUInt32LE(6, 4);
buf.writeUInt8(anim.bgColor[2], 8);
buf.writeUInt8(anim.bgColor[1], 9);
buf.writeUInt8(anim.bgColor[0], 10);
buf.writeUInt8(anim.bgColor[3], 11);
buf.writeUInt16LE(anim.loops, 12);
this.writeBytes(buf);
}
writeChunk_ANMF(anmf) {
let buf = Buffer.alloc(24), { img } = anmf, size = 16, alpha = false;
buf.write('ANMF', 0);
buf.writeUIntLE(anmf.x, 8, 3);
buf.writeUIntLE(anmf.y, 11, 3);
buf.writeUIntLE(anmf.delay, 20, 3);
if (!anmf.blend) { buf[23] |= 0b00000010; }
if (anmf.dispose) { buf[23] |= 0b00000001; }
switch (img.type) {
case constants.TYPE_LOSSY:
{
let b;
this.width = Math.max(this.width, img.vp8.width);
this.height = Math.max(this.height, img.vp8.height);
buf.writeUIntLE(img.vp8.width - 1, 14, 3);
buf.writeUIntLE(img.vp8.height - 1, 17, 3);
this.writeBytes(buf);
if (img.vp8.alpha) {
b = createBasicChunk('ALPH', img.alph.raw);
this.writeBytes(...b.chunks);
size += b.size;
}
b = createBasicChunk('VP8 ', img.vp8.raw);
this.writeBytes(...b.chunks);
size += b.size;
}
break;
case constants.TYPE_LOSSLESS:
{
let b = createBasicChunk('VP8L', img.vp8l.raw);
this.width = Math.max(this.width, img.vp8l.width);
this.height = Math.max(this.height, img.vp8l.height);
buf.writeUIntLE(img.vp8l.width - 1, 14, 3);
buf.writeUIntLE(img.vp8l.height - 1, 17, 3);
if (img.vp8l.alpha) { alpha = true; }
this.writeBytes(buf, ...b.chunks);
size += b.size;
}
break;
case constants.TYPE_EXTENDED:
if (img.extended.hasAnim) {
let fr = img.anim.frames;
if (img.extended.hasAlpha) { alpha = true; }
for (let i = 0, l = fr.length; i < l; i++) {
let b = Buffer.alloc(8), c = fr[i].raw;
this.width = Math.max(this.width, fr[i].width + anmf.x);
this.height = Math.max(this.height, fr[i].height + anmf.y);
b.write('ANMF', 0);
b.writeUInt32LE(c.length, 4);
c.writeUIntLE(anmf.x, 0, 3);
c.writeUIntLE(anmf.y, 3, 3);
c.writeUIntLE(anmf.delay, 12, 3);
if (!anmf.blend) { c[15] |= 0b00000010; } else { c[15] &= 0b11111101; }
if (anmf.dispose) { c[15] |= 0b00000001; } else { c[15] &= 0b11111110; }
this.writeBytes(b, c);
if (c.length & 1) { this.writeBytes(nullByte); }
}
} else {
let b;
this.width = Math.max(this.width, img.extended.width);
this.height = Math.max(this.height, img.extended.height);
if (img.vp8) {
buf.writeUIntLE(img.vp8.width - 1, 14, 3);
buf.writeUIntLE(img.vp8.height - 1, 17, 3);
this.writeBytes(buf);
if (img.alph) {
b = createBasicChunk('ALPH', img.alph.raw);
alpha = true;
this.writeBytes(...b.chunks);
size += b.size;
}
b = createBasicChunk('VP8 ', img.vp8.raw);
this.writeBytes(...b.chunks);
size += b.size;
} else if (img.vp8l) {
buf.writeUIntLE(img.vp8l.width - 1, 14, 3);
buf.writeUIntLE(img.vp8l.height - 1, 17, 3);
if (img.vp8l.alpha) { alpha = true; }
b = createBasicChunk('VP8L', img.vp8l.raw);
this.writeBytes(buf, ...b.chunks);
size += b.size;
}
}
break;
default: throw new Error('Unknown image type');
}
buf.writeUInt32LE(size, 4);
if (alpha) { this.vp8x[8] |= 0b00010000; }
}
writeChunk_ALPH(alph) { this.writeBytes(...((createBasicChunk('ALPH', alph.raw)).chunks)); }
writeChunk_ICCP(iccp) { this.writeBytes(...((createBasicChunk('ICCP', iccp.raw)).chunks)); }
writeChunk_EXIF(exif) { this.writeBytes(...((createBasicChunk('EXIF', exif.raw)).chunks)); }
writeChunk_XMP(xmp) { this.writeBytes(...((createBasicChunk('XMP ', xmp.raw)).chunks)); }
}
module.exports = { WebPReader, WebPWriter };

415
node_modules/node-webpmux/webp.js generated vendored Normal file
View File

@@ -0,0 +1,415 @@
// For more information on the WebP format, see https://developers.google.com/speed/webp/docs/riff_container
const { WebPReader, WebPWriter } = require('./parser.js');
const IO = require('./io.js');
const emptyImageBuffer = Buffer.from([
0x52, 0x49, 0x46, 0x46, 0x24, 0x00, 0x00, 0x00, 0x57, 0x45, 0x42, 0x50, 0x56, 0x50, 0x38, 0x20,
0x18, 0x00, 0x00, 0x00, 0x30, 0x01, 0x00, 0x9d, 0x01, 0x2a, 0x01, 0x00, 0x01, 0x00, 0x02, 0x00,
0x34, 0x25, 0xa4, 0x00, 0x03, 0x70, 0x00, 0xfe, 0xfb, 0xfd, 0x50, 0x00
]);
const constants = {
TYPE_LOSSY: 0,
TYPE_LOSSLESS: 1,
TYPE_EXTENDED: 2
};
const encodeResults = {
// These are errors from binding.cpp
LIB_NOT_READY: -1, // <interface>.initEnc() was not called. This happens internally during <interface>.encodeImage() and thus should never happen.
LIB_INVALID_CONFIG: -2, // invalid options passed in via set[Image/Frame]Data. This should never happen.
SUCCESS: 0,
// These errors are from native code and can be found in upstream libwebp/src/encode.h, WebPEncodingError enum
VP8_ENC_ERROR_OUT_OF_MEMORY: 1, // memory error allocating objects
VP8_ENC_ERROR_BITSTREAM_OUT_OF_MEMORY: 2, // memory error while flushing bits
VP8_ENC_ERROR_NULL_PARAMETER: 3, // a pointer parameter is NULL
VP8_ENC_ERROR_INVALID_CONFIGURATION: 4, // configuration is invalid
VP8_ENC_ERROR_BAD_DIMENSION: 5, // picture has invalid width/height
VP8_ENC_ERROR_PARTITION0_OVERFLOW: 6, // partition is bigger than 512k
VP8_ENC_ERROR_PARTITION_OVERFLOW: 7, // partition is bigger than 16M
VP8_ENC_ERROR_BAD_WRITE: 8, // error while flushing bytes
VP8_ENC_ERROR_FILE_TOO_BIG: 9, // file is bigger than 4G
VP8_ENC_ERROR_USER_ABORT: 10, // abort request by user
VP8_ENC_ERROR_LAST: 11 // list terminator. always last.
};
const imageHints = {
DEFAULT: 0,
PICTURE: 1, // digital picture, such as a portrait. Indoors shot
PHOTO: 2, // outdoor photograph with natural lighting
GRAPH: 3 // discrete tone image (graph, map-tile, etc)
};
const imagePresets = {
DEFAULT: 0,
PICTURE: 1, // digital picture, such as a portrait. Indoors shot
PHOTO: 2, // outdoor photograph with natural lighting
DRAWING: 3, // hand or line drawing, with high-contrast details
ICON: 4, // small-sized, colorful images
TEXT: 5 // text-like
};
class Image {
constructor() { this.data = null; this.loaded = false; this.path = ''; }
async initLib() { return Image.initLib(); }
clear() { this.data = null; this.path = ''; this.loaded = false; }
// Convenience getters/setters
get width() { let d = this.data; return !this.loaded ? undefined : d.extended ? d.extended.width : d.vp8l ? d.vp8l.width : d.vp8 ? d.vp8.width : undefined; }
get height() { let d = this.data; return !this.loaded ? undefined : d.extended ? d.extended.height : d.vp8l ? d.vp8l.height : d.vp8 ? d.vp8.height : undefined; }
get type() { return this.loaded ? this.data.type : undefined; }
get hasAnim() { return this.loaded ? this.data.extended ? this.data.extended.hasAnim : false : false; }
get hasAlpha() { return this.loaded ? this.data.extended ? this.data.extended.hasAlpha : this.data.vp8 ? this.data.vp8.alpha : this.data.vp8l ? this.data.vp8l.alpha : false : false; }
get anim() { return this.hasAnim ? this.data.anim : undefined; }
get frames() { return this.anim ? this.anim.frames : undefined; }
get iccp() { return this.data.extended ? this.data.extended.hasICCP ? this.data.iccp.raw : undefined : undefined; }
set iccp(raw) {
if (!this.data.extended) { this._convertToExtended(); }
if (raw === undefined) { this.data.extended.hasICCP = false; delete this.data.iccp; }
else { this.data.iccp = { raw }; this.data.extended.hasICCP = true; }
}
get exif() { return this.data.extended ? this.data.extended.hasEXIF ? this.data.exif.raw : undefined : undefined; }
set exif(raw) {
if (!this.data.extended) { this._convertToExtended(); }
if (raw === undefined) { this.data.extended.hasEXIF = false; delete this.data.exif; }
else { this.data.exif = { raw }; this.data.extended.hasEXIF = true; }
}
get xmp() { return this.data.extended ? this.data.extended.hasXMP ? this.data.xmp.raw : undefined : undefined; }
set xmp(raw) {
if (!this.data.extended) { this._convertToExtended(); }
if (raw === undefined) { this.data.extended.hasXMP = false; delete this.data.xmp; }
else { this.data.xmp = { raw }; this.data.extended.hasXMP = true; }
}
// Private member functions
_convertToExtended() {
if (!this.loaded) { throw new Error('No image loaded'); }
this.data.type = constants.TYPE_EXTENDED;
this.data.extended = {
hasICCP: false,
hasAlpha: false,
hasEXIF: false,
hasXMP: false,
width: this.data.vp8 ? this.data.vp8.width : this.data.vp8l ? this.data.vp8l.width : 1,
height: this.data.vp8 ? this.data.vp8.height : this.data.vp8l ? this.data.vp8l.height : 1
};
}
async _demuxFrame(d, frame) {
let { hasICCP, hasEXIF, hasXMP } = this.data.extended ? this.data.extended : { hasICCP: false, hasEXIF: false, hasXMP: false }, hasAlpha = ((frame.vp8) && (frame.vp8.alpha)), writer = new WebPWriter();
if (typeof d === 'string') { writer.writeFile(d); }
else { writer.writeBuffer(); }
writer.writeFileHeader();
if ((hasICCP) || (hasEXIF) || (hasXMP) || (hasAlpha)) {
writer.writeChunk_VP8X({
hasICCP,
hasEXIF,
hasXMP,
hasAlpha: ((frame.vp8l) && (frame.vp8l.alpha)) || hasAlpha,
width: frame.width,
height: frame.height
});
}
if (frame.vp8l) { writer.writeChunk_VP8L(frame.vp8l); }
else if (frame.vp8) {
if (frame.vp8.alpha) { writer.writeChunk_ALPH(frame.alph); }
writer.writeChunk_VP8(frame.vp8);
} else { throw new Error('Frame has no VP8/VP8L?'); }
if ((hasICCP) || (hasEXIF) || (hasXMP) || (hasAlpha)) {
if (this.data.extended.hasICCP) { writer.writeChunk_ICCP(this.data.iccp); }
if (this.data.extended.hasEXIF) { writer.writeChunk_EXIF(this.data.exif); }
if (this.data.extended.hasXMP) { writer.writeChunk_XMP(this.data.xmp); }
}
return writer.commit();
}
async _save(writer, { width = undefined, height = undefined, frames = undefined, bgColor = [ 255, 255, 255, 255 ], loops = 0, delay = 100, x = 0, y = 0, blend = true, dispose = false, exif = false, iccp = false, xmp = false } = {}) {
let _width = width !== undefined ? width : this.width - 1, _height = height !== undefined ? height : this.height - 1, isAnim = this.hasAnim || frames !== undefined;
if ((_width < 0) || (_width > (1 << 24))) { throw new Error('Width out of range'); }
else if ((_height < 0) || (_height > (1 << 24))) { throw new Error('Height out of range'); }
else if ((_height * _width) > (Math.pow(2, 32) - 1)) { throw new Error(`Width * height too large (${_width}, ${_height})`); }
if (isAnim) {
if ((loops < 0) || (loops >= (1 << 24))) { throw new Error('Loops out of range'); }
else if ((delay < 0) || (delay >= (1 << 24))) { throw new Error('Delay out of range'); }
else if ((x < 0) || (x >= (1 << 24))) { throw new Error('X out of range'); }
else if ((y < 0) || (y >= (1 << 24))) { throw new Error('Y out of range'); }
} else { if ((_width == 0) || (_height == 0)) { throw new Error('Width/height cannot be 0'); } }
writer.writeFileHeader();
switch (this.type) {
case constants.TYPE_LOSSY: writer.writeChunk_VP8(this.data.vp8); break;
case constants.TYPE_LOSSLESS: writer.writeChunk_VP8L(this.data.vp8l); break;
case constants.TYPE_EXTENDED:
{
let hasICCP = iccp === true ? !!this.iccp : iccp,
hasEXIF = exif === true ? !!this.exif : exif,
hasXMP = xmp === true ? !!this.xmp : xmp;
writer.writeChunk_VP8X({
hasICCP, hasEXIF, hasXMP,
hasAlpha: ((this.data.alph) || ((this.data.vp8l) && (this.data.vp8l.alpha))),
hasAnim: isAnim,
width: _width,
height: _height
});
if (hasICCP) { writer.writeChunk_ICCP(iccp !== true ? iccp : this.data.iccp); }
if (isAnim) {
let _frames = frames || this.frames;
writer.writeChunk_ANIM({ bgColor, loops });
for (let i = 0, l = _frames.length; i < l; i++) {
let fr = _frames[i],
_delay = fr.delay == undefined ? delay : fr.delay,
_x = fr.x == undefined ? x :fr.x,
_y = fr.y == undefined ? y : fr.y,
_blend = fr.blend == undefined ? blend : fr.blend,
_dispose = fr.dispose == undefined ? dispose : fr.dispose, img;
if ((_delay < 0) || (_delay >= (1 << 24))) { throw new Error(`Delay out of range on frame ${i}`); }
else if ((_x < 0) || (_x >= (1 << 24))) { throw new Error(`X out of range on frame ${i}`); }
else if ((_y < 0) || (_y >= (1 << 24))) { throw new Error(`Y out of range on frame ${i}`); }
if (fr.path) { img = new Image(); await img.load(fr.path); img = img.data; }
else if (fr.buffer) { img = new Image(); await img.load(fr.buffer); img = img.data; }
else if (fr.img) { img = fr.img.data; }
else { img = fr; }
writer.writeChunk_ANMF({
x: _x,
y: _y,
delay: _delay,
blend: _blend,
dispose: _dispose,
img
});
}
if ((_width == 0) || (_height == 0)) { writer.updateChunk_VP8X_size(_width == 0 ? writer.width : _width, _height == 0 ? writer.height : _height); }
} else {
if (this.data.vp8) {
if (this.data.alph) { writer.writeChunk_ALPH(this.data.alph); }
writer.writeChunk_VP8(this.data.vp8);
} else if (this.data.vp8l) { writer.writeChunk_VP8L(this.data.vp8l); }
}
if (hasEXIF) { writer.writeChunk_EXIF(exif !== true ? exif : this.data.exif); }
if (hasXMP) { writer.writeChunk_XMP(xmp !== true ? xmp : this.data.xmp); }
}
break;
default: throw new Error('Unknown image type');
}
return writer.commit();
}
// Public member functions
async load(d) {
let reader = new WebPReader();
if (typeof d === 'string') { if (!IO.avail) { await IO.err(); } reader.readFile(d); this.path = d; }
else { reader.readBuffer(d); }
this.data = await reader.read();
this.loaded = true;
}
convertToAnim() {
if (!this.data.extended) { this._convertToExtended(); }
if (this.hasAnim) { return; }
if (this.data.vp8) { delete this.data.vp8; }
if (this.data.vp8l) { delete this.data.vp8l; }
if (this.data.alph) { delete this.data.alph; }
this.data.extended.hasAnim = true;
this.data.anim = {
bgColor: [ 255, 255, 255, 255],
loops: 0,
frames: []
};
}
async demux({ path = undefined, buffers = false, frame = -1, prefix = '#FNAME#', start = 0, end = 0 } = {}) {
if (!this.hasAnim) { throw new Error("This image isn't an animation"); }
let _end = end == 0 ? this.frames.length : end, bufs = [];
if (start < 0) { start = 0; }
if (_end >= this.frames.length) { _end = this.frames.length - 1; }
if (start > _end) { let n = start; start = _end; _end = n; }
if (frame != -1) { start = _end = frame; }
for (let i = start; i <= _end; i++) {
let t = await this._demuxFrame(path ? (`${path}/${prefix}_${i}.webp`).replace(/#FNAME#/g, IO.basename(this.path, '.webp')) : undefined, this.anim.frames[i]);
if (buffers) { bufs.push(t); }
}
if (buffers) { return bufs; }
}
async replaceFrame(frameIndex, d) {
if (!this.hasAnim) { throw new Error("WebP isn't animated"); }
if (typeof frameIndex !== 'number') { throw new Error('Frame index expects a number'); }
if ((frameIndex < 0) || (frameIndex >= this.frames.length)) { throw new Error(`Frame index out of bounds (0 <= index < ${this.frames.length})`); }
let r = new Image(), fr = this.frames[frameIndex];
await r.load(d);
switch (r.type) {
case constants.TYPE_LOSSY:
case constants.TYPE_LOSSLESS:
break;
case constants.TYPE_EXTENDED:
if (r.hasAnim) { throw new Error('Merging animations not currently supported'); }
break;
default: throw new Error('Unknown WebP type');
}
switch (fr.type) {
case constants.TYPE_LOSSY:
if (fr.vp8.alpha) { delete fr.alph; }
delete fr.vp8;
break;
case constants.TYPE_LOSSLESS:
delete fr.vp8l;
break;
default: throw new Error('Unknown frame type');
}
switch (r.type) {
case constants.TYPE_LOSSY:
fr.vp8 = r.data.vp8;
fr.type = constants.TYPE_LOSSY;
break;
case constants.TYPE_LOSSLESS:
fr.vp8l = r.data.vp8l;
fr.type = constants.TYPE_LOSSLESS;
break;
case constants.TYPE_EXTENDED:
if (r.data.vp8) {
fr.vp8 = r.data.vp8;
if (r.data.vp8.alpha) { fr.alph = r.data.alph; }
fr.type = constants.TYPE_LOSSY;
} else if (r.data.vp8l) { fr.vp8l = r.data.vp8l; fr.type = constants.TYPE_LOSSLESS; }
break;
}
fr.width = r.width;
fr.height = r.height;
}
async save(path = this.path, { width = this.width, height = this.height, frames = this.frames, bgColor = this.hasAnim ? this.anim.bgColor : [ 255, 255, 255, 255 ], loops = this.hasAnim ? this.anim.loops : 0, delay = 100, x = 0, y = 0, blend = true, dispose = false, exif = !!this.exif, iccp = !!this.iccp, xmp = !!this.xmp } = {}) {
let writer = new WebPWriter();
if (path !== null) { if (!IO.avail) { await IO.err(); } writer.writeFile(path); }
else { writer.writeBuffer(); }
return this._save(writer, { width, height, frames, bgColor, loops, delay, x, y, blend, dispose, exif, iccp, xmp });
}
async getImageData() {
if (!Image.libwebp) { throw new Error('Must call Image.initLib() before using getImageData'); }
if (this.hasAnim) { throw new Error('Calling getImageData on animations is not supported'); }
let buf = await this.save(null);
return Image.libwebp.decodeImage(buf, this.width, this.height);
}
async setImageData(buf, { width = 0, height = 0, preset = undefined, quality = undefined, exact = undefined, lossless = undefined, method = undefined, advanced = undefined } = {}) {
if (!Image.libwebp) { throw new Error('Must call Image.initLib() before using setImageData'); }
if (this.hasAnim) { throw new Error('Calling setImageData on animations is not supported'); }
if ((quality !== undefined) && ((quality < 0) || (quality > 100))) { throw new Error('Quality out of range'); }
if ((lossless !== undefined) && ((lossless < 0) || (lossless > 9))) { throw new Error('Lossless preset out of range'); }
if ((method !== undefined) && ((method < 0) || (method > 6))) { throw new Error('Method out of range'); }
let ret = Image.libwebp.encodeImage(buf, width > 0 ? width : this.width, height > 0 ? height : this.height, { preset, quality, exact, lossless, method, advanced }), img = new Image(), keepEx = false, ex;
if (ret.res !== encodeResults.SUCCESS) { return ret.res; }
await img.load(Buffer.from(ret.buf));
switch (this.type) {
case constants.TYPE_LOSSY: delete this.data.vp8; break;
case constants.TYPE_LOSSLESS: delete this.data.vp8l; break;
case constants.TYPE_EXTENDED:
ex = this.data.extended;
delete this.data.extended;
if ((ex.hasICCP) || (ex.hasEXIF) || (ex.hasXMP)) { keepEx = true; }
if (this.data.vp8) { delete this.data.vp8; }
if (this.data.vp8l) { delete this.data.vp8l; }
if (this.data.alph) { delete this.data.alph; }
break;
}
switch (img.type) {
case constants.TYPE_LOSSY:
if (keepEx) { this.data.type = constants.TYPE_EXTENDED; ex.hasAlpha = false; ex.width = img.width; ex.height = img.height; this.data.extended = ex; }
else { this.data.type = constants.TYPE_LOSSY; }
this.data.vp8 = img.data.vp8;
break;
case constants.TYPE_LOSSLESS:
if (keepEx) { this.data.type = constants.TYPE_EXTENDED; ex.hasAlpha = img.data.vp8l.alpha; ex.width = img.width; ex.height = img.height; this.data.extended = ex; }
else { this.data.type = constants.TYPE_LOSSLESS; }
this.data.vp8l = img.data.vp8l;
break;
case constants.TYPE_EXTENDED:
this.data.type = constants.TYPE_EXTENDED;
if (keepEx) { ex.hasAlpha = img.data.alph || ((img.data.vp8l) && (img.data.vp8l.alpha)); ex.width = img.width; ex.height = img.height; this.data.extended = ex; }
else { this.data.extended = img.data.extended; }
if (img.data.vp8) { this.data.vp8 = img.data.vp8; }
if (img.data.vp8l) { this.data.vp8l = img.data.vp8l; }
if (img.data.alph) { this.data.alph = img.data.alph; }
break;
}
return encodeResults.SUCCESS;
}
async getFrameData(frameIndex) {
if (!Image.libwebp) { throw new Error('Must call Image.initLib() before using getFrameData'); }
if (!this.hasAnim) { throw new Error('Calling getFrameData on non-animations is not supported'); }
if (typeof frameIndex !== 'number') { throw new Error('Frame index expects a number'); }
if ((frameIndex < 0) || (frameIndex >= this.frames.length)) { throw new Error('Frame index out of range'); }
let fr = this.frames[frameIndex], buf = await this._demuxFrame(null, fr);
return Image.libwebp.decodeImage(buf, fr.width, fr.height);
}
async setFrameData(frameIndex, buf, { width = 0, height = 0, preset = undefined, quality = undefined, exact = undefined, lossless = undefined, method = undefined, advanced = undefined } = {}) {
if (!Image.libwebp) { throw new Error('Must call Image.initLib() before using setFrameData'); }
if (!this.hasAnim) { throw new Error('Calling setFrameData on non-animations is not supported'); }
if (typeof frameIndex !== 'number') { throw new Error('Frame index expects a number'); }
if ((frameIndex < 0) || (frameIndex >= this.frames.length)) { throw new Error('Frame index out of range'); }
if ((quality !== undefined) && ((quality < 0) || (quality > 100))) { throw new Error('Quality out of range'); }
if ((lossless !== undefined) && ((lossless < 0) || (lossless > 9))) { throw new Error('Lossless preset out of range'); }
if ((method !== undefined) && ((method < 0) || (method > 6))) { throw new Error('Method out of range'); }
let fr = this.frames[frameIndex], ret = Image.libwebp.encodeImage(buf, width > 0 ? width : fr.width, height > 0 ? height : fr.height, { preset, quality, exact, lossless, method, advanced }), img = new Image();
if (ret.res !== encodeResults.SUCCESS) { return ret.res; }
await img.load(Buffer.from(ret.buf));
switch (fr.type) {
case constants.TYPE_LOSSY: delete fr.vp8; if (fr.alph) { delete fr.alph; } break;
case constants.TYPE_LOSSLESS: delete fr.vp8l; break;
}
fr.width = img.width;
fr.height = img.height;
switch (img.type) {
case constants.TYPE_LOSSY: fr.type = img.type; fr.vp8 = img.data.vp8; break;
case constants.TYPE_LOSSLESS: fr.type = img.type; fr.vp8l = img.data.vp8l; break;
case constants.TYPE_EXTENDED:
if (img.data.vp8) {
fr.type = constants.TYPE_LOSSY;
fr.vp8 = img.data.vp8;
if (img.data.vp8.alpha) { fr.alph = img.data.alph; }
} else if (img.data.vp8l) {
fr.type = constants.TYPE_LOSSLESS;
fr.vp8l = img.data.vp8l;
}
break;
}
return encodeResults.SUCCESS;
}
// Public static functions
static async initLib() {
if (!Image.libwebp) {
const libWebP = require('./libwebp.js');
Image.libwebp = new libWebP();
await Image.libwebp.init();
}
}
static async save(d, opts) {
if ((opts.frames) && ((opts.width === undefined) || (opts.height === undefined))) { throw new Error('Must provide both width and height when passing frames'); }
return (await Image.getEmptyImage(!!opts.frames)).save(d, opts);
}
static async getEmptyImage(ext) {
let img = new Image();
await img.load(emptyImageBuffer);
if (ext) { img.exif = undefined; }
return img;
}
static async generateFrame({ path = undefined, buffer = undefined, img = undefined, x = undefined, y = undefined, delay = undefined, blend = undefined, dispose = undefined } = {}) {
let _img = img;
if (((!path) && (!buffer) && (!img)) ||
((path) && (buffer) && (img))) { throw new Error('Must provide either `path`, `buffer`, or `img`'); }
if (!img) {
_img = new Image();
if (path) { await _img.load(path); }
else { await _img.load(buffer); }
}
if (_img.hasAnim) { throw new Error('Merging animations is not currently supported'); }
return {
img: _img,
x,
y,
delay,
blend,
dispose
};
}
static from(webp) {
let img = new Image();
img.data = webp.data;
img.loaded = webp.loaded;
img.path = webp.path;
return img;
}
}
module.exports = {
TYPE_LOSSY: constants.TYPE_LOSSY,
TYPE_LOSSLESS: constants.TYPE_LOSSLESS,
TYPE_EXTENDED: constants.TYPE_EXTENDED,
encodeResults,
hints: imageHints,
presets: imagePresets,
Image
};