Compare commits

...

58 Commits

Author SHA1 Message Date
Lovell Fuller
844deaf480 Release v0.31.3 2022-12-21 15:57:10 +00:00
Lovell Fuller
efbb0c22fd Docs: add image with examples of resize fit property 2022-12-21 15:47:39 +00:00
Lovell Fuller
da0b594900 Docs: update benchmarks for latest versions, add ARM64 results 2022-12-20 19:49:29 +00:00
Lovell Fuller
78dada9126 Tests: skip mapnik and tensorflow for Docker-run benchmarks
Maintainance of mapnik seems to have stalled, no ARM64 support
Memory requirements of Tensorflow too high, hangs/crashes on AMD64
2022-12-20 18:20:59 +00:00
Lovell Fuller
15f5cd4671 Tests: move mapnik to optional deps
It does not currently support ARM64
2022-12-19 19:47:46 +00:00
Lovell Fuller
9eb2e94404 Tests: update benchmark dependencies 2022-12-17 14:29:11 +00:00
Lovell Fuller
e40b068628 Tests: update leak suppresions for latest dependencies 2022-12-14 21:57:42 +00:00
Lovell Fuller
2c46528269 Docs refresh 2022-12-14 16:17:42 +00:00
Lovell Fuller
584807b4f5 Add runtime detection of V8 memory cage #3384
When using the V8 memory cage, Buffers cannot be wrapped and then
later freed via a callback. When the cage is detected via a throw,
instead fall back to copying Buffer contents to V8 memory.

This approach will be used by Electron 21+ and you should expect
reduced performance and increased memory consumption/fragmentation.
2022-12-14 16:06:04 +00:00
Lovell Fuller
a7fa7014ef Add experimental support for JPEG-XL, requires libvips with libjxl
The prebuilt binaries do not include support for this format.
2022-12-13 21:55:17 +00:00
Lovell Fuller
f92e33fbff Bump devDeps 2022-12-13 10:31:06 +00:00
Lovell Fuller
0f1e7ef6f6 Install: add support for Linux with glibc patch version #3423 2022-12-09 12:03:41 +00:00
Lennart
89e204d824 Docs: clarify failOn property applies to decoding pixel values (#3481) 2022-12-08 16:13:18 +00:00
Lovell Fuller
2a71f1830f Expand range of sharpen params to match libvips #3427 2022-12-07 09:28:01 +00:00
Lovell Fuller
def99a294a Install: log proxy use, if any, to aid with debugging 2022-12-06 19:35:47 +00:00
Lovell Fuller
9d760f3958 Improve perf of ops that introduce non-opaque background #3465 2022-12-05 20:40:41 +00:00
Lovell Fuller
0265d305fe Ensure integral output of linear op #3468 2022-12-04 21:41:15 +00:00
Lovell Fuller
a472aea025 Ignore sequentialRead option for stats #3462 2022-11-20 21:30:45 +00:00
Lovell Fuller
01ffa80338 Improve extractChannel support for 16-bit output #3453 2022-11-15 15:00:32 +00:00
Lovell Fuller
789d4851ea Tests: remove flaky font assertions
Probably due to Windows CI env font discovery
2022-11-15 10:08:43 +00:00
Lovell Fuller
4490a93430 Tests: simplify beforeEach configuration
Remove legacy settings for previous CI providers/hardware
2022-11-15 09:54:29 +00:00
Ingvar Stepanyan
ac0dc10bd5 Tests: convert mocha hooks (#3450) 2022-11-15 08:58:09 +00:00
Lovell Fuller
5740f4545e Expose GIF opts: interFrameMaxError, interPaletteMaxError #3401 2022-11-14 16:09:52 +00:00
Lovell Fuller
a9d692fb43 Reduce chance of race condition in test for... race condition 2022-11-13 10:16:47 +00:00
Lovell Fuller
df971207b8 Prevent possible race condition when reading metadata #3451 2022-11-13 10:04:55 +00:00
Ingvar Stepanyan
3a64a0529a Tests: run in parallel, move settings to config file (#3449)
This allows to easily invoke Mocha alone via `npx mocha`, or for e.g. VSCode Test Explorer to find and run tests with the correct settings automatically.
2022-11-10 21:48:18 +00:00
Peter Whidden
76cda885fb Docs: fix minor typo in resize properties (#3444) 2022-11-09 08:44:05 +00:00
Ingvar Stepanyan
1a563360c6 Fix errors for missing OpenJPEG (#3442)
Fixes couple of minor issues with JP2 errors:

1. The tests passed as false-positives even if regex is changed to arbitary pattern, because the promise returned from `assert.rejects` was ignored and the test ended prematurely. This is fixed by removing `{ ... }` around the test function body.
2. This, in turn, hid an issue with `toFile` not throwing the expected error message which was instead propagating `Error: VipsOperation: class "jp2ksave" not found` from libvips. This is now fixed by manually checking the extension before calling into libvips.
3. Pre-creating error instances like `errJp2Save` did is sometimes tempting, but is problematic for debugging because it hides the actual stacktrace of the error (the stacktrace is collected at the moment of `new Error` creation). This is now turned into a function that creates error with the right stack.
2022-11-08 19:53:14 +00:00
Lovell Fuller
ca22af203f Docs: canvas on Windows uses MSVCRT, conflicts with UCRT 2022-11-07 21:14:51 +00:00
Lovell Fuller
9fa516e849 Release v0.31.2 2022-11-04 09:44:37 +00:00
Lovell Fuller
12f472126d CI: Only pin Python version on x64 macOS and Windows
See commit 18be09f
2022-11-03 14:49:12 +00:00
Lovell Fuller
18be09f1d7 CI: Pin Python to 3.10
Python 3.11 removes support for opening files in
'universal newline' mode (e.g. 'rU'), however older
versions of node-gyp such as v6 still use it.
2022-11-03 14:40:16 +00:00
Lovell Fuller
b3c3290f90 Upgrade to libvips v8.13.3 2022-11-03 14:09:23 +00:00
Lovell Fuller
123f95c85a Bump devDeps 2022-11-03 12:50:58 +00:00
Lovell Fuller
5b0fba4c01 Ensure auto-rotate always works without resize #3422 2022-11-02 13:59:34 +00:00
Lovell Fuller
37f7ccfff4 CI: upgrade to checkout v3 2022-10-17 16:05:30 +01:00
Lovell Fuller
51811d06e2 Bump deps 2022-10-17 16:05:03 +01:00
Lovell Fuller
181731f8f4 Tests: increase timeout to 30s
Font discovery appears to be slooow on Windows
2022-10-17 15:54:43 +01:00
Gino Emiliozzi
ae79d26ead Docs: help clarify 'fit' is option name, not value (#3410) 2022-10-17 15:26:59 +01:00
Lovell Fuller
eacb8337fa Ensure manual flip, rotate, resize op order #3391 2022-10-01 11:55:29 +01:00
Lovell Fuller
99bf279de8 Release v0.31.1 2022-09-29 14:51:45 +01:00
Lovell Fuller
1b0eb6ab53 Tests: add assertion to existing scenario #3357 2022-09-29 14:21:37 +01:00
Lovell Fuller
891cf67d0b Upgrade to libvips v8.13.2 2022-09-29 14:19:58 +01:00
Lovell Fuller
eaf8a86bf2 Tests: increase timeout to 20s
Ignore unit coverage of fn used at install time
2022-09-27 14:25:34 +01:00
Lovell Fuller
3400976d61 Tests: ignore a branch for coverage check 2022-09-27 14:10:49 +01:00
Lovell Fuller
2d49f0e93e Tests: require 100% branch coverage to pass
Remove old coverage tooling, coveralls
2022-09-27 13:49:42 +01:00
Lovell Fuller
b0c69f1ee9 CI: use clearer job names 2022-09-27 10:02:20 +01:00
Lovell Fuller
27d0c35a01 CI: use clearer job names 2022-09-27 09:23:30 +01:00
Alex
32a22b5420 CI: GitHub Workflows security hardening (#3377)
Signed-off-by: Alex <aleksandrosansan@gmail.com>
2022-09-26 11:25:49 +01:00
Lovell Fuller
d1004eed02 Ensure greyscale images can be trimmed #3386 2022-09-26 10:15:25 +01:00
Lovell Fuller
70e6bb0162 Ensure close event occurs after end event #3313 2022-09-20 08:52:40 +01:00
Lovell Fuller
32aa3b4b20 Tests: bump/pin benchmark dependencies 2022-09-19 16:21:24 +01:00
Lovell Fuller
df24b30755 Tests: add tfjs to benchmark tests 2022-09-19 16:20:23 +01:00
Lovell Fuller
4de74bea94 Tests: remove assertions from benchmark code 2022-09-19 16:09:31 +01:00
Lovell Fuller
28b87db760 Ensure AVIF output is always 8-bit #3358 2022-09-14 13:33:47 +01:00
Lovell Fuller
fbd4970b57 Ensure auto-rotation works with shrink-on-load #3352
Fixes regression in 0.31.0
2022-09-07 14:17:40 +01:00
Lovell Fuller
f5da147a58 Docs: changelog and credit for #3349 2022-09-07 13:31:26 +01:00
Marcos Casagrande
eee0dd36d9 Ensure limitInputPixels uses uint64 (#3349) 2022-09-06 09:05:51 +01:00
60 changed files with 1022 additions and 363 deletions

View File

@@ -2,29 +2,33 @@ name: CI (MacStadium)
on: on:
- push - push
- pull_request - pull_request
permissions: {}
jobs: jobs:
CI: CI:
permissions:
contents: write # for npx prebuild to make release
name: Node.js ${{ matrix.nodejs_version }} ${{ matrix.nodejs_arch }} ${{ matrix.prebuild && '- prebuild' }}
runs-on: macos-m1 runs-on: macos-m1
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
include: include:
- nodejs_version: 14 - nodejs_version: 14
nodejs_architecture: x64 nodejs_arch: x64
- nodejs_version: 18 - nodejs_version: 18
nodejs_architecture: arm64 nodejs_arch: arm64
prebuild: true prebuild: true
defaults: defaults:
run: run:
shell: /usr/bin/arch -arch arm64e /bin/bash -l {0} shell: /usr/bin/arch -arch arm64e /bin/bash -l {0}
steps: steps:
- name: Dependencies - name: Dependencies (Node.js)
uses: actions/setup-node@v2 uses: actions/setup-node@v3
with: with:
node-version: ${{ matrix.nodejs_version }} node-version: ${{ matrix.nodejs_version }}
architecture: ${{ matrix.nodejs_architecture }} architecture: ${{ matrix.nodejs_arch }}
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Install - name: Install
run: npm install --build-from-source --unsafe-perm run: npm install --build-from-source --unsafe-perm
- name: Test - name: Test

View File

@@ -2,31 +2,34 @@ name: CI (GitHub)
on: on:
- push - push
- pull_request - pull_request
permissions: {}
jobs: jobs:
CI: CI:
permissions:
contents: write # for npx prebuild to make release
name: ${{ matrix.container || matrix.os }} - Node.js ${{ matrix.nodejs_version }} ${{ matrix.nodejs_arch }} ${{ matrix.prebuild && '- prebuild' }}
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
container: ${{ matrix.container }} container: ${{ matrix.container }}
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
include: include:
- os: ubuntu-20.04 - os: ubuntu-22.04
container: centos:7 container: centos:7
nodejs_version: 14 nodejs_version: 14
coverage: true
prebuild: true prebuild: true
- os: ubuntu-20.04 - os: ubuntu-22.04
container: centos:7 container: centos:7
nodejs_version: 16 nodejs_version: 16
- os: ubuntu-20.04 - os: ubuntu-22.04
container: rockylinux:8 container: rockylinux:8
nodejs_version: 18 nodejs_version: 18
- os: ubuntu-20.04 - os: ubuntu-22.04
container: node:14-alpine3.12 container: node:14-alpine3.12
prebuild: true prebuild: true
- os: ubuntu-20.04 - os: ubuntu-22.04
container: node:16-alpine3.12 container: node:16-alpine3.12
- os: ubuntu-20.04 - os: ubuntu-22.04
container: node:18-alpine3.14 container: node:18-alpine3.14
- os: macos-11 - os: macos-11
nodejs_version: 14 nodejs_version: 14
@@ -75,14 +78,19 @@ jobs:
- name: Dependencies (Linux musl) - name: Dependencies (Linux musl)
if: contains(matrix.container, 'alpine') if: contains(matrix.container, 'alpine')
run: apk add build-base git python3 font-noto --update-cache run: apk add build-base git python3 font-noto --update-cache
- name: Dependencies (macOS, Windows) - name: Dependencies (Python 3.10 - macOS, Windows)
if: contains(matrix.os, 'macos') || contains(matrix.os, 'windows')
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Dependencies (Node.js - macOS, Windows)
if: contains(matrix.os, 'macos') || contains(matrix.os, 'windows') if: contains(matrix.os, 'macos') || contains(matrix.os, 'windows')
uses: actions/setup-node@v3 uses: actions/setup-node@v3
with: with:
node-version: ${{ matrix.nodejs_version }} node-version: ${{ matrix.nodejs_version }}
architecture: ${{ matrix.nodejs_arch }} architecture: ${{ matrix.nodejs_arch }}
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Fix working directory ownership - name: Fix working directory ownership
if: matrix.container if: matrix.container
run: chown root.root . run: chown root.root .
@@ -90,11 +98,6 @@ jobs:
run: npm install --build-from-source --unsafe-perm run: npm install --build-from-source --unsafe-perm
- name: Test - name: Test
run: npm test run: npm test
- name: Coverage
if: matrix.coverage
uses: coverallsapp/github-action@v1.1.2
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Prebuild - name: Prebuild
if: matrix.prebuild && startsWith(github.ref, 'refs/tags/') if: matrix.prebuild && startsWith(github.ref, 'refs/tags/')
env: env:

7
.mocharc.jsonc Normal file
View File

@@ -0,0 +1,7 @@
{
"parallel": true,
"slow": 1000,
"timeout": 30000,
"require": "./test/beforeEach.js",
"spec": "./test/unit/*.js"
}

View File

@@ -98,7 +98,6 @@ readableStream
A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md) A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md)
covers reporting bugs, requesting features and submitting code changes. covers reporting bugs, requesting features and submitting code changes.
[![Test Coverage](https://coveralls.io/repos/lovell/sharp/badge.svg?branch=main)](https://coveralls.io/r/lovell/sharp?branch=main)
[![Node-API v5](https://img.shields.io/badge/Node--API-v5-green.svg)](https://nodejs.org/dist/latest/docs/api/n-api.html#n_api_n_api_version_matrix) [![Node-API v5](https://img.shields.io/badge/Node--API-v5-green.svg)](https://nodejs.org/dist/latest/docs/api/n-api.html#n_api_n_api_version_matrix)
## Licensing ## Licensing

View File

@@ -20,7 +20,7 @@ Implements the [stream.Duplex][1] class.
JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present.
* `options` **[Object][13]?** if present, is an Object with optional attributes. * `options` **[Object][13]?** if present, is an Object with optional attributes.
* `options.failOn` **[string][12]** level of sensitivity to invalid images, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels. (optional, default `'warning'`) * `options.failOn` **[string][12]** when to abort processing of invalid pixel data, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels, invalid metadata will always abort. (optional, default `'warning'`)
* `options.limitInputPixels` **([number][14] | [boolean][15])** Do not process input images where the number of pixels * `options.limitInputPixels` **([number][14] | [boolean][15])** Do not process input images where the number of pixels
(width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted.
An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). (optional, default `268402689`) An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). (optional, default `268402689`)

View File

@@ -151,22 +151,24 @@ Returns **Sharp**&#x20;
## sharpen ## sharpen
Sharpen the image. Sharpen the image.
When used without parameters, performs a fast, mild sharpen of the output image. When used without parameters, performs a fast, mild sharpen of the output image.
When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space.
Separate control over the level of sharpening in "flat" and "jagged" areas is available. Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available.
See [libvips sharpen][8] operation. See [libvips sharpen][8] operation.
### Parameters ### Parameters
* `options` **([Object][2] | [number][1])?** if present, is an Object with attributes or (deprecated) a number for `options.sigma`. * `options` **([Object][2] | [number][1])?** if present, is an Object with attributes
* `options.sigma` **[number][1]?** the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. * `options.sigma` **[number][1]?** the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10000
* `options.m1` **[number][1]** the level of sharpening to apply to "flat" areas. (optional, default `1.0`) * `options.m1` **[number][1]** the level of sharpening to apply to "flat" areas, between 0 and 1000000 (optional, default `1.0`)
* `options.m2` **[number][1]** the level of sharpening to apply to "jagged" areas. (optional, default `2.0`) * `options.m2` **[number][1]** the level of sharpening to apply to "jagged" areas, between 0 and 1000000 (optional, default `2.0`)
* `options.x1` **[number][1]** threshold between "flat" and "jagged" (optional, default `2.0`) * `options.x1` **[number][1]** threshold between "flat" and "jagged", between 0 and 1000000 (optional, default `2.0`)
* `options.y2` **[number][1]** maximum amount of brightening. (optional, default `10.0`) * `options.y2` **[number][1]** maximum amount of brightening, between 0 and 1000000 (optional, default `10.0`)
* `options.y3` **[number][1]** maximum amount of darkening. (optional, default `20.0`) * `options.y3` **[number][1]** maximum amount of darkening, between 0 and 1000000 (optional, default `20.0`)
* `flat` **[number][1]?** (deprecated) see `options.m1`. * `flat` **[number][1]?** (deprecated) see `options.m1`.
* `jagged` **[number][1]?** (deprecated) see `options.m2`. * `jagged` **[number][1]?** (deprecated) see `options.m2`.

View File

@@ -334,6 +334,8 @@ The palette of the input image will be re-used if possible.
* `options.colors` **[number][12]** alternative spelling of `options.colours` (optional, default `256`) * `options.colors` **[number][12]** alternative spelling of `options.colours` (optional, default `256`)
* `options.effort` **[number][12]** CPU effort, between 1 (fastest) and 10 (slowest) (optional, default `7`) * `options.effort` **[number][12]** CPU effort, between 1 (fastest) and 10 (slowest) (optional, default `7`)
* `options.dither` **[number][12]** level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) (optional, default `1.0`) * `options.dither` **[number][12]** level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) (optional, default `1.0`)
* `options.interFrameMaxError` **[number][12]** maximum inter-frame error for transparency, between 0 (lossless) and 32 (optional, default `0`)
* `options.interPaletteMaxError` **[number][12]** maximum inter-palette error for palette reuse, between 0 and 256 (optional, default `3`)
* `options.loop` **[number][12]** number of animation iterations, use 0 for infinite animation (optional, default `0`) * `options.loop` **[number][12]** number of animation iterations, use 0 for infinite animation (optional, default `0`)
* `options.delay` **([number][12] | [Array][13]<[number][12]>)?** delay(s) between animation frames (in milliseconds) * `options.delay` **([number][12] | [Array][13]<[number][12]>)?** delay(s) between animation frames (in milliseconds)
* `options.force` **[boolean][10]** force GIF output, otherwise attempt to use input format (optional, default `true`) * `options.force` **[boolean][10]** force GIF output, otherwise attempt to use input format (optional, default `true`)
@@ -361,6 +363,13 @@ const out = await sharp('in.gif', { animated: true })
.toBuffer(); .toBuffer();
``` ```
```javascript
// Lossy file size reduction of animated GIF
await sharp('in.gif', { animated: true })
.gif({ interFrameMaxError: 8 })
.toFile('optim.gif');
```
* Throws **[Error][4]** Invalid options * Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20; Returns **Sharp**&#x20;
@@ -527,6 +536,38 @@ Returns **Sharp**&#x20;
* **since**: 0.23.0 * **since**: 0.23.0
## jxl
Use these JPEG-XL (JXL) options for output image.
This feature is experimental, please do not use in production systems.
Requires libvips compiled with support for libjxl.
The prebuilt binaries do not include this - see
[installing a custom libvips][14].
Image metadata (EXIF, XMP) is unsupported.
### Parameters
* `options` **[Object][6]?** output options
* `options.distance` **[number][12]** maximum encoding error, between 0 (highest quality) and 15 (lowest quality) (optional, default `1.0`)
* `options.quality` **[number][12]?** calculate `distance` based on JPEG-like quality, between 1 and 100, overrides distance if specified
* `options.decodingTier` **[number][12]** target decode speed tier, between 0 (highest quality) and 4 (lowest quality) (optional, default `0`)
* `options.lossless` **[boolean][10]** use lossless compression (optional, default `false`)
* `options.effort` **[number][12]** CPU effort, between 3 (fastest) and 9 (slowest) (optional, default `7`)
<!---->
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.31.3
## raw ## raw
Force output to be raw, uncompressed pixel data. Force output to be raw, uncompressed pixel data.

View File

@@ -14,7 +14,9 @@ When both a `width` and `height` are provided, the possible methods by which the
Some of these values are based on the [object-fit][1] CSS property. Some of these values are based on the [object-fit][1] CSS property.
When using a `fit` of `cover` or `contain`, the default **position** is `centre`. Other options are: <img alt="Examples of various values for the fit property when resizing" width="100%" style="aspect-ratio: 998/243" src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/api-resize-fit.png">
When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are:
* `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`. * `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`.
* `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`. * `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`.
@@ -45,8 +47,8 @@ Previous calls to `resize` in the same pipeline will be ignored.
* `height` **[number][8]?** pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. * `height` **[number][8]?** pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width.
* `options` **[Object][9]?**&#x20; * `options` **[Object][9]?**&#x20;
* `options.width` **[String][10]?** alternative means of specifying `width`. If both are present this take priority. * `options.width` **[String][10]?** alternative means of specifying `width`. If both are present this takes priority.
* `options.height` **[String][10]?** alternative means of specifying `height`. If both are present this take priority. * `options.height` **[String][10]?** alternative means of specifying `height`. If both are present this takes priority.
* `options.fit` **[String][10]** how the image should be resized to fit both provided dimensions, one of `cover`, `contain`, `fill`, `inside` or `outside`. (optional, default `'cover'`) * `options.fit` **[String][10]** how the image should be resized to fit both provided dimensions, one of `cover`, `contain`, `fill`, `inside` or `outside`. (optional, default `'cover'`)
* `options.position` **[String][10]** position, gravity or strategy to use when `fit` is `cover` or `contain`. (optional, default `'centre'`) * `options.position` **[String][10]** position, gravity or strategy to use when `fit` is `cover` or `contain`. (optional, default `'centre'`)
* `options.background` **([String][10] | [Object][9])** background colour when `fit` is `contain`, parsed by the [color][11] module, defaults to black without transparency. (optional, default `{r:0,g:0,b:0,alpha:1}`) * `options.background` **([String][10] | [Object][9])** background colour when `fit` is `contain`, parsed by the [color][11] module, defaults to black without transparency. (optional, default `{r:0,g:0,b:0,alpha:1}`)

View File

@@ -2,7 +2,69 @@
## v0.31 - *eagle* ## v0.31 - *eagle*
Requires libvips v8.13.1 Requires libvips v8.13.3
### v0.31.3 - 21st December 2022
* Add experimental support for JPEG-XL images. Requires libvips compiled with libjxl.
[#2731](https://github.com/lovell/sharp/issues/2731)
* Add runtime detection of V8 memory cage, ensures compatibility with Electron 21 onwards.
[#3384](https://github.com/lovell/sharp/issues/3384)
* Expose `interFrameMaxError` and `interPaletteMaxError` GIF optimisation properties.
[#3401](https://github.com/lovell/sharp/issues/3401)
* Allow installation on Linux with glibc patch versions e.g. Fedora 38.
[#3423](https://github.com/lovell/sharp/issues/3423)
* Expand range of existing `sharpen` parameters to match libvips.
[#3427](https://github.com/lovell/sharp/issues/3427)
* Prevent possible race condition awaiting metadata of Stream-based input.
[#3451](https://github.com/lovell/sharp/issues/3451)
* Improve `extractChannel` support for 16-bit output colourspaces.
[#3453](https://github.com/lovell/sharp/issues/3453)
* Ignore `sequentialRead` option when calculating image statistics.
[#3462](https://github.com/lovell/sharp/issues/3462)
* Small performance improvement for operations that introduce a non-opaque background.
[#3465](https://github.com/lovell/sharp/issues/3465)
* Ensure integral output of `linear` operation.
[#3468](https://github.com/lovell/sharp/issues/3468)
### v0.31.2 - 4th November 2022
* Upgrade to libvips v8.13.3 for upstream bug fixes.
* Ensure manual flip, rotate, resize operation ordering (regression in 0.31.1)
[#3391](https://github.com/lovell/sharp/issues/3391)
* Ensure auto-rotation works without resize (regression in 0.31.1)
[#3422](https://github.com/lovell/sharp/issues/3422)
### v0.31.1 - 29th September 2022
* Upgrade to libvips v8.13.2 for upstream bug fixes.
* Ensure `close` event occurs after `end` event for Stream-based output.
[#3313](https://github.com/lovell/sharp/issues/3313)
* Ensure `limitInputPixels` constructor option uses uint64.
[#3349](https://github.com/lovell/sharp/pull/3349)
[@marcosc90](https://github.com/marcosc90)
* Ensure auto-rotation works with shrink-on-load and extract (regression in 0.31.0).
[#3352](https://github.com/lovell/sharp/issues/3352)
* Ensure AVIF output is always 8-bit.
[#3358](https://github.com/lovell/sharp/issues/3358)
* Ensure greyscale images can be trimmed (regression in 0.31.0).
[#3386](https://github.com/lovell/sharp/issues/3386)
### v0.31.0 - 5th September 2022 ### v0.31.0 - 5th September 2022

View File

@@ -260,3 +260,6 @@ GitHub: https://github.com/brahima
Name: Anton Marsden Name: Anton Marsden
GitHub: https://github.com/antonmarsden GitHub: https://github.com/antonmarsden
Name: Marcos Casagrande
GitHub: https://github.com/marcosc90

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

View File

@@ -343,9 +343,12 @@ Module did not self-register
### Canvas and Windows ### Canvas and Windows
The prebuilt binaries provided by `canvas` for Windows depend on the unmaintained GTK 2, last updated in 2011. The prebuilt binaries provided by `canvas` for Windows
from v2.7.0 onwards depend on the Visual C++ Runtime (MSVCRT).
These conflict with the binaries provided by sharp,
which depend on the more modern Universal C Runtime (UCRT).
These conflict with the modern, up-to-date binaries provided by sharp. See [Automattic/node-canvas#2155](https://github.com/Automattic/node-canvas/issues/2155).
If both modules are used in the same Windows process, the following error will occur: If both modules are used in the same Windows process, the following error will occur:
``` ```

View File

@@ -2,15 +2,19 @@
A test to benchmark the performance of this module relative to alternatives. A test to benchmark the performance of this module relative to alternatives.
Greater libvips performance can be expected with caching enabled (default)
and using 8+ core machines, especially those with larger L1/L2 CPU caches.
The I/O limits of the relevant (de)compression library will generally determine maximum throughput.
## The contenders ## The contenders
* [jimp](https://www.npmjs.com/package/jimp) v0.16.1 - Image processing in pure JavaScript. Provides bicubic interpolation. * [jimp](https://www.npmjs.com/package/jimp) v0.16.2 - Image processing in pure JavaScript. Provides bicubic interpolation.
* [mapnik](https://www.npmjs.org/package/mapnik) v4.5.9 - Whilst primarily a map renderer, Mapnik contains bitmap image utilities.
* [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*". * [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*".
* [gm](https://www.npmjs.com/package/gm) v1.23.1 - Fully featured wrapper around GraphicsMagick's `gm` command line utility. * [gm](https://www.npmjs.com/package/gm) v1.25.0 - Fully featured wrapper around GraphicsMagick's `gm` command line utility.
* [@squoosh/lib](https://www.npmjs.com/package/@squoosh/lib) v0.4.0 - Image libraries transpiled to WebAssembly, includes GPLv3 code. * [@squoosh/lib](https://www.npmjs.com/package/@squoosh/lib) v0.4.0 - Image libraries transpiled to WebAssembly, includes GPLv3 code.
* [@squoosh/cli](https://www.npmjs.com/package/@squoosh/cli) v0.7.2 - Command line wrapper around `@squoosh/lib`, avoids GPLv3 by spawning process. * [@squoosh/cli](https://www.npmjs.com/package/@squoosh/cli) v0.7.2 - Command line wrapper around `@squoosh/lib`, avoids GPLv3 by spawning process.
* sharp v0.31.0 / libvips v8.13.1 - Caching within libvips disabled to ensure a fair comparison. * sharp v0.31.3 / libvips v8.13.3 - Caching within libvips disabled to ensure a fair comparison.
## The task ## The task
@@ -18,31 +22,43 @@ Decompress a 2725x2225 JPEG image,
resize to 720x588 using Lanczos 3 resampling (where available), resize to 720x588 using Lanczos 3 resampling (where available),
then compress to JPEG at a "quality" setting of 80. then compress to JPEG at a "quality" setting of 80.
## Test environment ## Results
### AMD64
* AWS EC2 eu-west-1 [c6a.xlarge](https://aws.amazon.com/ec2/instance-types/c6a/) (4x AMD EPYC 7R13) * AWS EC2 eu-west-1 [c6a.xlarge](https://aws.amazon.com/ec2/instance-types/c6a/) (4x AMD EPYC 7R13)
* Ubuntu 22.04 (ami-051f7c00cb18501ee) * Ubuntu 22.04 (ami-026e72e4e468afa7b)
* Node.js 16.17.0 * Node.js 16.19.0
## Results
| Module | Input | Output | Ops/sec | Speed-up | | Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: | | :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 0.96 | 1.0 | | jimp | buffer | buffer | 0.82 | 1.0 |
| squoosh-cli | file | file | 1.10 | 1.1 | | squoosh-cli | file | file | 1.05 | 1.3 |
| squoosh-lib | buffer | buffer | 1.87 | 1.9 | | squoosh-lib | buffer | buffer | 1.19 | 1.5 |
| mapnik | buffer | buffer | 3.48 | 3.6 | | gm | buffer | buffer | 8.47 | 10.3 |
| gm | buffer | buffer | 8.53 | 8.9 | | gm | file | file | 8.58 | 10.5 |
| gm | file | file | 8.60 | 9.0 | | imagemagick | file | file | 9.23 | 11.3 |
| imagemagick | file | file | 9.30 | 9.7 | | sharp | stream | stream | 33.23 | 40.5 |
| sharp | stream | stream | 32.86 | 34.2 | | sharp | file | file | 35.22 | 43.0 |
| sharp | file | file | 34.82 | 36.3 | | sharp | buffer | buffer | 35.70 | 43.5 |
| sharp | buffer | buffer | 35.41 | 36.9 |
Greater libvips performance can be expected with caching enabled (default) ### ARM64
and using 8+ core machines, especially those with larger L1/L2 CPU caches.
The I/O limits of the relevant (de)compression library will generally determine maximum throughput. * AWS EC2 eu-west-1 [c7g.xlarge](https://aws.amazon.com/ec2/instance-types/c7g/) (4x ARM Graviton3)
* Ubuntu 22.04 (ami-02142ceceb3933ff5)
* Node.js 16.19.0
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 0.84 | 1.0 |
| squoosh-cli | file | file | 1.12 | 1.3 |
| squoosh-lib | buffer | buffer | 2.11 | 2.5 |
| gm | buffer | buffer | 10.39 | 12.4 |
| gm | file | file | 10.40 | 12.4 |
| imagemagick | file | file | 10.73 | 12.8 |
| sharp | stream | stream | 33.63 | 40.0 |
| sharp | file | file | 34.91 | 41.6 |
| sharp | buffer | buffer | 35.72 | 42.5 |
## Running the benchmark test ## Running the benchmark test

File diff suppressed because one or more lines are too long

View File

@@ -12,6 +12,7 @@ module.exports = [
'and', 'and',
'any', 'any',
'are', 'are',
'available',
'based', 'based',
'been', 'been',
'before', 'before',
@@ -74,10 +75,12 @@ module.exports = [
'requires', 'requires',
'requiresharp', 'requiresharp',
'returned', 'returned',
'run',
'same', 'same',
'see', 'see',
'set', 'set',
'sets', 'sets',
'sharp',
'should', 'should',
'since', 'since',
'site', 'site',
@@ -116,5 +119,6 @@ module.exports = [
'will', 'will',
'with', 'with',
'without', 'without',
'you' 'you',
'your'
]; ];

View File

@@ -138,7 +138,8 @@ try {
const libcFamily = detectLibc.familySync(); const libcFamily = detectLibc.familySync();
const libcVersion = detectLibc.versionSync(); const libcVersion = detectLibc.versionSync();
if (libcFamily === detectLibc.GLIBC && libcVersion && minimumGlibcVersionByArch[arch]) { if (libcFamily === detectLibc.GLIBC && libcVersion && minimumGlibcVersionByArch[arch]) {
if (semverLessThan(`${libcVersion}.0`, `${minimumGlibcVersionByArch[arch]}.0`)) { const libcVersionWithoutPatch = libcVersion.split('.').slice(0, 2).join('.');
if (semverLessThan(`${libcVersionWithoutPatch}.0`, `${minimumGlibcVersionByArch[arch]}.0`)) {
handleError(new Error(`Use with glibc ${libcVersion} requires manual installation of libvips >= ${minimumLibvipsVersion}`)); handleError(new Error(`Use with glibc ${libcVersion} requires manual installation of libvips >= ${minimumLibvipsVersion}`));
} }
} }
@@ -167,7 +168,7 @@ try {
} else { } else {
const url = distBaseUrl + tarFilename; const url = distBaseUrl + tarFilename;
libvips.log(`Downloading ${url}`); libvips.log(`Downloading ${url}`);
simpleGet({ url: url, agent: agent() }, function (err, response) { simpleGet({ url: url, agent: agent(libvips.log) }, function (err, response) {
if (err) { if (err) {
fail(err); fail(err);
} else if (response.statusCode === 404) { } else if (response.statusCode === 404) {

View File

@@ -18,7 +18,7 @@ function env (key) {
return process.env[key]; return process.env[key];
} }
module.exports = function () { module.exports = function (log) {
try { try {
const proxy = new url.URL(proxies.map(env).find(is.string)); const proxy = new url.URL(proxies.map(env).find(is.string));
const tunnel = proxy.protocol === 'https:' const tunnel = proxy.protocol === 'https:'
@@ -27,6 +27,7 @@ module.exports = function () {
const proxyAuth = proxy.username && proxy.password const proxyAuth = proxy.username && proxy.password
? `${decodeURIComponent(proxy.username)}:${decodeURIComponent(proxy.password)}` ? `${decodeURIComponent(proxy.username)}:${decodeURIComponent(proxy.password)}`
: null; : null;
log(`Via proxy ${proxy.protocol}://${proxy.hostname}:${proxy.port} ${proxyAuth ? 'with' : 'no'} credentials`);
return tunnel({ return tunnel({
proxy: { proxy: {
port: Number(proxy.port), port: Number(proxy.port),

View File

@@ -98,7 +98,7 @@ function extractChannel (channel) {
} else { } else {
throw is.invalidParameterError('channel', 'integer or one of: red, green, blue, alpha', channel); throw is.invalidParameterError('channel', 'integer or one of: red, green, blue, alpha', channel);
} }
return this.toColourspace('b-w'); return this;
} }
/** /**

View File

@@ -119,7 +119,7 @@ const debuglog = util.debuglog('sharp');
* a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. * a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file.
* JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. * JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present.
* @param {Object} [options] - if present, is an Object with optional attributes. * @param {Object} [options] - if present, is an Object with optional attributes.
* @param {string} [options.failOn='warning'] - level of sensitivity to invalid images, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels. * @param {string} [options.failOn='warning'] - when to abort processing of invalid pixel data, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels, invalid metadata will always abort.
* @param {number|boolean} [options.limitInputPixels=268402689] - Do not process input images where the number of pixels * @param {number|boolean} [options.limitInputPixels=268402689] - Do not process input images where the number of pixels
* (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. * (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted.
* An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). * An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF).
@@ -289,6 +289,8 @@ const Sharp = function (input, options) {
gifBitdepth: 8, gifBitdepth: 8,
gifEffort: 7, gifEffort: 7,
gifDither: 1, gifDither: 1,
gifInterFrameMaxError: 0,
gifInterPaletteMaxError: 3,
gifReoptimise: false, gifReoptimise: false,
tiffQuality: 80, tiffQuality: 80,
tiffCompression: 'jpeg', tiffCompression: 'jpeg',
@@ -306,6 +308,10 @@ const Sharp = function (input, options) {
heifCompression: 'av1', heifCompression: 'av1',
heifEffort: 4, heifEffort: 4,
heifChromaSubsampling: '4:4:4', heifChromaSubsampling: '4:4:4',
jxlDistance: 1,
jxlDecodingTier: 0,
jxlEffort: 7,
jxlLossless: false,
rawDepth: 'uchar', rawDepth: 'uchar',
tileSize: 256, tileSize: 256,
tileOverlap: 0, tileOverlap: 0,

View File

@@ -469,7 +469,7 @@ function metadata (callback) {
} else { } else {
if (this._isStreamInput()) { if (this._isStreamInput()) {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
this.on('finish', () => { const finished = () => {
this._flattenBufferIn(); this._flattenBufferIn();
sharp.metadata(this.options, (err, metadata) => { sharp.metadata(this.options, (err, metadata) => {
if (err) { if (err) {
@@ -478,7 +478,12 @@ function metadata (callback) {
resolve(metadata); resolve(metadata);
} }
}); });
}); };
if (this.writableFinished) {
finished();
} else {
this.once('finish', finished);
}
}); });
} else { } else {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {

View File

@@ -89,6 +89,7 @@ const removeVendoredLibvips = function () {
rm(vendorPath, { recursive: true, maxRetries: 3, force: true }); rm(vendorPath, { recursive: true, maxRetries: 3, force: true });
}; };
/* istanbul ignore next */
const pkgConfigPath = function () { const pkgConfigPath = function () {
if (process.platform !== 'win32') { if (process.platform !== 'win32') {
const brewPkgConfigPath = spawnSync( const brewPkgConfigPath = spawnSync(

View File

@@ -205,9 +205,11 @@ function affine (matrix, options) {
/** /**
* Sharpen the image. * Sharpen the image.
*
* When used without parameters, performs a fast, mild sharpen of the output image. * When used without parameters, performs a fast, mild sharpen of the output image.
*
* When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. * When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space.
* Separate control over the level of sharpening in "flat" and "jagged" areas is available. * Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available.
* *
* See {@link https://www.libvips.org/API/current/libvips-convolution.html#vips-sharpen|libvips sharpen} operation. * See {@link https://www.libvips.org/API/current/libvips-convolution.html#vips-sharpen|libvips sharpen} operation.
* *
@@ -229,13 +231,13 @@ function affine (matrix, options) {
* }) * })
* .toBuffer(); * .toBuffer();
* *
* @param {Object|number} [options] - if present, is an Object with attributes or (deprecated) a number for `options.sigma`. * @param {Object|number} [options] - if present, is an Object with attributes
* @param {number} [options.sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. * @param {number} [options.sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10000
* @param {number} [options.m1=1.0] - the level of sharpening to apply to "flat" areas. * @param {number} [options.m1=1.0] - the level of sharpening to apply to "flat" areas, between 0 and 1000000
* @param {number} [options.m2=2.0] - the level of sharpening to apply to "jagged" areas. * @param {number} [options.m2=2.0] - the level of sharpening to apply to "jagged" areas, between 0 and 1000000
* @param {number} [options.x1=2.0] - threshold between "flat" and "jagged" * @param {number} [options.x1=2.0] - threshold between "flat" and "jagged", between 0 and 1000000
* @param {number} [options.y2=10.0] - maximum amount of brightening. * @param {number} [options.y2=10.0] - maximum amount of brightening, between 0 and 1000000
* @param {number} [options.y3=20.0] - maximum amount of darkening. * @param {number} [options.y3=20.0] - maximum amount of darkening, between 0 and 1000000
* @param {number} [flat] - (deprecated) see `options.m1`. * @param {number} [flat] - (deprecated) see `options.m1`.
* @param {number} [jagged] - (deprecated) see `options.m2`. * @param {number} [jagged] - (deprecated) see `options.m2`.
* @returns {Sharp} * @returns {Sharp}
@@ -268,44 +270,44 @@ function sharpen (options, flat, jagged) {
} }
} }
} else if (is.plainObject(options)) { } else if (is.plainObject(options)) {
if (is.number(options.sigma) && is.inRange(options.sigma, 0.01, 10000)) { if (is.number(options.sigma) && is.inRange(options.sigma, 0.000001, 10000)) {
this.options.sharpenSigma = options.sigma; this.options.sharpenSigma = options.sigma;
} else { } else {
throw is.invalidParameterError('options.sigma', 'number between 0.01 and 10000', options.sigma); throw is.invalidParameterError('options.sigma', 'number between 0.000001 and 10000', options.sigma);
} }
if (is.defined(options.m1)) { if (is.defined(options.m1)) {
if (is.number(options.m1) && is.inRange(options.m1, 0, 10000)) { if (is.number(options.m1) && is.inRange(options.m1, 0, 1000000)) {
this.options.sharpenM1 = options.m1; this.options.sharpenM1 = options.m1;
} else { } else {
throw is.invalidParameterError('options.m1', 'number between 0 and 10000', options.m1); throw is.invalidParameterError('options.m1', 'number between 0 and 1000000', options.m1);
} }
} }
if (is.defined(options.m2)) { if (is.defined(options.m2)) {
if (is.number(options.m2) && is.inRange(options.m2, 0, 10000)) { if (is.number(options.m2) && is.inRange(options.m2, 0, 1000000)) {
this.options.sharpenM2 = options.m2; this.options.sharpenM2 = options.m2;
} else { } else {
throw is.invalidParameterError('options.m2', 'number between 0 and 10000', options.m2); throw is.invalidParameterError('options.m2', 'number between 0 and 1000000', options.m2);
} }
} }
if (is.defined(options.x1)) { if (is.defined(options.x1)) {
if (is.number(options.x1) && is.inRange(options.x1, 0, 10000)) { if (is.number(options.x1) && is.inRange(options.x1, 0, 1000000)) {
this.options.sharpenX1 = options.x1; this.options.sharpenX1 = options.x1;
} else { } else {
throw is.invalidParameterError('options.x1', 'number between 0 and 10000', options.x1); throw is.invalidParameterError('options.x1', 'number between 0 and 1000000', options.x1);
} }
} }
if (is.defined(options.y2)) { if (is.defined(options.y2)) {
if (is.number(options.y2) && is.inRange(options.y2, 0, 10000)) { if (is.number(options.y2) && is.inRange(options.y2, 0, 1000000)) {
this.options.sharpenY2 = options.y2; this.options.sharpenY2 = options.y2;
} else { } else {
throw is.invalidParameterError('options.y2', 'number between 0 and 10000', options.y2); throw is.invalidParameterError('options.y2', 'number between 0 and 1000000', options.y2);
} }
} }
if (is.defined(options.y3)) { if (is.defined(options.y3)) {
if (is.number(options.y3) && is.inRange(options.y3, 0, 10000)) { if (is.number(options.y3) && is.inRange(options.y3, 0, 1000000)) {
this.options.sharpenY3 = options.y3; this.options.sharpenY3 = options.y3;
} else { } else {
throw is.invalidParameterError('options.y3', 'number between 0 and 10000', options.y3); throw is.invalidParameterError('options.y3', 'number between 0 and 1000000', options.y3);
} }
} }
} else { } else {

View File

@@ -22,10 +22,13 @@ const formats = new Map([
['jp2', 'jp2'], ['jp2', 'jp2'],
['jpx', 'jp2'], ['jpx', 'jp2'],
['j2k', 'jp2'], ['j2k', 'jp2'],
['j2c', 'jp2'] ['j2c', 'jp2'],
['jxl', 'jxl']
]); ]);
const errJp2Save = new Error('JP2 output requires libvips with support for OpenJPEG'); const jp2Regex = /\.jp[2x]|j2[kc]$/i;
const errJp2Save = () => new Error('JP2 output requires libvips with support for OpenJPEG');
const bitdepthFromColourCount = (colours) => 1 << 31 - Math.clz32(Math.ceil(Math.log2(colours))); const bitdepthFromColourCount = (colours) => 1 << 31 - Math.clz32(Math.ceil(Math.log2(colours)));
@@ -68,6 +71,8 @@ function toFile (fileOut, callback) {
err = new Error('Missing output file path'); err = new Error('Missing output file path');
} else if (is.string(this.options.input.file) && path.resolve(this.options.input.file) === path.resolve(fileOut)) { } else if (is.string(this.options.input.file) && path.resolve(this.options.input.file) === path.resolve(fileOut)) {
err = new Error('Cannot use same file for input and output'); err = new Error('Cannot use same file for input and output');
} else if (jp2Regex.test(fileOut) && !this.constructor.format.jp2k.output.file) {
err = errJp2Save();
} }
if (err) { if (err) {
if (is.fn(callback)) { if (is.fn(callback)) {
@@ -547,6 +552,12 @@ function webp (options) {
* .gif({ dither: 0 }) * .gif({ dither: 0 })
* .toBuffer(); * .toBuffer();
* *
* @example
* // Lossy file size reduction of animated GIF
* await sharp('in.gif', { animated: true })
* .gif({ interFrameMaxError: 8 })
* .toFile('optim.gif');
*
* @param {Object} [options] - output options * @param {Object} [options] - output options
* @param {boolean} [options.reoptimise=false] - always generate new palettes (slow), re-use existing by default * @param {boolean} [options.reoptimise=false] - always generate new palettes (slow), re-use existing by default
* @param {boolean} [options.reoptimize=false] - alternative spelling of `options.reoptimise` * @param {boolean} [options.reoptimize=false] - alternative spelling of `options.reoptimise`
@@ -554,6 +565,8 @@ function webp (options) {
* @param {number} [options.colors=256] - alternative spelling of `options.colours` * @param {number} [options.colors=256] - alternative spelling of `options.colours`
* @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 10 (slowest) * @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 10 (slowest)
* @param {number} [options.dither=1.0] - level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) * @param {number} [options.dither=1.0] - level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most)
* @param {number} [options.interFrameMaxError=0] - maximum inter-frame error for transparency, between 0 (lossless) and 32
* @param {number} [options.interPaletteMaxError=3] - maximum inter-palette error for palette reuse, between 0 and 256
* @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation * @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation
* @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds) * @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds)
* @param {boolean} [options.force=true] - force GIF output, otherwise attempt to use input format * @param {boolean} [options.force=true] - force GIF output, otherwise attempt to use input format
@@ -589,6 +602,20 @@ function gif (options) {
throw is.invalidParameterError('dither', 'number between 0.0 and 1.0', options.dither); throw is.invalidParameterError('dither', 'number between 0.0 and 1.0', options.dither);
} }
} }
if (is.defined(options.interFrameMaxError)) {
if (is.number(options.interFrameMaxError) && is.inRange(options.interFrameMaxError, 0, 32)) {
this.options.gifInterFrameMaxError = options.interFrameMaxError;
} else {
throw is.invalidParameterError('interFrameMaxError', 'number between 0.0 and 32.0', options.interFrameMaxError);
}
}
if (is.defined(options.interPaletteMaxError)) {
if (is.number(options.interPaletteMaxError) && is.inRange(options.interPaletteMaxError, 0, 256)) {
this.options.gifInterPaletteMaxError = options.interPaletteMaxError;
} else {
throw is.invalidParameterError('interPaletteMaxError', 'number between 0.0 and 256.0', options.interPaletteMaxError);
}
}
} }
trySetAnimationOptions(options, this.options); trySetAnimationOptions(options, this.options);
return this._updateFormatOut('gif', options); return this._updateFormatOut('gif', options);
@@ -630,7 +657,7 @@ function gif (options) {
/* istanbul ignore next */ /* istanbul ignore next */
function jp2 (options) { function jp2 (options) {
if (!this.constructor.format.jp2k.output.buffer) { if (!this.constructor.format.jp2k.output.buffer) {
throw errJp2Save; throw errJp2Save();
} }
if (is.object(options)) { if (is.object(options)) {
if (is.defined(options.quality)) { if (is.defined(options.quality)) {
@@ -912,6 +939,71 @@ function heif (options) {
return this._updateFormatOut('heif', options); return this._updateFormatOut('heif', options);
} }
/**
* Use these JPEG-XL (JXL) options for output image.
*
* This feature is experimental, please do not use in production systems.
*
* Requires libvips compiled with support for libjxl.
* The prebuilt binaries do not include this - see
* {@link https://sharp.pixelplumbing.com/install#custom-libvips installing a custom libvips}.
*
* Image metadata (EXIF, XMP) is unsupported.
*
* @since 0.31.3
*
* @param {Object} [options] - output options
* @param {number} [options.distance=1.0] - maximum encoding error, between 0 (highest quality) and 15 (lowest quality)
* @param {number} [options.quality] - calculate `distance` based on JPEG-like quality, between 1 and 100, overrides distance if specified
* @param {number} [options.decodingTier=0] - target decode speed tier, between 0 (highest quality) and 4 (lowest quality)
* @param {boolean} [options.lossless=false] - use lossless compression
* @param {number} [options.effort=7] - CPU effort, between 3 (fastest) and 9 (slowest)
* @returns {Sharp}
* @throws {Error} Invalid options
*/
function jxl (options) {
if (is.object(options)) {
if (is.defined(options.quality)) {
if (is.integer(options.quality) && is.inRange(options.quality, 1, 100)) {
// https://github.com/libjxl/libjxl/blob/0aeea7f180bafd6893c1db8072dcb67d2aa5b03d/tools/cjxl_main.cc#L640-L644
this.options.jxlDistance = options.quality >= 30
? 0.1 + (100 - options.quality) * 0.09
: 53 / 3000 * options.quality * options.quality - 23 / 20 * options.quality + 25;
} else {
throw is.invalidParameterError('quality', 'integer between 1 and 100', options.quality);
}
} else if (is.defined(options.distance)) {
if (is.number(options.distance) && is.inRange(options.distance, 0, 15)) {
this.options.jxlDistance = options.distance;
} else {
throw is.invalidParameterError('distance', 'number between 0.0 and 15.0', options.distance);
}
}
if (is.defined(options.decodingTier)) {
if (is.integer(options.decodingTier) && is.inRange(options.decodingTier, 0, 4)) {
this.options.jxlDecodingTier = options.decodingTier;
} else {
throw is.invalidParameterError('decodingTier', 'integer between 0 and 4', options.decodingTier);
}
}
if (is.defined(options.lossless)) {
if (is.bool(options.lossless)) {
this.options.jxlLossless = options.lossless;
} else {
throw is.invalidParameterError('lossless', 'boolean', options.lossless);
}
}
if (is.defined(options.effort)) {
if (is.integer(options.effort) && is.inRange(options.effort, 3, 9)) {
this.options.jxlEffort = options.effort;
} else {
throw is.invalidParameterError('effort', 'integer between 3 and 9', options.effort);
}
}
}
return this._updateFormatOut('jxl', options);
}
/** /**
* Force output to be raw, uncompressed pixel data. * Force output to be raw, uncompressed pixel data.
* Pixel ordering is left-to-right, top-to-bottom, without padding. * Pixel ordering is left-to-right, top-to-bottom, without padding.
@@ -1205,7 +1297,7 @@ function _pipeline (callback) {
this.push(data); this.push(data);
} }
this.push(null); this.push(null);
this.emit('close'); this.on('end', () => this.emit('close'));
}); });
}); });
if (this.streamInFinished) { if (this.streamInFinished) {
@@ -1221,7 +1313,7 @@ function _pipeline (callback) {
this.push(data); this.push(data);
} }
this.push(null); this.push(null);
this.emit('close'); this.on('end', () => this.emit('close'));
}); });
} }
return this; return this;
@@ -1282,6 +1374,7 @@ module.exports = function (Sharp) {
tiff, tiff,
avif, avif,
heif, heif,
jxl,
gif, gif,
raw, raw,
tile, tile,

View File

@@ -111,7 +111,9 @@ function isResizeExpected (options) {
* *
* Some of these values are based on the [object-fit](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) CSS property. * Some of these values are based on the [object-fit](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) CSS property.
* *
* When using a `fit` of `cover` or `contain`, the default **position** is `centre`. Other options are: * <img alt="Examples of various values for the fit property when resizing" width="100%" style="aspect-ratio: 998/243" src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/api-resize-fit.png">
*
* When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are:
* - `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`. * - `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`.
* - `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`. * - `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`.
* - `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy. * - `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy.
@@ -217,8 +219,8 @@ function isResizeExpected (options) {
* @param {number} [width] - pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. * @param {number} [width] - pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height.
* @param {number} [height] - pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. * @param {number} [height] - pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width.
* @param {Object} [options] * @param {Object} [options]
* @param {String} [options.width] - alternative means of specifying `width`. If both are present this take priority. * @param {String} [options.width] - alternative means of specifying `width`. If both are present this takes priority.
* @param {String} [options.height] - alternative means of specifying `height`. If both are present this take priority. * @param {String} [options.height] - alternative means of specifying `height`. If both are present this takes priority.
* @param {String} [options.fit='cover'] - how the image should be resized to fit both provided dimensions, one of `cover`, `contain`, `fill`, `inside` or `outside`. * @param {String} [options.fit='cover'] - how the image should be resized to fit both provided dimensions, one of `cover`, `contain`, `fill`, `inside` or `outside`.
* @param {String} [options.position='centre'] - position, gravity or strategy to use when `fit` is `cover` or `contain`. * @param {String} [options.position='centre'] - position, gravity or strategy to use when `fit` is `cover` or `contain`.
* @param {String|Object} [options.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. * @param {String|Object} [options.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency.
@@ -428,7 +430,7 @@ function extend (extend) {
* @throws {Error} Invalid parameters * @throws {Error} Invalid parameters
*/ */
function extract (options) { function extract (options) {
const suffix = isResizeExpected(this.options) || isRotationExpected(this.options) ? 'Post' : 'Pre'; const suffix = isResizeExpected(this.options) || this.options.widthPre !== -1 ? 'Post' : 'Pre';
if (this.options[`width${suffix}`] !== -1) { if (this.options[`width${suffix}`] !== -1) {
this.options.debuglog('ignoring previous extract options'); this.options.debuglog('ignoring previous extract options');
} }
@@ -511,16 +513,13 @@ function trim (trim) {
} }
} else if (is.object(trim)) { } else if (is.object(trim)) {
this._setBackgroundColourOption('trimBackground', trim.background); this._setBackgroundColourOption('trimBackground', trim.background);
if (!is.defined(trim.threshold)) { if (!is.defined(trim.threshold)) {
this.options.trimThreshold = 10; this.options.trimThreshold = 10;
} else if (is.number(trim.threshold)) { } else if (is.number(trim.threshold) && trim.threshold >= 0) {
if (trim.threshold >= 0) {
this.options.trimThreshold = trim.threshold; this.options.trimThreshold = trim.threshold;
} else { } else {
throw is.invalidParameterError('threshold', 'positive number', trim); throw is.invalidParameterError('threshold', 'positive number', trim);
} }
}
} else { } else {
throw is.invalidParameterError('trim', 'string, number or object', trim); throw is.invalidParameterError('trim', 'string, number or object', trim);
} }

View File

@@ -1,7 +1,7 @@
{ {
"name": "sharp", "name": "sharp",
"description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images", "description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images",
"version": "0.31.0", "version": "0.31.3",
"author": "Lovell Fuller <npm@lovell.info>", "author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://github.com/lovell/sharp", "homepage": "https://github.com/lovell/sharp",
"contributors": [ "contributors": [
@@ -92,9 +92,8 @@
"clean": "rm -rf node_modules/ build/ vendor/ .nyc_output/ coverage/ test/fixtures/output.*", "clean": "rm -rf node_modules/ build/ vendor/ .nyc_output/ coverage/ test/fixtures/output.*",
"test": "npm run test-lint && npm run test-unit && npm run test-licensing", "test": "npm run test-lint && npm run test-unit && npm run test-licensing",
"test-lint": "semistandard && cpplint", "test-lint": "semistandard && cpplint",
"test-unit": "nyc --reporter=lcov --branches=99 mocha --slow=1000 --timeout=60000 ./test/unit/*.js", "test-unit": "nyc --reporter=lcov --reporter=text --check-coverage --branches=100 mocha",
"test-licensing": "license-checker --production --summary --onlyAllow=\"Apache-2.0;BSD;ISC;MIT\"", "test-licensing": "license-checker --production --summary --onlyAllow=\"Apache-2.0;BSD;ISC;MIT\"",
"test-coverage": "./test/coverage/report.sh",
"test-leak": "./test/leak/leak.sh", "test-leak": "./test/leak/leak.sh",
"docs-build": "documentation lint lib && node docs/build && node docs/search-index/build", "docs-build": "documentation lint lib && node docs/build && node docs/search-index/build",
"docs-serve": "cd docs && npx serve", "docs-serve": "cd docs && npx serve",
@@ -134,7 +133,7 @@
"detect-libc": "^2.0.1", "detect-libc": "^2.0.1",
"node-addon-api": "^5.0.0", "node-addon-api": "^5.0.0",
"prebuild-install": "^7.1.1", "prebuild-install": "^7.1.1",
"semver": "^7.3.7", "semver": "^7.3.8",
"simple-get": "^4.0.1", "simple-get": "^4.0.1",
"tar-fs": "^2.1.1", "tar-fs": "^2.1.1",
"tunnel-agent": "^0.6.0" "tunnel-agent": "^0.6.0"
@@ -142,13 +141,13 @@
"devDependencies": { "devDependencies": {
"async": "^3.2.4", "async": "^3.2.4",
"cc": "^3.0.1", "cc": "^3.0.1",
"documentation": "^14.0.0", "documentation": "^14.0.1",
"exif-reader": "^1.0.3", "exif-reader": "^1.0.3",
"extract-zip": "^2.0.1", "extract-zip": "^2.0.1",
"icc": "^2.0.0", "icc": "^2.0.0",
"license-checker": "^25.0.1", "license-checker": "^25.0.1",
"mocha": "^10.0.0", "mocha": "^10.2.0",
"mock-fs": "^5.1.4", "mock-fs": "^5.2.0",
"nyc": "^15.1.0", "nyc": "^15.1.0",
"prebuild": "^11.0.4", "prebuild": "^11.0.4",
"rimraf": "^3.0.2", "rimraf": "^3.0.2",
@@ -156,19 +155,19 @@
}, },
"license": "Apache-2.0", "license": "Apache-2.0",
"config": { "config": {
"libvips": "8.13.1", "libvips": "8.13.3",
"integrity": { "integrity": {
"darwin-arm64v8": "sha512-JdpGTx67RDbvRkg3ljFvTzqoq+oBXmMdDFEp0expDYXmP5HLH+GCkikmsROlGltgfKE2KqL/qwpxTEhIwMK/3A==", "darwin-arm64v8": "sha512-xFgYt7CtQSZcWoyUdzPTDNHbioZIrZSEU+gkMxzH4Cgjhi4/N49UsonnIZhKQoTBGloAqEexHeMx4rYTQ2Kgvw==",
"darwin-x64": "sha512-0Oh4/hEDnzV+X8MiiyUQ4G/Zh/MHw9rKstfuX0P1czgaxS2hX8Pxdbzdk1oqwTOEYVEGO/hMm9ItCVZ3RVPPaA==", "darwin-x64": "sha512-6SivWKzu15aUMMohe0wg7sNYMPETVnOe40BuWsnKOgzl3o5FpQqNSgs+68Mi8Za3Qti9/DaR+H/fyD0x48Af2w==",
"linux-arm64v8": "sha512-9pSlPzEojt6ue5vXfASNMhQO1YS1p4i4Wydu+bzOfMtIPSBRXbu/+y8WELbbo03Ts7pftm9KtrMHitCVdy5EXw==", "linux-arm64v8": "sha512-b+iI9V/ehgDabXYRQcvqa5CEysh+1FQsgFmYc358StCrJCDahwNmsQdsiH1GOVd5WaWh5wHUGByPwMmFOO16Aw==",
"linux-armv6": "sha512-sv2FqS/ggpQly7h5/+nh8txQDulolE5ptaE90PO7iwfTont8N42pudeqootWKsuf0fRmkW4M92184VfVVYCvGw==", "linux-armv6": "sha512-zRP2F+EiustLE4bXSH8AHCxwfemh9d+QuvmPjira/HL6uJOUuA7SyQgVV1TPwTQle2ioCNnKPm7FEB/MAiT+ug==",
"linux-armv7": "sha512-LmQIB8FDfasK6BsFhnE7ZI3LMlxh/rF5tZRNQ/uoTbF2xrtWQqqgiZgCifJByiEM+1tR7RxwNdnjxZhWvM9WmQ==", "linux-armv7": "sha512-6OCChowE5lBXXXAZrnGdA9dVktg7UdODEBpE5qTroiAJYZv4yXRMgyDFYajok7du2NTgoklhxGk8d9+4vGv5hg==",
"linux-x64": "sha512-JBRf8WBnlVw/K1jpSvmeZpnGZGjeqhG2NDEiQV/hUze3zgDGwDza4oiworaQExQmKcDrc2LJKF14Nsz1qQSNJw==", "linux-x64": "sha512-OTmlmP2r8ozGKdB96X+K5oQE1ojVZanqLqqKlwDpEnfixyIaDGYbVzcjWBNGU3ai/26bvkaCkjynnc2ecYcsuA==",
"linuxmusl-arm64v8": "sha512-yzUQO5isDwsRpEUxbMXBeWp0sKhWghebrSK46SUF5mvB/kq6hZ7JbRuJ2aZjE84K/HUTyuCc0kE+M3m8naOs+g==", "linuxmusl-arm64v8": "sha512-Qh5Wi+bkKTohFYHzSPssfjMhIkD6z6EHbVmnwmWSsgY9zsUBStFp6+mKcNTQfP5YM5Mz06vJOkLHX2OzEr5TzA==",
"linuxmusl-x64": "sha512-H3Vz1QaaZ6X5iEbfPST7TPFwDO01tI8dk1osLm6l4a17BWCaOMaBQlqxgTgYrtd09JJ9CvGoq5fo5j5TPxUc4Q==", "linuxmusl-x64": "sha512-DwB4Fs3+ISw9etaLCANkueZDdk758iOS+wNp4TKZkHdq0al6B/3Pk7OHLR8a9E3H6wYDD328u++dcJzip5tacA==",
"win32-arm64v8": "sha512-b5Ver+uwOJhdOGqvZVM+qF2KLKcowcac/wKK5Fg0czqlSMqP/KxDF2kxw2eKXUJNgfqe4eDH1QG/yTg2pQSetQ==", "win32-arm64v8": "sha512-96r3W+O4BtX602B1MtxU5Ru4lKzRRTZqM4OQEBJ//TNL3fiCZdd9agD+RQBjaeR4KFIyBSt3F7IE425ZWmxz+w==",
"win32-ia32": "sha512-h/SJ/Yfn0ce9H70vt1wS8rZ4PfHnguCCTsOGik7e6O/e2AlBQOM0mKsPIB9jSOquoCP8rP0qF6AOPOjXKnCk+w==", "win32-ia32": "sha512-qfN1MsfQGek1QQd1UNW7JT+5K5Ne1suFQ2GpgpYm3JLSpIve/tz2vOGEGzvTVssOBADJvAkTDFt+yIi3PgU9pA==",
"win32-x64": "sha512-p9qpdWdhZooPteib92Kk+qF1vvzcScxvOwdIP8muhgo/A8uDI4/mqXCpEbMBw6vjETKlS3qo2JUbVF6+0/lyWQ==" "win32-x64": "sha512-eb3aAmjbVVBVRbiYgebQwoxkAt69WI8nwmKlilSQ3kWqoc0pXfIe322rF2UR8ebbISCGvYRUfzD2r1k92RXISQ=="
}, },
"runtime": "napi", "runtime": "napi",
"target": 7 "target": 7

View File

@@ -76,6 +76,14 @@ namespace sharp {
} }
return vector; return vector;
} }
Napi::Buffer<char> NewOrCopyBuffer(Napi::Env env, char* data, size_t len) {
try {
return Napi::Buffer<char>::New(env, data, len, FreeCallback);
} catch (Napi::Error const &err) {}
Napi::Buffer<char> buf = Napi::Buffer<char>::Copy(env, data, len);
FreeCallback(nullptr, data);
return buf;
}
// Create an InputDescriptor instance from a Napi::Object describing an input image // Create an InputDescriptor instance from a Napi::Object describing an input image
InputDescriptor* CreateInputDescriptor(Napi::Object input) { InputDescriptor* CreateInputDescriptor(Napi::Object input) {
@@ -207,6 +215,9 @@ namespace sharp {
bool IsAvif(std::string const &str) { bool IsAvif(std::string const &str) {
return EndsWith(str, ".avif") || EndsWith(str, ".AVIF"); return EndsWith(str, ".avif") || EndsWith(str, ".AVIF");
} }
bool IsJxl(std::string const &str) {
return EndsWith(str, ".jxl") || EndsWith(str, ".JXL");
}
bool IsDz(std::string const &str) { bool IsDz(std::string const &str) {
return EndsWith(str, ".dzi") || EndsWith(str, ".DZI"); return EndsWith(str, ".dzi") || EndsWith(str, ".DZI");
} }
@@ -237,6 +248,7 @@ namespace sharp {
case ImageType::PPM: id = "ppm"; break; case ImageType::PPM: id = "ppm"; break;
case ImageType::FITS: id = "fits"; break; case ImageType::FITS: id = "fits"; break;
case ImageType::EXR: id = "exr"; break; case ImageType::EXR: id = "exr"; break;
case ImageType::JXL: id = "jxl"; break;
case ImageType::VIPS: id = "vips"; break; case ImageType::VIPS: id = "vips"; break;
case ImageType::RAW: id = "raw"; break; case ImageType::RAW: id = "raw"; break;
case ImageType::UNKNOWN: id = "unknown"; break; case ImageType::UNKNOWN: id = "unknown"; break;
@@ -281,6 +293,8 @@ namespace sharp {
{ "VipsForeignLoadPpmFile", ImageType::PPM }, { "VipsForeignLoadPpmFile", ImageType::PPM },
{ "VipsForeignLoadFitsFile", ImageType::FITS }, { "VipsForeignLoadFitsFile", ImageType::FITS },
{ "VipsForeignLoadOpenexr", ImageType::EXR }, { "VipsForeignLoadOpenexr", ImageType::EXR },
{ "VipsForeignLoadJxlFile", ImageType::JXL },
{ "VipsForeignLoadJxlBuffer", ImageType::JXL },
{ "VipsForeignLoadVips", ImageType::VIPS }, { "VipsForeignLoadVips", ImageType::VIPS },
{ "VipsForeignLoadVipsFile", ImageType::VIPS }, { "VipsForeignLoadVipsFile", ImageType::VIPS },
{ "VipsForeignLoadRaw", ImageType::RAW } { "VipsForeignLoadRaw", ImageType::RAW }
@@ -507,9 +521,10 @@ namespace sharp {
} }
} }
} }
// Limit input images to a given number of pixels, where pixels = width * height // Limit input images to a given number of pixels, where pixels = width * height
if (descriptor->limitInputPixels > 0 && if (descriptor->limitInputPixels > 0 &&
static_cast<uint64_t>(image.width() * image.height()) > descriptor->limitInputPixels) { static_cast<uint64_t>(image.width()) * image.height() > descriptor->limitInputPixels) {
throw vips::VError("Input image exceeds pixel limit"); throw vips::VError("Input image exceeds pixel limit");
} }
return std::make_tuple(image, imageType); return std::make_tuple(image, imageType);
@@ -556,6 +571,7 @@ namespace sharp {
VImage RemoveExifOrientation(VImage image) { VImage RemoveExifOrientation(VImage image) {
VImage copy = image.copy(); VImage copy = image.copy();
copy.remove(VIPS_META_ORIENTATION); copy.remove(VIPS_META_ORIENTATION);
copy.remove("exif-ifd0-Orientation");
return copy; return copy;
} }
@@ -911,7 +927,7 @@ namespace sharp {
// Add non-transparent alpha channel, if required // Add non-transparent alpha channel, if required
if (colour[3] < 255.0 && !HasAlpha(image)) { if (colour[3] < 255.0 && !HasAlpha(image)) {
image = image.bandjoin( image = image.bandjoin(
VImage::new_matrix(image.width(), image.height()).new_from_image(255 * multiplier)); VImage::new_matrix(image.width(), image.height()).new_from_image(255 * multiplier).cast(image.format()));
} }
return std::make_tuple(image, alphaColour); return std::make_tuple(image, alphaColour);
} }

View File

@@ -26,8 +26,8 @@
#if (VIPS_MAJOR_VERSION < 8) || \ #if (VIPS_MAJOR_VERSION < 8) || \
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 13) || \ (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 13) || \
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 13 && VIPS_MICRO_VERSION < 1) (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 13 && VIPS_MICRO_VERSION < 3)
#error "libvips version 8.13.1+ is required - please see https://sharp.pixelplumbing.com/install" #error "libvips version 8.13.3+ is required - please see https://sharp.pixelplumbing.com/install"
#endif #endif
#if ((!defined(__clang__)) && defined(__GNUC__) && (__GNUC__ < 4 || (__GNUC__ == 4 && __GNUC_MINOR__ < 6))) #if ((!defined(__clang__)) && defined(__GNUC__) && (__GNUC__ < 4 || (__GNUC__ == 4 && __GNUC_MINOR__ < 6)))
@@ -133,6 +133,7 @@ namespace sharp {
return static_cast<T>( return static_cast<T>(
vips_enum_from_nick(nullptr, type, AttrAsStr(obj, attr).data())); vips_enum_from_nick(nullptr, type, AttrAsStr(obj, attr).data()));
} }
Napi::Buffer<char> NewOrCopyBuffer(Napi::Env env, char* data, size_t len);
// Create an InputDescriptor instance from a Napi::Object describing an input image // Create an InputDescriptor instance from a Napi::Object describing an input image
InputDescriptor* CreateInputDescriptor(Napi::Object input); InputDescriptor* CreateInputDescriptor(Napi::Object input);
@@ -152,6 +153,7 @@ namespace sharp {
PPM, PPM,
FITS, FITS,
EXR, EXR,
JXL,
VIPS, VIPS,
RAW, RAW,
UNKNOWN, UNKNOWN,
@@ -182,6 +184,7 @@ namespace sharp {
bool IsHeic(std::string const &str); bool IsHeic(std::string const &str);
bool IsHeif(std::string const &str); bool IsHeif(std::string const &str);
bool IsAvif(std::string const &str); bool IsAvif(std::string const &str);
bool IsJxl(std::string const &str);
bool IsDz(std::string const &str); bool IsDz(std::string const &str);
bool IsDzZip(std::string const &str); bool IsDzZip(std::string const &str);
bool IsV(std::string const &str); bool IsV(std::string const &str);

View File

@@ -235,20 +235,20 @@ class MetadataWorker : public Napi::AsyncWorker {
info.Set("orientation", baton->orientation); info.Set("orientation", baton->orientation);
} }
if (baton->exifLength > 0) { if (baton->exifLength > 0) {
info.Set("exif", Napi::Buffer<char>::New(env, baton->exif, baton->exifLength, sharp::FreeCallback)); info.Set("exif", sharp::NewOrCopyBuffer(env, baton->exif, baton->exifLength));
} }
if (baton->iccLength > 0) { if (baton->iccLength > 0) {
info.Set("icc", Napi::Buffer<char>::New(env, baton->icc, baton->iccLength, sharp::FreeCallback)); info.Set("icc", sharp::NewOrCopyBuffer(env, baton->icc, baton->iccLength));
} }
if (baton->iptcLength > 0) { if (baton->iptcLength > 0) {
info.Set("iptc", Napi::Buffer<char>::New(env, baton->iptc, baton->iptcLength, sharp::FreeCallback)); info.Set("iptc", sharp::NewOrCopyBuffer(env, baton->iptc, baton->iptcLength));
} }
if (baton->xmpLength > 0) { if (baton->xmpLength > 0) {
info.Set("xmp", Napi::Buffer<char>::New(env, baton->xmp, baton->xmpLength, sharp::FreeCallback)); info.Set("xmp", sharp::NewOrCopyBuffer(env, baton->xmp, baton->xmpLength));
} }
if (baton->tifftagPhotoshopLength > 0) { if (baton->tifftagPhotoshopLength > 0) {
info.Set("tifftagPhotoshop", info.Set("tifftagPhotoshop",
Napi::Buffer<char>::New(env, baton->tifftagPhotoshop, baton->tifftagPhotoshopLength, sharp::FreeCallback)); sharp::NewOrCopyBuffer(env, baton->tifftagPhotoshop, baton->tifftagPhotoshopLength));
} }
Callback().MakeCallback(Receiver().Value(), { env.Null(), info }); Callback().MakeCallback(Receiver().Value(), { env.Null(), info });
} else { } else {

View File

@@ -289,17 +289,20 @@ namespace sharp {
background = image.extract_area(0, 0, 1, 1)(0, 0); background = image.extract_area(0, 0, 1, 1)(0, 0);
multiplier = 1.0; multiplier = 1.0;
} }
if (background.size() == 4) { if (HasAlpha(image) && background.size() == 4) {
// Just discard the alpha because flattening the background colour with // Just discard the alpha because flattening the background colour with
// itself (effectively what find_trim() does) gives the same result // itself (effectively what find_trim() does) gives the same result
backgroundAlpha[0] = background[3] * multiplier; backgroundAlpha[0] = background[3] * multiplier;
} }
if (image.bands() > 2) {
background = { background = {
background[0] * multiplier, background[0] * multiplier,
background[1] * multiplier, background[1] * multiplier,
background[2] * multiplier background[2] * multiplier
}; };
} else {
background[0] = background[0] * multiplier;
}
int left, top, width, height; int left, top, width, height;
left = image.find_trim(&top, &width, &height, VImage::option() left = image.find_trim(&top, &width, &height, VImage::option()
->set("background", background) ->set("background", background)
@@ -342,9 +345,9 @@ namespace sharp {
if (HasAlpha(image) && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) { if (HasAlpha(image) && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) {
// Separate alpha channel // Separate alpha channel
VImage alpha = image[bands - 1]; VImage alpha = image[bands - 1];
return RemoveAlpha(image).linear(a, b).bandjoin(alpha); return RemoveAlpha(image).linear(a, b, VImage::option()->set("uchar", TRUE)).bandjoin(alpha);
} else { } else {
return image.linear(a, b); return image.linear(a, b, VImage::option()->set("uchar", TRUE));
} }
} }

View File

@@ -81,32 +81,47 @@ class PipelineWorker : public Napi::AsyncWorker {
int pageHeight = sharp::GetPageHeight(image); int pageHeight = sharp::GetPageHeight(image);
// Calculate angle of rotation // Calculate angle of rotation
VipsAngle rotation; VipsAngle rotation = VIPS_ANGLE_D0;
bool flip = FALSE; VipsAngle autoRotation = VIPS_ANGLE_D0;
bool flop = FALSE; bool autoFlip = FALSE;
bool autoFlop = FALSE;
if (baton->useExifOrientation) { if (baton->useExifOrientation) {
// Rotate and flip image according to Exif orientation // Rotate and flip image according to Exif orientation
std::tie(rotation, flip, flop) = CalculateExifRotationAndFlip(sharp::ExifOrientation(image)); std::tie(autoRotation, autoFlip, autoFlop) = CalculateExifRotationAndFlip(sharp::ExifOrientation(image));
image = sharp::RemoveExifOrientation(image);
} else { } else {
rotation = CalculateAngleRotation(baton->angle); rotation = CalculateAngleRotation(baton->angle);
} }
// Rotate pre-extract // Rotate pre-extract
if (baton->rotateBeforePreExtract) { bool const shouldRotateBefore = baton->rotateBeforePreExtract &&
(rotation != VIPS_ANGLE_D0 || autoRotation != VIPS_ANGLE_D0 ||
autoFlip || baton->flip || autoFlop || baton->flop ||
baton->rotationAngle != 0.0);
if (shouldRotateBefore) {
if (autoRotation != VIPS_ANGLE_D0) {
image = image.rot(autoRotation);
autoRotation = VIPS_ANGLE_D0;
}
if (autoFlip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
autoFlip = FALSE;
} else if (baton->flip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
baton->flip = FALSE;
}
if (autoFlop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
autoFlop = FALSE;
} else if (baton->flop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
baton->flop = FALSE;
}
if (rotation != VIPS_ANGLE_D0) { if (rotation != VIPS_ANGLE_D0) {
image = image.rot(rotation); image = image.rot(rotation);
rotation = VIPS_ANGLE_D0;
} }
if (flip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
}
if (flop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
}
if (rotation != VIPS_ANGLE_D0 || flip || flop) {
image = sharp::RemoveExifOrientation(image);
}
flop = FALSE;
flip = FALSE;
if (baton->rotationAngle != 0.0) { if (baton->rotationAngle != 0.0) {
MultiPageUnsupported(nPages, "Rotate"); MultiPageUnsupported(nPages, "Rotate");
std::vector<double> background; std::vector<double> background;
@@ -147,7 +162,9 @@ class PipelineWorker : public Napi::AsyncWorker {
int targetResizeHeight = baton->height; int targetResizeHeight = baton->height;
// Swap input output width and height when rotating by 90 or 270 degrees // Swap input output width and height when rotating by 90 or 270 degrees
bool swap = !baton->rotateBeforePreExtract && (rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270); bool swap = !baton->rotateBeforePreExtract &&
(rotation == VIPS_ANGLE_D90 || rotation == VIPS_ANGLE_D270 ||
autoRotation == VIPS_ANGLE_D90 || autoRotation == VIPS_ANGLE_D270);
// Shrink to pageHeight, so we work for multi-page images // Shrink to pageHeight, so we work for multi-page images
std::tie(hshrink, vshrink) = sharp::ResolveShrink( std::tie(hshrink, vshrink) = sharp::ResolveShrink(
@@ -167,7 +184,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// - input colourspace is not specified; // - input colourspace is not specified;
bool const shouldPreShrink = (targetResizeWidth > 0 || targetResizeHeight > 0) && bool const shouldPreShrink = (targetResizeWidth > 0 || targetResizeHeight > 0) &&
baton->gamma == 0 && baton->topOffsetPre == -1 && baton->trimThreshold == 0.0 && baton->gamma == 0 && baton->topOffsetPre == -1 && baton->trimThreshold == 0.0 &&
baton->colourspaceInput == VIPS_INTERPRETATION_LAST; baton->colourspaceInput == VIPS_INTERPRETATION_LAST && !shouldRotateBefore;
if (shouldPreShrink) { if (shouldPreShrink) {
// The common part of the shrink: the bit by which both axes must be shrunk // The common part of the shrink: the bit by which both axes must be shrunk
@@ -364,30 +381,21 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("kernel", baton->kernel)); ->set("kernel", baton->kernel));
} }
// Auto-rotate post-extract
if (autoRotation != VIPS_ANGLE_D0) {
image = image.rot(autoRotation);
}
// Flip (mirror about Y axis) // Flip (mirror about Y axis)
if (baton->flip || flip) { if (baton->flip || autoFlip) {
image = image.flip(VIPS_DIRECTION_VERTICAL); image = image.flip(VIPS_DIRECTION_VERTICAL);
image = sharp::RemoveExifOrientation(image);
} }
// Flop (mirror about X axis) // Flop (mirror about X axis)
if (baton->flop || flop) { if (baton->flop || autoFlop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL); image = image.flip(VIPS_DIRECTION_HORIZONTAL);
image = sharp::RemoveExifOrientation(image);
} }
// Rotate post-extract 90-angle // Rotate post-extract 90-angle
if (!baton->rotateBeforePreExtract && rotation != VIPS_ANGLE_D0) { if (rotation != VIPS_ANGLE_D0) {
image = image.rot(rotation); image = image.rot(rotation);
if (flip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
flip = FALSE;
}
if (flop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
flop = FALSE;
}
image = sharp::RemoveExifOrientation(image);
} }
// Join additional color channels to the image // Join additional color channels to the image
@@ -693,24 +701,6 @@ class PipelineWorker : public Napi::AsyncWorker {
image = sharp::Tint(image, baton->tintA, baton->tintB); image = sharp::Tint(image, baton->tintA, baton->tintB);
} }
// Extract an image channel (aka vips band)
if (baton->extractChannel > -1) {
if (baton->extractChannel >= image.bands()) {
if (baton->extractChannel == 3 && sharp::HasAlpha(image)) {
baton->extractChannel = image.bands() - 1;
} else {
(baton->err).append("Cannot extract channel from image. Too few channels in image.");
return Error();
}
}
VipsInterpretation const interpretation = sharp::Is16Bit(image.interpretation())
? VIPS_INTERPRETATION_GREY16
: VIPS_INTERPRETATION_B_W;
image = image
.extract_band(baton->extractChannel)
.copy(VImage::option()->set("interpretation", interpretation));
}
// Remove alpha channel, if any // Remove alpha channel, if any
if (baton->removeAlpha) { if (baton->removeAlpha) {
image = sharp::RemoveAlpha(image); image = sharp::RemoveAlpha(image);
@@ -736,6 +726,26 @@ class PipelineWorker : public Napi::AsyncWorker {
} }
} }
// Extract channel
if (baton->extractChannel > -1) {
if (baton->extractChannel >= image.bands()) {
if (baton->extractChannel == 3 && sharp::HasAlpha(image)) {
baton->extractChannel = image.bands() - 1;
} else {
(baton->err)
.append("Cannot extract channel ").append(std::to_string(baton->extractChannel))
.append(" from image with channels 0-").append(std::to_string(image.bands() - 1));
return Error();
}
}
VipsInterpretation colourspace = sharp::Is16Bit(image.interpretation())
? VIPS_INTERPRETATION_GREY16
: VIPS_INTERPRETATION_B_W;
image = image
.extract_band(baton->extractChannel)
.copy(VImage::option()->set("interpretation", colourspace));
}
// Apply output ICC profile // Apply output ICC profile
if (!baton->withMetadataIcc.empty()) { if (!baton->withMetadataIcc.empty()) {
image = image.icc_transform( image = image.icc_transform(
@@ -861,6 +871,8 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("bitdepth", baton->gifBitdepth) ->set("bitdepth", baton->gifBitdepth)
->set("effort", baton->gifEffort) ->set("effort", baton->gifEffort)
->set("reoptimise", baton->gifReoptimise) ->set("reoptimise", baton->gifReoptimise)
->set("interframe_maxerror", baton->gifInterFrameMaxError)
->set("interpalette_maxerror", baton->gifInterPaletteMaxError)
->set("dither", baton->gifDither))); ->set("dither", baton->gifDither)));
baton->bufferOut = static_cast<char*>(area->data); baton->bufferOut = static_cast<char*>(area->data);
baton->bufferOutLength = area->length; baton->bufferOutLength = area->length;
@@ -899,12 +911,13 @@ class PipelineWorker : public Napi::AsyncWorker {
} else if (baton->formatOut == "heif" || } else if (baton->formatOut == "heif" ||
(baton->formatOut == "input" && inputImageType == sharp::ImageType::HEIF)) { (baton->formatOut == "input" && inputImageType == sharp::ImageType::HEIF)) {
// Write HEIF to buffer // Write HEIF to buffer
image = sharp::RemoveAnimationProperties(image); image = sharp::RemoveAnimationProperties(image).cast(VIPS_FORMAT_UCHAR);
VipsArea *area = reinterpret_cast<VipsArea*>(image.heifsave_buffer(VImage::option() VipsArea *area = reinterpret_cast<VipsArea*>(image.heifsave_buffer(VImage::option()
->set("strip", !baton->withMetadata) ->set("strip", !baton->withMetadata)
->set("Q", baton->heifQuality) ->set("Q", baton->heifQuality)
->set("compression", baton->heifCompression) ->set("compression", baton->heifCompression)
->set("effort", baton->heifEffort) ->set("effort", baton->heifEffort)
->set("bitdepth", 8)
->set("subsample_mode", baton->heifChromaSubsampling == "4:4:4" ->set("subsample_mode", baton->heifChromaSubsampling == "4:4:4"
? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON) ? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON)
->set("lossless", baton->heifLossless))); ->set("lossless", baton->heifLossless)));
@@ -926,6 +939,21 @@ class PipelineWorker : public Napi::AsyncWorker {
area->free_fn = nullptr; area->free_fn = nullptr;
vips_area_unref(area); vips_area_unref(area);
baton->formatOut = "dz"; baton->formatOut = "dz";
} else if (baton->formatOut == "jxl" ||
(baton->formatOut == "input" && inputImageType == sharp::ImageType::JXL)) {
// Write JXL to buffer
image = sharp::RemoveAnimationProperties(image);
VipsArea *area = reinterpret_cast<VipsArea*>(image.jxlsave_buffer(VImage::option()
->set("strip", !baton->withMetadata)
->set("distance", baton->jxlDistance)
->set("tier", baton->jxlDecodingTier)
->set("effort", baton->jxlEffort)
->set("lossless", baton->jxlLossless)));
baton->bufferOut = static_cast<char*>(area->data);
baton->bufferOutLength = area->length;
area->free_fn = nullptr;
vips_area_unref(area);
baton->formatOut = "jxl";
} else if (baton->formatOut == "raw" || } else if (baton->formatOut == "raw" ||
(baton->formatOut == "input" && inputImageType == sharp::ImageType::RAW)) { (baton->formatOut == "input" && inputImageType == sharp::ImageType::RAW)) {
// Write raw, uncompressed image data to buffer // Write raw, uncompressed image data to buffer
@@ -964,6 +992,7 @@ class PipelineWorker : public Napi::AsyncWorker {
bool const isTiff = sharp::IsTiff(baton->fileOut); bool const isTiff = sharp::IsTiff(baton->fileOut);
bool const isJp2 = sharp::IsJp2(baton->fileOut); bool const isJp2 = sharp::IsJp2(baton->fileOut);
bool const isHeif = sharp::IsHeif(baton->fileOut); bool const isHeif = sharp::IsHeif(baton->fileOut);
bool const isJxl = sharp::IsJxl(baton->fileOut);
bool const isDz = sharp::IsDz(baton->fileOut); bool const isDz = sharp::IsDz(baton->fileOut);
bool const isDzZip = sharp::IsDzZip(baton->fileOut); bool const isDzZip = sharp::IsDzZip(baton->fileOut);
bool const isV = sharp::IsV(baton->fileOut); bool const isV = sharp::IsV(baton->fileOut);
@@ -1070,16 +1099,28 @@ class PipelineWorker : public Napi::AsyncWorker {
} else if (baton->formatOut == "heif" || (mightMatchInput && isHeif) || } else if (baton->formatOut == "heif" || (mightMatchInput && isHeif) ||
(willMatchInput && inputImageType == sharp::ImageType::HEIF)) { (willMatchInput && inputImageType == sharp::ImageType::HEIF)) {
// Write HEIF to file // Write HEIF to file
image = sharp::RemoveAnimationProperties(image); image = sharp::RemoveAnimationProperties(image).cast(VIPS_FORMAT_UCHAR);
image.heifsave(const_cast<char*>(baton->fileOut.data()), VImage::option() image.heifsave(const_cast<char*>(baton->fileOut.data()), VImage::option()
->set("strip", !baton->withMetadata) ->set("strip", !baton->withMetadata)
->set("Q", baton->heifQuality) ->set("Q", baton->heifQuality)
->set("compression", baton->heifCompression) ->set("compression", baton->heifCompression)
->set("effort", baton->heifEffort) ->set("effort", baton->heifEffort)
->set("bitdepth", 8)
->set("subsample_mode", baton->heifChromaSubsampling == "4:4:4" ->set("subsample_mode", baton->heifChromaSubsampling == "4:4:4"
? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON) ? VIPS_FOREIGN_SUBSAMPLE_OFF : VIPS_FOREIGN_SUBSAMPLE_ON)
->set("lossless", baton->heifLossless)); ->set("lossless", baton->heifLossless));
baton->formatOut = "heif"; baton->formatOut = "heif";
} else if (baton->formatOut == "jxl" || (mightMatchInput && isJxl) ||
(willMatchInput && inputImageType == sharp::ImageType::JXL)) {
// Write JXL to file
image = sharp::RemoveAnimationProperties(image);
image.jxlsave(const_cast<char*>(baton->fileOut.data()), VImage::option()
->set("strip", !baton->withMetadata)
->set("distance", baton->jxlDistance)
->set("tier", baton->jxlDecodingTier)
->set("effort", baton->jxlEffort)
->set("lossless", baton->jxlLossless));
baton->formatOut = "jxl";
} else if (baton->formatOut == "dz" || isDz || isDzZip) { } else if (baton->formatOut == "dz" || isDz || isDzZip) {
// Write DZ to file // Write DZ to file
if (isDzZip) { if (isDzZip) {
@@ -1165,8 +1206,8 @@ class PipelineWorker : public Napi::AsyncWorker {
// Add buffer size to info // Add buffer size to info
info.Set("size", static_cast<uint32_t>(baton->bufferOutLength)); info.Set("size", static_cast<uint32_t>(baton->bufferOutLength));
// Pass ownership of output data to Buffer instance // Pass ownership of output data to Buffer instance
Napi::Buffer<char> data = Napi::Buffer<char>::New(env, static_cast<char*>(baton->bufferOut), Napi::Buffer<char> data = sharp::NewOrCopyBuffer(env, static_cast<char*>(baton->bufferOut),
baton->bufferOutLength, sharp::FreeCallback); baton->bufferOutLength);
Callback().MakeCallback(Receiver().Value(), { env.Null(), data, info }); Callback().MakeCallback(Receiver().Value(), { env.Null(), data, info });
} else { } else {
// Add file size to info // Add file size to info
@@ -1539,6 +1580,8 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->gifBitdepth = sharp::AttrAsUint32(options, "gifBitdepth"); baton->gifBitdepth = sharp::AttrAsUint32(options, "gifBitdepth");
baton->gifEffort = sharp::AttrAsUint32(options, "gifEffort"); baton->gifEffort = sharp::AttrAsUint32(options, "gifEffort");
baton->gifDither = sharp::AttrAsDouble(options, "gifDither"); baton->gifDither = sharp::AttrAsDouble(options, "gifDither");
baton->gifInterFrameMaxError = sharp::AttrAsDouble(options, "gifInterFrameMaxError");
baton->gifInterPaletteMaxError = sharp::AttrAsDouble(options, "gifInterPaletteMaxError");
baton->gifReoptimise = sharp::AttrAsBool(options, "gifReoptimise"); baton->gifReoptimise = sharp::AttrAsBool(options, "gifReoptimise");
baton->tiffQuality = sharp::AttrAsUint32(options, "tiffQuality"); baton->tiffQuality = sharp::AttrAsUint32(options, "tiffQuality");
baton->tiffPyramid = sharp::AttrAsBool(options, "tiffPyramid"); baton->tiffPyramid = sharp::AttrAsBool(options, "tiffPyramid");
@@ -1563,6 +1606,10 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
options, "heifCompression", VIPS_TYPE_FOREIGN_HEIF_COMPRESSION); options, "heifCompression", VIPS_TYPE_FOREIGN_HEIF_COMPRESSION);
baton->heifEffort = sharp::AttrAsUint32(options, "heifEffort"); baton->heifEffort = sharp::AttrAsUint32(options, "heifEffort");
baton->heifChromaSubsampling = sharp::AttrAsStr(options, "heifChromaSubsampling"); baton->heifChromaSubsampling = sharp::AttrAsStr(options, "heifChromaSubsampling");
baton->jxlDistance = sharp::AttrAsDouble(options, "jxlDistance");
baton->jxlDecodingTier = sharp::AttrAsUint32(options, "jxlDecodingTier");
baton->jxlEffort = sharp::AttrAsUint32(options, "jxlEffort");
baton->jxlLossless = sharp::AttrAsBool(options, "jxlLossless");
baton->rawDepth = sharp::AttrAsEnum<VipsBandFormat>(options, "rawDepth", VIPS_TYPE_BAND_FORMAT); baton->rawDepth = sharp::AttrAsEnum<VipsBandFormat>(options, "rawDepth", VIPS_TYPE_BAND_FORMAT);
// Animated output properties // Animated output properties
if (sharp::HasAttr(options, "loop")) { if (sharp::HasAttr(options, "loop")) {

View File

@@ -163,6 +163,8 @@ struct PipelineBaton {
int gifBitdepth; int gifBitdepth;
int gifEffort; int gifEffort;
double gifDither; double gifDither;
double gifInterFrameMaxError;
double gifInterPaletteMaxError;
bool gifReoptimise; bool gifReoptimise;
int tiffQuality; int tiffQuality;
VipsForeignTiffCompression tiffCompression; VipsForeignTiffCompression tiffCompression;
@@ -180,6 +182,10 @@ struct PipelineBaton {
int heifEffort; int heifEffort;
std::string heifChromaSubsampling; std::string heifChromaSubsampling;
bool heifLossless; bool heifLossless;
double jxlDistance;
int jxlDecodingTier;
int jxlEffort;
bool jxlLossless;
VipsBandFormat rawDepth; VipsBandFormat rawDepth;
std::string err; std::string err;
bool withMetadata; bool withMetadata;
@@ -314,6 +320,8 @@ struct PipelineBaton {
gifBitdepth(8), gifBitdepth(8),
gifEffort(7), gifEffort(7),
gifDither(1.0), gifDither(1.0),
gifInterFrameMaxError(0.0),
gifInterPaletteMaxError(3.0),
gifReoptimise(false), gifReoptimise(false),
tiffQuality(80), tiffQuality(80),
tiffCompression(VIPS_FOREIGN_TIFF_COMPRESSION_JPEG), tiffCompression(VIPS_FOREIGN_TIFF_COMPRESSION_JPEG),
@@ -331,6 +339,10 @@ struct PipelineBaton {
heifEffort(4), heifEffort(4),
heifChromaSubsampling("4:4:4"), heifChromaSubsampling("4:4:4"),
heifLossless(false), heifLossless(false),
jxlDistance(1.0),
jxlDecodingTier(0),
jxlEffort(7),
jxlLossless(false),
rawDepth(VIPS_FORMAT_UCHAR), rawDepth(VIPS_FORMAT_UCHAR),
withMetadata(false), withMetadata(false),
withMetadataOrientation(-1), withMetadataOrientation(-1),

View File

@@ -176,6 +176,7 @@ Napi::Value stats(const Napi::CallbackInfo& info) {
// Input // Input
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>()); baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
baton->input->access = VIPS_ACCESS_RANDOM;
// Function to notify of libvips warnings // Function to notify of libvips warnings
Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>(); Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>();

View File

@@ -115,7 +115,7 @@ Napi::Value format(const Napi::CallbackInfo& info) {
Napi::Object format = Napi::Object::New(env); Napi::Object format = Napi::Object::New(env);
for (std::string const f : { for (std::string const f : {
"jpeg", "png", "webp", "tiff", "magick", "openslide", "dz", "jpeg", "png", "webp", "tiff", "magick", "openslide", "dz",
"ppm", "fits", "gif", "svg", "heif", "pdf", "vips", "jp2k" "ppm", "fits", "gif", "svg", "heif", "pdf", "vips", "jp2k", "jxl"
}) { }) {
// Input // Input
const VipsObjectClass *oc = vips_class_find("VipsOperation", (f + "load").c_str()); const VipsObjectClass *oc = vips_class_find("VipsOperation", (f + "load").c_str());

21
test/beforeEach.js Normal file
View File

@@ -0,0 +1,21 @@
'use strict';
const sharp = require('../');
const usingCache = !process.env.G_DEBUG;
const usingSimd = !process.env.VIPS_NOVECTOR;
const concurrency = Number(process.env.VIPS_CONCURRENCY) || 0;
exports.mochaHooks = {
beforeEach () {
sharp.cache(usingCache);
sharp.simd(usingSimd);
sharp.concurrency(concurrency);
},
afterEach () {
if (global.gc) {
global.gc();
}
}
};

View File

@@ -9,7 +9,7 @@ RUN curl -fsSL https://deb.nodesource.com/setup_16.x | bash -
RUN apt-get install -y nodejs RUN apt-get install -y nodejs
# Install benchmark dependencies # Install benchmark dependencies
RUN apt-get install -y imagemagick libmagick++-dev graphicsmagick libmapnik-dev RUN apt-get install -y imagemagick libmagick++-dev graphicsmagick
# Install sharp # Install sharp
RUN mkdir /tmp/sharp RUN mkdir /tmp/sharp
@@ -17,7 +17,7 @@ RUN cd /tmp && git clone --single-branch --branch $BRANCH https://github.com/lov
RUN cd /tmp/sharp && npm install --build-from-source RUN cd /tmp/sharp && npm install --build-from-source
# Install benchmark test # Install benchmark test
RUN cd /tmp/sharp/test/bench && npm install RUN cd /tmp/sharp/test/bench && npm install --omit optional
RUN cat /etc/os-release | grep VERSION= RUN cat /etc/os-release | grep VERSION=
RUN node -v RUN node -v

View File

@@ -7,16 +7,19 @@
"scripts": { "scripts": {
"test": "node perf && node random && node parallel" "test": "node perf && node random && node parallel"
}, },
"devDependencies": { "dependencies": {
"@squoosh/cli": "0.7.2", "@squoosh/cli": "0.7.2",
"@squoosh/lib": "0.4.0", "@squoosh/lib": "0.4.0",
"async": "3.2.4", "async": "3.2.4",
"benchmark": "2.1.4", "benchmark": "2.1.4",
"gm": "1.23.1", "gm": "1.25.0",
"imagemagick": "0.1.3", "imagemagick": "0.1.3",
"jimp": "0.16.1", "jimp": "0.16.2",
"mapnik": "4.5.9", "semver": "7.3.8"
"semver": "7.3.7" },
"optionalDependencies": {
"@tensorflow/tfjs-node": "4.1.0",
"mapnik": "4.5.9"
}, },
"license": "Apache-2.0", "license": "Apache-2.0",
"engines": { "engines": {

View File

@@ -5,16 +5,24 @@ const fs = require('fs');
const { exec } = require('child_process'); const { exec } = require('child_process');
const async = require('async'); const async = require('async');
const assert = require('assert');
const Benchmark = require('benchmark'); const Benchmark = require('benchmark');
const safeRequire = (name) => {
try {
return require(name);
} catch (err) {}
return null;
};
// Contenders // Contenders
const sharp = require('../../'); const sharp = require('../../');
const gm = require('gm'); const gm = require('gm');
const imagemagick = require('imagemagick'); const imagemagick = require('imagemagick');
const mapnik = require('mapnik'); const mapnik = safeRequire('mapnik');
const jimp = require('jimp'); const jimp = require('jimp');
const squoosh = require('@squoosh/lib'); const squoosh = require('@squoosh/lib');
process.env.TF_CPP_MIN_LOG_LEVEL = 1;
const tfjs = safeRequire('@tensorflow/tfjs-node');
const fixtures = require('../fixtures'); const fixtures = require('../fixtures');
@@ -137,7 +145,7 @@ async.series({
} }
}); });
// mapnik // mapnik
jpegSuite.add('mapnik-file-file', { mapnik && jpegSuite.add('mapnik-file-file', {
defer: true, defer: true,
fn: function (deferred) { fn: function (deferred) {
mapnik.Image.open(fixtures.inputJpg, function (err, img) { mapnik.Image.open(fixtures.inputJpg, function (err, img) {
@@ -212,11 +220,10 @@ async.series({
.filter('Lanczos') .filter('Lanczos')
.resize(width, height) .resize(width, height)
.quality(80) .quality(80)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -243,16 +250,33 @@ async.series({
.filter('Lanczos') .filter('Lanczos')
.resize(width, height) .resize(width, height)
.quality(80) .quality(80)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
} }
}); });
// tfjs
tfjs && jpegSuite.add('tfjs-node-buffer-buffer', {
defer: true,
fn: function (deferred) {
const decoded = tfjs.node.decodeJpeg(inputJpgBuffer);
const resized = tfjs.image.resizeBilinear(decoded, [height, width]);
tfjs
.node
.encodeJpeg(resized, 'rgb', 80)
.then(function () {
deferred.resolve();
tfjs.disposeVariables();
})
.catch(function (err) {
throw err;
});
}
});
// sharp // sharp
jpegSuite.add('sharp-buffer-file', { jpegSuite.add('sharp-buffer-file', {
defer: true, defer: true,
@@ -272,11 +296,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -311,11 +334,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(fixtures.inputJpg) sharp(fixtures.inputJpg)
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -326,8 +348,7 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.toBuffer() .toBuffer()
.then(function (buffer) { .then(function () {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
}) })
.catch(function (err) { .catch(function (err) {
@@ -350,11 +371,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.sharpen() .sharpen()
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -365,11 +385,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.sharpen(3, 1, 3) .sharpen(3, 1, 3)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -380,11 +399,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.blur() .blur()
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -395,11 +413,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.blur(3) .blur(3)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -410,11 +427,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.gamma() .gamma()
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -425,11 +441,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.normalise() .normalise()
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -440,11 +455,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.greyscale() .greyscale()
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -456,11 +470,10 @@ async.series({
.resize(width, height) .resize(width, height)
.gamma() .gamma()
.greyscale() .greyscale()
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -471,11 +484,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.jpeg({ progressive: true }) .jpeg({ progressive: true })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -486,11 +498,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.jpeg({ chromaSubsampling: '4:4:4' }) .jpeg({ chromaSubsampling: '4:4:4' })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -501,11 +512,10 @@ async.series({
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.rotate(90) .rotate(90)
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -516,12 +526,11 @@ async.series({
sharp.simd(false); sharp.simd(false);
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
sharp.simd(true); sharp.simd(true);
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -531,11 +540,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(inputJpgBuffer, { sequentialRead: true }) sharp(inputJpgBuffer, { sequentialRead: true })
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -548,11 +556,10 @@ async.series({
fit: 'cover', fit: 'cover',
position: sharp.strategy.entropy position: sharp.strategy.entropy
}) })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -565,11 +572,10 @@ async.series({
fit: 'cover', fit: 'cover',
position: sharp.strategy.attention position: sharp.strategy.attention
}) })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -588,11 +594,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height, { kernel: 'cubic' }) .resize(width, height, { kernel: 'cubic' })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -602,11 +607,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height, { kernel: 'lanczos2' }) .resize(width, height, { kernel: 'lanczos2' })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -616,11 +620,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(inputJpgBuffer) sharp(inputJpgBuffer)
.resize(width, height, { kernel: 'lanczos3' }) .resize(width, height, { kernel: 'lanczos3' })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -681,7 +684,7 @@ async.series({
} }
}); });
// mapnik // mapnik
pngSuite.add('mapnik-file-file', { mapnik && pngSuite.add('mapnik-file-file', {
defer: true, defer: true,
fn: function (deferred) { fn: function (deferred) {
mapnik.Image.open(fixtures.inputPngAlphaPremultiplicationLarge, function (err, img) { mapnik.Image.open(fixtures.inputPngAlphaPremultiplicationLarge, function (err, img) {
@@ -774,11 +777,10 @@ async.series({
.resize(width, height) .resize(width, height)
.define('PNG:compression-level=6') .define('PNG:compression-level=6')
.define('PNG:compression-filter=0') .define('PNG:compression-filter=0')
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -807,11 +809,10 @@ async.series({
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, height)
.png({ compressionLevel: 6 }) .png({ compressionLevel: 6 })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -838,11 +839,10 @@ async.series({
sharp(fixtures.inputPngAlphaPremultiplicationLarge) sharp(fixtures.inputPngAlphaPremultiplicationLarge)
.resize(width, height) .resize(width, height)
.png({ compressionLevel: 6 }) .png({ compressionLevel: 6 })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -854,11 +854,10 @@ async.series({
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, height)
.png({ compressionLevel: 6, progressive: true }) .png({ compressionLevel: 6, progressive: true })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -870,11 +869,10 @@ async.series({
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, height)
.png({ adaptiveFiltering: true, compressionLevel: 6 }) .png({ adaptiveFiltering: true, compressionLevel: 6 })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -886,11 +884,10 @@ async.series({
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, height)
.png({ compressionLevel: 9 }) .png({ compressionLevel: 9 })
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -923,11 +920,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(inputWebPBuffer) sharp(inputWebPBuffer)
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -950,11 +946,10 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
sharp(fixtures.inputWebP) sharp(fixtures.inputWebP)
.resize(width, height) .resize(width, height)
.toBuffer(function (err, buffer) { .toBuffer(function (err) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
assert.notStrictEqual(null, buffer);
deferred.resolve(); deferred.resolve();
} }
}); });
@@ -966,7 +961,9 @@ async.series({
}).run(); }).run();
} }
}, function (err, results) { }, function (err, results) {
assert(!err, err); if (err) {
throw err;
}
Object.keys(results).forEach(function (format) { Object.keys(results).forEach(function (format) {
if (results[format].toString().substr(0, 5) !== 'sharp') { if (results[format].toString().substr(0, 5) !== 'sharp') {
console.log('sharp was slower than ' + results[format] + ' for ' + format); console.log('sharp was slower than ' + results[format] + ' for ' + format);

View File

@@ -1,6 +0,0 @@
#!/bin/sh
CPPFLAGS="--coverage" LDFLAGS="--coverage" npm rebuild
npm test
geninfo --no-external --base-directory src --output-file coverage/sharp.info build/Release/obj.target/sharp/src
genhtml --title sharp --demangle-cpp --output-directory coverage/sharp coverage/*.info

BIN
test/fixtures/65536-uint32-limit.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 510 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 13 KiB

View File

@@ -98,6 +98,7 @@ module.exports = {
inputPngTrimSpecificColour: getPath('Flag_of_the_Netherlands.png'), // https://commons.wikimedia.org/wiki/File:Flag_of_the_Netherlands.svg inputPngTrimSpecificColour: getPath('Flag_of_the_Netherlands.png'), // https://commons.wikimedia.org/wiki/File:Flag_of_the_Netherlands.svg
inputPngTrimSpecificColour16bit: getPath('Flag_of_the_Netherlands-16bit.png'), // convert Flag_of_the_Netherlands.png -depth 16 Flag_of_the_Netherlands-16bit.png inputPngTrimSpecificColour16bit: getPath('Flag_of_the_Netherlands-16bit.png'), // convert Flag_of_the_Netherlands.png -depth 16 Flag_of_the_Netherlands-16bit.png
inputPngTrimSpecificColourIncludeAlpha: getPath('Flag_of_the_Netherlands-alpha.png'), // convert Flag_of_the_Netherlands.png -alpha set -background none -channel A -evaluate multiply 0.5 +channel Flag_of_the_Netherlands-alpha.png inputPngTrimSpecificColourIncludeAlpha: getPath('Flag_of_the_Netherlands-alpha.png'), // convert Flag_of_the_Netherlands.png -alpha set -background none -channel A -evaluate multiply 0.5 +channel Flag_of_the_Netherlands-alpha.png
inputPngUint32Limit: getPath('65536-uint32-limit.png'), // https://alexandre.alapetite.fr/doc-alex/large-image/
inputWebP: getPath('4.webp'), // http://www.gstatic.com/webp/gallery/4.webp inputWebP: getPath('4.webp'), // http://www.gstatic.com/webp/gallery/4.webp
inputWebPWithTransparency: getPath('5_webp_a.webp'), // http://www.gstatic.com/webp/gallery3/5_webp_a.webp inputWebPWithTransparency: getPath('5_webp_a.webp'), // http://www.gstatic.com/webp/gallery3/5_webp_a.webp

View File

@@ -8,7 +8,7 @@ fi
curl -s -o ./test/leak/libvips.supp https://raw.githubusercontent.com/libvips/libvips/master/suppressions/valgrind.supp curl -s -o ./test/leak/libvips.supp https://raw.githubusercontent.com/libvips/libvips/master/suppressions/valgrind.supp
for test in ./test/unit/*.js; do for test in ./test/unit/*.js; do
G_SLICE=always-malloc G_DEBUG=gc-friendly VIPS_LEAK=1 valgrind \ G_SLICE=always-malloc G_DEBUG=gc-friendly VIPS_LEAK=1 VIPS_NOVECTOR=1 valgrind \
--suppressions=test/leak/libvips.supp \ --suppressions=test/leak/libvips.supp \
--suppressions=test/leak/sharp.supp \ --suppressions=test/leak/sharp.supp \
--gen-suppressions=yes \ --gen-suppressions=yes \
@@ -16,5 +16,5 @@ for test in ./test/unit/*.js; do
--show-leak-kinds=definite,indirect,possible \ --show-leak-kinds=definite,indirect,possible \
--num-callers=20 \ --num-callers=20 \
--trace-children=yes \ --trace-children=yes \
node --expose-gc node_modules/.bin/mocha --slow=60000 --timeout=120000 --file test/unit/beforeEach.js "$test"; node --expose-gc node_modules/.bin/mocha --no-config --slow=60000 --timeout=120000 --require test/beforeEach.js "$test";
done done

View File

@@ -349,6 +349,13 @@
fun:heif_context_read_from_reader fun:heif_context_read_from_reader
} }
# orc
{
addr_orcexec
Memcheck:Addr1
obj:*/orcexec.*
}
# libvips # libvips
{ {
cond_libvips_interpolate_lbb cond_libvips_interpolate_lbb
@@ -945,3 +952,13 @@
fun:_ZN4node7binding6DLOpenERKN2v820FunctionCallbackInfoINS1_5ValueEEE fun:_ZN4node7binding6DLOpenERKN2v820FunctionCallbackInfoINS1_5ValueEEE
fun:_ZN2v88internal12_GLOBAL__N_119HandleApiCallHelperILb0EEENS0_11MaybeHandleINS0_6ObjectEEEPNS0_7IsolateENS0_6HandleINS0_10HeapObjectEEESA_NS8_INS0_20FunctionTemplateInfoEEENS8_IS4_EENS0_16BuiltinArgumentsE fun:_ZN2v88internal12_GLOBAL__N_119HandleApiCallHelperILb0EEENS0_11MaybeHandleINS0_6ObjectEEEPNS0_7IsolateENS0_6HandleINS0_10HeapObjectEEESA_NS8_INS0_20FunctionTemplateInfoEEENS8_IS4_EENS0_16BuiltinArgumentsE
} }
{
addr_node_binding_dlopen_strncmp
Memcheck:Addr8
fun:strncmp
fun:is_dst
...
fun:dlopen_implementation
...
fun:_ZNSt17_Function_handlerIFbPN4node7binding4DLibEEZNS1_6DLOpenERKN2v820FunctionCallbackInfoINS5_5ValueEEEEUlS3_E_E9_M_invokeERKSt9_Any_dataOS3_
}

View File

@@ -10,34 +10,40 @@ describe('HTTP agent', function () {
it('HTTPS proxy with auth from HTTPS_PROXY', function () { it('HTTPS proxy with auth from HTTPS_PROXY', function () {
process.env.HTTPS_PROXY = 'https://user:pass@secure:123'; process.env.HTTPS_PROXY = 'https://user:pass@secure:123';
const proxy = agent(); let logMsg = '';
const proxy = agent(msg => { logMsg = msg; });
delete process.env.HTTPS_PROXY; delete process.env.HTTPS_PROXY;
assert.strictEqual('object', typeof proxy); assert.strictEqual('object', typeof proxy);
assert.strictEqual('secure', proxy.options.proxy.host); assert.strictEqual('secure', proxy.options.proxy.host);
assert.strictEqual(123, proxy.options.proxy.port); assert.strictEqual(123, proxy.options.proxy.port);
assert.strictEqual('user:pass', proxy.options.proxy.proxyAuth); assert.strictEqual('user:pass', proxy.options.proxy.proxyAuth);
assert.strictEqual(443, proxy.defaultPort); assert.strictEqual(443, proxy.defaultPort);
assert.strictEqual(logMsg, 'Via proxy https:://secure:123 with credentials');
}); });
it('HTTPS proxy with auth from HTTPS_PROXY using credentials containing special characters', function () { it('HTTPS proxy with auth from HTTPS_PROXY using credentials containing special characters', function () {
process.env.HTTPS_PROXY = 'https://user,:pass=@secure:123'; process.env.HTTPS_PROXY = 'https://user,:pass=@secure:789';
const proxy = agent(); let logMsg = '';
const proxy = agent(msg => { logMsg = msg; });
delete process.env.HTTPS_PROXY; delete process.env.HTTPS_PROXY;
assert.strictEqual('object', typeof proxy); assert.strictEqual('object', typeof proxy);
assert.strictEqual('secure', proxy.options.proxy.host); assert.strictEqual('secure', proxy.options.proxy.host);
assert.strictEqual(123, proxy.options.proxy.port); assert.strictEqual(789, proxy.options.proxy.port);
assert.strictEqual('user,:pass=', proxy.options.proxy.proxyAuth); assert.strictEqual('user,:pass=', proxy.options.proxy.proxyAuth);
assert.strictEqual(443, proxy.defaultPort); assert.strictEqual(443, proxy.defaultPort);
assert.strictEqual(logMsg, 'Via proxy https:://secure:789 with credentials');
}); });
it('HTTP proxy without auth from npm_config_proxy', function () { it('HTTP proxy without auth from npm_config_proxy', function () {
process.env.npm_config_proxy = 'http://plaintext:456'; process.env.npm_config_proxy = 'http://plaintext:456';
const proxy = agent(); let logMsg = '';
const proxy = agent(msg => { logMsg = msg; });
delete process.env.npm_config_proxy; delete process.env.npm_config_proxy;
assert.strictEqual('object', typeof proxy); assert.strictEqual('object', typeof proxy);
assert.strictEqual('plaintext', proxy.options.proxy.host); assert.strictEqual('plaintext', proxy.options.proxy.host);
assert.strictEqual(456, proxy.options.proxy.port); assert.strictEqual(456, proxy.options.proxy.port);
assert.strictEqual(null, proxy.options.proxy.proxyAuth); assert.strictEqual(null, proxy.options.proxy.proxyAuth);
assert.strictEqual(443, proxy.defaultPort); assert.strictEqual(443, proxy.defaultPort);
assert.strictEqual(logMsg, 'Via proxy http:://plaintext:456 no credentials');
}); });
}); });

View File

@@ -103,4 +103,28 @@ describe('AVIF', () => {
width: 10 width: 10
}); });
}); });
it('should cast to uchar', async () => {
const data = await sharp(inputJpg)
.resize(32)
.sharpen()
.avif({ effort: 0 })
.toBuffer();
const { size, ...metadata } = await sharp(data)
.metadata();
assert.deepStrictEqual(metadata, {
channels: 3,
compression: 'av1',
depth: 'uchar',
format: 'heif',
hasAlpha: false,
hasProfile: false,
height: 26,
isProgressive: false,
pagePrimary: 0,
pages: 1,
space: 'srgb',
width: 32
});
});
}); });

View File

@@ -1,22 +0,0 @@
'use strict';
const detectLibc = require('detect-libc');
const sharp = require('../../');
const libcFamily = detectLibc.familySync();
const usingCache = !(process.env.G_DEBUG || libcFamily === detectLibc.MUSL);
const usingSimd = !(process.env.G_DEBUG || process.env.VIPS_NOVECTOR);
const concurrency = process.env.VIPS_CONCURRENCY ||
(libcFamily === detectLibc.MUSL || process.arch === 'arm' ? 1 : undefined);
beforeEach(function () {
sharp.cache(usingCache);
sharp.simd(usingSimd);
sharp.concurrency(concurrency);
});
afterEach(function () {
if (global.gc) {
global.gc();
}
});

View File

@@ -300,7 +300,7 @@ describe('Partial image extraction', function () {
const s = sharp(); const s = sharp();
s.on('warning', function (msg) { warningMessage = msg; }); s.on('warning', function (msg) { warningMessage = msg; });
const options = { top: 0, left: 0, width: 1, height: 1 }; const options = { top: 0, left: 0, width: 1, height: 1 };
s.extract(options); s.extract(options).extract(options);
assert.strictEqual(warningMessage, ''); assert.strictEqual(warningMessage, '');
s.extract(options); s.extract(options);
assert.strictEqual(warningMessage, 'ignoring previous extract options'); assert.strictEqual(warningMessage, 'ignoring previous extract options');
@@ -311,7 +311,7 @@ describe('Partial image extraction', function () {
const s = sharp().rotate(); const s = sharp().rotate();
s.on('warning', function (msg) { warningMessage = msg; }); s.on('warning', function (msg) { warningMessage = msg; });
const options = { top: 0, left: 0, width: 1, height: 1 }; const options = { top: 0, left: 0, width: 1, height: 1 };
s.extract(options); s.extract(options).extract(options);
assert.strictEqual(warningMessage, ''); assert.strictEqual(warningMessage, '');
s.extract(options); s.extract(options);
assert.strictEqual(warningMessage, 'ignoring previous extract options'); assert.strictEqual(warningMessage, 'ignoring previous extract options');

View File

@@ -54,19 +54,13 @@ describe('Image channel extraction', function () {
}); });
}); });
it('With colorspace conversion', function (done) { it('With colorspace conversion', async () => {
const output = fixtures.path('output.extract-lch.jpg'); const [chroma] = await sharp({ create: { width: 1, height: 1, channels: 3, background: 'red' } })
sharp(fixtures.inputJpg)
.extractChannel(1)
.toColourspace('lch') .toColourspace('lch')
.resize(320, 240, { fastShrinkOnLoad: false }) .extractChannel(1)
.toFile(output, function (err, info) { .toBuffer();
if (err) throw err;
assert.strictEqual(320, info.width); assert.strictEqual(chroma, 104);
assert.strictEqual(240, info.height);
fixtures.assertMaxColourDistance(output, fixtures.expected('extract-lch.jpg'), 9);
done();
});
}); });
it('Alpha from 16-bit PNG', function (done) { it('Alpha from 16-bit PNG', function (done) {
@@ -108,12 +102,12 @@ describe('Image channel extraction', function () {
}); });
}); });
it('Non-existent channel', function (done) { it('Non-existent channel', async () =>
sharp(fixtures.inputPng) await assert.rejects(
.extractChannel(1) () => sharp({ create: { width: 1, height: 1, channels: 3, background: 'red' } })
.toBuffer(function (err) { .extractChannel(3)
assert(err instanceof Error); .toBuffer(),
done(); /Cannot extract channel 3 from image with channels 0-2/
}); )
}); );
}); });

View File

@@ -141,6 +141,28 @@ describe('GIF input', () => {
}); });
}); });
it('invalid interFrameMaxError throws', () => {
assert.throws(
() => sharp().gif({ interFrameMaxError: 33 }),
/Expected number between 0.0 and 32.0 for interFrameMaxError but received 33 of type number/
);
assert.throws(
() => sharp().gif({ interFrameMaxError: 'fail' }),
/Expected number between 0.0 and 32.0 for interFrameMaxError but received fail of type string/
);
});
it('invalid interPaletteMaxError throws', () => {
assert.throws(
() => sharp().gif({ interPaletteMaxError: 257 }),
/Expected number between 0.0 and 256.0 for interPaletteMaxError but received 257 of type number/
);
assert.throws(
() => sharp().gif({ interPaletteMaxError: 'fail' }),
/Expected number between 0.0 and 256.0 for interPaletteMaxError but received fail of type string/
);
});
it('should work with streams when only animated is set', function (done) { it('should work with streams when only animated is set', function (done) {
fs.createReadStream(fixtures.inputGifAnimated) fs.createReadStream(fixtures.inputGifAnimated)
.pipe(sharp({ animated: true })) .pipe(sharp({ animated: true }))
@@ -164,4 +186,18 @@ describe('GIF input', () => {
fixtures.assertSimilar(fixtures.inputGifAnimated, data, done); fixtures.assertSimilar(fixtures.inputGifAnimated, data, done);
}); });
}); });
it('should optimise file size via interFrameMaxError', async () => {
const input = sharp(fixtures.inputGifAnimated, { animated: true });
const before = await input.gif({ interFrameMaxError: 0 }).toBuffer();
const after = await input.gif({ interFrameMaxError: 10 }).toBuffer();
assert.strict(before.length > after.length);
});
it('should optimise file size via interPaletteMaxError', async () => {
const input = sharp(fixtures.inputGifAnimated, { animated: true });
const before = await input.gif({ interPaletteMaxError: 0 }).toBuffer();
const after = await input.gif({ interPaletteMaxError: 100 }).toBuffer();
assert.strict(before.length > after.length);
});
}); });

View File

@@ -745,6 +745,17 @@ describe('Input/output', function () {
}) })
); );
it('Enabling default limit works and fails for an image with resolution higher than uint32 limit', () =>
sharp(fixtures.inputPngUint32Limit, { limitInputPixels: true })
.toBuffer()
.then(() => {
assert.fail('Expected to fail');
})
.catch(err => {
assert.strictEqual(err.message, 'Input image exceeds pixel limit');
})
);
it('Smaller than input fails', () => it('Smaller than input fails', () =>
sharp(fixtures.inputJpg) sharp(fixtures.inputJpg)
.metadata() .metadata()

View File

@@ -8,20 +8,20 @@ const fixtures = require('../fixtures');
describe('JP2 output', () => { describe('JP2 output', () => {
if (!sharp.format.jp2k.input.buffer) { if (!sharp.format.jp2k.input.buffer) {
it('JP2 output should fail due to missing OpenJPEG', () => { it('JP2 output should fail due to missing OpenJPEG', () =>
assert.rejects(() => assert.rejects(async () =>
sharp(fixtures.inputJpg) sharp(fixtures.inputJpg)
.jp2() .jp2()
.toBuffer(), .toBuffer(),
/JP2 output requires libvips with support for OpenJPEG/ /JP2 output requires libvips with support for OpenJPEG/
)
); );
});
it('JP2 file output should fail due to missing OpenJPEG', () => { it('JP2 file output should fail due to missing OpenJPEG', () =>
assert.rejects(async () => await sharp().toFile('test.jp2'), assert.rejects(async () => sharp(fixtures.inputJpg).toFile('test.jp2'),
/JP2 output requires libvips with support for OpenJPEG/ /JP2 output requires libvips with support for OpenJPEG/
)
); );
});
} else { } else {
it('JP2 Buffer to PNG Buffer', () => { it('JP2 Buffer to PNG Buffer', () => {
sharp(fs.readFileSync(fixtures.inputJp2)) sharp(fs.readFileSync(fixtures.inputJp2))

97
test/unit/jxl.js Normal file
View File

@@ -0,0 +1,97 @@
'use strict';
const assert = require('assert');
const sharp = require('../../');
describe('JXL', () => {
it('called without options does not throw an error', () => {
assert.doesNotThrow(() => {
sharp().jxl();
});
});
it('valid distance does not throw an error', () => {
assert.doesNotThrow(() => {
sharp().jxl({ distance: 2.3 });
});
});
it('invalid distance should throw an error', () => {
assert.throws(() => {
sharp().jxl({ distance: 15.1 });
});
});
it('non-numeric distance should throw an error', () => {
assert.throws(() => {
sharp().jxl({ distance: 'fail' });
});
});
it('valid quality > 30 does not throw an error', () => {
const s = sharp();
assert.doesNotThrow(() => {
s.jxl({ quality: 80 });
});
assert.strictEqual(s.options.jxlDistance, 1.9);
});
it('valid quality < 30 does not throw an error', () => {
const s = sharp();
assert.doesNotThrow(() => {
s.jxl({ quality: 20 });
});
assert.strictEqual(s.options.jxlDistance, 9.066666666666666);
});
it('valid quality does not throw an error', () => {
assert.doesNotThrow(() => {
sharp().jxl({ quality: 80 });
});
});
it('invalid quality should throw an error', () => {
assert.throws(() => {
sharp().jxl({ quality: 101 });
});
});
it('non-numeric quality should throw an error', () => {
assert.throws(() => {
sharp().jxl({ quality: 'fail' });
});
});
it('valid decodingTier does not throw an error', () => {
assert.doesNotThrow(() => {
sharp().jxl({ decodingTier: 2 });
});
});
it('invalid decodingTier should throw an error', () => {
assert.throws(() => {
sharp().jxl({ decodingTier: 5 });
});
});
it('non-numeric decodingTier should throw an error', () => {
assert.throws(() => {
sharp().jxl({ decodingTier: 'fail' });
});
});
it('valid lossless does not throw an error', () => {
assert.doesNotThrow(() => {
sharp().jxl({ lossless: true });
});
});
it('non-boolean lossless should throw an error', () => {
assert.throws(() => {
sharp().jxl({ lossless: 'fail' });
});
});
it('valid effort does not throw an error', () => {
assert.doesNotThrow(() => {
sharp().jxl({ effort: 6 });
});
});
it('out of range effort should throw an error', () => {
assert.throws(() => {
sharp().jxl({ effort: 10 });
});
});
it('invalid effort should throw an error', () => {
assert.throws(() => {
sharp().jxl({ effort: 'fail' });
});
});
});

View File

@@ -73,6 +73,28 @@ describe('Linear adjustment', function () {
}); });
}); });
it('output is integer, not float, RGB', async () => {
const data = await sharp({ create: { width: 1, height: 1, channels: 3, background: 'red' } })
.linear(1, 0)
.tiff({ compression: 'none' })
.toBuffer();
const { channels, depth } = await sharp(data).metadata();
assert.strictEqual(channels, 3);
assert.strictEqual(depth, 'uchar');
});
it('output is integer, not float, RGBA', async () => {
const data = await sharp({ create: { width: 1, height: 1, channels: 4, background: '#ff000077' } })
.linear(1, 0)
.tiff({ compression: 'none' })
.toBuffer();
const { channels, depth } = await sharp(data).metadata();
assert.strictEqual(channels, 4);
assert.strictEqual(depth, 'uchar');
});
it('Invalid linear arguments', function () { it('Invalid linear arguments', function () {
assert.throws( assert.throws(
() => sharp().linear('foo'), () => sharp().linear('foo'),

View File

@@ -439,6 +439,19 @@ describe('Image metadata', function () {
); );
}); });
it('Stream in, finish event fires before metadata is requested', (done) => {
const create = { width: 1, height: 1, channels: 3, background: 'red' };
const image1 = sharp({ create }).png().pipe(sharp());
const image2 = sharp({ create }).png().pipe(sharp());
setTimeout(async () => {
const data1 = await image1.metadata();
assert.strictEqual('png', data1.format);
const data2 = await image2.metadata();
assert.strictEqual('png', data2.format);
done();
}, 500);
});
it('Stream', function (done) { it('Stream', function (done) {
const readable = fs.createReadStream(fixtures.inputJpg); const readable = fs.createReadStream(fixtures.inputJpg);
const pipeline = sharp().metadata(function (err, metadata) { const pipeline = sharp().metadata(function (err, metadata) {

View File

@@ -128,11 +128,26 @@ describe('PNG', function () {
assert.strictEqual(alphaMeanAfter, alphaMeanBefore); assert.strictEqual(alphaMeanAfter, alphaMeanBefore);
}); });
it('palette decode/encode roundtrip', () => it('palette decode/encode roundtrip', async () => {
sharp(fixtures.inputPngPalette) const data = await sharp(fixtures.inputPngPalette)
.png({ effort: 1, palette: true }) .png({ effort: 1, palette: true })
.toBuffer() .toBuffer();
);
const { size, ...metadata } = await sharp(data).metadata();
assert.deepStrictEqual(metadata, {
format: 'png',
width: 68,
height: 68,
space: 'srgb',
channels: 3,
density: 72,
depth: 'uchar',
isProgressive: false,
paletteBitDepth: 8,
hasProfile: false,
hasAlpha: false
});
});
it('Valid PNG libimagequant palette value does not throw error', function () { it('Valid PNG libimagequant palette value does not throw error', function () {
assert.doesNotThrow(function () { assert.doesNotThrow(function () {

View File

@@ -8,16 +8,43 @@ const fixtures = require('../fixtures');
describe('Rotation', function () { describe('Rotation', function () {
['Landscape', 'Portrait'].forEach(function (orientation) { ['Landscape', 'Portrait'].forEach(function (orientation) {
[1, 2, 3, 4, 5, 6, 7, 8].forEach(function (exifTag) { [1, 2, 3, 4, 5, 6, 7, 8].forEach(function (exifTag) {
it('Input image has Orientation EXIF tag value of (' + exifTag + '), auto-rotate', function (done) { const input = fixtures[`inputJpgWith${orientation}Exif${exifTag}`];
sharp(fixtures['inputJpgWith' + orientation + 'Exif' + exifTag]) const expectedOutput = fixtures.expected(`${orientation}_${exifTag}-out.jpg`);
it(`Auto-rotate ${orientation} image with EXIF Orientation ${exifTag}`, function (done) {
const [expectedWidth, expectedHeight] = orientation === 'Landscape' ? [600, 450] : [450, 600];
sharp(input)
.rotate() .rotate()
.resize(320)
.toBuffer(function (err, data, info) { .toBuffer(function (err, data, info) {
if (err) throw err; if (err) throw err;
assert.strictEqual('jpeg', info.format); assert.strictEqual(info.width, expectedWidth);
assert.strictEqual(320, info.width); assert.strictEqual(info.height, expectedHeight);
assert.strictEqual(orientation === 'Landscape' ? 240 : 427, info.height); fixtures.assertSimilar(expectedOutput, data, done);
fixtures.assertSimilar(fixtures.expected(orientation + '_' + exifTag + '-out.jpg'), data, done); });
});
it(`Auto-rotate then resize ${orientation} image with EXIF Orientation ${exifTag}`, function (done) {
const [expectedWidth, expectedHeight] = orientation === 'Landscape' ? [320, 240] : [320, 427];
sharp(input)
.rotate()
.resize({ width: 320 })
.toBuffer(function (err, data, info) {
if (err) throw err;
assert.strictEqual(info.width, expectedWidth);
assert.strictEqual(info.height, expectedHeight);
fixtures.assertSimilar(expectedOutput, data, done);
});
});
it(`Resize then auto-rotate ${orientation} image with EXIF Orientation ${exifTag}`, function (done) {
const [expectedWidth, expectedHeight] = orientation === 'Landscape'
? (exifTag < 5) ? [320, 240] : [320, 240]
: [320, 427];
sharp(input)
.resize({ width: 320 })
.rotate()
.toBuffer(function (err, data, info) {
if (err) throw err;
assert.strictEqual(info.width, expectedWidth);
assert.strictEqual(info.height, expectedHeight);
fixtures.assertSimilar(expectedOutput, data, done);
}); });
}); });
}); });
@@ -40,7 +67,7 @@ describe('Rotation', function () {
it('Rotate by 30 degrees with solid background', function (done) { it('Rotate by 30 degrees with solid background', function (done) {
sharp(fixtures.inputJpg) sharp(fixtures.inputJpg)
.resize(320) .resize(320)
.rotate(30, { background: { r: 255, g: 0, b: 0, alpha: 0.5 } }) .rotate(30, { background: { r: 255, g: 0, b: 0 } })
.toBuffer(function (err, data, info) { .toBuffer(function (err, data, info) {
if (err) throw err; if (err) throw err;
assert.strictEqual('jpeg', info.format); assert.strictEqual('jpeg', info.format);
@@ -406,4 +433,41 @@ describe('Rotation', function () {
fixtures.assertSimilar(fixtures.expected('rotate-and-flop.jpg'), data, done); fixtures.assertSimilar(fixtures.expected('rotate-and-flop.jpg'), data, done);
}); });
}); });
it('Auto-rotate and shrink-on-load', async () => {
const [r, g, b] = await sharp(fixtures.inputJpgWithLandscapeExif3)
.rotate()
.resize(8)
.raw()
.toBuffer();
assert.strictEqual(r, 60);
assert.strictEqual(g, 73);
assert.strictEqual(b, 52);
});
it('Flip and rotate ordering', async () => {
const [r, g, b] = await sharp(fixtures.inputJpgWithPortraitExif5)
.flip()
.rotate(90)
.raw()
.toBuffer();
assert.strictEqual(r, 55);
assert.strictEqual(g, 65);
assert.strictEqual(b, 31);
});
it('Flip, rotate and resize ordering', async () => {
const [r, g, b] = await sharp(fixtures.inputJpgWithPortraitExif5)
.flip()
.rotate(90)
.resize(449)
.raw()
.toBuffer();
assert.strictEqual(r, 54);
assert.strictEqual(g, 64);
assert.strictEqual(b, 30);
});
}); });

View File

@@ -110,32 +110,32 @@ describe('Sharpen', function () {
it('invalid options.sigma', () => assert.throws( it('invalid options.sigma', () => assert.throws(
() => sharp().sharpen({ sigma: -1 }), () => sharp().sharpen({ sigma: -1 }),
/Expected number between 0\.01 and 10000 for options\.sigma but received -1 of type number/ /Expected number between 0\.000001 and 10000 for options\.sigma but received -1 of type number/
)); ));
it('invalid options.m1', () => assert.throws( it('invalid options.m1', () => assert.throws(
() => sharp().sharpen({ sigma: 1, m1: -1 }), () => sharp().sharpen({ sigma: 1, m1: -1 }),
/Expected number between 0 and 10000 for options\.m1 but received -1 of type number/ /Expected number between 0 and 1000000 for options\.m1 but received -1 of type number/
)); ));
it('invalid options.m2', () => assert.throws( it('invalid options.m2', () => assert.throws(
() => sharp().sharpen({ sigma: 1, m2: -1 }), () => sharp().sharpen({ sigma: 1, m2: -1 }),
/Expected number between 0 and 10000 for options\.m2 but received -1 of type number/ /Expected number between 0 and 1000000 for options\.m2 but received -1 of type number/
)); ));
it('invalid options.x1', () => assert.throws( it('invalid options.x1', () => assert.throws(
() => sharp().sharpen({ sigma: 1, x1: -1 }), () => sharp().sharpen({ sigma: 1, x1: -1 }),
/Expected number between 0 and 10000 for options\.x1 but received -1 of type number/ /Expected number between 0 and 1000000 for options\.x1 but received -1 of type number/
)); ));
it('invalid options.y2', () => assert.throws( it('invalid options.y2', () => assert.throws(
() => sharp().sharpen({ sigma: 1, y2: -1 }), () => sharp().sharpen({ sigma: 1, y2: -1 }),
/Expected number between 0 and 10000 for options\.y2 but received -1 of type number/ /Expected number between 0 and 1000000 for options\.y2 but received -1 of type number/
)); ));
it('invalid options.y3', () => assert.throws( it('invalid options.y3', () => assert.throws(
() => sharp().sharpen({ sigma: 1, y3: -1 }), () => sharp().sharpen({ sigma: 1, y3: -1 }),
/Expected number between 0 and 10000 for options\.y3 but received -1 of type number/ /Expected number between 0 and 1000000 for options\.y3 but received -1 of type number/
)); ));
it('sharpened image is larger than non-sharpened', function (done) { it('sharpened image is larger than non-sharpened', function (done) {

View File

@@ -730,4 +730,9 @@ describe('Image Stats', function () {
done(); done();
}); });
}); });
it('Sequential read option is ignored', async () => {
const { isOpaque } = await sharp(fixtures.inputJpg, { sequentialRead: true }).stats();
assert.strictEqual(isOpaque, true);
});
}); });

View File

@@ -48,9 +48,8 @@ describe('Text to image', () => {
if (err) throw err; if (err) throw err;
assert.strictEqual('png', info.format); assert.strictEqual('png', info.format);
assert.strictEqual(3, info.channels); assert.strictEqual(3, info.channels);
assert.ok(info.width > 10 && info.width <= maxWidth); assert.ok(info.width <= maxWidth);
assert.ok(info.height > 10 && info.height <= maxHeight); assert.ok(info.height <= maxHeight);
assert.ok(Math.abs(info.width - maxWidth) < 50);
assert.ok(info.textAutofitDpi > 0); assert.ok(info.textAutofitDpi > 0);
done(); done();
}); });
@@ -175,7 +174,7 @@ describe('Text to image', () => {
}); });
it('fontfile input', function () { it('fontfile input', function () {
// Added for code coverage assert.doesNotThrow(function () {
sharp({ sharp({
text: { text: {
text: 'text', text: 'text',
@@ -183,6 +182,7 @@ describe('Text to image', () => {
} }
}); });
}); });
});
it('bad font input', function () { it('bad font input', function () {
assert.throws(function () { assert.throws(function () {

View File

@@ -124,6 +124,32 @@ describe('Trim borders', function () {
assert.strictEqual(trimOffsetLeft, -13); assert.strictEqual(trimOffsetLeft, -13);
}); });
it('Ensure greyscale image can be trimmed', async () => {
const greyscale = await sharp({
create: {
width: 16,
height: 8,
channels: 3,
background: 'silver'
}
})
.extend({ left: 12, right: 24, background: 'gray' })
.toColourspace('b-w')
.png({ compressionLevel: 0 })
.toBuffer();
const { info } = await sharp(greyscale)
.trim()
.raw()
.toBuffer({ resolveWithObject: true });
const { width, height, trimOffsetTop, trimOffsetLeft } = info;
assert.strictEqual(width, 16);
assert.strictEqual(height, 8);
assert.strictEqual(trimOffsetTop, 0);
assert.strictEqual(trimOffsetLeft, -12);
});
it('Ensure trim of image with all pixels same is no-op', async () => { it('Ensure trim of image with all pixels same is no-op', async () => {
const { info } = await sharp({ const { info } = await sharp({
create: { create: {