Compare commits

..

115 Commits

Author SHA1 Message Date
Lovell Fuller
aea368a3a0 Release v0.32.4 2023-07-21 11:41:08 +01:00
Lovell Fuller
7ecbc20d3d Upgrade to libvips v8.14.3 2023-07-21 11:10:21 +01:00
Lovell Fuller
cb0e2a91c4 Bump dep 2023-07-19 16:55:12 +01:00
Lovell Fuller
739b317a6f Expose ability to (un)block libvips ops by name 2023-07-19 16:53:52 +01:00
Lovell Fuller
a0e1c39785 Release v0.32.3 2023-07-14 11:03:39 +01:00
Lovell Fuller
85b26dab68 Expose preset option for WebP output #3639 2023-07-12 19:12:04 +01:00
Lovell Fuller
66f7cef253 Docs: fix a few typos 2023-07-12 14:22:29 +01:00
Lovell Fuller
863174f201 CI: FreeBSD: Use 13.2 stable, upgrade to Node.js 20 2023-07-12 12:10:31 +01:00
Lovell Fuller
bcd865cc96 Ensure decoding remains sequential for all ops #3725 2023-07-12 11:35:59 +01:00
Lovell Fuller
16ea04fe80 Release v0.32.2 2023-07-11 11:47:37 +01:00
Lovell Fuller
9c547dc321 Use copy rather than cache to prevent affine overcompute
More predictable behaviour, see commit 14c3346 for context
2023-07-10 13:56:42 +01:00
Lovell Fuller
5522060e9e Limit HEIF output dimensions to 16384x16384
This is a slightly breaking change to sync with the behaviour of a
forthcoming libvips patch release. It also matches the libavif
limit. Having an arbitrary limit is safer than no limit.
2023-07-10 10:24:14 +01:00
Lovell Fuller
d2f0fa855b Tests: loosen threshold for affine rotate then extract
Ignores rounding errors under Rosetta emulation
2023-07-10 08:12:13 +01:00
Lovell Fuller
2bb3ea8170 Bump deps 2023-07-09 11:57:05 +01:00
Lovell Fuller
3434eef5b9 Guard use of smartcrop premultiplied option #3710 2023-07-09 09:57:20 +01:00
Lovell Fuller
2f67823c3d Allow seq read for EXIF-based auto-orient #3725 2023-07-09 09:26:58 +01:00
Lovell Fuller
38c760cdd7 Update to latest (temporary) prebuild patch 2023-07-09 09:10:24 +01:00
Lovell Fuller
14c3346800 Prevent over-compute in affine rotate #3722 2023-07-09 09:04:07 +01:00
Lovell Fuller
0da55bab7e Tests: remove unused dependency 2023-06-29 09:11:25 +01:00
Lovell Fuller
cfb659f576 Bump deps 2023-06-23 08:19:41 +01:00
Lovell Fuller
cc5ac5385f Docs: clarify use of extract before composite 2023-06-23 08:08:39 +01:00
Lovell Fuller
93fafb0c18 CI: Upgrade to latest git v2 within centos 7 containers 2023-06-05 12:32:47 +01:00
Lovell Fuller
41e3c8ca09 Temporarily use patched prebuild with node-gyp v9 2023-06-05 09:45:12 +01:00
Lovell Fuller
da61ea0199 Docs: changelog and credit for #3674 2023-06-05 09:35:07 +01:00
BJJ
7e6a70af44 Improve detection of jp2 filename extensions #3674 2023-06-05 09:31:25 +01:00
Lovell Fuller
f5845c7e61 Ensure exceptions are not thrown when terminating #3569 2023-06-03 11:51:44 +01:00
Lovell Fuller
eb1e53db83 Bump deps 2023-06-03 11:51:12 +01:00
Lovell Fuller
3340120aea Types: include base input options for composite #3669 2023-05-16 13:55:28 +01:00
Lovell Fuller
de0fc07092 Ensure same access method for all inputs #3669 2023-05-16 13:53:31 +01:00
Lovell Fuller
dc4b39f73f Docs: multi-page images cannot be flipped 2023-05-13 08:54:21 +01:00
Lovell Fuller
e873978e53 Docs: clarify which axis is used when mirroring 2023-05-11 10:24:24 +01:00
Lovell Fuller
5255964c79 Docs: ensure headings with digits appear 2023-04-27 10:23:28 +01:00
Lovell Fuller
dea319daf6 Release v0.32.1 2023-04-27 09:58:40 +01:00
Lovell Fuller
a2ca678854 Docs: clarify text align applies to multi-line 2023-04-27 09:00:11 +01:00
Lovell Fuller
e98993a6e2 Bump node-addon-api for Buffer::NewOrCopy 2023-04-23 15:43:54 +01:00
Lovell Fuller
90abd927c9 Install: coerce libc version to semver (refactor) 2023-04-23 11:54:41 +01:00
Lovell Fuller
4d7957a043 Install: coerce libc version to semver #3641 2023-04-23 11:37:43 +01:00
Lovell Fuller
bf9bb56367 Docs: fix affine interpolator example 2023-04-22 13:56:33 +01:00
Lovell Fuller
8408e99aa3 Ensure trim op works with CMYK input #3636 2023-04-20 10:49:39 +01:00
Lovell Fuller
a39f959dcc Docs: add security policy
- Latest version is supported
- Report vulnerabilities via e-mail
2023-04-20 10:46:04 +01:00
Lovell Fuller
d08baa20e6 Install: log possible error when removing vendor dir 2023-04-19 11:06:16 +01:00
Lovell Fuller
391018ad3d Bump semver dep 2023-04-19 11:04:03 +01:00
Lovell Fuller
afed876f90 Docs: ensure inclusion of jp2 function
A misplaced code coverage comment was preventing this.
See ef849fd for the original commit where this broke.
2023-04-17 20:55:12 +01:00
Lovell Fuller
d6b60a60c6 Docs: add example of how to set EXIF GPS metadata 2023-04-17 20:35:47 +01:00
Lovell Fuller
5f8646d937 Support modulate op with non-sRGB pipeline colourspace #3620 2023-04-17 19:53:48 +01:00
Lovell Fuller
b763801d68 Ensure profile-less CMYK roundtrip skips space conv #3620 2023-04-11 20:31:57 +01:00
Lovell Fuller
2e0f789c9b Tests: add retries to text test suite
as font discovery is occasionally slow
in Windows CI environment.
2023-04-09 21:42:09 +01:00
Lovell Fuller
a8645f0f38 Smartcrop performance can take future advantage of
https://github.com/libvips/libvips/commit/de43eea
2023-04-09 21:17:08 +01:00
Lovell Fuller
7b58ad9360 Docs: changelog entry for #3615 2023-04-07 12:23:21 +01:00
TomWis97
9ebbcc3701 Logging: fix notation of proxy URL (#3615) 2023-04-07 12:19:04 +01:00
Lovell Fuller
e87204b92c Doc update and changelog entry for #3461 2023-04-07 11:21:15 +01:00
Anton Marsden
a4c6eba7d4 Add unflatten operation to create an alpha channel (#3461) 2023-04-07 11:01:29 +01:00
Lovell Fuller
b9c3851515 Ensure linear op works with 16-bit input #3605 2023-04-01 12:08:14 +01:00
Lovell Fuller
97cf69c26a Ensure use of flip op forces random access read #3600 2023-03-31 09:04:22 +01:00
Lovell Fuller
d5be024bfd Bump devDeps 2023-03-28 14:39:25 +01:00
Lovell Fuller
de01fc44e7 Docs: ensure API fn name linking is consistent 2023-03-28 14:00:52 +01:00
Lovell Fuller
ca102ebd6c Docs: fix perf result copypasta
An ARM64 value was incorrectly using an AMD64 result
2023-03-28 12:08:07 +01:00
Lovell Fuller
b9d4c30a9f Release v0.32.0 2023-03-24 17:05:59 +00:00
Lovell Fuller
148760fe55 Docs: clarify resize reduction/enlargement options refer to scaling
Types: options can be passed as first resize parameter
2023-03-24 15:19:21 +00:00
Lovell Fuller
98ed237734 Docs: use only first year of copyright to match code 2023-03-24 09:59:36 +00:00
Lovell Fuller
b55e58f31e Trim space from end of libvips error messages 2023-03-24 09:58:21 +00:00
Lovell Fuller
0af070ed93 Docs: update performance results, include PNG-based task 2023-03-23 18:57:43 +00:00
Lovell Fuller
9fbb4fcaef Tests: bump benchmark deps 2023-03-22 11:03:16 +00:00
Lovell Fuller
6008ff8a08 Docs: tile-based output requires libgsf 2023-03-22 09:17:07 +00:00
Lovell Fuller
cd5e11bd50 Docs: ensure parameters are indexed as they now appear in a table 2023-03-22 09:04:54 +00:00
Lovell Fuller
08d6822265 Upgrade to libvips v8.14.2 2023-03-21 21:21:24 +00:00
Lovell Fuller
8b8a815fbb Tests: tile-based output optional, will require custom libvips
The prebuilt binaries provided by v0.32.0 will not support
tile-based output, which is (hopefully) a temporary situation
until upstream licensing issues are resolved.
2023-03-21 21:19:56 +00:00
Lovell Fuller
a44da850c1 Docs: add open graph title and image 2023-03-21 12:56:44 +00:00
Lovell Fuller
c5ef4677b1 Bump tsd dep to pick up TypeScript 5 improvements 2023-03-21 12:43:29 +00:00
Lovell Fuller
f8a430bdd3 Tests: reduce CPU cost of RGBA linear test, ~2s faster 2023-03-21 12:42:50 +00:00
Lovell Fuller
cd419a261b Docs: changelog and refresh for #3583 2023-03-21 10:16:31 +00:00
LachlanNewman
d7776e3b98 Add support to normalise for lower and upper percentiles (#3583) 2023-03-21 10:13:12 +00:00
Lovell Fuller
1eefd4e562 Docs: how to provide new integrity values for custom binaries 2023-03-17 09:25:24 +00:00
cychub
0a16d26ec7 Docs: fix sharp_binary_host example by adding version (#3568) 2023-03-12 12:58:08 +00:00
Lovell Fuller
fc03fba602 Docs: clarify metadata ignores chained ops 2023-03-10 13:35:06 +00:00
Lovell Fuller
c87fe512b4 Bump devDeps 2023-03-08 16:59:20 +00:00
Lovell Fuller
2eaab59c48 Docs: add note about API Gateway integration 2023-03-08 16:54:49 +00:00
Lovell Fuller
4ec883eaa0 Wrap all async JS callbacks, help avoid possible race #3569 2023-03-01 12:41:11 +00:00
Lovell Fuller
0063df4d4f Ensure clahe op uses random read, simplify validation 2023-02-28 21:59:31 +00:00
Lovell Fuller
6c61ad256f Ensure all source code files contain SPDX licence 2023-02-28 17:01:58 +00:00
Lovell Fuller
b90474affa Docs: clarify formats that support multi-page/anim 2023-02-28 14:39:49 +00:00
Lovell Fuller
34cbc6dec3 Docs: clarify that paths are relative to process working dir 2023-02-23 10:33:13 +00:00
Lovell Fuller
bb8de0cc26 Docs: refresh search index 2023-02-18 12:51:42 +00:00
Lovell Fuller
863e37455a Docs: changelog and credit for #3556 2023-02-18 12:50:58 +00:00
Tomasz Janowski
6f0e6f2e65 Add support to extend for extendWith, allows copy/mirror/repeat (#3556) 2023-02-17 14:01:24 +00:00
Lovell Fuller
ebf4ccd124 Bump deps 2023-02-12 19:49:32 +00:00
Lovell Fuller
b96c8e8ba4 Tests: use native fs.rm instead of rimraf 2023-02-12 19:32:00 +00:00
Lovell Fuller
42d2f07e44 Add ignoreIcc input option to ignore embedded ICC profile 2023-02-12 17:51:24 +00:00
Lovell Fuller
a2988c9edc macOS: use 10.13 as minimum to match prebuilt libvips
This allows clang to use SSE4 intrinsics
2023-02-12 16:10:21 +00:00
Lovell Fuller
24b3344937 Docs: changelog for #3548 2023-02-05 09:49:06 +00:00
Jérémy Lal
9608f219bd Add support for ArrayBuffer input (#3548) 2023-02-05 09:45:17 +00:00
Pascal Jufer
4798d9da64 Docs: clarify supported bit depth for AVIF images (#3541) 2023-02-02 17:47:07 +00:00
Lovell Fuller
8d8c6b70eb Prefer integer (un)premultiply for faster RGBA resize
Add changelog, loosen modulate test thresholds
2023-01-24 15:44:39 +00:00
Lovell Fuller
9e2207f376 Prefer integer (un)premultiply for faster RGBA resize 2023-01-24 15:24:58 +00:00
Lovell Fuller
802f560b9b Test: update benchmark dependencies 2023-01-24 15:21:31 +00:00
Lovell Fuller
a532659b0f Types: changes/additions relating to new v0.32.0 features
A separate commit is required as these were not part of the
initial definitions in the v0.31.3 snapshot.

From now on, new features and updates can include the relevant
TypeScript definition changes as part of the same
code/docs/tests commits.
2023-01-17 16:11:04 +00:00
Lovell Fuller
25c6da2bcd Docs: add a couple of missing params/props 2023-01-17 15:01:52 +00:00
Lovell Fuller
02f855d57a Expose own version as sharp.versions.sharp #3471 2023-01-17 09:56:58 +00:00
Lovell Fuller
c150263ef1 Respect fastShrinkOnLoad option for WebP input #3516 2023-01-17 09:39:23 +00:00
Lovell Fuller
9f79f80a93 Docs: fastShrinkOnLoad can round-down when auto-scaling 2023-01-16 12:06:50 +00:00
Lovell Fuller
069803b83d Docs: remove Heroku install section 2023-01-16 12:06:08 +00:00
Lovell Fuller
f79760b4f2 Docs: changelog and help for TypeScript defs #3369 #3370 2023-01-16 11:12:00 +00:00
Espen Hovlandsdal
aa5f0f4e40 Include and publish TypeScript definitions (#3370)
Definitions are a snapshot taken from `@types/sharp`,
which remain under the terms of MIT licensing.
2023-01-16 10:48:37 +00:00
Lovell Fuller
286a322622 Docs: changelog and doc refresh for #3470 2023-01-16 09:27:31 +00:00
Emanuel Jöbstl
6d404f4d2c Add coords to output when using attention based crop (#3470) 2023-01-16 09:20:42 +00:00
Lovell Fuller
bdc50e1d6e Unpin node-addon-api, cast CallbackInfo access to size_t
See https://github.com/nodejs/node-addon-api/pull/1253
2023-01-16 09:00:42 +00:00
Lovell Fuller
a9bd0e79f8 Pin node-addon-api to workaround possible bug in 5.1.0 2023-01-15 19:35:27 +00:00
Lovell Fuller
a1e464cc5e Switch to sequential read as default where possible 2023-01-15 18:43:50 +00:00
Lovell Fuller
081debd055 Reduce sharpen op max sigma from 10000 to 10 #3521 2023-01-10 16:29:40 +00:00
Lovell Fuller
ef849fd639 Docs: switch to well-maintained jsdoc2md for JSDoc parsing 2023-01-08 10:15:38 +00:00
Lovell Fuller
a42a975c46 Bump devDeps 2023-01-06 19:25:27 +00:00
Lovell Fuller
e8273580af Docs: add note about use of fastShrinkOnLoad with resize kernel 2023-01-06 19:24:32 +00:00
Lovell Fuller
5be36c2deb Install: log Rosetta detection, improve related docs 2023-01-04 21:11:21 +00:00
Kleis Auke Wolthuizen
6cda090ce2 Tests: remove ICC profile from CIELAB fixture (#3510)
This ICC profile is considered incompatible with this image.

See: https://github.com/libvips/libvips/issues/730
2023-01-01 21:12:29 +00:00
Lovell Fuller
eac6e8b261 Upgrade to libvips v8.14.0-rc1
- Replace GIF 'optimise' option with 'reuse'
- Add 'progressive' option to GIF
- Add 'wrap' option to text creation
- Add 'formatMagick' property to *magick input metadata
2022-12-29 15:53:50 +00:00
171 changed files with 5801 additions and 2530 deletions

View File

@@ -1,5 +1,5 @@
freebsd_instance: freebsd_instance:
image_family: freebsd-14-0-snap image_family: freebsd-13-2
task: task:
name: FreeBSD name: FreeBSD
@@ -9,7 +9,7 @@ task:
prerequisites_script: prerequisites_script:
- pkg update -f - pkg update -f
- pkg upgrade -y - pkg upgrade -y
- pkg install -y devel/pkgconf graphics/vips www/node16 www/npm - pkg install -y devel/git devel/pkgconf graphics/vips www/node20 www/npm
install_script: install_script:
- npm install --build-from-source --unsafe-perm - npm install --build-from-source --unsafe-perm
test_script: test_script:

View File

@@ -33,6 +33,7 @@ To test C++ changes, you can compile the module using `npm install --build-from-
Please add JavaScript [unit tests](https://github.com/lovell/sharp/tree/main/test/unit) to cover your new feature. Please add JavaScript [unit tests](https://github.com/lovell/sharp/tree/main/test/unit) to cover your new feature.
A test coverage report for the JavaScript code is generated in the `coverage/lcov-report` directory. A test coverage report for the JavaScript code is generated in the `coverage/lcov-report` directory.
Please also update the [TypeScript definitions](https://github.com/lovell/sharp/tree/main/lib/index.d.ts), along with the [type definition tests](https://github.com/lovell/sharp/tree/main/test/types/sharp.test-d.ts).
Where possible, the functional tests use gradient-based perceptual hashes Where possible, the functional tests use gradient-based perceptual hashes
based on [dHash](http://www.hackerfactor.com/blog/index.php?/archives/529-Kind-of-Like-That.html) based on [dHash](http://www.hackerfactor.com/blog/index.php?/archives/529-Kind-of-Like-That.html)

18
.github/SECURITY.md vendored Normal file
View File

@@ -0,0 +1,18 @@
# Security Policy
## Supported Versions
The latest version of `sharp` as published to npm
and reported by `npm view sharp dist-tags.latest`
is supported with security updates.
## Reporting a Vulnerability
Please use
[e-mail](https://github.com/lovell/sharp/blob/main/package.json#L5)
to report a vulnerability.
You can expect a response within 48 hours
if you are a human reporting a genuine issue.
Thank you in advance.

View File

@@ -66,6 +66,7 @@ jobs:
if: contains(matrix.container, 'centos') if: contains(matrix.container, 'centos')
run: | run: |
curl -sL https://rpm.nodesource.com/setup_${{ matrix.nodejs_version }}.x | bash - curl -sL https://rpm.nodesource.com/setup_${{ matrix.nodejs_version }}.x | bash -
yum install -y https://packages.endpointdev.com/rhel/7/os/x86_64/endpoint-repo.x86_64.rpm
yum install -y centos-release-scl yum install -y centos-release-scl
yum install -y devtoolset-11-gcc-c++ make git python3 nodejs fontconfig google-noto-sans-fonts yum install -y devtoolset-11-gcc-c++ make git python3 nodejs fontconfig google-noto-sans-fonts
echo "/opt/rh/devtoolset-11/root/usr/bin" >> $GITHUB_PATH echo "/opt/rh/devtoolset-11/root/usr/bin" >> $GITHUB_PATH

1
.gitignore vendored
View File

@@ -3,6 +3,7 @@ node_modules
/coverage /coverage
test/bench/node_modules test/bench/node_modules
test/fixtures/output* test/fixtures/output*
test/fixtures/vips-properties.xml
test/leak/libvips.supp test/leak/libvips.supp
test/saliency/report.json test/saliency/report.json
test/saliency/Image* test/saliency/Image*

View File

@@ -98,11 +98,9 @@ readableStream
A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md) A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md)
covers reporting bugs, requesting features and submitting code changes. covers reporting bugs, requesting features and submitting code changes.
[![Node-API v5](https://img.shields.io/badge/Node--API-v5-green.svg)](https://nodejs.org/dist/latest/docs/api/n-api.html#n_api_n_api_version_matrix)
## Licensing ## Licensing
Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Lovell Fuller and contributors. Copyright 2013 Lovell Fuller and others.
Licensed under the Apache License, Version 2.0 (the "License"); Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License. you may not use this file except in compliance with the License.

View File

@@ -70,7 +70,9 @@
}, { }, {
'target_name': 'sharp-<(platform_and_arch)', 'target_name': 'sharp-<(platform_and_arch)',
'defines': [ 'defines': [
'NAPI_VERSION=7' 'NAPI_VERSION=7',
'NODE_ADDON_API_DISABLE_DEPRECATED',
'NODE_API_SWALLOW_UNTHROWABLE_EXCEPTIONS'
], ],
'dependencies': [ 'dependencies': [
'<!(node -p "require(\'node-addon-api\').gyp")', '<!(node -p "require(\'node-addon-api\').gyp")',
@@ -179,7 +181,7 @@
], ],
'xcode_settings': { 'xcode_settings': {
'CLANG_CXX_LANGUAGE_STANDARD': 'c++11', 'CLANG_CXX_LANGUAGE_STANDARD': 'c++11',
'MACOSX_DEPLOYMENT_TARGET': '10.9', 'MACOSX_DEPLOYMENT_TARGET': '10.13',
'GCC_ENABLE_CPP_EXCEPTIONS': 'YES', 'GCC_ENABLE_CPP_EXCEPTIONS': 'YES',
'GCC_ENABLE_CPP_RTTI': 'YES', 'GCC_ENABLE_CPP_RTTI': 'YES',
'OTHER_CPLUSPLUSFLAGS': [ 'OTHER_CPLUSPLUSFLAGS': [

View File

@@ -74,7 +74,7 @@ covers reporting bugs, requesting features and submitting code changes.
### Licensing ### Licensing
Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Lovell Fuller and contributors. Copyright 2013 Lovell Fuller and others.
Licensed under the Apache License, Version 2.0 (the "License"); Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License. you may not use this file except in compliance with the License.

View File

@@ -1,14 +1,11 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## removeAlpha ## removeAlpha
Remove alpha channel, if any. This is a no-op if the image does not have an alpha channel. Remove alpha channel, if any. This is a no-op if the image does not have an alpha channel.
See also [flatten][1]. See also [flatten](/api-operation#flatten).
### Examples
```javascript **Example**
```js
sharp('rgba.png') sharp('rgba.png')
.removeAlpha() .removeAlpha()
.toFile('rgb.png', function(err, info) { .toFile('rgb.png', function(err, info) {
@@ -16,61 +13,62 @@ sharp('rgba.png')
}); });
``` ```
Returns **Sharp**&#x20;
## ensureAlpha ## ensureAlpha
Ensure the output image has an alpha transparency channel. Ensure the output image has an alpha transparency channel.
If missing, the added alpha channel will have the specified If missing, the added alpha channel will have the specified
transparency level, defaulting to fully-opaque (1). transparency level, defaulting to fully-opaque (1).
This is a no-op if the image already has an alpha channel. This is a no-op if the image already has an alpha channel.
### Parameters
* `alpha` **[number][2]** alpha transparency level (0=fully-transparent, 1=fully-opaque) (optional, default `1`) **Throws**:
### Examples - <code>Error</code> Invalid alpha transparency level
```javascript **Since**: 0.21.2
| Param | Type | Default | Description |
| --- | --- | --- | --- |
| [alpha] | <code>number</code> | <code>1</code> | alpha transparency level (0=fully-transparent, 1=fully-opaque) |
**Example**
```js
// rgba.png will be a 4 channel image with a fully-opaque alpha channel // rgba.png will be a 4 channel image with a fully-opaque alpha channel
await sharp('rgb.jpg') await sharp('rgb.jpg')
.ensureAlpha() .ensureAlpha()
.toFile('rgba.png') .toFile('rgba.png')
``` ```
**Example**
```javascript ```js
// rgba is a 4 channel image with a fully-transparent alpha channel // rgba is a 4 channel image with a fully-transparent alpha channel
const rgba = await sharp(rgb) const rgba = await sharp(rgb)
.ensureAlpha(0) .ensureAlpha(0)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][3]** Invalid alpha transparency level
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.21.2
## extractChannel ## extractChannel
Extract a single channel from a multi-channel image. Extract a single channel from a multi-channel image.
### Parameters
* `channel` **([number][2] | [string][4])** zero-indexed channel/band number to extract, or `red`, `green`, `blue` or `alpha`. **Throws**:
### Examples - <code>Error</code> Invalid channel
```javascript
| Param | Type | Description |
| --- | --- | --- |
| channel | <code>number</code> \| <code>string</code> | zero-indexed channel/band number to extract, or `red`, `green`, `blue` or `alpha`. |
**Example**
```js
// green.jpg is a greyscale image containing the green channel of the input // green.jpg is a greyscale image containing the green channel of the input
await sharp(input) await sharp(input)
.extractChannel('green') .extractChannel('green')
.toFile('green.jpg'); .toFile('green.jpg');
``` ```
**Example**
```javascript ```js
// red1 is the red value of the first pixel, red2 the second pixel etc. // red1 is the red value of the first pixel, red2 the second pixel etc.
const [red1, red2, ...] = await sharp(input) const [red1, red2, ...] = await sharp(input)
.extractChannel(0) .extractChannel(0)
@@ -78,45 +76,46 @@ const [red1, red2, ...] = await sharp(input)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][3]** Invalid channel
Returns **Sharp**&#x20;
## joinChannel ## joinChannel
Join one or more channels to the image. Join one or more channels to the image.
The meaning of the added channels depends on the output colourspace, set with `toColourspace()`. The meaning of the added channels depends on the output colourspace, set with `toColourspace()`.
By default the output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. By default the output image will be web-friendly sRGB, with additional channels interpreted as alpha channels.
Channel ordering follows vips convention: Channel ordering follows vips convention:
- sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha.
* sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha. - CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha.
* CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha.
Buffers may be any of the image formats supported by sharp. Buffers may be any of the image formats supported by sharp.
For raw pixel input, the `options` object should contain a `raw` attribute, which follows the format of the attribute of the same name in the `sharp()` constructor. For raw pixel input, the `options` object should contain a `raw` attribute, which follows the format of the attribute of the same name in the `sharp()` constructor.
### Parameters
* `images` **([Array][5]<([string][4] | [Buffer][6])> | [string][4] | [Buffer][6])** one or more images (file paths, Buffers). **Throws**:
* `options` **[Object][7]** image options, see `sharp()` constructor.
<!----> - <code>Error</code> Invalid parameters
| Param | Type | Description |
| --- | --- | --- |
| images | <code>Array.&lt;(string\|Buffer)&gt;</code> \| <code>string</code> \| <code>Buffer</code> | one or more images (file paths, Buffers). |
| options | <code>Object</code> | image options, see `sharp()` constructor. |
* Throws **[Error][3]** Invalid parameters
Returns **Sharp**&#x20;
## bandbool ## bandbool
Perform a bitwise boolean operation on all input image channels (bands) to produce a single channel output image. Perform a bitwise boolean operation on all input image channels (bands) to produce a single channel output image.
### Parameters
* `boolOp` **[string][4]** one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. **Throws**:
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Description |
| --- | --- | --- |
| boolOp | <code>string</code> | one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. |
**Example**
```js
sharp('3-channel-rgb-input.png') sharp('3-channel-rgb-input.png')
.bandbool(sharp.bool.and) .bandbool(sharp.bool.and)
.toFile('1-channel-output.png', function (err, info) { .toFile('1-channel-output.png', function (err, info) {
@@ -124,22 +123,4 @@ sharp('3-channel-rgb-input.png')
// If `I(1,1) = [247, 170, 14] = [0b11110111, 0b10101010, 0b00001111]` // If `I(1,1) = [247, 170, 14] = [0b11110111, 0b10101010, 0b00001111]`
// then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`. // then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`.
}); });
``` ```
* Throws **[Error][3]** Invalid parameters
Returns **Sharp**&#x20;
[1]: /api-operation#flatten
[2]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number
[3]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error
[4]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String
[5]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Array
[6]: https://nodejs.org/api/buffer.html
[7]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object

View File

@@ -1,28 +1,26 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## tint ## tint
Tint the image using the provided chroma while preserving the image luminance. Tint the image using the provided chroma while preserving the image luminance.
An alpha channel may be present and will be unchanged by the operation. An alpha channel may be present and will be unchanged by the operation.
### Parameters
* `rgb` **([string][1] | [Object][2])** parsed by the [color][3] module to extract chroma values. **Throws**:
### Examples - <code>Error</code> Invalid parameter
```javascript
| Param | Type | Description |
| --- | --- | --- |
| rgb | <code>string</code> \| <code>Object</code> | parsed by the [color](https://www.npmjs.org/package/color) module to extract chroma values. |
**Example**
```js
const output = await sharp(input) const output = await sharp(input)
.tint({ r: 255, g: 240, b: 16 }) .tint({ r: 255, g: 240, b: 16 })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid parameter
Returns **Sharp**&#x20;
## greyscale ## greyscale
Convert to 8-bit greyscale; 256 shades of grey. Convert to 8-bit greyscale; 256 shades of grey.
This is a linear operation. If the input image is in a non-linear colour space such as sRGB, use `gamma()` with `greyscale()` for the best results. This is a linear operation. If the input image is in a non-linear colour space such as sRGB, use `gamma()` with `greyscale()` for the best results.
By default the output image will be web-friendly sRGB and contain three (identical) color channels. By default the output image will be web-friendly sRGB and contain three (identical) color channels.
@@ -30,44 +28,51 @@ This may be overridden by other sharp operations such as `toColourspace('b-w')`,
which will produce an output image containing one color channel. which will produce an output image containing one color channel.
An alpha channel may be present, and will be unchanged by the operation. An alpha channel may be present, and will be unchanged by the operation.
### Parameters
* `greyscale` **[Boolean][5]** (optional, default `true`)
### Examples | Param | Type | Default |
| --- | --- | --- |
| [greyscale] | <code>Boolean</code> | <code>true</code> |
```javascript **Example**
```js
const output = await sharp(input).greyscale().toBuffer(); const output = await sharp(input).greyscale().toBuffer();
``` ```
Returns **Sharp**&#x20;
## grayscale ## grayscale
Alternative spelling of `greyscale`. Alternative spelling of `greyscale`.
### Parameters
* `grayscale` **[Boolean][5]** (optional, default `true`)
Returns **Sharp**&#x20; | Param | Type | Default |
| --- | --- | --- |
| [grayscale] | <code>Boolean</code> | <code>true</code> |
## pipelineColourspace ## pipelineColourspace
Set the pipeline colourspace. Set the pipeline colourspace.
The input image will be converted to the provided colourspace at the start of the pipeline. The input image will be converted to the provided colourspace at the start of the pipeline.
All operations will use this colourspace before converting to the output colourspace, as defined by [toColourspace][6]. All operations will use this colourspace before converting to the output colourspace,
as defined by [toColourspace](#tocolourspace).
This feature is experimental and has not yet been fully-tested with all operations. This feature is experimental and has not yet been fully-tested with all operations.
### Parameters
* `colourspace` **[string][1]?** pipeline colourspace e.g. `rgb16`, `scrgb`, `lab`, `grey16` [...][7] **Throws**:
### Examples - <code>Error</code> Invalid parameters
```javascript **Since**: 0.29.0
| Param | Type | Description |
| --- | --- | --- |
| [colourspace] | <code>string</code> | pipeline colourspace e.g. `rgb16`, `scrgb`, `lab`, `grey16` [...](https://github.com/libvips/libvips/blob/41cff4e9d0838498487a00623462204eb10ee5b8/libvips/iofuncs/enumtypes.c#L774) |
**Example**
```js
// Run pipeline in 16 bits per channel RGB while converting final result to 8 bits per channel sRGB. // Run pipeline in 16 bits per channel RGB while converting final result to 8 bits per channel sRGB.
await sharp(input) await sharp(input)
.pipelineColourspace('rgb16') .pipelineColourspace('rgb16')
@@ -75,76 +80,54 @@ await sharp(input)
.toFile('16bpc-pipeline-to-8bpc-output.png') .toFile('16bpc-pipeline-to-8bpc-output.png')
``` ```
* Throws **[Error][4]** Invalid parameters
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.29.0
## pipelineColorspace ## pipelineColorspace
Alternative spelling of `pipelineColourspace`. Alternative spelling of `pipelineColourspace`.
### Parameters
* `colorspace` **[string][1]?** pipeline colorspace. **Throws**:
<!----> - <code>Error</code> Invalid parameters
| Param | Type | Description |
| --- | --- | --- |
| [colorspace] | <code>string</code> | pipeline colorspace. |
* Throws **[Error][4]** Invalid parameters
Returns **Sharp**&#x20;
## toColourspace ## toColourspace
Set the output colourspace. Set the output colourspace.
By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels. By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels.
### Parameters
* `colourspace` **[string][1]?** output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...][8] **Throws**:
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Description |
| --- | --- | --- |
| [colourspace] | <code>string</code> | output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://github.com/libvips/libvips/blob/3c0bfdf74ce1dc37a6429bed47fa76f16e2cd70a/libvips/iofuncs/enumtypes.c#L777-L794) |
**Example**
```js
// Output 16 bits per pixel RGB // Output 16 bits per pixel RGB
await sharp(input) await sharp(input)
.toColourspace('rgb16') .toColourspace('rgb16')
.toFile('16-bpp.png') .toFile('16-bpp.png')
``` ```
* Throws **[Error][4]** Invalid parameters
Returns **Sharp**&#x20;
## toColorspace ## toColorspace
Alternative spelling of `toColourspace`. Alternative spelling of `toColourspace`.
### Parameters
* `colorspace` **[string][1]?** output colorspace. **Throws**:
<!----> - <code>Error</code> Invalid parameters
* Throws **[Error][4]** Invalid parameters
Returns **Sharp**&#x20; | Param | Type | Description |
| --- | --- | --- |
[1]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String | [colorspace] | <code>string</code> | output colorspace. |
[2]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object
[3]: https://www.npmjs.org/package/color
[4]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error
[5]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[6]: #tocolourspace
[7]: https://github.com/libvips/libvips/blob/41cff4e9d0838498487a00623462204eb10ee5b8/libvips/iofuncs/enumtypes.c#L774
[8]: https://github.com/libvips/libvips/blob/3c0bfdf74ce1dc37a6429bed47fa76f16e2cd70a/libvips/iofuncs/enumtypes.c#L777-L794

View File

@@ -1,13 +1,10 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## composite ## composite
Composite image(s) over the processed (resized, extracted etc.) image. Composite image(s) over the processed (resized, extracted etc.) image.
The images to composite must be the same size or smaller than the processed image. The images to composite must be the same size or smaller than the processed image.
If both `top` and `left` options are provided, they take precedence over `gravity`. If both `top` and `left` options are provided, they take precedence over `gravity`.
Any resize or rotate operations in the same processing pipeline Any resize, rotate or extract operations in the same processing pipeline
will always be applied to the input image before composition. will always be applied to the input image before composition.
The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`, The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`,
@@ -17,52 +14,53 @@ The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`,
`hard-light`, `soft-light`, `difference`, `exclusion`. `hard-light`, `soft-light`, `difference`, `exclusion`.
More information about blend modes can be found at More information about blend modes can be found at
[https://www.libvips.org/API/current/libvips-conversion.html#VipsBlendMode][1] https://www.libvips.org/API/current/libvips-conversion.html#VipsBlendMode
and [https://www.cairographics.org/operators/][2] and https://www.cairographics.org/operators/
### Parameters
* `images` **[Array][3]<[Object][4]>** Ordered list of images to composite **Throws**:
* `images[].input` **([Buffer][5] | [String][6])?** Buffer containing image data, String containing the path to an image file, or Create object (see below) - <code>Error</code> Invalid parameters
* `images[].input.create` **[Object][4]?** describes a blank overlay to be created. **Since**: 0.22.0
* `images[].input.create.width` **[Number][7]?**&#x20; | Param | Type | Default | Description |
* `images[].input.create.height` **[Number][7]?**&#x20; | --- | --- | --- | --- |
* `images[].input.create.channels` **[Number][7]?** 3-4 | images | <code>Array.&lt;Object&gt;</code> | | Ordered list of images to composite |
* `images[].input.create.background` **([String][6] | [Object][4])?** parsed by the [color][8] module to extract values for red, green, blue and alpha. | [images[].input] | <code>Buffer</code> \| <code>String</code> | | Buffer containing image data, String containing the path to an image file, or Create object (see below) |
* `images[].input.text` **[Object][4]?** describes a new text image to be created. | [images[].input.create] | <code>Object</code> | | describes a blank overlay to be created. |
| [images[].input.create.width] | <code>Number</code> | | |
| [images[].input.create.height] | <code>Number</code> | | |
| [images[].input.create.channels] | <code>Number</code> | | 3-4 |
| [images[].input.create.background] | <code>String</code> \| <code>Object</code> | | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. |
| [images[].input.text] | <code>Object</code> | | describes a new text image to be created. |
| [images[].input.text.text] | <code>string</code> | | text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`. |
| [images[].input.text.font] | <code>string</code> | | font name to render with. |
| [images[].input.text.fontfile] | <code>string</code> | | absolute filesystem path to a font file that can be used by `font`. |
| [images[].input.text.width] | <code>number</code> | <code>0</code> | integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. |
| [images[].input.text.height] | <code>number</code> | <code>0</code> | integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. |
| [images[].input.text.align] | <code>string</code> | <code>&quot;&#x27;left&#x27;&quot;</code> | text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). |
| [images[].input.text.justify] | <code>boolean</code> | <code>false</code> | set this to true to apply justification to the text. |
| [images[].input.text.dpi] | <code>number</code> | <code>72</code> | the resolution (size) at which to render the text. Does not take effect if `height` is specified. |
| [images[].input.text.rgba] | <code>boolean</code> | <code>false</code> | set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `<span foreground="red">Red!</span>`. |
| [images[].input.text.spacing] | <code>number</code> | <code>0</code> | text line height in points. Will use the font line height if none is specified. |
| [images[].blend] | <code>String</code> | <code>&#x27;over&#x27;</code> | how to blend this image with the image below. |
| [images[].gravity] | <code>String</code> | <code>&#x27;centre&#x27;</code> | gravity at which to place the overlay. |
| [images[].top] | <code>Number</code> | | the pixel offset from the top edge. |
| [images[].left] | <code>Number</code> | | the pixel offset from the left edge. |
| [images[].tile] | <code>Boolean</code> | <code>false</code> | set to true to repeat the overlay image across the entire image with the given `gravity`. |
| [images[].premultiplied] | <code>Boolean</code> | <code>false</code> | set to true to avoid premultiplying the image below. Equivalent to the `--premultiplied` vips option. |
| [images[].density] | <code>Number</code> | <code>72</code> | number representing the DPI for vector overlay image. |
| [images[].raw] | <code>Object</code> | | describes overlay when using raw pixel data. |
| [images[].raw.width] | <code>Number</code> | | |
| [images[].raw.height] | <code>Number</code> | | |
| [images[].raw.channels] | <code>Number</code> | | |
| [images[].animated] | <code>boolean</code> | <code>false</code> | Set to `true` to read all frames/pages of an animated image. |
| [images[].failOn] | <code>string</code> | <code>&quot;&#x27;warning&#x27;&quot;</code> | @see [constructor parameters](/api-constructor#parameters) |
| [images[].limitInputPixels] | <code>number</code> \| <code>boolean</code> | <code>268402689</code> | @see [constructor parameters](/api-constructor#parameters) |
* `images[].input.text.text` **[string][6]?** text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`. **Example**
* `images[].input.text.font` **[string][6]?** font name to render with. ```js
* `images[].input.text.fontfile` **[string][6]?** absolute filesystem path to a font file that can be used by `font`.
* `images[].input.text.width` **[number][7]** integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. (optional, default `0`)
* `images[].input.text.height` **[number][7]** integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. (optional, default `0`)
* `images[].input.text.align` **[string][6]** text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). (optional, default `'left'`)
* `images[].input.text.justify` **[boolean][9]** set this to true to apply justification to the text. (optional, default `false`)
* `images[].input.text.dpi` **[number][7]** the resolution (size) at which to render the text. Does not take effect if `height` is specified. (optional, default `72`)
* `images[].input.text.rgba` **[boolean][9]** set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`. (optional, default `false`)
* `images[].input.text.spacing` **[number][7]** text line height in points. Will use the font line height if none is specified. (optional, default `0`)
* `images[].blend` **[String][6]** how to blend this image with the image below. (optional, default `'over'`)
* `images[].gravity` **[String][6]** gravity at which to place the overlay. (optional, default `'centre'`)
* `images[].top` **[Number][7]?** the pixel offset from the top edge.
* `images[].left` **[Number][7]?** the pixel offset from the left edge.
* `images[].tile` **[Boolean][9]** set to true to repeat the overlay image across the entire image with the given `gravity`. (optional, default `false`)
* `images[].premultiplied` **[Boolean][9]** set to true to avoid premultipling the image below. Equivalent to the `--premultiplied` vips option. (optional, default `false`)
* `images[].density` **[Number][7]** number representing the DPI for vector overlay image. (optional, default `72`)
* `images[].raw` **[Object][4]?** describes overlay when using raw pixel data.
* `images[].raw.width` **[Number][7]?**&#x20;
* `images[].raw.height` **[Number][7]?**&#x20;
* `images[].raw.channels` **[Number][7]?**&#x20;
* `images[].animated` **[boolean][9]** Set to `true` to read all frames/pages of an animated image. (optional, default `false`)
* `images[].failOn` **[string][6]** @see [constructor parameters][10] (optional, default `'warning'`)
* `images[].limitInputPixels` **([number][7] | [boolean][9])** @see [constructor parameters][10] (optional, default `268402689`)
### Examples
```javascript
await sharp(background) await sharp(background)
.composite([ .composite([
{ input: layer1, gravity: 'northwest' }, { input: layer1, gravity: 'northwest' },
@@ -70,16 +68,16 @@ await sharp(background)
]) ])
.toFile('combined.png'); .toFile('combined.png');
``` ```
**Example**
```javascript ```js
const output = await sharp('input.gif', { animated: true }) const output = await sharp('input.gif', { animated: true })
.composite([ .composite([
{ input: 'overlay.png', tile: true, blend: 'saturate' } { input: 'overlay.png', tile: true, blend: 'saturate' }
]) ])
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
sharp('input.png') sharp('input.png')
.rotate(180) .rotate(180)
.resize(300) .resize(300)
@@ -94,34 +92,4 @@ sharp('input.png')
// onto orange background, composited with overlay.png with SE gravity, // onto orange background, composited with overlay.png with SE gravity,
// sharpened, with metadata, 90% quality WebP image data. Phew! // sharpened, with metadata, 90% quality WebP image data. Phew!
}); });
``` ```
* Throws **[Error][11]** Invalid parameters
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.22.0
[1]: https://www.libvips.org/API/current/libvips-conversion.html#VipsBlendMode
[2]: https://www.cairographics.org/operators/
[3]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Array
[4]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object
[5]: https://nodejs.org/api/buffer.html
[6]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String
[7]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number
[8]: https://www.npmjs.org/package/color
[9]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[10]: /api-constructor#parameters
[11]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error

View File

@@ -1,7 +1,9 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## Sharp ## Sharp
**Emits**: <code>Sharp#event:info</code>, <code>Sharp#event:warning</code>
<a name="new_Sharp_new"></a>
### new
Constructor factory to create an instance of `sharp`, to which further methods are chained. Constructor factory to create an instance of `sharp`, to which further methods are chained.
JPEG, PNG, WebP, GIF, AVIF or TIFF format image data can be streamed out from this object. JPEG, PNG, WebP, GIF, AVIF or TIFF format image data can be streamed out from this object.
@@ -9,64 +11,57 @@ When using Stream based output, derived attributes are available from the `info`
Non-critical problems encountered during processing are emitted as `warning` events. Non-critical problems encountered during processing are emitted as `warning` events.
Implements the [stream.Duplex][1] class. Implements the [stream.Duplex](http://nodejs.org/api/stream.html#stream_class_stream_duplex) class.
### Parameters **Throws**:
* `input` **([Buffer][2] | [Uint8Array][3] | [Uint8ClampedArray][4] | [Int8Array][5] | [Uint16Array][6] | [Int16Array][7] | [Uint32Array][8] | [Int32Array][9] | [Float32Array][10] | [Float64Array][11] | [string][12])?** if present, can be - <code>Error</code> Invalid parameters
a Buffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or
a TypedArray containing raw pixel image data, or
a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file.
JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present.
* `options` **[Object][13]?** if present, is an Object with optional attributes.
* `options.failOn` **[string][12]** when to abort processing of invalid pixel data, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels, invalid metadata will always abort. (optional, default `'warning'`)
* `options.limitInputPixels` **([number][14] | [boolean][15])** Do not process input images where the number of pixels
(width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted.
An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). (optional, default `268402689`)
* `options.unlimited` **[boolean][15]** Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF). (optional, default `false`)
* `options.sequentialRead` **[boolean][15]** Set this to `true` to use sequential rather than random access where possible.
This can reduce memory usage and might improve performance on some systems. (optional, default `false`)
* `options.density` **[number][14]** number representing the DPI for vector images in the range 1 to 100000. (optional, default `72`)
* `options.pages` **[number][14]** number of pages to extract for multi-page input (GIF, WebP, AVIF, TIFF, PDF), use -1 for all pages. (optional, default `1`)
* `options.page` **[number][14]** page number to start extracting from for multi-page input (GIF, WebP, AVIF, TIFF, PDF), zero based. (optional, default `0`)
* `options.subifd` **[number][14]** subIFD (Sub Image File Directory) to extract for OME-TIFF, defaults to main image. (optional, default `-1`)
* `options.level` **[number][14]** level to extract from a multi-level input (OpenSlide), zero based. (optional, default `0`)
* `options.animated` **[boolean][15]** Set to `true` to read all frames/pages of an animated image (equivalent of setting `pages` to `-1`). (optional, default `false`)
* `options.raw` **[Object][13]?** describes raw pixel input image data. See `raw()` for pixel ordering.
* `options.raw.width` **[number][14]?** integral number of pixels wide. | Param | Type | Default | Description |
* `options.raw.height` **[number][14]?** integral number of pixels high. | --- | --- | --- | --- |
* `options.raw.channels` **[number][14]?** integral number of channels, between 1 and 4. | [input] | <code>Buffer</code> \| <code>ArrayBuffer</code> \| <code>Uint8Array</code> \| <code>Uint8ClampedArray</code> \| <code>Int8Array</code> \| <code>Uint16Array</code> \| <code>Int16Array</code> \| <code>Uint32Array</code> \| <code>Int32Array</code> \| <code>Float32Array</code> \| <code>Float64Array</code> \| <code>string</code> | | if present, can be a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or a TypedArray containing raw pixel image data, or a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. |
* `options.raw.premultiplied` **[boolean][15]?** specifies that the raw input has already been premultiplied, set to `true` | [options] | <code>Object</code> | | if present, is an Object with optional attributes. |
to avoid sharp premultiplying the image. (optional, default `false`) | [options.failOn] | <code>string</code> | <code>&quot;&#x27;warning&#x27;&quot;</code> | when to abort processing of invalid pixel data, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), higher levels imply lower levels, invalid metadata will always abort. |
* `options.create` **[Object][13]?** describes a new image to be created. | [options.limitInputPixels] | <code>number</code> \| <code>boolean</code> | <code>268402689</code> | Do not process input images where the number of pixels (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). |
| [options.unlimited] | <code>boolean</code> | <code>false</code> | Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF). |
| [options.sequentialRead] | <code>boolean</code> | <code>true</code> | Set this to `false` to use random access rather than sequential read. Some operations will do this automatically. |
| [options.density] | <code>number</code> | <code>72</code> | number representing the DPI for vector images in the range 1 to 100000. |
| [options.ignoreIcc] | <code>number</code> | <code>false</code> | should the embedded ICC profile, if any, be ignored. |
| [options.pages] | <code>number</code> | <code>1</code> | Number of pages to extract for multi-page input (GIF, WebP, TIFF), use -1 for all pages. |
| [options.page] | <code>number</code> | <code>0</code> | Page number to start extracting from for multi-page input (GIF, WebP, TIFF), zero based. |
| [options.subifd] | <code>number</code> | <code>-1</code> | subIFD (Sub Image File Directory) to extract for OME-TIFF, defaults to main image. |
| [options.level] | <code>number</code> | <code>0</code> | level to extract from a multi-level input (OpenSlide), zero based. |
| [options.animated] | <code>boolean</code> | <code>false</code> | Set to `true` to read all frames/pages of an animated image (GIF, WebP, TIFF), equivalent of setting `pages` to `-1`. |
| [options.raw] | <code>Object</code> | | describes raw pixel input image data. See `raw()` for pixel ordering. |
| [options.raw.width] | <code>number</code> | | integral number of pixels wide. |
| [options.raw.height] | <code>number</code> | | integral number of pixels high. |
| [options.raw.channels] | <code>number</code> | | integral number of channels, between 1 and 4. |
| [options.raw.premultiplied] | <code>boolean</code> | | specifies that the raw input has already been premultiplied, set to `true` to avoid sharp premultiplying the image. (optional, default `false`) |
| [options.create] | <code>Object</code> | | describes a new image to be created. |
| [options.create.width] | <code>number</code> | | integral number of pixels wide. |
| [options.create.height] | <code>number</code> | | integral number of pixels high. |
| [options.create.channels] | <code>number</code> | | integral number of channels, either 3 (RGB) or 4 (RGBA). |
| [options.create.background] | <code>string</code> \| <code>Object</code> | | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. |
| [options.create.noise] | <code>Object</code> | | describes a noise to be created. |
| [options.create.noise.type] | <code>string</code> | | type of generated noise, currently only `gaussian` is supported. |
| [options.create.noise.mean] | <code>number</code> | | mean of pixels in generated noise. |
| [options.create.noise.sigma] | <code>number</code> | | standard deviation of pixels in generated noise. |
| [options.text] | <code>Object</code> | | describes a new text image to be created. |
| [options.text.text] | <code>string</code> | | text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`. |
| [options.text.font] | <code>string</code> | | font name to render with. |
| [options.text.fontfile] | <code>string</code> | | absolute filesystem path to a font file that can be used by `font`. |
| [options.text.width] | <code>number</code> | <code>0</code> | Integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. |
| [options.text.height] | <code>number</code> | <code>0</code> | Maximum integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. |
| [options.text.align] | <code>string</code> | <code>&quot;&#x27;left&#x27;&quot;</code> | Alignment style for multi-line text (`'left'`, `'centre'`, `'center'`, `'right'`). |
| [options.text.justify] | <code>boolean</code> | <code>false</code> | set this to true to apply justification to the text. |
| [options.text.dpi] | <code>number</code> | <code>72</code> | the resolution (size) at which to render the text. Does not take effect if `height` is specified. |
| [options.text.rgba] | <code>boolean</code> | <code>false</code> | set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`. |
| [options.text.spacing] | <code>number</code> | <code>0</code> | text line height in points. Will use the font line height if none is specified. |
| [options.text.wrap] | <code>string</code> | <code>&quot;&#x27;word&#x27;&quot;</code> | word wrapping style when width is provided, one of: 'word', 'char', 'charWord' (prefer char, fallback to word) or 'none'. |
* `options.create.width` **[number][14]?** integral number of pixels wide. **Example**
* `options.create.height` **[number][14]?** integral number of pixels high. ```js
* `options.create.channels` **[number][14]?** integral number of channels, either 3 (RGB) or 4 (RGBA).
* `options.create.background` **([string][12] | [Object][13])?** parsed by the [color][16] module to extract values for red, green, blue and alpha.
* `options.create.noise` **[Object][13]?** describes a noise to be created.
* `options.create.noise.type` **[string][12]?** type of generated noise, currently only `gaussian` is supported.
* `options.create.noise.mean` **[number][14]?** mean of pixels in generated noise.
* `options.create.noise.sigma` **[number][14]?** standard deviation of pixels in generated noise.
* `options.text` **[Object][13]?** describes a new text image to be created.
* `options.text.text` **[string][12]?** text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`.
* `options.text.font` **[string][12]?** font name to render with.
* `options.text.fontfile` **[string][12]?** absolute filesystem path to a font file that can be used by `font`.
* `options.text.width` **[number][14]** integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. (optional, default `0`)
* `options.text.height` **[number][14]** integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. (optional, default `0`)
* `options.text.align` **[string][12]** text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). (optional, default `'left'`)
* `options.text.justify` **[boolean][15]** set this to true to apply justification to the text. (optional, default `false`)
* `options.text.dpi` **[number][14]** the resolution (size) at which to render the text. Does not take effect if `height` is specified. (optional, default `72`)
* `options.text.rgba` **[boolean][15]** set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`. (optional, default `false`)
* `options.text.spacing` **[number][14]** text line height in points. Will use the font line height if none is specified. (optional, default `0`)
### Examples
```javascript
sharp('input.jpg') sharp('input.jpg')
.resize(300, 200) .resize(300, 200)
.toFile('output.jpg', function(err) { .toFile('output.jpg', function(err) {
@@ -74,8 +69,8 @@ sharp('input.jpg')
// containing a scaled and cropped version of input.jpg // containing a scaled and cropped version of input.jpg
}); });
``` ```
**Example**
```javascript ```js
// Read image data from readableStream, // Read image data from readableStream,
// resize to 300 pixels wide, // resize to 300 pixels wide,
// emit an 'info' event with calculated dimensions // emit an 'info' event with calculated dimensions
@@ -87,9 +82,9 @@ var transformer = sharp()
}); });
readableStream.pipe(transformer).pipe(writableStream); readableStream.pipe(transformer).pipe(writableStream);
``` ```
**Example**
```javascript ```js
// Create a blank 300x200 PNG image of semi-transluent red pixels // Create a blank 300x200 PNG image of semi-translucent red pixels
sharp({ sharp({
create: { create: {
width: 300, width: 300,
@@ -102,13 +97,13 @@ sharp({
.toBuffer() .toBuffer()
.then( ... ); .then( ... );
``` ```
**Example**
```javascript ```js
// Convert an animated GIF to an animated WebP // Convert an animated GIF to an animated WebP
await sharp('in.gif', { animated: true }).toFile('out.webp'); await sharp('in.gif', { animated: true }).toFile('out.webp');
``` ```
**Example**
```javascript ```js
// Read a raw array of pixels and save it to a png // Read a raw array of pixels and save it to a png
const input = Uint8Array.from([255, 255, 255, 0, 0, 0]); // or Uint8ClampedArray const input = Uint8Array.from([255, 255, 255, 0, 0, 0]); // or Uint8ClampedArray
const image = sharp(input, { const image = sharp(input, {
@@ -122,8 +117,8 @@ const image = sharp(input, {
}); });
await image.toFile('my-two-pixels.png'); await image.toFile('my-two-pixels.png');
``` ```
**Example**
```javascript ```js
// Generate RGB Gaussian noise // Generate RGB Gaussian noise
await sharp({ await sharp({
create: { create: {
@@ -138,8 +133,8 @@ await sharp({
} }
}).toFile('noise.png'); }).toFile('noise.png');
``` ```
**Example**
```javascript ```js
// Generate an image from text // Generate an image from text
await sharp({ await sharp({
text: { text: {
@@ -149,8 +144,8 @@ await sharp({
} }
}).toFile('text_bw.png'); }).toFile('text_bw.png');
``` ```
**Example**
```javascript ```js
// Generate an rgba image from text using pango markup and font // Generate an rgba image from text using pango markup and font
await sharp({ await sharp({
text: { text: {
@@ -162,19 +157,15 @@ await sharp({
}).toFile('text_rgba.png'); }).toFile('text_rgba.png');
``` ```
* Throws **[Error][17]** Invalid parameters
Returns **[Sharp][18]**&#x20;
## clone ## clone
Take a "snapshot" of the Sharp instance, returning a new instance. Take a "snapshot" of the Sharp instance, returning a new instance.
Cloned instances inherit the input of their parent instance. Cloned instances inherit the input of their parent instance.
This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream. This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream.
### Examples
```javascript **Example**
```js
const pipeline = sharp().rotate(); const pipeline = sharp().rotate();
pipeline.clone().resize(800, 600).pipe(firstWritableStream); pipeline.clone().resize(800, 600).pipe(firstWritableStream);
pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream); pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream);
@@ -182,8 +173,8 @@ readableStream.pipe(pipeline);
// firstWritableStream receives auto-rotated, resized readableStream // firstWritableStream receives auto-rotated, resized readableStream
// secondWritableStream receives auto-rotated, extracted region of readableStream // secondWritableStream receives auto-rotated, extracted region of readableStream
``` ```
**Example**
```javascript ```js
// Create a pipeline that will download an image, resize it and format it to different files // Create a pipeline that will download an image, resize it and format it to different files
// Using Promises to know when the pipeline is complete // Using Promises to know when the pipeline is complete
const fs = require("fs"); const fs = require("fs");
@@ -228,42 +219,4 @@ Promise.all(promises)
fs.unlinkSync("optimized-500.webp"); fs.unlinkSync("optimized-500.webp");
} catch (e) {} } catch (e) {}
}); });
``` ```
Returns **[Sharp][18]**&#x20;
[1]: http://nodejs.org/api/stream.html#stream_class_stream_duplex
[2]: https://nodejs.org/api/buffer.html
[3]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array
[4]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Uint8ClampedArray
[5]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Int8Array
[6]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Uint16Array
[7]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Int16Array
[8]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Uint32Array
[9]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Int32Array
[10]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Float32Array
[11]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Float64Array
[12]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String
[13]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object
[14]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number
[15]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[16]: https://www.npmjs.org/package/color
[17]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error
[18]: #sharp

View File

@@ -1,57 +1,57 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## metadata ## metadata
Fast access to (uncached) image metadata without decoding any compressed pixel data. Fast access to (uncached) image metadata without decoding any compressed pixel data.
This is taken from the header of the input image. This is read from the header of the input image.
It does not include operations, such as resize, to be applied to the output image. It does not take into consideration any operations to be applied to the output image,
such as resize or rotate.
Dimensions in the response will respect the `page` and `pages` properties of the Dimensions in the response will respect the `page` and `pages` properties of the
[constructor parameters][1]. [constructor parameters](/api-constructor#parameters).
A `Promise` is returned when `callback` is not provided. A `Promise` is returned when `callback` is not provided.
* `format`: Name of decoder used to decompress image data e.g. `jpeg`, `png`, `webp`, `gif`, `svg` - `format`: Name of decoder used to decompress image data e.g. `jpeg`, `png`, `webp`, `gif`, `svg`
* `size`: Total size of image in bytes, for Stream and Buffer input only - `size`: Total size of image in bytes, for Stream and Buffer input only
* `width`: Number of pixels wide (EXIF orientation is not taken into consideration, see example below) - `width`: Number of pixels wide (EXIF orientation is not taken into consideration, see example below)
* `height`: Number of pixels high (EXIF orientation is not taken into consideration, see example below) - `height`: Number of pixels high (EXIF orientation is not taken into consideration, see example below)
* `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...][2] - `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://www.libvips.org/API/current/VipsImage.html#VipsInterpretation)
* `channels`: Number of bands e.g. `3` for sRGB, `4` for CMYK - `channels`: Number of bands e.g. `3` for sRGB, `4` for CMYK
* `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...][3] - `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...](https://www.libvips.org/API/current/VipsImage.html#VipsBandFormat)
* `density`: Number of pixels per inch (DPI), if present - `density`: Number of pixels per inch (DPI), if present
* `chromaSubsampling`: String containing JPEG chroma subsampling, `4:2:0` or `4:4:4` for RGB, `4:2:0:4` or `4:4:4:4` for CMYK - `chromaSubsampling`: String containing JPEG chroma subsampling, `4:2:0` or `4:4:4` for RGB, `4:2:0:4` or `4:4:4:4` for CMYK
* `isProgressive`: Boolean indicating whether the image is interlaced using a progressive scan - `isProgressive`: Boolean indicating whether the image is interlaced using a progressive scan
* `pages`: Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP - `pages`: Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP
* `pageHeight`: Number of pixels high each page in a multi-page image will be. - `pageHeight`: Number of pixels high each page in a multi-page image will be.
* `loop`: Number of times to loop an animated image, zero refers to a continuous loop. - `loop`: Number of times to loop an animated image, zero refers to a continuous loop.
* `delay`: Delay in ms between each page in an animated image, provided as an array of integers. - `delay`: Delay in ms between each page in an animated image, provided as an array of integers.
* `pagePrimary`: Number of the primary page in a HEIF image - `pagePrimary`: Number of the primary page in a HEIF image
* `levels`: Details of each level in a multi-level image provided as an array of objects, requires libvips compiled with support for OpenSlide - `levels`: Details of each level in a multi-level image provided as an array of objects, requires libvips compiled with support for OpenSlide
* `subifds`: Number of Sub Image File Directories in an OME-TIFF image - `subifds`: Number of Sub Image File Directories in an OME-TIFF image
* `background`: Default background colour, if present, for PNG (bKGD) and GIF images, either an RGB Object or a single greyscale value - `background`: Default background colour, if present, for PNG (bKGD) and GIF images, either an RGB Object or a single greyscale value
* `compression`: The encoder used to compress an HEIF file, `av1` (AVIF) or `hevc` (HEIC) - `compression`: The encoder used to compress an HEIF file, `av1` (AVIF) or `hevc` (HEIC)
* `resolutionUnit`: The unit of resolution (density), either `inch` or `cm`, if present - `resolutionUnit`: The unit of resolution (density), either `inch` or `cm`, if present
* `hasProfile`: Boolean indicating the presence of an embedded ICC profile - `hasProfile`: Boolean indicating the presence of an embedded ICC profile
* `hasAlpha`: Boolean indicating the presence of an alpha transparency channel - `hasAlpha`: Boolean indicating the presence of an alpha transparency channel
* `orientation`: Number value of the EXIF Orientation header, if present - `orientation`: Number value of the EXIF Orientation header, if present
* `exif`: Buffer containing raw EXIF data, if present - `exif`: Buffer containing raw EXIF data, if present
* `icc`: Buffer containing raw [ICC][4] profile data, if present - `icc`: Buffer containing raw [ICC](https://www.npmjs.com/package/icc) profile data, if present
* `iptc`: Buffer containing raw IPTC data, if present - `iptc`: Buffer containing raw IPTC data, if present
* `xmp`: Buffer containing raw XMP data, if present - `xmp`: Buffer containing raw XMP data, if present
* `tifftagPhotoshop`: Buffer containing raw TIFFTAG\_PHOTOSHOP data, if present - `tifftagPhotoshop`: Buffer containing raw TIFFTAG_PHOTOSHOP data, if present
- `formatMagick`: String containing format for images loaded via *magick
### Parameters
* `callback` **[Function][5]?** called with the arguments `(err, metadata)`
### Examples | Param | Type | Description |
| --- | --- | --- |
| [callback] | <code>function</code> | called with the arguments `(err, metadata)` |
```javascript **Example**
```js
const metadata = await sharp(input).metadata(); const metadata = await sharp(input).metadata();
``` ```
**Example**
```javascript ```js
const image = sharp(inputJpg); const image = sharp(inputJpg);
image image
.metadata() .metadata()
@@ -65,8 +65,8 @@ image
// data contains a WebP image half the width and height of the original JPEG // data contains a WebP image half the width and height of the original JPEG
}); });
``` ```
**Example**
```javascript ```js
// Based on EXIF rotation metadata, get the right-side-up width and height: // Based on EXIF rotation metadata, get the right-side-up width and height:
const size = getNormalSize(await sharp(input).metadata()); const size = getNormalSize(await sharp(input).metadata());
@@ -78,39 +78,38 @@ function getNormalSize({ width, height, orientation }) {
} }
``` ```
Returns **([Promise][6]<[Object][7]> | Sharp)**&#x20;
## stats ## stats
Access to pixel-derived image statistics for every channel in the image. Access to pixel-derived image statistics for every channel in the image.
A `Promise` is returned when `callback` is not provided. A `Promise` is returned when `callback` is not provided.
* `channels`: Array of channel statistics for each channel in the image. Each channel statistic contains - `channels`: Array of channel statistics for each channel in the image. Each channel statistic contains
* `min` (minimum value in the channel) - `min` (minimum value in the channel)
* `max` (maximum value in the channel) - `max` (maximum value in the channel)
* `sum` (sum of all values in a channel) - `sum` (sum of all values in a channel)
* `squaresSum` (sum of squared values in a channel) - `squaresSum` (sum of squared values in a channel)
* `mean` (mean of the values in a channel) - `mean` (mean of the values in a channel)
* `stdev` (standard deviation for the values in a channel) - `stdev` (standard deviation for the values in a channel)
* `minX` (x-coordinate of one of the pixel where the minimum lies) - `minX` (x-coordinate of one of the pixel where the minimum lies)
* `minY` (y-coordinate of one of the pixel where the minimum lies) - `minY` (y-coordinate of one of the pixel where the minimum lies)
* `maxX` (x-coordinate of one of the pixel where the maximum lies) - `maxX` (x-coordinate of one of the pixel where the maximum lies)
* `maxY` (y-coordinate of one of the pixel where the maximum lies) - `maxY` (y-coordinate of one of the pixel where the maximum lies)
* `isOpaque`: Is the image fully opaque? Will be `true` if the image has no alpha channel or if every pixel is fully opaque. - `isOpaque`: Is the image fully opaque? Will be `true` if the image has no alpha channel or if every pixel is fully opaque.
* `entropy`: Histogram-based estimation of greyscale entropy, discarding alpha channel if any. - `entropy`: Histogram-based estimation of greyscale entropy, discarding alpha channel if any.
* `sharpness`: Estimation of greyscale sharpness based on the standard deviation of a Laplacian convolution, discarding alpha channel if any. - `sharpness`: Estimation of greyscale sharpness based on the standard deviation of a Laplacian convolution, discarding alpha channel if any.
* `dominant`: Object containing most dominant sRGB colour based on a 4096-bin 3D histogram. - `dominant`: Object containing most dominant sRGB colour based on a 4096-bin 3D histogram.
**Note**: Statistics are derived from the original input image. Any operations performed on the image must first be **Note**: Statistics are derived from the original input image. Any operations performed on the image must first be
written to a buffer in order to run `stats` on the result (see third example). written to a buffer in order to run `stats` on the result (see third example).
### Parameters
* `callback` **[Function][5]?** called with the arguments `(err, stats)`
### Examples | Param | Type | Description |
| --- | --- | --- |
| [callback] | <code>function</code> | called with the arguments `(err, stats)` |
```javascript **Example**
```js
const image = sharp(inputJpg); const image = sharp(inputJpg);
image image
.stats() .stats()
@@ -118,32 +117,16 @@ image
// stats contains the channel-wise statistics array and the isOpaque value // stats contains the channel-wise statistics array and the isOpaque value
}); });
``` ```
**Example**
```javascript ```js
const { entropy, sharpness, dominant } = await sharp(input).stats(); const { entropy, sharpness, dominant } = await sharp(input).stats();
const { r, g, b } = dominant; const { r, g, b } = dominant;
``` ```
**Example**
```javascript ```js
const image = sharp(input); const image = sharp(input);
// store intermediate result // store intermediate result
const part = await image.extract(region).toBuffer(); const part = await image.extract(region).toBuffer();
// create new instance to obtain statistics of extracted region // create new instance to obtain statistics of extracted region
const stats = await sharp(part).stats(); const stats = await sharp(part).stats();
``` ```
Returns **[Promise][6]<[Object][7]>**&#x20;
[1]: /api-constructor#parameters
[2]: https://www.libvips.org/API/current/VipsImage.html#VipsInterpretation
[3]: https://www.libvips.org/API/current/VipsImage.html#VipsBandFormat
[4]: https://www.npmjs.com/package/icc
[5]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Statements/function
[6]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Promise
[7]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object

View File

@@ -1,12 +1,9 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## rotate ## rotate
Rotate the output image by either an explicit angle Rotate the output image by either an explicit angle
or auto-orient based on the EXIF `Orientation` tag. or auto-orient based on the EXIF `Orientation` tag.
If an angle is provided, it is converted to a valid positive degree rotation. If an angle is provided, it is converted to a valid positive degree rotation.
For example, `-450` will produce a 270deg rotation. For example, `-450` will produce a 270 degree rotation.
When rotating by an angle other than a multiple of 90, When rotating by an angle other than a multiple of 90,
the background colour can be provided with the `background` option. the background colour can be provided with the `background` option.
@@ -22,16 +19,20 @@ Previous calls to `rotate` in the same pipeline will be ignored.
Method order is important when rotating, resizing and/or extracting regions, Method order is important when rotating, resizing and/or extracting regions,
for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`. for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`.
### Parameters
* `angle` **[number][1]** angle of rotation. (optional, default `auto`) **Throws**:
* `options` **[Object][2]?** if present, is an Object with optional attributes.
* `options.background` **([string][3] | [Object][2])** parsed by the [color][4] module to extract values for red, green, blue and alpha. (optional, default `"#000000"`) - <code>Error</code> Invalid parameters
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [angle] | <code>number</code> | <code>auto</code> | angle of rotation. |
| [options] | <code>Object</code> | | if present, is an Object with optional attributes. |
| [options.background] | <code>string</code> \| <code>Object</code> | <code>&quot;\&quot;#000000\&quot;&quot;</code> | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. |
**Example**
```js
const pipeline = sharp() const pipeline = sharp()
.rotate() .rotate()
.resize(null, 200) .resize(null, 200)
@@ -42,8 +43,8 @@ const pipeline = sharp()
}); });
readableStream.pipe(pipeline); readableStream.pipe(pipeline);
``` ```
**Example**
```javascript ```js
const rotateThenResize = await sharp(input) const rotateThenResize = await sharp(input)
.rotate(90) .rotate(90)
.resize({ width: 16, height: 8, fit: 'fill' }) .resize({ width: 16, height: 8, fit: 'fill' })
@@ -54,82 +55,85 @@ const resizeThenRotate = await sharp(input)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## flip ## flip
Mirror the image vertically (up-down) about the x-axis.
This always occurs before rotation, if any.
Flip the image about the vertical Y axis. This always occurs before rotation, if any.
The use of `flip` implies the removal of the EXIF `Orientation` tag, if any. The use of `flip` implies the removal of the EXIF `Orientation` tag, if any.
### Parameters This operation does not work correctly with multi-page images.
* `flip` **[Boolean][6]** (optional, default `true`)
### Examples
```javascript | Param | Type | Default |
| --- | --- | --- |
| [flip] | <code>Boolean</code> | <code>true</code> |
**Example**
```js
const output = await sharp(input).flip().toBuffer(); const output = await sharp(input).flip().toBuffer();
``` ```
Returns **Sharp**&#x20;
## flop ## flop
Mirror the image horizontally (left-right) about the y-axis.
This always occurs before rotation, if any.
Flop the image about the horizontal X axis. This always occurs before rotation, if any.
The use of `flop` implies the removal of the EXIF `Orientation` tag, if any. The use of `flop` implies the removal of the EXIF `Orientation` tag, if any.
### Parameters
* `flop` **[Boolean][6]** (optional, default `true`)
### Examples | Param | Type | Default |
| --- | --- | --- |
| [flop] | <code>Boolean</code> | <code>true</code> |
```javascript **Example**
```js
const output = await sharp(input).flop().toBuffer(); const output = await sharp(input).flop().toBuffer();
``` ```
Returns **Sharp**&#x20;
## affine ## affine
Perform an affine transform on an image. This operation will always occur after resizing, extraction and rotation, if any. Perform an affine transform on an image. This operation will always occur after resizing, extraction and rotation, if any.
You must provide an array of length 4 or a 2x2 affine transformation matrix. You must provide an array of length 4 or a 2x2 affine transformation matrix.
By default, new pixels are filled with a black background. You can provide a background color with the `background` option. By default, new pixels are filled with a black background. You can provide a background color with the `background` option.
A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolator` Object e.g. `sharp.interpolator.nohalo`. A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolators` Object e.g. `sharp.interpolators.nohalo`.
In the case of a 2x2 matrix, the transform is: In the case of a 2x2 matrix, the transform is:
- X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx`
* X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx` - Y = `matrix[1, 0]` \* (x + `idx`) + `matrix[1, 1]` \* (y + `idy`) + `ody`
* Y = `matrix[1, 0]` \* (x + `idx`) + `matrix[1, 1]` \* (y + `idy`) + `ody`
where: where:
- x and y are the coordinates in input image.
- X and Y are the coordinates in output image.
- (0,0) is the upper left corner.
* x and y are the coordinates in input image.
* X and Y are the coordinates in output image.
* (0,0) is the upper left corner.
### Parameters **Throws**:
* `matrix` **([Array][7]<[Array][7]<[number][1]>> | [Array][7]<[number][1]>)** affine transformation matrix - <code>Error</code> Invalid parameters
* `options` **[Object][2]?** if present, is an Object with optional attributes.
* `options.background` **([String][3] | [Object][2])** parsed by the [color][4] module to extract values for red, green, blue and alpha. (optional, default `"#000000"`) **Since**: 0.27.0
* `options.idx` **[Number][1]** input horizontal offset (optional, default `0`)
* `options.idy` **[Number][1]** input vertical offset (optional, default `0`)
* `options.odx` **[Number][1]** output horizontal offset (optional, default `0`)
* `options.ody` **[Number][1]** output vertical offset (optional, default `0`)
* `options.interpolator` **[String][3]** interpolator (optional, default `sharp.interpolators.bicubic`)
### Examples | Param | Type | Default | Description |
| --- | --- | --- | --- |
| matrix | <code>Array.&lt;Array.&lt;number&gt;&gt;</code> \| <code>Array.&lt;number&gt;</code> | | affine transformation matrix |
| [options] | <code>Object</code> | | if present, is an Object with optional attributes. |
| [options.background] | <code>String</code> \| <code>Object</code> | <code>&quot;#000000&quot;</code> | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. |
| [options.idx] | <code>Number</code> | <code>0</code> | input horizontal offset |
| [options.idy] | <code>Number</code> | <code>0</code> | input vertical offset |
| [options.odx] | <code>Number</code> | <code>0</code> | output horizontal offset |
| [options.ody] | <code>Number</code> | <code>0</code> | output vertical offset |
| [options.interpolator] | <code>String</code> | <code>sharp.interpolators.bicubic</code> | interpolator |
```javascript **Example**
```js
const pipeline = sharp() const pipeline = sharp()
.affine([[1, 0.3], [0.1, 0.7]], { .affine([[1, 0.3], [0.1, 0.7]], {
background: 'white', background: 'white',
interpolate: sharp.interpolators.nohalo interpolator: sharp.interpolators.nohalo
}) })
.toBuffer((err, outputBuffer, info) => { .toBuffer((err, outputBuffer, info) => {
// outputBuffer contains the transformed image // outputBuffer contains the transformed image
@@ -140,16 +144,8 @@ inputStream
.pipe(pipeline); .pipe(pipeline);
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.27.0
## sharpen ## sharpen
Sharpen the image. Sharpen the image.
When used without parameters, performs a fast, mild sharpen of the output image. When used without parameters, performs a fast, mild sharpen of the output image.
@@ -157,32 +153,36 @@ When used without parameters, performs a fast, mild sharpen of the output image.
When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space. When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space.
Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available. Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available.
See [libvips sharpen][8] operation. See [libvips sharpen](https://www.libvips.org/API/current/libvips-convolution.html#vips-sharpen) operation.
### Parameters
* `options` **([Object][2] | [number][1])?** if present, is an Object with attributes **Throws**:
* `options.sigma` **[number][1]?** the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10000 - <code>Error</code> Invalid parameters
* `options.m1` **[number][1]** the level of sharpening to apply to "flat" areas, between 0 and 1000000 (optional, default `1.0`)
* `options.m2` **[number][1]** the level of sharpening to apply to "jagged" areas, between 0 and 1000000 (optional, default `2.0`)
* `options.x1` **[number][1]** threshold between "flat" and "jagged", between 0 and 1000000 (optional, default `2.0`)
* `options.y2` **[number][1]** maximum amount of brightening, between 0 and 1000000 (optional, default `10.0`)
* `options.y3` **[number][1]** maximum amount of darkening, between 0 and 1000000 (optional, default `20.0`)
* `flat` **[number][1]?** (deprecated) see `options.m1`.
* `jagged` **[number][1]?** (deprecated) see `options.m2`.
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> \| <code>number</code> | | if present, is an Object with attributes |
| [options.sigma] | <code>number</code> | | the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10 |
| [options.m1] | <code>number</code> | <code>1.0</code> | the level of sharpening to apply to "flat" areas, between 0 and 1000000 |
| [options.m2] | <code>number</code> | <code>2.0</code> | the level of sharpening to apply to "jagged" areas, between 0 and 1000000 |
| [options.x1] | <code>number</code> | <code>2.0</code> | threshold between "flat" and "jagged", between 0 and 1000000 |
| [options.y2] | <code>number</code> | <code>10.0</code> | maximum amount of brightening, between 0 and 1000000 |
| [options.y3] | <code>number</code> | <code>20.0</code> | maximum amount of darkening, between 0 and 1000000 |
| [flat] | <code>number</code> | | (deprecated) see `options.m1`. |
| [jagged] | <code>number</code> | | (deprecated) see `options.m2`. |
**Example**
```js
const data = await sharp(input).sharpen().toBuffer(); const data = await sharp(input).sharpen().toBuffer();
``` ```
**Example**
```javascript ```js
const data = await sharp(input).sharpen({ sigma: 2 }).toBuffer(); const data = await sharp(input).sharpen({ sigma: 2 }).toBuffer();
``` ```
**Example**
```javascript ```js
const data = await sharp(input) const data = await sharp(input)
.sharpen({ .sharpen({
sigma: 2, sigma: 2,
@@ -195,87 +195,108 @@ const data = await sharp(input)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## median ## median
Apply median filter. Apply median filter.
When used without parameters the default window is 3x3. When used without parameters the default window is 3x3.
### Parameters
* `size` **[number][1]** square mask size: size x size (optional, default `3`) **Throws**:
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Default | Description |
| --- | --- | --- | --- |
| [size] | <code>number</code> | <code>3</code> | square mask size: size x size |
**Example**
```js
const output = await sharp(input).median().toBuffer(); const output = await sharp(input).median().toBuffer();
``` ```
**Example**
```javascript ```js
const output = await sharp(input).median(5).toBuffer(); const output = await sharp(input).median(5).toBuffer();
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## blur ## blur
Blur the image. Blur the image.
When used without parameters, performs a fast 3x3 box blur (equivalent to a box linear filter). When used without parameters, performs a fast 3x3 box blur (equivalent to a box linear filter).
When a `sigma` is provided, performs a slower, more accurate Gaussian blur. When a `sigma` is provided, performs a slower, more accurate Gaussian blur.
### Parameters
* `sigma` **[number][1]?** a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. **Throws**:
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Description |
| --- | --- | --- |
| [sigma] | <code>number</code> | a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`. |
**Example**
```js
const boxBlurred = await sharp(input) const boxBlurred = await sharp(input)
.blur() .blur()
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
const gaussianBlurred = await sharp(input) const gaussianBlurred = await sharp(input)
.blur(5) .blur(5)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## flatten ## flatten
Merge alpha transparency channel, if any, with a background, then remove the alpha channel. Merge alpha transparency channel, if any, with a background, then remove the alpha channel.
See also [removeAlpha][9]. See also [removeAlpha](/api-channel#removealpha).
### Parameters
* `options` **[Object][2]?**&#x20;
* `options.background` **([string][3] | [Object][2])** background colour, parsed by the [color][4] module, defaults to black. (optional, default `{r:0,g:0,b:0}`) | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.background] | <code>string</code> \| <code>Object</code> | <code>&quot;{r: 0, g: 0, b: 0}&quot;</code> | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black. |
### Examples **Example**
```js
```javascript
await sharp(rgbaInput) await sharp(rgbaInput)
.flatten({ background: '#F0A703' }) .flatten({ background: '#F0A703' })
.toBuffer(); .toBuffer();
``` ```
Returns **Sharp**&#x20;
## unflatten
Ensure the image has an alpha channel
with all white pixel values made fully transparent.
Existing alpha channel values for non-white pixels remain unchanged.
This feature is experimental and the API may change.
**Since**: 0.32.1
**Example**
```js
await sharp(rgbInput)
.unflatten()
.toBuffer();
```
**Example**
```js
await sharp(rgbInput)
.threshold(128, { grayscale: false }) // converter bright pixels to white
.unflatten()
.toBuffer();
```
## gamma ## gamma
Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma` Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma`
then increasing the encoding (brighten) post-resize at a factor of `gamma`. then increasing the encoding (brighten) post-resize at a factor of `gamma`.
This can improve the perceived brightness of a resized image in non-linear colour spaces. This can improve the perceived brightness of a resized image in non-linear colour spaces.
@@ -284,95 +305,114 @@ when applying a gamma correction.
Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases. Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases.
### Parameters
* `gamma` **[number][1]** value between 1.0 and 3.0. (optional, default `2.2`) **Throws**:
* `gammaOut` **[number][1]?** value between 1.0 and 3.0. (optional, defaults to same as `gamma`)
<!----> - <code>Error</code> Invalid parameters
| Param | Type | Default | Description |
| --- | --- | --- | --- |
| [gamma] | <code>number</code> | <code>2.2</code> | value between 1.0 and 3.0. |
| [gammaOut] | <code>number</code> | | value between 1.0 and 3.0. (optional, defaults to same as `gamma`) |
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## negate ## negate
Produce the "negative" of the image. Produce the "negative" of the image.
### Parameters
* `options` **[Object][2]?**&#x20;
* `options.alpha` **[Boolean][6]** Whether or not to negate any alpha channel (optional, default `true`) | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.alpha] | <code>Boolean</code> | <code>true</code> | Whether or not to negate any alpha channel |
### Examples **Example**
```js
```javascript
const output = await sharp(input) const output = await sharp(input)
.negate() .negate()
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
const output = await sharp(input) const output = await sharp(input)
.negate({ alpha: false }) .negate({ alpha: false })
.toBuffer(); .toBuffer();
``` ```
Returns **Sharp**&#x20;
## normalise ## normalise
Enhance output image contrast by stretching its luminance to cover a full dynamic range.
Enhance output image contrast by stretching its luminance to cover the full dynamic range. Uses a histogram-based approach, taking a default range of 1% to 99% to reduce sensitivity to noise at the extremes.
### Parameters Luminance values below the `lower` percentile will be underexposed by clipping to zero.
Luminance values above the `upper` percentile will be overexposed by clipping to the max pixel value.
* `normalise` **[Boolean][6]** (optional, default `true`)
### Examples
```javascript | Param | Type | Default | Description |
const output = await sharp(input).normalise().toBuffer(); | --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.lower] | <code>number</code> | <code>1</code> | Percentile below which luminance values will be underexposed. |
| [options.upper] | <code>number</code> | <code>99</code> | Percentile above which luminance values will be overexposed. |
**Example**
```js
const output = await sharp(input)
.normalise()
.toBuffer();
```
**Example**
```js
const output = await sharp(input)
.normalise({ lower: 0, upper: 100 })
.toBuffer();
``` ```
Returns **Sharp**&#x20;
## normalize ## normalize
Alternative spelling of normalise. Alternative spelling of normalise.
### Parameters
* `normalize` **[Boolean][6]** (optional, default `true`)
### Examples | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.lower] | <code>number</code> | <code>1</code> | Percentile below which luminance values will be underexposed. |
| [options.upper] | <code>number</code> | <code>99</code> | Percentile above which luminance values will be overexposed. |
```javascript **Example**
const output = await sharp(input).normalize().toBuffer(); ```js
const output = await sharp(input)
.normalize()
.toBuffer();
``` ```
Returns **Sharp**&#x20;
## clahe ## clahe
Perform contrast limiting adaptive histogram equalization Perform contrast limiting adaptive histogram equalization
[CLAHE][10]. [CLAHE](https://en.wikipedia.org/wiki/Adaptive_histogram_equalization#Contrast_Limited_AHE).
This will, in general, enhance the clarity of the image by bringing out darker details. This will, in general, enhance the clarity of the image by bringing out darker details.
### Parameters
* `options` **[Object][2]**&#x20; **Throws**:
* `options.width` **[number][1]** integer width of the region in pixels. - <code>Error</code> Invalid parameters
* `options.height` **[number][1]** integer height of the region in pixels.
* `options.maxSlope` **[number][1]** maximum value for the slope of the
cumulative histogram. A value of 0 disables contrast limiting. Valid values
are integers in the range 0-100 (inclusive) (optional, default `3`)
### Examples **Since**: 0.28.3
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| options | <code>Object</code> | | |
| options.width | <code>number</code> | | Integral width of the search window, in pixels. |
| options.height | <code>number</code> | | Integral height of the search window, in pixels. |
| [options.maxSlope] | <code>number</code> | <code>3</code> | Integral level of brightening, between 0 and 100, where 0 disables contrast limiting. |
**Example**
```js
const output = await sharp(input) const output = await sharp(input)
.clahe({ .clahe({
width: 3, width: 3,
@@ -381,31 +421,27 @@ const output = await sharp(input)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.28.3
## convolve ## convolve
Convolve the image with the specified kernel. Convolve the image with the specified kernel.
### Parameters
* `kernel` **[Object][2]**&#x20; **Throws**:
* `kernel.width` **[number][1]** width of the kernel in pixels. - <code>Error</code> Invalid parameters
* `kernel.height` **[number][1]** height of the kernel in pixels.
* `kernel.kernel` **[Array][7]<[number][1]>** Array of length `width*height` containing the kernel values.
* `kernel.scale` **[number][1]** the scale of the kernel in pixels. (optional, default `sum`)
* `kernel.offset` **[number][1]** the offset of the kernel in pixels. (optional, default `0`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| kernel | <code>Object</code> | | |
| kernel.width | <code>number</code> | | width of the kernel in pixels. |
| kernel.height | <code>number</code> | | height of the kernel in pixels. |
| kernel.kernel | <code>Array.&lt;number&gt;</code> | | Array of length `width*height` containing the kernel values. |
| [kernel.scale] | <code>number</code> | <code>sum</code> | the scale of the kernel in pixels. |
| [kernel.offset] | <code>number</code> | <code>0</code> | the offset of the kernel in pixels. |
**Example**
```js
sharp(input) sharp(input)
.convolve({ .convolve({
width: 3, width: 3,
@@ -419,74 +455,74 @@ sharp(input)
}); });
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## threshold ## threshold
Any pixel value greater than or equal to the threshold value will be set to 255, otherwise it will be set to 0. Any pixel value greater than or equal to the threshold value will be set to 255, otherwise it will be set to 0.
### Parameters
* `threshold` **[number][1]** a value in the range 0-255 representing the level at which the threshold will be applied. (optional, default `128`) **Throws**:
* `options` **[Object][2]?**&#x20;
* `options.greyscale` **[Boolean][6]** convert to single channel greyscale. (optional, default `true`) - <code>Error</code> Invalid parameters
* `options.grayscale` **[Boolean][6]** alternative spelling for greyscale. (optional, default `true`)
<!---->
* Throws **[Error][5]** Invalid parameters | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [threshold] | <code>number</code> | <code>128</code> | a value in the range 0-255 representing the level at which the threshold will be applied. |
| [options] | <code>Object</code> | | |
| [options.greyscale] | <code>Boolean</code> | <code>true</code> | convert to single channel greyscale. |
| [options.grayscale] | <code>Boolean</code> | <code>true</code> | alternative spelling for greyscale. |
Returns **Sharp**&#x20;
## boolean ## boolean
Perform a bitwise boolean operation with operand image. Perform a bitwise boolean operation with operand image.
This operation creates an output image where each pixel is the result of This operation creates an output image where each pixel is the result of
the selected bitwise boolean `operation` between the corresponding pixels of the input images. the selected bitwise boolean `operation` between the corresponding pixels of the input images.
### Parameters
* `operand` **([Buffer][11] | [string][3])** Buffer containing image data or string containing the path to an image file. **Throws**:
* `operator` **[string][3]** one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively.
* `options` **[Object][2]?**&#x20;
* `options.raw` **[Object][2]?** describes operand when using raw pixel data. - <code>Error</code> Invalid parameters
* `options.raw.width` **[number][1]?**&#x20;
* `options.raw.height` **[number][1]?**&#x20;
* `options.raw.channels` **[number][1]?**&#x20;
<!----> | Param | Type | Description |
| --- | --- | --- |
| operand | <code>Buffer</code> \| <code>string</code> | Buffer containing image data or string containing the path to an image file. |
| operator | <code>string</code> | one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively. |
| [options] | <code>Object</code> | |
| [options.raw] | <code>Object</code> | describes operand when using raw pixel data. |
| [options.raw.width] | <code>number</code> | |
| [options.raw.height] | <code>number</code> | |
| [options.raw.channels] | <code>number</code> | |
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## linear ## linear
Apply the linear formula `a` * input + `b` to the image to adjust image levels.
Apply the linear formula `a` \* input + `b` to the image to adjust image levels.
When a single number is provided, it will be used for all image channels. When a single number is provided, it will be used for all image channels.
When an array of numbers is provided, the array length must match the number of channels. When an array of numbers is provided, the array length must match the number of channels.
### Parameters
* `a` **([number][1] | [Array][7]<[number][1]>)** multiplier (optional, default `[]`) **Throws**:
* `b` **([number][1] | [Array][7]<[number][1]>)** offset (optional, default `[]`)
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Default | Description |
| --- | --- | --- | --- |
| [a] | <code>number</code> \| <code>Array.&lt;number&gt;</code> | <code>[]</code> | multiplier |
| [b] | <code>number</code> \| <code>Array.&lt;number&gt;</code> | <code>[]</code> | offset |
**Example**
```js
await sharp(input) await sharp(input)
.linear(0.5, 2) .linear(0.5, 2)
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
await sharp(rgbInput) await sharp(rgbInput)
.linear( .linear(
[0.25, 0.5, 0.75], [0.25, 0.5, 0.75],
@@ -495,21 +531,23 @@ await sharp(rgbInput)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
## recomb ## recomb
Recombine the image with the specified matrix.
Recomb the image with the specified matrix.
### Parameters **Throws**:
* `inputMatrix` **[Array][7]<[Array][7]<[number][1]>>** 3x3 Recombination matrix - <code>Error</code> Invalid parameters
### Examples **Since**: 0.21.1
```javascript | Param | Type | Description |
| --- | --- | --- |
| inputMatrix | <code>Array.&lt;Array.&lt;number&gt;&gt;</code> | 3x3 Recombination matrix |
**Example**
```js
sharp(input) sharp(input)
.recomb([ .recomb([
[0.3588, 0.7044, 0.1368], [0.3588, 0.7044, 0.1368],
@@ -518,37 +556,30 @@ sharp(input)
]) ])
.raw() .raw()
.toBuffer(function(err, data, info) { .toBuffer(function(err, data, info) {
// data contains the raw pixel data after applying the recomb // data contains the raw pixel data after applying the matrix
// With this example input, a sepia filter has been applied // With this example input, a sepia filter has been applied
}); });
``` ```
* Throws **[Error][5]** Invalid parameters
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.21.1
## modulate ## modulate
Transforms the image using brightness, saturation, hue rotation, and lightness. Transforms the image using brightness, saturation, hue rotation, and lightness.
Brightness and lightness both operate on luminance, with the difference being that Brightness and lightness both operate on luminance, with the difference being that
brightness is multiplicative whereas lightness is additive. brightness is multiplicative whereas lightness is additive.
### Parameters
* `options` **[Object][2]?**&#x20; **Since**: 0.22.1
* `options.brightness` **[number][1]?** Brightness multiplier | Param | Type | Description |
* `options.saturation` **[number][1]?** Saturation multiplier | --- | --- | --- |
* `options.hue` **[number][1]?** Degrees for hue rotation | [options] | <code>Object</code> | |
* `options.lightness` **[number][1]?** Lightness addend | [options.brightness] | <code>number</code> | Brightness multiplier |
| [options.saturation] | <code>number</code> | Saturation multiplier |
| [options.hue] | <code>number</code> | Degrees for hue rotation |
| [options.lightness] | <code>number</code> | Lightness addend |
### Examples **Example**
```js
```javascript
// increase brightness by a factor of 2 // increase brightness by a factor of 2
const output = await sharp(input) const output = await sharp(input)
.modulate({ .modulate({
@@ -556,8 +587,8 @@ const output = await sharp(input)
}) })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// hue-rotate by 180 degrees // hue-rotate by 180 degrees
const output = await sharp(input) const output = await sharp(input)
.modulate({ .modulate({
@@ -565,8 +596,8 @@ const output = await sharp(input)
}) })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// increase lightness by +50 // increase lightness by +50
const output = await sharp(input) const output = await sharp(input)
.modulate({ .modulate({
@@ -574,9 +605,9 @@ const output = await sharp(input)
}) })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// decreate brightness and saturation while also hue-rotating by 90 degrees // decrease brightness and saturation while also hue-rotating by 90 degrees
const output = await sharp(input) const output = await sharp(input)
.modulate({ .modulate({
brightness: 0.5, brightness: 0.5,
@@ -584,32 +615,4 @@ const output = await sharp(input)
hue: 90, hue: 90,
}) })
.toBuffer(); .toBuffer();
``` ```
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.22.1
[1]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number
[2]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object
[3]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String
[4]: https://www.npmjs.org/package/color
[5]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error
[6]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[7]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Array
[8]: https://www.libvips.org/API/current/libvips-convolution.html#vips-sharpen
[9]: /api-channel#removealpha
[10]: https://en.wikipedia.org/wiki/Adaptive_histogram_equalization#Contrast_Limited_AHE
[11]: https://nodejs.org/api/buffer.html

View File

@@ -1,7 +1,4 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## toFile ## toFile
Write output image data to a file. Write output image data to a file.
If an explicit output format is not selected, it will be inferred from the extension, If an explicit output format is not selected, it will be inferred from the extension,
@@ -9,92 +6,90 @@ with JPEG, PNG, WebP, AVIF, TIFF, GIF, DZI, and libvips' V format supported.
Note that raw pixel data is only supported for buffer output. Note that raw pixel data is only supported for buffer output.
By default all metadata will be removed, which includes EXIF-based orientation. By default all metadata will be removed, which includes EXIF-based orientation.
See [withMetadata][1] for control over this. See [withMetadata](#withmetadata) for control over this.
The caller is responsible for ensuring directory structures and permissions exist. The caller is responsible for ensuring directory structures and permissions exist.
A `Promise` is returned when `callback` is not provided. A `Promise` is returned when `callback` is not provided.
### Parameters
* `fileOut` **[string][2]** the path to write the image data to. **Returns**: <code>Promise.&lt;Object&gt;</code> - - when no callback is provided
* `callback` **[Function][3]?** called on completion with two arguments `(err, info)`. **Throws**:
`info` contains the output image `format`, `size` (bytes), `width`, `height`,
`channels` and `premultiplied` (indicating if premultiplication was used).
When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`.
May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text.
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Description |
| --- | --- | --- |
| fileOut | <code>string</code> | the path to write the image data to. |
| [callback] | <code>function</code> | called on completion with two arguments `(err, info)`. `info` contains the output image `format`, `size` (bytes), `width`, `height`, `channels` and `premultiplied` (indicating if premultiplication was used). When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. When using the attention crop strategy also contains `attentionX` and `attentionY`, the focal point of the cropped region. May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text. |
**Example**
```js
sharp(input) sharp(input)
.toFile('output.png', (err, info) => { ... }); .toFile('output.png', (err, info) => { ... });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.toFile('output.png') .toFile('output.png')
.then(info => { ... }) .then(info => { ... })
.catch(err => { ... }); .catch(err => { ... });
``` ```
* Throws **[Error][4]** Invalid parameters
Returns **[Promise][5]<[Object][6]>** when no callback is provided
## toBuffer ## toBuffer
Write output to a Buffer. Write output to a Buffer.
JPEG, PNG, WebP, AVIF, TIFF, GIF and raw pixel data output are supported. JPEG, PNG, WebP, AVIF, TIFF, GIF and raw pixel data output are supported.
Use [toFormat][7] or one of the format-specific functions such as [jpeg][8], [png][9] etc. to set the output format. Use [toFormat](#toformat) or one of the format-specific functions such as [jpeg](#jpeg), [png](#png) etc. to set the output format.
If no explicit format is set, the output format will match the input image, except SVG input which becomes PNG output. If no explicit format is set, the output format will match the input image, except SVG input which becomes PNG output.
By default all metadata will be removed, which includes EXIF-based orientation. By default all metadata will be removed, which includes EXIF-based orientation.
See [withMetadata][1] for control over this. See [withMetadata](#withmetadata) for control over this.
`callback`, if present, gets three arguments `(err, data, info)` where: `callback`, if present, gets three arguments `(err, data, info)` where:
- `err` is an error, if any.
* `err` is an error, if any. - `data` is the output image data.
* `data` is the output image data. - `info` contains the output image `format`, `size` (bytes), `width`, `height`,
* `info` contains the output image `format`, `size` (bytes), `width`, `height`, `channels` and `premultiplied` (indicating if premultiplication was used).
`channels` and `premultiplied` (indicating if premultiplication was used). When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`.
When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text.
May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text.
A `Promise` is returned when `callback` is not provided. A `Promise` is returned when `callback` is not provided.
### Parameters
* `options` **[Object][6]?**&#x20; **Returns**: <code>Promise.&lt;Buffer&gt;</code> - - when no callback is provided
* `options.resolveWithObject` **[boolean][10]?** Resolve the Promise with an Object containing `data` and `info` properties instead of resolving only with `data`. | Param | Type | Description |
* `callback` **[Function][3]?**&#x20; | --- | --- | --- |
| [options] | <code>Object</code> | |
| [options.resolveWithObject] | <code>boolean</code> | Resolve the Promise with an Object containing `data` and `info` properties instead of resolving only with `data`. |
| [callback] | <code>function</code> | |
### Examples **Example**
```js
```javascript
sharp(input) sharp(input)
.toBuffer((err, data, info) => { ... }); .toBuffer((err, data, info) => { ... });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.toBuffer() .toBuffer()
.then(data => { ... }) .then(data => { ... })
.catch(err => { ... }); .catch(err => { ... });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.png() .png()
.toBuffer({ resolveWithObject: true }) .toBuffer({ resolveWithObject: true })
.then(({ data, info }) => { ... }) .then(({ data, info }) => { ... })
.catch(err => { ... }); .catch(err => { ... });
``` ```
**Example**
```javascript ```js
const { data, info } = await sharp('my-image.jpg') const { data, info } = await sharp('my-image.jpg')
// output the raw pixels // output the raw pixels
.raw() .raw()
@@ -111,10 +106,8 @@ await sharp(pixelArray, { raw: { width, height, channels } })
.toFile('my-changed-image.jpg'); .toFile('my-changed-image.jpg');
``` ```
Returns **[Promise][5]<[Buffer][11]>** when no callback is provided
## withMetadata ## withMetadata
Include all metadata (EXIF, XMP, IPTC) from the input image in the output image. Include all metadata (EXIF, XMP, IPTC) from the input image in the output image.
This will also convert to and add a web-friendly sRGB ICC profile unless a custom This will also convert to and add a web-friendly sRGB ICC profile unless a custom
output profile is provided. output profile is provided.
@@ -124,95 +117,106 @@ sRGB colour space and strip all metadata, including the removal of any ICC profi
EXIF metadata is unsupported for TIFF output. EXIF metadata is unsupported for TIFF output.
### Parameters
* `options` **[Object][6]?**&#x20; **Throws**:
* `options.orientation` **[number][12]?** value between 1 and 8, used to update the EXIF `Orientation` tag. - <code>Error</code> Invalid parameters
* `options.icc` **[string][2]?** filesystem path to output ICC profile, defaults to sRGB.
* `options.exif` **[Object][6]<[Object][6]>** Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. (optional, default `{}`)
* `options.density` **[number][12]?** Number of pixels per inch (DPI).
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.orientation] | <code>number</code> | | value between 1 and 8, used to update the EXIF `Orientation` tag. |
| [options.icc] | <code>string</code> | <code>&quot;&#x27;srgb&#x27;&quot;</code> | Filesystem path to output ICC profile, relative to `process.cwd()`, defaults to built-in sRGB. |
| [options.exif] | <code>Object.&lt;Object&gt;</code> | <code>{}</code> | Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. |
| [options.density] | <code>number</code> | | Number of pixels per inch (DPI). |
**Example**
```js
sharp('input.jpg') sharp('input.jpg')
.withMetadata() .withMetadata()
.toFile('output-with-metadata.jpg') .toFile('output-with-metadata.jpg')
.then(info => { ... }); .then(info => { ... });
``` ```
**Example**
```javascript ```js
// Set "IFD0-Copyright" in output EXIF metadata // Set output EXIF metadata
const data = await sharp(input) const data = await sharp(input)
.withMetadata({ .withMetadata({
exif: { exif: {
IFD0: { IFD0: {
Copyright: 'Wernham Hogg' Copyright: 'The National Gallery'
},
IFD3: {
GPSLatitudeRef: 'N',
GPSLatitude: '51/1 30/1 3230/100',
GPSLongitudeRef: 'W',
GPSLongitude: '0/1 7/1 4366/100'
} }
} }
}) })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Set output metadata to 96 DPI // Set output metadata to 96 DPI
const data = await sharp(input) const data = await sharp(input)
.withMetadata({ density: 96 }) .withMetadata({ density: 96 })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid parameters
Returns **Sharp**&#x20;
## toFormat ## toFormat
Force output to a given format. Force output to a given format.
### Parameters
* `format` **([string][2] | [Object][6])** as a string or an Object with an 'id' attribute **Throws**:
* `options` **[Object][6]** output options
### Examples - <code>Error</code> unsupported format or options
```javascript
| Param | Type | Description |
| --- | --- | --- |
| format | <code>string</code> \| <code>Object</code> | as a string or an Object with an 'id' attribute |
| options | <code>Object</code> | output options |
**Example**
```js
// Convert any input to PNG output // Convert any input to PNG output
const data = await sharp(input) const data = await sharp(input)
.toFormat('png') .toFormat('png')
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** unsupported format or options
Returns **Sharp**&#x20;
## jpeg ## jpeg
Use these JPEG options for output image. Use these JPEG options for output image.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.quality` **[number][12]** quality, integer 1-100 (optional, default `80`) - <code>Error</code> Invalid options
* `options.progressive` **[boolean][10]** use progressive (interlace) scan (optional, default `false`)
* `options.chromaSubsampling` **[string][2]** set to '4:4:4' to prevent chroma subsampling otherwise defaults to '4:2:0' chroma subsampling (optional, default `'4:2:0'`)
* `options.optimiseCoding` **[boolean][10]** optimise Huffman coding tables (optional, default `true`)
* `options.optimizeCoding` **[boolean][10]** alternative spelling of optimiseCoding (optional, default `true`)
* `options.mozjpeg` **[boolean][10]** use mozjpeg defaults, equivalent to `{ trellisQuantisation: true, overshootDeringing: true, optimiseScans: true, quantisationTable: 3 }` (optional, default `false`)
* `options.trellisQuantisation` **[boolean][10]** apply trellis quantisation (optional, default `false`)
* `options.overshootDeringing` **[boolean][10]** apply overshoot deringing (optional, default `false`)
* `options.optimiseScans` **[boolean][10]** optimise progressive scans, forces progressive (optional, default `false`)
* `options.optimizeScans` **[boolean][10]** alternative spelling of optimiseScans (optional, default `false`)
* `options.quantisationTable` **[number][12]** quantization table to use, integer 0-8 (optional, default `0`)
* `options.quantizationTable` **[number][12]** alternative spelling of quantisationTable (optional, default `0`)
* `options.force` **[boolean][10]** force JPEG output, otherwise attempt to use input format (optional, default `true`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.quality] | <code>number</code> | <code>80</code> | quality, integer 1-100 |
| [options.progressive] | <code>boolean</code> | <code>false</code> | use progressive (interlace) scan |
| [options.chromaSubsampling] | <code>string</code> | <code>&quot;&#x27;4:2:0&#x27;&quot;</code> | set to '4:4:4' to prevent chroma subsampling otherwise defaults to '4:2:0' chroma subsampling |
| [options.optimiseCoding] | <code>boolean</code> | <code>true</code> | optimise Huffman coding tables |
| [options.optimizeCoding] | <code>boolean</code> | <code>true</code> | alternative spelling of optimiseCoding |
| [options.mozjpeg] | <code>boolean</code> | <code>false</code> | use mozjpeg defaults, equivalent to `{ trellisQuantisation: true, overshootDeringing: true, optimiseScans: true, quantisationTable: 3 }` |
| [options.trellisQuantisation] | <code>boolean</code> | <code>false</code> | apply trellis quantisation |
| [options.overshootDeringing] | <code>boolean</code> | <code>false</code> | apply overshoot deringing |
| [options.optimiseScans] | <code>boolean</code> | <code>false</code> | optimise progressive scans, forces progressive |
| [options.optimizeScans] | <code>boolean</code> | <code>false</code> | alternative spelling of optimiseScans |
| [options.quantisationTable] | <code>number</code> | <code>0</code> | quantization table to use, integer 0-8 |
| [options.quantizationTable] | <code>number</code> | <code>0</code> | alternative spelling of quantisationTable |
| [options.force] | <code>boolean</code> | <code>true</code> | force JPEG output, otherwise attempt to use input format |
**Example**
```js
// Convert any input to very high quality JPEG output // Convert any input to very high quality JPEG output
const data = await sharp(input) const data = await sharp(input)
.jpeg({ .jpeg({
@@ -221,191 +225,190 @@ const data = await sharp(input)
}) })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Use mozjpeg to reduce output JPEG file size (slower) // Use mozjpeg to reduce output JPEG file size (slower)
const data = await sharp(input) const data = await sharp(input)
.jpeg({ mozjpeg: true }) .jpeg({ mozjpeg: true })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
## png ## png
Use these PNG options for output image. Use these PNG options for output image.
By default, PNG output is full colour at 8 or 16 bits per pixel. By default, PNG output is full colour at 8 or 16 bits per pixel.
Indexed PNG input at 1, 2 or 4 bits per pixel is converted to 8 bits per pixel. Indexed PNG input at 1, 2 or 4 bits per pixel is converted to 8 bits per pixel.
Set `palette` to `true` for slower, indexed PNG output. Set `palette` to `true` for slower, indexed PNG output.
### Parameters
* `options` **[Object][6]?**&#x20; **Throws**:
* `options.progressive` **[boolean][10]** use progressive (interlace) scan (optional, default `false`) - <code>Error</code> Invalid options
* `options.compressionLevel` **[number][12]** zlib compression level, 0 (fastest, largest) to 9 (slowest, smallest) (optional, default `6`)
* `options.adaptiveFiltering` **[boolean][10]** use adaptive row filtering (optional, default `false`)
* `options.palette` **[boolean][10]** quantise to a palette-based image with alpha transparency support (optional, default `false`)
* `options.quality` **[number][12]** use the lowest number of colours needed to achieve given quality, sets `palette` to `true` (optional, default `100`)
* `options.effort` **[number][12]** CPU effort, between 1 (fastest) and 10 (slowest), sets `palette` to `true` (optional, default `7`)
* `options.colours` **[number][12]** maximum number of palette entries, sets `palette` to `true` (optional, default `256`)
* `options.colors` **[number][12]** alternative spelling of `options.colours`, sets `palette` to `true` (optional, default `256`)
* `options.dither` **[number][12]** level of Floyd-Steinberg error diffusion, sets `palette` to `true` (optional, default `1.0`)
* `options.force` **[boolean][10]** force PNG output, otherwise attempt to use input format (optional, default `true`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.progressive] | <code>boolean</code> | <code>false</code> | use progressive (interlace) scan |
| [options.compressionLevel] | <code>number</code> | <code>6</code> | zlib compression level, 0 (fastest, largest) to 9 (slowest, smallest) |
| [options.adaptiveFiltering] | <code>boolean</code> | <code>false</code> | use adaptive row filtering |
| [options.palette] | <code>boolean</code> | <code>false</code> | quantise to a palette-based image with alpha transparency support |
| [options.quality] | <code>number</code> | <code>100</code> | use the lowest number of colours needed to achieve given quality, sets `palette` to `true` |
| [options.effort] | <code>number</code> | <code>7</code> | CPU effort, between 1 (fastest) and 10 (slowest), sets `palette` to `true` |
| [options.colours] | <code>number</code> | <code>256</code> | maximum number of palette entries, sets `palette` to `true` |
| [options.colors] | <code>number</code> | <code>256</code> | alternative spelling of `options.colours`, sets `palette` to `true` |
| [options.dither] | <code>number</code> | <code>1.0</code> | level of Floyd-Steinberg error diffusion, sets `palette` to `true` |
| [options.force] | <code>boolean</code> | <code>true</code> | force PNG output, otherwise attempt to use input format |
**Example**
```js
// Convert any input to full colour PNG output // Convert any input to full colour PNG output
const data = await sharp(input) const data = await sharp(input)
.png() .png()
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Convert any input to indexed PNG output (slower) // Convert any input to indexed PNG output (slower)
const data = await sharp(input) const data = await sharp(input)
.png({ palette: true }) .png({ palette: true })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
## webp ## webp
Use these WebP options for output image. Use these WebP options for output image.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.quality` **[number][12]** quality, integer 1-100 (optional, default `80`) - <code>Error</code> Invalid options
* `options.alphaQuality` **[number][12]** quality of alpha layer, integer 0-100 (optional, default `100`)
* `options.lossless` **[boolean][10]** use lossless compression mode (optional, default `false`)
* `options.nearLossless` **[boolean][10]** use near\_lossless compression mode (optional, default `false`)
* `options.smartSubsample` **[boolean][10]** use high quality chroma subsampling (optional, default `false`)
* `options.effort` **[number][12]** CPU effort, between 0 (fastest) and 6 (slowest) (optional, default `4`)
* `options.loop` **[number][12]** number of animation iterations, use 0 for infinite animation (optional, default `0`)
* `options.delay` **([number][12] | [Array][13]<[number][12]>)?** delay(s) between animation frames (in milliseconds)
* `options.minSize` **[boolean][10]** prevent use of animation key frames to minimise file size (slow) (optional, default `false`)
* `options.mixed` **[boolean][10]** allow mixture of lossy and lossless animation frames (slow) (optional, default `false`)
* `options.force` **[boolean][10]** force WebP output, otherwise attempt to use input format (optional, default `true`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.quality] | <code>number</code> | <code>80</code> | quality, integer 1-100 |
| [options.alphaQuality] | <code>number</code> | <code>100</code> | quality of alpha layer, integer 0-100 |
| [options.lossless] | <code>boolean</code> | <code>false</code> | use lossless compression mode |
| [options.nearLossless] | <code>boolean</code> | <code>false</code> | use near_lossless compression mode |
| [options.smartSubsample] | <code>boolean</code> | <code>false</code> | use high quality chroma subsampling |
| [options.preset] | <code>string</code> | <code>&quot;&#x27;default&#x27;&quot;</code> | named preset for preprocessing/filtering, one of: default, photo, picture, drawing, icon, text |
| [options.effort] | <code>number</code> | <code>4</code> | CPU effort, between 0 (fastest) and 6 (slowest) |
| [options.loop] | <code>number</code> | <code>0</code> | number of animation iterations, use 0 for infinite animation |
| [options.delay] | <code>number</code> \| <code>Array.&lt;number&gt;</code> | | delay(s) between animation frames (in milliseconds) |
| [options.minSize] | <code>boolean</code> | <code>false</code> | prevent use of animation key frames to minimise file size (slow) |
| [options.mixed] | <code>boolean</code> | <code>false</code> | allow mixture of lossy and lossless animation frames (slow) |
| [options.force] | <code>boolean</code> | <code>true</code> | force WebP output, otherwise attempt to use input format |
**Example**
```js
// Convert any input to lossless WebP output // Convert any input to lossless WebP output
const data = await sharp(input) const data = await sharp(input)
.webp({ lossless: true }) .webp({ lossless: true })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Optimise the file size of an animated WebP // Optimise the file size of an animated WebP
const outputWebp = await sharp(inputWebp, { animated: true }) const outputWebp = await sharp(inputWebp, { animated: true })
.webp({ effort: 6 }) .webp({ effort: 6 })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
## gif ## gif
Use these GIF options for the output image. Use these GIF options for the output image.
The first entry in the palette is reserved for transparency. The first entry in the palette is reserved for transparency.
The palette of the input image will be re-used if possible. The palette of the input image will be re-used if possible.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.reoptimise` **[boolean][10]** always generate new palettes (slow), re-use existing by default (optional, default `false`) - <code>Error</code> Invalid options
* `options.reoptimize` **[boolean][10]** alternative spelling of `options.reoptimise` (optional, default `false`)
* `options.colours` **[number][12]** maximum number of palette entries, including transparency, between 2 and 256 (optional, default `256`)
* `options.colors` **[number][12]** alternative spelling of `options.colours` (optional, default `256`)
* `options.effort` **[number][12]** CPU effort, between 1 (fastest) and 10 (slowest) (optional, default `7`)
* `options.dither` **[number][12]** level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) (optional, default `1.0`)
* `options.interFrameMaxError` **[number][12]** maximum inter-frame error for transparency, between 0 (lossless) and 32 (optional, default `0`)
* `options.interPaletteMaxError` **[number][12]** maximum inter-palette error for palette reuse, between 0 and 256 (optional, default `3`)
* `options.loop` **[number][12]** number of animation iterations, use 0 for infinite animation (optional, default `0`)
* `options.delay` **([number][12] | [Array][13]<[number][12]>)?** delay(s) between animation frames (in milliseconds)
* `options.force` **[boolean][10]** force GIF output, otherwise attempt to use input format (optional, default `true`)
### Examples **Since**: 0.30.0
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.reuse] | <code>boolean</code> | <code>true</code> | re-use existing palette, otherwise generate new (slow) |
| [options.progressive] | <code>boolean</code> | <code>false</code> | use progressive (interlace) scan |
| [options.colours] | <code>number</code> | <code>256</code> | maximum number of palette entries, including transparency, between 2 and 256 |
| [options.colors] | <code>number</code> | <code>256</code> | alternative spelling of `options.colours` |
| [options.effort] | <code>number</code> | <code>7</code> | CPU effort, between 1 (fastest) and 10 (slowest) |
| [options.dither] | <code>number</code> | <code>1.0</code> | level of Floyd-Steinberg error diffusion, between 0 (least) and 1 (most) |
| [options.interFrameMaxError] | <code>number</code> | <code>0</code> | maximum inter-frame error for transparency, between 0 (lossless) and 32 |
| [options.interPaletteMaxError] | <code>number</code> | <code>3</code> | maximum inter-palette error for palette reuse, between 0 and 256 |
| [options.loop] | <code>number</code> | <code>0</code> | number of animation iterations, use 0 for infinite animation |
| [options.delay] | <code>number</code> \| <code>Array.&lt;number&gt;</code> | | delay(s) between animation frames (in milliseconds) |
| [options.force] | <code>boolean</code> | <code>true</code> | force GIF output, otherwise attempt to use input format |
**Example**
```js
// Convert PNG to GIF // Convert PNG to GIF
await sharp(pngBuffer) await sharp(pngBuffer)
.gif() .gif()
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Convert animated WebP to animated GIF // Convert animated WebP to animated GIF
await sharp('animated.webp', { animated: true }) await sharp('animated.webp', { animated: true })
.toFile('animated.gif'); .toFile('animated.gif');
``` ```
**Example**
```javascript ```js
// Create a 128x128, cropped, non-dithered, animated thumbnail of an animated GIF // Create a 128x128, cropped, non-dithered, animated thumbnail of an animated GIF
const out = await sharp('in.gif', { animated: true }) const out = await sharp('in.gif', { animated: true })
.resize({ width: 128, height: 128 }) .resize({ width: 128, height: 128 })
.gif({ dither: 0 }) .gif({ dither: 0 })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Lossy file size reduction of animated GIF // Lossy file size reduction of animated GIF
await sharp('in.gif', { animated: true }) await sharp('in.gif', { animated: true })
.gif({ interFrameMaxError: 8 }) .gif({ interFrameMaxError: 8 })
.toFile('optim.gif'); .toFile('optim.gif');
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.30.0
## jp2 ## jp2
Use these JP2 options for output image. Use these JP2 options for output image.
Requires libvips compiled with support for OpenJPEG. Requires libvips compiled with support for OpenJPEG.
The prebuilt binaries do not include this - see The prebuilt binaries do not include this - see
[installing a custom libvips][14]. [installing a custom libvips](https://sharp.pixelplumbing.com/install#custom-libvips).
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.quality` **[number][12]** quality, integer 1-100 (optional, default `80`) - <code>Error</code> Invalid options
* `options.lossless` **[boolean][10]** use lossless compression mode (optional, default `false`)
* `options.tileWidth` **[number][12]** horizontal tile size (optional, default `512`)
* `options.tileHeight` **[number][12]** vertical tile size (optional, default `512`)
* `options.chromaSubsampling` **[string][2]** set to '4:2:0' to use chroma subsampling (optional, default `'4:4:4'`)
### Examples **Since**: 0.29.1
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.quality] | <code>number</code> | <code>80</code> | quality, integer 1-100 |
| [options.lossless] | <code>boolean</code> | <code>false</code> | use lossless compression mode |
| [options.tileWidth] | <code>number</code> | <code>512</code> | horizontal tile size |
| [options.tileHeight] | <code>number</code> | <code>512</code> | vertical tile size |
| [options.chromaSubsampling] | <code>string</code> | <code>&quot;&#x27;4:4:4&#x27;&quot;</code> | set to '4:2:0' to use chroma subsampling |
**Example**
```js
// Convert any input to lossless JP2 output // Convert any input to lossless JP2 output
const data = await sharp(input) const data = await sharp(input)
.jp2({ lossless: true }) .jp2({ lossless: true })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
// Convert any input to very high quality JP2 output // Convert any input to very high quality JP2 output
const data = await sharp(input) const data = await sharp(input)
.jp2({ .jp2({
@@ -415,40 +418,37 @@ const data = await sharp(input)
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.29.1
## tiff ## tiff
Use these TIFF options for output image. Use these TIFF options for output image.
The `density` can be set in pixels/inch via [withMetadata][1] instead of providing `xres` and `yres` in pixels/mm. The `density` can be set in pixels/inch via [withMetadata](#withmetadata)
instead of providing `xres` and `yres` in pixels/mm.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.quality` **[number][12]** quality, integer 1-100 (optional, default `80`) - <code>Error</code> Invalid options
* `options.force` **[boolean][10]** force TIFF output, otherwise attempt to use input format (optional, default `true`)
* `options.compression` **[string][2]** compression options: none, jpeg, deflate, packbits, ccittfax4, lzw, webp, zstd, jp2k (optional, default `'jpeg'`)
* `options.predictor` **[string][2]** compression predictor options: none, horizontal, float (optional, default `'horizontal'`)
* `options.pyramid` **[boolean][10]** write an image pyramid (optional, default `false`)
* `options.tile` **[boolean][10]** write a tiled tiff (optional, default `false`)
* `options.tileWidth` **[number][12]** horizontal tile size (optional, default `256`)
* `options.tileHeight` **[number][12]** vertical tile size (optional, default `256`)
* `options.xres` **[number][12]** horizontal resolution in pixels/mm (optional, default `1.0`)
* `options.yres` **[number][12]** vertical resolution in pixels/mm (optional, default `1.0`)
* `options.resolutionUnit` **[string][2]** resolution unit options: inch, cm (optional, default `'inch'`)
* `options.bitdepth` **[number][12]** reduce bitdepth to 1, 2 or 4 bit (optional, default `8`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.quality] | <code>number</code> | <code>80</code> | quality, integer 1-100 |
| [options.force] | <code>boolean</code> | <code>true</code> | force TIFF output, otherwise attempt to use input format |
| [options.compression] | <code>string</code> | <code>&quot;&#x27;jpeg&#x27;&quot;</code> | compression options: none, jpeg, deflate, packbits, ccittfax4, lzw, webp, zstd, jp2k |
| [options.predictor] | <code>string</code> | <code>&quot;&#x27;horizontal&#x27;&quot;</code> | compression predictor options: none, horizontal, float |
| [options.pyramid] | <code>boolean</code> | <code>false</code> | write an image pyramid |
| [options.tile] | <code>boolean</code> | <code>false</code> | write a tiled tiff |
| [options.tileWidth] | <code>number</code> | <code>256</code> | horizontal tile size |
| [options.tileHeight] | <code>number</code> | <code>256</code> | vertical tile size |
| [options.xres] | <code>number</code> | <code>1.0</code> | horizontal resolution in pixels/mm |
| [options.yres] | <code>number</code> | <code>1.0</code> | vertical resolution in pixels/mm |
| [options.resolutionUnit] | <code>string</code> | <code>&quot;&#x27;inch&#x27;&quot;</code> | resolution unit options: inch, cm |
| [options.bitdepth] | <code>number</code> | <code>8</code> | reduce bitdepth to 1, 2 or 4 bit |
**Example**
```js
// Convert SVG input to LZW-compressed, 1 bit per pixel TIFF output // Convert SVG input to LZW-compressed, 1 bit per pixel TIFF output
sharp('input.svg') sharp('input.svg')
.tiff({ .tiff({
@@ -459,12 +459,8 @@ sharp('input.svg')
.then(info => { ... }); .then(info => { ... });
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
## avif ## avif
Use these AVIF options for output image. Use these AVIF options for output image.
Whilst it is possible to create AVIF images smaller than 16x16 pixels, Whilst it is possible to create AVIF images smaller than 16x16 pixels,
@@ -472,124 +468,119 @@ most web browsers do not display these properly.
AVIF image sequences are not supported. AVIF image sequences are not supported.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.quality` **[number][12]** quality, integer 1-100 (optional, default `50`) - <code>Error</code> Invalid options
* `options.lossless` **[boolean][10]** use lossless compression (optional, default `false`)
* `options.effort` **[number][12]** CPU effort, between 0 (fastest) and 9 (slowest) (optional, default `4`)
* `options.chromaSubsampling` **[string][2]** set to '4:2:0' to use chroma subsampling (optional, default `'4:4:4'`)
### Examples **Since**: 0.27.0
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.quality] | <code>number</code> | <code>50</code> | quality, integer 1-100 |
| [options.lossless] | <code>boolean</code> | <code>false</code> | use lossless compression |
| [options.effort] | <code>number</code> | <code>4</code> | CPU effort, between 0 (fastest) and 9 (slowest) |
| [options.chromaSubsampling] | <code>string</code> | <code>&quot;&#x27;4:4:4&#x27;&quot;</code> | set to '4:2:0' to use chroma subsampling |
**Example**
```js
const data = await sharp(input) const data = await sharp(input)
.avif({ effort: 2 }) .avif({ effort: 2 })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
const data = await sharp(input) const data = await sharp(input)
.avif({ lossless: true }) .avif({ lossless: true })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.27.0
## heif ## heif
Use these HEIF options for output image. Use these HEIF options for output image.
Support for patent-encumbered HEIC images using `hevc` compression requires the use of a Support for patent-encumbered HEIC images using `hevc` compression requires the use of a
globally-installed libvips compiled with support for libheif, libde265 and x265. globally-installed libvips compiled with support for libheif, libde265 and x265.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.quality` **[number][12]** quality, integer 1-100 (optional, default `50`) - <code>Error</code> Invalid options
* `options.compression` **[string][2]** compression format: av1, hevc (optional, default `'av1'`)
* `options.lossless` **[boolean][10]** use lossless compression (optional, default `false`)
* `options.effort` **[number][12]** CPU effort, between 0 (fastest) and 9 (slowest) (optional, default `4`)
* `options.chromaSubsampling` **[string][2]** set to '4:2:0' to use chroma subsampling (optional, default `'4:4:4'`)
### Examples **Since**: 0.23.0
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.quality] | <code>number</code> | <code>50</code> | quality, integer 1-100 |
| [options.compression] | <code>string</code> | <code>&quot;&#x27;av1&#x27;&quot;</code> | compression format: av1, hevc |
| [options.lossless] | <code>boolean</code> | <code>false</code> | use lossless compression |
| [options.effort] | <code>number</code> | <code>4</code> | CPU effort, between 0 (fastest) and 9 (slowest) |
| [options.chromaSubsampling] | <code>string</code> | <code>&quot;&#x27;4:4:4&#x27;&quot;</code> | set to '4:2:0' to use chroma subsampling |
**Example**
```js
const data = await sharp(input) const data = await sharp(input)
.heif({ compression: 'hevc' }) .heif({ compression: 'hevc' })
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.23.0
## jxl ## jxl
Use these JPEG-XL (JXL) options for output image. Use these JPEG-XL (JXL) options for output image.
This feature is experimental, please do not use in production systems. This feature is experimental, please do not use in production systems.
Requires libvips compiled with support for libjxl. Requires libvips compiled with support for libjxl.
The prebuilt binaries do not include this - see The prebuilt binaries do not include this - see
[installing a custom libvips][14]. [installing a custom libvips](https://sharp.pixelplumbing.com/install#custom-libvips).
Image metadata (EXIF, XMP) is unsupported. Image metadata (EXIF, XMP) is unsupported.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.distance` **[number][12]** maximum encoding error, between 0 (highest quality) and 15 (lowest quality) (optional, default `1.0`) - <code>Error</code> Invalid options
* `options.quality` **[number][12]?** calculate `distance` based on JPEG-like quality, between 1 and 100, overrides distance if specified
* `options.decodingTier` **[number][12]** target decode speed tier, between 0 (highest quality) and 4 (lowest quality) (optional, default `0`)
* `options.lossless` **[boolean][10]** use lossless compression (optional, default `false`)
* `options.effort` **[number][12]** CPU effort, between 3 (fastest) and 9 (slowest) (optional, default `7`)
<!----> **Since**: 0.31.3
* Throws **[Error][4]** Invalid options | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.distance] | <code>number</code> | <code>1.0</code> | maximum encoding error, between 0 (highest quality) and 15 (lowest quality) |
| [options.quality] | <code>number</code> | | calculate `distance` based on JPEG-like quality, between 1 and 100, overrides distance if specified |
| [options.decodingTier] | <code>number</code> | <code>0</code> | target decode speed tier, between 0 (highest quality) and 4 (lowest quality) |
| [options.lossless] | <code>boolean</code> | <code>false</code> | use lossless compression |
| [options.effort] | <code>number</code> | <code>7</code> | CPU effort, between 3 (fastest) and 9 (slowest) |
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.31.3
## raw ## raw
Force output to be raw, uncompressed pixel data. Force output to be raw, uncompressed pixel data.
Pixel ordering is left-to-right, top-to-bottom, without padding. Pixel ordering is left-to-right, top-to-bottom, without padding.
Channel ordering will be RGB or RGBA for non-greyscale colourspaces. Channel ordering will be RGB or RGBA for non-greyscale colourspaces.
### Parameters
* `options` **[Object][6]?** output options **Throws**:
* `options.depth` **[string][2]** bit depth, one of: char, uchar (default), short, ushort, int, uint, float, complex, double, dpcomplex (optional, default `'uchar'`) - <code>Error</code> Invalid options
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | output options |
| [options.depth] | <code>string</code> | <code>&quot;&#x27;uchar&#x27;&quot;</code> | bit depth, one of: char, uchar (default), short, ushort, int, uint, float, complex, double, dpcomplex |
**Example**
```js
// Extract raw, unsigned 8-bit RGB pixel data from JPEG input // Extract raw, unsigned 8-bit RGB pixel data from JPEG input
const { data, info } = await sharp('input.jpg') const { data, info } = await sharp('input.jpg')
.raw() .raw()
.toBuffer({ resolveWithObject: true }); .toBuffer({ resolveWithObject: true });
``` ```
**Example**
```javascript ```js
// Extract alpha channel as raw, unsigned 16-bit pixel data from PNG input // Extract alpha channel as raw, unsigned 16-bit pixel data from PNG input
const data = await sharp('input.png') const data = await sharp('input.png')
.ensureAlpha() .ensureAlpha()
@@ -599,10 +590,8 @@ const data = await sharp('input.png')
.toBuffer(); .toBuffer();
``` ```
* Throws **[Error][4]** Invalid options
## tile ## tile
Use tile-based deep zoom (image pyramid) output. Use tile-based deep zoom (image pyramid) output.
Set the format and options for tile images via the `toFormat`, `jpeg`, `png` or `webp` functions. Set the format and options for tile images via the `toFormat`, `jpeg`, `png` or `webp` functions.
@@ -610,26 +599,34 @@ Use a `.zip` or `.szi` file extension with `toFile` to write to a compressed arc
The container will be set to `zip` when the output is a Buffer or Stream, otherwise it will default to `fs`. The container will be set to `zip` when the output is a Buffer or Stream, otherwise it will default to `fs`.
### Parameters Requires libvips compiled with support for libgsf.
The prebuilt binaries do not include this - see
[installing a custom libvips](https://sharp.pixelplumbing.com/install#custom-libvips).
* `options` **[Object][6]?**&#x20;
* `options.size` **[number][12]** tile size in pixels, a value between 1 and 8192. (optional, default `256`) **Throws**:
* `options.overlap` **[number][12]** tile overlap in pixels, a value between 0 and 8192. (optional, default `0`)
* `options.angle` **[number][12]** tile angle of rotation, must be a multiple of 90. (optional, default `0`)
* `options.background` **([string][2] | [Object][6])** background colour, parsed by the [color][15] module, defaults to white without transparency. (optional, default `{r:255,g:255,b:255,alpha:1}`)
* `options.depth` **[string][2]?** how deep to make the pyramid, possible values are `onepixel`, `onetile` or `one`, default based on layout.
* `options.skipBlanks` **[number][12]** threshold to skip tile generation, a value 0 - 255 for 8-bit images or 0 - 65535 for 16-bit images (optional, default `-1`)
* `options.container` **[string][2]** tile container, with value `fs` (filesystem) or `zip` (compressed file). (optional, default `'fs'`)
* `options.layout` **[string][2]** filesystem layout, possible values are `dz`, `iiif`, `iiif3`, `zoomify` or `google`. (optional, default `'dz'`)
* `options.centre` **[boolean][10]** centre image in tile. (optional, default `false`)
* `options.center` **[boolean][10]** alternative spelling of centre. (optional, default `false`)
* `options.id` **[string][2]** when `layout` is `iiif`/`iiif3`, sets the `@id`/`id` attribute of `info.json` (optional, default `'https://example.com/iiif'`)
* `options.basename` **[string][2]?** the name of the directory within the zip file when container is `zip`.
### Examples - <code>Error</code> Invalid parameters
```javascript
| Param | Type | Default | Description |
| --- | --- | --- | --- |
| [options] | <code>Object</code> | | |
| [options.size] | <code>number</code> | <code>256</code> | tile size in pixels, a value between 1 and 8192. |
| [options.overlap] | <code>number</code> | <code>0</code> | tile overlap in pixels, a value between 0 and 8192. |
| [options.angle] | <code>number</code> | <code>0</code> | tile angle of rotation, must be a multiple of 90. |
| [options.background] | <code>string</code> \| <code>Object</code> | <code>&quot;{r: 255, g: 255, b: 255, alpha: 1}&quot;</code> | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to white without transparency. |
| [options.depth] | <code>string</code> | | how deep to make the pyramid, possible values are `onepixel`, `onetile` or `one`, default based on layout. |
| [options.skipBlanks] | <code>number</code> | <code>-1</code> | threshold to skip tile generation, a value 0 - 255 for 8-bit images or 0 - 65535 for 16-bit images |
| [options.container] | <code>string</code> | <code>&quot;&#x27;fs&#x27;&quot;</code> | tile container, with value `fs` (filesystem) or `zip` (compressed file). |
| [options.layout] | <code>string</code> | <code>&quot;&#x27;dz&#x27;&quot;</code> | filesystem layout, possible values are `dz`, `iiif`, `iiif3`, `zoomify` or `google`. |
| [options.centre] | <code>boolean</code> | <code>false</code> | centre image in tile. |
| [options.center] | <code>boolean</code> | <code>false</code> | alternative spelling of centre. |
| [options.id] | <code>string</code> | <code>&quot;&#x27;https://example.com/iiif&#x27;&quot;</code> | when `layout` is `iiif`/`iiif3`, sets the `@id`/`id` attribute of `info.json` |
| [options.basename] | <code>string</code> | | the name of the directory within the zip file when container is `zip`. |
**Example**
```js
sharp('input.tiff') sharp('input.tiff')
.png() .png()
.tile({ .tile({
@@ -640,41 +637,38 @@ sharp('input.tiff')
// output_files contains 512x512 tiles grouped by zoom level // output_files contains 512x512 tiles grouped by zoom level
}); });
``` ```
**Example**
```javascript ```js
const zipFileWithTiles = await sharp(input) const zipFileWithTiles = await sharp(input)
.tile({ basename: "tiles" }) .tile({ basename: "tiles" })
.toBuffer(); .toBuffer();
``` ```
**Example**
```javascript ```js
const iiififier = sharp().tile({ layout: "iiif" }); const iiififier = sharp().tile({ layout: "iiif" });
readableStream readableStream
.pipe(iiififier) .pipe(iiififier)
.pipe(writeableStream); .pipe(writeableStream);
``` ```
* Throws **[Error][4]** Invalid parameters
Returns **Sharp**&#x20;
## timeout ## timeout
Set a timeout for processing, in seconds. Set a timeout for processing, in seconds.
Use a value of zero to continue processing indefinitely, the default behaviour. Use a value of zero to continue processing indefinitely, the default behaviour.
The clock starts when libvips opens an input image for processing. The clock starts when libvips opens an input image for processing.
Time spent waiting for a libuv thread to become available is not included. Time spent waiting for a libuv thread to become available is not included.
### Parameters
* `options` **[Object][6]**&#x20; **Since**: 0.29.2
* `options.seconds` **[number][12]** Number of seconds after which processing will be stopped | Param | Type | Description |
| --- | --- | --- |
| options | <code>Object</code> | |
| options.seconds | <code>number</code> | Number of seconds after which processing will be stopped |
### Examples **Example**
```js
```javascript
// Ensure processing takes no longer than 3 seconds // Ensure processing takes no longer than 3 seconds
try { try {
const data = await sharp(input) const data = await sharp(input)
@@ -684,40 +678,4 @@ try {
} catch (err) { } catch (err) {
if (err.message.includes('timeout')) { ... } if (err.message.includes('timeout')) { ... }
} }
``` ```
Returns **Sharp**&#x20;
**Meta**
* **since**: 0.29.2
[1]: #withmetadata
[2]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String
[3]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Statements/function
[4]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error
[5]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Promise
[6]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object
[7]: #toformat
[8]: #jpeg
[9]: #png
[10]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[11]: https://nodejs.org/api/buffer.html
[12]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number
[13]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Array
[14]: https://sharp.pixelplumbing.com/install#custom-libvips
[15]: https://www.npmjs.org/package/color

View File

@@ -1,65 +1,62 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## resize ## resize
Resize image to `width`, `height` or `width x height`. Resize image to `width`, `height` or `width x height`.
When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are: When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are:
- `cover`: (default) Preserving aspect ratio, attempt to ensure the image covers both provided dimensions by cropping/clipping to fit.
- `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary.
- `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions.
- `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified.
- `outside`: Preserving aspect ratio, resize the image to be as small as possible while ensuring its dimensions are greater than or equal to both those specified.
* `cover`: (default) Preserving aspect ratio, ensure the image covers both provided dimensions by cropping/clipping to fit. Some of these values are based on the [object-fit](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) CSS property.
* `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary.
* `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions.
* `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified.
* `outside`: Preserving aspect ratio, resize the image to be as small as possible while ensuring its dimensions are greater than or equal to both those specified.
Some of these values are based on the [object-fit][1] CSS property.
<img alt="Examples of various values for the fit property when resizing" width="100%" style="aspect-ratio: 998/243" src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/api-resize-fit.png"> <img alt="Examples of various values for the fit property when resizing" width="100%" style="aspect-ratio: 998/243" src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/api-resize-fit.png">
When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are: When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are:
- `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`.
- `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`.
- `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy.
* `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`. Some of these values are based on the [object-position](https://developer.mozilla.org/en-US/docs/Web/CSS/object-position) CSS property.
* `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`.
* `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy.
Some of these values are based on the [object-position][2] CSS property.
The experimental strategy-based approach resizes so one dimension is at its target length The experimental strategy-based approach resizes so one dimension is at its target length
then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy. then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy.
- `entropy`: focus on the region with the highest [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29).
* `entropy`: focus on the region with the highest [Shannon entropy][3]. - `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones.
* `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones.
Possible interpolation kernels are: Possible interpolation kernels are:
- `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation).
* `nearest`: Use [nearest neighbour interpolation][4]. - `cubic`: Use a [Catmull-Rom spline](https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline).
* `cubic`: Use a [Catmull-Rom spline][5]. - `mitchell`: Use a [Mitchell-Netravali spline](https://www.cs.utexas.edu/~fussell/courses/cs384g-fall2013/lectures/mitchell/Mitchell.pdf).
* `mitchell`: Use a [Mitchell-Netravali spline][6]. - `lanczos2`: Use a [Lanczos kernel](https://en.wikipedia.org/wiki/Lanczos_resampling#Lanczos_kernel) with `a=2`.
* `lanczos2`: Use a [Lanczos kernel][7] with `a=2`. - `lanczos3`: Use a Lanczos kernel with `a=3` (the default).
* `lanczos3`: Use a Lanczos kernel with `a=3` (the default).
Only one resize can occur per pipeline. Only one resize can occur per pipeline.
Previous calls to `resize` in the same pipeline will be ignored. Previous calls to `resize` in the same pipeline will be ignored.
### Parameters
* `width` **[number][8]?** pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. **Throws**:
* `height` **[number][8]?** pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width.
* `options` **[Object][9]?**&#x20;
* `options.width` **[String][10]?** alternative means of specifying `width`. If both are present this takes priority. - <code>Error</code> Invalid parameters
* `options.height` **[String][10]?** alternative means of specifying `height`. If both are present this takes priority.
* `options.fit` **[String][10]** how the image should be resized to fit both provided dimensions, one of `cover`, `contain`, `fill`, `inside` or `outside`. (optional, default `'cover'`)
* `options.position` **[String][10]** position, gravity or strategy to use when `fit` is `cover` or `contain`. (optional, default `'centre'`)
* `options.background` **([String][10] | [Object][9])** background colour when `fit` is `contain`, parsed by the [color][11] module, defaults to black without transparency. (optional, default `{r:0,g:0,b:0,alpha:1}`)
* `options.kernel` **[String][10]** the kernel to use for image reduction. (optional, default `'lanczos3'`)
* `options.withoutEnlargement` **[Boolean][12]** do not enlarge if the width *or* height are already less than the specified dimensions, equivalent to GraphicsMagick's `>` geometry option. (optional, default `false`)
* `options.withoutReduction` **[Boolean][12]** do not reduce if the width *or* height are already greater than the specified dimensions, equivalent to GraphicsMagick's `<` geometry option. (optional, default `false`)
* `options.fastShrinkOnLoad` **[Boolean][12]** take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern on some images. (optional, default `true`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| [width] | <code>number</code> | | How many pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. |
| [height] | <code>number</code> | | How many pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. |
| [options] | <code>Object</code> | | |
| [options.width] | <code>number</code> | | An alternative means of specifying `width`. If both are present this takes priority. |
| [options.height] | <code>number</code> | | An alternative means of specifying `height`. If both are present this takes priority. |
| [options.fit] | <code>String</code> | <code>&#x27;cover&#x27;</code> | How the image should be resized/cropped to fit the target dimension(s), one of `cover`, `contain`, `fill`, `inside` or `outside`. |
| [options.position] | <code>String</code> | <code>&#x27;centre&#x27;</code> | A position, gravity or strategy to use when `fit` is `cover` or `contain`. |
| [options.background] | <code>String</code> \| <code>Object</code> | <code>{r: 0, g: 0, b: 0, alpha: 1}</code> | background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. |
| [options.kernel] | <code>String</code> | <code>&#x27;lanczos3&#x27;</code> | The kernel to use for image reduction. Use the `fastShrinkOnLoad` option to control kernel vs shrink-on-load. |
| [options.withoutEnlargement] | <code>Boolean</code> | <code>false</code> | Do not scale up if the width *or* height are already less than the target dimensions, equivalent to GraphicsMagick's `>` geometry option. This may result in output dimensions smaller than the target dimensions. |
| [options.withoutReduction] | <code>Boolean</code> | <code>false</code> | Do not scale down if the width *or* height are already greater than the target dimensions, equivalent to GraphicsMagick's `<` geometry option. This may still result in a crop to reach the target dimensions. |
| [options.fastShrinkOnLoad] | <code>Boolean</code> | <code>true</code> | Take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern or round-down of an auto-scaled dimension. |
**Example**
```js
sharp(input) sharp(input)
.resize({ width: 100 }) .resize({ width: 100 })
.toBuffer() .toBuffer()
@@ -67,8 +64,8 @@ sharp(input)
// 100 pixels wide, auto-scaled height // 100 pixels wide, auto-scaled height
}); });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.resize({ height: 100 }) .resize({ height: 100 })
.toBuffer() .toBuffer()
@@ -76,8 +73,8 @@ sharp(input)
// 100 pixels high, auto-scaled width // 100 pixels high, auto-scaled width
}); });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.resize(200, 300, { .resize(200, 300, {
kernel: sharp.kernel.nearest, kernel: sharp.kernel.nearest,
@@ -92,8 +89,8 @@ sharp(input)
// contained within the north-east corner of a semi-transparent white canvas // contained within the north-east corner of a semi-transparent white canvas
}); });
``` ```
**Example**
```javascript ```js
const transformer = sharp() const transformer = sharp()
.resize({ .resize({
width: 200, width: 200,
@@ -107,8 +104,8 @@ readableStream
.pipe(transformer) .pipe(transformer)
.pipe(writableStream); .pipe(writableStream);
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.resize(200, 200, { .resize(200, 200, {
fit: sharp.fit.inside, fit: sharp.fit.inside,
@@ -122,8 +119,8 @@ sharp(input)
// and no larger than the input image // and no larger than the input image
}); });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.resize(200, 200, { .resize(200, 200, {
fit: sharp.fit.outside, fit: sharp.fit.outside,
@@ -137,8 +134,8 @@ sharp(input)
// and no smaller than the input image // and no smaller than the input image
}); });
``` ```
**Example**
```javascript ```js
const scaleByHalf = await sharp(input) const scaleByHalf = await sharp(input)
.metadata() .metadata()
.then(({ width }) => sharp(input) .then(({ width }) => sharp(input)
@@ -147,28 +144,30 @@ const scaleByHalf = await sharp(input)
); );
``` ```
* Throws **[Error][13]** Invalid parameters
Returns **Sharp**&#x20;
## extend ## extend
Extend / pad / extrude one or more edges of the image with either
Extends/pads the edges of the image with the provided background colour. the provided background colour or pixels derived from the image.
This operation will always occur after resizing and extraction, if any. This operation will always occur after resizing and extraction, if any.
### Parameters
* `extend` **([number][8] | [Object][9])** single pixel count to add to all edges or an Object with per-edge counts **Throws**:
* `extend.top` **[number][8]** (optional, default `0`) - <code>Error</code> Invalid parameters
* `extend.left` **[number][8]** (optional, default `0`)
* `extend.bottom` **[number][8]** (optional, default `0`)
* `extend.right` **[number][8]** (optional, default `0`)
* `extend.background` **([String][10] | [Object][9])** background colour, parsed by the [color][11] module, defaults to black without transparency. (optional, default `{r:0,g:0,b:0,alpha:1}`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| extend | <code>number</code> \| <code>Object</code> | | single pixel count to add to all edges or an Object with per-edge counts |
| [extend.top] | <code>number</code> | <code>0</code> | |
| [extend.left] | <code>number</code> | <code>0</code> | |
| [extend.bottom] | <code>number</code> | <code>0</code> | |
| [extend.right] | <code>number</code> | <code>0</code> | |
| [extend.extendWith] | <code>String</code> | <code>&#x27;background&#x27;</code> | populate new pixels using this method, one of: background, copy, repeat, mirror. |
| [extend.background] | <code>String</code> \| <code>Object</code> | <code>{r: 0, g: 0, b: 0, alpha: 1}</code> | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. |
**Example**
```js
// Resize to 140 pixels wide, then add 10 transparent pixels // Resize to 140 pixels wide, then add 10 transparent pixels
// to the top, left and right edges and 20 to the bottom edge // to the top, left and right edges and 20 to the bottom edge
sharp(input) sharp(input)
@@ -182,8 +181,8 @@ sharp(input)
}) })
... ...
``` ```
**Example**
```javascript ```js
// Add a row of 10 red pixels to the bottom // Add a row of 10 red pixels to the bottom
sharp(input) sharp(input)
.extend({ .extend({
@@ -192,39 +191,49 @@ sharp(input)
}) })
... ...
``` ```
**Example**
```js
// Extrude image by 8 pixels to the right, mirroring existing right hand edge
sharp(input)
.extend({
right: 8,
background: 'mirror'
})
...
```
* Throws **[Error][13]** Invalid parameters
Returns **Sharp**&#x20;
## extract ## extract
Extract/crop a region of the image. Extract/crop a region of the image.
* Use `extract` before `resize` for pre-resize extraction. - Use `extract` before `resize` for pre-resize extraction.
* Use `extract` after `resize` for post-resize extraction. - Use `extract` after `resize` for post-resize extraction.
* Use `extract` before and after for both. - Use `extract` before and after for both.
### Parameters
* `options` **[Object][9]** describes the region to extract using integral pixel values **Throws**:
* `options.left` **[number][8]** zero-indexed offset from left edge - <code>Error</code> Invalid parameters
* `options.top` **[number][8]** zero-indexed offset from top edge
* `options.width` **[number][8]** width of region to extract
* `options.height` **[number][8]** height of region to extract
### Examples
```javascript | Param | Type | Description |
| --- | --- | --- |
| options | <code>Object</code> | describes the region to extract using integral pixel values |
| options.left | <code>number</code> | zero-indexed offset from left edge |
| options.top | <code>number</code> | zero-indexed offset from top edge |
| options.width | <code>number</code> | width of region to extract |
| options.height | <code>number</code> | height of region to extract |
**Example**
```js
sharp(input) sharp(input)
.extract({ left: left, top: top, width: width, height: height }) .extract({ left: left, top: top, width: width, height: height })
.toFile(output, function(err) { .toFile(output, function(err) {
// Extract a region of the input image, saving in the same format. // Extract a region of the input image, saving in the same format.
}); });
``` ```
**Example**
```javascript ```js
sharp(input) sharp(input)
.extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre }) .extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre })
.resize(width, height) .resize(width, height)
@@ -234,12 +243,8 @@ sharp(input)
}); });
``` ```
* Throws **[Error][13]** Invalid parameters
Returns **Sharp**&#x20;
## trim ## trim
Trim pixels from all edges that contain values similar to the given background colour, which defaults to that of the top-left pixel. Trim pixels from all edges that contain values similar to the given background colour, which defaults to that of the top-left pixel.
Images with an alpha channel will use the combined bounding box of alpha and non-alpha channels. Images with an alpha channel will use the combined bounding box of alpha and non-alpha channels.
@@ -249,16 +254,20 @@ If the result of this operation would trim an image to nothing then no change is
The `info` response Object, obtained from callback of `.toFile()` or `.toBuffer()`, The `info` response Object, obtained from callback of `.toFile()` or `.toBuffer()`,
will contain `trimOffsetLeft` and `trimOffsetTop` properties. will contain `trimOffsetLeft` and `trimOffsetTop` properties.
### Parameters
* `trim` **([string][10] | [number][8] | [Object][9])** the specific background colour to trim, the threshold for doing so or an Object with both. **Throws**:
* `trim.background` **([string][10] | [Object][9])** background colour, parsed by the [color][11] module, defaults to that of the top-left pixel. (optional, default `'top-left pixel'`) - <code>Error</code> Invalid parameters
* `trim.threshold` **[number][8]** the allowed difference from the above colour, a positive number. (optional, default `10`)
### Examples
```javascript | Param | Type | Default | Description |
| --- | --- | --- | --- |
| trim | <code>string</code> \| <code>number</code> \| <code>Object</code> | | the specific background colour to trim, the threshold for doing so or an Object with both. |
| [trim.background] | <code>string</code> \| <code>Object</code> | <code>&quot;&#x27;top-left pixel&#x27;&quot;</code> | background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to that of the top-left pixel. |
| [trim.threshold] | <code>number</code> | <code>10</code> | the allowed difference from the above colour, a positive number. |
**Example**
```js
// Trim pixels with a colour similar to that of the top-left pixel. // Trim pixels with a colour similar to that of the top-left pixel.
sharp(input) sharp(input)
.trim() .trim()
@@ -266,8 +275,8 @@ sharp(input)
... ...
}); });
``` ```
**Example**
```javascript ```js
// Trim pixels with the exact same colour as that of the top-left pixel. // Trim pixels with the exact same colour as that of the top-left pixel.
sharp(input) sharp(input)
.trim(0) .trim(0)
@@ -275,8 +284,8 @@ sharp(input)
... ...
}); });
``` ```
**Example**
```javascript ```js
// Trim only pixels with a similar colour to red. // Trim only pixels with a similar colour to red.
sharp(input) sharp(input)
.trim("#FF0000") .trim("#FF0000")
@@ -284,8 +293,8 @@ sharp(input)
... ...
}); });
``` ```
**Example**
```javascript ```js
// Trim all "yellow-ish" pixels, being more lenient with the higher threshold. // Trim all "yellow-ish" pixels, being more lenient with the higher threshold.
sharp(input) sharp(input)
.trim({ .trim({
@@ -295,34 +304,4 @@ sharp(input)
.toFile(output, function(err, info) { .toFile(output, function(err, info) {
... ...
}); });
``` ```
* Throws **[Error][13]** Invalid parameters
Returns **Sharp**&#x20;
[1]: https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit
[2]: https://developer.mozilla.org/en-US/docs/Web/CSS/object-position
[3]: https://en.wikipedia.org/wiki/Entropy_%28information_theory%29
[4]: http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation
[5]: https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline
[6]: https://www.cs.utexas.edu/~fussell/courses/cs384g-fall2013/lectures/mitchell/Mitchell.pdf
[7]: https://en.wikipedia.org/wiki/Lanczos_resampling#Lanczos_kernel
[8]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number
[9]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object
[10]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String
[11]: https://www.npmjs.org/package/color
[12]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[13]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Error

View File

@@ -1,101 +1,96 @@
<!-- Generated by documentation.js. Update this documentation by updating the source code. -->
## format
An Object containing nested boolean values representing the available input and output formats/methods.
### Examples
```javascript
console.log(sharp.format);
```
Returns **[Object][1]**&#x20;
## interpolators
An Object containing the available interpolators and their proper values
Type: [string][2]
### nearest
[Nearest neighbour interpolation][3]. Suitable for image enlargement only.
### bilinear
[Bilinear interpolation][4]. Faster than bicubic but with less smooth results.
### bicubic
[Bicubic interpolation][5] (the default).
### locallyBoundedBicubic
[LBB interpolation][6]. Prevents some "[acutance][7]" but typically reduces performance by a factor of 2.
### nohalo
[Nohalo interpolation][8]. Prevents acutance but typically reduces performance by a factor of 3.
### vertexSplitQuadraticBasisSpline
[VSQBS interpolation][9]. Prevents "staircasing" when enlarging.
## versions ## versions
An Object containing the version numbers of sharp, libvips and its dependencies.
An Object containing the version numbers of libvips and its dependencies.
### Examples **Example**
```js
```javascript
console.log(sharp.versions); console.log(sharp.versions);
``` ```
## vendor
## interpolators
An Object containing the available interpolators and their proper values
**Read only**: true
**Properties**
| Name | Type | Default | Description |
| --- | --- | --- | --- |
| nearest | <code>string</code> | <code>&quot;nearest&quot;</code> | [Nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). Suitable for image enlargement only. |
| bilinear | <code>string</code> | <code>&quot;bilinear&quot;</code> | [Bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation). Faster than bicubic but with less smooth results. |
| bicubic | <code>string</code> | <code>&quot;bicubic&quot;</code> | [Bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). |
| locallyBoundedBicubic | <code>string</code> | <code>&quot;lbb&quot;</code> | [LBB interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/lbb.cpp#L100). Prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. |
| nohalo | <code>string</code> | <code>&quot;nohalo&quot;</code> | [Nohalo interpolation](http://eprints.soton.ac.uk/268086/). Prevents acutance but typically reduces performance by a factor of 3. |
| vertexSplitQuadraticBasisSpline | <code>string</code> | <code>&quot;vsqbs&quot;</code> | [VSQBS interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/vsqbs.cpp#L48). Prevents "staircasing" when enlarging. |
## format
An Object containing nested boolean values representing the available input and output formats/methods.
**Example**
```js
console.log(sharp.format);
```
## vendor
An Object containing the platform and architecture An Object containing the platform and architecture
of the current and installed vendored binaries. of the current and installed vendored binaries.
### Examples
```javascript **Example**
```js
console.log(sharp.vendor); console.log(sharp.vendor);
``` ```
## cache
Gets or, when options are provided, sets the limits of *libvips'* operation cache. ## queue
An EventEmitter that emits a `change` event when a task is either:
- queued, waiting for _libuv_ to provide a worker thread
- complete
**Example**
```js
sharp.queue.on('change', function(queueLength) {
console.log('Queue contains ' + queueLength + ' task(s)');
});
```
## cache
Gets or, when options are provided, sets the limits of _libvips'_ operation cache.
Existing entries in the cache will be trimmed after any change in limits. Existing entries in the cache will be trimmed after any change in limits.
This method always returns cache statistics, This method always returns cache statistics,
useful for determining how much working memory is required for a particular task. useful for determining how much working memory is required for a particular task.
### Parameters
* `options` **([Object][1] | [boolean][10])** Object with the following attributes, or boolean where true uses default cache settings and false removes all caching (optional, default `true`)
* `options.memory` **[number][11]** is the maximum memory in MB to use for this cache (optional, default `50`) | Param | Type | Default | Description |
* `options.files` **[number][11]** is the maximum number of files to hold open (optional, default `20`) | --- | --- | --- | --- |
* `options.items` **[number][11]** is the maximum number of operations to cache (optional, default `100`) | [options] | <code>Object</code> \| <code>boolean</code> | <code>true</code> | Object with the following attributes, or boolean where true uses default cache settings and false removes all caching |
| [options.memory] | <code>number</code> | <code>50</code> | is the maximum memory in MB to use for this cache |
| [options.files] | <code>number</code> | <code>20</code> | is the maximum number of files to hold open |
| [options.items] | <code>number</code> | <code>100</code> | is the maximum number of operations to cache |
### Examples **Example**
```js
```javascript
const stats = sharp.cache(); const stats = sharp.cache();
``` ```
**Example**
```javascript ```js
sharp.cache( { items: 200 } ); sharp.cache( { items: 200 } );
sharp.cache( { files: 0 } ); sharp.cache( { files: 0 } );
sharp.cache(false); sharp.cache(false);
``` ```
Returns **[Object][1]**&#x20;
## concurrency ## concurrency
Gets or, when a concurrency is provided, sets Gets or, when a concurrency is provided, sets
the maximum number of threads *libvips* should use to process *each image*. the maximum number of threads _libvips_ should use to process _each image_.
These are from a thread pool managed by glib, These are from a thread pool managed by glib,
which helps avoid the overhead of creating new threads. which helps avoid the overhead of creating new threads.
@@ -115,102 +110,114 @@ The maximum number of images that sharp can process in parallel
is controlled by libuv's `UV_THREADPOOL_SIZE` environment variable, is controlled by libuv's `UV_THREADPOOL_SIZE` environment variable,
which defaults to 4. which defaults to 4.
[https://nodejs.org/api/cli.html#uv\_threadpool\_sizesize][12] https://nodejs.org/api/cli.html#uv_threadpool_sizesize
For example, by default, a machine with 8 CPU cores will process For example, by default, a machine with 8 CPU cores will process
4 images in parallel and use up to 8 threads per image, 4 images in parallel and use up to 8 threads per image,
so there will be up to 32 concurrent threads. so there will be up to 32 concurrent threads.
### Parameters
* `concurrency` **[number][11]?**&#x20; **Returns**: <code>number</code> - concurrency
### Examples | Param | Type |
| --- | --- |
| [concurrency] | <code>number</code> |
```javascript **Example**
```js
const threads = sharp.concurrency(); // 4 const threads = sharp.concurrency(); // 4
sharp.concurrency(2); // 2 sharp.concurrency(2); // 2
sharp.concurrency(0); // 4 sharp.concurrency(0); // 4
``` ```
Returns **[number][11]** concurrency
## queue
An EventEmitter that emits a `change` event when a task is either:
* queued, waiting for *libuv* to provide a worker thread
* complete
### Examples
```javascript
sharp.queue.on('change', function(queueLength) {
console.log('Queue contains ' + queueLength + ' task(s)');
});
```
## counters ## counters
Provides access to internal task counters. Provides access to internal task counters.
- queue is the number of tasks this module has queued waiting for _libuv_ to provide a worker thread from its pool.
- process is the number of resize tasks currently being processed.
* queue is the number of tasks this module has queued waiting for *libuv* to provide a worker thread from its pool.
* process is the number of resize tasks currently being processed.
### Examples **Example**
```js
```javascript
const counters = sharp.counters(); // { queue: 2, process: 4 } const counters = sharp.counters(); // { queue: 2, process: 4 }
``` ```
Returns **[Object][1]**&#x20;
## simd ## simd
Get and set use of SIMD vector unit instructions. Get and set use of SIMD vector unit instructions.
Requires libvips to have been compiled with liborc support. Requires libvips to have been compiled with liborc support.
Improves the performance of `resize`, `blur` and `sharpen` operations Improves the performance of `resize`, `blur` and `sharpen` operations
by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON. by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON.
### Parameters
* `simd` **[boolean][10]** (optional, default `true`)
### Examples | Param | Type | Default |
| --- | --- | --- |
| [simd] | <code>boolean</code> | <code>true</code> |
```javascript **Example**
```js
const simd = sharp.simd(); const simd = sharp.simd();
// simd is `true` if the runtime use of liborc is currently enabled // simd is `true` if the runtime use of liborc is currently enabled
``` ```
**Example**
```javascript ```js
const simd = sharp.simd(false); const simd = sharp.simd(false);
// prevent libvips from using liborc at runtime // prevent libvips from using liborc at runtime
``` ```
Returns **[boolean][10]**&#x20;
[1]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Object ## block
Block libvips operations at runtime.
[2]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/String This is in addition to the `VIPS_BLOCK_UNTRUSTED` environment variable,
which when set will block all "untrusted" operations.
[3]: http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation
[4]: http://en.wikipedia.org/wiki/Bilinear_interpolation **Since**: 0.32.4
[5]: http://en.wikipedia.org/wiki/Bicubic_interpolation | Param | Type | Description |
| --- | --- | --- |
| options | <code>Object</code> | |
| options.operation | <code>Array.&lt;string&gt;</code> | List of libvips low-level operation names to block. |
[6]: https://github.com/libvips/libvips/blob/master/libvips/resample/lbb.cpp#L100 **Example** *(Block all TIFF input.)*
```js
sharp.block({
operation: ['VipsForeignLoadTiff']
});
```
[7]: http://en.wikipedia.org/wiki/Acutance
[8]: http://eprints.soton.ac.uk/268086/ ## unblock
Unblock libvips operations at runtime.
[9]: https://github.com/libvips/libvips/blob/master/libvips/resample/vsqbs.cpp#L48 This is useful for defining a list of allowed operations.
[10]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Boolean
[11]: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Number **Since**: 0.32.4
[12]: https://nodejs.org/api/cli.html#uv_threadpool_sizesize | Param | Type | Description |
| --- | --- | --- |
| options | <code>Object</code> | |
| options.operation | <code>Array.&lt;string&gt;</code> | List of libvips low-level operation names to unblock. |
**Example** *(Block all input except WebP from the filesystem.)*
```js
sharp.block({
operation: ['VipsForeignLoad']
});
sharp.unblock({
operation: ['VipsForeignLoadWebpFile']
});
```
**Example** *(Block all input except JPEG and PNG from a Buffer or Stream.)*
```js
sharp.block({
operation: ['VipsForeignLoad']
});
sharp.unblock({
operation: ['VipsForeignLoadJpegBuffer', 'VipsForeignLoadPngBuffer']
});
```

View File

@@ -1,7 +1,11 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const fs = require('fs').promises; const fs = require('fs').promises;
const path = require('path'); const path = require('path');
const jsdoc2md = require('jsdoc-to-markdown');
[ [
'constructor', 'constructor',
@@ -14,13 +18,21 @@ const path = require('path');
'output', 'output',
'utility' 'utility'
].forEach(async (m) => { ].forEach(async (m) => {
const documentation = await import('documentation');
const input = path.join('lib', `${m}.js`); const input = path.join('lib', `${m}.js`);
const output = path.join('docs', `api-${m}.md`); const output = path.join('docs', `api-${m}.md`);
const ast = await documentation.build(input, { shallow: true }); const ast = await jsdoc2md.getTemplateData({ files: input });
const markdown = await documentation.formats.md(ast, { markdownToc: false }); const markdown = await jsdoc2md.render({
data: ast,
'global-index-format': 'none',
'module-index-format': 'none'
});
await fs.writeFile(output, markdown); const cleanMarkdown = markdown
.replace(/(## [A-Za-z0-9]+)[^\n]*/g, '$1') // simplify headings to match those of documentationjs, ensures existing URLs work
.replace(/<a name="[A-Za-z0-9+]+"><\/a>/g, '') // remove anchors, let docute add these (at markdown to HTML render time)
.replace(/\*\*Kind\*\*: global[^\n]+/g, '') // remove all "global" Kind labels (requires JSDoc refactoring)
.trim();
await fs.writeFile(output, cleanMarkdown);
}); });

View File

@@ -1,5 +1,136 @@
# Changelog # Changelog
## v0.32 - *flow*
Requires libvips v8.14.3
### v0.32.4 - 21st July 2023
* Upgrade to libvips v8.14.3 for upstream bug fixes.
* Expose ability to (un)block low-level libvips operations by name.
* Prebuilt binaries: restore support for tile-based output.
[#3581](https://github.com/lovell/sharp/issues/3581)
### v0.32.3 - 14th July 2023
* Expose `preset` option for WebP output.
[#3639](https://github.com/lovell/sharp/issues/3639)
* Ensure decoding remains sequential for all operations (regression in 0.32.2).
[#3725](https://github.com/lovell/sharp/issues/3725)
### v0.32.2 - 11th July 2023
* Limit HEIF output dimensions to 16384x16384, matches libvips.
* Ensure exceptions are not thrown when terminating.
[#3569](https://github.com/lovell/sharp/issues/3569)
* Ensure the same access method is used for all inputs (regression in 0.32.0).
[#3669](https://github.com/lovell/sharp/issues/3669)
* Improve detection of jp2 filename extensions.
[#3674](https://github.com/lovell/sharp/pull/3674)
[@bianjunjie1981](https://github.com/bianjunjie1981)
* Guard use of smartcrop premultiplied option to prevent warning (regression in 0.32.1).
[#3710](https://github.com/lovell/sharp/issues/3710)
* Prevent over-compute in affine-based rotate before resize.
[#3722](https://github.com/lovell/sharp/issues/3722)
* Allow sequential read for EXIF-based auto-orientation.
[#3725](https://github.com/lovell/sharp/issues/3725)
### v0.32.1 - 27th April 2023
* Add experimental `unflatten` operation.
[#3461](https://github.com/lovell/sharp/pull/3461)
[@antonmarsden](https://github.com/antonmarsden)
* Ensure use of `flip` operation forces random access read (regression in 0.32.0).
[#3600](https://github.com/lovell/sharp/issues/3600)
* Ensure `linear` operation works with 16-bit input (regression in 0.31.3).
[#3605](https://github.com/lovell/sharp/issues/3605)
* Install: ensure proxy URLs are logged correctly.
[#3615](https://github.com/lovell/sharp/pull/3615)
[@TomWis97](https://github.com/TomWis97)
* Ensure profile-less CMYK to CMYK roundtrip skips colourspace conversion.
[#3620](https://github.com/lovell/sharp/issues/3620)
* Add support for `modulate` operation when using non-sRGB pipeline colourspace.
[#3620](https://github.com/lovell/sharp/issues/3620)
* Ensure `trim` operation works with CMYK images (regression in 0.31.0).
[#3636](https://github.com/lovell/sharp/issues/3636)
* Install: coerce libc version to semver.
[#3641](https://github.com/lovell/sharp/issues/3641)
### v0.32.0 - 24th March 2023
* Default to using sequential rather than random access read where possible.
* Replace GIF output `optimise` / `optimize` option with `reuse`.
* Add `progressive` option to GIF output for interlacing.
* Add `wrap` option to text image creation.
* Add `formatMagick` property to metadata of images loaded via *magick.
* Prefer integer (un)premultiply for faster resizing of RGBA images.
* Add `ignoreIcc` input option to ignore embedded ICC profile.
* Allow use of GPS (IFD3) EXIF metadata.
[#2767](https://github.com/lovell/sharp/issues/2767)
* TypeScript definitions are now maintained and published directly, deprecating the `@types/sharp` package.
[#3369](https://github.com/lovell/sharp/issues/3369)
* Prebuilt binaries: ensure macOS 10.13+ support, as documented.
[#3438](https://github.com/lovell/sharp/issues/3438)
* Prebuilt binaries: prevent use of glib slice allocator, improves QEMU support.
[#3448](https://github.com/lovell/sharp/issues/3448)
* Add focus point coordinates to output when using attention based crop.
[#3470](https://github.com/lovell/sharp/pull/3470)
[@ejoebstl](https://github.com/ejoebstl)
* Expose sharp version as `sharp.versions.sharp`.
[#3471](https://github.com/lovell/sharp/issues/3471)
* Respect `fastShrinkOnLoad` resize option for WebP input.
[#3516](https://github.com/lovell/sharp/issues/3516)
* Reduce sharpen `sigma` maximum from 10000 to 10.
[#3521](https://github.com/lovell/sharp/issues/3521)
* Add support for `ArrayBuffer` input.
[#3548](https://github.com/lovell/sharp/pull/3548)
[@kapouer](https://github.com/kapouer)
* Add support to `extend` operation for `extendWith` to allow copy/mirror/repeat.
[#3556](https://github.com/lovell/sharp/pull/3556)
[@janaz](https://github.com/janaz)
* Ensure all async JS callbacks are wrapped to help avoid possible race condition.
[#3569](https://github.com/lovell/sharp/issues/3569)
* Prebuilt binaries: support for tile-based output temporarily removed due to licensing issue.
[#3581](https://github.com/lovell/sharp/issues/3581)
* Add support to `normalise` for `lower` and `upper` percentiles.
[#3583](https://github.com/lovell/sharp/pull/3583)
[@LachlanNewman](https://github.com/LachlanNewman)
## v0.31 - *eagle* ## v0.31 - *eagle*
Requires libvips v8.13.3 Requires libvips v8.13.3

View File

@@ -263,3 +263,15 @@ GitHub: https://github.com/antonmarsden
Name: Marcos Casagrande Name: Marcos Casagrande
GitHub: https://github.com/marcosc90 GitHub: https://github.com/marcosc90
Name: Emanuel Jöbstl
GitHub: https://github.com/ejoebstl
Name: Tomasz Janowski
GitHub: https://github.com/janaz
Name: Lachlan Newman
GitHub: https://github.com/LachlanNewman
Name: BJJ
GitHub: https://github.com/bianjunjie1981

Binary file not shown.

After

Width:  |  Height:  |  Size: 652 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 661 B

View File

@@ -5,13 +5,15 @@
<meta charset="utf-8"> <meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="description" content="Resize large images in common formats to smaller, web-friendly JPEG, PNG, WebP, GIF and AVIF images of varying dimensions"> <meta name="description" content="Resize large images in common formats to smaller, web-friendly JPEG, PNG, WebP, GIF and AVIF images of varying dimensions">
<meta property="og:title" content="sharp - High performance Node.js image processing">
<meta property="og:image" content="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo-600.png">
<meta http-equiv="Content-Security-Policy" content="default-src 'self'; object-src 'none'; style-src 'unsafe-inline'; <meta http-equiv="Content-Security-Policy" content="default-src 'self'; object-src 'none'; style-src 'unsafe-inline';
img-src 'unsafe-inline' data: https://cdn.jsdelivr.net/gh/lovell/ https://www.google-analytics.com; img-src 'unsafe-inline' data: https://cdn.jsdelivr.net/gh/lovell/ https://www.google-analytics.com;
connect-src 'self' https://www.google-analytics.com; connect-src 'self' https://www.google-analytics.com;
script-src 'self' 'unsafe-inline' 'unsafe-eval' script-src 'self' 'unsafe-inline' 'unsafe-eval'
https://www.google-analytics.com/analytics.js;"> https://www.google-analytics.com/analytics.js;">
<link rel="icon" type="image/svg+xml" href="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo.svg"> <link rel="icon" type="image/svg+xml" href="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo.svg">
<link rel="icon" type="image/png" sizes="32x32" href="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo.png"> <link rel="icon" type="image/png" sizes="32x32" href="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo-32.png">
<link rel="author" href="/humans.txt" type="text/plain"> <link rel="author" href="/humans.txt" type="text/plain">
<link rel="dns-prefetch" href="https://www.google-analytics.com"> <link rel="dns-prefetch" href="https://www.google-analytics.com">
<script type="application/ld+json"> <script type="application/ld+json">
@@ -29,7 +31,7 @@
"@type": "Person", "@type": "Person",
"name": "Lovell Fuller" "name": "Lovell Fuller"
}, },
"copyrightYear": [2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022], "copyrightYear": 2013,
"license": "https://www.apache.org/licenses/LICENSE-2.0" "license": "https://www.apache.org/licenses/LICENSE-2.0"
} }
</script> </script>
@@ -77,9 +79,7 @@
.map(function (sidebarLink) { .map(function (sidebarLink) {
return sidebarLink.title; return sidebarLink.title;
})[0]; })[0];
return title return title ? `# ${title}\n${md}` : md;
? md.replace(/<!-- Generated by documentation.js. Update this documentation by updating the source code. -->/, '# ' + title)
: md;
}); });
} }
}; };

View File

@@ -28,7 +28,7 @@ is downloaded via HTTPS, verified via Subresource Integrity
and decompressed into `node_modules/sharp/vendor` during `npm install`. and decompressed into `node_modules/sharp/vendor` during `npm install`.
This provides support for the This provides support for the
JPEG, PNG, WebP, AVIF, TIFF, GIF and SVG (input) image formats. JPEG, PNG, WebP, AVIF (limited to 8-bit depth), TIFF, GIF and SVG (input) image formats.
The following platforms have prebuilt libvips but not sharp: The following platforms have prebuilt libvips but not sharp:
@@ -115,7 +115,8 @@ and that it can be located using `pkg-config --modversion vips-cpp`.
For help compiling libvips and its dependencies, please see For help compiling libvips and its dependencies, please see
[building libvips from source](https://www.libvips.org/install.html#building-libvips-from-source). [building libvips from source](https://www.libvips.org/install.html#building-libvips-from-source).
The use of a globally-installed libvips is unsupported on Windows. The use of a globally-installed libvips is unsupported on Windows
and on macOS when running Node.js under Rosetta.
## Building from source ## Building from source
@@ -147,7 +148,7 @@ or the `npm_config_sharp_local_prebuilds` environment variable.
URL example: URL example:
if `sharp_binary_host` is set to `https://hostname/path` if `sharp_binary_host` is set to `https://hostname/path`
and the sharp version is `1.2.3` then the resultant URL will be and the sharp version is `1.2.3` then the resultant URL will be
`https://hostname/path/sharp-v1.2.3-napi-v5-platform-arch.tar.gz`. `https://hostname/path/v1.2.3/sharp-v1.2.3-napi-v5-platform-arch.tar.gz`.
Filename example: Filename example:
if `sharp_local_prebuilds` is set to `/path` if `sharp_local_prebuilds` is set to `/path`
@@ -178,6 +179,16 @@ and the libvips version is `4.5.6` then the resultant filename will be
See the Chinese mirror below for a further example. See the Chinese mirror below for a further example.
If these binaries are modified, new integrity hashes can be provided
at install time via `npm_package_config_integrity_platform_arch`
environment variables, for example set
`npm_package_config_integrity_linux_x64` to `sha512-abc...`.
The integrity hash of a file can be generated via:
```sh
sha512sum libvips-x.y.z-platform-arch.tar.br | cut -f1 -d' ' | xxd -r -p | base64 -w 0
```
## Chinese mirror ## Chinese mirror
A mirror site based in China, provided by Alibaba, contains binaries for both sharp and libvips. A mirror site based in China, provided by Alibaba, contains binaries for both sharp and libvips.
@@ -227,16 +238,6 @@ the use of an alternative memory allocator such as
Those using musl-based Linux (e.g. Alpine) and non-Linux systems are Those using musl-based Linux (e.g. Alpine) and non-Linux systems are
unaffected. unaffected.
## Heroku
Add the
[jemalloc buildpack](https://github.com/gaffneyc/heroku-buildpack-jemalloc)
to reduce the effects of memory fragmentation.
Set
[NODE_MODULES_CACHE](https://devcenter.heroku.com/articles/nodejs-support#cache-behavior)
to `false` when using the `yarn` package manager.
## AWS Lambda ## AWS Lambda
The `node_modules` directory of the The `node_modules` directory of the
@@ -255,6 +256,9 @@ SHARP_IGNORE_GLOBAL_LIBVIPS=1 npm install --arch=x64 --platform=linux --libc=gli
To get the best performance select the largest memory available. To get the best performance select the largest memory available.
A 1536 MB function provides ~12x more CPU time than a 128 MB function. A 1536 MB function provides ~12x more CPU time than a 128 MB function.
When integrating with AWS API Gateway, ensure it is configured with the relevant
[binary media types](https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings.html).
## Bundlers ## Bundlers
### webpack ### webpack
@@ -301,6 +305,17 @@ custom:
- npm install --arch=x64 --platform=linux sharp - npm install --arch=x64 --platform=linux sharp
``` ```
## TypeScript
TypeScript definitions are published as part of
the `sharp` package from v0.32.0.
Previously these were available via the `@types/sharp` package,
which is now deprecated.
When using Typescript, please ensure `devDependencies` includes
the `@types/node` package.
## Fonts ## Fonts
When creating text images or rendering SVG images that contain text elements, When creating text images or rendering SVG images that contain text elements,

View File

@@ -7,58 +7,98 @@ and using 8+ core machines, especially those with larger L1/L2 CPU caches.
The I/O limits of the relevant (de)compression library will generally determine maximum throughput. The I/O limits of the relevant (de)compression library will generally determine maximum throughput.
## The contenders ## Contenders
* [jimp](https://www.npmjs.com/package/jimp) v0.16.2 - Image processing in pure JavaScript. Provides bicubic interpolation. * [jimp](https://www.npmjs.com/package/jimp) v0.22.7 - Image processing in pure JavaScript.
* [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*". * [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*".
* [gm](https://www.npmjs.com/package/gm) v1.25.0 - Fully featured wrapper around GraphicsMagick's `gm` command line utility. * [gm](https://www.npmjs.com/package/gm) v1.25.0 - Fully featured wrapper around GraphicsMagick's `gm` command line utility.
* [@squoosh/lib](https://www.npmjs.com/package/@squoosh/lib) v0.4.0 - Image libraries transpiled to WebAssembly, includes GPLv3 code. * [@squoosh/lib](https://www.npmjs.com/package/@squoosh/lib) v0.4.0 - Image libraries transpiled to WebAssembly, includes GPLv3 code, but "*Project no longer maintained*".
* [@squoosh/cli](https://www.npmjs.com/package/@squoosh/cli) v0.7.2 - Command line wrapper around `@squoosh/lib`, avoids GPLv3 by spawning process. * [@squoosh/cli](https://www.npmjs.com/package/@squoosh/cli) v0.7.3 - Command line wrapper around `@squoosh/lib`, avoids GPLv3 by spawning process, but "*Project no longer maintained*".
* sharp v0.31.3 / libvips v8.13.3 - Caching within libvips disabled to ensure a fair comparison. * sharp v0.32.0 / libvips v8.14.2 - Caching within libvips disabled to ensure a fair comparison.
## The task ## Environment
### AMD64
* AWS EC2 us-east-2 [c6a.xlarge](https://aws.amazon.com/ec2/instance-types/c6a/) (4x AMD EPYC 7R13)
* Ubuntu 22.04 20230303 (ami-0122295b0eb922138)
* Node.js 16.19.1
### ARM64
* AWS EC2 us-east-2 [c7g.xlarge](https://aws.amazon.com/ec2/instance-types/c7g/) (4x ARM Graviton3)
* Ubuntu 22.04 20230303 (ami-0af198159897e7a29)
* Node.js 16.19.1
## Task: JPEG
Decompress a 2725x2225 JPEG image, Decompress a 2725x2225 JPEG image,
resize to 720x588 using Lanczos 3 resampling (where available), resize to 720x588 using Lanczos 3 resampling (where available),
then compress to JPEG at a "quality" setting of 80. then compress to JPEG at a "quality" setting of 80.
## Results Note: jimp does not support Lanczos 3, bicubic resampling used instead.
### AMD64 #### Results: JPEG (AMD64)
* AWS EC2 eu-west-1 [c6a.xlarge](https://aws.amazon.com/ec2/instance-types/c6a/) (4x AMD EPYC 7R13)
* Ubuntu 22.04 (ami-026e72e4e468afa7b)
* Node.js 16.19.0
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 0.82 | 1.0 |
| squoosh-cli | file | file | 1.05 | 1.3 |
| squoosh-lib | buffer | buffer | 1.19 | 1.5 |
| gm | buffer | buffer | 8.47 | 10.3 |
| gm | file | file | 8.58 | 10.5 |
| imagemagick | file | file | 9.23 | 11.3 |
| sharp | stream | stream | 33.23 | 40.5 |
| sharp | file | file | 35.22 | 43.0 |
| sharp | buffer | buffer | 35.70 | 43.5 |
### ARM64
* AWS EC2 eu-west-1 [c7g.xlarge](https://aws.amazon.com/ec2/instance-types/c7g/) (4x ARM Graviton3)
* Ubuntu 22.04 (ami-02142ceceb3933ff5)
* Node.js 16.19.0
| Module | Input | Output | Ops/sec | Speed-up | | Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: | | :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 0.84 | 1.0 | | jimp | buffer | buffer | 0.84 | 1.0 |
| squoosh-cli | file | file | 1.12 | 1.3 | | squoosh-cli | file | file | 1.07 | 1.3 |
| squoosh-lib | buffer | buffer | 2.11 | 2.5 | | squoosh-lib | buffer | buffer | 1.82 | 2.2 |
| gm | buffer | buffer | 10.39 | 12.4 | | gm | buffer | buffer | 8.41 | 10.0 |
| gm | file | file | 10.40 | 12.4 | | gm | file | file | 8.45 | 10.0 |
| imagemagick | file | file | 10.73 | 12.8 | | imagemagick | file | file | 8.77 | 10.4 |
| sharp | stream | stream | 33.63 | 40.0 | | sharp | stream | stream | 36.36 | 43.3 |
| sharp | file | file | 34.91 | 41.6 | | sharp | file | file | 38.67 | 46.0 |
| sharp | buffer | buffer | 35.72 | 42.5 | | sharp | buffer | buffer | 39.44 | 47.0 |
#### Results: JPEG (ARM64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 1.02 | 1.0 |
| squoosh-cli | file | file | 1.11 | 1.1 |
| squoosh-lib | buffer | buffer | 2.08 | 2.0 |
| gm | buffer | buffer | 8.80 | 8.6 |
| gm | file | file | 10.05 | 9.9 |
| imagemagick | file | file | 10.28 | 10.1 |
| sharp | stream | stream | 26.87 | 26.3 |
| sharp | file | file | 27.88 | 27.3 |
| sharp | buffer | buffer | 28.40 | 27.8 |
## Task: PNG
Decompress a 2048x1536 RGBA PNG image,
premultiply the alpha channel,
resize to 720x540 using Lanczos 3 resampling (where available),
unpremultiply then compress as PNG with a "default" zlib compression level of 6
and without adaptive filtering.
Note: jimp does not support premultiply/unpremultiply.
### Results: PNG (AMD64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| squoosh-cli | file | file | 0.40 | 1.0 |
| squoosh-lib | buffer | buffer | 0.47 | 1.2 |
| gm | file | file | 6.47 | 16.2 |
| jimp | buffer | buffer | 6.60 | 16.5 |
| imagemagick | file | file | 7.08 | 17.7 |
| sharp | file | file | 17.80 | 44.5 |
| sharp | buffer | buffer | 18.02 | 45.0 |
### Results: PNG (ARM64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| squoosh-cli | file | file | 0.40 | 1.0 |
| squoosh-lib | buffer | buffer | 0.48 | 1.2 |
| gm | file | file | 7.20 | 18.0 |
| jimp | buffer | buffer | 7.62 | 19.1 |
| imagemagick | file | file | 7.96 | 19.9 |
| sharp | file | file | 12.97 | 32.4 |
| sharp | buffer | buffer | 13.12 | 32.8 |
## Running the benchmark test ## Running the benchmark test

File diff suppressed because one or more lines are too long

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const fs = require('fs'); const fs = require('fs');
@@ -37,7 +40,7 @@ for (const match of matches) {
].forEach((section) => { ].forEach((section) => {
const contents = fs.readFileSync(path.join(__dirname, '..', `api-${section}.md`), 'utf8'); const contents = fs.readFileSync(path.join(__dirname, '..', `api-${section}.md`), 'utf8');
const matches = contents.matchAll( const matches = contents.matchAll(
/\n## (?<title>[A-Za-z]+)\n\n(?<firstparagraph>.+?)\n\n(?<parameters>### Parameters.+?Returns)?/gs /## (?<title>[A-Za-z]+)\n(?<firstparagraph>.+?)\n\n.+?(?<parameters>\| Param .+?\n\n)?\*\*Example/gs
); );
for (const match of matches) { for (const match of matches) {
const { title, firstparagraph, parameters } = match.groups; const { title, firstparagraph, parameters } = match.groups;

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const stopWords = require('./stop-words'); const stopWords = require('./stop-words');
@@ -13,8 +16,9 @@ const extractDescription = (str) =>
.trim(); .trim();
const extractParameters = (str) => const extractParameters = (str) =>
[...str.matchAll(/options\.(?<name>[^.`]+)/gs)] [...str.matchAll(/options\.(?<name>[^.`\] ]+)/gs)]
.map((match) => match.groups.name) .map((match) => match.groups.name)
.map((name) => name.replace(/([A-Z])/g, ' $1').toLowerCase())
.join(' '); .join(' ');
const extractKeywords = (str) => const extractKeywords = (str) =>

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
module.exports = [ module.exports = [
@@ -16,6 +19,7 @@ module.exports = [
'based', 'based',
'been', 'been',
'before', 'before',
'best',
'both', 'both',
'call', 'call',
'callback', 'callback',
@@ -60,14 +64,20 @@ module.exports = [
'must', 'must',
'non', 'non',
'not', 'not',
'now',
'occur', 'occur',
'occurs', 'occurs',
'one',
'options', 'options',
'other', 'other',
'out', 'out',
'over', 'over',
'part',
'perform', 'perform',
'performs', 'performs',
'please',
'pre',
'previously',
'produce', 'produce',
'provide', 'provide',
'provided', 'provided',
@@ -112,6 +122,7 @@ module.exports = [
'using', 'using',
'value', 'value',
'values', 'values',
'via',
'were', 'were',
'when', 'when',
'which', 'which',

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const libvips = require('../lib/libvips'); const libvips = require('../lib/libvips');

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const fs = require('fs'); const fs = require('fs');

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const fs = require('fs'); const fs = require('fs');
@@ -8,6 +11,7 @@ const zlib = require('zlib');
const { createHash } = require('crypto'); const { createHash } = require('crypto');
const detectLibc = require('detect-libc'); const detectLibc = require('detect-libc');
const semverCoerce = require('semver/functions/coerce');
const semverLessThan = require('semver/functions/lt'); const semverLessThan = require('semver/functions/lt');
const semverSatisfies = require('semver/functions/satisfies'); const semverSatisfies = require('semver/functions/satisfies');
const simpleGet = require('simple-get'); const simpleGet = require('simple-get');
@@ -74,7 +78,11 @@ const verifyIntegrity = function (platformAndArch) {
flush: function (done) { flush: function (done) {
const digest = `sha512-${hash.digest('base64')}`; const digest = `sha512-${hash.digest('base64')}`;
if (expected !== digest) { if (expected !== digest) {
libvips.removeVendoredLibvips(); try {
libvips.removeVendoredLibvips();
} catch (err) {
libvips.log(err.message);
}
libvips.log(`Integrity expected: ${expected}`); libvips.log(`Integrity expected: ${expected}`);
libvips.log(`Integrity received: ${digest}`); libvips.log(`Integrity received: ${digest}`);
done(new Error(`Integrity check failed for ${platformAndArch}`)); done(new Error(`Integrity check failed for ${platformAndArch}`));
@@ -128,24 +136,23 @@ try {
if (arch === 'ia32' && !platformAndArch.startsWith('win32')) { if (arch === 'ia32' && !platformAndArch.startsWith('win32')) {
throw new Error(`Intel Architecture 32-bit systems require manual installation of libvips >= ${minimumLibvipsVersion}`); throw new Error(`Intel Architecture 32-bit systems require manual installation of libvips >= ${minimumLibvipsVersion}`);
} }
if (platformAndArch === 'darwin-arm64') {
throw new Error("Please run 'brew install vips' to install libvips on Apple M1 (ARM64) systems");
}
if (platformAndArch === 'freebsd-x64' || platformAndArch === 'openbsd-x64' || platformAndArch === 'sunos-x64') { if (platformAndArch === 'freebsd-x64' || platformAndArch === 'openbsd-x64' || platformAndArch === 'sunos-x64') {
throw new Error(`BSD/SunOS systems require manual installation of libvips >= ${minimumLibvipsVersion}`); throw new Error(`BSD/SunOS systems require manual installation of libvips >= ${minimumLibvipsVersion}`);
} }
// Linux libc version check // Linux libc version check
const libcFamily = detectLibc.familySync(); const libcVersionRaw = detectLibc.versionSync();
const libcVersion = detectLibc.versionSync(); if (libcVersionRaw) {
if (libcFamily === detectLibc.GLIBC && libcVersion && minimumGlibcVersionByArch[arch]) { const libcFamily = detectLibc.familySync();
const libcVersionWithoutPatch = libcVersion.split('.').slice(0, 2).join('.'); const libcVersion = semverCoerce(libcVersionRaw).version;
if (semverLessThan(`${libcVersionWithoutPatch}.0`, `${minimumGlibcVersionByArch[arch]}.0`)) { if (libcFamily === detectLibc.GLIBC && minimumGlibcVersionByArch[arch]) {
handleError(new Error(`Use with glibc ${libcVersion} requires manual installation of libvips >= ${minimumLibvipsVersion}`)); if (semverLessThan(libcVersion, semverCoerce(minimumGlibcVersionByArch[arch]).version)) {
handleError(new Error(`Use with glibc ${libcVersionRaw} requires manual installation of libvips >= ${minimumLibvipsVersion}`));
}
} }
} if (libcFamily === detectLibc.MUSL) {
if (libcFamily === detectLibc.MUSL && libcVersion) { if (semverLessThan(libcVersion, '1.1.24')) {
if (semverLessThan(libcVersion, '1.1.24')) { handleError(new Error(`Use with musl ${libcVersionRaw} requires manual installation of libvips >= ${minimumLibvipsVersion}`));
handleError(new Error(`Use with musl ${libcVersion} requires manual installation of libvips >= ${minimumLibvipsVersion}`)); }
} }
} }
// Node.js minimum version check // Node.js minimum version check
@@ -153,7 +160,6 @@ try {
if (!semverSatisfies(process.versions.node, supportedNodeVersion)) { if (!semverSatisfies(process.versions.node, supportedNodeVersion)) {
handleError(new Error(`Expected Node.js version ${supportedNodeVersion} but found ${process.versions.node}`)); handleError(new Error(`Expected Node.js version ${supportedNodeVersion} but found ${process.versions.node}`));
} }
// Download to per-process temporary file // Download to per-process temporary file
const tarFilename = ['libvips', minimumLibvipsVersionLabelled, platformAndArch].join('-') + '.tar.br'; const tarFilename = ['libvips', minimumLibvipsVersionLabelled, platformAndArch].join('-') + '.tar.br';
const tarPathCache = path.join(libvips.cachePath(), tarFilename); const tarPathCache = path.join(libvips.cachePath(), tarFilename);

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const url = require('url'); const url = require('url');
@@ -27,7 +30,7 @@ module.exports = function (log) {
const proxyAuth = proxy.username && proxy.password const proxyAuth = proxy.username && proxy.password
? `${decodeURIComponent(proxy.username)}:${decodeURIComponent(proxy.password)}` ? `${decodeURIComponent(proxy.username)}:${decodeURIComponent(proxy.password)}`
: null; : null;
log(`Via proxy ${proxy.protocol}://${proxy.hostname}:${proxy.port} ${proxyAuth ? 'with' : 'no'} credentials`); log(`Via proxy ${proxy.protocol}//${proxy.hostname}:${proxy.port} ${proxyAuth ? 'with' : 'no'} credentials`);
return tunnel({ return tunnel({
proxy: { proxy: {
port: Number(proxy.port), port: Number(proxy.port),

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const is = require('./is'); const is = require('./is');

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const color = require('color'); const color = require('color');
@@ -67,7 +70,8 @@ function grayscale (grayscale) {
* Set the pipeline colourspace. * Set the pipeline colourspace.
* *
* The input image will be converted to the provided colourspace at the start of the pipeline. * The input image will be converted to the provided colourspace at the start of the pipeline.
* All operations will use this colourspace before converting to the output colourspace, as defined by {@link toColourspace}. * All operations will use this colourspace before converting to the output colourspace,
* as defined by {@link #tocolourspace|toColourspace}.
* *
* This feature is experimental and has not yet been fully-tested with all operations. * This feature is experimental and has not yet been fully-tested with all operations.
* *

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const is = require('./is'); const is = require('./is');
@@ -43,7 +46,7 @@ const blend = {
* The images to composite must be the same size or smaller than the processed image. * The images to composite must be the same size or smaller than the processed image.
* If both `top` and `left` options are provided, they take precedence over `gravity`. * If both `top` and `left` options are provided, they take precedence over `gravity`.
* *
* Any resize or rotate operations in the same processing pipeline * Any resize, rotate or extract operations in the same processing pipeline
* will always be applied to the input image before composition. * will always be applied to the input image before composition.
* *
* The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`, * The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`,
@@ -105,14 +108,14 @@ const blend = {
* @param {string} [images[].input.text.align='left'] - text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). * @param {string} [images[].input.text.align='left'] - text alignment (`'left'`, `'centre'`, `'center'`, `'right'`).
* @param {boolean} [images[].input.text.justify=false] - set this to true to apply justification to the text. * @param {boolean} [images[].input.text.justify=false] - set this to true to apply justification to the text.
* @param {number} [images[].input.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified. * @param {number} [images[].input.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified.
* @param {boolean} [images[].input.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`. * @param {boolean} [images[].input.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `<span foreground="red">Red!</span>`.
* @param {number} [images[].input.text.spacing=0] - text line height in points. Will use the font line height if none is specified. * @param {number} [images[].input.text.spacing=0] - text line height in points. Will use the font line height if none is specified.
* @param {String} [images[].blend='over'] - how to blend this image with the image below. * @param {String} [images[].blend='over'] - how to blend this image with the image below.
* @param {String} [images[].gravity='centre'] - gravity at which to place the overlay. * @param {String} [images[].gravity='centre'] - gravity at which to place the overlay.
* @param {Number} [images[].top] - the pixel offset from the top edge. * @param {Number} [images[].top] - the pixel offset from the top edge.
* @param {Number} [images[].left] - the pixel offset from the left edge. * @param {Number} [images[].left] - the pixel offset from the left edge.
* @param {Boolean} [images[].tile=false] - set to true to repeat the overlay image across the entire image with the given `gravity`. * @param {Boolean} [images[].tile=false] - set to true to repeat the overlay image across the entire image with the given `gravity`.
* @param {Boolean} [images[].premultiplied=false] - set to true to avoid premultipling the image below. Equivalent to the `--premultiplied` vips option. * @param {Boolean} [images[].premultiplied=false] - set to true to avoid premultiplying the image below. Equivalent to the `--premultiplied` vips option.
* @param {Number} [images[].density=72] - number representing the DPI for vector overlay image. * @param {Number} [images[].density=72] - number representing the DPI for vector overlay image.
* @param {Object} [images[].raw] - describes overlay when using raw pixel data. * @param {Object} [images[].raw] - describes overlay when using raw pixel data.
* @param {Number} [images[].raw.width] * @param {Number} [images[].raw.width]

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const util = require('util'); const util = require('util');
@@ -46,7 +49,7 @@ const debuglog = util.debuglog('sharp');
* readableStream.pipe(transformer).pipe(writableStream); * readableStream.pipe(transformer).pipe(writableStream);
* *
* @example * @example
* // Create a blank 300x200 PNG image of semi-transluent red pixels * // Create a blank 300x200 PNG image of semi-translucent red pixels
* sharp({ * sharp({
* create: { * create: {
* width: 300, * width: 300,
@@ -113,25 +116,25 @@ const debuglog = util.debuglog('sharp');
* } * }
* }).toFile('text_rgba.png'); * }).toFile('text_rgba.png');
* *
* @param {(Buffer|Uint8Array|Uint8ClampedArray|Int8Array|Uint16Array|Int16Array|Uint32Array|Int32Array|Float32Array|Float64Array|string)} [input] - if present, can be * @param {(Buffer|ArrayBuffer|Uint8Array|Uint8ClampedArray|Int8Array|Uint16Array|Int16Array|Uint32Array|Int32Array|Float32Array|Float64Array|string)} [input] - if present, can be
* a Buffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or * a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or
* a TypedArray containing raw pixel image data, or * a TypedArray containing raw pixel image data, or
* a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. * a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file.
* JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. * JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present.
* @param {Object} [options] - if present, is an Object with optional attributes. * @param {Object} [options] - if present, is an Object with optional attributes.
* @param {string} [options.failOn='warning'] - when to abort processing of invalid pixel data, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels, invalid metadata will always abort. * @param {string} [options.failOn='warning'] - when to abort processing of invalid pixel data, one of (in order of sensitivity): 'none' (least), 'truncated', 'error' or 'warning' (most), higher levels imply lower levels, invalid metadata will always abort.
* @param {number|boolean} [options.limitInputPixels=268402689] - Do not process input images where the number of pixels * @param {number|boolean} [options.limitInputPixels=268402689] - Do not process input images where the number of pixels
* (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. * (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted.
* An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). * An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF).
* @param {boolean} [options.unlimited=false] - Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF). * @param {boolean} [options.unlimited=false] - Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF).
* @param {boolean} [options.sequentialRead=false] - Set this to `true` to use sequential rather than random access where possible. * @param {boolean} [options.sequentialRead=true] - Set this to `false` to use random access rather than sequential read. Some operations will do this automatically.
* This can reduce memory usage and might improve performance on some systems.
* @param {number} [options.density=72] - number representing the DPI for vector images in the range 1 to 100000. * @param {number} [options.density=72] - number representing the DPI for vector images in the range 1 to 100000.
* @param {number} [options.pages=1] - number of pages to extract for multi-page input (GIF, WebP, AVIF, TIFF, PDF), use -1 for all pages. * @param {number} [options.ignoreIcc=false] - should the embedded ICC profile, if any, be ignored.
* @param {number} [options.page=0] - page number to start extracting from for multi-page input (GIF, WebP, AVIF, TIFF, PDF), zero based. * @param {number} [options.pages=1] - Number of pages to extract for multi-page input (GIF, WebP, TIFF), use -1 for all pages.
* @param {number} [options.page=0] - Page number to start extracting from for multi-page input (GIF, WebP, TIFF), zero based.
* @param {number} [options.subifd=-1] - subIFD (Sub Image File Directory) to extract for OME-TIFF, defaults to main image. * @param {number} [options.subifd=-1] - subIFD (Sub Image File Directory) to extract for OME-TIFF, defaults to main image.
* @param {number} [options.level=0] - level to extract from a multi-level input (OpenSlide), zero based. * @param {number} [options.level=0] - level to extract from a multi-level input (OpenSlide), zero based.
* @param {boolean} [options.animated=false] - Set to `true` to read all frames/pages of an animated image (equivalent of setting `pages` to `-1`). * @param {boolean} [options.animated=false] - Set to `true` to read all frames/pages of an animated image (GIF, WebP, TIFF), equivalent of setting `pages` to `-1`.
* @param {Object} [options.raw] - describes raw pixel input image data. See `raw()` for pixel ordering. * @param {Object} [options.raw] - describes raw pixel input image data. See `raw()` for pixel ordering.
* @param {number} [options.raw.width] - integral number of pixels wide. * @param {number} [options.raw.width] - integral number of pixels wide.
* @param {number} [options.raw.height] - integral number of pixels high. * @param {number} [options.raw.height] - integral number of pixels high.
@@ -151,13 +154,14 @@ const debuglog = util.debuglog('sharp');
* @param {string} [options.text.text] - text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`. * @param {string} [options.text.text] - text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`.
* @param {string} [options.text.font] - font name to render with. * @param {string} [options.text.font] - font name to render with.
* @param {string} [options.text.fontfile] - absolute filesystem path to a font file that can be used by `font`. * @param {string} [options.text.fontfile] - absolute filesystem path to a font file that can be used by `font`.
* @param {number} [options.text.width=0] - integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries. * @param {number} [options.text.width=0] - Integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries.
* @param {number} [options.text.height=0] - integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0. * @param {number} [options.text.height=0] - Maximum integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0.
* @param {string} [options.text.align='left'] - text alignment (`'left'`, `'centre'`, `'center'`, `'right'`). * @param {string} [options.text.align='left'] - Alignment style for multi-line text (`'left'`, `'centre'`, `'center'`, `'right'`).
* @param {boolean} [options.text.justify=false] - set this to true to apply justification to the text. * @param {boolean} [options.text.justify=false] - set this to true to apply justification to the text.
* @param {number} [options.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified. * @param {number} [options.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified.
* @param {boolean} [options.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`. * @param {boolean} [options.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`.
* @param {number} [options.text.spacing=0] - text line height in points. Will use the font line height if none is specified. * @param {number} [options.text.spacing=0] - text line height in points. Will use the font line height if none is specified.
* @param {string} [options.text.wrap='word'] - word wrapping style when width is provided, one of: 'word', 'char', 'charWord' (prefer char, fallback to word) or 'none'.
* @returns {Sharp} * @returns {Sharp}
* @throws {Error} Invalid parameters * @throws {Error} Invalid parameters
*/ */
@@ -196,6 +200,7 @@ const Sharp = function (input, options) {
extendLeft: 0, extendLeft: 0,
extendRight: 0, extendRight: 0,
extendBackground: [0, 0, 0, 255], extendBackground: [0, 0, 0, 255],
extendWith: 'background',
withoutEnlargement: false, withoutEnlargement: false,
withoutReduction: false, withoutReduction: false,
affineMatrix: [], affineMatrix: [],
@@ -212,6 +217,7 @@ const Sharp = function (input, options) {
tintB: 128, tintB: 128,
flatten: false, flatten: false,
flattenBackground: [0, 0, 0], flattenBackground: [0, 0, 0],
unflatten: false,
negate: false, negate: false,
negateAlpha: true, negateAlpha: true,
medianSize: 0, medianSize: 0,
@@ -230,6 +236,8 @@ const Sharp = function (input, options) {
gammaOut: 0, gammaOut: 0,
greyscale: false, greyscale: false,
normalise: false, normalise: false,
normaliseLower: 1,
normaliseUpper: 99,
claheWidth: 0, claheWidth: 0,
claheHeight: 0, claheHeight: 0,
claheMaxSlope: 3, claheMaxSlope: 3,
@@ -283,6 +291,7 @@ const Sharp = function (input, options) {
webpLossless: false, webpLossless: false,
webpNearLossless: false, webpNearLossless: false,
webpSmartSubsample: false, webpSmartSubsample: false,
webpPreset: 'default',
webpEffort: 4, webpEffort: 4,
webpMinSize: false, webpMinSize: false,
webpMixed: false, webpMixed: false,
@@ -291,7 +300,8 @@ const Sharp = function (input, options) {
gifDither: 1, gifDither: 1,
gifInterFrameMaxError: 0, gifInterFrameMaxError: 0,
gifInterPaletteMaxError: 3, gifInterPaletteMaxError: 3,
gifReoptimise: false, gifReuse: true,
gifProgressive: false,
tiffQuality: 80, tiffQuality: 80,
tiffCompression: 'jpeg', tiffCompression: 'jpeg',
tiffPredictor: 'horizontal', tiffPredictor: 'horizontal',

1604
lib/index.d.ts vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const Sharp = require('./constructor'); const Sharp = require('./constructor');

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const color = require('color'); const color = require('color');
@@ -21,9 +24,9 @@ const align = {
* @private * @private
*/ */
function _inputOptionsFromObject (obj) { function _inputOptionsFromObject (obj) {
const { raw, density, limitInputPixels, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd } = obj; const { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd } = obj;
return [raw, density, limitInputPixels, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd].some(is.defined) return [raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd].some(is.defined)
? { raw, density, limitInputPixels, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd } ? { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd }
: undefined; : undefined;
} }
@@ -35,8 +38,9 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
const inputDescriptor = { const inputDescriptor = {
failOn: 'warning', failOn: 'warning',
limitInputPixels: Math.pow(0x3FFF, 2), limitInputPixels: Math.pow(0x3FFF, 2),
ignoreIcc: false,
unlimited: false, unlimited: false,
sequentialRead: false sequentialRead: true
}; };
if (is.string(input)) { if (is.string(input)) {
// filesystem // filesystem
@@ -47,6 +51,11 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
throw Error('Input Buffer is empty'); throw Error('Input Buffer is empty');
} }
inputDescriptor.buffer = input; inputDescriptor.buffer = input;
} else if (is.arrayBuffer(input)) {
if (input.byteLength === 0) {
throw Error('Input bit Array is empty');
}
inputDescriptor.buffer = Buffer.from(input, 0, input.byteLength);
} else if (is.typedArray(input)) { } else if (is.typedArray(input)) {
if (input.length === 0) { if (input.length === 0) {
throw Error('Input Bit Array is empty'); throw Error('Input Bit Array is empty');
@@ -92,6 +101,14 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
throw is.invalidParameterError('density', 'number between 1 and 100000', inputOptions.density); throw is.invalidParameterError('density', 'number between 1 and 100000', inputOptions.density);
} }
} }
// Ignore embeddded ICC profile
if (is.defined(inputOptions.ignoreIcc)) {
if (is.bool(inputOptions.ignoreIcc)) {
inputDescriptor.ignoreIcc = inputOptions.ignoreIcc;
} else {
throw is.invalidParameterError('ignoreIcc', 'boolean', inputOptions.ignoreIcc);
}
}
// limitInputPixels // limitInputPixels
if (is.defined(inputOptions.limitInputPixels)) { if (is.defined(inputOptions.limitInputPixels)) {
if (is.bool(inputOptions.limitInputPixels)) { if (is.bool(inputOptions.limitInputPixels)) {
@@ -327,6 +344,13 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
throw is.invalidParameterError('text.spacing', 'number', inputOptions.text.spacing); throw is.invalidParameterError('text.spacing', 'number', inputOptions.text.spacing);
} }
} }
if (is.defined(inputOptions.text.wrap)) {
if (is.string(inputOptions.text.wrap) && is.inArray(inputOptions.text.wrap, ['word', 'char', 'wordChar', 'none'])) {
inputDescriptor.textWrap = inputOptions.text.wrap;
} else {
throw is.invalidParameterError('text.wrap', 'one of: word, char, wordChar, none', inputOptions.text.wrap);
}
}
delete inputDescriptor.buffer; delete inputDescriptor.buffer;
} else { } else {
throw new Error('Expected a valid string to create an image with text.'); throw new Error('Expected a valid string to create an image with text.');
@@ -387,8 +411,9 @@ function _isStreamInput () {
/** /**
* Fast access to (uncached) image metadata without decoding any compressed pixel data. * Fast access to (uncached) image metadata without decoding any compressed pixel data.
* *
* This is taken from the header of the input image. * This is read from the header of the input image.
* It does not include operations, such as resize, to be applied to the output image. * It does not take into consideration any operations to be applied to the output image,
* such as resize or rotate.
* *
* Dimensions in the response will respect the `page` and `pages` properties of the * Dimensions in the response will respect the `page` and `pages` properties of the
* {@link /api-constructor#parameters|constructor parameters}. * {@link /api-constructor#parameters|constructor parameters}.
@@ -423,6 +448,7 @@ function _isStreamInput () {
* - `iptc`: Buffer containing raw IPTC data, if present * - `iptc`: Buffer containing raw IPTC data, if present
* - `xmp`: Buffer containing raw XMP data, if present * - `xmp`: Buffer containing raw XMP data, if present
* - `tifftagPhotoshop`: Buffer containing raw TIFFTAG_PHOTOSHOP data, if present * - `tifftagPhotoshop`: Buffer containing raw TIFFTAG_PHOTOSHOP data, if present
* - `formatMagick`: String containing format for images loaded via *magick
* *
* @example * @example
* const metadata = await sharp(input).metadata(); * const metadata = await sharp(input).metadata();

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
/** /**
@@ -71,6 +74,14 @@ const typedArray = function (val) {
return false; return false;
}; };
/**
* Is this value an ArrayBuffer object?
* @private
*/
const arrayBuffer = function (val) {
return val instanceof ArrayBuffer;
};
/** /**
* Is this value a non-empty string? * Is this value a non-empty string?
* @private * @private
@@ -134,6 +145,7 @@ module.exports = {
bool: bool, bool: bool,
buffer: buffer, buffer: buffer,
typedArray: typedArray, typedArray: typedArray,
arrayBuffer: arrayBuffer,
string: string, string: string,
number: number, number: number,
integer: integer, integer: integer,

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const fs = require('fs'); const fs = require('fs');
@@ -85,8 +88,7 @@ const hasVendoredLibvips = function () {
/* istanbul ignore next */ /* istanbul ignore next */
const removeVendoredLibvips = function () { const removeVendoredLibvips = function () {
const rm = fs.rmSync ? fs.rmSync : fs.rmdirSync; fs.rmSync(vendorPath, { recursive: true, maxRetries: 3, force: true });
rm(vendorPath, { recursive: true, maxRetries: 3, force: true });
}; };
/* istanbul ignore next */ /* istanbul ignore next */
@@ -115,6 +117,7 @@ const useGlobalLibvips = function () {
} }
/* istanbul ignore next */ /* istanbul ignore next */
if (isRosetta()) { if (isRosetta()) {
log('Detected Rosetta, skipping search for globally-installed libvips');
return false; return false;
} }
const globalVipsVersion = globalLibvipsVersion(); const globalVipsVersion = globalLibvipsVersion();

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const color = require('color'); const color = require('color');
@@ -8,7 +11,7 @@ const is = require('./is');
* or auto-orient based on the EXIF `Orientation` tag. * or auto-orient based on the EXIF `Orientation` tag.
* *
* If an angle is provided, it is converted to a valid positive degree rotation. * If an angle is provided, it is converted to a valid positive degree rotation.
* For example, `-450` will produce a 270deg rotation. * For example, `-450` will produce a 270 degree rotation.
* *
* When rotating by an angle other than a multiple of 90, * When rotating by an angle other than a multiple of 90,
* the background colour can be provided with the `background` option. * the background colour can be provided with the `background` option.
@@ -77,9 +80,13 @@ function rotate (angle, options) {
} }
/** /**
* Flip the image about the vertical Y axis. This always occurs before rotation, if any. * Mirror the image vertically (up-down) about the x-axis.
* This always occurs before rotation, if any.
*
* The use of `flip` implies the removal of the EXIF `Orientation` tag, if any. * The use of `flip` implies the removal of the EXIF `Orientation` tag, if any.
* *
* This operation does not work correctly with multi-page images.
*
* @example * @example
* const output = await sharp(input).flip().toBuffer(); * const output = await sharp(input).flip().toBuffer();
* *
@@ -92,7 +99,9 @@ function flip (flip) {
} }
/** /**
* Flop the image about the horizontal X axis. This always occurs before rotation, if any. * Mirror the image horizontally (left-right) about the y-axis.
* This always occurs before rotation, if any.
*
* The use of `flop` implies the removal of the EXIF `Orientation` tag, if any. * The use of `flop` implies the removal of the EXIF `Orientation` tag, if any.
* *
* @example * @example
@@ -111,7 +120,7 @@ function flop (flop) {
* *
* You must provide an array of length 4 or a 2x2 affine transformation matrix. * You must provide an array of length 4 or a 2x2 affine transformation matrix.
* By default, new pixels are filled with a black background. You can provide a background color with the `background` option. * By default, new pixels are filled with a black background. You can provide a background color with the `background` option.
* A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolator` Object e.g. `sharp.interpolator.nohalo`. * A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolators` Object e.g. `sharp.interpolators.nohalo`.
* *
* In the case of a 2x2 matrix, the transform is: * In the case of a 2x2 matrix, the transform is:
* - X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx` * - X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx`
@@ -128,7 +137,7 @@ function flop (flop) {
* const pipeline = sharp() * const pipeline = sharp()
* .affine([[1, 0.3], [0.1, 0.7]], { * .affine([[1, 0.3], [0.1, 0.7]], {
* background: 'white', * background: 'white',
* interpolate: sharp.interpolators.nohalo * interpolator: sharp.interpolators.nohalo
* }) * })
* .toBuffer((err, outputBuffer, info) => { * .toBuffer((err, outputBuffer, info) => {
* // outputBuffer contains the transformed image * // outputBuffer contains the transformed image
@@ -232,7 +241,7 @@ function affine (matrix, options) {
* .toBuffer(); * .toBuffer();
* *
* @param {Object|number} [options] - if present, is an Object with attributes * @param {Object|number} [options] - if present, is an Object with attributes
* @param {number} [options.sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10000 * @param {number} [options.sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10
* @param {number} [options.m1=1.0] - the level of sharpening to apply to "flat" areas, between 0 and 1000000 * @param {number} [options.m1=1.0] - the level of sharpening to apply to "flat" areas, between 0 and 1000000
* @param {number} [options.m2=2.0] - the level of sharpening to apply to "jagged" areas, between 0 and 1000000 * @param {number} [options.m2=2.0] - the level of sharpening to apply to "jagged" areas, between 0 and 1000000
* @param {number} [options.x1=2.0] - threshold between "flat" and "jagged", between 0 and 1000000 * @param {number} [options.x1=2.0] - threshold between "flat" and "jagged", between 0 and 1000000
@@ -270,10 +279,10 @@ function sharpen (options, flat, jagged) {
} }
} }
} else if (is.plainObject(options)) { } else if (is.plainObject(options)) {
if (is.number(options.sigma) && is.inRange(options.sigma, 0.000001, 10000)) { if (is.number(options.sigma) && is.inRange(options.sigma, 0.000001, 10)) {
this.options.sharpenSigma = options.sigma; this.options.sharpenSigma = options.sigma;
} else { } else {
throw is.invalidParameterError('options.sigma', 'number between 0.000001 and 10000', options.sigma); throw is.invalidParameterError('options.sigma', 'number between 0.000001 and 10', options.sigma);
} }
if (is.defined(options.m1)) { if (is.defined(options.m1)) {
if (is.number(options.m1) && is.inRange(options.m1, 0, 1000000)) { if (is.number(options.m1) && is.inRange(options.m1, 0, 1000000)) {
@@ -402,6 +411,32 @@ function flatten (options) {
return this; return this;
} }
/**
* Ensure the image has an alpha channel
* with all white pixel values made fully transparent.
*
* Existing alpha channel values for non-white pixels remain unchanged.
*
* This feature is experimental and the API may change.
*
* @since 0.32.1
*
* @example
* await sharp(rgbInput)
* .unflatten()
* .toBuffer();
*
* @example
* await sharp(rgbInput)
* .threshold(128, { grayscale: false }) // converter bright pixels to white
* .unflatten()
* .toBuffer();
*/
function unflatten () {
this.options.unflatten = true;
return this;
}
/** /**
* Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma` * Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma`
* then increasing the encoding (brighten) post-resize at a factor of `gamma`. * then increasing the encoding (brighten) post-resize at a factor of `gamma`.
@@ -466,16 +501,50 @@ function negate (options) {
} }
/** /**
* Enhance output image contrast by stretching its luminance to cover the full dynamic range. * Enhance output image contrast by stretching its luminance to cover a full dynamic range.
*
* Uses a histogram-based approach, taking a default range of 1% to 99% to reduce sensitivity to noise at the extremes.
*
* Luminance values below the `lower` percentile will be underexposed by clipping to zero.
* Luminance values above the `upper` percentile will be overexposed by clipping to the max pixel value.
* *
* @example * @example
* const output = await sharp(input).normalise().toBuffer(); * const output = await sharp(input)
* .normalise()
* .toBuffer();
* *
* @param {Boolean} [normalise=true] * @example
* const output = await sharp(input)
* .normalise({ lower: 0, upper: 100 })
* .toBuffer();
*
* @param {Object} [options]
* @param {number} [options.lower=1] - Percentile below which luminance values will be underexposed.
* @param {number} [options.upper=99] - Percentile above which luminance values will be overexposed.
* @returns {Sharp} * @returns {Sharp}
*/ */
function normalise (normalise) { function normalise (options) {
this.options.normalise = is.bool(normalise) ? normalise : true; if (is.plainObject(options)) {
if (is.defined(options.lower)) {
if (is.number(options.lower) && is.inRange(options.lower, 0, 99)) {
this.options.normaliseLower = options.lower;
} else {
throw is.invalidParameterError('lower', 'number between 0 and 99', options.lower);
}
}
if (is.defined(options.upper)) {
if (is.number(options.upper) && is.inRange(options.upper, 1, 100)) {
this.options.normaliseUpper = options.upper;
} else {
throw is.invalidParameterError('upper', 'number between 1 and 100', options.upper);
}
}
}
if (this.options.normaliseLower >= this.options.normaliseUpper) {
throw is.invalidParameterError('range', 'lower to be less than upper',
`${this.options.normaliseLower} >= ${this.options.normaliseUpper}`);
}
this.options.normalise = true;
return this; return this;
} }
@@ -483,13 +552,17 @@ function normalise (normalise) {
* Alternative spelling of normalise. * Alternative spelling of normalise.
* *
* @example * @example
* const output = await sharp(input).normalize().toBuffer(); * const output = await sharp(input)
* .normalize()
* .toBuffer();
* *
* @param {Boolean} [normalize=true] * @param {Object} [options]
* @param {number} [options.lower=1] - Percentile below which luminance values will be underexposed.
* @param {number} [options.upper=99] - Percentile above which luminance values will be overexposed.
* @returns {Sharp} * @returns {Sharp}
*/ */
function normalize (normalize) { function normalize (options) {
return this.normalise(normalize); return this.normalise(options);
} }
/** /**
@@ -509,35 +582,34 @@ function normalize (normalize) {
* .toBuffer(); * .toBuffer();
* *
* @param {Object} options * @param {Object} options
* @param {number} options.width - integer width of the region in pixels. * @param {number} options.width - Integral width of the search window, in pixels.
* @param {number} options.height - integer height of the region in pixels. * @param {number} options.height - Integral height of the search window, in pixels.
* @param {number} [options.maxSlope=3] - maximum value for the slope of the * @param {number} [options.maxSlope=3] - Integral level of brightening, between 0 and 100, where 0 disables contrast limiting.
* cumulative histogram. A value of 0 disables contrast limiting. Valid values
* are integers in the range 0-100 (inclusive)
* @returns {Sharp} * @returns {Sharp}
* @throws {Error} Invalid parameters * @throws {Error} Invalid parameters
*/ */
function clahe (options) { function clahe (options) {
if (!is.plainObject(options)) { if (is.plainObject(options)) {
if (is.integer(options.width) && options.width > 0) {
this.options.claheWidth = options.width;
} else {
throw is.invalidParameterError('width', 'integer greater than zero', options.width);
}
if (is.integer(options.height) && options.height > 0) {
this.options.claheHeight = options.height;
} else {
throw is.invalidParameterError('height', 'integer greater than zero', options.height);
}
if (is.defined(options.maxSlope)) {
if (is.integer(options.maxSlope) && is.inRange(options.maxSlope, 0, 100)) {
this.options.claheMaxSlope = options.maxSlope;
} else {
throw is.invalidParameterError('maxSlope', 'integer between 0 and 100', options.maxSlope);
}
}
} else {
throw is.invalidParameterError('options', 'plain object', options); throw is.invalidParameterError('options', 'plain object', options);
} }
if (!('width' in options) || !is.integer(options.width) || options.width <= 0) {
throw is.invalidParameterError('width', 'integer above zero', options.width);
} else {
this.options.claheWidth = options.width;
}
if (!('height' in options) || !is.integer(options.height) || options.height <= 0) {
throw is.invalidParameterError('height', 'integer above zero', options.height);
} else {
this.options.claheHeight = options.height;
}
if (!is.defined(options.maxSlope)) {
this.options.claheMaxSlope = 3;
} else if (!is.integer(options.maxSlope) || options.maxSlope < 0 || options.maxSlope > 100) {
throw is.invalidParameterError('maxSlope', 'integer 0-100', options.maxSlope);
} else {
this.options.claheMaxSlope = options.maxSlope;
}
return this; return this;
} }
@@ -700,7 +772,7 @@ function linear (a, b) {
} }
/** /**
* Recomb the image with the specified matrix. * Recombine the image with the specified matrix.
* *
* @since 0.21.1 * @since 0.21.1
* *
@@ -713,7 +785,7 @@ function linear (a, b) {
* ]) * ])
* .raw() * .raw()
* .toBuffer(function(err, data, info) { * .toBuffer(function(err, data, info) {
* // data contains the raw pixel data after applying the recomb * // data contains the raw pixel data after applying the matrix
* // With this example input, a sepia filter has been applied * // With this example input, a sepia filter has been applied
* }); * });
* *
@@ -770,7 +842,7 @@ function recomb (inputMatrix) {
* .toBuffer(); * .toBuffer();
* *
* @example * @example
* // decreate brightness and saturation while also hue-rotating by 90 degrees * // decrease brightness and saturation while also hue-rotating by 90 degrees
* const output = await sharp(input) * const output = await sharp(input)
* .modulate({ * .modulate({
* brightness: 0.5, * brightness: 0.5,
@@ -835,6 +907,7 @@ module.exports = function (Sharp) {
median, median,
blur, blur,
flatten, flatten,
unflatten,
gamma, gamma,
negate, negate,
normalise, normalise,

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const path = require('path'); const path = require('path');
@@ -26,7 +29,7 @@ const formats = new Map([
['jxl', 'jxl'] ['jxl', 'jxl']
]); ]);
const jp2Regex = /\.jp[2x]|j2[kc]$/i; const jp2Regex = /\.(jp[2x]|j2[kc])$/i;
const errJp2Save = () => new Error('JP2 output requires libvips with support for OpenJPEG'); const errJp2Save = () => new Error('JP2 output requires libvips with support for OpenJPEG');
@@ -40,7 +43,7 @@ const bitdepthFromColourCount = (colours) => 1 << 31 - Math.clz32(Math.ceil(Math
* Note that raw pixel data is only supported for buffer output. * Note that raw pixel data is only supported for buffer output.
* *
* By default all metadata will be removed, which includes EXIF-based orientation. * By default all metadata will be removed, which includes EXIF-based orientation.
* See {@link withMetadata} for control over this. * See {@link #withmetadata|withMetadata} for control over this.
* *
* The caller is responsible for ensuring directory structures and permissions exist. * The caller is responsible for ensuring directory structures and permissions exist.
* *
@@ -61,6 +64,7 @@ const bitdepthFromColourCount = (colours) => 1 << 31 - Math.clz32(Math.ceil(Math
* `info` contains the output image `format`, `size` (bytes), `width`, `height`, * `info` contains the output image `format`, `size` (bytes), `width`, `height`,
* `channels` and `premultiplied` (indicating if premultiplication was used). * `channels` and `premultiplied` (indicating if premultiplication was used).
* When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`. * When using a crop strategy also contains `cropOffsetLeft` and `cropOffsetTop`.
* When using the attention crop strategy also contains `attentionX` and `attentionY`, the focal point of the cropped region.
* May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text. * May also contain `textAutofitDpi` (dpi the font was rendered at) if image was created from text.
* @returns {Promise<Object>} - when no callback is provided * @returns {Promise<Object>} - when no callback is provided
* @throws {Error} Invalid parameters * @throws {Error} Invalid parameters
@@ -71,7 +75,7 @@ function toFile (fileOut, callback) {
err = new Error('Missing output file path'); err = new Error('Missing output file path');
} else if (is.string(this.options.input.file) && path.resolve(this.options.input.file) === path.resolve(fileOut)) { } else if (is.string(this.options.input.file) && path.resolve(this.options.input.file) === path.resolve(fileOut)) {
err = new Error('Cannot use same file for input and output'); err = new Error('Cannot use same file for input and output');
} else if (jp2Regex.test(fileOut) && !this.constructor.format.jp2k.output.file) { } else if (jp2Regex.test(path.extname(fileOut)) && !this.constructor.format.jp2k.output.file) {
err = errJp2Save(); err = errJp2Save();
} }
if (err) { if (err) {
@@ -91,12 +95,12 @@ function toFile (fileOut, callback) {
* Write output to a Buffer. * Write output to a Buffer.
* JPEG, PNG, WebP, AVIF, TIFF, GIF and raw pixel data output are supported. * JPEG, PNG, WebP, AVIF, TIFF, GIF and raw pixel data output are supported.
* *
* Use {@link toFormat} or one of the format-specific functions such as {@link jpeg}, {@link png} etc. to set the output format. * Use {@link #toformat|toFormat} or one of the format-specific functions such as {@link jpeg}, {@link png} etc. to set the output format.
* *
* If no explicit format is set, the output format will match the input image, except SVG input which becomes PNG output. * If no explicit format is set, the output format will match the input image, except SVG input which becomes PNG output.
* *
* By default all metadata will be removed, which includes EXIF-based orientation. * By default all metadata will be removed, which includes EXIF-based orientation.
* See {@link withMetadata} for control over this. * See {@link #withmetadata|withMetadata} for control over this.
* *
* `callback`, if present, gets three arguments `(err, data, info)` where: * `callback`, if present, gets three arguments `(err, data, info)` where:
* - `err` is an error, if any. * - `err` is an error, if any.
@@ -173,12 +177,18 @@ function toBuffer (options, callback) {
* .then(info => { ... }); * .then(info => { ... });
* *
* @example * @example
* // Set "IFD0-Copyright" in output EXIF metadata * // Set output EXIF metadata
* const data = await sharp(input) * const data = await sharp(input)
* .withMetadata({ * .withMetadata({
* exif: { * exif: {
* IFD0: { * IFD0: {
* Copyright: 'Wernham Hogg' * Copyright: 'The National Gallery'
* },
* IFD3: {
* GPSLatitudeRef: 'N',
* GPSLatitude: '51/1 30/1 3230/100',
* GPSLongitudeRef: 'W',
* GPSLongitude: '0/1 7/1 4366/100'
* } * }
* } * }
* }) * })
@@ -192,7 +202,7 @@ function toBuffer (options, callback) {
* *
* @param {Object} [options] * @param {Object} [options]
* @param {number} [options.orientation] value between 1 and 8, used to update the EXIF `Orientation` tag. * @param {number} [options.orientation] value between 1 and 8, used to update the EXIF `Orientation` tag.
* @param {string} [options.icc] filesystem path to output ICC profile, defaults to sRGB. * @param {string} [options.icc='srgb'] Filesystem path to output ICC profile, relative to `process.cwd()`, defaults to built-in sRGB.
* @param {Object<Object>} [options.exif={}] Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data. * @param {Object<Object>} [options.exif={}] Object keyed by IFD0, IFD1 etc. of key/value string pairs to write as EXIF data.
* @param {number} [options.density] Number of pixels per inch (DPI). * @param {number} [options.density] Number of pixels per inch (DPI).
* @returns {Sharp} * @returns {Sharp}
@@ -473,6 +483,7 @@ function png (options) {
* @param {boolean} [options.lossless=false] - use lossless compression mode * @param {boolean} [options.lossless=false] - use lossless compression mode
* @param {boolean} [options.nearLossless=false] - use near_lossless compression mode * @param {boolean} [options.nearLossless=false] - use near_lossless compression mode
* @param {boolean} [options.smartSubsample=false] - use high quality chroma subsampling * @param {boolean} [options.smartSubsample=false] - use high quality chroma subsampling
* @param {string} [options.preset='default'] - named preset for preprocessing/filtering, one of: default, photo, picture, drawing, icon, text
* @param {number} [options.effort=4] - CPU effort, between 0 (fastest) and 6 (slowest) * @param {number} [options.effort=4] - CPU effort, between 0 (fastest) and 6 (slowest)
* @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation * @param {number} [options.loop=0] - number of animation iterations, use 0 for infinite animation
* @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds) * @param {number|number[]} [options.delay] - delay(s) between animation frames (in milliseconds)
@@ -507,6 +518,13 @@ function webp (options) {
if (is.defined(options.smartSubsample)) { if (is.defined(options.smartSubsample)) {
this._setBooleanOption('webpSmartSubsample', options.smartSubsample); this._setBooleanOption('webpSmartSubsample', options.smartSubsample);
} }
if (is.defined(options.preset)) {
if (is.string(options.preset) && is.inArray(options.preset, ['default', 'photo', 'picture', 'drawing', 'icon', 'text'])) {
this.options.webpPreset = options.preset;
} else {
throw is.invalidParameterError('preset', 'one of: default, photo, picture, drawing, icon, text', options.preset);
}
}
if (is.defined(options.effort)) { if (is.defined(options.effort)) {
if (is.integer(options.effort) && is.inRange(options.effort, 0, 6)) { if (is.integer(options.effort) && is.inRange(options.effort, 0, 6)) {
this.options.webpEffort = options.effort; this.options.webpEffort = options.effort;
@@ -559,8 +577,8 @@ function webp (options) {
* .toFile('optim.gif'); * .toFile('optim.gif');
* *
* @param {Object} [options] - output options * @param {Object} [options] - output options
* @param {boolean} [options.reoptimise=false] - always generate new palettes (slow), re-use existing by default * @param {boolean} [options.reuse=true] - re-use existing palette, otherwise generate new (slow)
* @param {boolean} [options.reoptimize=false] - alternative spelling of `options.reoptimise` * @param {boolean} [options.progressive=false] - use progressive (interlace) scan
* @param {number} [options.colours=256] - maximum number of palette entries, including transparency, between 2 and 256 * @param {number} [options.colours=256] - maximum number of palette entries, including transparency, between 2 and 256
* @param {number} [options.colors=256] - alternative spelling of `options.colours` * @param {number} [options.colors=256] - alternative spelling of `options.colours`
* @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 10 (slowest) * @param {number} [options.effort=7] - CPU effort, between 1 (fastest) and 10 (slowest)
@@ -575,10 +593,11 @@ function webp (options) {
*/ */
function gif (options) { function gif (options) {
if (is.object(options)) { if (is.object(options)) {
if (is.defined(options.reoptimise)) { if (is.defined(options.reuse)) {
this._setBooleanOption('gifReoptimise', options.reoptimise); this._setBooleanOption('gifReuse', options.reuse);
} else if (is.defined(options.reoptimize)) { }
this._setBooleanOption('gifReoptimise', options.reoptimize); if (is.defined(options.progressive)) {
this._setBooleanOption('gifProgressive', options.progressive);
} }
const colours = options.colours || options.colors; const colours = options.colours || options.colors;
if (is.defined(colours)) { if (is.defined(colours)) {
@@ -621,6 +640,7 @@ function gif (options) {
return this._updateFormatOut('gif', options); return this._updateFormatOut('gif', options);
} }
/* istanbul ignore next */
/** /**
* Use these JP2 options for output image. * Use these JP2 options for output image.
* *
@@ -654,7 +674,6 @@ function gif (options) {
* @returns {Sharp} * @returns {Sharp}
* @throws {Error} Invalid options * @throws {Error} Invalid options
*/ */
/* istanbul ignore next */
function jp2 (options) { function jp2 (options) {
if (!this.constructor.format.jp2k.output.buffer) { if (!this.constructor.format.jp2k.output.buffer) {
throw errJp2Save(); throw errJp2Save();
@@ -690,7 +709,7 @@ function jp2 (options) {
} }
if (is.defined(options.chromaSubsampling)) { if (is.defined(options.chromaSubsampling)) {
if (is.string(options.chromaSubsampling) && is.inArray(options.chromaSubsampling, ['4:2:0', '4:4:4'])) { if (is.string(options.chromaSubsampling) && is.inArray(options.chromaSubsampling, ['4:2:0', '4:4:4'])) {
this.options.heifChromaSubsampling = options.chromaSubsampling; this.options.jp2ChromaSubsampling = options.chromaSubsampling;
} else { } else {
throw is.invalidParameterError('chromaSubsampling', 'one of: 4:2:0, 4:4:4', options.chromaSubsampling); throw is.invalidParameterError('chromaSubsampling', 'one of: 4:2:0, 4:4:4', options.chromaSubsampling);
} }
@@ -735,7 +754,8 @@ function trySetAnimationOptions (source, target) {
/** /**
* Use these TIFF options for output image. * Use these TIFF options for output image.
* *
* The `density` can be set in pixels/inch via {@link withMetadata} instead of providing `xres` and `yres` in pixels/mm. * The `density` can be set in pixels/inch via {@link #withmetadata|withMetadata}
* instead of providing `xres` and `yres` in pixels/mm.
* *
* @example * @example
* // Convert SVG input to LZW-compressed, 1 bit per pixel TIFF output * // Convert SVG input to LZW-compressed, 1 bit per pixel TIFF output
@@ -1051,6 +1071,10 @@ function raw (options) {
* *
* The container will be set to `zip` when the output is a Buffer or Stream, otherwise it will default to `fs`. * The container will be set to `zip` when the output is a Buffer or Stream, otherwise it will default to `fs`.
* *
* Requires libvips compiled with support for libgsf.
* The prebuilt binaries do not include this - see
* {@link https://sharp.pixelplumbing.com/install#custom-libvips installing a custom libvips}.
*
* @example * @example
* sharp('input.tiff') * sharp('input.tiff')
* .png() * .png()

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const detectLibc = require('detect-libc'); const detectLibc = require('detect-libc');

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const is = require('./is'); const is = require('./is');
@@ -36,6 +39,18 @@ const position = {
'left top': 8 'left top': 8
}; };
/**
* How to extend the image.
* @member
* @private
*/
const extendWith = {
background: 'background',
copy: 'copy',
repeat: 'repeat',
mirror: 'mirror'
};
/** /**
* Strategies for automagic cover behaviour. * Strategies for automagic cover behaviour.
* @member * @member
@@ -103,7 +118,7 @@ function isResizeExpected (options) {
* Resize image to `width`, `height` or `width x height`. * Resize image to `width`, `height` or `width x height`.
* *
* When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are: * When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are:
* - `cover`: (default) Preserving aspect ratio, ensure the image covers both provided dimensions by cropping/clipping to fit. * - `cover`: (default) Preserving aspect ratio, attempt to ensure the image covers both provided dimensions by cropping/clipping to fit.
* - `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary. * - `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary.
* - `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions. * - `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions.
* - `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified. * - `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified.
@@ -216,32 +231,32 @@ function isResizeExpected (options) {
* .toBuffer() * .toBuffer()
* ); * );
* *
* @param {number} [width] - pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height. * @param {number} [width] - How many pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height.
* @param {number} [height] - pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width. * @param {number} [height] - How many pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width.
* @param {Object} [options] * @param {Object} [options]
* @param {String} [options.width] - alternative means of specifying `width`. If both are present this takes priority. * @param {number} [options.width] - An alternative means of specifying `width`. If both are present this takes priority.
* @param {String} [options.height] - alternative means of specifying `height`. If both are present this takes priority. * @param {number} [options.height] - An alternative means of specifying `height`. If both are present this takes priority.
* @param {String} [options.fit='cover'] - how the image should be resized to fit both provided dimensions, one of `cover`, `contain`, `fill`, `inside` or `outside`. * @param {String} [options.fit='cover'] - How the image should be resized/cropped to fit the target dimension(s), one of `cover`, `contain`, `fill`, `inside` or `outside`.
* @param {String} [options.position='centre'] - position, gravity or strategy to use when `fit` is `cover` or `contain`. * @param {String} [options.position='centre'] - A position, gravity or strategy to use when `fit` is `cover` or `contain`.
* @param {String|Object} [options.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. * @param {String|Object} [options.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency.
* @param {String} [options.kernel='lanczos3'] - the kernel to use for image reduction. * @param {String} [options.kernel='lanczos3'] - The kernel to use for image reduction. Use the `fastShrinkOnLoad` option to control kernel vs shrink-on-load.
* @param {Boolean} [options.withoutEnlargement=false] - do not enlarge if the width *or* height are already less than the specified dimensions, equivalent to GraphicsMagick's `>` geometry option. * @param {Boolean} [options.withoutEnlargement=false] - Do not scale up if the width *or* height are already less than the target dimensions, equivalent to GraphicsMagick's `>` geometry option. This may result in output dimensions smaller than the target dimensions.
* @param {Boolean} [options.withoutReduction=false] - do not reduce if the width *or* height are already greater than the specified dimensions, equivalent to GraphicsMagick's `<` geometry option. * @param {Boolean} [options.withoutReduction=false] - Do not scale down if the width *or* height are already greater than the target dimensions, equivalent to GraphicsMagick's `<` geometry option. This may still result in a crop to reach the target dimensions.
* @param {Boolean} [options.fastShrinkOnLoad=true] - take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern on some images. * @param {Boolean} [options.fastShrinkOnLoad=true] - Take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern or round-down of an auto-scaled dimension.
* @returns {Sharp} * @returns {Sharp}
* @throws {Error} Invalid parameters * @throws {Error} Invalid parameters
*/ */
function resize (width, height, options) { function resize (widthOrOptions, height, options) {
if (isResizeExpected(this.options)) { if (isResizeExpected(this.options)) {
this.options.debuglog('ignoring previous resize options'); this.options.debuglog('ignoring previous resize options');
} }
if (is.defined(width)) { if (is.defined(widthOrOptions)) {
if (is.object(width) && !is.defined(options)) { if (is.object(widthOrOptions) && !is.defined(options)) {
options = width; options = widthOrOptions;
} else if (is.integer(width) && width > 0) { } else if (is.integer(widthOrOptions) && widthOrOptions > 0) {
this.options.width = width; this.options.width = widthOrOptions;
} else { } else {
throw is.invalidParameterError('width', 'positive integer', width); throw is.invalidParameterError('width', 'positive integer', widthOrOptions);
} }
} else { } else {
this.options.width = -1; this.options.width = -1;
@@ -322,7 +337,8 @@ function resize (width, height, options) {
} }
/** /**
* Extends/pads the edges of the image with the provided background colour. * Extend / pad / extrude one or more edges of the image with either
* the provided background colour or pixels derived from the image.
* This operation will always occur after resizing and extraction, if any. * This operation will always occur after resizing and extraction, if any.
* *
* @example * @example
@@ -348,11 +364,21 @@ function resize (width, height, options) {
* }) * })
* ... * ...
* *
* @example
* // Extrude image by 8 pixels to the right, mirroring existing right hand edge
* sharp(input)
* .extend({
* right: 8,
* background: 'mirror'
* })
* ...
*
* @param {(number|Object)} extend - single pixel count to add to all edges or an Object with per-edge counts * @param {(number|Object)} extend - single pixel count to add to all edges or an Object with per-edge counts
* @param {number} [extend.top=0] * @param {number} [extend.top=0]
* @param {number} [extend.left=0] * @param {number} [extend.left=0]
* @param {number} [extend.bottom=0] * @param {number} [extend.bottom=0]
* @param {number} [extend.right=0] * @param {number} [extend.right=0]
* @param {String} [extend.extendWith='background'] - populate new pixels using this method, one of: background, copy, repeat, mirror.
* @param {String|Object} [extend.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency. * @param {String|Object} [extend.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency.
* @returns {Sharp} * @returns {Sharp}
* @throws {Error} Invalid parameters * @throws {Error} Invalid parameters
@@ -393,6 +419,13 @@ function extend (extend) {
} }
} }
this._setBackgroundColourOption('extendBackground', extend.background); this._setBackgroundColourOption('extendBackground', extend.background);
if (is.defined(extend.extendWith)) {
if (is.string(extendWith[extend.extendWith])) {
this.options.extendWith = extendWith[extend.extendWith];
} else {
throw is.invalidParameterError('extendWith', 'one of: background, copy, repeat, mirror', extend.extendWith);
}
}
} else { } else {
throw is.invalidParameterError('extend', 'integer or object', extend); throw is.invalidParameterError('extend', 'integer or object', extend);
} }

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const platformAndArch = require('./platform')(); const platformAndArch = require('./platform')();

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const fs = require('fs'); const fs = require('fs');
@@ -43,7 +46,7 @@ const interpolators = {
}; };
/** /**
* An Object containing the version numbers of libvips and its dependencies. * An Object containing the version numbers of sharp, libvips and its dependencies.
* @member * @member
* @example * @example
* console.log(sharp.versions); * console.log(sharp.versions);
@@ -54,6 +57,7 @@ let versions = {
try { try {
versions = require(`../vendor/${versions.vips}/${platformAndArch}/versions.json`); versions = require(`../vendor/${versions.vips}/${platformAndArch}/versions.json`);
} catch (_err) { /* ignore */ } } catch (_err) { /* ignore */ }
versions.sharp = require('../package.json').version;
/** /**
* An Object containing the platform and architecture * An Object containing the platform and architecture
@@ -198,6 +202,72 @@ function simd (simd) {
} }
simd(true); simd(true);
/**
* Block libvips operations at runtime.
*
* This is in addition to the `VIPS_BLOCK_UNTRUSTED` environment variable,
* which when set will block all "untrusted" operations.
*
* @since 0.32.4
*
* @example <caption>Block all TIFF input.</caption>
* sharp.block({
* operation: ['VipsForeignLoadTiff']
* });
*
* @param {Object} options
* @param {Array<string>} options.operation - List of libvips low-level operation names to block.
*/
function block (options) {
if (is.object(options)) {
if (Array.isArray(options.operation) && options.operation.every(is.string)) {
sharp.block(options.operation, true);
} else {
throw is.invalidParameterError('operation', 'Array<string>', options.operation);
}
} else {
throw is.invalidParameterError('options', 'object', options);
}
}
/**
* Unblock libvips operations at runtime.
*
* This is useful for defining a list of allowed operations.
*
* @since 0.32.4
*
* @example <caption>Block all input except WebP from the filesystem.</caption>
* sharp.block({
* operation: ['VipsForeignLoad']
* });
* sharp.unblock({
* operation: ['VipsForeignLoadWebpFile']
* });
*
* @example <caption>Block all input except JPEG and PNG from a Buffer or Stream.</caption>
* sharp.block({
* operation: ['VipsForeignLoad']
* });
* sharp.unblock({
* operation: ['VipsForeignLoadJpegBuffer', 'VipsForeignLoadPngBuffer']
* });
*
* @param {Object} options
* @param {Array<string>} options.operation - List of libvips low-level operation names to unblock.
*/
function unblock (options) {
if (is.object(options)) {
if (Array.isArray(options.operation) && options.operation.every(is.string)) {
sharp.block(options.operation, false);
} else {
throw is.invalidParameterError('operation', 'Array<string>', options.operation);
}
} else {
throw is.invalidParameterError('options', 'object', options);
}
}
/** /**
* Decorate the Sharp class with utility-related functions. * Decorate the Sharp class with utility-related functions.
* @private * @private
@@ -212,4 +282,6 @@ module.exports = function (Sharp) {
Sharp.versions = versions; Sharp.versions = versions;
Sharp.vendor = vendor; Sharp.vendor = vendor;
Sharp.queue = queue; Sharp.queue = queue;
Sharp.block = block;
Sharp.unblock = unblock;
}; };

View File

@@ -1,7 +1,7 @@
{ {
"name": "sharp", "name": "sharp",
"description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images", "description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images",
"version": "0.31.3", "version": "0.32.4",
"author": "Lovell Fuller <npm@lovell.info>", "author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://github.com/lovell/sharp", "homepage": "https://github.com/lovell/sharp",
"contributors": [ "contributors": [
@@ -85,21 +85,24 @@
"Brodan <christopher.hranj@gmail.com", "Brodan <christopher.hranj@gmail.com",
"Ankur Parihar <ankur.github@gmail.com>", "Ankur Parihar <ankur.github@gmail.com>",
"Brahim Ait elhaj <brahima@gmail.com>", "Brahim Ait elhaj <brahima@gmail.com>",
"Mart Jansink <m.jansink@gmail.com>" "Mart Jansink <m.jansink@gmail.com>",
"Lachlan Newman <lachnewman007@gmail.com>"
], ],
"scripts": { "scripts": {
"install": "(node install/libvips && node install/dll-copy && prebuild-install) || (node install/can-compile && node-gyp rebuild && node install/dll-copy)", "install": "(node install/libvips && node install/dll-copy && prebuild-install) || (node install/can-compile && node-gyp rebuild && node install/dll-copy)",
"clean": "rm -rf node_modules/ build/ vendor/ .nyc_output/ coverage/ test/fixtures/output.*", "clean": "rm -rf node_modules/ build/ vendor/ .nyc_output/ coverage/ test/fixtures/output.*",
"test": "npm run test-lint && npm run test-unit && npm run test-licensing", "test": "npm run test-lint && npm run test-unit && npm run test-licensing && npm run test-types",
"test-lint": "semistandard && cpplint", "test-lint": "semistandard && cpplint",
"test-unit": "nyc --reporter=lcov --reporter=text --check-coverage --branches=100 mocha", "test-unit": "nyc --reporter=lcov --reporter=text --check-coverage --branches=100 mocha",
"test-licensing": "license-checker --production --summary --onlyAllow=\"Apache-2.0;BSD;ISC;MIT\"", "test-licensing": "license-checker --production --summary --onlyAllow=\"Apache-2.0;BSD;ISC;MIT\"",
"test-leak": "./test/leak/leak.sh", "test-leak": "./test/leak/leak.sh",
"docs-build": "documentation lint lib && node docs/build && node docs/search-index/build", "test-types": "tsd",
"docs-build": "node docs/build && node docs/search-index/build",
"docs-serve": "cd docs && npx serve", "docs-serve": "cd docs && npx serve",
"docs-publish": "cd docs && npx firebase-tools deploy --project pixelplumbing --only hosting:pixelplumbing-sharp" "docs-publish": "cd docs && npx firebase-tools deploy --project pixelplumbing --only hosting:pixelplumbing-sharp"
}, },
"main": "lib/index.js", "main": "lib/index.js",
"types": "lib/index.d.ts",
"files": [ "files": [
"binding.gyp", "binding.gyp",
"install/**", "install/**",
@@ -130,44 +133,45 @@
], ],
"dependencies": { "dependencies": {
"color": "^4.2.3", "color": "^4.2.3",
"detect-libc": "^2.0.1", "detect-libc": "^2.0.2",
"node-addon-api": "^5.0.0", "node-addon-api": "^6.1.0",
"prebuild-install": "^7.1.1", "prebuild-install": "^7.1.1",
"semver": "^7.3.8", "semver": "^7.5.4",
"simple-get": "^4.0.1", "simple-get": "^4.0.1",
"tar-fs": "^2.1.1", "tar-fs": "^3.0.4",
"tunnel-agent": "^0.6.0" "tunnel-agent": "^0.6.0"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "*",
"async": "^3.2.4", "async": "^3.2.4",
"cc": "^3.0.1", "cc": "^3.0.1",
"documentation": "^14.0.1", "exif-reader": "^1.2.0",
"exif-reader": "^1.0.3",
"extract-zip": "^2.0.1", "extract-zip": "^2.0.1",
"icc": "^2.0.0", "icc": "^3.0.0",
"jsdoc-to-markdown": "^8.0.0",
"license-checker": "^25.0.1", "license-checker": "^25.0.1",
"mocha": "^10.2.0", "mocha": "^10.2.0",
"mock-fs": "^5.2.0", "mock-fs": "^5.2.0",
"nyc": "^15.1.0", "nyc": "^15.1.0",
"prebuild": "^11.0.4", "prebuild": "lovell/prebuild#add-nodejs-20-drop-nodejs-10-and-12",
"rimraf": "^3.0.2", "semistandard": "^16.0.1",
"semistandard": "^16.0.1" "tsd": "^0.28.1"
}, },
"license": "Apache-2.0", "license": "Apache-2.0",
"config": { "config": {
"libvips": "8.13.3", "libvips": "8.14.3",
"integrity": { "integrity": {
"darwin-arm64v8": "sha512-xFgYt7CtQSZcWoyUdzPTDNHbioZIrZSEU+gkMxzH4Cgjhi4/N49UsonnIZhKQoTBGloAqEexHeMx4rYTQ2Kgvw==", "darwin-arm64v8": "sha512-zeb7jQ/5ARZfBH9Uy5wlpN05bFpiIN0qN3gIIpfJhpN0rhGDnjJZQgK0W+pOmG1YiLL42BMCS0SHldb0xE33rA==",
"darwin-x64": "sha512-6SivWKzu15aUMMohe0wg7sNYMPETVnOe40BuWsnKOgzl3o5FpQqNSgs+68Mi8Za3Qti9/DaR+H/fyD0x48Af2w==", "darwin-x64": "sha512-C3N6smxdfprfz58cjojv0aekYXDl6+f9SwpGpxPG5RrZnrDMn5NOXtUQOEQ8PZ3Hd9VzfkJTnW/s36EvcMPfYg==",
"linux-arm64v8": "sha512-b+iI9V/ehgDabXYRQcvqa5CEysh+1FQsgFmYc358StCrJCDahwNmsQdsiH1GOVd5WaWh5wHUGByPwMmFOO16Aw==", "linux-arm64v8": "sha512-hT6B+OswqVQH10Fggq3jpOdn+GhxNx+5bk+EMr3lY3RZy72PZ+n4ZHJDfYSxAymdiz5rCdzGxsRLMb9GgD4OSw==",
"linux-armv6": "sha512-zRP2F+EiustLE4bXSH8AHCxwfemh9d+QuvmPjira/HL6uJOUuA7SyQgVV1TPwTQle2ioCNnKPm7FEB/MAiT+ug==", "linux-armv6": "sha512-cW9giVrBssHXFt07l+PgqGu7P7XRDv7oW8jC6iXGBcjG75N7rXz2CK0DyPclfnyoWH4IQ78dh5SkQWmb6X4tig==",
"linux-armv7": "sha512-6OCChowE5lBXXXAZrnGdA9dVktg7UdODEBpE5qTroiAJYZv4yXRMgyDFYajok7du2NTgoklhxGk8d9+4vGv5hg==", "linux-armv7": "sha512-hgqFt3UkZHK6D91JtYrYmT1niznh+N93Zxj2EWXgTLAdcS1D3QqaDPEg2EhInHbXqYvfOuQYAAXPxt7zVtKqcw==",
"linux-x64": "sha512-OTmlmP2r8ozGKdB96X+K5oQE1ojVZanqLqqKlwDpEnfixyIaDGYbVzcjWBNGU3ai/26bvkaCkjynnc2ecYcsuA==", "linux-x64": "sha512-FKbMBbCcFcSugRtuiTsA6Cov+M2WQ8nzvmmJ5xYYpRg/rsrWvObFT+6x/YBpblur9uXGjGIehjXVZtB3VXc+pg==",
"linuxmusl-arm64v8": "sha512-Qh5Wi+bkKTohFYHzSPssfjMhIkD6z6EHbVmnwmWSsgY9zsUBStFp6+mKcNTQfP5YM5Mz06vJOkLHX2OzEr5TzA==", "linuxmusl-arm64v8": "sha512-RTf6mrFyLGWnyt0DH4nHeXv5oSZMSJWxTdTt4cjvJsgp2Husz3mNJLQJGeehCuqPCYj/liJ9NIczw8u71eHFng==",
"linuxmusl-x64": "sha512-DwB4Fs3+ISw9etaLCANkueZDdk758iOS+wNp4TKZkHdq0al6B/3Pk7OHLR8a9E3H6wYDD328u++dcJzip5tacA==", "linuxmusl-x64": "sha512-y/8UOkHzKhi/5UM1/ONyPvpuhO11nPQmuJWfzqUKj8kSKnDasmxv3FN46yI0XY3xA2oFC8lQNFBnLudQsi3Nvw==",
"win32-arm64v8": "sha512-96r3W+O4BtX602B1MtxU5Ru4lKzRRTZqM4OQEBJ//TNL3fiCZdd9agD+RQBjaeR4KFIyBSt3F7IE425ZWmxz+w==", "win32-arm64v8": "sha512-D3PiVL981S7V0bSUwW3OqDS48H9QRw2vqQhYIY3JcIEssOnjWxmJGaz0Y9Zb8TYF5DHnnD6g5kEhob5Y2PIVEw==",
"win32-ia32": "sha512-qfN1MsfQGek1QQd1UNW7JT+5K5Ne1suFQ2GpgpYm3JLSpIve/tz2vOGEGzvTVssOBADJvAkTDFt+yIi3PgU9pA==", "win32-ia32": "sha512-FuLIaSIYJGJAcxyKkG/3/uuTzputekKSCcRCpRHkQS9J8IwM+yHzQeJ5W2PyAvNdeGIEwlYq3wnCNcXe1UGXWA==",
"win32-x64": "sha512-eb3aAmjbVVBVRbiYgebQwoxkAt69WI8nwmKlilSQ3kWqoc0pXfIe322rF2UR8ebbISCGvYRUfzD2r1k92RXISQ==" "win32-x64": "sha512-VQg4aBqpEfybgV8bjnrjfvnosxQDII/23mouFUfKHCsH5kvvHV5tTuPsxm6qbl+SCVploDK/zK1qpjop8YEvtg=="
}, },
"runtime": "napi", "runtime": "napi",
"target": 7 "target": 7
@@ -193,5 +197,8 @@
"filter": [ "filter": [
"build/include" "build/include"
] ]
},
"tsd": {
"directory": "test/types/"
} }
} }

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <cstdlib> #include <cstdlib>
#include <string> #include <string>
@@ -76,14 +65,6 @@ namespace sharp {
} }
return vector; return vector;
} }
Napi::Buffer<char> NewOrCopyBuffer(Napi::Env env, char* data, size_t len) {
try {
return Napi::Buffer<char>::New(env, data, len, FreeCallback);
} catch (Napi::Error const &err) {}
Napi::Buffer<char> buf = Napi::Buffer<char>::Copy(env, data, len);
FreeCallback(nullptr, data);
return buf;
}
// Create an InputDescriptor instance from a Napi::Object describing an input image // Create an InputDescriptor instance from a Napi::Object describing an input image
InputDescriptor* CreateInputDescriptor(Napi::Object input) { InputDescriptor* CreateInputDescriptor(Napi::Object input) {
@@ -101,6 +82,10 @@ namespace sharp {
if (HasAttr(input, "density")) { if (HasAttr(input, "density")) {
descriptor->density = AttrAsDouble(input, "density"); descriptor->density = AttrAsDouble(input, "density");
} }
// Should we ignore any embedded ICC profile
if (HasAttr(input, "ignoreIcc")) {
descriptor->ignoreIcc = AttrAsBool(input, "ignoreIcc");
}
// Raw pixel input // Raw pixel input
if (HasAttr(input, "rawChannels")) { if (HasAttr(input, "rawChannels")) {
descriptor->rawDepth = AttrAsEnum<VipsBandFormat>(input, "rawDepth", VIPS_TYPE_BAND_FORMAT); descriptor->rawDepth = AttrAsEnum<VipsBandFormat>(input, "rawDepth", VIPS_TYPE_BAND_FORMAT);
@@ -167,6 +152,9 @@ namespace sharp {
if (HasAttr(input, "textSpacing")) { if (HasAttr(input, "textSpacing")) {
descriptor->textSpacing = AttrAsUint32(input, "textSpacing"); descriptor->textSpacing = AttrAsUint32(input, "textSpacing");
} }
if (HasAttr(input, "textWrap")) {
descriptor->textWrap = AttrAsEnum<VipsTextWrap>(input, "textWrap", VIPS_TYPE_TEXT_WRAP);
}
} }
// Limit input images to a given number of pixels, where pixels = width * height // Limit input images to a given number of pixels, where pixels = width * height
descriptor->limitInputPixels = static_cast<uint64_t>(AttrAsInt64(input, "limitInputPixels")); descriptor->limitInputPixels = static_cast<uint64_t>(AttrAsInt64(input, "limitInputPixels"));
@@ -228,6 +216,13 @@ namespace sharp {
return EndsWith(str, ".v") || EndsWith(str, ".V") || EndsWith(str, ".vips") || EndsWith(str, ".VIPS"); return EndsWith(str, ".v") || EndsWith(str, ".V") || EndsWith(str, ".vips") || EndsWith(str, ".VIPS");
} }
/*
Trim space from end of string.
*/
std::string TrimEnd(std::string const &str) {
return str.substr(0, str.find_last_not_of(" \n\r\f") + 1);
}
/* /*
Provide a string identifier for the given image type. Provide a string identifier for the given image type.
*/ */
@@ -454,6 +449,7 @@ namespace sharp {
->set("justify", descriptor->textJustify) ->set("justify", descriptor->textJustify)
->set("rgba", descriptor->textRgba) ->set("rgba", descriptor->textRgba)
->set("spacing", descriptor->textSpacing) ->set("spacing", descriptor->textSpacing)
->set("wrap", descriptor->textWrap)
->set("autofit_dpi", &descriptor->textAutofitDpi); ->set("autofit_dpi", &descriptor->textAutofitDpi);
if (descriptor->textWidth > 0) { if (descriptor->textWidth > 0) {
textOptions->set("width", descriptor->textWidth); textOptions->set("width", descriptor->textWidth);
@@ -612,6 +608,15 @@ namespace sharp {
return copy; return copy;
} }
/*
Remove GIF palette from image.
*/
VImage RemoveGifPalette(VImage image) {
VImage copy = image.copy();
copy.remove("gif-palette");
return copy;
}
/* /*
Does this image have a non-default density? Does this image have a non-default density?
*/ */
@@ -664,6 +669,10 @@ namespace sharp {
if (image.width() > 65535 || height > 65535) { if (image.width() > 65535 || height > 65535) {
throw vips::VError("Processed image is too large for the GIF format"); throw vips::VError("Processed image is too large for the GIF format");
} }
} else if (imageType == ImageType::HEIF) {
if (image.width() > 16384 || height > 16384) {
throw vips::VError("Processed image is too large for the HEIF format");
}
} }
} }
@@ -1026,4 +1035,13 @@ namespace sharp {
return std::make_pair(hshrink, vshrink); return std::make_pair(hshrink, vshrink);
} }
/*
Ensure decoding remains sequential.
*/
VImage StaySequential(VImage image, VipsAccess access, bool condition) {
if (access == VIPS_ACCESS_SEQUENTIAL && condition) {
return image.copy_memory();
}
return image;
}
} // namespace sharp } // namespace sharp

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef SRC_COMMON_H_ #ifndef SRC_COMMON_H_
#define SRC_COMMON_H_ #define SRC_COMMON_H_
@@ -25,9 +14,9 @@
// Verify platform and compiler compatibility // Verify platform and compiler compatibility
#if (VIPS_MAJOR_VERSION < 8) || \ #if (VIPS_MAJOR_VERSION < 8) || \
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 13) || \ (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 14) || \
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 13 && VIPS_MICRO_VERSION < 3) (VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 14 && VIPS_MICRO_VERSION < 3)
#error "libvips version 8.13.3+ is required - please see https://sharp.pixelplumbing.com/install" #error "libvips version 8.14.3+ is required - please see https://sharp.pixelplumbing.com/install"
#endif #endif
#if ((!defined(__clang__)) && defined(__GNUC__) && (__GNUC__ < 4 || (__GNUC__ == 4 && __GNUC_MINOR__ < 6))) #if ((!defined(__clang__)) && defined(__GNUC__) && (__GNUC__ < 4 || (__GNUC__ == 4 && __GNUC_MINOR__ < 6)))
@@ -55,6 +44,7 @@ namespace sharp {
size_t bufferLength; size_t bufferLength;
bool isBuffer; bool isBuffer;
double density; double density;
bool ignoreIcc;
VipsBandFormat rawDepth; VipsBandFormat rawDepth;
int rawChannels; int rawChannels;
int rawWidth; int rawWidth;
@@ -81,6 +71,7 @@ namespace sharp {
int textDpi; int textDpi;
bool textRgba; bool textRgba;
int textSpacing; int textSpacing;
VipsTextWrap textWrap;
int textAutofitDpi; int textAutofitDpi;
InputDescriptor(): InputDescriptor():
@@ -92,6 +83,7 @@ namespace sharp {
bufferLength(0), bufferLength(0),
isBuffer(FALSE), isBuffer(FALSE),
density(72.0), density(72.0),
ignoreIcc(FALSE),
rawDepth(VIPS_FORMAT_UCHAR), rawDepth(VIPS_FORMAT_UCHAR),
rawChannels(0), rawChannels(0),
rawWidth(0), rawWidth(0),
@@ -114,6 +106,7 @@ namespace sharp {
textDpi(72), textDpi(72),
textRgba(FALSE), textRgba(FALSE),
textSpacing(0), textSpacing(0),
textWrap(VIPS_TEXT_WRAP_WORD),
textAutofitDpi(0) {} textAutofitDpi(0) {}
}; };
@@ -133,7 +126,6 @@ namespace sharp {
return static_cast<T>( return static_cast<T>(
vips_enum_from_nick(nullptr, type, AttrAsStr(obj, attr).data())); vips_enum_from_nick(nullptr, type, AttrAsStr(obj, attr).data()));
} }
Napi::Buffer<char> NewOrCopyBuffer(Napi::Env env, char* data, size_t len);
// Create an InputDescriptor instance from a Napi::Object describing an input image // Create an InputDescriptor instance from a Napi::Object describing an input image
InputDescriptor* CreateInputDescriptor(Napi::Object input); InputDescriptor* CreateInputDescriptor(Napi::Object input);
@@ -189,6 +181,11 @@ namespace sharp {
bool IsDzZip(std::string const &str); bool IsDzZip(std::string const &str);
bool IsV(std::string const &str); bool IsV(std::string const &str);
/*
Trim space from end of string.
*/
std::string TrimEnd(std::string const &str);
/* /*
Provide a string identifier for the given image type. Provide a string identifier for the given image type.
*/ */
@@ -255,6 +252,11 @@ namespace sharp {
*/ */
VImage RemoveAnimationProperties(VImage image); VImage RemoveAnimationProperties(VImage image);
/*
Remove GIF palette from image.
*/
VImage RemoveGifPalette(VImage image);
/* /*
Does this image have a non-default density? Does this image have a non-default density?
*/ */
@@ -368,6 +370,11 @@ namespace sharp {
std::pair<double, double> ResolveShrink(int width, int height, int targetWidth, int targetHeight, std::pair<double, double> ResolveShrink(int width, int height, int targetWidth, int targetHeight,
Canvas canvas, bool swap, bool withoutEnlargement, bool withoutReduction); Canvas canvas, bool swap, bool withoutEnlargement, bool withoutReduction);
/*
Ensure decoding remains sequential.
*/
VImage StaySequential(VImage image, VipsAccess access, bool condition = TRUE);
} // namespace sharp } // namespace sharp
#endif // SRC_COMMON_H_ #endif // SRC_COMMON_H_

View File

@@ -3679,6 +3679,13 @@ VipsBlob *VImage::webpsave_buffer( VOption *options ) const
return( buffer ); return( buffer );
} }
void VImage::webpsave_mime( VOption *options ) const
{
call( "webpsave_mime",
(options ? options : VImage::option())->
set( "in", *this ) );
}
void VImage::webpsave_target( VTarget target, VOption *options ) const void VImage::webpsave_target( VTarget target, VOption *options ) const
{ {
call( "webpsave_target", call( "webpsave_target",

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <numeric> #include <numeric>
#include <vector> #include <vector>
@@ -80,6 +69,9 @@ class MetadataWorker : public Napi::AsyncWorker {
if (image.get_typeof(VIPS_META_RESOLUTION_UNIT) == VIPS_TYPE_REF_STRING) { if (image.get_typeof(VIPS_META_RESOLUTION_UNIT) == VIPS_TYPE_REF_STRING) {
baton->resolutionUnit = image.get_string(VIPS_META_RESOLUTION_UNIT); baton->resolutionUnit = image.get_string(VIPS_META_RESOLUTION_UNIT);
} }
if (image.get_typeof("magick-format") == VIPS_TYPE_REF_STRING) {
baton->formatMagick = image.get_string("magick-format");
}
if (image.get_typeof("openslide.level-count") == VIPS_TYPE_REF_STRING) { if (image.get_typeof("openslide.level-count") == VIPS_TYPE_REF_STRING) {
int const levels = std::stoi(image.get_string("openslide.level-count")); int const levels = std::stoi(image.get_string("openslide.level-count"));
for (int l = 0; l < levels; l++) { for (int l = 0; l < levels; l++) {
@@ -153,7 +145,7 @@ class MetadataWorker : public Napi::AsyncWorker {
// Handle warnings // Handle warnings
std::string warning = sharp::VipsWarningPop(); std::string warning = sharp::VipsWarningPop();
while (!warning.empty()) { while (!warning.empty()) {
debuglog.Call({ Napi::String::New(env, warning) }); debuglog.MakeCallback(Receiver().Value(), { Napi::String::New(env, warning) });
warning = sharp::VipsWarningPop(); warning = sharp::VipsWarningPop();
} }
@@ -204,6 +196,9 @@ class MetadataWorker : public Napi::AsyncWorker {
if (!baton->resolutionUnit.empty()) { if (!baton->resolutionUnit.empty()) {
info.Set("resolutionUnit", baton->resolutionUnit == "in" ? "inch" : baton->resolutionUnit); info.Set("resolutionUnit", baton->resolutionUnit == "in" ? "inch" : baton->resolutionUnit);
} }
if (!baton->formatMagick.empty()) {
info.Set("formatMagick", baton->formatMagick);
}
if (!baton->levels.empty()) { if (!baton->levels.empty()) {
int i = 0; int i = 0;
Napi::Array levels = Napi::Array::New(env, static_cast<size_t>(baton->levels.size())); Napi::Array levels = Napi::Array::New(env, static_cast<size_t>(baton->levels.size()));
@@ -235,24 +230,25 @@ class MetadataWorker : public Napi::AsyncWorker {
info.Set("orientation", baton->orientation); info.Set("orientation", baton->orientation);
} }
if (baton->exifLength > 0) { if (baton->exifLength > 0) {
info.Set("exif", sharp::NewOrCopyBuffer(env, baton->exif, baton->exifLength)); info.Set("exif", Napi::Buffer<char>::NewOrCopy(env, baton->exif, baton->exifLength, sharp::FreeCallback));
} }
if (baton->iccLength > 0) { if (baton->iccLength > 0) {
info.Set("icc", sharp::NewOrCopyBuffer(env, baton->icc, baton->iccLength)); info.Set("icc", Napi::Buffer<char>::NewOrCopy(env, baton->icc, baton->iccLength, sharp::FreeCallback));
} }
if (baton->iptcLength > 0) { if (baton->iptcLength > 0) {
info.Set("iptc", sharp::NewOrCopyBuffer(env, baton->iptc, baton->iptcLength)); info.Set("iptc", Napi::Buffer<char>::NewOrCopy(env, baton->iptc, baton->iptcLength, sharp::FreeCallback));
} }
if (baton->xmpLength > 0) { if (baton->xmpLength > 0) {
info.Set("xmp", sharp::NewOrCopyBuffer(env, baton->xmp, baton->xmpLength)); info.Set("xmp", Napi::Buffer<char>::NewOrCopy(env, baton->xmp, baton->xmpLength, sharp::FreeCallback));
} }
if (baton->tifftagPhotoshopLength > 0) { if (baton->tifftagPhotoshopLength > 0) {
info.Set("tifftagPhotoshop", info.Set("tifftagPhotoshop",
sharp::NewOrCopyBuffer(env, baton->tifftagPhotoshop, baton->tifftagPhotoshopLength)); Napi::Buffer<char>::NewOrCopy(env, baton->tifftagPhotoshop,
baton->tifftagPhotoshopLength, sharp::FreeCallback));
} }
Callback().MakeCallback(Receiver().Value(), { env.Null(), info }); Callback().MakeCallback(Receiver().Value(), { env.Null(), info });
} else { } else {
Callback().MakeCallback(Receiver().Value(), { Napi::Error::New(env, baton->err).Value() }); Callback().MakeCallback(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() });
} }
delete baton->input; delete baton->input;
@@ -270,7 +266,7 @@ class MetadataWorker : public Napi::AsyncWorker {
Napi::Value metadata(const Napi::CallbackInfo& info) { Napi::Value metadata(const Napi::CallbackInfo& info) {
// V8 objects are converted to non-V8 types held in the baton struct // V8 objects are converted to non-V8 types held in the baton struct
MetadataBaton *baton = new MetadataBaton; MetadataBaton *baton = new MetadataBaton;
Napi::Object options = info[0].As<Napi::Object>(); Napi::Object options = info[size_t(0)].As<Napi::Object>();
// Input // Input
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>()); baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
@@ -279,7 +275,7 @@ Napi::Value metadata(const Napi::CallbackInfo& info) {
Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>(); Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>();
// Join queue for worker thread // Join queue for worker thread
Napi::Function callback = info[1].As<Napi::Function>(); Napi::Function callback = info[size_t(1)].As<Napi::Function>();
MetadataWorker *worker = new MetadataWorker(callback, baton, debuglog); MetadataWorker *worker = new MetadataWorker(callback, baton, debuglog);
worker->Receiver().Set("options", options); worker->Receiver().Set("options", options);
worker->Queue(); worker->Queue();

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef SRC_METADATA_H_ #ifndef SRC_METADATA_H_
#define SRC_METADATA_H_ #define SRC_METADATA_H_
@@ -41,6 +30,7 @@ struct MetadataBaton {
int pagePrimary; int pagePrimary;
std::string compression; std::string compression;
std::string resolutionUnit; std::string resolutionUnit;
std::string formatMagick;
std::vector<std::pair<int, int>> levels; std::vector<std::pair<int, int>> levels;
int subifds; int subifds;
std::vector<double> background; std::vector<double> background;

View File

@@ -1,23 +1,11 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <algorithm> #include <algorithm>
#include <functional> #include <functional>
#include <memory> #include <memory>
#include <tuple> #include <tuple>
#include <vector> #include <vector>
#include <vips/vips8> #include <vips/vips8>
#include "common.h" #include "common.h"
@@ -57,7 +45,7 @@ namespace sharp {
/* /*
* Stretch luminance to cover full dynamic range. * Stretch luminance to cover full dynamic range.
*/ */
VImage Normalise(VImage image) { VImage Normalise(VImage image, int const lower, int const upper) {
// Get original colourspace // Get original colourspace
VipsInterpretation typeBeforeNormalize = image.interpretation(); VipsInterpretation typeBeforeNormalize = image.interpretation();
if (typeBeforeNormalize == VIPS_INTERPRETATION_RGB) { if (typeBeforeNormalize == VIPS_INTERPRETATION_RGB) {
@@ -67,9 +55,11 @@ namespace sharp {
VImage lab = image.colourspace(VIPS_INTERPRETATION_LAB); VImage lab = image.colourspace(VIPS_INTERPRETATION_LAB);
// Extract luminance // Extract luminance
VImage luminance = lab[0]; VImage luminance = lab[0];
// Find luminance range // Find luminance range
int const min = luminance.percent(1); int const min = lower == 0 ? luminance.min() : luminance.percent(lower);
int const max = luminance.percent(99); int const max = upper == 100 ? luminance.max() : luminance.percent(upper);
if (std::abs(max - min) > 1) { if (std::abs(max - min) > 1) {
// Extract chroma // Extract chroma
VImage chroma = lab.extract_band(1, VImage::option()->set("n", 2)); VImage chroma = lab.extract_band(1, VImage::option()->set("n", 2));
@@ -196,6 +186,7 @@ namespace sharp {
VImage Modulate(VImage image, double const brightness, double const saturation, VImage Modulate(VImage image, double const brightness, double const saturation,
int const hue, double const lightness) { int const hue, double const lightness) {
VipsInterpretation colourspaceBeforeModulate = image.interpretation();
if (HasAlpha(image)) { if (HasAlpha(image)) {
// Separate alpha channel // Separate alpha channel
VImage alpha = image[image.bands() - 1]; VImage alpha = image[image.bands() - 1];
@@ -205,7 +196,7 @@ namespace sharp {
{ brightness, saturation, 1}, { brightness, saturation, 1},
{ lightness, 0.0, static_cast<double>(hue) } { lightness, 0.0, static_cast<double>(hue) }
) )
.colourspace(VIPS_INTERPRETATION_sRGB) .colourspace(colourspaceBeforeModulate)
.bandjoin(alpha); .bandjoin(alpha);
} else { } else {
return image return image
@@ -214,7 +205,7 @@ namespace sharp {
{ brightness, saturation, 1 }, { brightness, saturation, 1 },
{ lightness, 0.0, static_cast<double>(hue) } { lightness, 0.0, static_cast<double>(hue) }
) )
.colourspace(VIPS_INTERPRETATION_sRGB); .colourspace(colourspaceBeforeModulate);
} }
} }
@@ -278,30 +269,20 @@ namespace sharp {
if (image.width() < 3 && image.height() < 3) { if (image.width() < 3 && image.height() < 3) {
throw VError("Image to trim must be at least 3x3 pixels"); throw VError("Image to trim must be at least 3x3 pixels");
} }
// Scale up 8-bit values to match 16-bit input image
double multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0;
threshold *= multiplier;
std::vector<double> backgroundAlpha(1);
if (background.size() == 0) { if (background.size() == 0) {
// Top-left pixel provides the default background colour if none is given // Top-left pixel provides the default background colour if none is given
background = image.extract_area(0, 0, 1, 1)(0, 0); background = image.extract_area(0, 0, 1, 1)(0, 0);
multiplier = 1.0; } else if (sharp::Is16Bit(image.interpretation())) {
for (size_t i = 0; i < background.size(); i++) {
background[i] *= 256.0;
}
threshold *= 256.0;
} }
if (HasAlpha(image) && background.size() == 4) { std::vector<double> backgroundAlpha({ background.back() });
// Just discard the alpha because flattening the background colour with if (HasAlpha(image)) {
// itself (effectively what find_trim() does) gives the same result background.pop_back();
backgroundAlpha[0] = background[3] * multiplier;
}
if (image.bands() > 2) {
background = {
background[0] * multiplier,
background[1] * multiplier,
background[2] * multiplier
};
} else { } else {
background[0] = background[0] * multiplier; background.resize(image.bands());
} }
int left, top, width, height; int left, top, width, height;
left = image.find_trim(&top, &width, &height, VImage::option() left = image.find_trim(&top, &width, &height, VImage::option()
@@ -342,12 +323,26 @@ namespace sharp {
if (a.size() > bands) { if (a.size() > bands) {
throw VError("Band expansion using linear is unsupported"); throw VError("Band expansion using linear is unsupported");
} }
bool const uchar = !Is16Bit(image.interpretation());
if (HasAlpha(image) && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) { if (HasAlpha(image) && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) {
// Separate alpha channel // Separate alpha channel
VImage alpha = image[bands - 1]; VImage alpha = image[bands - 1];
return RemoveAlpha(image).linear(a, b, VImage::option()->set("uchar", TRUE)).bandjoin(alpha); return RemoveAlpha(image).linear(a, b, VImage::option()->set("uchar", uchar)).bandjoin(alpha);
} else { } else {
return image.linear(a, b, VImage::option()->set("uchar", TRUE)); return image.linear(a, b, VImage::option()->set("uchar", uchar));
}
}
/*
* Unflatten
*/
VImage Unflatten(VImage image) {
if (HasAlpha(image)) {
VImage alpha = image[image.bands() - 1];
VImage noAlpha = RemoveAlpha(image);
return noAlpha.bandjoin(alpha & (noAlpha.colourspace(VIPS_INTERPRETATION_B_W) < 255));
} else {
return image.bandjoin(image.colourspace(VIPS_INTERPRETATION_B_W) < 255);
} }
} }
@@ -395,11 +390,11 @@ namespace sharp {
* Split into frames, embed each frame, reassemble, and update pageHeight. * Split into frames, embed each frame, reassemble, and update pageHeight.
*/ */
VImage EmbedMultiPage(VImage image, int left, int top, int width, int height, VImage EmbedMultiPage(VImage image, int left, int top, int width, int height,
std::vector<double> background, int nPages, int *pageHeight) { VipsExtend extendWith, std::vector<double> background, int nPages, int *pageHeight) {
if (top == 0 && height == *pageHeight) { if (top == 0 && height == *pageHeight) {
// Fast path; no need to adjust the height of the multi-page image // Fast path; no need to adjust the height of the multi-page image
return image.embed(left, 0, width, image.height(), VImage::option() return image.embed(left, 0, width, image.height(), VImage::option()
->set("extend", VIPS_EXTEND_BACKGROUND) ->set("extend", extendWith)
->set("background", background)); ->set("background", background));
} else if (left == 0 && width == image.width()) { } else if (left == 0 && width == image.width()) {
// Fast path; no need to adjust the width of the multi-page image // Fast path; no need to adjust the width of the multi-page image
@@ -411,7 +406,7 @@ namespace sharp {
// Do the embed on the wide image // Do the embed on the wide image
image = image.embed(0, top, image.width(), height, VImage::option() image = image.embed(0, top, image.width(), height, VImage::option()
->set("extend", VIPS_EXTEND_BACKGROUND) ->set("extend", extendWith)
->set("background", background)); ->set("background", background));
// Split the wide image into frames // Split the wide image into frames
@@ -441,7 +436,7 @@ namespace sharp {
// Embed each frame in the target size // Embed each frame in the target size
for (int i = 0; i < nPages; i++) { for (int i = 0; i < nPages; i++) {
pages[i] = pages[i].embed(left, top, width, height, VImage::option() pages[i] = pages[i].embed(left, top, width, height, VImage::option()
->set("extend", VIPS_EXTEND_BACKGROUND) ->set("extend", extendWith)
->set("background", background)); ->set("background", background));
} }

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef SRC_OPERATIONS_H_ #ifndef SRC_OPERATIONS_H_
#define SRC_OPERATIONS_H_ #define SRC_OPERATIONS_H_
@@ -33,7 +22,7 @@ namespace sharp {
/* /*
* Stretch luminance to cover full dynamic range. * Stretch luminance to cover full dynamic range.
*/ */
VImage Normalise(VImage image); VImage Normalise(VImage image, int const lower, int const upper);
/* /*
* Contrast limiting adapative histogram equalization (CLAHE) * Contrast limiting adapative histogram equalization (CLAHE)
@@ -97,6 +86,11 @@ namespace sharp {
*/ */
VImage Linear(VImage image, std::vector<double> const a, std::vector<double> const b); VImage Linear(VImage image, std::vector<double> const a, std::vector<double> const b);
/*
* Unflatten
*/
VImage Unflatten(VImage image);
/* /*
* Recomb with a Matrix of the given bands/channel size. * Recomb with a Matrix of the given bands/channel size.
* Eg. RGB will be a 3x3 matrix. * Eg. RGB will be a 3x3 matrix.
@@ -124,7 +118,7 @@ namespace sharp {
* Split into frames, embed each frame, reassemble, and update pageHeight. * Split into frames, embed each frame, reassemble, and update pageHeight.
*/ */
VImage EmbedMultiPage(VImage image, int left, int top, int width, int height, VImage EmbedMultiPage(VImage image, int left, int top, int width, int height,
std::vector<double> background, int nPages, int *pageHeight); VipsExtend extendWith, std::vector<double> background, int nPages, int *pageHeight);
} // namespace sharp } // namespace sharp

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <algorithm> #include <algorithm>
#include <cmath> #include <cmath>
@@ -67,6 +56,7 @@ class PipelineWorker : public Napi::AsyncWorker {
vips::VImage image; vips::VImage image;
sharp::ImageType inputImageType; sharp::ImageType inputImageType;
std::tie(image, inputImageType) = sharp::OpenInput(baton->input); std::tie(image, inputImageType) = sharp::OpenInput(baton->input);
VipsAccess access = baton->input->access;
image = sharp::EnsureColourspace(image, baton->colourspaceInput); image = sharp::EnsureColourspace(image, baton->colourspaceInput);
int nPages = baton->input->pages; int nPages = baton->input->pages;
@@ -85,6 +75,7 @@ class PipelineWorker : public Napi::AsyncWorker {
VipsAngle autoRotation = VIPS_ANGLE_D0; VipsAngle autoRotation = VIPS_ANGLE_D0;
bool autoFlip = FALSE; bool autoFlip = FALSE;
bool autoFlop = FALSE; bool autoFlop = FALSE;
if (baton->useExifOrientation) { if (baton->useExifOrientation) {
// Rotate and flip image according to Exif orientation // Rotate and flip image according to Exif orientation
std::tie(autoRotation, autoFlip, autoFlop) = CalculateExifRotationAndFlip(sharp::ExifOrientation(image)); std::tie(autoRotation, autoFlip, autoFlop) = CalculateExifRotationAndFlip(sharp::ExifOrientation(image));
@@ -100,6 +91,13 @@ class PipelineWorker : public Napi::AsyncWorker {
baton->rotationAngle != 0.0); baton->rotationAngle != 0.0);
if (shouldRotateBefore) { if (shouldRotateBefore) {
image = sharp::StaySequential(image, access,
rotation != VIPS_ANGLE_D0 ||
autoRotation != VIPS_ANGLE_D0 ||
autoFlip ||
baton->flip ||
baton->rotationAngle != 0.0);
if (autoRotation != VIPS_ANGLE_D0) { if (autoRotation != VIPS_ANGLE_D0) {
image = image.rot(autoRotation); image = image.rot(autoRotation);
autoRotation = VIPS_ANGLE_D0; autoRotation = VIPS_ANGLE_D0;
@@ -126,13 +124,14 @@ class PipelineWorker : public Napi::AsyncWorker {
MultiPageUnsupported(nPages, "Rotate"); MultiPageUnsupported(nPages, "Rotate");
std::vector<double> background; std::vector<double> background;
std::tie(image, background) = sharp::ApplyAlpha(image, baton->rotationBackground, FALSE); std::tie(image, background) = sharp::ApplyAlpha(image, baton->rotationBackground, FALSE);
image = image.rotate(baton->rotationAngle, VImage::option()->set("background", background)); image = image.rotate(baton->rotationAngle, VImage::option()->set("background", background)).copy_memory();
} }
} }
// Trim // Trim
if (baton->trimThreshold > 0.0) { if (baton->trimThreshold > 0.0) {
MultiPageUnsupported(nPages, "Trim"); MultiPageUnsupported(nPages, "Trim");
image = sharp::StaySequential(image, access);
image = sharp::Trim(image, baton->trimBackground, baton->trimThreshold); image = sharp::Trim(image, baton->trimBackground, baton->trimThreshold);
baton->trimOffsetLeft = image.xoffset(); baton->trimOffsetLeft = image.xoffset();
baton->trimOffsetTop = image.yoffset(); baton->trimOffsetTop = image.yoffset();
@@ -205,7 +204,7 @@ class PipelineWorker : public Napi::AsyncWorker {
if (jpegShrinkOnLoad > 1 && static_cast<int>(shrink) == jpegShrinkOnLoad) { if (jpegShrinkOnLoad > 1 && static_cast<int>(shrink) == jpegShrinkOnLoad) {
jpegShrinkOnLoad /= 2; jpegShrinkOnLoad /= 2;
} }
} else if (inputImageType == sharp::ImageType::WEBP && shrink > 1.0) { } else if (inputImageType == sharp::ImageType::WEBP && baton->fastShrinkOnLoad && shrink > 1.0) {
// Avoid upscaling via webp // Avoid upscaling via webp
scale = 1.0 / shrink; scale = 1.0 / shrink;
} else if (inputImageType == sharp::ImageType::SVG || } else if (inputImageType == sharp::ImageType::SVG ||
@@ -219,7 +218,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// pdfload* and svgload* // pdfload* and svgload*
if (jpegShrinkOnLoad > 1) { if (jpegShrinkOnLoad > 1) {
vips::VOption *option = VImage::option() vips::VOption *option = VImage::option()
->set("access", baton->input->access) ->set("access", access)
->set("shrink", jpegShrinkOnLoad) ->set("shrink", jpegShrinkOnLoad)
->set("unlimited", baton->input->unlimited) ->set("unlimited", baton->input->unlimited)
->set("fail_on", baton->input->failOn); ->set("fail_on", baton->input->failOn);
@@ -234,7 +233,7 @@ class PipelineWorker : public Napi::AsyncWorker {
} }
} else if (scale != 1.0) { } else if (scale != 1.0) {
vips::VOption *option = VImage::option() vips::VOption *option = VImage::option()
->set("access", baton->input->access) ->set("access", access)
->set("scale", scale) ->set("scale", scale)
->set("fail_on", baton->input->failOn); ->set("fail_on", baton->input->failOn);
if (inputImageType == sharp::ImageType::WEBP) { if (inputImageType == sharp::ImageType::WEBP) {
@@ -320,7 +319,8 @@ class PipelineWorker : public Napi::AsyncWorker {
if ( if (
sharp::HasProfile(image) && sharp::HasProfile(image) &&
image.interpretation() != VIPS_INTERPRETATION_LABS && image.interpretation() != VIPS_INTERPRETATION_LABS &&
image.interpretation() != VIPS_INTERPRETATION_GREY16 image.interpretation() != VIPS_INTERPRETATION_GREY16 &&
!baton->input->ignoreIcc
) { ) {
// Convert to sRGB/P3 using embedded profile // Convert to sRGB/P3 using embedded profile
try { try {
@@ -329,9 +329,12 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("depth", image.interpretation() == VIPS_INTERPRETATION_RGB16 ? 16 : 8) ->set("depth", image.interpretation() == VIPS_INTERPRETATION_RGB16 ? 16 : 8)
->set("intent", VIPS_INTENT_PERCEPTUAL)); ->set("intent", VIPS_INTENT_PERCEPTUAL));
} catch(...) { } catch(...) {
// Ignore failure of embedded profile sharp::VipsWarningCallback(nullptr, G_LOG_LEVEL_WARNING, "Invalid embedded profile", nullptr);
} }
} else if (image.interpretation() == VIPS_INTERPRETATION_CMYK) { } else if (
image.interpretation() == VIPS_INTERPRETATION_CMYK &&
baton->colourspaceInput != VIPS_INTERPRETATION_CMYK
) {
image = image.icc_transform(processingProfile, VImage::option() image = image.icc_transform(processingProfile, VImage::option()
->set("input_profile", "cmyk") ->set("input_profile", "cmyk")
->set("intent", VIPS_INTENT_PERCEPTUAL)); ->set("intent", VIPS_INTENT_PERCEPTUAL));
@@ -367,11 +370,12 @@ class PipelineWorker : public Napi::AsyncWorker {
image = sharp::EnsureAlpha(image, 1); image = sharp::EnsureAlpha(image, 1);
} }
VipsBandFormat premultiplyFormat = image.format();
bool const shouldPremultiplyAlpha = sharp::HasAlpha(image) && bool const shouldPremultiplyAlpha = sharp::HasAlpha(image) &&
(shouldResize || shouldBlur || shouldConv || shouldSharpen); (shouldResize || shouldBlur || shouldConv || shouldSharpen);
if (shouldPremultiplyAlpha) { if (shouldPremultiplyAlpha) {
image = image.premultiply(); image = image.premultiply().cast(premultiplyFormat);
} }
// Resize // Resize
@@ -381,15 +385,20 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("kernel", baton->kernel)); ->set("kernel", baton->kernel));
} }
image = sharp::StaySequential(image, access,
autoRotation != VIPS_ANGLE_D0 ||
baton->flip ||
autoFlip ||
rotation != VIPS_ANGLE_D0);
// Auto-rotate post-extract // Auto-rotate post-extract
if (autoRotation != VIPS_ANGLE_D0) { if (autoRotation != VIPS_ANGLE_D0) {
image = image.rot(autoRotation); image = image.rot(autoRotation);
} }
// Flip (mirror about Y axis) // Mirror vertically (up-down) about the x-axis
if (baton->flip || autoFlip) { if (baton->flip || autoFlip) {
image = image.flip(VIPS_DIRECTION_VERTICAL); image = image.flip(VIPS_DIRECTION_VERTICAL);
} }
// Flop (mirror about X axis) // Mirror horizontally (left-right) about the y-axis
if (baton->flop || autoFlop) { if (baton->flop || autoFlop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL); image = image.flip(VIPS_DIRECTION_HORIZONTAL);
} }
@@ -399,16 +408,18 @@ class PipelineWorker : public Napi::AsyncWorker {
} }
// Join additional color channels to the image // Join additional color channels to the image
if (baton->joinChannelIn.size() > 0) { if (!baton->joinChannelIn.empty()) {
VImage joinImage; VImage joinImage;
sharp::ImageType joinImageType = sharp::ImageType::UNKNOWN; sharp::ImageType joinImageType = sharp::ImageType::UNKNOWN;
for (unsigned int i = 0; i < baton->joinChannelIn.size(); i++) { for (unsigned int i = 0; i < baton->joinChannelIn.size(); i++) {
baton->joinChannelIn[i]->access = access;
std::tie(joinImage, joinImageType) = sharp::OpenInput(baton->joinChannelIn[i]); std::tie(joinImage, joinImageType) = sharp::OpenInput(baton->joinChannelIn[i]);
joinImage = sharp::EnsureColourspace(joinImage, baton->colourspaceInput); joinImage = sharp::EnsureColourspace(joinImage, baton->colourspaceInput);
image = image.bandjoin(joinImage); image = image.bandjoin(joinImage);
} }
image = image.copy(VImage::option()->set("interpretation", baton->colourspace)); image = image.copy(VImage::option()->set("interpretation", baton->colourspace));
image = sharp::RemoveGifPalette(image);
} }
inputWidth = image.width(); inputWidth = image.width();
@@ -438,7 +449,7 @@ class PipelineWorker : public Napi::AsyncWorker {
image = nPages > 1 image = nPages > 1
? sharp::EmbedMultiPage(image, ? sharp::EmbedMultiPage(image,
left, top, width, height, background, nPages, &targetPageHeight) left, top, width, height, VIPS_EXTEND_BACKGROUND, background, nPages, &targetPageHeight)
: image.embed(left, top, width, height, VImage::option() : image.embed(left, top, width, height, VImage::option()
->set("extend", VIPS_EXTEND_BACKGROUND) ->set("extend", VIPS_EXTEND_BACKGROUND)
->set("background", background)); ->set("background", background));
@@ -455,6 +466,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Gravity-based crop // Gravity-based crop
int left; int left;
int top; int top;
std::tie(left, top) = sharp::CalculateCrop( std::tie(left, top) = sharp::CalculateCrop(
inputWidth, inputHeight, baton->width, baton->height, baton->position); inputWidth, inputHeight, baton->width, baton->height, baton->position);
int width = std::min(inputWidth, baton->width); int width = std::min(inputWidth, baton->width);
@@ -465,16 +477,25 @@ class PipelineWorker : public Napi::AsyncWorker {
left, top, width, height, nPages, &targetPageHeight) left, top, width, height, nPages, &targetPageHeight)
: image.extract_area(left, top, width, height); : image.extract_area(left, top, width, height);
} else { } else {
int attention_x;
int attention_y;
// Attention-based or Entropy-based crop // Attention-based or Entropy-based crop
MultiPageUnsupported(nPages, "Resize strategy"); MultiPageUnsupported(nPages, "Resize strategy");
image = image.tilecache(VImage::option() image = sharp::StaySequential(image, access);
->set("access", VIPS_ACCESS_RANDOM)
->set("threaded", TRUE));
image = image.smartcrop(baton->width, baton->height, VImage::option() image = image.smartcrop(baton->width, baton->height, VImage::option()
->set("interesting", baton->position == 16 ? VIPS_INTERESTING_ENTROPY : VIPS_INTERESTING_ATTENTION)); ->set("interesting", baton->position == 16 ? VIPS_INTERESTING_ENTROPY : VIPS_INTERESTING_ATTENTION)
#if (VIPS_MAJOR_VERSION >= 8 && VIPS_MINOR_VERSION >= 15)
->set("premultiplied", shouldPremultiplyAlpha)
#endif
->set("attention_x", &attention_x)
->set("attention_y", &attention_y));
baton->hasCropOffset = true; baton->hasCropOffset = true;
baton->cropOffsetLeft = static_cast<int>(image.xoffset()); baton->cropOffsetLeft = static_cast<int>(image.xoffset());
baton->cropOffsetTop = static_cast<int>(image.yoffset()); baton->cropOffsetTop = static_cast<int>(image.yoffset());
baton->hasAttentionCenter = true;
baton->attentionX = static_cast<int>(attention_x * jpegShrinkOnLoad / scale);
baton->attentionY = static_cast<int>(attention_y * jpegShrinkOnLoad / scale);
} }
} }
} }
@@ -482,6 +503,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Rotate post-extract non-90 angle // Rotate post-extract non-90 angle
if (!baton->rotateBeforePreExtract && baton->rotationAngle != 0.0) { if (!baton->rotateBeforePreExtract && baton->rotationAngle != 0.0) {
MultiPageUnsupported(nPages, "Rotate"); MultiPageUnsupported(nPages, "Rotate");
image = sharp::StaySequential(image, access);
std::vector<double> background; std::vector<double> background;
std::tie(image, background) = sharp::ApplyAlpha(image, baton->rotationBackground, shouldPremultiplyAlpha); std::tie(image, background) = sharp::ApplyAlpha(image, baton->rotationBackground, shouldPremultiplyAlpha);
image = image.rotate(baton->rotationAngle, VImage::option()->set("background", background)); image = image.rotate(baton->rotationAngle, VImage::option()->set("background", background));
@@ -503,8 +525,9 @@ class PipelineWorker : public Napi::AsyncWorker {
} }
// Affine transform // Affine transform
if (baton->affineMatrix.size() > 0) { if (!baton->affineMatrix.empty()) {
MultiPageUnsupported(nPages, "Affine"); MultiPageUnsupported(nPages, "Affine");
image = sharp::StaySequential(image, access);
std::vector<double> background; std::vector<double> background;
std::tie(image, background) = sharp::ApplyAlpha(image, baton->affineBackground, shouldPremultiplyAlpha); std::tie(image, background) = sharp::ApplyAlpha(image, baton->affineBackground, shouldPremultiplyAlpha);
vips::VInterpolate interp = vips::VInterpolate::new_from_name( vips::VInterpolate interp = vips::VInterpolate::new_from_name(
@@ -519,24 +542,37 @@ class PipelineWorker : public Napi::AsyncWorker {
// Extend edges // Extend edges
if (baton->extendTop > 0 || baton->extendBottom > 0 || baton->extendLeft > 0 || baton->extendRight > 0) { if (baton->extendTop > 0 || baton->extendBottom > 0 || baton->extendLeft > 0 || baton->extendRight > 0) {
std::vector<double> background;
std::tie(image, background) = sharp::ApplyAlpha(image, baton->extendBackground, shouldPremultiplyAlpha);
// Embed // Embed
baton->width = image.width() + baton->extendLeft + baton->extendRight; baton->width = image.width() + baton->extendLeft + baton->extendRight;
baton->height = (nPages > 1 ? targetPageHeight : image.height()) + baton->extendTop + baton->extendBottom; baton->height = (nPages > 1 ? targetPageHeight : image.height()) + baton->extendTop + baton->extendBottom;
image = nPages > 1 if (baton->extendWith == VIPS_EXTEND_BACKGROUND) {
? sharp::EmbedMultiPage(image, std::vector<double> background;
baton->extendLeft, baton->extendTop, baton->width, baton->height, background, nPages, &targetPageHeight) std::tie(image, background) = sharp::ApplyAlpha(image, baton->extendBackground, shouldPremultiplyAlpha);
: image.embed(baton->extendLeft, baton->extendTop, baton->width, baton->height,
VImage::option()->set("extend", VIPS_EXTEND_BACKGROUND)->set("background", background)); image = nPages > 1
? sharp::EmbedMultiPage(image,
baton->extendLeft, baton->extendTop, baton->width, baton->height,
baton->extendWith, background, nPages, &targetPageHeight)
: image.embed(baton->extendLeft, baton->extendTop, baton->width, baton->height,
VImage::option()->set("extend", baton->extendWith)->set("background", background));
} else {
std::vector<double> ignoredBackground(1);
image = nPages > 1
? sharp::EmbedMultiPage(image,
baton->extendLeft, baton->extendTop, baton->width, baton->height,
baton->extendWith, ignoredBackground, nPages, &targetPageHeight)
: image.embed(baton->extendLeft, baton->extendTop, baton->width, baton->height,
VImage::option()->set("extend", baton->extendWith));
}
} }
// Median - must happen before blurring, due to the utility of blurring after thresholding // Median - must happen before blurring, due to the utility of blurring after thresholding
if (baton->medianSize > 0) { if (baton->medianSize > 0) {
image = image.median(baton->medianSize); image = image.median(baton->medianSize);
} }
// Threshold - must happen before blurring, due to the utility of blurring after thresholding // Threshold - must happen before blurring, due to the utility of blurring after thresholding
// Threshold - must happen before unflatten to enable non-white unflattening
if (baton->threshold != 0) { if (baton->threshold != 0) {
image = sharp::Threshold(image, baton->threshold, baton->thresholdGrayscale); image = sharp::Threshold(image, baton->threshold, baton->thresholdGrayscale);
} }
@@ -546,6 +582,11 @@ class PipelineWorker : public Napi::AsyncWorker {
image = sharp::Blur(image, baton->blurSigma); image = sharp::Blur(image, baton->blurSigma);
} }
// Unflatten the image
if (baton->unflatten) {
image = sharp::Unflatten(image);
}
// Convolve // Convolve
if (shouldConv) { if (shouldConv) {
image = sharp::Convolve(image, image = sharp::Convolve(image,
@@ -572,13 +613,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Reverse premultiplication after all transformations // Reverse premultiplication after all transformations
if (shouldPremultiplyAlpha) { if (shouldPremultiplyAlpha) {
image = image.unpremultiply(); image = image.unpremultiply().cast(premultiplyFormat);
// Cast pixel values to integer
if (sharp::Is16Bit(image.interpretation())) {
image = image.cast(VIPS_FORMAT_USHORT);
} else {
image = image.cast(VIPS_FORMAT_UCHAR);
}
} }
baton->premultiplied = shouldPremultiplyAlpha; baton->premultiplied = shouldPremultiplyAlpha;
@@ -589,6 +624,7 @@ class PipelineWorker : public Napi::AsyncWorker {
for (Composite *composite : baton->composite) { for (Composite *composite : baton->composite) {
VImage compositeImage; VImage compositeImage;
sharp::ImageType compositeImageType = sharp::ImageType::UNKNOWN; sharp::ImageType compositeImageType = sharp::ImageType::UNKNOWN;
composite->input->access = access;
std::tie(compositeImage, compositeImageType) = sharp::OpenInput(composite->input); std::tie(compositeImage, compositeImageType) = sharp::OpenInput(composite->input);
compositeImage = sharp::EnsureColourspace(compositeImage, baton->colourspaceInput); compositeImage = sharp::EnsureColourspace(compositeImage, baton->colourspaceInput);
// Verify within current dimensions // Verify within current dimensions
@@ -660,6 +696,7 @@ class PipelineWorker : public Napi::AsyncWorker {
ys.push_back(top); ys.push_back(top);
} }
image = VImage::composite(images, modes, VImage::option()->set("x", xs)->set("y", ys)); image = VImage::composite(images, modes, VImage::option()->set("x", xs)->set("y", ys));
image = sharp::RemoveGifPalette(image);
} }
// Gamma decoding (brighten) // Gamma decoding (brighten)
@@ -674,11 +711,13 @@ class PipelineWorker : public Napi::AsyncWorker {
// Apply normalisation - stretch luminance to cover full dynamic range // Apply normalisation - stretch luminance to cover full dynamic range
if (baton->normalise) { if (baton->normalise) {
image = sharp::Normalise(image); image = sharp::StaySequential(image, access);
image = sharp::Normalise(image, baton->normaliseLower, baton->normaliseUpper);
} }
// Apply contrast limiting adaptive histogram equalization (CLAHE) // Apply contrast limiting adaptive histogram equalization (CLAHE)
if (baton->claheWidth != 0 && baton->claheHeight != 0) { if (baton->claheWidth != 0 && baton->claheHeight != 0) {
image = sharp::StaySequential(image, access);
image = sharp::Clahe(image, baton->claheWidth, baton->claheHeight, baton->claheMaxSlope); image = sharp::Clahe(image, baton->claheWidth, baton->claheHeight, baton->claheMaxSlope);
} }
@@ -686,9 +725,11 @@ class PipelineWorker : public Napi::AsyncWorker {
if (baton->boolean != nullptr) { if (baton->boolean != nullptr) {
VImage booleanImage; VImage booleanImage;
sharp::ImageType booleanImageType = sharp::ImageType::UNKNOWN; sharp::ImageType booleanImageType = sharp::ImageType::UNKNOWN;
baton->boolean->access = access;
std::tie(booleanImage, booleanImageType) = sharp::OpenInput(baton->boolean); std::tie(booleanImage, booleanImageType) = sharp::OpenInput(baton->boolean);
booleanImage = sharp::EnsureColourspace(booleanImage, baton->colourspaceInput); booleanImage = sharp::EnsureColourspace(booleanImage, baton->colourspaceInput);
image = sharp::Boolean(image, booleanImage, baton->booleanOp); image = sharp::Boolean(image, booleanImage, baton->booleanOp);
image = sharp::RemoveGifPalette(image);
} }
// Apply per-channel Bandbool bitwise operations after all other operations // Apply per-channel Bandbool bitwise operations after all other operations
@@ -853,6 +894,7 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("lossless", baton->webpLossless) ->set("lossless", baton->webpLossless)
->set("near_lossless", baton->webpNearLossless) ->set("near_lossless", baton->webpNearLossless)
->set("smart_subsample", baton->webpSmartSubsample) ->set("smart_subsample", baton->webpSmartSubsample)
->set("preset", baton->webpPreset)
->set("effort", baton->webpEffort) ->set("effort", baton->webpEffort)
->set("min_size", baton->webpMinSize) ->set("min_size", baton->webpMinSize)
->set("mixed", baton->webpMixed) ->set("mixed", baton->webpMixed)
@@ -870,7 +912,8 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("strip", !baton->withMetadata) ->set("strip", !baton->withMetadata)
->set("bitdepth", baton->gifBitdepth) ->set("bitdepth", baton->gifBitdepth)
->set("effort", baton->gifEffort) ->set("effort", baton->gifEffort)
->set("reoptimise", baton->gifReoptimise) ->set("reuse", baton->gifReuse)
->set("interlace", baton->gifProgressive)
->set("interframe_maxerror", baton->gifInterFrameMaxError) ->set("interframe_maxerror", baton->gifInterFrameMaxError)
->set("interpalette_maxerror", baton->gifInterPaletteMaxError) ->set("interpalette_maxerror", baton->gifInterPaletteMaxError)
->set("dither", baton->gifDither))); ->set("dither", baton->gifDither)));
@@ -911,6 +954,7 @@ class PipelineWorker : public Napi::AsyncWorker {
} else if (baton->formatOut == "heif" || } else if (baton->formatOut == "heif" ||
(baton->formatOut == "input" && inputImageType == sharp::ImageType::HEIF)) { (baton->formatOut == "input" && inputImageType == sharp::ImageType::HEIF)) {
// Write HEIF to buffer // Write HEIF to buffer
sharp::AssertImageTypeDimensions(image, sharp::ImageType::HEIF);
image = sharp::RemoveAnimationProperties(image).cast(VIPS_FORMAT_UCHAR); image = sharp::RemoveAnimationProperties(image).cast(VIPS_FORMAT_UCHAR);
VipsArea *area = reinterpret_cast<VipsArea*>(image.heifsave_buffer(VImage::option() VipsArea *area = reinterpret_cast<VipsArea*>(image.heifsave_buffer(VImage::option()
->set("strip", !baton->withMetadata) ->set("strip", !baton->withMetadata)
@@ -932,6 +976,7 @@ class PipelineWorker : public Napi::AsyncWorker {
if (!sharp::HasAlpha(image)) { if (!sharp::HasAlpha(image)) {
baton->tileBackground.pop_back(); baton->tileBackground.pop_back();
} }
image = sharp::StaySequential(image, access, baton->tileAngle != 0);
vips::VOption *options = BuildOptionsDZ(baton); vips::VOption *options = BuildOptionsDZ(baton);
VipsArea *area = reinterpret_cast<VipsArea*>(image.dzsave_buffer(options)); VipsArea *area = reinterpret_cast<VipsArea*>(image.dzsave_buffer(options));
baton->bufferOut = static_cast<char*>(area->data); baton->bufferOut = static_cast<char*>(area->data);
@@ -1055,6 +1100,7 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("lossless", baton->webpLossless) ->set("lossless", baton->webpLossless)
->set("near_lossless", baton->webpNearLossless) ->set("near_lossless", baton->webpNearLossless)
->set("smart_subsample", baton->webpSmartSubsample) ->set("smart_subsample", baton->webpSmartSubsample)
->set("preset", baton->webpPreset)
->set("effort", baton->webpEffort) ->set("effort", baton->webpEffort)
->set("min_size", baton->webpMinSize) ->set("min_size", baton->webpMinSize)
->set("mixed", baton->webpMixed) ->set("mixed", baton->webpMixed)
@@ -1068,7 +1114,8 @@ class PipelineWorker : public Napi::AsyncWorker {
->set("strip", !baton->withMetadata) ->set("strip", !baton->withMetadata)
->set("bitdepth", baton->gifBitdepth) ->set("bitdepth", baton->gifBitdepth)
->set("effort", baton->gifEffort) ->set("effort", baton->gifEffort)
->set("reoptimise", baton->gifReoptimise) ->set("reuse", baton->gifReuse)
->set("interlace", baton->gifProgressive)
->set("dither", baton->gifDither)); ->set("dither", baton->gifDither));
baton->formatOut = "gif"; baton->formatOut = "gif";
} else if (baton->formatOut == "tiff" || (mightMatchInput && isTiff) || } else if (baton->formatOut == "tiff" || (mightMatchInput && isTiff) ||
@@ -1099,6 +1146,7 @@ class PipelineWorker : public Napi::AsyncWorker {
} else if (baton->formatOut == "heif" || (mightMatchInput && isHeif) || } else if (baton->formatOut == "heif" || (mightMatchInput && isHeif) ||
(willMatchInput && inputImageType == sharp::ImageType::HEIF)) { (willMatchInput && inputImageType == sharp::ImageType::HEIF)) {
// Write HEIF to file // Write HEIF to file
sharp::AssertImageTypeDimensions(image, sharp::ImageType::HEIF);
image = sharp::RemoveAnimationProperties(image).cast(VIPS_FORMAT_UCHAR); image = sharp::RemoveAnimationProperties(image).cast(VIPS_FORMAT_UCHAR);
image.heifsave(const_cast<char*>(baton->fileOut.data()), VImage::option() image.heifsave(const_cast<char*>(baton->fileOut.data()), VImage::option()
->set("strip", !baton->withMetadata) ->set("strip", !baton->withMetadata)
@@ -1129,6 +1177,7 @@ class PipelineWorker : public Napi::AsyncWorker {
if (!sharp::HasAlpha(image)) { if (!sharp::HasAlpha(image)) {
baton->tileBackground.pop_back(); baton->tileBackground.pop_back();
} }
image = sharp::StaySequential(image, access, baton->tileAngle != 0);
vips::VOption *options = BuildOptionsDZ(baton); vips::VOption *options = BuildOptionsDZ(baton);
image.dzsave(const_cast<char*>(baton->fileOut.data()), options); image.dzsave(const_cast<char*>(baton->fileOut.data()), options);
baton->formatOut = "dz"; baton->formatOut = "dz";
@@ -1164,7 +1213,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Handle warnings // Handle warnings
std::string warning = sharp::VipsWarningPop(); std::string warning = sharp::VipsWarningPop();
while (!warning.empty()) { while (!warning.empty()) {
debuglog.Call({ Napi::String::New(env, warning) }); debuglog.MakeCallback(Receiver().Value(), { Napi::String::New(env, warning) });
warning = sharp::VipsWarningPop(); warning = sharp::VipsWarningPop();
} }
@@ -1193,6 +1242,10 @@ class PipelineWorker : public Napi::AsyncWorker {
info.Set("cropOffsetLeft", static_cast<int32_t>(baton->cropOffsetLeft)); info.Set("cropOffsetLeft", static_cast<int32_t>(baton->cropOffsetLeft));
info.Set("cropOffsetTop", static_cast<int32_t>(baton->cropOffsetTop)); info.Set("cropOffsetTop", static_cast<int32_t>(baton->cropOffsetTop));
} }
if (baton->hasAttentionCenter) {
info.Set("attentionX", static_cast<int32_t>(baton->attentionX));
info.Set("attentionY", static_cast<int32_t>(baton->attentionY));
}
if (baton->trimThreshold > 0.0) { if (baton->trimThreshold > 0.0) {
info.Set("trimOffsetLeft", static_cast<int32_t>(baton->trimOffsetLeft)); info.Set("trimOffsetLeft", static_cast<int32_t>(baton->trimOffsetLeft));
info.Set("trimOffsetTop", static_cast<int32_t>(baton->trimOffsetTop)); info.Set("trimOffsetTop", static_cast<int32_t>(baton->trimOffsetTop));
@@ -1206,8 +1259,8 @@ class PipelineWorker : public Napi::AsyncWorker {
// Add buffer size to info // Add buffer size to info
info.Set("size", static_cast<uint32_t>(baton->bufferOutLength)); info.Set("size", static_cast<uint32_t>(baton->bufferOutLength));
// Pass ownership of output data to Buffer instance // Pass ownership of output data to Buffer instance
Napi::Buffer<char> data = sharp::NewOrCopyBuffer(env, static_cast<char*>(baton->bufferOut), Napi::Buffer<char> data = Napi::Buffer<char>::NewOrCopy(env, static_cast<char*>(baton->bufferOut),
baton->bufferOutLength); baton->bufferOutLength, sharp::FreeCallback);
Callback().MakeCallback(Receiver().Value(), { env.Null(), data, info }); Callback().MakeCallback(Receiver().Value(), { env.Null(), data, info });
} else { } else {
// Add file size to info // Add file size to info
@@ -1218,7 +1271,7 @@ class PipelineWorker : public Napi::AsyncWorker {
Callback().MakeCallback(Receiver().Value(), { env.Null(), info }); Callback().MakeCallback(Receiver().Value(), { env.Null(), info });
} }
} else { } else {
Callback().MakeCallback(Receiver().Value(), { Napi::Error::New(env, baton->err).Value() }); Callback().MakeCallback(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() });
} }
// Delete baton // Delete baton
@@ -1236,7 +1289,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Decrement processing task counter // Decrement processing task counter
g_atomic_int_dec_and_test(&sharp::counterProcess); g_atomic_int_dec_and_test(&sharp::counterProcess);
Napi::Number queueLength = Napi::Number::New(env, static_cast<double>(sharp::counterQueue)); Napi::Number queueLength = Napi::Number::New(env, static_cast<double>(sharp::counterQueue));
queueListener.Call(Receiver().Value(), { queueLength }); queueListener.MakeCallback(Receiver().Value(), { queueLength });
} }
private: private:
@@ -1326,6 +1379,7 @@ class PipelineWorker : public Napi::AsyncWorker {
{"lossless", baton->webpLossless ? "TRUE" : "FALSE"}, {"lossless", baton->webpLossless ? "TRUE" : "FALSE"},
{"near_lossless", baton->webpNearLossless ? "TRUE" : "FALSE"}, {"near_lossless", baton->webpNearLossless ? "TRUE" : "FALSE"},
{"smart_subsample", baton->webpSmartSubsample ? "TRUE" : "FALSE"}, {"smart_subsample", baton->webpSmartSubsample ? "TRUE" : "FALSE"},
{"preset", vips_enum_nick(VIPS_TYPE_FOREIGN_WEBP_PRESET, baton->webpPreset)},
{"min_size", baton->webpMinSize ? "TRUE" : "FALSE"}, {"min_size", baton->webpMinSize ? "TRUE" : "FALSE"},
{"mixed", baton->webpMixed ? "TRUE" : "FALSE"}, {"mixed", baton->webpMixed ? "TRUE" : "FALSE"},
{"effort", std::to_string(baton->webpEffort)} {"effort", std::to_string(baton->webpEffort)}
@@ -1382,7 +1436,7 @@ class PipelineWorker : public Napi::AsyncWorker {
Napi::Value pipeline(const Napi::CallbackInfo& info) { Napi::Value pipeline(const Napi::CallbackInfo& info) {
// V8 objects are converted to non-V8 types held in the baton struct // V8 objects are converted to non-V8 types held in the baton struct
PipelineBaton *baton = new PipelineBaton; PipelineBaton *baton = new PipelineBaton;
Napi::Object options = info[0].As<Napi::Object>(); Napi::Object options = info[size_t(0)].As<Napi::Object>();
// Input // Input
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>()); baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
@@ -1444,6 +1498,7 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
// Operators // Operators
baton->flatten = sharp::AttrAsBool(options, "flatten"); baton->flatten = sharp::AttrAsBool(options, "flatten");
baton->flattenBackground = sharp::AttrAsVectorOfDouble(options, "flattenBackground"); baton->flattenBackground = sharp::AttrAsVectorOfDouble(options, "flattenBackground");
baton->unflatten = sharp::AttrAsBool(options, "unflatten");
baton->negate = sharp::AttrAsBool(options, "negate"); baton->negate = sharp::AttrAsBool(options, "negate");
baton->negateAlpha = sharp::AttrAsBool(options, "negateAlpha"); baton->negateAlpha = sharp::AttrAsBool(options, "negateAlpha");
baton->blurSigma = sharp::AttrAsDouble(options, "blurSigma"); baton->blurSigma = sharp::AttrAsDouble(options, "blurSigma");
@@ -1468,6 +1523,8 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->linearB = sharp::AttrAsVectorOfDouble(options, "linearB"); baton->linearB = sharp::AttrAsVectorOfDouble(options, "linearB");
baton->greyscale = sharp::AttrAsBool(options, "greyscale"); baton->greyscale = sharp::AttrAsBool(options, "greyscale");
baton->normalise = sharp::AttrAsBool(options, "normalise"); baton->normalise = sharp::AttrAsBool(options, "normalise");
baton->normaliseLower = sharp::AttrAsUint32(options, "normaliseLower");
baton->normaliseUpper = sharp::AttrAsUint32(options, "normaliseUpper");
baton->tintA = sharp::AttrAsDouble(options, "tintA"); baton->tintA = sharp::AttrAsDouble(options, "tintA");
baton->tintB = sharp::AttrAsDouble(options, "tintB"); baton->tintB = sharp::AttrAsDouble(options, "tintB");
baton->claheWidth = sharp::AttrAsUint32(options, "claheWidth"); baton->claheWidth = sharp::AttrAsUint32(options, "claheWidth");
@@ -1485,6 +1542,7 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->extendLeft = sharp::AttrAsInt32(options, "extendLeft"); baton->extendLeft = sharp::AttrAsInt32(options, "extendLeft");
baton->extendRight = sharp::AttrAsInt32(options, "extendRight"); baton->extendRight = sharp::AttrAsInt32(options, "extendRight");
baton->extendBackground = sharp::AttrAsVectorOfDouble(options, "extendBackground"); baton->extendBackground = sharp::AttrAsVectorOfDouble(options, "extendBackground");
baton->extendWith = sharp::AttrAsEnum<VipsExtend>(options, "extendWith", VIPS_TYPE_EXTEND);
baton->extractChannel = sharp::AttrAsInt32(options, "extractChannel"); baton->extractChannel = sharp::AttrAsInt32(options, "extractChannel");
baton->affineMatrix = sharp::AttrAsVectorOfDouble(options, "affineMatrix"); baton->affineMatrix = sharp::AttrAsVectorOfDouble(options, "affineMatrix");
baton->affineBackground = sharp::AttrAsVectorOfDouble(options, "affineBackground"); baton->affineBackground = sharp::AttrAsVectorOfDouble(options, "affineBackground");
@@ -1574,6 +1632,7 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->webpLossless = sharp::AttrAsBool(options, "webpLossless"); baton->webpLossless = sharp::AttrAsBool(options, "webpLossless");
baton->webpNearLossless = sharp::AttrAsBool(options, "webpNearLossless"); baton->webpNearLossless = sharp::AttrAsBool(options, "webpNearLossless");
baton->webpSmartSubsample = sharp::AttrAsBool(options, "webpSmartSubsample"); baton->webpSmartSubsample = sharp::AttrAsBool(options, "webpSmartSubsample");
baton->webpPreset = sharp::AttrAsEnum<VipsForeignWebpPreset>(options, "webpPreset", VIPS_TYPE_FOREIGN_WEBP_PRESET);
baton->webpEffort = sharp::AttrAsUint32(options, "webpEffort"); baton->webpEffort = sharp::AttrAsUint32(options, "webpEffort");
baton->webpMinSize = sharp::AttrAsBool(options, "webpMinSize"); baton->webpMinSize = sharp::AttrAsBool(options, "webpMinSize");
baton->webpMixed = sharp::AttrAsBool(options, "webpMixed"); baton->webpMixed = sharp::AttrAsBool(options, "webpMixed");
@@ -1582,7 +1641,8 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->gifDither = sharp::AttrAsDouble(options, "gifDither"); baton->gifDither = sharp::AttrAsDouble(options, "gifDither");
baton->gifInterFrameMaxError = sharp::AttrAsDouble(options, "gifInterFrameMaxError"); baton->gifInterFrameMaxError = sharp::AttrAsDouble(options, "gifInterFrameMaxError");
baton->gifInterPaletteMaxError = sharp::AttrAsDouble(options, "gifInterPaletteMaxError"); baton->gifInterPaletteMaxError = sharp::AttrAsDouble(options, "gifInterPaletteMaxError");
baton->gifReoptimise = sharp::AttrAsBool(options, "gifReoptimise"); baton->gifReuse = sharp::AttrAsBool(options, "gifReuse");
baton->gifProgressive = sharp::AttrAsBool(options, "gifProgressive");
baton->tiffQuality = sharp::AttrAsUint32(options, "tiffQuality"); baton->tiffQuality = sharp::AttrAsUint32(options, "tiffQuality");
baton->tiffPyramid = sharp::AttrAsBool(options, "tiffPyramid"); baton->tiffPyramid = sharp::AttrAsBool(options, "tiffPyramid");
baton->tiffBitdepth = sharp::AttrAsUint32(options, "tiffBitdepth"); baton->tiffBitdepth = sharp::AttrAsUint32(options, "tiffBitdepth");
@@ -1632,20 +1692,6 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->tileId = sharp::AttrAsStr(options, "tileId"); baton->tileId = sharp::AttrAsStr(options, "tileId");
baton->tileBasename = sharp::AttrAsStr(options, "tileBasename"); baton->tileBasename = sharp::AttrAsStr(options, "tileBasename");
// Force random access for certain operations
if (baton->input->access == VIPS_ACCESS_SEQUENTIAL) {
if (
baton->trimThreshold > 0.0 ||
baton->normalise ||
baton->position == 16 || baton->position == 17 ||
baton->angle % 360 != 0 ||
fmod(baton->rotationAngle, 360.0) != 0.0 ||
baton->useExifOrientation
) {
baton->input->access = VIPS_ACCESS_RANDOM;
}
}
// Function to notify of libvips warnings // Function to notify of libvips warnings
Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>(); Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>();
@@ -1653,7 +1699,7 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
Napi::Function queueListener = options.Get("queueListener").As<Napi::Function>(); Napi::Function queueListener = options.Get("queueListener").As<Napi::Function>();
// Join queue for worker thread // Join queue for worker thread
Napi::Function callback = info[1].As<Napi::Function>(); Napi::Function callback = info[size_t(1)].As<Napi::Function>();
PipelineWorker *worker = new PipelineWorker(callback, baton, debuglog, queueListener); PipelineWorker *worker = new PipelineWorker(callback, baton, debuglog, queueListener);
worker->Receiver().Set("options", options); worker->Receiver().Set("options", options);
worker->Queue(); worker->Queue();
@@ -1661,7 +1707,7 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
// Increment queued task counter // Increment queued task counter
g_atomic_int_inc(&sharp::counterQueue); g_atomic_int_inc(&sharp::counterQueue);
Napi::Number queueLength = Napi::Number::New(info.Env(), static_cast<double>(sharp::counterQueue)); Napi::Number queueLength = Napi::Number::New(info.Env(), static_cast<double>(sharp::counterQueue));
queueListener.Call(info.This(), { queueLength }); queueListener.MakeCallback(info.This(), { queueLength });
return info.Env().Undefined(); return info.Env().Undefined();
} }

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef SRC_PIPELINE_H_ #ifndef SRC_PIPELINE_H_
#define SRC_PIPELINE_H_ #define SRC_PIPELINE_H_
@@ -74,6 +63,9 @@ struct PipelineBaton {
bool hasCropOffset; bool hasCropOffset;
int cropOffsetLeft; int cropOffsetLeft;
int cropOffsetTop; int cropOffsetTop;
bool hasAttentionCenter;
int attentionX;
int attentionY;
bool premultiplied; bool premultiplied;
bool tileCentre; bool tileCentre;
bool fastShrinkOnLoad; bool fastShrinkOnLoad;
@@ -81,6 +73,7 @@ struct PipelineBaton {
double tintB; double tintB;
bool flatten; bool flatten;
std::vector<double> flattenBackground; std::vector<double> flattenBackground;
bool unflatten;
bool negate; bool negate;
bool negateAlpha; bool negateAlpha;
double blurSigma; double blurSigma;
@@ -107,6 +100,8 @@ struct PipelineBaton {
double gammaOut; double gammaOut;
bool greyscale; bool greyscale;
bool normalise; bool normalise;
int normaliseLower;
int normaliseUpper;
int claheWidth; int claheWidth;
int claheHeight; int claheHeight;
int claheMaxSlope; int claheMaxSlope;
@@ -122,6 +117,7 @@ struct PipelineBaton {
int extendLeft; int extendLeft;
int extendRight; int extendRight;
std::vector<double> extendBackground; std::vector<double> extendBackground;
VipsExtend extendWith;
bool withoutEnlargement; bool withoutEnlargement;
bool withoutReduction; bool withoutReduction;
std::vector<double> affineMatrix; std::vector<double> affineMatrix;
@@ -157,6 +153,7 @@ struct PipelineBaton {
bool webpNearLossless; bool webpNearLossless;
bool webpLossless; bool webpLossless;
bool webpSmartSubsample; bool webpSmartSubsample;
VipsForeignWebpPreset webpPreset;
int webpEffort; int webpEffort;
bool webpMinSize; bool webpMinSize;
bool webpMixed; bool webpMixed;
@@ -165,7 +162,8 @@ struct PipelineBaton {
double gifDither; double gifDither;
double gifInterFrameMaxError; double gifInterFrameMaxError;
double gifInterPaletteMaxError; double gifInterPaletteMaxError;
bool gifReoptimise; bool gifReuse;
bool gifProgressive;
int tiffQuality; int tiffQuality;
VipsForeignTiffCompression tiffCompression; VipsForeignTiffCompression tiffCompression;
VipsForeignTiffPredictor tiffPredictor; VipsForeignTiffPredictor tiffPredictor;
@@ -235,11 +233,15 @@ struct PipelineBaton {
hasCropOffset(false), hasCropOffset(false),
cropOffsetLeft(0), cropOffsetLeft(0),
cropOffsetTop(0), cropOffsetTop(0),
hasAttentionCenter(false),
attentionX(0),
attentionY(0),
premultiplied(false), premultiplied(false),
tintA(128.0), tintA(128.0),
tintB(128.0), tintB(128.0),
flatten(false), flatten(false),
flattenBackground{ 0.0, 0.0, 0.0 }, flattenBackground{ 0.0, 0.0, 0.0 },
unflatten(false),
negate(false), negate(false),
negateAlpha(true), negateAlpha(true),
blurSigma(0.0), blurSigma(0.0),
@@ -265,6 +267,8 @@ struct PipelineBaton {
gamma(0.0), gamma(0.0),
greyscale(false), greyscale(false),
normalise(false), normalise(false),
normaliseLower(1),
normaliseUpper(99),
claheWidth(0), claheWidth(0),
claheHeight(0), claheHeight(0),
claheMaxSlope(3), claheMaxSlope(3),
@@ -279,6 +283,7 @@ struct PipelineBaton {
extendLeft(0), extendLeft(0),
extendRight(0), extendRight(0),
extendBackground{ 0.0, 0.0, 0.0, 255.0 }, extendBackground{ 0.0, 0.0, 0.0, 255.0 },
extendWith(VIPS_EXTEND_BACKGROUND),
withoutEnlargement(false), withoutEnlargement(false),
withoutReduction(false), withoutReduction(false),
affineMatrix{ 1.0, 0.0, 0.0, 1.0 }, affineMatrix{ 1.0, 0.0, 0.0, 1.0 },
@@ -314,6 +319,7 @@ struct PipelineBaton {
webpNearLossless(false), webpNearLossless(false),
webpLossless(false), webpLossless(false),
webpSmartSubsample(false), webpSmartSubsample(false),
webpPreset(VIPS_FOREIGN_WEBP_PRESET_DEFAULT),
webpEffort(4), webpEffort(4),
webpMinSize(false), webpMinSize(false),
webpMixed(false), webpMixed(false),
@@ -322,7 +328,8 @@ struct PipelineBaton {
gifDither(1.0), gifDither(1.0),
gifInterFrameMaxError(0.0), gifInterFrameMaxError(0.0),
gifInterPaletteMaxError(3.0), gifInterPaletteMaxError(3.0),
gifReoptimise(false), gifReuse(true),
gifProgressive(false),
tiffQuality(80), tiffQuality(80),
tiffCompression(VIPS_FOREIGN_TIFF_COMPRESSION_JPEG), tiffCompression(VIPS_FOREIGN_TIFF_COMPRESSION_JPEG),
tiffPredictor(VIPS_FOREIGN_TIFF_PREDICTOR_HORIZONTAL), tiffPredictor(VIPS_FOREIGN_TIFF_PREDICTOR_HORIZONTAL),

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <napi.h> #include <napi.h>
#include <vips/vips8> #include <vips/vips8>
@@ -42,6 +31,7 @@ Napi::Object init(Napi::Env env, Napi::Object exports) {
exports.Set("simd", Napi::Function::New(env, simd)); exports.Set("simd", Napi::Function::New(env, simd));
exports.Set("libvipsVersion", Napi::Function::New(env, libvipsVersion)); exports.Set("libvipsVersion", Napi::Function::New(env, libvipsVersion));
exports.Set("format", Napi::Function::New(env, format)); exports.Set("format", Napi::Function::New(env, format));
exports.Set("block", Napi::Function::New(env, block));
exports.Set("_maxColourDistance", Napi::Function::New(env, _maxColourDistance)); exports.Set("_maxColourDistance", Napi::Function::New(env, _maxColourDistance));
exports.Set("_isUsingJemalloc", Napi::Function::New(env, _isUsingJemalloc)); exports.Set("_isUsingJemalloc", Napi::Function::New(env, _isUsingJemalloc));
exports.Set("stats", Napi::Function::New(env, stats)); exports.Set("stats", Napi::Function::New(env, stats));

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <numeric> #include <numeric>
#include <vector> #include <vector>
@@ -117,7 +106,7 @@ class StatsWorker : public Napi::AsyncWorker {
// Handle warnings // Handle warnings
std::string warning = sharp::VipsWarningPop(); std::string warning = sharp::VipsWarningPop();
while (!warning.empty()) { while (!warning.empty()) {
debuglog.Call({ Napi::String::New(env, warning) }); debuglog.MakeCallback(Receiver().Value(), { Napi::String::New(env, warning) });
warning = sharp::VipsWarningPop(); warning = sharp::VipsWarningPop();
} }
@@ -154,7 +143,7 @@ class StatsWorker : public Napi::AsyncWorker {
info.Set("dominant", dominant); info.Set("dominant", dominant);
Callback().MakeCallback(Receiver().Value(), { env.Null(), info }); Callback().MakeCallback(Receiver().Value(), { env.Null(), info });
} else { } else {
Callback().MakeCallback(Receiver().Value(), { Napi::Error::New(env, baton->err).Value() }); Callback().MakeCallback(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() });
} }
delete baton->input; delete baton->input;
@@ -172,7 +161,7 @@ class StatsWorker : public Napi::AsyncWorker {
Napi::Value stats(const Napi::CallbackInfo& info) { Napi::Value stats(const Napi::CallbackInfo& info) {
// V8 objects are converted to non-V8 types held in the baton struct // V8 objects are converted to non-V8 types held in the baton struct
StatsBaton *baton = new StatsBaton; StatsBaton *baton = new StatsBaton;
Napi::Object options = info[0].As<Napi::Object>(); Napi::Object options = info[size_t(0)].As<Napi::Object>();
// Input // Input
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>()); baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
@@ -182,7 +171,7 @@ Napi::Value stats(const Napi::CallbackInfo& info) {
Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>(); Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>();
// Join queue for worker thread // Join queue for worker thread
Napi::Function callback = info[1].As<Napi::Function>(); Napi::Function callback = info[size_t(1)].As<Napi::Function>();
StatsWorker *worker = new StatsWorker(callback, baton, debuglog); StatsWorker *worker = new StatsWorker(callback, baton, debuglog);
worker->Receiver().Set("options", options); worker->Receiver().Set("options", options);
worker->Queue(); worker->Queue();

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef SRC_STATS_H_ #ifndef SRC_STATS_H_
#define SRC_STATS_H_ #define SRC_STATS_H_

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <cmath> #include <cmath>
#include <string> #include <string>
@@ -30,16 +19,16 @@ Napi::Value cache(const Napi::CallbackInfo& info) {
Napi::Env env = info.Env(); Napi::Env env = info.Env();
// Set memory limit // Set memory limit
if (info[0].IsNumber()) { if (info[size_t(0)].IsNumber()) {
vips_cache_set_max_mem(info[0].As<Napi::Number>().Int32Value() * 1048576); vips_cache_set_max_mem(info[size_t(0)].As<Napi::Number>().Int32Value() * 1048576);
} }
// Set file limit // Set file limit
if (info[1].IsNumber()) { if (info[size_t(1)].IsNumber()) {
vips_cache_set_max_files(info[1].As<Napi::Number>().Int32Value()); vips_cache_set_max_files(info[size_t(1)].As<Napi::Number>().Int32Value());
} }
// Set items limit // Set items limit
if (info[2].IsNumber()) { if (info[size_t(2)].IsNumber()) {
vips_cache_set_max(info[2].As<Napi::Number>().Int32Value()); vips_cache_set_max(info[size_t(2)].As<Napi::Number>().Int32Value());
} }
// Get memory stats // Get memory stats
@@ -69,8 +58,8 @@ Napi::Value cache(const Napi::CallbackInfo& info) {
*/ */
Napi::Value concurrency(const Napi::CallbackInfo& info) { Napi::Value concurrency(const Napi::CallbackInfo& info) {
// Set concurrency // Set concurrency
if (info[0].IsNumber()) { if (info[size_t(0)].IsNumber()) {
vips_concurrency_set(info[0].As<Napi::Number>().Int32Value()); vips_concurrency_set(info[size_t(0)].As<Napi::Number>().Int32Value());
} }
// Get concurrency // Get concurrency
return Napi::Number::New(info.Env(), vips_concurrency_get()); return Napi::Number::New(info.Env(), vips_concurrency_get());
@@ -91,8 +80,8 @@ Napi::Value counters(const Napi::CallbackInfo& info) {
*/ */
Napi::Value simd(const Napi::CallbackInfo& info) { Napi::Value simd(const Napi::CallbackInfo& info) {
// Set state // Set state
if (info[0].IsBoolean()) { if (info[size_t(0)].IsBoolean()) {
vips_vector_set_enabled(info[0].As<Napi::Boolean>().Value()); vips_vector_set_enabled(info[size_t(0)].As<Napi::Boolean>().Value());
} }
// Get state // Get state
return Napi::Boolean::New(info.Env(), vips_vector_isenabled()); return Napi::Boolean::New(info.Env(), vips_vector_isenabled());
@@ -175,6 +164,17 @@ Napi::Value format(const Napi::CallbackInfo& info) {
return format; return format;
} }
/*
(Un)block libvips operations at runtime.
*/
void block(const Napi::CallbackInfo& info) {
Napi::Array ops = info[size_t(0)].As<Napi::Array>();
bool const state = info[size_t(1)].As<Napi::Boolean>().Value();
for (unsigned int i = 0; i < ops.Length(); i++) {
vips_operation_block_set(ops.Get(i).As<Napi::String>().Utf8Value().c_str(), state);
}
}
/* /*
Synchronous, internal-only method used by some of the functional tests. Synchronous, internal-only method used by some of the functional tests.
Calculates the maximum colour distance using the DE2000 algorithm Calculates the maximum colour distance using the DE2000 algorithm
@@ -185,10 +185,10 @@ Napi::Value _maxColourDistance(const Napi::CallbackInfo& info) {
// Open input files // Open input files
VImage image1; VImage image1;
sharp::ImageType imageType1 = sharp::DetermineImageType(info[0].As<Napi::String>().Utf8Value().data()); sharp::ImageType imageType1 = sharp::DetermineImageType(info[size_t(0)].As<Napi::String>().Utf8Value().data());
if (imageType1 != sharp::ImageType::UNKNOWN) { if (imageType1 != sharp::ImageType::UNKNOWN) {
try { try {
image1 = VImage::new_from_file(info[0].As<Napi::String>().Utf8Value().c_str()); image1 = VImage::new_from_file(info[size_t(0)].As<Napi::String>().Utf8Value().c_str());
} catch (...) { } catch (...) {
throw Napi::Error::New(env, "Input file 1 has corrupt header"); throw Napi::Error::New(env, "Input file 1 has corrupt header");
} }
@@ -196,10 +196,10 @@ Napi::Value _maxColourDistance(const Napi::CallbackInfo& info) {
throw Napi::Error::New(env, "Input file 1 is of an unsupported image format"); throw Napi::Error::New(env, "Input file 1 is of an unsupported image format");
} }
VImage image2; VImage image2;
sharp::ImageType imageType2 = sharp::DetermineImageType(info[1].As<Napi::String>().Utf8Value().data()); sharp::ImageType imageType2 = sharp::DetermineImageType(info[size_t(1)].As<Napi::String>().Utf8Value().data());
if (imageType2 != sharp::ImageType::UNKNOWN) { if (imageType2 != sharp::ImageType::UNKNOWN) {
try { try {
image2 = VImage::new_from_file(info[1].As<Napi::String>().Utf8Value().c_str()); image2 = VImage::new_from_file(info[size_t(1)].As<Napi::String>().Utf8Value().c_str());
} catch (...) { } catch (...) {
throw Napi::Error::New(env, "Input file 2 has corrupt header"); throw Napi::Error::New(env, "Input file 2 has corrupt header");
} }

View File

@@ -1,16 +1,5 @@
// Copyright 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 Lovell Fuller and contributors. // Copyright 2013 Lovell Fuller and others.
// // SPDX-License-Identifier: Apache-2.0
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef SRC_UTILITIES_H_ #ifndef SRC_UTILITIES_H_
#define SRC_UTILITIES_H_ #define SRC_UTILITIES_H_
@@ -23,6 +12,7 @@ Napi::Value counters(const Napi::CallbackInfo& info);
Napi::Value simd(const Napi::CallbackInfo& info); Napi::Value simd(const Napi::CallbackInfo& info);
Napi::Value libvipsVersion(const Napi::CallbackInfo& info); Napi::Value libvipsVersion(const Napi::CallbackInfo& info);
Napi::Value format(const Napi::CallbackInfo& info); Napi::Value format(const Napi::CallbackInfo& info);
void block(const Napi::CallbackInfo& info);
Napi::Value _maxColourDistance(const Napi::CallbackInfo& info); Napi::Value _maxColourDistance(const Napi::CallbackInfo& info);
Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info); Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info);

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const sharp = require('../'); const sharp = require('../');

View File

@@ -8,17 +8,16 @@
"test": "node perf && node random && node parallel" "test": "node perf && node random && node parallel"
}, },
"dependencies": { "dependencies": {
"@squoosh/cli": "0.7.2", "@squoosh/cli": "0.7.3",
"@squoosh/lib": "0.4.0", "@squoosh/lib": "0.4.0",
"async": "3.2.4", "async": "3.2.4",
"benchmark": "2.1.4", "benchmark": "2.1.4",
"gm": "1.25.0", "gm": "1.25.0",
"imagemagick": "0.1.3", "imagemagick": "0.1.3",
"jimp": "0.16.2", "jimp": "0.22.7"
"semver": "7.3.8"
}, },
"optionalDependencies": { "optionalDependencies": {
"@tensorflow/tfjs-node": "4.1.0", "@tensorflow/tfjs-node": "4.2.0",
"mapnik": "4.5.9" "mapnik": "4.5.9"
}, },
"license": "Apache-2.0", "license": "Apache-2.0",

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
process.env.UV_THREADPOOL_SIZE = 64; process.env.UV_THREADPOOL_SIZE = 64;

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const os = require('os'); const os = require('os');
@@ -32,6 +35,7 @@ const outputWebP = fixtures.path('output.webp');
const width = 720; const width = 720;
const height = 588; const height = 588;
const heightPng = 540;
// Disable libvips cache to ensure tests are as fair as they can be // Disable libvips cache to ensure tests are as fair as they can be
sharp.cache(false); sharp.cache(false);
@@ -535,10 +539,10 @@ async.series({
} }
}); });
} }
}).add('sharp-sequentialRead', { }).add('sharp-random-access-read', {
defer: true, defer: true,
fn: function (deferred) { fn: function (deferred) {
sharp(inputJpgBuffer, { sequentialRead: true }) sharp(inputJpgBuffer, { sequentialRead: false })
.resize(width, height) .resize(width, height)
.toBuffer(function (err) { .toBuffer(function (err) {
if (err) { if (err) {
@@ -648,7 +652,7 @@ async.series({
throw err; throw err;
} else { } else {
image image
.resize(width, height) .resize(width, heightPng)
.deflateLevel(6) .deflateLevel(6)
.filterType(0) .filterType(0)
.getBuffer(jimp.MIME_PNG, function (err) { .getBuffer(jimp.MIME_PNG, function (err) {
@@ -669,7 +673,7 @@ async.series({
throw err; throw err;
} else { } else {
image image
.resize(width, height) .resize(width, heightPng)
.deflateLevel(6) .deflateLevel(6)
.filterType(0) .filterType(0)
.write(outputPng, function (err) { .write(outputPng, function (err) {
@@ -683,6 +687,59 @@ async.series({
}); });
} }
}); });
// squoosh-cli
pngSuite.add('squoosh-cli-file-file', {
defer: true,
fn: function (deferred) {
exec(`./node_modules/.bin/squoosh-cli \
--output-dir ${os.tmpdir()} \
--resize '{"enabled":true,"width":${width},"height":${heightPng},"method":"lanczos3","premultiply":true,"linearRGB":false}' \
--oxipng '{"level":1}' \
"${fixtures.inputPngAlphaPremultiplicationLarge}"`, function (err) {
if (err) {
throw err;
}
deferred.resolve();
});
}
});
// squoosh-lib (GPLv3)
pngSuite.add('squoosh-lib-buffer-buffer', {
defer: true,
fn: function (deferred) {
const pool = new squoosh.ImagePool();
const image = pool.ingestImage(inputPngBuffer);
image.decoded
.then(function () {
return image.preprocess({
resize: {
enabled: true,
width,
height: heightPng,
method: 'lanczos3',
premultiply: true,
linearRGB: false
}
});
})
.then(function () {
return image.encode({
oxipng: {
level: 1
}
});
})
.then(function () {
return pool.close();
})
.then(function () {
return image.encodedWith.oxipng;
})
.then(function () {
deferred.resolve();
});
}
});
// mapnik // mapnik
mapnik && pngSuite.add('mapnik-file-file', { mapnik && pngSuite.add('mapnik-file-file', {
defer: true, defer: true,
@@ -691,13 +748,13 @@ async.series({
if (err) throw err; if (err) throw err;
img.premultiply(function (err, img) { img.premultiply(function (err, img) {
if (err) throw err; if (err) throw err;
img.resize(width, height, { img.resize(width, heightPng, {
scaling_method: mapnik.imageScaling.lanczos scaling_method: mapnik.imageScaling.lanczos
}, function (err, img) { }, function (err, img) {
if (err) throw err; if (err) throw err;
img.demultiply(function (err, img) { img.demultiply(function (err, img) {
if (err) throw err; if (err) throw err;
img.save(outputPng, 'png', function (err) { img.save(outputPng, 'png32:f=no:z=6', function (err) {
if (err) throw err; if (err) throw err;
deferred.resolve(); deferred.resolve();
}); });
@@ -713,13 +770,13 @@ async.series({
if (err) throw err; if (err) throw err;
img.premultiply(function (err, img) { img.premultiply(function (err, img) {
if (err) throw err; if (err) throw err;
img.resize(width, height, { img.resize(width, heightPng, {
scaling_method: mapnik.imageScaling.lanczos scaling_method: mapnik.imageScaling.lanczos
}, function (err, img) { }, function (err, img) {
if (err) throw err; if (err) throw err;
img.demultiply(function (err, img) { img.demultiply(function (err, img) {
if (err) throw err; if (err) throw err;
img.encode('png', function (err) { img.encode('png32:f=no:z=6', function (err) {
if (err) throw err; if (err) throw err;
deferred.resolve(); deferred.resolve();
}); });
@@ -737,7 +794,7 @@ async.series({
srcPath: fixtures.inputPngAlphaPremultiplicationLarge, srcPath: fixtures.inputPngAlphaPremultiplicationLarge,
dstPath: outputPng, dstPath: outputPng,
width: width, width: width,
height: height, height: heightPng,
filter: 'Lanczos', filter: 'Lanczos',
customArgs: [ customArgs: [
'-define', 'PNG:compression-level=6', '-define', 'PNG:compression-level=6',
@@ -758,7 +815,7 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
gm(fixtures.inputPngAlphaPremultiplicationLarge) gm(fixtures.inputPngAlphaPremultiplicationLarge)
.filter('Lanczos') .filter('Lanczos')
.resize(width, height) .resize(width, heightPng)
.define('PNG:compression-level=6') .define('PNG:compression-level=6')
.define('PNG:compression-filter=0') .define('PNG:compression-filter=0')
.write(outputPng, function (err) { .write(outputPng, function (err) {
@@ -774,7 +831,7 @@ async.series({
fn: function (deferred) { fn: function (deferred) {
gm(fixtures.inputPngAlphaPremultiplicationLarge) gm(fixtures.inputPngAlphaPremultiplicationLarge)
.filter('Lanczos') .filter('Lanczos')
.resize(width, height) .resize(width, heightPng)
.define('PNG:compression-level=6') .define('PNG:compression-level=6')
.define('PNG:compression-filter=0') .define('PNG:compression-filter=0')
.toBuffer(function (err) { .toBuffer(function (err) {
@@ -792,7 +849,7 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, heightPng)
.png({ compressionLevel: 6 }) .png({ compressionLevel: 6 })
.toFile(outputPng, function (err) { .toFile(outputPng, function (err) {
if (err) { if (err) {
@@ -807,9 +864,9 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, heightPng)
.png({ compressionLevel: 6 }) .png({ compressionLevel: 6 })
.toBuffer(function (err) { .toBuffer(function (err, data) {
if (err) { if (err) {
throw err; throw err;
} else { } else {
@@ -822,7 +879,7 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(fixtures.inputPngAlphaPremultiplicationLarge) sharp(fixtures.inputPngAlphaPremultiplicationLarge)
.resize(width, height) .resize(width, heightPng)
.png({ compressionLevel: 6 }) .png({ compressionLevel: 6 })
.toFile(outputPng, function (err) { .toFile(outputPng, function (err) {
if (err) { if (err) {
@@ -837,7 +894,7 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(fixtures.inputPngAlphaPremultiplicationLarge) sharp(fixtures.inputPngAlphaPremultiplicationLarge)
.resize(width, height) .resize(width, heightPng)
.png({ compressionLevel: 6 }) .png({ compressionLevel: 6 })
.toBuffer(function (err) { .toBuffer(function (err) {
if (err) { if (err) {
@@ -852,7 +909,7 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, heightPng)
.png({ compressionLevel: 6, progressive: true }) .png({ compressionLevel: 6, progressive: true })
.toBuffer(function (err) { .toBuffer(function (err) {
if (err) { if (err) {
@@ -867,7 +924,7 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, heightPng)
.png({ adaptiveFiltering: true, compressionLevel: 6 }) .png({ adaptiveFiltering: true, compressionLevel: 6 })
.toBuffer(function (err) { .toBuffer(function (err) {
if (err) { if (err) {
@@ -882,7 +939,7 @@ async.series({
minSamples, minSamples,
fn: function (deferred) { fn: function (deferred) {
sharp(inputPngBuffer) sharp(inputPngBuffer)
.resize(width, height) .resize(width, heightPng)
.png({ compressionLevel: 9 }) .png({ compressionLevel: 9 })
.toBuffer(function (err) { .toBuffer(function (err) {
if (err) { if (err) {

View File

@@ -1,3 +1,6 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict'; 'use strict';
const imagemagick = require('imagemagick'); const imagemagick = require('imagemagick');

Binary file not shown.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 154 KiB

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 112 KiB

After

Width:  |  Height:  |  Size: 9.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 179 KiB

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

View File

Before

Width:  |  Height:  |  Size: 4.1 KiB

After

Width:  |  Height:  |  Size: 4.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 262 B

After

Width:  |  Height:  |  Size: 255 B

BIN
test/fixtures/expected/linear-16bit.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 29 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 29 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 29 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

After

Width:  |  Height:  |  Size: 301 KiB

Some files were not shown because too many files have changed in this diff Show More