Compare commits

..

No commits in common. "5ee83d13e20b2ab78ef148fc58ef18a5cef02f77" and "b7ff2645c4ff5283e47b840a3e6d338fb3ea097b" have entirely different histories.

194 changed files with 1085 additions and 1370 deletions

View File

@ -183,7 +183,7 @@ jobs:
nodejs_version_major: 18
steps:
- uses: actions/checkout@v4
- uses: uraimo/run-on-arch-action@v3
- uses: uraimo/run-on-arch-action@v2
with:
arch: ${{ matrix.run_on_arch }}
distro: ${{ matrix.distro }}
@ -208,7 +208,7 @@ jobs:
contents: write
name: wasm32 - prebuild
runs-on: ubuntu-24.04
container: "emscripten/emsdk:4.0.5"
container: "emscripten/emsdk:3.1.70"
steps:
- name: Checkout
uses: actions/checkout@v4

2
.gitignore vendored
View File

@ -14,5 +14,3 @@ test/leak/libvips.supp
package-lock.json
.idea
.firebase
.astro
docs/dist

View File

@ -1,6 +1,4 @@
---
title: "High performance Node.js image processing"
---
# sharp
<img src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo.svg" width="160" height="160" alt="sharp logo" align="right">
@ -25,11 +23,7 @@ rotation, extraction, compositing and gamma correction are available.
Most modern macOS, Windows and Linux systems
do not require any additional install or runtime dependencies.
```sh
npm install sharp
```
## Formats
### Formats
This module supports reading JPEG, PNG, WebP, GIF, AVIF, TIFF and SVG images.
@ -43,7 +37,7 @@ Deep Zoom image pyramids can be generated,
suitable for use with "slippy map" tile viewers like
[OpenSeadragon](https://github.com/openseadragon/openseadragon).
## Fast
### Fast
This module is powered by the blazingly fast
[libvips](https://github.com/libvips/libvips) image processing library,
@ -58,7 +52,7 @@ taking full advantage of multiple CPU cores and L1/L2/L3 cache.
Everything remains non-blocking thanks to _libuv_,
no child processes are spawned and Promises/async/await are supported.
## Optimal
### Optimal
The features of `mozjpeg` and `pngquant` can be used
to optimise the file size of JPEG and PNG images respectively,
@ -77,12 +71,12 @@ The file size of animated GIF output is optimised
without having to use separate command line tools such as
[gifsicle](https://www.lcdf.org/gifsicle/).
## Contributing
### Contributing
A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md)
covers reporting bugs, requesting features and submitting code changes.
## Licensing
### Licensing
Copyright 2013 Lovell Fuller and others.

View File

@ -1,12 +1,7 @@
---
# This file was auto-generated from JSDoc in lib/channel.js
title: Channel manipulation
---
## removeAlpha
> removeAlpha() ⇒ <code>Sharp</code>
Remove alpha channels, if any. This is a no-op if the image does not have an alpha channel.
Remove alpha channel, if any. This is a no-op if the image does not have an alpha channel.
See also [flatten](/api-operation#flatten).

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/colour.js
title: Colour manipulation
---
## tint
> tint(tint) ⇒ <code>Sharp</code>

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/composite.js
title: Compositing images
---
## composite
> composite(images) ⇒ <code>Sharp</code>
@ -51,7 +46,6 @@ and https://www.cairographics.org/operators/
| [images[].input.text.dpi] | <code>number</code> | <code>72</code> | the resolution (size) at which to render the text. Does not take effect if `height` is specified. |
| [images[].input.text.rgba] | <code>boolean</code> | <code>false</code> | set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `<span foreground="red">Red!</span>`. |
| [images[].input.text.spacing] | <code>number</code> | <code>0</code> | text line height in points. Will use the font line height if none is specified. |
| [images[].autoOrient] | <code>Boolean</code> | <code>false</code> | set to true to use EXIF orientation data, if present, to orient the image. |
| [images[].blend] | <code>String</code> | <code>&#x27;over&#x27;</code> | how to blend this image with the image below. |
| [images[].gravity] | <code>String</code> | <code>&#x27;centre&#x27;</code> | gravity at which to place the overlay. |
| [images[].top] | <code>Number</code> | | the pixel offset from the top edge. |

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/constructor.js
title: Constructor
---
## Sharp
> Sharp
@ -33,12 +28,11 @@ where the overall height is the `pageHeight` multiplied by the number of `pages`
| Param | Type | Default | Description |
| --- | --- | --- | --- |
| [input] | <code>Buffer</code> \| <code>ArrayBuffer</code> \| <code>Uint8Array</code> \| <code>Uint8ClampedArray</code> \| <code>Int8Array</code> \| <code>Uint16Array</code> \| <code>Int16Array</code> \| <code>Uint32Array</code> \| <code>Int32Array</code> \| <code>Float32Array</code> \| <code>Float64Array</code> \| <code>string</code> \| <code>Array</code> | | if present, can be a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or a TypedArray containing raw pixel image data, or a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. An array of inputs can be provided, and these will be joined together. JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. |
| [input] | <code>Buffer</code> \| <code>ArrayBuffer</code> \| <code>Uint8Array</code> \| <code>Uint8ClampedArray</code> \| <code>Int8Array</code> \| <code>Uint16Array</code> \| <code>Int16Array</code> \| <code>Uint32Array</code> \| <code>Int32Array</code> \| <code>Float32Array</code> \| <code>Float64Array</code> \| <code>string</code> | | if present, can be a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or a TypedArray containing raw pixel image data, or a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file. JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present. |
| [options] | <code>Object</code> | | if present, is an Object with optional attributes. |
| [options.failOn] | <code>string</code> | <code>&quot;&#x27;warning&#x27;&quot;</code> | When to abort processing of invalid pixel data, one of (in order of sensitivity, least to most): 'none', 'truncated', 'error', 'warning'. Higher levels imply lower levels. Invalid metadata will always abort. |
| [options.limitInputPixels] | <code>number</code> \| <code>boolean</code> | <code>268402689</code> | Do not process input images where the number of pixels (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted. An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF). |
| [options.unlimited] | <code>boolean</code> | <code>false</code> | Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF). |
| [options.autoOrient] | <code>boolean</code> | <code>false</code> | Set this to `true` to rotate/flip the image to match EXIF `Orientation`, if any. |
| [options.sequentialRead] | <code>boolean</code> | <code>true</code> | Set this to `false` to use random access rather than sequential read. Some operations will do this automatically. |
| [options.density] | <code>number</code> | <code>72</code> | number representing the DPI for vector images in the range 1 to 100000. |
| [options.ignoreIcc] | <code>number</code> | <code>false</code> | should the embedded ICC profile, if any, be ignored. |
@ -74,13 +68,6 @@ where the overall height is the `pageHeight` multiplied by the number of `pages`
| [options.text.rgba] | <code>boolean</code> | <code>false</code> | set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`. |
| [options.text.spacing] | <code>number</code> | <code>0</code> | text line height in points. Will use the font line height if none is specified. |
| [options.text.wrap] | <code>string</code> | <code>&quot;&#x27;word&#x27;&quot;</code> | word wrapping style when width is provided, one of: 'word', 'char', 'word-char' (prefer word, fallback to char) or 'none'. |
| [options.join] | <code>Object</code> | | describes how an array of input images should be joined. |
| [options.join.across] | <code>number</code> | <code>1</code> | number of images to join horizontally. |
| [options.join.animated] | <code>boolean</code> | <code>false</code> | set this to `true` to join the images as an animated image. |
| [options.join.shim] | <code>number</code> | <code>0</code> | number of pixels to insert between joined images. |
| [options.join.background] | <code>string</code> \| <code>Object</code> | | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. |
| [options.join.halign] | <code>string</code> | <code>&quot;&#x27;left&#x27;&quot;</code> | horizontal alignment style for images joined horizontally (`'left'`, `'centre'`, `'center'`, `'right'`). |
| [options.join.valign] | <code>string</code> | <code>&quot;&#x27;top&#x27;&quot;</code> | vertical alignment style for images joined vertically (`'top'`, `'centre'`, `'center'`, `'bottom'`). |
**Example**
```js
@ -180,22 +167,6 @@ await sharp({
}
}).toFile('text_rgba.png');
```
**Example**
```js
// Join four input images as a 2x2 grid with a 4 pixel gutter
const data = await sharp(
[image1, image2, image3, image4],
{ join: { across: 2, shim: 4 } }
).toBuffer();
```
**Example**
```js
// Generate a two-frame animated image from emoji
const images = ['😀', '😛'].map(text => ({
text: { text, width: 64, height: 64, channels: 4, rgba: true }
}));
await sharp(images, { join: { animated: true } }).toFile('out.gif');
```
## clone

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/input.js
title: Input metadata
---
## metadata
> metadata([callback]) ⇒ <code>Promise.&lt;Object&gt;</code> \| <code>Sharp</code>
@ -77,9 +72,15 @@ image
```
**Example**
```js
// Get dimensions taking EXIF Orientation into account.
const { autoOrient } = await sharp(input).metadata();
const { width, height } = autoOrient;
// Based on EXIF rotation metadata, get the right-side-up width and height:
const size = getNormalSize(await sharp(input).metadata());
function getNormalSize({ width, height, orientation }) {
return (orientation || 0) >= 5
? { width: height, height: width }
: { width, height };
}
```

View File

@ -1,24 +1,22 @@
---
# This file was auto-generated from JSDoc in lib/operation.js
title: Image operations
---
## rotate
> rotate([angle], [options]) ⇒ <code>Sharp</code>
Rotate the output image.
Rotate the output image by either an explicit angle
or auto-orient based on the EXIF `Orientation` tag.
The provided angle is converted to a valid positive degree rotation.
If an angle is provided, it is converted to a valid positive degree rotation.
For example, `-450` will produce a 270 degree rotation.
When rotating by an angle other than a multiple of 90,
the background colour can be provided with the `background` option.
For backwards compatibility, if no angle is provided, `.autoOrient()` will be called.
If no angle is provided, it is determined from the EXIF data.
Mirroring is supported and may infer the use of a flip operation.
Only one rotation can occur per pipeline (aside from an initial call without
arguments to orient via EXIF data). Previous calls to `rotate` in the same
pipeline will be ignored.
The use of `rotate` without an angle will remove the EXIF `Orientation` tag, if any.
Only one rotation can occur per pipeline.
Previous calls to `rotate` in the same pipeline will be ignored.
Multi-page images can only be rotated by 180 degrees.
@ -37,6 +35,18 @@ for example `.rotate(x).extract(y)` will produce a different result to `.extract
| [options] | <code>Object</code> | | if present, is an Object with optional attributes. |
| [options.background] | <code>string</code> \| <code>Object</code> | <code>&quot;\&quot;#000000\&quot;&quot;</code> | parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha. |
**Example**
```js
const pipeline = sharp()
.rotate()
.resize(null, 200)
.toBuffer(function (err, outputBuffer, info) {
// outputBuffer contains 200px high JPEG image data,
// auto-rotated using EXIF Orientation tag
// info.width and info.height contain the dimensions of the resized image
});
readableStream.pipe(pipeline);
```
**Example**
```js
const rotateThenResize = await sharp(input)
@ -50,34 +60,6 @@ const resizeThenRotate = await sharp(input)
```
## autoOrient
> autoOrient() ⇒ <code>Sharp</code>
Auto-orient based on the EXIF `Orientation` tag, then remove the tag.
Mirroring is supported and may infer the use of a flip operation.
Previous or subsequent use of `rotate(angle)` and either `flip()` or `flop()`
will logically occur after auto-orientation, regardless of call order.
**Example**
```js
const output = await sharp(input).autoOrient().toBuffer();
```
**Example**
```js
const pipeline = sharp()
.autoOrient()
.resize(null, 200)
.toBuffer(function (err, outputBuffer, info) {
// outputBuffer contains 200px high JPEG image data,
// auto-oriented using EXIF Orientation tag
// info.width and info.height contain the dimensions of the resized image
});
readableStream.pipe(pipeline);
```
## flip
> flip([flip]) ⇒ <code>Sharp</code>

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/output.js
title: Output options
---
## toFile
> toFile(fileOut, [callback]) ⇒ <code>Promise.&lt;Object&gt;</code>

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/resize.js
title: Resizing images
---
## resize
> resize([width], [height], [options]) ⇒ <code>Sharp</code>

View File

@ -1,8 +1,3 @@
---
# This file was auto-generated from JSDoc in lib/utility.js
title: Global properties
---
## versions
> versions

View File

@ -1,79 +0,0 @@
// @ts-check
import { defineConfig } from 'astro/config';
import starlight from '@astrojs/starlight';
export default defineConfig({
site: 'https://sharp.pixelplumbing.com',
integrations: [
starlight({
title: 'sharp',
description:
'High performance Node.js image processing. The fastest module to resize JPEG, PNG, WebP and TIFF images.',
logo: {
src: './src/assets/sharp-logo.svg',
alt: '#'
},
customCss: ['./src/styles/custom.css'],
head: [{
tag: 'meta',
attrs: {
'http-equiv': 'Content-Security-Policy',
content: "default-src 'self'; connect-src 'self'; object-src 'none'; style-src 'self' 'unsafe-inline'; img-src 'self' data: https://cdn.jsdelivr.net/gh/lovell/; script-src 'self' 'unsafe-inline' 'unsafe-eval' https://static.cloudflareinsights.com/beacon.min.js/;"
}
}, {
tag: 'link',
attrs: {
rel: 'author',
href: '/humans.txt',
type: 'text/plain'
}
}, {
tag: 'script',
attrs: {
type: 'application/ld+json'
},
content: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'SoftwareSourceCode',
name: 'sharp',
description: 'High performance Node.js image processing',
url: 'https://sharp.pixelplumbing.com',
codeRepository: 'https://github.com/lovell/sharp',
programmingLanguage: ['JavaScript', 'C++'],
runtimePlatform: 'Node.js',
copyrightHolder: {
'@context': 'https://schema.org',
'@type': 'Person',
name: 'Lovell Fuller'
},
copyrightYear: 2013,
license: 'https://www.apache.org/licenses/LICENSE-2.0'
})
}],
sidebar: [
{ label: 'Home', link: '/' },
{ label: 'Installation', slug: 'install' },
{
label: 'API',
items: [
{ label: 'Constructor', slug: 'api-constructor' },
{ label: 'Input metadata', slug: 'api-input' },
{ label: 'Output options', slug: 'api-output' },
{ label: 'Resizing images', slug: 'api-resize' },
{ label: 'Compositing images', slug: 'api-composite' },
{ label: 'Image operations', slug: 'api-operation' },
{ label: 'Colour manipulation', slug: 'api-colour' },
{ label: 'Channel manipulation', slug: 'api-channel' },
{ label: 'Global properties', slug: 'api-utility' }
]
},
{ label: 'Performance', slug: 'performance' },
{ label: 'Changelog', slug: 'changelog' }
],
social: {
openCollective: 'https://opencollective.com/libvips',
github: 'https://github.com/lovell/sharp'
}
})
]
});

38
docs/build.js Normal file
View File

@ -0,0 +1,38 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict';
const fs = require('fs').promises;
const path = require('path');
const jsdoc2md = require('jsdoc-to-markdown');
[
'constructor',
'input',
'resize',
'composite',
'operation',
'colour',
'channel',
'output',
'utility'
].forEach(async (m) => {
const input = path.join('lib', `${m}.js`);
const output = path.join('docs', `api-${m}.md`);
const ast = await jsdoc2md.getTemplateData({ files: input });
const markdown = await jsdoc2md.render({
data: ast,
'global-index-format': 'none',
'module-index-format': 'none'
});
const cleanMarkdown = markdown
.replace(/(## )([A-Za-z0-9]+)([^\n]*)/g, '$1$2\n> $2$3\n') // simplify headings to match those of documentationjs, ensures existing URLs work
.replace(/<a name="[A-Za-z0-9+]+"><\/a>/g, '') // remove anchors, let docute add these (at markdown to HTML render time)
.replace(/\*\*Kind\*\*: global[^\n]+/g, '') // remove all "global" Kind labels (requires JSDoc refactoring)
.trim();
await fs.writeFile(output, cleanMarkdown);
});

View File

@ -1,42 +0,0 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict';
import fs from 'node:fs/promises';
import path from 'node:path';
import jsdoc2md from 'jsdoc-to-markdown';
const pages = {
constructor: 'Constructor',
input: 'Input metadata',
resize: 'Resizing images',
composite: 'Compositing images',
operation: 'Image operations',
colour: 'Colour manipulation',
channel: 'Channel manipulation',
output: 'Output options',
utility: 'Global properties'
};
Object.keys(pages).forEach(async (m) => {
const input = path.join('lib', `${m}.js`);
const output = path.join('docs', 'src', 'content', 'docs', `api-${m}.md`);
const ast = await jsdoc2md.getTemplateData({ files: input });
const markdown = await jsdoc2md.render({
data: ast,
'global-index-format': 'none',
'module-index-format': 'none'
});
const cleanMarkdown =
`---\n# This file was auto-generated from JSDoc in lib/${m}.js\ntitle: ${pages[m]}\n---\n\n` +
markdown
.replace(/(## )([A-Za-z0-9]+)([^\n]*)/g, '$1$2\n> $2$3\n') // simplify headings
.replace(/<a name="[A-Za-z0-9+]+"><\/a>/g, '') // remove anchors
.replace(/\*\*Kind\*\*: global[^\n]+/g, '') // remove all "global" Kind labels (requires JSDoc refactoring)
.trim();
await fs.writeFile(output, cleanMarkdown);
});

View File

@ -1,19 +1,11 @@
---
title: Changelog
---
# Changelog
## v0.34 - *hat*
Requires libvips v8.16.1
Requires libvips v8.16.0
### v0.34.0 - TBD
* Breaking: Support array of input images to be joined or animated.
[#1580](https://github.com/lovell/sharp/issues/1580)
* Breaking: Ensure `removeAlpha` removes all alpha channels.
[#2266](https://github.com/lovell/sharp/issues/2266)
* Breaking: Support `info.size` on wide-character systems via upgrade to C++17.
[#3943](https://github.com/lovell/sharp/issues/3943)
@ -24,10 +16,6 @@ Requires libvips v8.16.1
* Expose WebP `smartDeblock` output option.
* Add `autoOrient` operation and constructor option.
[#4151](https://github.com/lovell/sharp/pull/4151)
[@happycollision](https://github.com/happycollision)
* TypeScript: Ensure channel counts use the correct range.
[#4197](https://github.com/lovell/sharp/pull/4197)
[@DavidVaness](https://github.com/DavidVaness)
@ -40,10 +28,6 @@ Requires libvips v8.16.1
[#4207](https://github.com/lovell/sharp/pull/4207)
[@calebmer](https://github.com/calebmer)
* Add support for RGBE images. Requires libvips compiled with radiance support.
[#4316](https://github.com/lovell/sharp/pull/4316)
[@florentzabera](https://github.com/florentzabera)
## v0.33 - *gauge*
Requires libvips v8.15.3

1
docs/docute.min.js vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -1,7 +1,14 @@
{
"hosting": {
"site": "pixelplumbing-sharp",
"public": "dist",
"public": ".",
"ignore": [
".*",
"build.js",
"firebase.json",
"image/**",
"search-index/**"
],
"headers": [
{
"source": "**",

View File

@ -308,9 +308,3 @@ GitHub: https://github.com/sumitd2
Name: Caleb Meredith
GitHub: https://github.com/calebmer
Name: Don Denton
GitHub: https://github.com/happycollision
Name: Florent Zabera
GitHub: https://github.com/florentzabera

View File

Before

Width:  |  Height:  |  Size: 4.0 KiB

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 652 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

View File

Before

Width:  |  Height:  |  Size: 929 B

After

Width:  |  Height:  |  Size: 929 B

View File

Before

Width:  |  Height:  |  Size: 508 B

After

Width:  |  Height:  |  Size: 508 B

202
docs/index.html Normal file

File diff suppressed because one or more lines are too long

View File

@ -1,12 +1,8 @@
---
title: Installation
---
# Installation
Works with your choice of JavaScript package manager.
:::caution
Please ensure your package manager is configured to install optional dependencies
:::
> ⚠️ **Please ensure your package manager is configured to install optional dependencies**
If a package manager lockfile must support multiple platforms,
please see the [cross-platform](#cross-platform) section
@ -20,10 +16,6 @@ npm install sharp
pnpm add sharp
```
When using `pnpm`, you may need to add `sharp` to
[ignoredBuiltDependencies](https://pnpm.io/package_json#pnpmignoredbuiltdependencies)
to silence warnings.
```sh
yarn add sharp
```
@ -67,9 +59,7 @@ within the same installation tree and/or using the same lockfile.
### npm v10+
:::caution
npm `package-lock.json` files shared by multiple platforms can cause installation problems due to [npm bug #4828](https://github.com/npm/cli/issues/4828)
:::
> ⚠️ **npm `package-lock.json` files can cause installation problems due to [npm bug #4828](https://github.com/npm/cli/issues/4828)**
Provides limited support via `--os`, `--cpu` and `--libc` flags.
@ -132,10 +122,6 @@ If `node-addon-api` or `node-gyp` cannot be found, try adding them via:
npm install --save node-addon-api node-gyp
```
When using `pnpm`, you may need to add `sharp` to
[onlyBuiltDependencies](https://pnpm.io/package_json#pnpmonlybuiltdependencies)
to ensure the installation script can be run.
For cross-compiling, the `--platform`, `--arch` and `--libc` npm flags
(or the `npm_config_platform`, `npm_config_arch` and `npm_config_libc` environment variables)
can be used to configure the target environment.

View File

@ -1,17 +0,0 @@
{
"name": "sharp-docs",
"type": "module",
"version": "0.0.1",
"private": true,
"scripts": {
"dev": "astro dev",
"start": "astro dev",
"build": "astro build",
"preview": "astro preview",
"astro": "astro"
},
"dependencies": {
"@astrojs/starlight": "^0.31.0",
"astro": "^5.1.7"
}
}

111
docs/performance.md Normal file
View File

@ -0,0 +1,111 @@
# Performance
A test to benchmark the performance of this module relative to alternatives.
Greater libvips performance can be expected with caching enabled (default)
and using 8+ core machines, especially those with larger L1/L2 CPU caches.
The I/O limits of the relevant (de)compression library will generally determine maximum throughput.
## Contenders
* [jimp](https://www.npmjs.com/package/jimp) v0.22.10 - Image processing in pure JavaScript.
* [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*".
* [gm](https://www.npmjs.com/package/gm) v1.25.0 - Fully featured wrapper around GraphicsMagick's `gm` command line utility.
* [@squoosh/lib](https://www.npmjs.com/package/@squoosh/lib) v0.5.3 - Image libraries transpiled to WebAssembly, includes GPLv3 code, but "*Project no longer maintained*".
* [@squoosh/cli](https://www.npmjs.com/package/@squoosh/cli) v0.7.3 - Command line wrapper around `@squoosh/lib`, avoids GPLv3 by spawning process, but "*Project no longer maintained*".
* sharp v0.33.0 / libvips v8.15.0 - Caching within libvips disabled to ensure a fair comparison.
## Environment
### AMD64
* AWS EC2 us-east-2 [c7a.xlarge](https://aws.amazon.com/ec2/instance-types/c7a/) (4x AMD EPYC 9R14)
* Ubuntu 23.10 [13f233a16be2](https://hub.docker.com/layers/library/ubuntu/23.10/images/sha256-13f233a16be210b57907b98b0d927ceff7571df390701e14fe1f3901b2c4a4d7)
* Node.js 20.10.0
### ARM64
* AWS EC2 us-east-2 [c7g.xlarge](https://aws.amazon.com/ec2/instance-types/c7g/) (4x ARM Graviton3)
* Ubuntu 23.10 [7708743264cb](https://hub.docker.com/layers/library/ubuntu/23.10/images/sha256-7708743264cbb7f6cf7fc13e915faece45a6cdda455748bc55e58e8de3d27b63)
* Node.js 20.10.0
## Task: JPEG
Decompress a 2725x2225 JPEG image,
resize to 720x588 using Lanczos 3 resampling (where available),
then compress to JPEG at a "quality" setting of 80.
Note: jimp does not support Lanczos 3, bicubic resampling used instead.
#### Results: JPEG (AMD64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 0.84 | 1.0 |
| squoosh-cli | file | file | 1.54 | 1.8 |
| squoosh-lib | buffer | buffer | 2.24 | 2.7 |
| imagemagick | file | file | 11.75 | 14.0 |
| gm | buffer | buffer | 12.66 | 15.1 |
| gm | file | file | 12.72 | 15.1 |
| sharp | stream | stream | 48.31 | 57.5 |
| sharp | file | file | 51.42 | 61.2 |
| sharp | buffer | buffer | 52.41 | 62.4 |
#### Results: JPEG (ARM64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 0.88 | 1.0 |
| squoosh-cli | file | file | 1.18 | 1.3 |
| squoosh-lib | buffer | buffer | 1.99 | 2.3 |
| gm | buffer | buffer | 6.06 | 6.9 |
| gm | file | file | 10.81 | 12.3 |
| imagemagick | file | file | 10.95 | 12.4 |
| sharp | stream | stream | 33.15 | 37.7 |
| sharp | file | file | 34.99 | 39.8 |
| sharp | buffer | buffer | 36.05 | 41.0 |
## Task: PNG
Decompress a 2048x1536 RGBA PNG image,
premultiply the alpha channel,
resize to 720x540 using Lanczos 3 resampling (where available),
unpremultiply then compress as PNG with a "default" zlib compression level of 6
and without adaptive filtering.
Note: jimp does not support premultiply/unpremultiply.
### Results: PNG (AMD64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| squoosh-cli | file | file | 0.34 | 1.0 |
| squoosh-lib | buffer | buffer | 0.51 | 1.5 |
| jimp | buffer | buffer | 3.59 | 10.6 |
| gm | file | file | 8.54 | 25.1 |
| imagemagick | file | file | 9.23 | 27.1 |
| sharp | file | file | 25.43 | 74.8 |
| sharp | buffer | buffer | 25.70 | 75.6 |
### Results: PNG (ARM64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| squoosh-cli | file | file | 0.33 | 1.0 |
| squoosh-lib | buffer | buffer | 0.46 | 1.4 |
| jimp | buffer | buffer | 3.51 | 10.6 |
| gm | file | file | 7.47 | 22.6 |
| imagemagick | file | file | 8.06 | 24.4 |
| sharp | file | file | 17.31 | 52.5 |
| sharp | buffer | buffer | 17.66 | 53.5 |
## Running the benchmark test
Requires Docker.
```sh
git clone https://github.com/lovell/sharp.git
cd sharp/test/bench
./run-with-docker.sh
```

View File

@ -1,5 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="86 86 550 550">
<!-- Creative Commons CC0 1.0 Universal Public Domain Dedication -->
<path fill="none" stroke="#9c0" stroke-width="80" d="M258.411 285.777l200.176-26.8M244.113 466.413L451.44 438.66M451.441 438.66V238.484M451.441 88.363v171.572l178.725-23.917M270.323 255.602V477.22M272.71 634.17V462.591L93.984 486.515"/>
<path fill="none" stroke="#090" stroke-width="80" d="M451.441 610.246V438.66l178.725-23.91M269.688 112.59v171.58L90.964 308.093"/>
</svg>

Before

Width:  |  Height:  |  Size: 508 B

View File

@ -1,4 +0,0 @@
User-agent: *
Disallow:
Sitemap: https://sharp.pixelplumbing.com/sitemap-index.xml

2
docs/robots.txt Normal file
View File

@ -0,0 +1,2 @@
User-agent: *
Disallow:

1
docs/search-index.json Normal file

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,64 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict';
const fs = require('fs');
const path = require('path');
const { extractDescription, extractKeywords, extractParameters } = require('./extract');
const searchIndex = [];
// Install
const contents = fs.readFileSync(path.join(__dirname, '..', 'install.md'), 'utf8');
const matches = contents.matchAll(
/## (?<title>[A-Za-z0-9 ]+)\n\n(?<body>[^#]+)/gs
);
for (const match of matches) {
const { title, body } = match.groups;
const description = extractDescription(body);
searchIndex.push({
t: title,
d: description,
k: extractKeywords(`${title} ${description}`),
l: `/install#${title.toLowerCase().replace(/ /g, '-')}`
});
}
// API
[
'constructor',
'input',
'output',
'resize',
'composite',
'operation',
'channel',
'colour',
'utility'
].forEach((section) => {
const contents = fs.readFileSync(path.join(__dirname, '..', `api-${section}.md`), 'utf8');
const matches = contents.matchAll(
/## (?<title>[A-Za-z]+)\n[^\n]+\n(?<firstparagraph>.+?)\n\n.+?(?<parameters>\| Param .+?\n\n)?\*\*Example/gs
);
for (const match of matches) {
const { title, firstparagraph, parameters } = match.groups;
const description = firstparagraph.startsWith('###')
? 'Constructor'
: extractDescription(firstparagraph);
const parameterNames = parameters ? extractParameters(parameters) : '';
searchIndex.push({
t: title,
d: description,
k: extractKeywords(`${title} ${description} ${parameterNames}`),
l: `/api-${section}#${title.toLowerCase()}`
});
}
});
fs.writeFileSync(
path.join(__dirname, '..', 'search-index.json'),
JSON.stringify(searchIndex)
);

View File

@ -0,0 +1,34 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict';
const stopWords = require('./stop-words');
const extractDescription = (str) =>
str
.replace(/### Examples.*/sg, '')
.replace(/\(http[^)]+/g, '')
.replace(/\s+/g, ' ')
.replace(/[^A-Za-z0-9_/\-,. ]/g, '')
.replace(/\s+/g, ' ')
.substring(0, 200)
.trim();
const extractParameters = (str) =>
[...str.matchAll(/options\.(?<name>[^.`\] ]+)/gs)]
.map((match) => match.groups.name)
.map((name) => name.replace(/([A-Z])/g, ' $1').toLowerCase())
.join(' ');
const extractKeywords = (str) =>
[
...new Set(
str
.split(/[ -/]/)
.map((word) => word.toLowerCase().replace(/[^a-z]/g, ''))
.filter((word) => word.length > 2 && word.length < 15 && !stopWords.includes(word))
)
].join(' ');
module.exports = { extractDescription, extractKeywords, extractParameters };

View File

@ -0,0 +1,140 @@
// Copyright 2013 Lovell Fuller and others.
// SPDX-License-Identifier: Apache-2.0
'use strict';
module.exports = [
'about',
'after',
'all',
'allows',
'already',
'also',
'alternative',
'always',
'and',
'any',
'are',
'available',
'based',
'been',
'before',
'best',
'both',
'call',
'callback',
'can',
'containing',
'contains',
'created',
'current',
'date',
'default',
'deprecated',
'does',
'each',
'either',
'ensure',
'entirely',
'etc',
'every',
'except',
'following',
'for',
'from',
'get',
'gets',
'given',
'has',
'have',
'helps',
'how',
'image',
'implies',
'include',
'including',
'involve',
'its',
'last',
'least',
'lots',
'make',
'may',
'meaning',
'more',
'most',
'much',
'must',
'non',
'not',
'now',
'occur',
'occurs',
'one',
'options',
'other',
'out',
'over',
'part',
'perform',
'performs',
'please',
'pre',
'previously',
'produce',
'proper',
'provide',
'provided',
'ready',
'requires',
'requiresharp',
'returned',
'run',
'same',
'see',
'set',
'sets',
'sharp',
'should',
'since',
'site',
'some',
'specified',
'spelling',
'such',
'support',
'supported',
'sure',
'take',
'task',
'than',
'that',
'the',
'their',
'then',
'there',
'therefore',
'these',
'this',
'under',
'unless',
'unmaintained',
'unsuitable',
'unsupported',
'until',
'use',
'used',
'using',
'value',
'values',
'via',
'were',
'when',
'which',
'while',
'will',
'with',
'without',
'you',
'your'
];

View File

@ -1,5 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="86 86 550 550">
<!-- Creative Commons CC0 1.0 Universal Public Domain Dedication -->
<path fill="none" stroke="#9c0" stroke-width="80" d="M258.411 285.777l200.176-26.8M244.113 466.413L451.44 438.66M451.441 438.66V238.484M451.441 88.363v171.572l178.725-23.917M270.323 255.602V477.22M272.71 634.17V462.591L93.984 486.515"/>
<path fill="none" stroke="#090" stroke-width="80" d="M451.441 610.246V438.66l178.725-23.91M269.688 112.59v171.58L90.964 308.093"/>
</svg>

Before

Width:  |  Height:  |  Size: 508 B

View File

@ -1,7 +0,0 @@
import { defineCollection } from 'astro:content';
import { docsLoader } from '@astrojs/starlight/loaders';
import { docsSchema } from '@astrojs/starlight/schema';
export const collections = {
docs: defineCollection({ loader: docsLoader(), schema: docsSchema() }),
};

View File

@ -1,103 +0,0 @@
---
title: Performance
---
A test to benchmark the performance of this module relative to alternatives.
Greater libvips performance can be expected with caching enabled (default)
and using 8+ core machines, especially those with larger L1/L2 CPU caches.
The I/O limits of the relevant (de)compression library will generally determine maximum throughput.
## Contenders
* [jimp](https://www.npmjs.com/package/jimp) v1.6.0 - Image processing in pure JavaScript.
* [imagemagick](https://www.npmjs.com/package/imagemagick) v0.1.3 - Supports filesystem only and "*has been unmaintained for a long time*".
* [gm](https://www.npmjs.com/package/gm) v1.25.1 - Fully featured wrapper around GraphicsMagick's `gm` command line utility, but "*has been sunset*".
* sharp v0.34.0 / libvips v8.16.1 - Caching within libvips disabled to ensure a fair comparison.
## Environment
### AMD64
* AWS EC2 us-west-2 [c7a.xlarge](https://aws.amazon.com/ec2/instance-types/c7a/) (4x AMD EPYC 9R14)
* Ubuntu 24.10 [fad5ba7223f8](https://hub.docker.com/layers/library/ubuntu/24.10/images/sha256-fad5ba7223f8d87179dfa23211d31845d47e07a474ac31ad5258afb606523c0d)
* Node.js 22.14.0
### ARM64
* AWS EC2 us-west-2 [c8g.xlarge](https://aws.amazon.com/ec2/instance-types/c8g/) (4x ARM Graviton4)
* Ubuntu 24.10 [133f2e05cb69](https://hub.docker.com/layers/library/ubuntu/24.10/images/sha256-133f2e05cb6958c3ce7ec870fd5a864558ba780fb7062315b51a23670bff7e76)
* Node.js 22.14.0
## Task: JPEG
Decompress a 2725x2225 JPEG image,
resize to 720x588 using Lanczos 3 resampling (where available),
then compress to JPEG at a "quality" setting of 80.
Note: jimp does not support Lanczos 3, bicubic resampling used instead.
#### Results: JPEG (AMD64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 2.35 | 1.0 |
| imagemagick | file | file | 10.51 | 4.5 |
| gm | buffer | buffer | 11.67 | 5.0 |
| gm | file | file | 11.75 | 5.1 |
| sharp | stream | stream | 60.72 | 25.8 |
| sharp | file | file | 62.37 | 26.5 |
| sharp | buffer | buffer | 65.15 | 27.7 |
#### Results: JPEG (ARM64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| jimp | buffer | buffer | 2.13 | 1.0 |
| imagemagick | file | file | 12.95 | 6.1 |
| gm | buffer | buffer | 13.53 | 6.4 |
| gm | file | file | 13.52 | 6.4 |
| sharp | stream | stream | 46.58 | 21.9 |
| sharp | file | file | 48.42 | 22.7 |
| sharp | buffer | buffer | 50.16 | 23.6 |
## Task: PNG
Decompress a 2048x1536 RGBA PNG image,
premultiply the alpha channel,
resize to 720x540 using Lanczos 3 resampling (where available),
unpremultiply then compress as PNG with a "default" zlib compression level of 6
and without adaptive filtering.
Note: jimp does not support premultiply/unpremultiply.
### Results: PNG (AMD64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| gm | file | file | 8.66 | 1.0 |
| imagemagick | file | file | 8.79 | 1.0 |
| jimp | buffer | buffer | 11.26 | 1.3 |
| sharp | file | file | 27.93 | 3.2 |
| sharp | buffer | buffer | 28.69 | 3.3 |
### Results: PNG (ARM64)
| Module | Input | Output | Ops/sec | Speed-up |
| :----------------- | :----- | :----- | ------: | -------: |
| gm | file | file | 9.65 | 1.0 |
| imagemagick | file | file | 9.72 | 1.0 |
| jimp | buffer | buffer | 10.68 | 1.1 |
| sharp | file | file | 23.90 | 2.5 |
| sharp | buffer | buffer | 24.48 | 2.5 |
## Running the benchmark test
Requires Docker.
```sh
git clone https://github.com/lovell/sharp.git
cd sharp/test/bench
./run-with-docker.sh
```

View File

@ -1,45 +0,0 @@
@view-transition {
navigation: auto;
}
:root {
--sl-content-width: 60rem;
--sl-color-accent-low: #072d00;
--sl-color-accent: #247f00;
--sl-color-accent-high: #aad7a0;
--sl-color-white: #ffffff;
--sl-color-gray-1: #eaf0e8;
--sl-color-gray-2: #c5cdc3;
--sl-color-gray-3: #99a796;
--sl-color-gray-4: #4f5c4d;
--sl-color-gray-5: #303c2d;
--sl-color-gray-6: #1f2a1c;
--sl-color-black: #151a13;
}
:root[data-theme="light"] {
--sl-color-accent-low: #c0e2b8;
--sl-color-accent: #165800;
--sl-color-accent-high: #0d3e00;
--sl-color-white: #151a13;
--sl-color-gray-1: #1f2a1c;
--sl-color-gray-2: #303c2d;
--sl-color-gray-3: #4f5c4d;
--sl-color-gray-4: #82907f;
--sl-color-gray-5: #bdc4bb;
--sl-color-gray-6: #eaf0e8;
--sl-color-gray-7: #f4f7f3;
--sl-color-black: #ffffff;
}
blockquote {
background-color: var(--sl-color-gray-6);
padding: 1rem;
}
.site-title::after {
content: "High performance Node.js image processing";
color: var(--sl-color-text);
font-size: var(--sl-text-sm);
padding-top: 0.3rem;
}

View File

@ -1,5 +0,0 @@
{
"extends": "astro/tsconfigs/strict",
"include": [".astro/types.d.ts", "**/*"],
"exclude": ["dist"]
}

View File

@ -16,7 +16,7 @@ const bool = {
};
/**
* Remove alpha channels, if any. This is a no-op if the image does not have an alpha channel.
* Remove alpha channel, if any. This is a no-op if the image does not have an alpha channel.
*
* See also {@link /api-operation#flatten|flatten}.
*

View File

@ -110,7 +110,6 @@ const blend = {
* @param {number} [images[].input.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified.
* @param {boolean} [images[].input.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `<span foreground="red">Red!</span>`.
* @param {number} [images[].input.text.spacing=0] - text line height in points. Will use the font line height if none is specified.
* @param {Boolean} [images[].autoOrient=false] - set to true to use EXIF orientation data, if present, to orient the image.
* @param {String} [images[].blend='over'] - how to blend this image with the image below.
* @param {String} [images[].gravity='centre'] - gravity at which to place the overlay.
* @param {Number} [images[].top] - the pixel offset from the top edge.

View File

@ -121,25 +121,10 @@ const debuglog = util.debuglog('sharp');
* }
* }).toFile('text_rgba.png');
*
* @example
* // Join four input images as a 2x2 grid with a 4 pixel gutter
* const data = await sharp(
* [image1, image2, image3, image4],
* { join: { across: 2, shim: 4 } }
* ).toBuffer();
*
* @example
* // Generate a two-frame animated image from emoji
* const images = ['😀', '😛'].map(text => ({
* text: { text, width: 64, height: 64, channels: 4, rgba: true }
* }));
* await sharp(images, { join: { animated: true } }).toFile('out.gif');
*
* @param {(Buffer|ArrayBuffer|Uint8Array|Uint8ClampedArray|Int8Array|Uint16Array|Int16Array|Uint32Array|Int32Array|Float32Array|Float64Array|string|Array)} [input] - if present, can be
* @param {(Buffer|ArrayBuffer|Uint8Array|Uint8ClampedArray|Int8Array|Uint16Array|Int16Array|Uint32Array|Int32Array|Float32Array|Float64Array|string)} [input] - if present, can be
* a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or
* a TypedArray containing raw pixel image data, or
* a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file.
* An array of inputs can be provided, and these will be joined together.
* JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present.
* @param {Object} [options] - if present, is an Object with optional attributes.
* @param {string} [options.failOn='warning'] - When to abort processing of invalid pixel data, one of (in order of sensitivity, least to most): 'none', 'truncated', 'error', 'warning'. Higher levels imply lower levels. Invalid metadata will always abort.
@ -147,7 +132,6 @@ const debuglog = util.debuglog('sharp');
* (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted.
* An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF).
* @param {boolean} [options.unlimited=false] - Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF).
* @param {boolean} [options.autoOrient=false] - Set this to `true` to rotate/flip the image to match EXIF `Orientation`, if any.
* @param {boolean} [options.sequentialRead=true] - Set this to `false` to use random access rather than sequential read. Some operations will do this automatically.
* @param {number} [options.density=72] - number representing the DPI for vector images in the range 1 to 100000.
* @param {number} [options.ignoreIcc=false] - should the embedded ICC profile, if any, be ignored.
@ -184,14 +168,6 @@ const debuglog = util.debuglog('sharp');
* @param {boolean} [options.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`.
* @param {number} [options.text.spacing=0] - text line height in points. Will use the font line height if none is specified.
* @param {string} [options.text.wrap='word'] - word wrapping style when width is provided, one of: 'word', 'char', 'word-char' (prefer word, fallback to char) or 'none'.
* @param {Object} [options.join] - describes how an array of input images should be joined.
* @param {number} [options.join.across=1] - number of images to join horizontally.
* @param {boolean} [options.join.animated=false] - set this to `true` to join the images as an animated image.
* @param {number} [options.join.shim=0] - number of pixels to insert between joined images.
* @param {string|Object} [options.join.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha.
* @param {string} [options.join.halign='left'] - horizontal alignment style for images joined horizontally (`'left'`, `'centre'`, `'center'`, `'right'`).
* @param {string} [options.join.valign='top'] - vertical alignment style for images joined vertically (`'top'`, `'centre'`, `'center'`, `'bottom'`).
*
* @returns {Sharp}
* @throws {Error} Invalid parameters
*/
@ -218,6 +194,7 @@ const Sharp = function (input, options) {
canvas: 'crop',
position: 0,
resizeBackground: [0, 0, 0, 255],
useExifOrientation: false,
angle: 0,
rotationAngle: 0,
rotationBackground: [0, 0, 0, 255],

156
lib/index.d.ts vendored
View File

@ -40,7 +40,19 @@ import { Duplex } from 'stream';
*/
declare function sharp(options?: sharp.SharpOptions): sharp.Sharp;
declare function sharp(
input?: sharp.SharpInput | Array<sharp.SharpInput>,
input?:
| Buffer
| ArrayBuffer
| Uint8Array
| Uint8ClampedArray
| Int8Array
| Uint16Array
| Int16Array
| Uint32Array
| Int32Array
| Float32Array
| Float64Array
| string,
options?: sharp.SharpOptions,
): sharp.Sharp;
@ -50,35 +62,33 @@ declare namespace sharp {
/** An Object containing the version numbers of sharp, libvips and its dependencies. */
const versions: {
aom?: string | undefined;
archive?: string | undefined;
vips: string;
cairo?: string | undefined;
cgif?: string | undefined;
croco?: string | undefined;
exif?: string | undefined;
expat?: string | undefined;
ffi?: string | undefined;
fontconfig?: string | undefined;
freetype?: string | undefined;
fribidi?: string | undefined;
gdkpixbuf?: string | undefined;
gif?: string | undefined;
glib?: string | undefined;
gsf?: string | undefined;
harfbuzz?: string | undefined;
heif?: string | undefined;
highway?: string | undefined;
imagequant?: string | undefined;
jpeg?: string | undefined;
lcms?: string | undefined;
mozjpeg?: string | undefined;
orc?: string | undefined;
pango?: string | undefined;
pixman?: string | undefined;
png?: string | undefined;
"proxy-libintl"?: string | undefined;
rsvg?: string | undefined;
sharp: string;
spng?: string | undefined;
sharp?: string | undefined;
svg?: string | undefined;
tiff?: string | undefined;
vips: string;
webp?: string | undefined;
avif?: string | undefined;
heif?: string | undefined;
xml?: string | undefined;
"zlib-ng"?: string | undefined;
zlib?: string | undefined;
};
/** An Object containing the available interpolators and their proper values */
@ -354,72 +364,24 @@ declare namespace sharp {
//#region Operation functions
/**
* Rotate the output image by either an explicit angle
* or auto-orient based on the EXIF `Orientation` tag.
* Rotate the output image by either an explicit angle or auto-orient based on the EXIF Orientation tag.
*
* If an angle is provided, it is converted to a valid positive degree rotation.
* For example, `-450` will produce a 270 degree rotation.
* If an angle is provided, it is converted to a valid positive degree rotation. For example, -450 will produce a 270deg rotation.
*
* When rotating by an angle other than a multiple of 90,
* the background colour can be provided with the `background` option.
* When rotating by an angle other than a multiple of 90, the background colour can be provided with the background option.
*
* If no angle is provided, it is determined from the EXIF data.
* Mirroring is supported and may infer the use of a flip operation.
* If no angle is provided, it is determined from the EXIF data. Mirroring is supported and may infer the use of a flip operation.
*
* The use of `rotate` without an angle will remove the EXIF `Orientation` tag, if any.
* The use of rotate implies the removal of the EXIF Orientation tag, if any.
*
* Only one rotation can occur per pipeline (aside from an initial call without
* arguments to orient via EXIF data). Previous calls to `rotate` in the same
* pipeline will be ignored.
*
* Multi-page images can only be rotated by 180 degrees.
*
* Method order is important when rotating, resizing and/or extracting regions,
* for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`.
*
* @example
* const pipeline = sharp()
* .rotate()
* .resize(null, 200)
* .toBuffer(function (err, outputBuffer, info) {
* // outputBuffer contains 200px high JPEG image data,
* // auto-rotated using EXIF Orientation tag
* // info.width and info.height contain the dimensions of the resized image
* });
* readableStream.pipe(pipeline);
*
* @example
* const rotateThenResize = await sharp(input)
* .rotate(90)
* .resize({ width: 16, height: 8, fit: 'fill' })
* .toBuffer();
* const resizeThenRotate = await sharp(input)
* .resize({ width: 16, height: 8, fit: 'fill' })
* .rotate(90)
* .toBuffer();
*
* @param {number} [angle=auto] angle of rotation.
* @param {Object} [options] - if present, is an Object with optional attributes.
* @param {string|Object} [options.background="#000000"] parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha.
* @returns {Sharp}
* Method order is important when both rotating and extracting regions, for example rotate(x).extract(y) will produce a different result to extract(y).rotate(x).
* @param angle angle of rotation. (optional, default auto)
* @param options if present, is an Object with optional attributes.
* @throws {Error} Invalid parameters
* @returns A sharp instance that can be used to chain operations
*/
rotate(angle?: number, options?: RotateOptions): Sharp;
/**
* Alias for calling `rotate()` with no arguments, which orients the image based
* on EXIF orientsion.
*
* This operation is aliased to emphasize its purpose, helping to remove any
* confusion between rotation and orientation.
*
* @example
* const output = await sharp(input).autoOrient().toBuffer();
*
* @returns {Sharp}
*/
autoOrient(): Sharp
/**
* Flip the image about the vertical Y axis. This always occurs after rotation, if any.
* The use of flip implies the removal of the EXIF Orientation tag, if any.
@ -935,27 +897,7 @@ declare namespace sharp {
//#endregion
}
type SharpInput = Buffer
| ArrayBuffer
| Uint8Array
| Uint8ClampedArray
| Int8Array
| Uint16Array
| Int16Array
| Uint32Array
| Int32Array
| Float32Array
| Float64Array
| string;
interface SharpOptions {
/**
* Auto-orient based on the EXIF `Orientation` tag, if present.
* Mirroring is supported and may infer the use of a flip operation.
*
* Using this option will remove the EXIF `Orientation` tag, if any.
*/
autoOrient?: boolean;
/**
* When to abort processing of invalid pixel data, one of (in order of sensitivity):
* 'none' (least), 'truncated', 'error' or 'warning' (most), highers level imply lower levels, invalid metadata will always abort. (optional, default 'warning')
@ -1001,8 +943,6 @@ declare namespace sharp {
create?: Create | undefined;
/** Describes a new text image to be created. */
text?: CreateText | undefined;
/** Describes how array of input images should be joined. */
join?: Join | undefined;
}
interface CacheOptions {
@ -1083,21 +1023,6 @@ declare namespace sharp {
wrap?: TextWrap;
}
interface Join {
/** Number of images per row. */
across?: number | undefined;
/** Treat input as frames of an animated image. */
animated?: boolean | undefined;
/** Space between images, in pixels. */
shim?: number | undefined;
/** Background colour. */
background?: Colour | Color | undefined;
/** Horizontal alignment. */
halign?: HorizontalAlignment | undefined;
/** Vertical alignment. */
valign?: VerticalAlignment | undefined;
}
interface ExifDir {
[k: string]: string;
}
@ -1137,13 +1062,6 @@ declare namespace sharp {
width?: number | undefined;
/** Number of pixels high (EXIF orientation is not taken into consideration) */
height?: number | undefined;
/** Any changed metadata after the image orientation is applied. */
autoOrient: {
/** Number of pixels wide (EXIF orientation is taken into consideration) */
width: number;
/** Number of pixels high (EXIF orientation is taken into consideration) */
height: number;
};
/** Name of colour space interpretation */
space?: keyof ColourspaceEnum | undefined;
/** Number of bands e.g. 3 for sRGB, 4 for CMYK */
@ -1594,8 +1512,6 @@ declare namespace sharp {
failOn?: FailOnOptions | undefined;
/** see sharp() constructor, (optional, default 268402689) */
limitInputPixels?: number | boolean | undefined;
/** see sharp() constructor, (optional, default false) */
autoOrient?: boolean | undefined;
}
interface TileOptions {
@ -1736,10 +1652,6 @@ declare namespace sharp {
type TextWrap = 'word' | 'char' | 'word-char' | 'none';
type HorizontalAlignment = 'left' | 'centre' | 'center' | 'right';
type VerticalAlignment = 'top' | 'centre' | 'center' | 'bottom';
type TileContainer = 'fs' | 'zip';
type TileLayout = 'dz' | 'iiif' | 'iiif3' | 'zoomify' | 'google';

View File

@ -14,13 +14,9 @@ const sharp = require('./sharp');
*/
const align = {
left: 'low',
top: 'low',
low: 'low',
center: 'centre',
centre: 'centre',
right: 'high',
bottom: 'high',
high: 'high'
right: 'high'
};
/**
@ -28,9 +24,9 @@ const align = {
* @private
*/
function _inputOptionsFromObject (obj) {
const { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd, pdfBackground, autoOrient } = obj;
return [raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd, pdfBackground, autoOrient].some(is.defined)
? { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd, pdfBackground, autoOrient }
const { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd, pdfBackground } = obj;
return [raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd, pdfBackground].some(is.defined)
? { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd, pdfBackground }
: undefined;
}
@ -40,7 +36,6 @@ function _inputOptionsFromObject (obj) {
*/
function _createInputDescriptor (input, inputOptions, containerOptions) {
const inputDescriptor = {
autoOrient: false,
failOn: 'warning',
limitInputPixels: Math.pow(0x3FFF, 2),
ignoreIcc: false,
@ -76,18 +71,6 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
} else if (!is.defined(input) && !is.defined(inputOptions) && is.object(containerOptions) && containerOptions.allowStream) {
// Stream without options
inputDescriptor.buffer = [];
} else if (Array.isArray(input)) {
if (input.length > 1) {
// Join images together
if (!this.options.joining) {
this.options.joining = true;
this.options.join = input.map(i => this._createInputDescriptor(i));
} else {
throw new Error('Recursive join is unsupported');
}
} else {
throw new Error('Expected at least two images to join');
}
} else {
throw new Error(`Unsupported input '${input}' of type ${typeof input}${
is.defined(inputOptions) ? ` when also providing options of type ${typeof inputOptions}` : ''
@ -110,14 +93,6 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
throw is.invalidParameterError('failOn', 'one of: none, truncated, error, warning', inputOptions.failOn);
}
}
// autoOrient
if (is.defined(inputOptions.autoOrient)) {
if (is.bool(inputOptions.autoOrient)) {
inputDescriptor.autoOrient = inputOptions.autoOrient;
} else {
throw is.invalidParameterError('autoOrient', 'boolean', inputOptions.autoOrient);
}
}
// Density
if (is.defined(inputOptions.density)) {
if (is.inRange(inputOptions.density, 1, 100000)) {
@ -385,57 +360,6 @@ function _createInputDescriptor (input, inputOptions, containerOptions) {
throw new Error('Expected a valid string to create an image with text.');
}
}
// Join images together
if (is.defined(inputOptions.join)) {
if (is.defined(this.options.join)) {
if (is.defined(inputOptions.join.animated)) {
if (is.bool(inputOptions.join.animated)) {
inputDescriptor.joinAnimated = inputOptions.join.animated;
} else {
throw is.invalidParameterError('join.animated', 'boolean', inputOptions.join.animated);
}
}
if (is.defined(inputOptions.join.across)) {
if (is.integer(inputOptions.join.across) && is.inRange(inputOptions.join.across, 1, 1000000)) {
inputDescriptor.joinAcross = inputOptions.join.across;
} else {
throw is.invalidParameterError('join.across', 'integer between 1 and 100000', inputOptions.join.across);
}
}
if (is.defined(inputOptions.join.shim)) {
if (is.integer(inputOptions.join.shim) && is.inRange(inputOptions.join.shim, 0, 1000000)) {
inputDescriptor.joinShim = inputOptions.join.shim;
} else {
throw is.invalidParameterError('join.shim', 'integer between 0 and 100000', inputOptions.join.shim);
}
}
if (is.defined(inputOptions.join.background)) {
const background = color(inputOptions.join.background);
inputDescriptor.joinBackground = [
background.red(),
background.green(),
background.blue(),
Math.round(background.alpha() * 255)
];
}
if (is.defined(inputOptions.join.halign)) {
if (is.string(inputOptions.join.halign) && is.string(this.constructor.align[inputOptions.join.halign])) {
inputDescriptor.joinHalign = this.constructor.align[inputOptions.join.halign];
} else {
throw is.invalidParameterError('join.halign', 'valid alignment', inputOptions.join.halign);
}
}
if (is.defined(inputOptions.join.valign)) {
if (is.string(inputOptions.join.valign) && is.string(this.constructor.align[inputOptions.join.valign])) {
inputDescriptor.joinValign = this.constructor.align[inputOptions.join.valign];
} else {
throw is.invalidParameterError('join.valign', 'valid alignment', inputOptions.join.valign);
}
}
} else {
throw new Error('Expected input to be an array of images to join');
}
}
} else if (is.defined(inputOptions)) {
throw new Error('Invalid input options ' + inputOptions);
}
@ -551,9 +475,15 @@ function _isStreamInput () {
* });
*
* @example
* // Get dimensions taking EXIF Orientation into account.
* const { autoOrient } = await sharp(input).metadata();
* const { width, height } = autoOrient;
* // Based on EXIF rotation metadata, get the right-side-up width and height:
*
* const size = getNormalSize(await sharp(input).metadata());
*
* function getNormalSize({ width, height, orientation }) {
* return (orientation || 0) >= 5
* ? { width: height, height: width }
* : { width, height };
* }
*
* @param {Function} [callback] - called with the arguments `(err, metadata)`
* @returns {Promise<Object>|Sharp}

View File

@ -18,19 +18,22 @@ const vipsPrecision = {
};
/**
* Rotate the output image.
* Rotate the output image by either an explicit angle
* or auto-orient based on the EXIF `Orientation` tag.
*
* The provided angle is converted to a valid positive degree rotation.
* If an angle is provided, it is converted to a valid positive degree rotation.
* For example, `-450` will produce a 270 degree rotation.
*
* When rotating by an angle other than a multiple of 90,
* the background colour can be provided with the `background` option.
*
* For backwards compatibility, if no angle is provided, `.autoOrient()` will be called.
* If no angle is provided, it is determined from the EXIF data.
* Mirroring is supported and may infer the use of a flip operation.
*
* Only one rotation can occur per pipeline (aside from an initial call without
* arguments to orient via EXIF data). Previous calls to `rotate` in the same
* pipeline will be ignored.
* The use of `rotate` without an angle will remove the EXIF `Orientation` tag, if any.
*
* Only one rotation can occur per pipeline.
* Previous calls to `rotate` in the same pipeline will be ignored.
*
* Multi-page images can only be rotated by 180 degrees.
*
@ -38,6 +41,17 @@ const vipsPrecision = {
* for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`.
*
* @example
* const pipeline = sharp()
* .rotate()
* .resize(null, 200)
* .toBuffer(function (err, outputBuffer, info) {
* // outputBuffer contains 200px high JPEG image data,
* // auto-rotated using EXIF Orientation tag
* // info.width and info.height contain the dimensions of the resized image
* });
* readableStream.pipe(pipeline);
*
* @example
* const rotateThenResize = await sharp(input)
* .rotate(90)
* .resize({ width: 16, height: 8, fit: 'fill' })
@ -54,15 +68,12 @@ const vipsPrecision = {
* @throws {Error} Invalid parameters
*/
function rotate (angle, options) {
if (!is.defined(angle)) {
return this.autoOrient();
}
if (this.options.angle || this.options.rotationAngle) {
if (this.options.useExifOrientation || this.options.angle || this.options.rotationAngle) {
this.options.debuglog('ignoring previous rotate options');
this.options.angle = 0;
this.options.rotationAngle = 0;
}
if (is.integer(angle) && !(angle % 90)) {
if (!is.defined(angle)) {
this.options.useExifOrientation = true;
} else if (is.integer(angle) && !(angle % 90)) {
this.options.angle = angle;
} else if (is.number(angle)) {
this.options.rotationAngle = angle;
@ -81,34 +92,6 @@ function rotate (angle, options) {
return this;
}
/**
* Auto-orient based on the EXIF `Orientation` tag, then remove the tag.
* Mirroring is supported and may infer the use of a flip operation.
*
* Previous or subsequent use of `rotate(angle)` and either `flip()` or `flop()`
* will logically occur after auto-orientation, regardless of call order.
*
* @example
* const output = await sharp(input).autoOrient().toBuffer();
*
* @example
* const pipeline = sharp()
* .autoOrient()
* .resize(null, 200)
* .toBuffer(function (err, outputBuffer, info) {
* // outputBuffer contains 200px high JPEG image data,
* // auto-oriented using EXIF Orientation tag
* // info.width and info.height contain the dimensions of the resized image
* });
* readableStream.pipe(pipeline);
*
* @returns {Sharp}
*/
function autoOrient () {
this.options.input.autoOrient = true;
return this;
}
/**
* Mirror the image vertically (up-down) about the x-axis.
* This always occurs before rotation, if any.
@ -952,7 +935,6 @@ function modulate (options) {
*/
module.exports = function (Sharp) {
Object.assign(Sharp.prototype, {
autoOrient,
rotate,
flip,
flop,

View File

@ -107,7 +107,7 @@ const mapFitToCanvas = {
* @private
*/
function isRotationExpected (options) {
return (options.angle % 360) !== 0 || options.input.autoOrient === true || options.rotationAngle !== 0;
return (options.angle % 360) !== 0 || options.useExifOrientation === true || options.rotationAngle !== 0;
}
/**

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-darwin-arm64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with macOS 64-bit ARM",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-darwin-arm64": "1.1.0-rc4"
"@img/sharp-libvips-darwin-arm64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-darwin-x64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with macOS x64",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-darwin-x64": "1.1.0-rc4"
"@img/sharp-libvips-darwin-x64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linux-arm",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (glibc) ARM (32-bit)",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linux-arm": "1.1.0-rc4.1"
"@img/sharp-libvips-linux-arm": "1.1.0-rc3.1"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linux-arm64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (glibc) 64-bit ARM",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linux-arm64": "1.1.0-rc4"
"@img/sharp-libvips-linux-arm64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linux-ppc64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (glibc) ppc64",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linux-ppc64": "1.1.0-rc4"
"@img/sharp-libvips-linux-ppc64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linux-s390x",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (glibc) s390x",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linux-s390x": "1.1.0-rc4"
"@img/sharp-libvips-linux-s390x": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linux-x64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (glibc) x64",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linux-x64": "1.1.0-rc4"
"@img/sharp-libvips-linux-x64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linuxmusl-arm64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (musl) 64-bit ARM",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linuxmusl-arm64": "1.1.0-rc4"
"@img/sharp-libvips-linuxmusl-arm64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-linuxmusl-x64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Linux (musl) x64",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
@ -15,7 +15,7 @@
},
"preferUnplugged": true,
"optionalDependencies": {
"@img/sharp-libvips-linuxmusl-x64": "1.1.0-rc4"
"@img/sharp-libvips-linuxmusl-x64": "1.1.0-rc3"
},
"files": [
"lib"

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"private": "true",
"workspaces": [
"darwin-arm64",

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-wasm32",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with wasm32",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-win32-ia32",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Windows x86 (32-bit)",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",

View File

@ -1,6 +1,6 @@
{
"name": "@img/sharp-win32-x64",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"description": "Prebuilt sharp for use with Windows x64",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",

View File

@ -1,7 +1,7 @@
{
"name": "sharp",
"description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images",
"version": "0.34.0-rc.0",
"version": "0.33.5",
"author": "Lovell Fuller <npm@lovell.info>",
"homepage": "https://sharp.pixelplumbing.com",
"contributors": [
@ -102,9 +102,9 @@
"test-types": "tsd",
"package-from-local-build": "node npm/from-local-build",
"package-from-github-release": "node npm/from-github-release",
"docs-build": "node docs/build.mjs",
"docs-serve": "cd docs && npm start",
"docs-publish": "cd docs && npm run build && npx firebase-tools deploy --project pixelplumbing --only hosting:pixelplumbing-sharp"
"docs-build": "node docs/build && node docs/search-index/build",
"docs-serve": "cd docs && npx serve",
"docs-publish": "cd docs && npx firebase-tools deploy --project pixelplumbing --only hosting:pixelplumbing-sharp"
},
"type": "commonjs",
"main": "lib/index.js",
@ -139,50 +139,50 @@
"dependencies": {
"color": "^4.2.3",
"detect-libc": "^2.0.3",
"semver": "^7.7.1"
"semver": "^7.6.3"
},
"optionalDependencies": {
"@img/sharp-darwin-arm64": "0.34.0-rc.0",
"@img/sharp-darwin-x64": "0.34.0-rc.0",
"@img/sharp-libvips-darwin-arm64": "1.1.0-rc4",
"@img/sharp-libvips-darwin-x64": "1.1.0-rc4",
"@img/sharp-libvips-linux-arm": "1.1.0-rc4.1",
"@img/sharp-libvips-linux-arm64": "1.1.0-rc4",
"@img/sharp-libvips-linux-ppc64": "1.1.0-rc4",
"@img/sharp-libvips-linux-s390x": "1.1.0-rc4",
"@img/sharp-libvips-linux-x64": "1.1.0-rc4",
"@img/sharp-libvips-linuxmusl-arm64": "1.1.0-rc4",
"@img/sharp-libvips-linuxmusl-x64": "1.1.0-rc4",
"@img/sharp-linux-arm": "0.34.0-rc.0",
"@img/sharp-linux-arm64": "0.34.0-rc.0",
"@img/sharp-linux-s390x": "0.34.0-rc.0",
"@img/sharp-linux-x64": "0.34.0-rc.0",
"@img/sharp-linuxmusl-arm64": "0.34.0-rc.0",
"@img/sharp-linuxmusl-x64": "0.34.0-rc.0",
"@img/sharp-wasm32": "0.34.0-rc.0",
"@img/sharp-win32-ia32": "0.34.0-rc.0",
"@img/sharp-win32-x64": "0.34.0-rc.0"
"@img/sharp-darwin-arm64": "0.33.5",
"@img/sharp-darwin-x64": "0.33.5",
"@img/sharp-libvips-darwin-arm64": "1.1.0-rc3",
"@img/sharp-libvips-darwin-x64": "1.1.0-rc3",
"@img/sharp-libvips-linux-arm": "1.1.0-rc3.1",
"@img/sharp-libvips-linux-arm64": "1.1.0-rc3",
"@img/sharp-libvips-linux-ppc64": "1.1.0-rc3",
"@img/sharp-libvips-linux-s390x": "1.1.0-rc3",
"@img/sharp-libvips-linux-x64": "1.1.0-rc3",
"@img/sharp-libvips-linuxmusl-arm64": "1.1.0-rc3",
"@img/sharp-libvips-linuxmusl-x64": "1.1.0-rc3",
"@img/sharp-linux-arm": "0.33.5",
"@img/sharp-linux-arm64": "0.33.5",
"@img/sharp-linux-s390x": "0.33.5",
"@img/sharp-linux-x64": "0.33.5",
"@img/sharp-linuxmusl-arm64": "0.33.5",
"@img/sharp-linuxmusl-x64": "0.33.5",
"@img/sharp-wasm32": "0.33.5",
"@img/sharp-win32-ia32": "0.33.5",
"@img/sharp-win32-x64": "0.33.5"
},
"devDependencies": {
"@emnapi/runtime": "^1.3.1",
"@img/sharp-libvips-dev": "1.1.0-rc4",
"@img/sharp-libvips-dev-wasm32": "1.1.0-rc4",
"@img/sharp-libvips-win32-ia32": "1.1.0-rc4",
"@img/sharp-libvips-win32-x64": "1.1.0-rc4",
"@img/sharp-libvips-dev": "1.1.0-rc3",
"@img/sharp-libvips-dev-wasm32": "1.1.0-rc3",
"@img/sharp-libvips-win32-ia32": "1.1.0-rc3",
"@img/sharp-libvips-win32-x64": "1.1.0-rc3",
"@types/node": "*",
"cc": "^3.0.1",
"emnapi": "^1.3.1",
"exif-reader": "^2.0.2",
"exif-reader": "^2.0.1",
"extract-zip": "^2.0.1",
"icc": "^3.0.0",
"jsdoc-to-markdown": "^9.1.1",
"license-checker": "^25.0.1",
"mocha": "^11.1.0",
"node-addon-api": "^8.3.1",
"mocha": "^11.0.1",
"node-addon-api": "^8.3.0",
"nyc": "^17.1.0",
"prebuild": "^13.0.1",
"semistandard": "^17.0.0",
"tar-fs": "^3.0.8",
"tar-fs": "^3.0.6",
"tsd": "^0.31.2"
},
"license": "Apache-2.0",
@ -190,7 +190,7 @@
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
},
"config": {
"libvips": ">=8.16.1"
"libvips": ">=8.16.0"
},
"funding": {
"url": "https://opencollective.com/libvips"

View File

@ -160,34 +160,12 @@ namespace sharp {
descriptor->textWrap = AttrAsEnum<VipsTextWrap>(input, "textWrap", VIPS_TYPE_TEXT_WRAP);
}
}
// Join images together
if (HasAttr(input, "joinAnimated")) {
descriptor->joinAnimated = AttrAsBool(input, "joinAnimated");
}
if (HasAttr(input, "joinAcross")) {
descriptor->joinAcross = AttrAsUint32(input, "joinAcross");
}
if (HasAttr(input, "joinShim")) {
descriptor->joinShim = AttrAsUint32(input, "joinShim");
}
if (HasAttr(input, "joinBackground")) {
descriptor->joinBackground = AttrAsVectorOfDouble(input, "joinBackground");
}
if (HasAttr(input, "joinHalign")) {
descriptor->joinHalign = AttrAsEnum<VipsAlign>(input, "joinHalign", VIPS_TYPE_ALIGN);
}
if (HasAttr(input, "joinValign")) {
descriptor->joinValign = AttrAsEnum<VipsAlign>(input, "joinValign", VIPS_TYPE_ALIGN);
}
// Limit input images to a given number of pixels, where pixels = width * height
descriptor->limitInputPixels = static_cast<uint64_t>(AttrAsInt64(input, "limitInputPixels"));
if (HasAttr(input, "access")) {
// Allow switch from random to sequential access
descriptor->access = AttrAsBool(input, "sequentialRead") ? VIPS_ACCESS_SEQUENTIAL : VIPS_ACCESS_RANDOM;
}
// Remove safety features and allow unlimited input
descriptor->unlimited = AttrAsBool(input, "unlimited");
// Use the EXIF orientation to auto orient the image
descriptor->autoOrient = AttrAsBool(input, "autoOrient");
return descriptor;
}
@ -270,7 +248,6 @@ namespace sharp {
case ImageType::FITS: id = "fits"; break;
case ImageType::EXR: id = "exr"; break;
case ImageType::JXL: id = "jxl"; break;
case ImageType::RAD: id = "rad"; break;
case ImageType::VIPS: id = "vips"; break;
case ImageType::RAW: id = "raw"; break;
case ImageType::UNKNOWN: id = "unknown"; break;
@ -317,8 +294,6 @@ namespace sharp {
{ "VipsForeignLoadOpenexr", ImageType::EXR },
{ "VipsForeignLoadJxlFile", ImageType::JXL },
{ "VipsForeignLoadJxlBuffer", ImageType::JXL },
{ "VipsForeignLoadRadFile", ImageType::RAD },
{ "VipsForeignLoadRadBuffer", ImageType::RAD },
{ "VipsForeignLoadVips", ImageType::VIPS },
{ "VipsForeignLoadVipsFile", ImageType::VIPS },
{ "VipsForeignLoadRaw", ImageType::RAW }
@ -595,6 +570,14 @@ namespace sharp {
return image;
}
/*
Does this image have an alpha channel?
Uses colour space interpretation with number of channels to guess this.
*/
bool HasAlpha(VImage image) {
return image.has_alpha();
}
static void* RemoveExifCallback(VipsImage *image, char const *field, GValue *value, void *data) {
std::vector<std::string> *fieldNames = static_cast<std::vector<std::string> *>(data);
std::string fieldName(field);
@ -1003,13 +986,13 @@ namespace sharp {
};
}
// Add alpha channel to alphaColour colour
if (colour[3] < 255.0 || image.has_alpha()) {
if (colour[3] < 255.0 || HasAlpha(image)) {
alphaColour.push_back(colour[3] * multiplier);
}
// Ensure alphaColour colour uses correct colourspace
alphaColour = sharp::GetRgbaAsColourspace(alphaColour, image.interpretation(), premultiply);
// Add non-transparent alpha channel, if required
if (colour[3] < 255.0 && !image.has_alpha()) {
if (colour[3] < 255.0 && !HasAlpha(image)) {
image = image.bandjoin(
VImage::new_matrix(image.width(), image.height()).new_from_image(255 * multiplier).cast(image.format()));
}
@ -1017,10 +1000,10 @@ namespace sharp {
}
/*
Removes alpha channels, if any.
Removes alpha channel, if any.
*/
VImage RemoveAlpha(VImage image) {
while (image.bands() > 1 && image.has_alpha()) {
if (HasAlpha(image)) {
image = image.extract_band(0, VImage::option()->set("n", image.bands() - 1));
}
return image;
@ -1030,7 +1013,7 @@ namespace sharp {
Ensures alpha channel, if missing.
*/
VImage EnsureAlpha(VImage image, double const value) {
if (!image.has_alpha()) {
if (!HasAlpha(image)) {
std::vector<double> alpha;
alpha.push_back(value * sharp::MaximumImageAlpha(image.interpretation()));
image = image.bandjoin_const(alpha);

View File

@ -16,8 +16,8 @@
#if (VIPS_MAJOR_VERSION < 8) || \
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 16) || \
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 16 && VIPS_MICRO_VERSION < 1)
#error "libvips version 8.16.1+ is required - please see https://sharp.pixelplumbing.com/install"
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 16 && VIPS_MICRO_VERSION < 0)
#error "libvips version 8.16.0+ is required - please see https://sharp.pixelplumbing.com/install"
#endif
#if defined(__has_include)
@ -33,7 +33,6 @@ namespace sharp {
struct InputDescriptor { // NOLINT(runtime/indentation_namespace)
std::string name;
std::string file;
bool autoOrient;
char *buffer;
VipsFailOn failOn;
uint64_t limitInputPixels;
@ -71,21 +70,14 @@ namespace sharp {
int textSpacing;
VipsTextWrap textWrap;
int textAutofitDpi;
bool joinAnimated;
int joinAcross;
int joinShim;
std::vector<double> joinBackground;
VipsAlign joinHalign;
VipsAlign joinValign;
std::vector<double> pdfBackground;
InputDescriptor():
autoOrient(false),
buffer(nullptr),
failOn(VIPS_FAIL_ON_WARNING),
limitInputPixels(0x3FFF * 0x3FFF),
unlimited(false),
access(VIPS_ACCESS_SEQUENTIAL),
access(VIPS_ACCESS_RANDOM),
bufferLength(0),
isBuffer(false),
density(72.0),
@ -114,12 +106,6 @@ namespace sharp {
textSpacing(0),
textWrap(VIPS_TEXT_WRAP_WORD),
textAutofitDpi(0),
joinAnimated(false),
joinAcross(1),
joinShim(0),
joinBackground{ 0.0, 0.0, 0.0, 255.0 },
joinHalign(VIPS_ALIGN_LOW),
joinValign(VIPS_ALIGN_LOW),
pdfBackground{ 255.0, 255.0, 255.0, 255.0 } {}
};
@ -159,7 +145,6 @@ namespace sharp {
FITS,
EXR,
JXL,
RAD,
VIPS,
RAW,
UNKNOWN,
@ -245,6 +230,12 @@ namespace sharp {
*/
VImage SetProfile(VImage image, std::pair<char*, size_t> icc);
/*
Does this image have an alpha channel?
Uses colour space interpretation with number of channels to guess this.
*/
bool HasAlpha(VImage image);
/*
Remove all EXIF-related image fields.
*/
@ -375,7 +366,7 @@ namespace sharp {
std::tuple<VImage, std::vector<double>> ApplyAlpha(VImage image, std::vector<double> colour, bool premultiply);
/*
Removes alpha channels, if any.
Removes alpha channel, if any.
*/
VImage RemoveAlpha(VImage image);

View File

@ -95,7 +95,7 @@ class MetadataWorker : public Napi::AsyncWorker {
baton->background = image.get_array_double("background");
}
// Derived attributes
baton->hasAlpha = image.has_alpha();
baton->hasAlpha = sharp::HasAlpha(image);
baton->orientation = sharp::ExifOrientation(image);
// EXIF
if (image.get_typeof(VIPS_META_EXIF_NAME) == VIPS_TYPE_BLOB) {
@ -242,15 +242,6 @@ class MetadataWorker : public Napi::AsyncWorker {
if (baton->orientation > 0) {
info.Set("orientation", baton->orientation);
}
Napi::Object autoOrient = Napi::Object::New(env);
info.Set("autoOrient", autoOrient);
if (baton->orientation >= 5) {
autoOrient.Set("width", baton->height);
autoOrient.Set("height", baton->width);
} else {
autoOrient.Set("width", baton->width);
autoOrient.Set("height", baton->height);
}
if (baton->exifLength > 0) {
info.Set("exif", Napi::Buffer<char>::NewOrCopy(env, baton->exif, baton->exifLength, sharp::FreeCallback));
}

View File

@ -40,7 +40,7 @@ namespace sharp {
typeBeforeTint = VIPS_INTERPRETATION_sRGB;
}
// Apply lookup table
if (image.has_alpha()) {
if (HasAlpha(image)) {
VImage alpha = image[image.bands() - 1];
image = RemoveAlpha(image)
.colourspace(VIPS_INTERPRETATION_B_W)
@ -83,7 +83,7 @@ namespace sharp {
// Scale luminance, join to chroma, convert back to original colourspace
VImage normalized = luminance.linear(f, a).bandjoin(chroma).colourspace(typeBeforeNormalize);
// Attach original alpha channel, if any
if (image.has_alpha()) {
if (HasAlpha(image)) {
// Extract original alpha channel
VImage alpha = image[image.bands() - 1];
// Join alpha channel to normalised image
@ -106,7 +106,7 @@ namespace sharp {
* Gamma encoding/decoding
*/
VImage Gamma(VImage image, double const exponent) {
if (image.has_alpha()) {
if (HasAlpha(image)) {
// Separate alpha channel
VImage alpha = image[image.bands() - 1];
return RemoveAlpha(image).gamma(VImage::option()->set("exponent", exponent)).bandjoin(alpha);
@ -132,7 +132,7 @@ namespace sharp {
* Produce the "negative" of the image.
*/
VImage Negate(VImage image, bool const negateAlpha) {
if (image.has_alpha() && !negateAlpha) {
if (HasAlpha(image) && !negateAlpha) {
// Separate alpha channel
VImage alpha = image[image.bands() - 1];
return RemoveAlpha(image).invert().bandjoin(alpha);
@ -205,7 +205,7 @@ namespace sharp {
VImage Modulate(VImage image, double const brightness, double const saturation,
int const hue, double const lightness) {
VipsInterpretation colourspaceBeforeModulate = image.interpretation();
if (image.has_alpha()) {
if (HasAlpha(image)) {
// Separate alpha channel
VImage alpha = image[image.bands() - 1];
return RemoveAlpha(image)
@ -297,7 +297,7 @@ namespace sharp {
threshold *= 256.0;
}
std::vector<double> backgroundAlpha({ background.back() });
if (image.has_alpha()) {
if (HasAlpha(image)) {
background.pop_back();
} else {
background.resize(image.bands());
@ -307,7 +307,7 @@ namespace sharp {
->set("background", background)
->set("line_art", lineArt)
->set("threshold", threshold));
if (image.has_alpha()) {
if (HasAlpha(image)) {
// Search alpha channel (A)
int leftA, topA, widthA, heightA;
VImage alpha = image[image.bands() - 1];
@ -344,7 +344,7 @@ namespace sharp {
throw VError("Band expansion using linear is unsupported");
}
bool const uchar = !Is16Bit(image.interpretation());
if (image.has_alpha() && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) {
if (HasAlpha(image) && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) {
// Separate alpha channel
VImage alpha = image[bands - 1];
return RemoveAlpha(image).linear(a, b, VImage::option()->set("uchar", uchar)).bandjoin(alpha);
@ -357,7 +357,7 @@ namespace sharp {
* Unflatten
*/
VImage Unflatten(VImage image) {
if (image.has_alpha()) {
if (HasAlpha(image)) {
VImage alpha = image[image.bands() - 1];
VImage noAlpha = RemoveAlpha(image);
return noAlpha.bandjoin(alpha & (noAlpha.colourspace(VIPS_INTERPRETATION_B_W) < 255));

View File

@ -42,39 +42,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Open input
vips::VImage image;
sharp::ImageType inputImageType;
if (baton->join.empty()) {
std::tie(image, inputImageType) = sharp::OpenInput(baton->input);
} else {
std::vector<VImage> images;
bool hasAlpha = false;
for (auto &join : baton->join) {
std::tie(image, inputImageType) = sharp::OpenInput(join);
image = sharp::EnsureColourspace(image, baton->colourspacePipeline);
images.push_back(image);
hasAlpha |= image.has_alpha();
}
if (hasAlpha) {
for (auto &image : images) {
if (!image.has_alpha()) {
image = sharp::EnsureAlpha(image, 1);
}
}
} else {
baton->input->joinBackground.pop_back();
}
inputImageType = sharp::ImageType::PNG;
image = VImage::arrayjoin(images, VImage::option()
->set("across", baton->input->joinAcross)
->set("shim", baton->input->joinShim)
->set("background", baton->input->joinBackground)
->set("halign", baton->input->joinHalign)
->set("valign", baton->input->joinValign));
if (baton->input->joinAnimated) {
image = image.copy();
image.set(VIPS_META_N_PAGES, static_cast<int>(images.size()));
image.set(VIPS_META_PAGE_HEIGHT, static_cast<int>(image.height() / images.size()));
}
}
VipsAccess access = baton->input->access;
image = sharp::EnsureColourspace(image, baton->colourspacePipeline);
@ -95,13 +63,13 @@ class PipelineWorker : public Napi::AsyncWorker {
bool autoFlip = false;
bool autoFlop = false;
if (baton->input->autoOrient) {
if (baton->useExifOrientation) {
// Rotate and flip image according to Exif orientation
std::tie(autoRotation, autoFlip, autoFlop) = CalculateExifRotationAndFlip(sharp::ExifOrientation(image));
image = sharp::RemoveExifOrientation(image);
}
} else {
rotation = CalculateAngleRotation(baton->angle);
}
// Rotate pre-extract
bool const shouldRotateBefore = baton->rotateBeforePreExtract &&
@ -124,14 +92,18 @@ class PipelineWorker : public Napi::AsyncWorker {
image = image.rot(autoRotation);
autoRotation = VIPS_ANGLE_D0;
}
if (autoFlip != baton->flip) {
if (autoFlip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
autoFlip = false;
} else if (baton->flip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
baton->flip = false;
}
if (autoFlop != baton->flop) {
if (autoFlop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
autoFlop = false;
} else if (baton->flop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
baton->flop = false;
}
if (rotation != VIPS_ANGLE_D0) {
@ -372,7 +344,7 @@ class PipelineWorker : public Napi::AsyncWorker {
}
// Flatten image to remove alpha channel
if (baton->flatten && image.has_alpha()) {
if (baton->flatten && sharp::HasAlpha(image)) {
image = sharp::Flatten(image, baton->flattenBackground);
}
@ -392,12 +364,12 @@ class PipelineWorker : public Napi::AsyncWorker {
bool const shouldSharpen = baton->sharpenSigma != 0.0;
bool const shouldComposite = !baton->composite.empty();
if (shouldComposite && !image.has_alpha()) {
if (shouldComposite && !sharp::HasAlpha(image)) {
image = sharp::EnsureAlpha(image, 1);
}
VipsBandFormat premultiplyFormat = image.format();
bool const shouldPremultiplyAlpha = image.has_alpha() &&
bool const shouldPremultiplyAlpha = sharp::HasAlpha(image) &&
(shouldResize || shouldBlur || shouldConv || shouldSharpen);
if (shouldPremultiplyAlpha) {
@ -424,11 +396,11 @@ class PipelineWorker : public Napi::AsyncWorker {
image = image.rot(autoRotation);
}
// Mirror vertically (up-down) about the x-axis
if (baton->flip != autoFlip) {
if (baton->flip || autoFlip) {
image = image.flip(VIPS_DIRECTION_VERTICAL);
}
// Mirror horizontally (left-right) about the y-axis
if (baton->flop != autoFlop) {
if (baton->flop || autoFlop) {
image = image.flip(VIPS_DIRECTION_HORIZONTAL);
}
// Rotate post-extract 90-angle
@ -659,30 +631,6 @@ class PipelineWorker : public Napi::AsyncWorker {
composite->input->access = access;
std::tie(compositeImage, compositeImageType) = sharp::OpenInput(composite->input);
compositeImage = sharp::EnsureColourspace(compositeImage, baton->colourspacePipeline);
if (composite->input->autoOrient) {
// Respect EXIF Orientation
VipsAngle compositeAutoRotation = VIPS_ANGLE_D0;
bool compositeAutoFlip = false;
bool compositeAutoFlop = false;
std::tie(compositeAutoRotation, compositeAutoFlip, compositeAutoFlop) =
CalculateExifRotationAndFlip(sharp::ExifOrientation(compositeImage));
compositeImage = sharp::RemoveExifOrientation(compositeImage);
compositeImage = sharp::StaySequential(compositeImage,
compositeAutoRotation != VIPS_ANGLE_D0 || compositeAutoFlip);
if (compositeAutoRotation != VIPS_ANGLE_D0) {
compositeImage = compositeImage.rot(compositeAutoRotation);
}
if (compositeAutoFlip) {
compositeImage = compositeImage.flip(VIPS_DIRECTION_VERTICAL);
}
if (compositeAutoFlop) {
compositeImage = compositeImage.flip(VIPS_DIRECTION_HORIZONTAL);
}
}
// Verify within current dimensions
if (compositeImage.width() > image.width() || compositeImage.height() > image.height()) {
throw vips::VError("Image to composite must have same dimensions or smaller");
@ -725,7 +673,9 @@ class PipelineWorker : public Napi::AsyncWorker {
}
// Ensure image to composite is sRGB with unpremultiplied alpha
compositeImage = compositeImage.colourspace(VIPS_INTERPRETATION_sRGB);
if (!sharp::HasAlpha(compositeImage)) {
compositeImage = sharp::EnsureAlpha(compositeImage, 1);
}
if (composite->premultiplied) compositeImage = compositeImage.unpremultiply();
// Calculate position
int left;
@ -826,7 +776,7 @@ class PipelineWorker : public Napi::AsyncWorker {
// Extract channel
if (baton->extractChannel > -1) {
if (baton->extractChannel >= image.bands()) {
if (baton->extractChannel == 3 && image.has_alpha()) {
if (baton->extractChannel == 3 && sharp::HasAlpha(image)) {
baton->extractChannel = image.bands() - 1;
} else {
(baton->err)
@ -1050,7 +1000,7 @@ class PipelineWorker : public Napi::AsyncWorker {
} else if (baton->formatOut == "dz") {
// Write DZ to buffer
baton->tileContainer = VIPS_FOREIGN_DZ_CONTAINER_ZIP;
if (!image.has_alpha()) {
if (!sharp::HasAlpha(image)) {
baton->tileBackground.pop_back();
}
image = sharp::StaySequential(image, baton->tileAngle != 0);
@ -1099,7 +1049,6 @@ class PipelineWorker : public Napi::AsyncWorker {
// Unsupported output format
(baton->err).append("Unsupported output format ");
if (baton->formatOut == "input") {
(baton->err).append("when trying to match input format of ");
(baton->err).append(ImageTypeId(inputImageType));
} else {
(baton->err).append(baton->formatOut);
@ -1254,7 +1203,7 @@ class PipelineWorker : public Napi::AsyncWorker {
if (isDzZip) {
baton->tileContainer = VIPS_FOREIGN_DZ_CONTAINER_ZIP;
}
if (!image.has_alpha()) {
if (!sharp::HasAlpha(image)) {
baton->tileBackground.pop_back();
}
image = sharp::StaySequential(image, baton->tileAngle != 0);
@ -1369,9 +1318,6 @@ class PipelineWorker : public Napi::AsyncWorker {
for (sharp::InputDescriptor *input : baton->joinChannelIn) {
delete input;
}
for (sharp::InputDescriptor *input : baton->join) {
delete input;
}
delete baton;
// Decrement processing task counter
@ -1529,14 +1475,6 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
// Input
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
// Join images together
if (sharp::HasAttr(options, "join")) {
Napi::Array join = options.Get("join").As<Napi::Array>();
for (unsigned int i = 0; i < join.Length(); i++) {
baton->join.push_back(
sharp::CreateInputDescriptor(join.Get(i).As<Napi::Object>()));
}
}
// Extract image options
baton->topOffsetPre = sharp::AttrAsInt32(options, "topOffsetPre");
baton->leftOffsetPre = sharp::AttrAsInt32(options, "leftOffsetPre");
@ -1629,6 +1567,7 @@ Napi::Value pipeline(const Napi::CallbackInfo& info) {
baton->claheWidth = sharp::AttrAsUint32(options, "claheWidth");
baton->claheHeight = sharp::AttrAsUint32(options, "claheHeight");
baton->claheMaxSlope = sharp::AttrAsUint32(options, "claheMaxSlope");
baton->useExifOrientation = sharp::AttrAsBool(options, "useExifOrientation");
baton->angle = sharp::AttrAsInt32(options, "angle");
baton->rotationAngle = sharp::AttrAsDouble(options, "rotationAngle");
baton->rotationBackground = sharp::AttrAsVectorOfDouble(options, "rotationBackground");

View File

@ -39,7 +39,6 @@ struct Composite {
struct PipelineBaton {
sharp::InputDescriptor *input;
std::vector<sharp::InputDescriptor *> join;
std::string formatOut;
std::string fileOut;
void *bufferOut;
@ -110,6 +109,7 @@ struct PipelineBaton {
int claheWidth;
int claheHeight;
int claheMaxSlope;
bool useExifOrientation;
int angle;
double rotationAngle;
std::vector<double> rotationBackground;
@ -282,6 +282,7 @@ struct PipelineBaton {
claheWidth(0),
claheHeight(0),
claheMaxSlope(3),
useExifOrientation(false),
angle(0),
rotationAngle(0.0),
rotationBackground{ 0.0, 0.0, 0.0, 255.0 },

View File

@ -58,7 +58,7 @@ class StatsWorker : public Napi::AsyncWorker {
baton->channelStats.push_back(cStats);
}
// Image is not opaque when alpha layer is present and contains a non-mamixa value
if (image.has_alpha()) {
if (sharp::HasAlpha(image)) {
double const minAlpha = static_cast<double>(stats.getpoint(STAT_MIN_INDEX, bands).front());
if (minAlpha != sharp::MaximumImageAlpha(image.interpretation())) {
baton->isOpaque = false;

View File

@ -119,7 +119,7 @@ Napi::Value format(const Napi::CallbackInfo& info) {
Napi::Object format = Napi::Object::New(env);
for (std::string const f : {
"jpeg", "png", "webp", "tiff", "magick", "openslide", "dz",
"ppm", "fits", "gif", "svg", "heif", "pdf", "vips", "jp2k", "jxl", "rad"
"ppm", "fits", "gif", "svg", "heif", "pdf", "vips", "jp2k", "jxl"
}) {
// Input
const VipsObjectClass *oc = vips_class_find("VipsOperation", (f + "load").c_str());
@ -233,10 +233,10 @@ Napi::Value _maxColourDistance(const Napi::CallbackInfo& info) {
double maxColourDistance;
try {
// Premultiply and remove alpha
if (image1.has_alpha()) {
if (sharp::HasAlpha(image1)) {
image1 = image1.premultiply().extract_band(1, VImage::option()->set("n", image1.bands() - 1));
}
if (image2.has_alpha()) {
if (sharp::HasAlpha(image2)) {
image2 = image2.premultiply().extract_band(1, VImage::option()->set("n", image2.bands() - 1));
}
// Calculate colour distance

View File

@ -1,13 +1,14 @@
FROM ubuntu:24.10
FROM ubuntu:23.10
ARG BRANCH=main
# Install basic dependencies
RUN apt-get -y update && apt-get install -y build-essential curl git ca-certificates gnupg
# Install latest Node.js LTS
RUN curl -fsSL https://deb.nodesource.com/setup_22.x -o nodesource_setup.sh
RUN bash nodesource_setup.sh
RUN apt-get install -y nodejs
RUN mkdir -p /etc/apt/keyrings
RUN curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg
RUN echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_20.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list
RUN apt-get -y update && apt-get install -y nodejs
# Install benchmark dependencies
RUN apt-get install -y imagemagick libmagick++-dev graphicsmagick
@ -25,4 +26,8 @@ RUN node -v
WORKDIR /tmp/sharp/test/bench
# Workaround for: https://github.com/emscripten-core/emscripten/pull/16917
# This could be removed once Squoosh is an optional dependency.
ENV NODE_OPTIONS="--no-experimental-fetch"
CMD [ "node", "perf" ]

View File

@ -8,14 +8,16 @@
"test": "node perf && node random && node parallel"
},
"dependencies": {
"async": "3.2.6",
"@squoosh/cli": "0.7.3",
"@squoosh/lib": "0.5.3",
"async": "3.2.5",
"benchmark": "2.1.4",
"gm": "1.25.1",
"gm": "1.25.0",
"imagemagick": "0.1.3",
"jimp": "1.6.0"
"jimp": "0.22.10"
},
"optionalDependencies": {
"@tensorflow/tfjs-node": "4.22.0",
"@tensorflow/tfjs-node": "4.13.0",
"mapnik": "4.5.9"
},
"license": "Apache-2.0"

View File

@ -3,8 +3,9 @@
'use strict';
const os = require('os');
const fs = require('fs');
const { execSync } = require('child_process');
const { exec, execSync } = require('child_process');
const async = require('async');
const Benchmark = require('benchmark');
@ -21,8 +22,8 @@ const sharp = require('../../');
const gm = require('gm');
const imagemagick = require('imagemagick');
const mapnik = safeRequire('mapnik');
const { Jimp, JimpMime } = require('jimp');
const jimp = require('jimp');
const squoosh = require('@squoosh/lib');
process.env.TF_CPP_MIN_LOG_LEVEL = 1;
const tfjs = safeRequire('@tensorflow/tfjs-node');
@ -51,23 +52,104 @@ async.series({
// jimp
jpegSuite.add('jimp-buffer-buffer', {
defer: true,
fn: async function (deferred) {
const image = await Jimp.read(inputJpgBuffer);
await image
.resize({ w: width, h: height, mode: Jimp.RESIZE_BICUBIC })
.getBuffer(JimpMime.jpeg, { quality: 80 });
fn: function (deferred) {
jimp.read(inputJpgBuffer, function (err, image) {
if (err) {
throw err;
} else {
image
.resize(width, height, jimp.RESIZE_BICUBIC)
.quality(80)
.getBuffer(jimp.MIME_JPEG, function (err) {
if (err) {
throw err;
} else {
deferred.resolve();
}
});
}
});
}
}).add('jimp-file-file', {
defer: true,
fn: async function (deferred) {
const image = await Jimp.read(fixtures.inputJpg);
await image
.resize({ w: width, h: height, mode: Jimp.RESIZE_BICUBIC })
.getBuffer(JimpMime.jpeg, { quality: 80 });
fn: function (deferred) {
jimp.read(fixtures.inputJpg, function (err, image) {
if (err) {
throw err;
} else {
image
.resize(width, height, jimp.RESIZE_BICUBIC)
.quality(80)
.write(outputJpg, function (err) {
if (err) {
throw err;
} else {
deferred.resolve();
}
});
}
});
}
});
// squoosh-cli
jpegSuite.add('squoosh-cli-file-file', {
defer: true,
fn: function (deferred) {
exec(`./node_modules/.bin/squoosh-cli \
--output-dir ${os.tmpdir()} \
--resize '{"enabled":true,"width":${width},"height":${height},"method":"lanczos3","premultiply":false,"linearRGB":false}' \
--mozjpeg '{"quality":80,"progressive":false,"optimize_coding":true,"quant_table":0,"trellis_multipass":false,"chroma_subsample":2,"separate_chroma_quality":false}' \
"${fixtures.inputJpg}"`, function (err) {
if (err) {
throw err;
}
deferred.resolve();
});
}
});
// squoosh-lib (GPLv3)
jpegSuite.add('squoosh-lib-buffer-buffer', {
defer: true,
fn: function (deferred) {
const pool = new squoosh.ImagePool(os.cpus().length);
const image = pool.ingestImage(inputJpgBuffer);
image.decoded
.then(function () {
return image.preprocess({
resize: {
enabled: true,
width,
height,
method: 'lanczos3',
premultiply: false,
linearRGB: false
}
});
})
.then(function () {
return image.encode({
mozjpeg: {
quality: 80,
progressive: false,
optimize_coding: true,
quant_table: 0,
trellis_multipass: false,
chroma_subsample: 2,
separate_chroma_quality: false
}
});
})
.then(function () {
return pool.close();
})
.then(function () {
return image.encodedWith.mozjpeg;
})
.then(function () {
deferred.resolve();
});
}
});
// mapnik
mapnik && jpegSuite.add('mapnik-file-file', {
defer: true,
@ -566,23 +648,100 @@ async.series({
// jimp
pngSuite.add('jimp-buffer-buffer', {
defer: true,
fn: async function (deferred) {
const image = await Jimp.read(inputPngBuffer);
await image
.resize({ w: width, h: heightPng, mode: Jimp.RESIZE_BICUBIC })
.getBuffer(JimpMime.png, { deflateLevel: 6, filterType: 0 });
fn: function (deferred) {
jimp.read(inputPngBuffer, function (err, image) {
if (err) {
throw err;
} else {
image
.resize(width, heightPng, jimp.RESIZE_BICUBIC)
.deflateLevel(6)
.filterType(0)
.getBuffer(jimp.MIME_PNG, function (err) {
if (err) {
throw err;
} else {
deferred.resolve();
}
});
}
});
}
}).add('jimp-file-file', {
defer: true,
fn: async function (deferred) {
const image = await Jimp.read(fixtures.inputPngAlphaPremultiplicationLarge);
await image
.resize({ w: width, h: heightPng, mode: Jimp.RESIZE_BICUBIC })
.write(outputPng, { deflateLevel: 6, filterType: 0 });
fn: function (deferred) {
jimp.read(fixtures.inputPngAlphaPremultiplicationLarge, function (err, image) {
if (err) {
throw err;
} else {
image
.resize(width, heightPng, jimp.RESIZE_BICUBIC)
.deflateLevel(6)
.filterType(0)
.write(outputPng, function (err) {
if (err) {
throw err;
} else {
deferred.resolve();
}
});
}
});
}
});
// squoosh-cli
pngSuite.add('squoosh-cli-file-file', {
defer: true,
fn: function (deferred) {
exec(`./node_modules/.bin/squoosh-cli \
--output-dir ${os.tmpdir()} \
--resize '{"enabled":true,"width":${width},"height":${heightPng},"method":"lanczos3","premultiply":true,"linearRGB":false}' \
--oxipng '{"level":1}' \
"${fixtures.inputPngAlphaPremultiplicationLarge}"`, function (err) {
if (err) {
throw err;
}
deferred.resolve();
});
}
});
// squoosh-lib (GPLv3)
pngSuite.add('squoosh-lib-buffer-buffer', {
defer: true,
fn: function (deferred) {
const pool = new squoosh.ImagePool(os.cpus().length);
const image = pool.ingestImage(inputPngBuffer);
image.decoded
.then(function () {
return image.preprocess({
resize: {
enabled: true,
width,
height: heightPng,
method: 'lanczos3',
premultiply: true,
linearRGB: false
}
});
})
.then(function () {
return image.encode({
oxipng: {
level: 1
}
});
})
.then(function () {
return pool.close();
})
.then(function () {
return image.encodedWith.oxipng;
})
.then(function () {
deferred.resolve();
});
}
});
// mapnik
mapnik && pngSuite.add('mapnik-file-file', {
defer: true,

Binary file not shown.

Before

Width:  |  Height:  |  Size: 79 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 76 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Some files were not shown because too many files have changed in this diff Show More