feat: Created a mini nodeJS server with NewMan for testing without PostMan GUI.

This will mimic a run in a CD/CI environment or docker container.
This commit is contained in:
Simon Priet
2021-09-08 14:01:19 +02:00
parent 5fbd7c88fa
commit e69a613a37
5610 changed files with 740417 additions and 3 deletions

1341
node_modules/postman-runtime/CHANGELOG.yaml generated vendored Normal file

File diff suppressed because it is too large Load Diff

201
node_modules/postman-runtime/LICENSE.md generated vendored Normal file
View File

@@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2016, Postdot Technologies, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

403
node_modules/postman-runtime/README.md generated vendored Normal file
View File

@@ -0,0 +1,403 @@
# Postman Runtime [![Build Status](https://travis-ci.com/postmanlabs/postman-runtime.svg?branch=develop)](https://travis-ci.com/postmanlabs/postman-runtime) [![codecov](https://codecov.io/gh/postmanlabs/postman-runtime/branch/develop/graph/badge.svg)](https://codecov.io/gh/postmanlabs/postman-runtime)
> This is a low-level library used as the backbone for all Collection running & Request sending functionality, in
> the Postman App, and allied systems ([Postman Monitoring](https://www.getpostman.com/docs/schedule_cloud_runs),
[Newman](https://github.com/postmanlabs/newman))
> If you are looking to execute collections, you should be using Newman, this is very low level.
## Development Notes
- `npm run test`: Runs lint, system, unit and integration tests of runtime
- `npm run test-integration-newman`: This command runs tests of newman with the under-development variant of runtime
- `npm run test-coverage`: This command runs `postman-runtime` tests and generate overall coverage
- `npm/memory-check.sh`: This bash scripts performs first-level memory usage analysis, and plots a graph out of the results
## Options
Postman Runtime supports a lot of options to customize its behavior for different environments and use-cases.
```javascript
var runner = new runtime.Runner(); // runtime = require('postman-runtime');
// a collection object constructed using the Postman Collection SDK.
var collection = new sdk.Collection();
runner.run(collection, {
// Iteration Data
data: [],
// Timeouts (in ms)
timeout: {
request: 30000,
script: 5000
},
// Number of iterations
iterationCount: 1,
// Control flags (you can only specify one of these):
// - gracefully halts on errors (errors in request scripts or while sending the request)
// calls the `item` and `iteration` callbacks and does not run any further items (requests)
stopOnError: true,
// - abruptly halts the run on errors, and directly calls the `done` callback
abortOnError: true,
// - gracefully halts on errors _or_ test failures.
// calls the `item` and `iteration` callbacks and does not run any further items (requests)
stopOnFailure: true,
// - abruptly halts the run on errors or test failures, and directly calls the `done` callback
abortOnFailure: true,
// Environment (a "VariableScope" from the SDK)
environment: new sdk.VariableScope(),
// Globals (a "VariableScope" from the SDK)
globals: new sdk.VariableScope(),
// Execute a folder/request using id/name or path
entrypoint: {
// execute a folder/request using id or name
execute: 'folderName',
// idOrName in case of execute and path in case of path
// is chosen to specify the folder/request to be executed
lookupStrategy: 'path',
// execute a folder/request using a path
path: ['grand_parent_folder_idOrName', 'parent_folder_idOrName']
},
// Configure delays (in ms)
delay: {
// between each request
item: 1000,
// between iterations
iteration: 1000
},
// Used to fetch contents of files, certificates wherever needed
fileResolver: require('fs'),
// Options specific to the requester
requester: {
// An object compatible with the cookieJar provided by the 'postman-request' module.
// To limit programmatic cookie access to only whitelisted domains, add `allowProgrammaticAccess`
// method to the jar. Example:
// jar.allowProgrammaticAccess = function (domain) { return domain === 'postman-echo.com'; };
cookieJar: jar,
// Controls redirect behavior (only supported on Node, ignored in the browser)
followRedirects: true,
// Redirect with the original HTTP method (only supported on Node, ignored in the browser)
followOriginalHttpMethod: false,
// Maximum number of redirects to follow (only supported on Node, ignored in the browser)
maxRedirects: 10,
// Maximum allowed response size in bytes (only supported on Node, ignored in the browser)
maxResponseSize: 1000000,
// Enable to use WHATWG URL parser and encoder
useWhatWGUrlParser: true,
// Removes the `referer` header when a redirect happens (only supported on Node, ignored in the browser)
removeRefererHeaderOnRedirect: false,
// Enable or disable certificate verification (only supported on Node, ignored in the browser)
strictSSL: false,
// Enable or disable detailed request-response timings (only supported on Node, ignored in the browser)
timings: true,
// Enable or disable verbose level history (only supported on Node, ignored in the browser)
verbose: false,
// Implicitly add `Cache-Control` system header in request (only supported on Node, ignored in the browser)
implicitCacheControl: true,
// Implicitly add `Postman-Token` system header in request (only supported on Node, ignored in the browser)
implicitTraceHeader: true,
// Add system headers to all requests which cannot be overridden or disabled
systemHeaders: { 'User-Agent': 'PostmanRuntime' }
// Extend well known "root" CAs with the extra certificates in file. The file should consist of one or more trusted certificates in PEM format. (only supported on Node, ignored in the browser)
extendedRootCA: 'path/to/extra/CA/certs.pem',
// network related options
network: {
hostLookup: { // hosts file configuration for dns lookup
type: 'hostIpMap',
hostIpMap: {
'domain.com': '127.0.0.1',
'ipv6-domain.com': '::1',
}
},
restrictedAddresses: {'192.168.1.1': true} // Allows restricting IP/host in requests
},
// Custom requesting agents (only supported on Node, ignored in the browser)
agents: {
http: {
agentClass: http.Agent,
agentOptions: { keepAlive: true, timeout: 399 }
},
https: new https.Agent({ keepAlive: true })
}
},
// Options specific to the script execution
script: {
// Option to set whether to send console logs in serialized format which can be parsed
// using the `teleport-javascript` serialization library.
serializeLogs: false
},
// A ProxyConfigList, from the SDK
proxies: new sdk.ProxyConfigList(),
// A function that fetches the system proxy for a given URL.
systemProxy: function (url, callback) { return callback(null, {/* ProxyConfig object */}) },
// Opt-out of [proxy configured using environment variables]((https://github.com/postmanlabs/postman-request#controlling-proxy-behaviour-using-environment-variables) ) (only supported on Node, ignored in the browser)
ignoreProxyEnvironmentVariables: false,
// A CertificateList from the SDK
certificates: new sdk.CertificateList(),
// *note* Not implemented yet.
// In the future, this will be used to read certificates from the OS keychain.
systemCertificate: function() {}
}, function (err, run) {
console.log('Created a new Run!');
// Check the section below for detailed documentation on what callbacks should be.
run.start(callbacks);
});
```
## Callbacks
You can pass a series of callbacks for runtime to execute as a collection is being executed.
```javascript
runner.run(collection, { /* options */ }, function(err, run) {
run.start({
// Called any time we see a new assertion in the test scripts
assertion: function (cursor, assertions) {
// cursor = {
// position: Number,
// iteration: Number,
// length: Number,
// cycles: Number,
// eof: Boolean,
// empty: Boolean,
// bof: Boolean,
// cr: Boolean,
// ref: String,
// scriptId: String,
// eventId: String
// }
// assertions: array of assertion objects
// assertion: {
// error: Error,
// index: Number,
// name: String,
// skipped: Number,
// passed: Number
// }
},
// Called when the run begins
start: function (err, cursor) {
// err: null or Error
// cursor = {
// position: Number,
// iteration: Number,
// length: Number,
// cycles: Number,
// eof: Boolean,
// empty: Boolean,
// bof: Boolean,
// cr: Boolean,
// ref: String
// }
},
// Called before starting a new iteration
beforeIteration: function (err, cursor) {
/* Same as arguments for "start" */
},
// Called when an iteration is completed
iteration: function (err, cursor) {
/* Same as arguments for "start" */
},
// Called before running a new Item (check the postman collection v2 format for what Item means)
beforeItem: function (err, cursor, item) {
// err, cursor: Same as arguments for "start"
// item: sdk.Item
},
// Called after completion of an Item
item: function (err, cursor, item, visualizer) {
// err, cursor, item: Same as arguments for "beforeItem"
// visualizer: null or object containing visualizer result that looks like this:
// {
// -- Tmeplate processing error
// error: <Error>
//
// -- Data used for template processing
// data: <Object>
//
// -- Processed template
// processedTemplate: <String>
// }
},
// Called before running pre-request script(s) (Yes, Runtime supports multiple pre-request scripts!)
beforePrerequest: function (err, cursor, events, item) {
// err, cursor: Same as arguments for "start"
// events: Array of sdk.Event objects
// item: sdk.Item
},
// Called after running pre-request script(s)
prerequest: function (err, cursor, results, item) {
// err, cursor: Same as arguments for "start"
// item: sdk.Item
// results: Array of objects. Each object looks like this:
// {
// error: Error,
// event: sdk.Event,
// script: sdk.Script,
// result: {
// target: 'prerequest'
//
// -- Updated environment
// environment: <VariableScope>
//
// -- Updated globals
// globals: <VariableScope>
//
// data: <Object of data variables>
// return: <Object, contains set next request params, etc>
// }
// }
},
// Called before running test script(s)
beforeTest: function (err, cursor, events, item) {
// err, cursor: Same as arguments for "start"
// events: Array of sdk.Event objects
// item: sdk.Item
},
// Called just after running test script (s)
test: function (err, cursor, results, item) {
// results: Array of objects. Each object looks like this:
// {
// error: Error,
// event: sdk.Event,
// script: sdk.Script,
// result: {
// target: 'test'
//
// -- Updated environment
// environment: <VariableScope>
//
// -- Updated globals
// globals: <VariableScope>
//
// response: <sdk.Response>
// request: <sdk.Request>
// data: <Object of data variables>
// cookies: <Array of "sdk.Cookie" objects>
// tests: <Object>
// return: <Object, contains set next request params, etc>
// }
// }
},
// Called just before sending a request
beforeRequest: function (err, cursor, request, item) {
// err, cursor: Same as arguments for "start"
// item: sdk.Item
// request: sdk.request
},
// Called just after sending a request, may include request replays
request: function (err, cursor, response, request, item, cookies, history) {
// err, cursor: Same as arguments for "start"
// item: sdk.Item
// response: sdk.Response
// request: sdk.request
},
// Called just after receiving the request-response without waiting for
// the response body or, request to end.
// Called once with response for each request in a collection
responseStart: function (err, cursor, response, request, item, cookies, history) {
// err, cursor: Same as arguments for "start"
// item: sdk.Item
// response: sdk.Response
// request: sdk.request
},
// Called once with response for each request in a collection
response: function (err, cursor, response, request, item, cookies, history) {
// err, cursor: Same as arguments for "start"
// item: sdk.Item
// response: sdk.Response
// request: sdk.request
},
exception: function (cursor, err) {
// Called when an exception occurs
// @param {Object} cursor - A representation of the current run state.
// @param {Error} err - An Error instance with name, message, and type properties.
},
// Called at the end of a run
done: function (err) {
// err: null or Error
console.log('done');
},
// Called any time a console.* function is called in test/pre-request scripts
console: function (cursor, level, ...logs) {},
io: function (err, cursor, trace, ...otherArgs) {
// err, cursor: Same as arguments for "start"
// trace: An object which looks like this:
// {
// -- Indicates the type of IO event, may be HTTP, File, etc. Any requests sent out as a part of
// -- auth flows, replays, etc will show up here.
// type: 'http',
//
// -- Indicates what this IO event originated from, (collection, auth flows, etc)
// source: 'collection'
// }
// otherArgs: Variable number of arguments, specific to the type of the IO event.
// For http type, the otherArgs are:
// response: sdk.Response()
// request: sdk.Request()
// cookies: Array of sdk.Cookie()
}
});
});
```

1
node_modules/postman-runtime/dist/index.js generated vendored Normal file

File diff suppressed because one or more lines are too long

18
node_modules/postman-runtime/index.js generated vendored Normal file
View File

@@ -0,0 +1,18 @@
/**!
* @license Copyright 2016 Postdot Technologies, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
* an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and limitations under the License.
*/
/**
* @fileOverview This is the entry point to PostmanCollectionRunner modules.
* The public structure of the module is defined here.
*/
module.exports = require('./lib');

119
node_modules/postman-runtime/lib/authorizer/apikey.js generated vendored Normal file
View File

@@ -0,0 +1,119 @@
var _ = require('lodash'),
TARGETS = {
header: 'header',
query: 'query'
};
/**
* This module negotiates the following
*
* auth: {
* key: 'string',
* value: 'string',
* in: 'string~enum header, query',
*
* // @todo implement:
* privateKey: 'string',
* privateValue: 'string'
* }
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'apikey',
version: '0.0.1'
},
updates: [
{
property: '*',
type: 'header'
},
{
property: '*',
type: 'url.param'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done();
},
/**
* Verifies whether the request has required parameters
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
return done(null, Boolean(auth.get('key') || auth.get('value')));
},
/**
* Verifies whether the auth succeeded
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Signs the request
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var target = TARGETS[auth.get('in')] || TARGETS.header,
key = auth.get('key'),
value = auth.get('value'),
lkey = _.lowerCase(key); // needed for header case insensitive matches
// either key or value should be present
if (!(key || value)) {
return done();
}
if (target === TARGETS.header) {
request.headers.remove(function (header) {
return header && (_.lowerCase(header.key) === lkey);
});
request.headers.add({
key: key,
value: value,
system: true
});
}
else if (target === TARGETS.query) {
request.url.query.remove(function (query) {
return query && (query.key === key);
});
request.url.query.add({
key: key,
value: value,
system: true
});
}
return done();
}
};

View File

@@ -0,0 +1,95 @@
var _ = require('lodash'),
EMPTY = '',
createAuthInterface;
/**
* Creates a wrapper around RequestAuth and provides getters and setters helper functions
*
* @constructs AuthInterface
* @param {RequestAuth} auth
* @param {Object} protocolProfileBehavior - Protocol profile behaviors
* @return {AuthInterface}
* @throws {Error}
*/
createAuthInterface = function (auth, protocolProfileBehavior) {
if (!(auth && auth.parameters && auth.parameters())) {
throw new Error('runtime~createAuthInterface: invalid auth');
}
return /** @lends AuthInterface.prototype **/{
/**
* @private
* @property {protocolProfileBehavior} - Protocol profile behaviors
*/
_protocolProfileBehavior: protocolProfileBehavior || {},
/**
* @param {String|Array<String>} keys
* @return {*} Returns a value for a key or an object having all keys & values depending on the input
* @example
* get('foo') // bar
* get(['foo', 'alpha']) // {foo: 'bar', 'alpha': 'beta'}
*/
get: function (keys) {
var paramVariable;
if (_.isString(keys)) {
paramVariable = auth.parameters().one(keys);
return paramVariable && paramVariable.get();
}
if (_.isArray(keys)) {
return _.transform(keys, function (paramObject, key) {
paramVariable = auth.parameters().one(key);
paramVariable && (paramObject[key] = paramVariable.get());
return paramObject;
}, {});
}
return undefined;
},
/**
* @param {String|Object} key
* @param {*} [value]
* @return {AuthInterface}
* @example
* set('foo', 'bar')
* set({foo: 'bar', 'alpha': 'beta'})
* @throws {Error}
*/
set: function (key, value) {
var modifiedParams = {},
parameters;
if (_.isObject(key)) {
modifiedParams = key;
}
else if (_.isString(key)) {
modifiedParams[key] = value;
}
else {
throw new Error('runtime~AuthInterface: set should be called with `key` as a string or object');
}
parameters = auth.parameters();
_.forEach(modifiedParams, function (value, key) {
var param = parameters.one(key);
if (!param) {
return parameters.add({key: key, value: value, system: true});
}
// Update if the param is a system property or an empty user property (null, undefined or empty string)
if (param.system || param.value === EMPTY || _.isNil(param.value) || _.isNaN(param.value)) {
return param.update({key: key, value: value, system: true});
}
});
return this;
}
};
};
module.exports = createAuthInterface;

303
node_modules/postman-runtime/lib/authorizer/aws4.js generated vendored Normal file
View File

@@ -0,0 +1,303 @@
var _ = require('lodash'),
aws4 = require('aws4'),
crypto = require('crypto'),
sdk = require('postman-collection'),
urlEncoder = require('postman-url-encoder'),
bodyBuilder = require('../requester/core-body-builder'),
RequestBody = sdk.RequestBody,
X_AMZ_PREFIX = 'X-Amz-',
BODY_HASH_HEADER = 'X-Amz-Content-Sha256',
/**
* Calculates body hash with given algorithm and digestEncoding.
*
* @todo This function can also be used in Digest auth so that it works correctly for urlencoded and file body types
*
* @param {RequestBody} body
* @param {String} algorithm
* @param {String} digestEncoding
* @param {Function} callback
*/
computeBodyHash = function (body, algorithm, digestEncoding, callback) {
if (!(body && algorithm && digestEncoding) || body.isEmpty()) { return callback(); }
var hash = crypto.createHash(algorithm),
originalReadStream,
rawBody,
urlencodedBody,
graphqlBody;
if (body.mode === RequestBody.MODES.raw) {
rawBody = bodyBuilder.raw(body.raw).body;
hash.update(rawBody);
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.urlencoded) {
urlencodedBody = bodyBuilder.urlencoded(body.urlencoded).form;
urlencodedBody = urlEncoder.encodeQueryString(urlencodedBody);
hash.update(urlencodedBody);
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.file) {
originalReadStream = _.get(body, 'file.content');
if (!originalReadStream) {
return callback();
}
return originalReadStream.cloneReadStream(function (err, clonedStream) {
if (err) { return callback(); }
clonedStream.on('data', function (chunk) {
hash.update(chunk);
});
clonedStream.on('end', function () {
callback(hash.digest(digestEncoding));
});
});
}
if (body.mode === RequestBody.MODES.graphql) {
graphqlBody = bodyBuilder.graphql(body.graphql).body;
hash.update(graphqlBody);
return callback(hash.digest(digestEncoding));
}
// @todo: formdata body type requires adding new data to form instead of setting headers for AWS auth.
// Figure out how to do that. See below link:
// AWS auth with formdata: https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html
// ensure that callback is called if body.mode doesn't match with any of the above modes
return callback();
};
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'awsv4',
version: '1.0.0'
},
updates: [
{
property: 'Host',
type: 'header'
},
{
property: 'Authorization',
type: 'header'
},
{
property: 'X-Amz-Date',
type: 'header'
},
{
property: 'X-Amz-Security-Token',
type: 'header'
},
{
property: 'X-Amz-Content-Sha256',
type: 'header'
},
{
property: 'X-Amz-Security-Token',
type: 'url.param'
},
{
property: 'X-Amz-Expires',
type: 'url.param'
},
{
property: 'X-Amz-Date',
type: 'url.param'
},
{
property: 'X-Amz-Algorithm',
type: 'url.param'
},
{
property: 'X-Amz-Credential',
type: 'url.param'
},
{
property: 'X-Amz-SignedHeaders',
type: 'url.param'
},
{
property: 'X-Amz-Signature',
type: 'url.param'
}
]
},
/**
* Initializes a item (fetches all required parameters, etc) before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Checks the item, and fetches any parameters that are not already provided.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
done(null, true);
},
/**
* Verifies whether the request was successful after being sent.
*
* @param {AuthInterface} auth
* @param {Requester} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Generates the signature and adds auth data to the request as additional headers/query params.
* AWS v4 auth mandates that a content type header be present in each request.
*
* @param {Request} request request to add auth data
* @param {Object} params data required for auth
* @param {Object} params.credentials Should contain the AWS credentials, "accessKeyId" and "secretAccessKey"
* @param {String} params.host Contains the host name for the request
* @param {String} params.path Contains the complete path, with query string as well, e.g: /something/kane?hi=ho
* @param {String} params.service The name of the AWS service
* @param {String} params.region AWS region
* @param {String} params.method Request method
* @param {String} params.body Stringified request body
* @param {Object} params.headers Each key should be a header key, and the value should be a header value
* @param {Boolean} params.signQuery Add auth data to query params if true, otherwise add it to headers
*/
addAuthDataToRequest: function (request, params) {
var signedData = aws4.sign(params, params.credentials);
if (params.signQuery) {
_.forEach(sdk.Url.parse(signedData.path).query, function (param) {
// only add additional AWS specific params to request
if (_.startsWith(param.key, X_AMZ_PREFIX) && !request.url.query.has(param.key)) {
param.system = true;
request.url.query.add(param);
}
});
}
_.forEach(signedData.headers, function (value, key) {
request.upsertHeader({
key: key,
value: value,
system: true
});
});
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var self = this,
params = auth.get([
'accessKey',
'secretKey',
'sessionToken',
'service',
'region',
'addAuthDataToQuery'
]),
url = urlEncoder.toNodeUrl(request.url),
dataToSign;
// Clean up the request (if needed)
request.removeHeader('Authorization', {ignoreCase: true});
request.removeHeader('X-Amz-Date', {ignoreCase: true});
request.removeHeader('X-Amz-Security-Token', {ignoreCase: true});
request.removeHeader('X-Amz-Content-Sha256', {ignoreCase: true});
// Not removing `X-Amz-Expires` from params here allowing user to override
// default value
request.removeQueryParams([
'X-Amz-Security-Token',
'X-Amz-Date',
'X-Amz-Algorithm',
'X-Amz-Credential',
'X-Amz-SignedHeaders',
'X-Amz-Signature'
]);
dataToSign = {
credentials: {
accessKeyId: params.accessKey,
secretAccessKey: params.secretKey,
sessionToken: params.sessionToken || undefined
},
host: url.host,
path: url.path, // path = pathname + query
service: params.service || 'execute-api', // AWS API Gateway is the default service.
region: params.region || 'us-east-1',
method: request.method,
body: undefined, // no need to give body since we are setting 'X-Amz-Content-Sha256' header
headers: _.transform(request.getHeaders({enabled: true}), function (accumulator, value, key) {
accumulator[key] = value;
}, {}),
signQuery: params.addAuthDataToQuery
};
// Removed the code which was adding content-type header if it is not there in the request. Because
// aws4 does not require content-type header. It is only mandatory to include content-type header in signature
// calculation if it is there in the request.
// Refer: https://docs.aws.amazon.com/AmazonS3/latest/API/sig-v4-header-based-auth.html#canonical-request
// body hash is not required when adding auth data to qury params
// @see: https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-query-string-auth.html
if (params.addAuthDataToQuery) {
self.addAuthDataToRequest(request, dataToSign);
return done();
}
// aws4 module can't calculate body hash for body with ReadStream.
// So we calculate it our self and set 'X-Amz-Content-Sha256' header which will be used by aws4 module
// to calculate the signature.
computeBodyHash(request.body, 'sha256', 'hex', function (bodyHash) {
if (bodyHash) {
request.upsertHeader({
key: BODY_HASH_HEADER,
value: bodyHash,
system: true
});
dataToSign.headers[BODY_HASH_HEADER] = bodyHash;
}
self.addAuthDataToRequest(request, dataToSign);
return done();
});
}
};

77
node_modules/postman-runtime/lib/authorizer/basic.js generated vendored Normal file
View File

@@ -0,0 +1,77 @@
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'basic',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Verifies whether the request has valid basic auth credentials (which is always).
* Sanitizes the auth parameters if needed.
*
* @todo - add support for prompting a user for basic auth credentials if not already provided
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
done(null, true);
},
/**
* Verifies whether the basic auth succeeded.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var username = auth.get('username') || '',
password = auth.get('password') || '';
request.removeHeader('Authorization', {ignoreCase: true});
request.addHeader({
key: 'Authorization',
value: 'Basic ' + Buffer.from(`${username}:${password}`, 'utf8').toString('base64'),
system: true
});
return done();
}
};

83
node_modules/postman-runtime/lib/authorizer/bearer.js generated vendored Normal file
View File

@@ -0,0 +1,83 @@
var BEARER_AUTH_PREFIX = 'Bearer ';
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'bearer',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done();
},
/**
* Verifies whether the request has required parameters
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
return done(null, Boolean(auth.get('token')));
},
/**
* Verifies whether the auth succeeded
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Signs the request
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var token = auth.get('token');
if (!token) {
return done(); // Nothing to do if required parameters are not present.
}
// @TODO Should we support adding to query params and/or body also?
// According to the RFC#6750 they are supported but not recommended!
request.removeHeader('Authorization', {ignoreCase: true});
request.addHeader({
key: 'Authorization',
value: BEARER_AUTH_PREFIX + token,
system: true
});
return done();
}
};

498
node_modules/postman-runtime/lib/authorizer/digest.js generated vendored Normal file
View File

@@ -0,0 +1,498 @@
var _ = require('lodash'),
crypto = require('crypto'),
urlEncoder = require('postman-url-encoder'),
RequestBody = require('postman-collection').RequestBody,
bodyBuilder = require('../requester/core-body-builder'),
EMPTY = '',
ONE = '00000001',
DISABLE_RETRY_REQUEST = 'disableRetryRequest',
WWW_AUTHENTICATE = 'www-authenticate',
DIGEST_PREFIX = 'Digest ',
QOP = 'qop',
AUTH = 'auth',
COLON = ':',
QUOTE = '"',
SESS = '-sess',
AUTH_INT = 'auth-int',
AUTHORIZATION = 'Authorization',
MD5_SESS = 'MD5-sess',
ASCII_SOURCE = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789',
ASCII_SOURCE_LENGTH = ASCII_SOURCE.length,
USERNAME_EQUALS_QUOTE = 'username="',
REALM_EQUALS_QUOTE = 'realm="',
NONCE_EQUALS_QUOTE = 'nonce="',
URI_EQUALS_QUOTE = 'uri="',
ALGORITHM_EQUALS_QUOTE = 'algorithm="',
CNONCE_EQUALS_QUOTE = 'cnonce="',
RESPONSE_EQUALS_QUOTE = 'response="',
OPAQUE_EQUALS_QUOTE = 'opaque="',
QOP_EQUALS = 'qop=',
NC_EQUALS = 'nc=',
ALGO = {
MD5: 'MD5',
MD5_SESS: 'MD5-sess',
SHA_256: 'SHA-256',
SHA_256_SESS: 'SHA-256-sess',
SHA_512_256: 'SHA-512-256',
SHA_512_256_SESS: 'SHA-512-256-sess'
},
AUTH_PARAMETERS = [
'algorithm',
'username',
'realm',
'password',
'method',
'nonce',
'nonceCount',
'clientNonce',
'opaque',
'qop',
'uri'
],
nonceRegex = /nonce="([^"]*)"/,
realmRegex = /realm="([^"]*)"/,
qopRegex = /qop="([^"]*)"/,
opaqueRegex = /opaque="([^"]*)"/,
_extractField,
SHA512_256,
nodeCrypto;
// Current Electron version(7.2.3) in Postman app uses OpenSSL 1.1.0
// which don't support `SHA-512-256`. Use external `js-sha512` module
// to handle this case.
if (!_.includes(crypto.getHashes(), 'sha512-256')) {
SHA512_256 = require('js-sha512').sha512_256;
nodeCrypto = crypto;
// create a wrapper class with similar interface to Node's crypto and use jsSHA
// to support SHA512-256 algorithm
crypto = function () {
this._hash = SHA512_256.create();
};
_.assign(crypto.prototype, {
update: function (data) {
this._hash.update(data);
return this;
},
digest: function () {
// we only need 'hex' digest for this auth
return this._hash.hex();
}
});
_.assign(crypto, {
createHash: function (hashAlgo) {
// return hash from js-sha for SHA512-256
if (hashAlgo === 'sha512-256') {
return new crypto();
}
// return Node's hash otherwise
return nodeCrypto.createHash(hashAlgo);
}
});
}
/**
* Generates a random string of given length
*
* @todo Move this to util.js. After moving use that for hawk auth too
* @param {Number} length
*/
function randomString (length) {
length = length || 6;
var result = [],
i;
for (i = 0; i < length; i++) {
result[i] = ASCII_SOURCE[(Math.random() * ASCII_SOURCE_LENGTH) | 0];
}
return result.join(EMPTY);
}
/**
* Extracts a Digest Auth field from a WWW-Authenticate header value using a given regexp.
*
* @param {String} string
* @param {RegExp} regexp
* @private
*/
_extractField = function (string, regexp) {
var match = string.match(regexp);
return match ? match[1] : EMPTY;
};
/**
* Returns the 'www-authenticate' header for Digest auth. Since a server can suport more than more auth-scheme,
* there can be more than one header with the same key. So need to loop over and check each one.
*
* @param {VariableList} headers
* @private
*/
function _getDigestAuthHeader (headers) {
return headers.find(function (property) {
return (property.key.toLowerCase() === WWW_AUTHENTICATE) && (_.startsWith(property.value, DIGEST_PREFIX));
});
}
/**
* Returns hex encoded hash of given data using given algorithm.
*
* @param {String} data string to calculate hash
* @param {String} algorithm hash algorithm
* @returns {String} hex encoded hash of given data
*/
function getHash (data, algorithm) {
return crypto.createHash(algorithm).update(data || EMPTY).digest('hex');
}
/**
* Calculates body hash with given algorithm and digestEncoding.
*
* @param {RequestBody} body Request body
* @param {String} algorithm Hash algorithm to use
* @param {String} digestEncoding Encoding of the hash
* @param {Function} callback Callback function that will be called with body hash
*/
function computeBodyHash (body, algorithm, digestEncoding, callback) {
if (!(algorithm && digestEncoding)) { return callback(); }
var hash = crypto.createHash(algorithm),
originalReadStream,
rawBody,
graphqlBody,
urlencodedBody;
// if body is not available, return hash of empty string
if (!body || body.isEmpty()) {
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.raw) {
rawBody = bodyBuilder.raw(body.raw).body;
hash.update(rawBody);
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.urlencoded) {
urlencodedBody = bodyBuilder.urlencoded(body.urlencoded).form;
urlencodedBody = urlEncoder.encodeQueryString(urlencodedBody);
hash.update(urlencodedBody);
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.file) {
originalReadStream = _.get(body, 'file.content');
if (!originalReadStream) {
return callback();
}
return originalReadStream.cloneReadStream(function (err, clonedStream) {
if (err) { return callback(); }
clonedStream.on('data', function (chunk) {
hash.update(chunk);
});
clonedStream.on('end', function () {
callback(hash.digest(digestEncoding));
});
});
}
if (body.mode === RequestBody.MODES.graphql) {
graphqlBody = bodyBuilder.graphql(body.graphql).body;
hash.update(graphqlBody);
return callback(hash.digest(digestEncoding));
}
// @todo: Figure out a way to calculate hash for formdata body type.
// ensure that callback is called if body.mode doesn't match with any of the above modes
return callback();
}
/**
* All the auth definition parameters excluding username and password should be stored and resued.
* @todo The current implementation would fail for the case when two requests to two different hosts inherits the same
* auth. In that case a retry would not be attempted for the second request (since all the parameters would be present
* in the auth definition though invalid).
*
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'digest',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
},
{
property: 'nonce',
type: 'auth'
},
{
property: 'realm',
type: 'auth'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Checks whether the given item has all the required parameters in its request.
* Sanitizes the auth parameters if needed.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
// ensure that all dynamic parameter values are present in the parameters
// if even one is absent, we return false.
done(null, Boolean(auth.get('nonce') && auth.get('realm')));
},
/**
* Verifies whether the request was successfully authorized after being sent.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
if (auth.get(DISABLE_RETRY_REQUEST) || !response) {
return done(null, true);
}
var code,
realm,
nonce,
qop,
opaque,
authHeader,
authParams = {};
code = response.code;
authHeader = _getDigestAuthHeader(response.headers);
// If code is forbidden or unauthorized, and an auth header exists,
// we can extract the realm & the nonce, and replay the request.
// todo: add response.is4XX, response.is5XX, etc in the SDK.
if ((code === 401 || code === 403) && authHeader) {
nonce = _extractField(authHeader.value, nonceRegex);
realm = _extractField(authHeader.value, realmRegex);
qop = _extractField(authHeader.value, qopRegex);
opaque = _extractField(authHeader.value, opaqueRegex);
authParams.nonce = nonce;
authParams.realm = realm;
opaque && (authParams.opaque = opaque);
qop && (authParams.qop = qop);
if (authParams.qop || auth.get(QOP)) {
authParams.clientNonce = randomString(8);
authParams.nonceCount = ONE;
}
// if all the auth parameters sent by server were already present in auth definition then we do not retry
if (_.every(authParams, function (value, key) { return auth.get(key); })) {
return done(null, true);
}
auth.set(authParams);
return done(null, false);
}
done(null, true);
},
/**
* Computes the Digest Authentication header from the given parameters.
*
* @param {Object} params
* @param {String} params.algorithm
* @param {String} params.username
* @param {String} params.realm
* @param {String} params.password
* @param {String} params.method
* @param {String} params.nonce
* @param {String} params.nonceCount
* @param {String} params.clientNonce
* @param {String} params.opaque
* @param {String} params.qop
* @param {String} params.uri
* @returns {String}
*/
computeHeader: function (params) {
var algorithm = params.algorithm,
hashAlgo = params.hashAlgo,
username = params.username,
realm = params.realm,
password = params.password,
method = params.method,
nonce = params.nonce,
nonceCount = params.nonceCount,
clientNonce = params.clientNonce,
opaque = params.opaque,
qop = params.qop,
uri = params.uri,
// RFC defined terms, http://tools.ietf.org/html/rfc2617#section-3
A0,
A1,
A2,
hashA1,
hashA2,
reqDigest,
headerParams;
if (_.endsWith(algorithm, SESS)) {
A0 = getHash(username + COLON + realm + COLON + password, hashAlgo);
A1 = A0 + COLON + nonce + COLON + clientNonce;
}
else {
A1 = username + COLON + realm + COLON + password;
}
if (qop === AUTH_INT) {
A2 = method + COLON + uri + COLON + params.bodyhash;
}
else {
A2 = method + COLON + uri;
}
hashA1 = getHash(A1, hashAlgo);
hashA2 = getHash(A2, hashAlgo);
if (qop === AUTH || qop === AUTH_INT) {
reqDigest = getHash([hashA1, nonce, nonceCount, clientNonce, qop, hashA2].join(COLON), hashAlgo);
}
else {
reqDigest = getHash([hashA1, nonce, hashA2].join(COLON), hashAlgo);
}
headerParams = [USERNAME_EQUALS_QUOTE + username + QUOTE,
REALM_EQUALS_QUOTE + realm + QUOTE,
NONCE_EQUALS_QUOTE + nonce + QUOTE,
URI_EQUALS_QUOTE + uri + QUOTE
];
algorithm && headerParams.push(ALGORITHM_EQUALS_QUOTE + algorithm + QUOTE);
if (qop === AUTH || qop === AUTH_INT) {
headerParams.push(QOP_EQUALS + qop);
}
if (qop === AUTH || qop === AUTH_INT || algorithm === MD5_SESS) {
nonceCount && headerParams.push(NC_EQUALS + nonceCount);
headerParams.push(CNONCE_EQUALS_QUOTE + clientNonce + QUOTE);
}
headerParams.push(RESPONSE_EQUALS_QUOTE + reqDigest + QUOTE);
opaque && headerParams.push(OPAQUE_EQUALS_QUOTE + opaque + QUOTE);
return DIGEST_PREFIX + headerParams.join(', ');
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var self = this,
params = auth.get(AUTH_PARAMETERS),
url = urlEncoder.toNodeUrl(request.url),
header;
if (!params.username || !params.realm) {
return done(); // Nothing to do if required parameters are not present.
}
request.removeHeader(AUTHORIZATION, {ignoreCase: true});
params.method = request.method;
params.uri = url.path;
switch (params.algorithm) {
case ALGO.SHA_256:
case ALGO.SHA_256_SESS:
params.hashAlgo = 'sha256';
break;
case ALGO.MD5:
case ALGO.MD5_SESS:
case EMPTY:
case undefined:
case null:
params.algorithm = params.algorithm || ALGO.MD5;
params.hashAlgo = 'md5';
break;
case ALGO.SHA_512_256:
case ALGO.SHA_512_256_SESS:
params.hashAlgo = 'sha512-256';
break;
default:
return done(new Error(`Unsupported digest algorithm: ${params.algorithm}`));
}
// calculate body hash for qop='auth-int'
if (params.qop === AUTH_INT) {
return computeBodyHash(request.body, params.hashAlgo, 'hex', function (bodyhash) {
params.bodyhash = bodyhash;
header = self.computeHeader(params);
request.addHeader({
key: AUTHORIZATION,
value: header,
system: true
});
return done();
});
}
header = self.computeHeader(params);
request.addHeader({
key: AUTHORIZATION,
value: header,
system: true
});
return done();
}
};

316
node_modules/postman-runtime/lib/authorizer/edgegrid.js generated vendored Normal file
View File

@@ -0,0 +1,316 @@
/**
* @fileOverview
*
* Implements the EdgeGrid authentication method.
* Specification document: https://developer.akamai.com/legacy/introduction/Client_Auth.html
* Sample impletentation by Akamai: https://github.com/akamai/AkamaiOPEN-edgegrid-node
*/
var _ = require('lodash'),
uuid = require('uuid/v4'),
crypto = require('crypto'),
sdk = require('postman-collection'),
RequestBody = sdk.RequestBody,
urlEncoder = require('postman-url-encoder'),
bodyBuilder = require('../requester/core-body-builder'),
EMPTY = '',
COLON = ':',
UTC_OFFSET = '+0000',
ZERO = '0',
DATE_TIME_SEPARATOR = 'T',
TAB = '\t',
SPACE = ' ',
SLASH = '/',
STRING = 'string',
SIGNING_ALGORITHM = 'EG1-HMAC-SHA256 ',
AUTHORIZATION = 'Authorization',
/**
* Returns current timestamp in the format described in EdgeGrid specification (yyyyMMddTHH:mm:ss+0000)
*
* @returns {String} UTC timestamp in format yyyyMMddTHH:mm:ss+0000
*/
getTimestamp = function () {
var date = new Date();
return date.getUTCFullYear() +
_.padStart(date.getUTCMonth() + 1, 2, ZERO) +
_.padStart(date.getUTCDate(), 2, ZERO) +
DATE_TIME_SEPARATOR +
_.padStart(date.getUTCHours(), 2, ZERO) +
COLON +
_.padStart(date.getUTCMinutes(), 2, ZERO) +
COLON +
_.padStart(date.getUTCSeconds(), 2, ZERO) +
UTC_OFFSET;
},
/**
* Creates a String containing a tab delimited set of headers.
*
* @param {String[]} headersToSign Headers to include in signature
* @param {Object} headers Request headers
* @returns {String} Canonicalized headers
*/
canonicalizeHeaders = function (headersToSign, headers) {
var formattedHeaders = [],
headerValue;
headersToSign.forEach(function (headerName) {
if (typeof headerName !== STRING) { return; }
// trim the header name to remove extra spaces from user input
headerName = headerName.trim().toLowerCase();
headerValue = headers[headerName];
// should not include empty headers as per the specification
if (typeof headerValue !== STRING || headerValue === EMPTY) { return; }
formattedHeaders.push(`${headerName}:${headerValue.trim().replace(/\s+/g, SPACE)}`);
});
return formattedHeaders.join(TAB);
},
/**
* Returns base64 encoding of the SHA256 HMAC of given data signed with given key
*
* @param {String} data Data to sign
* @param {String} key Key to use while signing the data
* @returns {String} Base64 encoded signature
*/
base64HmacSha256 = function (data, key) {
var encrypt = crypto.createHmac('sha256', key);
encrypt.update(data);
return encrypt.digest('base64');
},
/**
* Calculates body hash with given algorithm and digestEncoding.
*
* @param {RequestBody} body Request body
* @param {String} algorithm Hash algorithm to use
* @param {String} digestEncoding Encoding of the hash
* @param {Function} callback Callback function that will be called with body hash
*/
computeBodyHash = function (body, algorithm, digestEncoding, callback) {
if (!(body && algorithm && digestEncoding) || body.isEmpty()) { return callback(); }
var hash = crypto.createHash(algorithm),
originalReadStream,
rawBody,
urlencodedBody,
graphqlBody;
if (body.mode === RequestBody.MODES.raw) {
rawBody = bodyBuilder.raw(body.raw).body;
hash.update(rawBody);
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.urlencoded) {
urlencodedBody = bodyBuilder.urlencoded(body.urlencoded).form;
urlencodedBody = urlEncoder.encodeQueryString(urlencodedBody);
hash.update(urlencodedBody);
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.file) {
originalReadStream = _.get(body, 'file.content');
if (!originalReadStream) {
return callback();
}
return originalReadStream.cloneReadStream(function (err, clonedStream) {
if (err) { return callback(); }
clonedStream.on('data', function (chunk) {
hash.update(chunk);
});
clonedStream.on('end', function () {
callback(hash.digest(digestEncoding));
});
});
}
if (body.mode === RequestBody.MODES.graphql) {
graphqlBody = bodyBuilder.graphql(body.graphql).body;
hash.update(graphqlBody);
return callback(hash.digest(digestEncoding));
}
// @todo: Figure out a way to calculate hash for formdata body type.
// ensure that callback is called if body.mode doesn't match with any of the above modes
return callback();
};
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'edgegrid',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
}
]
},
/**
* Initializes a item (fetches all required parameters, etc) before the actual authorization step.
*
* @param {AuthInterface} auth AuthInterface instance created with request auth
* @param {Response} response Response of intermediate request (it any)
* @param {AuthHandlerInterface~authInitHookCallback} done Callback function called with error as first argument
*/
init: function (auth, response, done) {
done(null);
},
/**
* Checks the item, and fetches any parameters that are not already provided.
*
* @param {AuthInterface} auth AuthInterface instance created with request auth
* @param {AuthHandlerInterface~authPreHookCallback} done Callback function called with error, success and request
*/
pre: function (auth, done) {
// only check required auth params here
done(null, Boolean(auth.get('accessToken') && auth.get('clientToken') && auth.get('clientSecret')));
},
/**
* Verifies whether the request was successful after being sent.
*
* @param {AuthInterface} auth AuthInterface instance created with request auth
* @param {Requester} response Response of the request
* @param {AuthHandlerInterface~authPostHookCallback} done Callback function called with error and success
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Generates the signature, and returns the Authorization header.
*
* @param {Object} params Auth parameters to use in header calculation
* @param {String} params.accessToken Access token provided by service provider
* @param {String} params.clientToken Client token provided by service provider
* @param {String} params.clientSecret Client secret provided by service provider
* @param {String} params.nonce Nonce to include in authorization header
* @param {String} params.timestamp Timestamp as defined in protocol specification
* @param {String} [params.bodyHash] Base64-encoded SHA256 hash of request body for POST request
* @param {Object[]} params.headers Request headers
* @param {String[]} params.headersToSign Ordered list of headers to include in signature
* @param {String} params.method Request method
* @param {Url} params.url Node's URL object
* @returns {String} Authorization header
*/
computeHeader: function (params) {
var authHeader = SIGNING_ALGORITHM,
signingKey = base64HmacSha256(params.timestamp, params.clientSecret),
dataToSign;
authHeader += `client_token=${params.clientToken};`;
authHeader += `access_token=${params.accessToken};`;
authHeader += `timestamp=${params.timestamp};`;
authHeader += `nonce=${params.nonce};`;
dataToSign = [
params.method,
// trim to convert 'http:' from Node's URL object to 'http'
_.trimEnd(params.url.protocol, COLON),
params.baseURL || params.url.host,
params.url.path || SLASH,
canonicalizeHeaders(params.headersToSign, params.headers),
params.bodyHash || EMPTY,
authHeader
].join(TAB);
return authHeader + 'signature=' + base64HmacSha256(dataToSign, signingKey);
},
/**
* Signs a request.
*
* @param {AuthInterface} auth AuthInterface instance created with request auth
* @param {Request} request Request to be sent
* @param {AuthHandlerInterface~authSignHookCallback} done Callback function
*/
sign: function (auth, request, done) {
var params = auth.get([
'accessToken',
'clientToken',
'clientSecret',
'baseURL',
'nonce',
'timestamp',
'headersToSign'
]),
url = urlEncoder.toNodeUrl(request.url),
self = this;
if (!(params.accessToken && params.clientToken && params.clientSecret)) {
return done(); // Nothing to do if required parameters are not present.
}
request.removeHeader(AUTHORIZATION, {ignoreCase: true});
// Extract host from provided baseURL.
params.baseURL = params.baseURL && urlEncoder.toNodeUrl(params.baseURL).host;
params.nonce = params.nonce || uuid();
params.timestamp = params.timestamp || getTimestamp();
params.url = url;
params.method = request.method;
// ensure that headers are case-insensitive as specified in the documentation
params.headers = request.getHeaders({enabled: true, ignoreCase: true});
if (typeof params.headersToSign === STRING) {
params.headersToSign = params.headersToSign.split(',');
}
else if (!_.isArray(params.headersToSign)) {
params.headersToSign = [];
}
// only calculate body hash for POST requests according to specification
if (request.method === 'POST') {
return computeBodyHash(request.body, 'sha256', 'base64', function (bodyHash) {
params.bodyHash = bodyHash;
request.addHeader({
key: AUTHORIZATION,
value: self.computeHeader(params),
system: true
});
return done();
});
}
request.addHeader({
key: AUTHORIZATION,
value: self.computeHeader(params),
system: true
});
return done();
}
};

264
node_modules/postman-runtime/lib/authorizer/hawk.js generated vendored Normal file
View File

@@ -0,0 +1,264 @@
var url = require('url'),
_ = require('lodash'),
crypto = require('crypto'),
Hawk = require('postman-request/lib/hawk'),
RequestBody = require('postman-collection').RequestBody,
bodyBuilder = require('../requester/core-body-builder'),
urlEncoder = require('postman-url-encoder'),
ASCII_SOURCE = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789',
ASCII_SOURCE_LENGTH = ASCII_SOURCE.length,
AUTHORIZATION = 'Authorization',
EMPTY = '';
/**
* Generates a random string of given length (useful for nonce generation, etc).
*
* @param {Number} length
*/
function randomString (length) {
length = length || 6;
var result = [],
i;
for (i = 0; i < length; i++) {
result[i] = ASCII_SOURCE[(Math.random() * ASCII_SOURCE_LENGTH) | 0];
}
return result.join(EMPTY);
}
/**
* Calculates body hash with given algorithm and digestEncoding.
* REFER: https://github.com/postmanlabs/postman-request/blob/master/lib/hawk.js#L12
*
* @param {RequestBody} body
* @param {String} algorithm
* @param {String} digestEncoding
* @param {String} contentType
* @param {Function} callback
*/
function computeBodyHash (body, algorithm, digestEncoding, contentType, callback) {
if (!(body && algorithm && digestEncoding) || body.isEmpty()) { return callback(); }
var hash = crypto.createHash(algorithm),
originalReadStream,
rawBody,
urlencodedBody,
graphqlBody;
hash.update('hawk.1.payload\n');
hash.update((contentType ? contentType.split(';')[0].trim().toLowerCase() : '') + '\n');
if (body.mode === RequestBody.MODES.raw) {
rawBody = bodyBuilder.raw(body.raw).body;
hash.update(rawBody);
hash.update('\n');
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.urlencoded) {
urlencodedBody = bodyBuilder.urlencoded(body.urlencoded).form;
urlencodedBody = urlEncoder.encodeQueryString(urlencodedBody);
hash.update(urlencodedBody);
hash.update('\n');
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.file) {
originalReadStream = _.get(body, 'file.content');
if (!originalReadStream) {
return callback();
}
return originalReadStream.cloneReadStream(function (err, clonedStream) {
if (err) { return callback(); }
clonedStream.on('data', function (chunk) {
hash.update(chunk);
});
clonedStream.on('end', function () {
hash.update('\n');
callback(hash.digest(digestEncoding));
});
});
}
if (body.mode === RequestBody.MODES.graphql) {
graphqlBody = bodyBuilder.graphql(body.graphql).body;
hash.update(graphqlBody);
hash.update('\n');
return callback(hash.digest(digestEncoding));
}
// @todo: Figure out a way to calculate hash for formdata body type.
// ensure that callback is called if body.mode doesn't match with any of the above modes
return callback();
}
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'hawk',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
},
{
property: 'nonce',
type: 'auth'
},
{
property: 'timestamp',
type: 'auth'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Checks the item, and fetches any parameters that are not already provided.
* Sanitizes the auth parameters if needed.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
!auth.get('nonce') && auth.set('nonce', randomString(6));
!_.parseInt(auth.get('timestamp')) && auth.set('timestamp', Math.floor(Date.now() / 1e3));
done(null, true);
},
/**
* Verifies whether the request was successfully authorized after being sent.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Computes signature and Auth header for a request.
*
* @param {Object} params
* @param {Object} params.credentials Contains hawk auth credentials, "id", "key" and "algorithm"
* @param {String} params.nonce
* @param {String} params.ext Extra data that may be associated with the request.
* @param {String} params.app Application ID used in Oz authorization protocol
* @param {String} params.dlg Delegation information (used in the Oz protocol)
* @param {String} params.user User id
* @param {String} params.url Complete request URL
* @param {String} params.method Request method
*
* @returns {*}
*/
computeHeader: function (params) {
return Hawk.header(url.parse(params.url), params.method, params);
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var params = auth.get([
'authId',
'authKey',
'algorithm',
'nonce',
'timestamp',
'extraData',
'app',
'delegation',
'user',
'includePayloadHash'
]),
contentType = request.headers.get('content-type'),
self = this,
signRequest = function (bodyHash) {
// force toString to add a protocol to the URL.
var url = urlEncoder.toNodeUrl(request.url),
result = self.computeHeader({
credentials: {
id: params.authId,
key: params.authKey,
algorithm: params.algorithm
},
nonce: params.nonce,
timestamp: params.timestamp,
ext: params.extraData,
app: params.app,
dlg: params.delegation,
user: params.user,
url: url.href,
method: request.method,
hash: bodyHash
});
request.addHeader({
key: AUTHORIZATION,
value: result,
system: true
});
return done();
};
if (!params.authId || !params.authKey) {
return done(); // Nothing to do if required parameters are not present.
}
request.removeHeader(AUTHORIZATION, {ignoreCase: true});
// @note: Payload verification is optional in hawk auth according to specifications (see below link). If user
// opt-in for payload verification, `Content-Type` header must be specified explicitely otherwise
// authentication might fail because we automatically add `Content-Type` header after auth handlers which
// is not accounted while calculating payload hash for hawk auth.
// documentation: https://github.com/hapijs/hawk#payload-validation
// issue: https://github.com/postmanlabs/postman-app-support/issues/6550
//
// @todo: Change flow of auto adding `Content-Type` header to happen before auth handlers
if (!params.includePayloadHash) {
return signRequest(); // sign request without calculating payload hash
}
computeBodyHash(request.body, params.algorithm, 'base64', contentType, signRequest);
}
};

239
node_modules/postman-runtime/lib/authorizer/index.js generated vendored Normal file
View File

@@ -0,0 +1,239 @@
var _ = require('lodash'),
sdk = require('postman-collection'),
createAuthInterface = require('./auth-interface'),
AUTH_TYPE_PROP = '__auth_type',
AuthLoader,
authorizeRequest;
/**
* This object manages loading and finding Handlers for auth.
*
* @type AuthLoader
*/
AuthLoader = {
/**
* Houses list of available Authentication handlers.
*
* @property {Object}
*/
handlers: {},
/**
* Finds the Handler for an Auth type.
*
* @param name
*
* @returns {AuthHandler}
*/
getHandler: function (name) {
return AuthLoader.handlers[name];
},
/**
* Adds a Handler for use with given Auth type.
*
* @param Handler
* @param name
*/
addHandler: function (Handler, name) {
if (!_.isFunction(Handler.init)) {
throw new Error('The handler for "' + name + '" does not have an "init" function, which is necessary');
}
if (!_.isFunction(Handler.pre)) {
throw new Error('The handler for "' + name + '" does not have a "pre" function, which is necessary');
}
if (!_.isFunction(Handler.post)) {
throw new Error('The handler for "' + name + '" does not have a "post" function, which is necessary');
}
if (!_.isFunction(Handler.sign)) {
throw new Error('The handler for "' + name + '" does not have a "sign" function, which is necessary');
}
Object.defineProperty(Handler, AUTH_TYPE_PROP, {
value: name,
configurable: false,
enumerable: false,
writable: false
});
AuthLoader.handlers[name] = Handler;
},
/**
* Removes the Handler for the Auth type.
*
* @param name
*/
removeHandler: function (name) {
AuthLoader.handlers[name] && (delete AuthLoader.handlers[name]);
}
};
// Create a Handler from each Signer that the SDK provides. Basically, we augment the signers with extra
// helper functions which take over the job of preparing a request for signing.
_.forEach({
noauth: require('./noauth'),
awsv4: require('./aws4'),
basic: require('./basic'),
bearer: require('./bearer'),
digest: require('./digest'),
hawk: require('./hawk'),
oauth1: require('./oauth1'),
oauth2: require('./oauth2'),
ntlm: require('./ntlm'),
apikey: require('./apikey'),
edgegrid: require('./edgegrid')
}, AuthLoader.addHandler);
/**
* Creates a copy of request, with the appropriate auth headers or parameters added.
*
* @note This function does not take care of resolving variables.
*
* @param {Request} request
* @param done
*
* @returns {Request}
*/
authorizeRequest = function (request, done) {
if (!request.auth) {
return done();
}
var clonedReq = new sdk.Request(request.toJSON()),
auth = clonedReq.auth,
authInterface = createAuthInterface(auth),
handler = AuthLoader.getHandler(auth.type);
if (handler) {
handler.sign(authInterface, clonedReq, function () { return done(null, clonedReq); });
}
else {
return done(new Error('runtime~authorizeRequest: could not find handler for auth type ' + auth.type));
}
};
module.exports = {
AuthLoader: AuthLoader,
authorizeRequest: authorizeRequest
};
// Interface
/**
* Interface for implementing auth handlers
*
* @interface AuthHandlerInterface
*/
// Interface functions
/**
* Defines the behaviour of an Auth Handler. This way the handler allows to statically analyse
* any changes the Handler will make ahead of time.
*
* @member {AuthHandlerInterface~AuthManifest} AuthHandlerInterface#manifest
*/
/**
* This hook decides whether all the required parameters are present in the auth or not.
* What happens next is dependent upon how the `done` callback is called.
* Check {@link AuthHandlerInterface~authPreHookCallback} for all the possible ways the callback can be called.
*
* @function
* @name AuthHandlerInterface#pre
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
* Callback function which takes error, success, and request as arguments
*/
/**
* This hook is called with the response from the intermediate request, which was requested from the
* [pre]{@link AuthHandlerInterface#pre} hook.
* Here the `auth` can be modified using the response. After this [pre]{@link AuthHandlerInterface#pre} hook will be
* called again to verify the required parameters.
*
* @function
* @name AuthHandlerInterface#init
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done Callback function which takes error as the only argument
*/
/**
* This hook signs the `request` using the `auth`.
*
* @function
* @name AuthHandlerInterface#sign
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done Callback function which takes error as the only argument
*/
/**
* This hook is called after the request is made. It receives the response using which it can determine whether
* it was a failure or success. It can also modify the `auth` and ask to replay the `request`.
* For this it has to call the [done]{@link AuthHandlerInterface~authPostHookCallback} callback with `success` as false.
*
* @function
* @name AuthHandlerInterface#post
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done Callback function which takes error and success as arguments
*/
// Callbacks
/**
* This callback is called in the `pre` hook of the auth handler
* Depending on what parameters are passed in this callback, one of the following flows will be executed:
* 1. return (err): The request will be stopped and the error will be bubbled up
* 2. return (null, true): The request will be signed and sent
* 3. return (null, false): The request will be sent without being signed
* 4. return (null, false, `request`):
* - send the intermediate request
* - invoke the auth's [init]{@link AuthHandlerInterface#init} hook with the response of the intermediate request
* - invoke the auth's [pre]{@link AuthHandlerInterface#pre} hook
* @callback AuthHandlerInterface~authPreHookCallback
* @param {?Error} err
* @param {Boolean} success Defines whether the [pre]{@link AuthHandlerInterface#pre} hook was successful.
* @param {Request~definition|String} [request] It can be either request definition or request URL
*/
/**
* This callback is called in the `init` hook of the auth handler
* @callback AuthHandlerInterface~authInitHookCallback
* @param {?Error} err
*/
/**
* This callback is called in the `sign` hook of the auth handler
* @callback AuthHandlerInterface~authSignHookCallback
* @param {?Error} err
*/
/**
* This callback is called in the `post` hook of the auth handler
* @callback AuthHandlerInterface~authPostHookCallback
* @param {?Error} err
* @param {Boolean} success Defines whether the request was successful or not. If not, it will be replayed.
*/
/**
* Structure of an Auth Manifest. See {@link AuthHandlerInterface#manifest} for description.
*
* @typedef {Object} AuthHandlerInterface~AuthManifest
*
* @property {Object} info
* @property {String} info.name
* @property {String} info.version
* @property {Array<Object>} updates
*/

62
node_modules/postman-runtime/lib/authorizer/noauth.js generated vendored Normal file
View File

@@ -0,0 +1,62 @@
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
* @todo runtime needs to make sure AuthHandler
* cannot mutate any property on Request that it has not declared on the manifest.
*/
manifest: {
info: {
name: 'noauth',
version: '1.0.0'
},
updates: []
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Checks whether the given item has all the required parameters in its request.
* Sanitizes the auth parameters if needed.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
done(null, true);
},
/**
* Verifies whether the request was successfully authorized after being sent.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
return done();
}
};

276
node_modules/postman-runtime/lib/authorizer/ntlm.js generated vendored Normal file
View File

@@ -0,0 +1,276 @@
/**
* @fileOverview
*
* Implements the NTLM over HTTP specification: [MS-NTHT] https://msdn.microsoft.com/en-us/library/cc237488.aspx
* Also see [MS-NLMP]: https://msdn.microsoft.com/en-us/library/cc236621.aspx
*
* @note NTLM supports a number of different variations, where an actual TCP connection is signed etc. This file
* does _not_ implement those cases.
*/
var ntlmUtil = require('httpntlm').ntlm,
_ = require('lodash'),
EMPTY = '',
NTLM = 'NTLM',
STATE = 'state',
NEGOTIATE = 'negotiate',
NTLM_HEADER = 'ntlmHeader',
AUTHORIZATION = 'Authorization',
WWW_AUTHENTICATE = 'www-authenticate',
DISABLE_RETRY_REQUEST = 'disableRetryRequest',
NTLM_PARAMETERS = {
DOMAIN: 'domain',
WORKSTATION: 'workstation',
USERNAME: 'username',
PASSWORD: 'password'
},
STATES = {
INITIALIZED: 'INITIALIZED',
T1_MSG_CREATED: 'T1_MSG_CREATED',
T3_MSG_CREATED: 'T3_MSG_CREATED'
};
/**
* Parses the username to separate username and domain. It can handle two formats:
* - Down-Level Logon name format `DOMAIN\USERNAME`
* - User Principal Name format `USERNAME@DOMAIN`
*
* @private
* @param {String} username - Username string to parse from
* @return {Object} - An object with `username` and `domain` fields, which are `strings`.
*/
function parseParametersFromUsername (username) {
var dllParams,
upnParams;
if (!(username && typeof username === 'string')) {
return {
username: EMPTY,
domain: EMPTY
};
}
dllParams = username.split('\\');
upnParams = username.split('@');
// username should be either of the two formats, not both
if (dllParams.length > 1 && upnParams.length > 1) {
return {
username,
domain: EMPTY
};
}
// try to parse from "down level logon" format
if (dllParams.length === 2 && dllParams[0] && dllParams[1]) {
return {
username: dllParams[1],
domain: dllParams[0]
};
}
// try to parse from "user principal name" format
if (upnParams.length === 2 && upnParams[0] && upnParams[1]) {
return {
username: upnParams[0],
domain: upnParams[1]
};
}
return {
username,
domain: EMPTY
};
}
/**
* Check if `WWW-Authenticate` header has NTLM challenge.
*
* @private
* @param {*} headers - Postman headers instance
* @returns {Boolean}
*/
function hasNTLMChallenge (headers) {
// Case 1: multiple headers
// - WWW-Authenticate: NTLM
// - WWW-Authenticate: Negotiate
if (headers.has(WWW_AUTHENTICATE, NTLM) || headers.has(WWW_AUTHENTICATE, NEGOTIATE)) {
return true;
}
// Case 2: single header
// - WWW-Authenticate: Negotiate, NTLM
return String(headers.get(WWW_AUTHENTICATE)).includes(NTLM);
}
/**
* NTLM auth while authenticating requires negotiateMessage (type 1) and authenticateMessage (type 3) to be stored.
* Also it needs to know which stage is it in (INITIALIZED, T1_MSG_CREATED and T3_MSG_CREATED).
* After the first successful authentication, it just relies on the TCP connection, no other state is needed.
* @todo Currenty we don't close the connection. So there is no way to de-authenticate.
*
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'ntlm',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Verifies whether the request has valid basic auth credentials (which is always).
* Sanitizes the auth parameters if needed.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
!auth.get(STATE) && auth.set(STATE, STATES.INITIALIZED);
done(null, true);
},
/**
* Verifies whether the basic auth succeeded.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
if (auth.get(DISABLE_RETRY_REQUEST)) {
return done(null, true);
}
var state = auth.get(STATE),
domain = auth.get(NTLM_PARAMETERS.DOMAIN) || EMPTY,
workstation = auth.get(NTLM_PARAMETERS.WORKSTATION) || EMPTY,
username = auth.get(NTLM_PARAMETERS.USERNAME) || EMPTY,
password = auth.get(NTLM_PARAMETERS.PASSWORD) || EMPTY,
negotiateMessage, // type 1
challengeMessage, // type 2
authenticateMessage, // type 3
ntlmType2Header,
parsedParameters;
if (response.code !== 401 && response.code !== 403) {
return done(null, true);
}
// we try to extract domain from username if not specified.
if (!domain) {
parsedParameters = parseParametersFromUsername(username) || {};
username = parsedParameters.username;
domain = parsedParameters.domain;
}
if (state === STATES.INITIALIZED) {
// Nothing to do if the server does not ask us for auth in the first place.
if (!hasNTLMChallenge(response.headers)) {
return done(null, true);
}
// Create a type 1 message to send to the server
negotiateMessage = ntlmUtil.createType1Message({
domain: domain,
workstation: workstation
});
// Add the type 1 message as the auth header
auth.set(NTLM_HEADER, negotiateMessage);
// Update the state
auth.set(STATE, STATES.T1_MSG_CREATED);
// ask runtime to replay the request
return done(null, false);
}
else if (state === STATES.T1_MSG_CREATED) {
// At this point, we can assume that the type 1 message was sent to the server
// there can be multiple headers present with key `www-authenticate`.
// iterate to get the one which has the NTLM hash. if multiple
// headers have the NTLM hash, use the first one.
ntlmType2Header = response.headers.find(function (header) {
return String(header.key).toLowerCase() === WWW_AUTHENTICATE &&
header.valueOf().startsWith('NTLM ');
});
if (!ntlmType2Header) {
return done(new Error('ntlm: server did not send NTLM type 2 message'));
}
challengeMessage = ntlmUtil.parseType2Message(ntlmType2Header.valueOf(), _.noop);
if (!challengeMessage) {
return done(new Error('ntlm: server did not correctly process authentication request'));
}
authenticateMessage = ntlmUtil.createType3Message(challengeMessage, {
domain: domain,
workstation: workstation,
username: username,
password: password
});
// Now create the type 3 message, and add it to the request
auth.set(NTLM_HEADER, authenticateMessage);
auth.set(STATE, STATES.T3_MSG_CREATED);
// ask runtime to replay the request
return done(null, false);
}
else if (state === STATES.T3_MSG_CREATED) {
// Means we have tried to authenticate, so we should stop here without worrying about anything
return done(null, true);
}
// We are in an undefined state
return done(null, true);
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var ntlmHeader = auth.get(NTLM_HEADER);
request.removeHeader(AUTHORIZATION, {ignoreCase: true});
ntlmHeader && request.addHeader({
key: AUTHORIZATION,
value: ntlmHeader,
system: true
});
return done();
}
};

494
node_modules/postman-runtime/lib/authorizer/oauth1.js generated vendored Normal file
View File

@@ -0,0 +1,494 @@
var _ = require('lodash'),
crypto = require('crypto'),
oAuth1 = require('node-oauth1'),
urlEncoder = require('postman-url-encoder'),
RequestBody = require('postman-collection').RequestBody,
bodyBuilder = require('../requester/core-body-builder'),
EMPTY = '',
RSA = 'RSA',
HYPHEN = '-',
PROTOCOL_HTTP = 'http',
PROTOCOL_SEPARATOR = '://',
HTTPS_PORT = '443',
HTTP_PORT = '80',
OAUTH1_PARAMS = {
oauthConsumerKey: 'oauth_consumer_key',
oauthToken: 'oauth_token',
oauthSignatureMethod: 'oauth_signature_method',
oauthTimestamp: 'oauth_timestamp',
oauthNonce: 'oauth_nonce',
oauthVersion: 'oauth_version',
oauthSignature: 'oauth_signature',
oauthCallback: 'oauth_callback',
oauthVerifier: 'oauth_verifier',
oauthBodyHash: 'oauth_body_hash'
};
/**
* Returns a OAuth1.0-a compatible representation of the request URL, also called "Base URL".
* For details, http://oauth.net/core/1.0a/#anchor13
*
* todo: should we ignore the auth parameters of the URL or not? (the standard does not mention them)
* we currently are.
*
* @private
* @param {Url} url - Node's URL object
* @returns {String}
*/
function getOAuth1BaseUrl (url) {
var port = url.port ? url.port : undefined,
host = ((port === HTTP_PORT ||
port === HTTPS_PORT ||
port === undefined) && url.hostname) || url.host,
path = url.path,
// trim to convert 'http:' from Node's URL object to 'http'
protocol = _.trimEnd(url.protocol || PROTOCOL_HTTP, PROTOCOL_SEPARATOR);
protocol = (_.endsWith(protocol, PROTOCOL_SEPARATOR) ? protocol : protocol + PROTOCOL_SEPARATOR);
return protocol.toLowerCase() + host.toLowerCase() + path;
}
/**
* Query parameters are encoded with WHATWG encoding in the request. OAuth1.0
* requires the query params to be encoded with the RFC-3986 standard. This
* function decodes the query parameters and encodes them to the required RFC-3986
* standard. For details: https://oauth.net/core/1.0a/#encoding_parameters
*
* @param {Request} request - request to update query parameters
* @param {Object} url - Node.js like url object
*/
function updateQueryParamEncoding (request, url) {
// early bailout if no query is set.
if (!url.query) {
return;
}
const queryParams = oAuth1.decodeForm(url.query);
// clear all query parameters
request.url.query.clear();
_.forEach(queryParams, function (param) {
request.url.query.add({
key: param[0] && oAuth1.percentEncode(param[0]),
value: param[1] && oAuth1.percentEncode(param[1])
});
});
}
/**
* Calculates body hash with given algorithm and digestEncoding.
*
* @param {RequestBody} body Request body
* @param {String} algorithm Hash algorithm to use
* @param {String} digestEncoding Encoding of the hash
* @param {Function} callback Callback function that will be called with body hash
*/
function computeBodyHash (body, algorithm, digestEncoding, callback) {
if (!(algorithm && digestEncoding)) { return callback(); }
var hash = crypto.createHash(algorithm),
originalReadStream,
rawBody,
graphqlBody;
// if body is not available, return hash of empty string
if (!body || body.isEmpty()) {
return callback(hash.digest(digestEncoding));
}
if (body.mode === RequestBody.MODES.raw) {
rawBody = bodyBuilder.raw(body.raw).body;
hash.update(rawBody);
return callback(hash.digest(digestEncoding));
}
// calculations for url-encoded body are not done here unlike other
// auths(i.e. AWS/HAWK) because it is not required for OAuth1.0
if (body.mode === RequestBody.MODES.file) {
originalReadStream = _.get(body, 'file.content');
if (!originalReadStream) {
return callback();
}
return originalReadStream.cloneReadStream(function (err, clonedStream) {
if (err) { return callback(); }
clonedStream.on('data', function (chunk) {
hash.update(chunk);
});
clonedStream.on('end', function () {
callback(hash.digest(digestEncoding));
});
});
}
if (body.mode === RequestBody.MODES.graphql) {
graphqlBody = bodyBuilder.graphql(body.graphql).body;
hash.update(graphqlBody);
return callback(hash.digest(digestEncoding));
}
// @todo: Figure out a way to calculate hash for formdata body type.
// ensure that callback is called if body.mode doesn't match with any of the above modes
return callback();
}
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'oauth1',
version: '1.0.0'
},
updates: [
{
property: 'Authorization',
type: 'header'
},
{
property: OAUTH1_PARAMS.oauthConsumerKey,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthToken,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthCallback,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthVerifier,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthBodyHash,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthSignatureMethod,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthTimestamp,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthNonce,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthVersion,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthSignature,
type: 'url.param'
},
{
property: OAUTH1_PARAMS.oauthConsumerKey,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthToken,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthCallback,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthVerifier,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthSignatureMethod,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthTimestamp,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthNonce,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthVersion,
type: 'body.urlencoded'
},
{
property: OAUTH1_PARAMS.oauthSignature,
type: 'body.urlencoded'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Verifies whether the request has valid basic auth credentials (which is always).
* Sanitizes the auth parameters if needed.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
done(null, true);
},
/**
* Verifies whether the basic auth succeeded.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Generates and adds oAuth1 data to the request. This function modifies the
* request passed in the argument.
*
* @param {Request} request - request to add oauth1 parameters
* @param {Object} params - oauth data to generate signature
* @param {Object} protocolProfileBehavior - Protocol profile behaviors
* @param {Function} done - callback function
*/
addAuthDataToRequest: function (request, params, protocolProfileBehavior, done) {
var url = urlEncoder.toNodeUrl(request.url),
signatureParams,
urlencodedBody,
bodyParams,
allParams,
signature,
message,
header,
accessor = {
consumerSecret: params.consumerSecret || EMPTY,
tokenSecret: params.tokenSecret || EMPTY,
privateKey: params.privateKey || EMPTY
},
disableUrlEncoding = protocolProfileBehavior && protocolProfileBehavior.disableUrlEncoding;
signatureParams = [
{system: true, key: OAUTH1_PARAMS.oauthConsumerKey, value: params.consumerKey},
{system: true, key: OAUTH1_PARAMS.oauthToken, value: params.token},
{system: true, key: OAUTH1_PARAMS.oauthSignatureMethod, value: params.signatureMethod},
{system: true, key: OAUTH1_PARAMS.oauthTimestamp, value: params.timestamp},
{system: true, key: OAUTH1_PARAMS.oauthNonce, value: params.nonce},
{system: true, key: OAUTH1_PARAMS.oauthVersion, value: params.version}
];
// bodyHash, callback and verifier parameters are part of extensions of the original OAuth1 spec.
// So we only include those in signature if they are non-empty, ignoring the addEmptyParamsToSign setting.
// Otherwise it causes problem for servers that don't support the respective OAuth1 extensions.
// Issue: https://github.com/postmanlabs/postman-app-support/issues/8737
if (params.bodyHash) {
signatureParams.push({system: true, key: OAUTH1_PARAMS.oauthBodyHash, value: params.bodyHash});
}
if (params.callback) {
signatureParams.push({system: true, key: OAUTH1_PARAMS.oauthCallback, value: params.callback});
}
if (params.verifier) {
signatureParams.push({system: true, key: OAUTH1_PARAMS.oauthVerifier, value: params.verifier});
}
// filter empty signature parameters
signatureParams = _.filter(signatureParams, function (param) {
return params.addEmptyParamsToSign || param.value;
});
urlencodedBody = request.body &&
request.body.mode === RequestBody.MODES.urlencoded &&
request.body.urlencoded;
// Body params only need to be included if they are URL encoded.
// http://oauth.net/core/1.0a/#anchor13
bodyParams = urlencodedBody ? urlencodedBody.filter(function (param) {
return !param.disabled;
}) : [];
allParams = [].concat(signatureParams, bodyParams);
message = {
action: getOAuth1BaseUrl(url),
method: request.method,
parameters: _.map(allParams, function (param) {
return [param.key, param.value];
})
};
try {
signature = oAuth1.SignatureMethod.sign(message, accessor);
}
catch (err) {
// handle invalid private key errors for RSA signatures
return done(err);
}
// Update the encoding for query parameters to RFC-3986 in accordance with the
// OAuth1.0a specification: https://oauth.net/core/1.0a/#encoding_parameters
// disableUrlEncoding option should be respected in authorization flow as well
if (disableUrlEncoding !== true) {
updateQueryParamEncoding(request, url);
}
signatureParams.push({system: true, key: OAUTH1_PARAMS.oauthSignature, value: signature});
// Add signature params to the request. The OAuth specification says
// that we should add parameters in the following order of preference:
// 1. Auth Header
// 2. Body parameters
// 3. Query parameters
//
// http://oauth.net/core/1.0/#consumer_req_param
if (params.addParamsToHeader) {
header = oAuth1.getAuthorizationHeader(params.realm, _.map(signatureParams, function (param) {
return [param.key, param.value];
}), params.disableHeaderEncoding);
request.addHeader({
key: 'Authorization',
value: header,
system: true
});
}
else if ((/PUT|POST/).test(request.method) && urlencodedBody) {
_.forEach(signatureParams, function (param) {
urlencodedBody.add(param);
});
}
else if (disableUrlEncoding === true) {
// disableUrlEncoding option should be respected in authorization flow as well
request.addQueryParams(signatureParams);
}
else {
_.forEach(signatureParams, function (param) {
request.url.query.add({
key: param.key && oAuth1.percentEncode(param.key),
value: param.value && oAuth1.percentEncode(param.value),
system: true
});
});
}
done();
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var self = this,
params = auth.get([
'consumerKey',
'consumerSecret',
'token',
'tokenSecret',
'privateKey',
'signatureMethod',
'callback',
'verifier',
'timestamp',
'nonce',
'version',
'realm',
'includeBodyHash',
'addParamsToHeader',
'addEmptyParamsToSign',
'disableHeaderEncoding'
]),
urlencodedBody = request.body,
signatureAlgo,
hashAlgo,
protocolProfileBehavior = auth._protocolProfileBehavior;
// extract hash and signature algorithm form signatureMethod
// signature methods are in this format: '<signatureAlgo>-<hashAlgo>' e.g. RSA-SHA1
hashAlgo = _.split(params.signatureMethod, HYPHEN);
signatureAlgo = _.upperCase(hashAlgo[0]);
hashAlgo = hashAlgo[1];
if (!params.consumerKey ||
(signatureAlgo !== RSA && !params.consumerSecret) ||
(signatureAlgo === RSA && !params.privateKey)) {
return done(); // Nothing to do if required parameters are not present.
}
// before this: urlencodedBody = request.body
// after this: urlencodedBody = request.body.urlencoded or undefined
urlencodedBody = (urlencodedBody &&
urlencodedBody.mode === RequestBody.MODES.urlencoded
) ? urlencodedBody.urlencoded : undefined;
// Remove existing headers and params (if any)
request.removeHeader('Authorization');
request.removeQueryParams(_.values(OAUTH1_PARAMS));
urlencodedBody && urlencodedBody.remove(function (param) {
return _.includes(_.values(OAUTH1_PARAMS), param.key);
});
// Generate a new nonce and timestamp
params.nonce = params.nonce || oAuth1.nonce(11).toString();
params.timestamp = params.timestamp || oAuth1.timestamp().toString();
// Ensure that empty parameters are not added to the signature
if (!params.addEmptyParamsToSign) {
params = _.reduce(params, function (accumulator, value, key) {
if (_.isString(value) && (value.trim() === EMPTY)) {
return accumulator;
}
accumulator[key] = value;
return accumulator;
}, {});
}
// Don't include body hash as defined in specification
// @see: https://tools.ietf.org/id/draft-eaton-oauth-bodyhash-00.html#when_to_include
if (urlencodedBody || !(params.includeBodyHash && hashAlgo)) {
return self.addAuthDataToRequest(request, params, protocolProfileBehavior, done);
}
computeBodyHash(request.body, hashAlgo, 'base64', function (bodyHash) {
params.bodyHash = bodyHash;
return self.addAuthDataToRequest(request, params, protocolProfileBehavior, done);
});
}
};

133
node_modules/postman-runtime/lib/authorizer/oauth2.js generated vendored Normal file
View File

@@ -0,0 +1,133 @@
var _ = require('lodash'),
HEADER = 'header',
QUERY_PARAMS = 'queryParams',
BEARER = 'bearer',
MAC = 'mac',
AUTHORIZATION = 'Authorization',
ACCESS_TOKEN = 'access_token',
AUTHORIZATION_PREFIX = 'Bearer',
OAUTH2_PARAMETERS = [
'accessToken',
'addTokenTo',
'tokenType',
'headerPrefix'
];
/**
* @implements {AuthHandlerInterface}
*/
module.exports = {
/**
* @property {AuthHandlerInterface~AuthManifest}
*/
manifest: {
info: {
name: 'oauth2',
version: '1.0.0'
},
updates: [
{
property: AUTHORIZATION,
type: 'header'
},
{
property: ACCESS_TOKEN,
type: 'url.param'
}
]
},
/**
* Initializes an item (extracts parameters from intermediate requests if any, etc)
* before the actual authorization step.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authInitHookCallback} done
*/
init: function (auth, response, done) {
done(null);
},
/**
* Verifies whether the request has valid basic auth credentials (which is always).
* Sanitizes the auth parameters if needed.
*
* @param {AuthInterface} auth
* @param {AuthHandlerInterface~authPreHookCallback} done
*/
pre: function (auth, done) {
done(null, Boolean(auth.get('accessToken')));
},
/**
* Verifies whether the basic auth succeeded.
*
* @param {AuthInterface} auth
* @param {Response} response
* @param {AuthHandlerInterface~authPostHookCallback} done
*/
post: function (auth, response, done) {
done(null, true);
},
/**
* Signs a request.
*
* @param {AuthInterface} auth
* @param {Request} request
* @param {AuthHandlerInterface~authSignHookCallback} done
*/
sign: function (auth, request, done) {
var params = auth.get(OAUTH2_PARAMETERS),
tokenType;
// Validation
if (!params.accessToken) {
return done(); // Nothing to do if required parameters are not present.
}
// Defaults
params.addTokenTo = params.addTokenTo || HEADER; // Add token to header by default
params.tokenType = params.tokenType || BEARER; // Use `Bearer` token type by default
params.headerPrefix = _.isNil(params.headerPrefix) ?
AUTHORIZATION_PREFIX : _.trim(String(params.headerPrefix));
// add a space after prefix only if there is any prefix
params.headerPrefix && (params.headerPrefix += ' ');
// Some servers send 'Bearer' while others send 'bearer'
tokenType = _.toLower(params.tokenType);
// @TODO Add support for HMAC
if (tokenType === MAC) {
return done();
}
// treat every token types (other than MAC) as bearer token
// clean conflicting headers and query params
// @todo: we should be able to get conflicting params from auth manifest
// and clear them before the sign step for any auth
request.removeHeader(AUTHORIZATION, {ignoreCase: true});
request.removeQueryParams([ACCESS_TOKEN]);
if (params.addTokenTo === QUERY_PARAMS) {
request.addQueryParams({
key: ACCESS_TOKEN,
value: params.accessToken,
system: true
});
}
else if (params.addTokenTo === HEADER) {
request.addHeader({
key: AUTHORIZATION,
value: params.headerPrefix + params.accessToken,
system: true
});
}
return done();
}
};

166
node_modules/postman-runtime/lib/backpack/index.js generated vendored Normal file
View File

@@ -0,0 +1,166 @@
var _ = require('lodash'),
meetExpectations,
backpack;
/**
* ensure the specified keys are functions in subject
*
* @param {Object} subject
* @param {Array} expectations
* @param {Array=} [defaults]
* @returns {Object}
*/
meetExpectations = function (subject, expectations, defaults) {
// provided that the subject is an object, we meet expectations that the keys in array must be a function
// eslint-disable-next-line lodash/prefer-lodash-chain
_.isObject(subject) && _.union(defaults, expectations).forEach(function (expect) {
!_.isFunction(subject[expect]) && (subject[expect] = _.noop);
});
return subject;
};
module.exports = backpack = {
/**
* Ensures that the given argument is a callable.
*
* @param {*} arg
* @param {Object=} ctx
* @returns {boolean|*}
*/
ensure: function (arg, ctx) {
return (typeof arg === 'function') && (ctx ? arg.bind(ctx) : arg) || undefined;
},
/**
* accept the callback parameter and convert it into a consistent object interface
*
* @param {Function|Object} cb
* @param {Array} [expect=]
* @returns {Object}
*
* @todo - write tests
*/
normalise: function (cb, expect) {
if (_.isFunction(cb) && cb.__normalised) {
return meetExpectations(cb, expect);
}
var userback, // this var will be populated and returned
// keep a reference of all initial callbacks sent by user
callback = (_.isFunction(cb) && cb) || (_.isFunction(cb && cb.done) && cb.done),
callbackError = _.isFunction(cb && cb.error) && cb.error,
callbackSuccess = _.isFunction(cb && cb.success) && cb.success;
// create master callback that calls these user provided callbacks
userback = _.assign(function (err) {
// if common callback is defined, call that
callback && callback.apply(this, arguments);
// for special error and success, call them if they are user defined
if (err) {
callbackError && callbackError.apply(this, arguments);
}
else {
// remove the extra error param before calling success
callbackSuccess && callbackSuccess.apply(this, (Array.prototype.shift.call(arguments), arguments));
}
}, _.isPlainObject(cb) && cb, { // override error, success and done
error: function () {
return userback.apply(this, arguments);
},
success: function () {
// inject null to arguments and call the main callback
userback.apply(this, (Array.prototype.unshift.call(arguments, null), arguments));
},
done: function () {
return userback.apply(this, arguments);
},
__normalised: true
});
return meetExpectations(userback, expect);
},
/**
* Convert a callback into a function that is called multiple times and the callback is actually called when a set
* of flags are set to true
*
* @param {Array} flags
* @param {Function} callback
* @param {Array} args
* @param {Number} ms
* @returns {Function}
*/
multiback: function (flags, callback, args, ms) {
var status = {},
sealed;
// ensure that the callback times out after a while
callback = backpack.timeback(callback, ms, null, function () {
sealed = true;
});
return function (err, flag, value) {
if (sealed) { return; } // do not proceed of it is sealed
status[flag] = value;
if (err) { // on error we directly call the callback and seal subsequent calls
sealed = true;
status = null;
callback.call(status, err);
return;
}
// if any flag is not defined, we exit. when all flags hold a value, we know that the end callback has to be
// executed.
for (var i = 0, ii = flags.length; i < ii; i++) {
if (!status.hasOwnProperty(flags[i])) { return; }
}
sealed = true;
status = null;
callback.apply(status, args);
};
},
/**
* Ensures that a callback is executed within a specific time.
*
* @param {Function} callback
* @param {Number=} [ms]
* @param {Object=} [scope]
* @param {Function=} [when] - function executed right before callback is called with timeout. one can do cleanup
* stuff here
* @returns {Function}
*/
timeback: function (callback, ms, scope, when) {
ms = Number(ms);
// if np callback time is specified, just return the callback function and exit. this is because we do need to
// track timeout in 0ms
if (!ms) {
return callback;
}
var sealed = false,
irq = setTimeout(function () { // irq = interrupt request
sealed = true;
irq = null;
when && when.call(scope || this);
callback.call(scope || this, new Error('callback timed out'));
}, ms);
return function () {
// if sealed, it means that timeout has elapsed and we accept no future callback
if (sealed) { return undefined; }
// otherwise we clear timeout and allow the callback to be executed. note that we do not seal the function
// since we should allow multiple callback calls.
irq && (irq = clearTimeout(irq));
return callback.apply(scope || this, arguments);
};
}
};

5
node_modules/postman-runtime/lib/index.js generated vendored Normal file
View File

@@ -0,0 +1,5 @@
module.exports = {
Runner: require('./runner'),
Requester: require('./requester').Requester,
version: require('./version')
};

View File

@@ -0,0 +1,786 @@
// Browser Request
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
/* eslint-disable */
var _ = require('lodash');
var parseHeadersString = require('postman-collection').Header.parse;
request.log = {
'trace': noop, 'debug': noop, 'info': noop, 'warn': noop, 'error': noop
}
var CORS_ERROR_CODE = 'ERR_PM_CORS'; // Custom error code for CORS errors
var MIXED_CONTENT_ERROR_CODE = 'ERR_PM_MIXED_CONTENT'; // Custom error code for mixed content error
var DEFAULT_TIMEOUT = 3 * 60 * 1000 // 3 minutes
// The body is ignored if the request method is GET or HEAD.
// Refer: https://xhr.spec.whatwg.org/#the-send()-method
var METHODS_WITHOUT_BODY = {
'GET': true,
'HEAD': true
};
// Refer: https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_header_name
var FORBIDDEN_HEADERS = {
'accept-charset': true,
'accept-encoding': true,
'access-control-request-headers': true,
'access-control-request-method': true,
connection: true,
'content-length': true,
cookie: true,
cookie2: true,
date: true,
dnt: true,
expect: true,
'feature-policy': true,
host: true,
'keep-alive': true,
origin: true,
referer: true,
te: true,
trailer: true,
'transfer-encoding': true,
upgrade: true,
via: true
};
var IS_LOCALHOST = {
'localhost': true,
'127.0.0.1': true,
'127.1': true,
'[::1]': true
};
function forEachAsync (items, fn, cb) {
!cb && (cb = function () { /* (ಠ_ಠ) */ })
if (!(Array.isArray(items) && fn)) { return cb() }
var index = 0
var totalItems = items.length
function next (err) {
if (err || index >= totalItems) {
return cb(err)
}
try {
fn.call(items, items[index++], next)
} catch (error) {
return cb(error)
}
}
if (!totalItems) { return cb() }
next()
}
//
// request
//
function request(originalRequest, options, onStart, callback) {
var options_onResponse = options.onResponse; // Save this for later.
var XHR = _.get(options, ['agents', options.url && options.url.protocol.slice(0, -1), 'agentClass']) || XMLHttpRequest;
if(typeof options === 'string')
options = {'uri':options};
else
options = _.clone(options); // Use a duplicate for mutating.
options.onResponse = options_onResponse // And put it back.
if (options.verbose) request.log = getLogger();
if(options.url) {
options.uri = options.url && options.url.href || options.url;
delete options.url;
}
if(!options.uri && options.uri !== "")
return callback(new Error("options.uri is a required argument"));
if(typeof options.uri != "string")
return callback(new Error("options.uri must be a string"));
options.onStart = onStart
options.callback = callback
options.method = options.method || 'GET';
options.headers = _.reduce(options.headers || {}, function (accumulator, value, key) {
if (!XHR._allowForbiddenHeaders && isForbiddenHeader(key)) {
// mutate original request and options as these will be passed in the
// request and response callbacks.
originalRequest.headers.remove(key);
delete options.headers[key];
return accumulator;
}
accumulator[key] = value;
return accumulator;
}, {});
options.body = options.body || null
options.timeout = options.timeout || request.DEFAULT_TIMEOUT
if(options.headers.host)
console.warn("Request: Options.headers.host is not supported");
if(options.json) {
options.headers.accept = options.headers.accept || 'application/json'
if(options.method !== 'GET')
options.headers['content-type'] = 'application/json'
if(typeof options.json !== 'boolean')
options.body = JSON.stringify(options.json)
else if(typeof options.body !== 'string')
options.body = JSON.stringify(options.body)
}
//BEGIN QS Hack
var serialize = function(obj) {
var str = [];
for(var p in obj)
if (obj.hasOwnProperty(p)) {
if (_.isArray(obj[p])) {
_.forEach(obj[p], function (value) {
str.push(encodeURIComponent(p) + "=" + encodeURIComponent(value));
});
}
else {
str.push(encodeURIComponent(p) + "=" + encodeURIComponent(obj[p]));
}
}
return str.join("&");
}
if(options.qs){
var qs = (typeof options.qs == 'string')? options.qs : serialize(options.qs);
if(options.uri.indexOf('?') !== -1){ //no get params
options.uri = options.uri+'&'+qs;
}else{ //existing get params
options.uri = options.uri+'?'+qs;
}
}
//END QS Hack
//BEGIN FORM Hack
var multipart = function (data) {
if (!Array.isArray(data)) { return; }
var i,
ii,
formParam,
formData = new FormData();
for (i = 0, ii = data.length; i < ii; i++) {
if (!(formParam = data[i])) { continue; }
if (Array.isArray(formParam.value)) {
formParam.value.forEach(function (value) {
formData.append(formParam.key, value);
});
}
else {
formData.append(formParam.key, formParam.value);
}
}
return {
body: formData
};
};
if(options.form){
if(typeof options.form == 'string') {
console.warn('form name unsupported');
}
if(XHR._allowBodyInGET || !METHODS_WITHOUT_BODY[options.method]) {
var encoding = (options.encoding || 'application/x-www-form-urlencoded').toLowerCase();
if (!options.headers['content-type'] && !options.headers['Content-Type']) {
options.headers['content-type'] = encoding;
}
switch(encoding){
case 'application/x-www-form-urlencoded':
options.body = serialize(options.form).replace(/%20/g, "+");
break;
case 'multipart/form-data':
var multi = multipart(options.form);
//options.headers['content-length'] = multi.length;
options.body = multi.body;
options.headers['content-type'] = multi.type;
break;
default : console.warn('unsupported encoding:'+encoding);
}
}
}
if (options.formData && (XHR._allowBodyInGET || !METHODS_WITHOUT_BODY[options.method])) {
var multipartBody = multipart(options.formData);
//options.headers['content-length'] = multipartBody.length;
options.body = multipartBody.body;
multipartBody.type && (options.headers['content-type'] = multipartBody.type);
}
//END FORM Hack
// If onResponse is boolean true, call back immediately when the response is known,
// not when the full request is complete.
options.onResponse = options.onResponse || noop
if(options.onResponse === true) {
options.onResponse = callback
options.callback = noop
}
// XXX Browsers do not like this.
//if(options.body)
// options.headers['content-length'] = options.body.length;
// HTTP basic authentication
if(!options.headers.authorization && options.auth)
options.headers.authorization = 'Basic ' + b64_enc(options.auth.username + ':' + options.auth.password);
// Query cookie jar if available
if ((typeof (options.jar && options.jar.getCookieString) === 'function')) {
options.jar.getCookieString(options.uri, function (_, cookies) {
if (cookies && cookies.length) {
options.cookiesFromJar = cookies;
}
run_xhr(XHR, originalRequest, options)
})
}
else {
return run_xhr(XHR, originalRequest, options)
}
}
var req_seq = 0
function run_xhr(XHR, originalRequest, options) {
var xhr = new XHR(options)
, timed_out = false
, is_cors = is_crossDomain(options.uri)
, supports_cors = ('withCredentials' in xhr)
req_seq += 1
xhr.seq_id = req_seq
xhr.id = req_seq + ': ' + options.method + ' ' + options.uri
xhr._id = xhr.id // I know I will type "_id" from habit all the time.
if(is_cors && !supports_cors) {
// This should never happen in our app
var cors_err = new Error('Browser does not support cross-origin request: ' + options.uri);
cors_err.code = CORS_ERROR_CODE;
cors_err.cors = 'unsupported';
options.callback(cors_err, xhr);
return xhr;
}
xhr.timeoutTimer = setTimeout(too_late, options.timeout)
function too_late() {
timed_out = true
var er = new Error('ETIMEDOUT')
er.code = 'ETIMEDOUT'
er.duration = options.timeout
request.log.error('Timeout', { 'id':xhr._id, 'milliseconds':options.timeout })
return options.callback(er, xhr)
}
// Some states can be skipped over, so remember what is still incomplete.
var did = {'response':false, 'loading':false, 'end':false, 'onStart': false}
xhr.onreadystatechange = on_state_change
xhr.open(options.method, options.uri, true) // asynchronous
if (is_cors) {
xhr.withCredentials = !! options.withCredentials
}
(options.encoding === null) && (xhr.responseType = "arraybuffer");
xhr.send(options.body)
return xhr
function on_state_change(event) {
if(timed_out)
return request.log.debug('Ignoring timed out state change', {'state':xhr.readyState, 'id':xhr.id})
request.log.debug('State change', {'state':xhr.readyState, 'id':xhr.id, 'timed_out':timed_out})
if(xhr.readyState === XHR.OPENED) {
request.log.debug('Request started', { 'id': xhr.id });
var cookies = [],
onInvalidHeader = function (key, error) {
error = new Error(`Header "${key}" contains invalid characters`);
// Do not process this request further.
did.response = true
did.loading = true
did.end = true
options.callback(error, xhr)
};
for (var key in options.headers) {
if (!options.headers.hasOwnProperty(key)) {
continue;
}
// Save all the cookies and add at the end because
if (String(key).toLowerCase() === 'cookie') {
cookies.push(options.headers[key]);
continue;
}
try {
if (Array.isArray(options.headers[key])) {
_.forEach(options.headers[key], function (eachValue) {
xhr.setRequestHeader(key, eachValue);
});
}
else {
xhr.setRequestHeader(key, options.headers[key]);
}
} catch (error) {
onInvalidHeader(key, error)
}
}
// Add `Cookie` header if cookies are present
if (cookies.length || options.cookiesFromJar) {
try {
var cookieString = cookies.join('; ') + (options.cookiesFromJar || '');
xhr.setRequestHeader('Cookie', cookieString);
// Also add update the original request header for console logs
originalRequest.headers.upsert({
key: 'Cookie',
value: cookieString
});
} catch (error) {
onInvalidHeader('Cookie', error)
}
}
}
else if(xhr.readyState === XHR.HEADERS_RECEIVED)
on_response()
else if(xhr.readyState === XHR.LOADING) {
on_response()
on_loading()
}
else if(xhr.readyState === XHR.DONE) {
on_response()
on_loading()
on_end()
}
}
function on_response() {
if(did.response)
return
did.response = true
request.log.debug('Got response', {'id':xhr.id, 'status':xhr.status})
clearTimeout(xhr.timeoutTimer)
xhr.statusCode = xhr.status // Node request compatibility
// Construct postman-request compatible debug object
!xhr.request && (xhr.request = {});
xhr.request._debug = xhr._debugData || [{
request: {
method: options.method,
href: options.uri,
headers: originalRequest.headers.toJSON(),
httpVersion: '1.1'
},
response: {
statusCode: xhr.statusCode,
headers: parseHeadersString(xhr.getAllResponseHeaders()),
httpVersion: '1.1'
}
}];
if (xhr.statusCode === 0 && xhr._error) {
// Do not process this request further.
did.loading = true
did.end = true
return options.callback(xhr._error, xhr);
}
// Detect mixed content failure
if (xhr.statusCode === 0 && is_mixedContent(options.uri)) {
var mixedContent_err = new Error('Mixed Content request rejected: ' + options.uri);
mixedContent_err.code = MIXED_CONTENT_ERROR_CODE;
// Do not process this request further.
did.loading = true
did.end = true
return options.callback(mixedContent_err, xhr)
}
// Detect failed CORS requests.
if(is_cors && xhr.statusCode == 0) {
var cors_err = new Error('CORS request rejected: ' + options.uri);
cors_err.code = CORS_ERROR_CODE;
cors_err.cors = 'rejected';
// Do not process this request further.
did.loading = true
did.end = true
return options.callback(cors_err, xhr)
}
function done () {
// Trigger onStart before callback
did.onStart = true
options.onStart(xhr)
options.onResponse(null, xhr)
// Due to the weird dependency of `onStart` and `callback` order,
// we ensure that callback is not called before onStart.
// This happens only if we are waiting for cookies to be added into the cookie jar.
typeof did.callback === 'function' && did.callback();
}
// We are all done here if the cookie jar is not available
if (!(typeof (options.jar && options.jar.setCookie) === 'function')) {
return done();
}
// Add cookies into the jar
var addCookie = function (cookie, cb) {
options.jar.setCookie(cookie, options.uri, {ignoreError: true}, function () {
cb()
})
},
getSetCookieHeaders = function (headersString) {
var cookies = [];
(parseHeadersString(headersString) || []).filter(function (header) {
if (String(header && header.key).toLowerCase() === 'set-cookie') {
cookies.push(header.value);
}
});
return cookies;
},
cookies = getSetCookieHeaders(xhr.getAllResponseHeaders());
if (!(cookies && cookies.length)) {
return done();
}
forEachAsync(cookies, addCookie, function () {
done()
})
}
function on_loading() {
if(did.loading)
return
did.loading = true
request.log.debug('Response body loading', {'id':xhr.id})
// TODO: Maybe simulate "data" events by watching xhr.responseText
}
function on_end() {
if(did.end)
return
did.end = true
request.log.debug('Request done', {'id':xhr.id})
xhr.body = (options.encoding === null) ? xhr.response : xhr.responseText;
if(options.json) {
try {
xhr.body = (xhr.responseText) ? JSON.parse(xhr.responseText) : xhr.responseText;
}
catch (er) {
return options.callback(er, xhr)
}
}
// Call the final callback if `onStart` is already called
if (did.onStart) {
options.callback(null, xhr, xhr.body, xhr.request && xhr.request._debug)
}
// otherwise, save the callback which will be triggered later in the `done` function
else {
did.callback = options.callback.bind(this, null, xhr, xhr.body, xhr.request && xhr.request._debug)
}
}
} // request
request.withCredentials = false;
request.DEFAULT_TIMEOUT = DEFAULT_TIMEOUT;
var shortcuts = [
'get',
'post',
'put',
'head',
'del',
'options',
'trace',
'copy',
'lock',
'mkcol',
'move',
'purge',
'propfind',
'proppatch',
'unlock',
'report',
'mkactivity',
'checkout',
'merge',
'm-search',
'notify',
'subscribe',
'unsubscribe',
'patch',
'search'
];
var shortcutsToMethods = {
'del': 'delete'
};
//
// defaults
//
request.defaults = function(options, requester) {
var def = function (method) {
var d = function (params, callback) {
if(typeof params === 'string')
params = {'uri': params};
else {
params = JSON.parse(JSON.stringify(params));
}
for (var i in options) {
if (params[i] === undefined) params[i] = options[i]
}
return method(params, callback)
}
return d
}
var de = def(request)
shortcuts.forEach(function (method) {
de[method] = def(request[method])
})
return de
}
//
// HTTP method shortcuts
//
shortcuts.forEach(function(shortcut) {
var method = shortcutsToMethods[shortcut] || shortcut;
method = method.toUpperCase();
var func = shortcut.toLowerCase();
request[func] = function(opts) {
if(typeof opts === 'string')
opts = {'method':method, 'uri':opts};
else {
opts = JSON.parse(JSON.stringify(opts));
opts.method = method;
}
var args = [opts].concat(Array.prototype.slice.apply(arguments, [1]));
return request.apply(this, args);
}
})
//
// CouchDB shortcut
//
request.couch = function(options, callback) {
if(typeof options === 'string')
options = {'uri':options}
// Just use the request API to do JSON.
options.json = true
if(options.body)
options.json = options.body
delete options.body
callback = callback || noop
var xhr = request(options, couch_handler)
return xhr
function couch_handler(er, resp, body) {
if(er)
return callback(er, resp, body)
if((resp.statusCode < 200 || resp.statusCode > 299) && body.error) {
// The body is a Couch JSON object indicating the error.
er = new Error('CouchDB error: ' + (body.error.reason || body.error.error))
for (var key in body)
er[key] = body[key]
return callback(er, resp, body);
}
return callback(er, resp, body);
}
}
//
// Utility
//
function noop() {}
function getLogger() {
var logger = {}
, levels = ['trace', 'debug', 'info', 'warn', 'error']
, level, i
for(i = 0; i < levels.length; i++) {
level = levels[i]
logger[level] = noop
if(typeof console !== 'undefined' && console && console[level])
logger[level] = formatted(console, level)
}
return logger
}
function formatted(obj, method) {
return formatted_logger
function formatted_logger(str, context) {
if(typeof context === 'object')
str += ' ' + JSON.stringify(context)
return obj[method].call(obj, str)
}
}
function window_location () {
// jQuery #8138, IE may throw an exception when accessing
// a field from window.location if document.domain has been set
var ajaxLocation
try { ajaxLocation = location.href }
catch (e) {
// Use the href attribute of an A element since IE will modify it given document.location
ajaxLocation = document.createElement( "a" );
ajaxLocation.href = "";
ajaxLocation = ajaxLocation.href;
}
return ajaxLocation
}
// Return whether a URL is a cross-domain request.
function is_crossDomain(url) {
var rurl = /^([\w\+\.\-]+:)(?:\/\/([^\/?#:]*)(?::(\d+))?)?/
, ajaxLocation = window_location()
, ajaxLocParts = rurl.exec(ajaxLocation.toLowerCase()) || []
, parts = rurl.exec(url.toLowerCase() )
var result = !!(
parts &&
( parts[1] != ajaxLocParts[1]
|| parts[2] != ajaxLocParts[2]
|| (parts[3] || (parts[1] === "http:" ? 80 : 443)) != (ajaxLocParts[3] || (ajaxLocParts[1] === "http:" ? 80 : 443))
)
)
//console.debug('is_crossDomain('+url+') -> ' + result)
return result
}
function is_mixedContent (url) {
var rurl = /^([\w\+\.\-]+:)(?:\/\/([^\/?#:]*)(?::(\d+))?)?/
, ajaxLocation = window_location()
, ajaxLocParts = rurl.exec(ajaxLocation.toLowerCase()) || []
, parts = rurl.exec(url.toLowerCase() )
return parts[1] != ajaxLocParts[1] && !IS_LOCALHOST[parts[2]]
}
// MIT License from http://phpjs.org/functions/base64_encode:358
function b64_enc (data) {
// Encodes string using MIME base64 algorithm
var b64 = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
var o1, o2, o3, h1, h2, h3, h4, bits, i = 0, ac = 0, enc="", tmp_arr = [];
if (!data) {
return data;
}
// assume utf8 data
// data = this.utf8_encode(data+'');
do { // pack three octets into four hexets
o1 = data.charCodeAt(i++);
o2 = data.charCodeAt(i++);
o3 = data.charCodeAt(i++);
bits = o1<<16 | o2<<8 | o3;
h1 = bits>>18 & 0x3f;
h2 = bits>>12 & 0x3f;
h3 = bits>>6 & 0x3f;
h4 = bits & 0x3f;
// use hexets to index into b64, and append result to encoded string
tmp_arr[ac++] = b64.charAt(h1) + b64.charAt(h2) + b64.charAt(h3) + b64.charAt(h4);
} while (i < data.length);
enc = tmp_arr.join('');
switch (data.length % 3) {
case 1:
enc = enc.slice(0, -2) + '==';
break;
case 2:
enc = enc.slice(0, -1) + '=';
break;
}
return enc;
}
// Check if given header name is forbidden i.e, cannot be modified programmatically.
// Refer: https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_header_name
// @note The User-Agent header is no longer forbidden. However,
// Chrome will silently drop the header: https://bugs.chromium.org/p/chromium/issues/detail?id=571722
function isForbiddenHeader (headerName) {
headerName = String(headerName).toLowerCase();
return FORBIDDEN_HEADERS[headerName] ||
headerName.startsWith('proxy-') ||
headerName.startsWith('sec-');
}
// ensure that the .jar() function is available
request.jar = _.noop;
module.exports = request;

View File

@@ -0,0 +1,303 @@
/**
* @fileOverview
*
* This module consists all request body transformer functions based on the request body modes supported
* Ideally, this should one day move to a function in SDK something like request.getNodeRequestOptions()
*
*
* _
* ( ) ,,,,,
* \\ . . ,
* \\ | - D ,
* (._) \__- | ,
* | |..
* \\|_ , ,---- _ |----.
* \__ ( ( / ) _
* | \/ \. ' _.| \ ( )
* | \ /( / /\_ \ //
* \ / ( / / ) //
* ( , / / , (_.)
* |......\ | \,
* / / ) \---
* /___/___^//
*/
var _ = require('lodash'),
CONTENT_TYPE_HEADER_KEY = 'Content-Type',
/**
* Map content-type to respective body language.
*
* @private
* @type {Object}
*/
CONTENT_TYPE_LANGUAGE = {
'html': 'text/html',
'text': 'text/plain',
'json': 'application/json',
'javascript': 'application/javascript',
'xml': 'application/xml'
},
STRING = 'string',
E = '',
oneNormalizedHeader,
// the following two are reducer functions. we keep it defined here to avoid redefinition upon each parse
urlEncodedBodyReducer,
formDataBodyReducer;
/**
* Find the enabled header with the given name.
*
* @todo Add this helper in Collection SDK.
*
* @private
* @param {HeaderList} headers
* @param {String} name
* @returns {Header|undefined}
*/
oneNormalizedHeader = function oneNormalizedHeader (headers, name) {
var i,
header;
// get all headers with `name`
headers = headers.reference[name.toLowerCase()];
if (Array.isArray(headers)) {
// traverse the headers list in reverse direction in order to find the last enabled
for (i = headers.length - 1; i >= 0; i--) {
header = headers[i];
if (header && !header.disabled) {
return header;
}
}
// bail out if no enabled header was found
return;
}
// return the single enabled header
if (headers && !headers.disabled) {
return headers;
}
};
/**
* Reduces postman SDK url encoded form definition (flattened to array) into Node compatible body options
*
* @param {Object} form - url encoded form params accumulator
* @param {Object} param - url encoded form param
*
* @returns {Object}
*/
urlEncodedBodyReducer = function (form, param) {
if (!param || param.disabled) {
return form;
}
var key = param.key,
value = param.value;
// add the parameter to the form while accounting for duplicate values
if (!form.hasOwnProperty(key)) {
form[key] = value;
return form;
}
// at this point, we know that form has duplicate, so we need to accumulate it in an array
if (!Array.isArray(form[key])) {
form[key] = [form[key]];
}
form[key].push(value); // finally push the duplicate and return
return form;
};
/**
* Reduces postman SDK multi-part form definition (flattened to array) into Node compatible body options
*
* @param {Array} data - multi-part form params accumulator
* @param {Object} param - multi-part form param
*
* @returns {Array}
*/
formDataBodyReducer = function (data, param) {
if (!param || param.disabled) {
return data;
}
var formParam = {
key: param.key,
value: param.value
},
options; // we keep the default blank and then set to object wherever needed. saves doing object keyLength
// make sure that value is either string or read stream otherwise it'll cause error in postman-request
if (param.type !== 'file' && typeof formParam.value !== STRING) {
try {
formParam.value = JSON.stringify(formParam.value);
}
catch (err) {
formParam.value = E;
}
}
// make sure `filename` param is sent for every file without `value`
// so that `filename=""` is added to content-disposition header in form data
if (param.type === 'file' && !formParam.value && typeof param.fileName !== 'string') {
param.fileName = E;
formParam.value = E; // make sure value is not null/undefined ever
}
// if data has a truthy content type, we mutate the value to take the options. we are assuming that
// blank string will not be considered as an accepted content type.
if (param.contentType && typeof param.contentType === STRING) {
(options || (options = {})).contentType = param.contentType;
}
// additionally parse the file name and length if sent
// @note: Add support for fileName & fileLength option in Schema & SDK.
// The filepath property overrides filename and may contain a relative path.
if (typeof param.fileName === STRING) { (options || (options = {})).filename = param.fileName; }
if (typeof param.fileLength === 'number') { (options || (options = {})).knownLength = param.fileLength; }
// if options were set, add them to formParam
options && (formParam.options = options);
data.push(formParam);
return data;
};
/**
* This module exposes functions that are named corresponding to Postman collection body modes. It accepts the body
* definition, usually like `request.body.raw` where mode is `raw` and returns its equivalent structure that needs to be
* sent to node request module
*/
module.exports = {
/**
* @param {Object} content - request body content
* @param {Request} [request] - request object
* @returns {Object}
*/
raw: function (content, request) {
var contentLanguage = _.get(request, 'body.options.raw.language', 'text');
// Add `Content-Type` header from body options if not set already
if (request && !oneNormalizedHeader(request.headers, CONTENT_TYPE_HEADER_KEY)) {
request.headers.add({
key: CONTENT_TYPE_HEADER_KEY,
value: CONTENT_TYPE_LANGUAGE[contentLanguage] || CONTENT_TYPE_LANGUAGE.text,
system: true
});
}
if (typeof content !== STRING) {
content = JSON.stringify(content);
}
return {
body: content
};
},
/**
* @param {Object} content - request body content
* @returns {Object}
*/
urlencoded: function (content) {
if (content && _.isFunction(content.all)) { content = content.all(); } // flatten the body content
return {
form: _.reduce(content, urlEncodedBodyReducer, {})
};
},
/**
* @param {Object} content - request body content
* @returns {Object}
*/
formdata: function (content) {
if (content && _.isFunction(content.all)) { content = content.all(); } // flatten the body content
return {
formData: _.reduce(content, formDataBodyReducer, [])
};
},
/**
* @param {Object} content - request body content
* @returns {Object}
*/
file: function (content) {
return {
body: content && content.content
};
},
/**
* @param {Object} content - request body content
* @param {Request} [request] - Request object
* @returns {Object}
*/
graphql: function (content, request) {
var body;
// implicitly add `Content-Type` header if not set already
if (request && !oneNormalizedHeader(request.headers, CONTENT_TYPE_HEADER_KEY)) {
request.headers.add({
key: CONTENT_TYPE_HEADER_KEY,
value: CONTENT_TYPE_LANGUAGE.json,
system: true
});
}
// if `variables` is an object, just stringify the entire content
if (content && typeof content.variables !== STRING) {
// if any property of graphql is undefined, it will not get stringified
// as a result, if no content object's properties are present then the
// result will be a blank object being sent.
// note that this behavior has to be imitated later when we are
// receiving variables as string
return {
body: JSON.stringify({
query: content.query,
operationName: content.operationName,
variables: content.variables
})
};
}
// otherwise, traverse the graphql properties and generate the
// stringified content. This avoids parsing the variables.
body = [];
if (content.hasOwnProperty('query') && (typeof content.query === STRING)) {
body.push('"query":' + JSON.stringify(content.query));
}
if (content.hasOwnProperty('operationName') && (typeof content.operationName === STRING)) {
body.push('"operationName":' + JSON.stringify(content.operationName));
}
if (content.hasOwnProperty('variables') && (typeof content.variables === STRING) &&
// even though users are free to send even malformed json string, the case of empty string has to be
// specially disallowed since in most default cases if a text editor is used to accept this data, it will
// send a blank string for an empty text-editor state and that would be an error flow. That implies majority
// default use case will become error flow and handling for the same has to be also coded in every other
// place where runtime is used.
(content.variables !== E)) {
body.push('"variables":' + content.variables); // already a stringified JSON
}
return {
body: '{' + body.join(',') + '}' // note that [] body = {} ¯\_(ツ)_/¯
};
}
};

769
node_modules/postman-runtime/lib/requester/core.js generated vendored Normal file
View File

@@ -0,0 +1,769 @@
var dns = require('dns'),
constants = require('constants'),
_ = require('lodash'),
uuid = require('uuid/v4'),
sdk = require('postman-collection'),
urlEncoder = require('postman-url-encoder'),
Socket = require('net').Socket,
requestBodyBuilders = require('./core-body-builder'),
version = require('../../package.json').version,
LOCAL_IPV6 = '::1',
LOCAL_IPV4 = '127.0.0.1',
LOCALHOST = 'localhost',
SOCKET_TIMEOUT = 500,
COLON = ':',
STRING = 'string',
HOSTS_TYPE = {
HOST_IP_MAP: 'hostIpMap'
},
HTTPS = 'https',
HTTPS_DEFAULT_PORT = 443,
HTTP_DEFAULT_PORT = 80,
S_CONNECT = 'connect',
S_ERROR = 'error',
S_TIMEOUT = 'timeout',
SSL_OP_NO = 'SSL_OP_NO_',
ERROR_ADDRESS_RESOLVE = 'NETERR: getaddrinfo ENOTFOUND ',
/**
* List of request methods without body.
*
* @private
* @type {Object}
*
* @note hash is used to reduce the lookup cost
* these methods are picked from the app, which don't support body.
* @todo move this list to SDK for parity.
*/
METHODS_WITHOUT_BODY = {
get: true,
copy: true,
head: true,
purge: true,
unlock: true
},
/**
* List of request options with their corresponding protocol profile behavior property name;
*
* @private
* @type {Object}
*/
PPB_OPTS = {
// enable or disable certificate verification
strictSSL: 'strictSSL',
// maximum number of redirects to follow (default: 10)
maxRedirects: 'maxRedirects',
// controls redirect behavior
// keeping the same convention as Newman
followRedirect: 'followRedirects',
followAllRedirects: 'followRedirects',
// retain `authorization` header when a redirect happens to a different hostname
followAuthorizationHeader: 'followAuthorizationHeader',
// redirect with the original HTTP method (default: redirects with GET)
followOriginalHttpMethod: 'followOriginalHttpMethod',
// removes the `referer` header when a redirect happens (default: false)
// @note `referer` header set in the initial request will be preserved during redirect chain
removeRefererHeader: 'removeRefererHeaderOnRedirect'
},
/**
* System headers which can be removed before sending the request if set
* in disabledSystemHeaders protocol profile behavior.
*
*
* @private
* @type {Array}
*/
ALLOWED_BLACKLIST_HEADERS = ['content-type', 'content-length', 'accept-encoding', 'connection'],
/**
* Find the enabled header with the given name.
*
* @todo Add this helper in Collection SDK.
*
* @private
* @param {HeaderList} headers
* @param {String} name
* @returns {Header|undefined}
*/
oneNormalizedHeader = function oneNormalizedHeader (headers, name) {
var i,
header;
// get all headers with `name`
headers = headers.reference[name.toLowerCase()];
if (Array.isArray(headers)) {
// traverse the headers list in reverse direction in order to find the last enabled
for (i = headers.length - 1; i >= 0; i--) {
header = headers[i];
if (header && !header.disabled) {
return header;
}
}
// bail out if no enabled header was found
return;
}
// return the single enabled header
if (headers && !headers.disabled) {
return headers;
}
},
/**
* Add static system headers if they are not disable using `disabledSystemHeaders`
* protocol profile behavior.
* Add the system headers provided as requester configuration.
*
* @note Don't traverse the user provided `disabledSystemHeaders` object
* to ensure runtime allowed headers and also for security reasons.
*
* @private
* @param {Request} request
* @param {Object} options
* @param {Object} disabledHeaders
* @param {Object} systemHeaders
*/
addSystemHeaders = function (request, options, disabledHeaders, systemHeaders) {
var key,
headers = request.headers;
[
{key: 'User-Agent', value: `PostmanRuntime/${version}`},
{key: 'Accept', value: '*/*'},
{key: 'Cache-Control', value: 'no-cache'},
{key: 'Postman-Token', value: uuid()},
{key: 'Host', value: options.url && options.url.host},
{key: 'Accept-Encoding', value: 'gzip, deflate, br'},
{key: 'Connection', value: 'keep-alive'}
].forEach(function (header) {
key = header.key.toLowerCase();
// add system header only if,
// 1. there's no user added header
// 2. not disabled using disabledSystemHeaders
!disabledHeaders[key] && !oneNormalizedHeader(headers, key) &&
headers.add({
key: header.key,
value: header.value,
system: true
});
});
for (key in systemHeaders) {
if (systemHeaders.hasOwnProperty(key)) {
// upsert instead of add to replace user-defined headers also
headers.upsert({
key: key,
value: systemHeaders[key],
system: true
});
}
}
},
/**
* Helper function to extract top level domain for the given hostname
*
* @private
*
* @param {String} hostname
* @returns {String}
*/
getTLD = function (hostname) {
if (!hostname) {
return '';
}
hostname = String(hostname);
return hostname.substring(hostname.lastIndexOf('.') + 1);
},
/**
* Abstracts out the logic for domain resolution
*
* @param options
* @param hostLookup
* @param hostLookup.type
* @param hostLookup.hostIpMap
* @param hostname
* @param callback
*/
_lookup = function (options, hostLookup, hostname, callback) {
var hostIpMap,
resolvedFamily = 4,
resolvedAddr;
// first we try to resolve the hostname using hosts file configuration
hostLookup && hostLookup.type === HOSTS_TYPE.HOST_IP_MAP &&
(hostIpMap = hostLookup[HOSTS_TYPE.HOST_IP_MAP]) && (resolvedAddr = hostIpMap[hostname]);
if (resolvedAddr) {
// since we only get an string for the resolved ip address, we manually find it's family (4 or 6)
// there will be at-least one `:` in an IPv6 (https://en.wikipedia.org/wiki/IPv6_address#Representation)
resolvedAddr.indexOf(COLON) !== -1 && (resolvedFamily = 6); // eslint-disable-line lodash/prefer-includes
// returning error synchronously causes uncaught error because listeners are not attached to error events
// on socket yet
return setImmediate(function () {
callback(null, resolvedAddr, resolvedFamily);
});
}
// no hosts file configuration provided or no match found. Falling back to normal dns lookup
return dns.lookup(hostname, options, callback);
},
/**
* Tries to make a TCP connection to the given host and port. If successful, the connection is immediately
* destroyed.
*
* @param host
* @param port
* @param callback
*/
connect = function (host, port, callback) {
var socket = new Socket(),
called,
done = function (type) {
if (!called) {
callback(type === S_CONNECT ? null : true); // eslint-disable-line callback-return
called = true;
this.destroy();
}
};
socket.setTimeout(SOCKET_TIMEOUT, done.bind(socket, S_TIMEOUT));
socket.once('connect', done.bind(socket, S_CONNECT));
socket.once('error', done.bind(socket, S_ERROR));
socket.connect(port, host);
socket = null;
},
/**
* Override DNS lookups in Node, to handle localhost as a special case.
* Chrome tries connecting to IPv6 by default, so we try the same thing.
*
* @param lookupOptions
* @param lookupOptions.port
* @param lookupOptions.network
* @param lookupOptions.network.restrictedAddresses
* @param lookupOptions.network.hostLookup
* @param lookupOptions.network.hostLookup.type
* @param lookupOptions.network.hostLookup.hostIpMap
* @param hostname
* @param options
* @param callback
*/
lookup = function (lookupOptions, hostname, options, callback) {
var self = this,
lowercaseHost = hostname && hostname.toLowerCase(),
networkOpts = lookupOptions.network || {},
hostLookup = networkOpts.hostLookup;
// do dns.lookup if hostname is not one of:
// - localhost
// - *.localhost
if (getTLD(lowercaseHost) !== LOCALHOST) {
return _lookup(options, hostLookup, lowercaseHost, function (err, addr, family) {
if (err) { return callback(err); }
return callback(self.isAddressRestricted(addr, networkOpts) ?
new Error(ERROR_ADDRESS_RESOLVE + hostname) : null, addr, family);
});
}
// Try checking if we can connect to IPv6 localhost ('::1')
connect(LOCAL_IPV6, lookupOptions.port, function (err) {
// use IPv4 if we cannot connect to IPv6
if (err) { return callback(null, LOCAL_IPV4, 4); }
callback(null, LOCAL_IPV6, 6);
});
},
/**
* Helper function to return postman-request compatible URL parser which
* respects the `disableUrlEncoding` protocol profile behavior.
*
* @private
* @param {Boolean} disableUrlEncoding
* @returns {Object}
*/
urlParser = function (disableUrlEncoding) {
return {
parse: function (urlToParse) {
return urlEncoder.toNodeUrl(urlToParse, disableUrlEncoding);
},
resolve: function (base, relative) {
if (typeof base === STRING) {
// @note we parse base URL here to respect `disableUrlEncoding`
// option even though resolveNodeUrl() accepts it as a string
base = urlEncoder.toNodeUrl(base, disableUrlEncoding);
}
return urlEncoder.resolveNodeUrl(base, relative);
}
};
},
/**
* Resolves given property with protocol profile behavior.
* Returns protocolProfileBehavior value if the given property is present.
* Else, returns value defined in default options.
*
* @param {String} property - Property name to look for
* @param {Object} defaultOpts - Default request options
* @param {Object} protocolProfileBehavior - Protocol profile behaviors
* @returns {*} - Resolved request option value
*/
resolveWithProtocolProfileBehavior = function (property, defaultOpts, protocolProfileBehavior) {
// bail out if property or defaultOpts is not defined
if (!(property && defaultOpts)) { return; }
if (protocolProfileBehavior && protocolProfileBehavior.hasOwnProperty(property)) {
return protocolProfileBehavior[property];
}
return defaultOpts[property];
};
module.exports = {
/**
* Creates a node request compatible options object from a request.
*
* @param request
* @param defaultOpts
* @param defaultOpts.agents
* @param defaultOpts.network
* @param defaultOpts.keepAlive
* @param defaultOpts.timeout
* @param defaultOpts.strictSSL
* @param defaultOpts.cookieJar The cookie jar to use (if any).
* @param defaultOpts.followRedirects
* @param defaultOpts.followOriginalHttpMethod
* @param defaultOpts.maxRedirects
* @param defaultOpts.maxResponseSize
* @param defaultOpts.implicitCacheControl
* @param defaultOpts.implicitTraceHeader
* @param defaultOpts.removeRefererHeaderOnRedirect
* @param defaultOpts.timings
* @param protocolProfileBehavior
* @returns {{}}
*/
getRequestOptions: function (request, defaultOpts, protocolProfileBehavior) {
!defaultOpts && (defaultOpts = {});
!protocolProfileBehavior && (protocolProfileBehavior = {});
var options = {},
networkOptions = defaultOpts.network || {},
self = this,
bodyParams,
useWhatWGUrlParser = defaultOpts.useWhatWGUrlParser,
disableUrlEncoding = protocolProfileBehavior.disableUrlEncoding,
disabledSystemHeaders = protocolProfileBehavior.disabledSystemHeaders || {},
// the system headers provided in requester configuration
systemHeaders = defaultOpts.systemHeaders || {},
url = useWhatWGUrlParser ? urlEncoder.toNodeUrl(request.url, disableUrlEncoding) :
urlEncoder.toLegacyNodeUrl(request.url.toString(true)),
isSSL = _.startsWith(url.protocol, HTTPS),
isTunnelingProxy = request.proxy && (request.proxy.tunnel || isSSL),
header,
reqOption,
portNumber,
behaviorName,
port = url && url.port,
hostname = url && url.hostname && url.hostname.toLowerCase(),
proxyHostname = request.proxy && request.proxy.host;
// resolve all *.localhost to localhost itself
// RFC: 6761 section 6.3 (https://tools.ietf.org/html/rfc6761#section-6.3)
if (getTLD(hostname) === LOCALHOST) {
// @note setting hostname to localhost ensures that we override lookup function
hostname = LOCALHOST;
}
if (getTLD(proxyHostname) === LOCALHOST) {
proxyHostname = LOCALHOST;
}
options.url = url;
options.method = request.method;
options.timeout = defaultOpts.timeout;
options.gzip = true;
options.brotli = true;
options.time = defaultOpts.timings;
options.verbose = defaultOpts.verbose;
options.agents = defaultOpts.agents;
options.extraCA = defaultOpts.extendedRootCA;
options.ignoreProxyEnvironmentVariables = defaultOpts.ignoreProxyEnvironmentVariables;
// Disable encoding of URL in postman-request in order to use pre-encoded URL object returned from
// toNodeUrl() function of postman-url-encoder
options.disableUrlEncoding = true;
// Ensures that "request" creates URL encoded formdata or querystring as
// foo=bar&foo=baz instead of foo[0]=bar&foo[1]=baz
options.useQuerystring = true;
// set encoding to null so that the response is a stream
options.encoding = null;
// Re-encode status message using `utf8` character encoding in postman-request.
// This is done to correctly represent status messages with characters that lie outside
// the range of `latin1` encoding (which is the default encoding in which status message is returned)
options.statusMessageEncoding = 'utf8';
// eslint-disable-next-line guard-for-in
for (reqOption in PPB_OPTS) {
behaviorName = PPB_OPTS[reqOption];
options[reqOption] = resolveWithProtocolProfileBehavior(behaviorName, defaultOpts, protocolProfileBehavior);
}
// set cookie jar if not disabled
if (!protocolProfileBehavior.disableCookies) {
options.jar = defaultOpts.cookieJar || true;
}
// use the server's cipher suite order instead of the client's during negotiation
if (protocolProfileBehavior.tlsPreferServerCiphers) {
options.honorCipherOrder = true;
}
// the SSL and TLS protocol versions to disabled during negotiation
if (Array.isArray(protocolProfileBehavior.tlsDisabledProtocols)) {
protocolProfileBehavior.tlsDisabledProtocols.forEach(function (protocol) {
// since secure options doesn't support TLSv1.3 before Node 14
// @todo remove the if condition when we drop support for Node 12
if (protocol === 'TLSv1_3' && !constants[SSL_OP_NO + protocol]) {
options.maxVersion = 'TLSv1.2';
}
else {
options.secureOptions |= constants[SSL_OP_NO + protocol];
}
});
}
// order of cipher suites that the SSL server profile uses to establish a secure connection
if (Array.isArray(protocolProfileBehavior.tlsCipherSelection)) {
options.ciphers = protocolProfileBehavior.tlsCipherSelection.join(':');
}
if (typeof defaultOpts.maxResponseSize === 'number') {
options.maxResponseSize = defaultOpts.maxResponseSize;
}
// Request body may return different options depending on the type of the body.
// @note getRequestBody may add system headers based on intent
bodyParams = self.getRequestBody(request, protocolProfileBehavior);
// Disable 'Cache-Control' and 'Postman-Token' based on global options
// @note this also make 'cache-control' and 'postman-token' part of `disabledSystemHeaders`
!defaultOpts.implicitCacheControl && (disabledSystemHeaders['cache-control'] = true);
!defaultOpts.implicitTraceHeader && (disabledSystemHeaders['postman-token'] = true);
// Add additional system headers to the request instance
addSystemHeaders(request, options, disabledSystemHeaders, systemHeaders);
// Don't add `Host` header if disabled using disabledSystemHeaders
// @note This can't be part of `blacklistHeaders` option as `setHost` is
// a Node.js http.request option to specifies whether or not to
// automatically add the Host header or not.
if (disabledSystemHeaders.host) {
header = oneNormalizedHeader(request.headers, 'host');
// only possible with AWS auth
header && header.system && (header.disabled = true);
// set `setHost` to false if there's no host header defined by the user
// or, the present host is added by the system.
(!header || header.system) && (options.setHost = false);
}
// Set `allowContentTypeOverride` if content-type header is disabled,
// this allows overriding (if invalid) the content-type for form-data
// and urlencoded request body.
if (disabledSystemHeaders['content-type']) {
options.allowContentTypeOverride = true;
}
options.blacklistHeaders = [];
ALLOWED_BLACKLIST_HEADERS.forEach(function (headerKey) {
if (!disabledSystemHeaders[headerKey]) { return; } // not disabled
header = oneNormalizedHeader(request.headers, headerKey);
// content-type added by body helper
header && header.system && (header.disabled = true);
// blacklist only if it's missing or part of system added headers
(!header || header.system) && options.blacklistHeaders.push(headerKey);
// @note for non-GET requests if no 'content-length' is set, it
// it assumes to be chucked request body and add 'transfer-encoding'
// here, we ensure blacklisting 'content-length' will also blacklist
// 'transfer-encoding' header.
if (headerKey === 'content-length') {
header = oneNormalizedHeader(request.headers, 'transfer-encoding');
(!header || header.system) && options.blacklistHeaders.push('transfer-encoding');
}
});
// Finally, get headers object
options.headers = request.getHeaders({enabled: true, sanitizeKeys: true});
// override URL parser to WhatWG URL parser
if (useWhatWGUrlParser) {
options.urlParser = urlParser(disableUrlEncoding);
}
// override DNS lookup
if (networkOptions.restrictedAddresses || hostname === LOCALHOST ||
(!isTunnelingProxy && proxyHostname === LOCALHOST) || networkOptions.hostLookup) {
// Use proxy port for localhost resolution in case of non-tunneling proxy
// because the request will be sent to proxy server by postman-request
if (request.proxy && !isTunnelingProxy) {
portNumber = Number(request.proxy.port);
}
// Otherwise, use request's port
else {
portNumber = Number(port) || (isSSL ? HTTPS_DEFAULT_PORT : HTTP_DEFAULT_PORT);
}
_.isFinite(portNumber) && (options.lookup = lookup.bind(this, {
port: portNumber,
network: networkOptions
}));
}
_.assign(options, bodyParams, {
// @note these common agent options can be overridden by specifying
// custom http/https agents using requester option `agents`
agentOptions: {
keepAlive: defaultOpts.keepAlive
}
});
return options;
},
/**
* Processes a request body and puts it in a format compatible with
* the "request" library.
*
* @todo: Move this to the SDK.
* @param request - Request object
* @param protocolProfileBehavior - Protocol profile behaviors
*
* @returns {Object}
*/
getRequestBody: function (request, protocolProfileBehavior) {
if (!(request && request.body)) {
return;
}
var i,
property,
requestBody = request.body,
requestBodyType = requestBody.mode,
requestMethod = (typeof request.method === STRING) ? request.method.toLowerCase() : undefined,
bodyIsEmpty = requestBody.isEmpty(),
bodyIsDisabled = requestBody.disabled,
bodyContent = requestBody[requestBodyType],
// flag to decide body pruning for METHODS_WITHOUT_BODY
// @note this will be `true` even if protocolProfileBehavior is undefined
pruneBody = protocolProfileBehavior ? !protocolProfileBehavior.disableBodyPruning : true;
// early bailout for empty or disabled body (this area has some legacy shenanigans)
if (bodyIsEmpty || bodyIsDisabled) {
return;
}
// body is empty if all the params in urlencoded and formdata body are disabled
// @todo update Collection SDK isEmpty method to account for this
if (sdk.PropertyList.isPropertyList(bodyContent)) {
bodyIsEmpty = true;
for (i = bodyContent.members.length - 1; i >= 0; i--) {
property = bodyContent.members[i];
// bail out if a single enabled property is present
if (property && !property.disabled) {
bodyIsEmpty = false;
break;
}
}
// bail out if body is empty
if (bodyIsEmpty) {
return;
}
}
// bail out if request method doesn't support body and pruneBody is true.
if (METHODS_WITHOUT_BODY[requestMethod] && pruneBody) {
return;
}
// even if body is not empty, but the body type is not known, we do not know how to parse the same
//
// @note if you'd like to support additional body types beyond formdata, url-encoding, etc, add the same to
// the builder module
if (!requestBodyBuilders.hasOwnProperty(requestBodyType)) {
return;
}
return requestBodyBuilders[requestBodyType](bodyContent, request);
},
/**
* Returns a JSON compatible with the Node's request library. (Also contains the original request)
*
* @param rawResponse Can be an XHR response or a Node request compatible response.
* about the actual request that was sent.
* @param requestOptions Options that were used to send the request.
* @param responseBody Body as a string.
*/
jsonifyResponse: function (rawResponse, requestOptions, responseBody) {
if (!rawResponse) {
return;
}
var responseJSON;
if (rawResponse.toJSON) {
responseJSON = rawResponse.toJSON();
responseJSON.request && _.assign(responseJSON.request, {
data: requestOptions.form || requestOptions.formData || requestOptions.body || {},
uri: { // @todo remove this
href: requestOptions.url && requestOptions.url.href || requestOptions.url
},
url: requestOptions.url && requestOptions.url.href || requestOptions.url
});
rawResponse.rawHeaders &&
(responseJSON.headers = this.arrayPairsToObject(rawResponse.rawHeaders) || responseJSON.headers);
return responseJSON;
}
responseBody = responseBody || '';
// @todo drop support or isolate XHR requester in v8
// XHR :/
return {
statusCode: rawResponse.status,
body: responseBody,
headers: _.transform(sdk.Header.parse(rawResponse.getAllResponseHeaders()), function (acc, header) {
if (acc[header.key]) {
!Array.isArray(acc[header.key]) && (acc[header.key] = [acc[header.key]]);
acc[header.key].push(header.value);
}
else {
acc[header.key] = header.value;
}
}, {}),
request: {
method: requestOptions.method || 'GET',
headers: requestOptions.headers,
uri: { // @todo remove this
href: requestOptions.url && requestOptions.url.href || requestOptions.url
},
url: requestOptions.url && requestOptions.url.href || requestOptions.url,
data: requestOptions.form || requestOptions.formData || requestOptions.body || {}
}
};
},
/**
* ArrayBuffer to String
*
* @param {ArrayBuffer} buffer
* @returns {String}
*/
arrayBufferToString: function (buffer) {
var str = '',
uArrayVal = new Uint8Array(buffer),
i,
ii;
for (i = 0, ii = uArrayVal.length; i < ii; i++) {
str += String.fromCharCode(uArrayVal[i]);
}
return str;
},
/**
* Converts an array of sequential pairs to an object.
*
* @param arr
* @returns {{}}
*
* @example
* ['a', 'b', 'c', 'd'] ====> {a: 'b', c: 'd' }
*/
arrayPairsToObject: function (arr) {
if (!_.isArray(arr)) {
return;
}
var obj = {},
key,
val,
i,
ii;
for (i = 0, ii = arr.length; i < ii; i += 2) {
key = arr[i];
val = arr[i + 1];
if (_.has(obj, key)) {
!_.isArray(obj[key]) && (obj[key] = [obj[key]]);
obj[key].push(val);
}
else {
obj[key] = val;
}
}
return obj;
},
/**
* Checks if a given host or IP is has been restricted in the options.
*
* @param {String} host
* @param {Object} networkOptions
* @param {Array<String>} networkOptions.restrictedAddresses
*
* @returns {Boolean}
*/
isAddressRestricted: function (host, networkOptions) {
return networkOptions.restrictedAddresses &&
networkOptions.restrictedAddresses[(host && host.toLowerCase())];
}
};

334
node_modules/postman-runtime/lib/requester/dry-run.js generated vendored Normal file
View File

@@ -0,0 +1,334 @@
/* istanbul ignore file */
// @todo
// 1. Return with annotations like (overridden headers, auth headers etc.)
// 2. Utilize requester (core.js) methods for dryRun
// 3. Add tests
const _ = require('lodash'),
async = require('async'),
mime = require('mime-types'),
urlEncoder = require('postman-url-encoder'),
Request = require('postman-collection').Request,
authorizeRequest = require('../authorizer').authorizeRequest,
authHandlers = require('../authorizer').AuthLoader.handlers,
version = require('../../package.json').version,
CALCULATED_AT_RUNTIME = '<calculated when request is sent>',
COOKIE = 'Cookie',
FUNCTION = 'function',
CONTENT_TYPE = 'Content-Type',
DEFAULT_MIME_TYPE = 'application/octet-stream',
CONTENT_TYPE_URLENCODED = 'application/x-www-form-urlencoded',
CONTENT_TYPE_FORMDATA = 'multipart/form-data; boundary=' + CALCULATED_AT_RUNTIME,
CONTENT_TYPE_LANGUAGE = {
'html': 'text/html',
'text': 'text/plain',
'json': 'application/json',
'javascript': 'application/javascript',
'xml': 'application/xml'
},
BODY_MODE = {
raw: 'raw',
file: 'file',
graphql: 'graphql',
formdata: 'formdata',
urlencoded: 'urlencoded'
};
/**
* Check if request body is empty and also handles disabled params for urlencoded
* and formdata bodies.
*
* @todo Update Collection SDK isEmpty method to account for this.
*
* @private
* @param {RequestBody} body
* @returns {Boolean}
*/
function bodyIsEmpty (body) {
if (!body || body.disabled || body.isEmpty()) {
return true;
}
var i,
param,
mode = body.mode;
if (!(mode === BODY_MODE.formdata || mode === BODY_MODE.urlencoded)) {
return false;
}
for (i = body[mode].members.length - 1; i >= 0; i--) {
param = body[mode].members[i];
// bail out if a single enabled param is present
if (param && !param.disabled) {
return false;
}
}
return true;
}
/**
* Add new System header.
*
* @param {object} headers
* @param {String} key
* @param {String} value
*/
function addSystemHeader (headers, key, value) {
headers.add({
key: key,
value: value,
system: true
});
}
/**
* Authorize the given request.
*
* @private
* @param {Request} request
* @param {Function} callback
*/
function setAuthorization (request, callback) {
authorizeRequest(request, function (err, clonedRequest) {
// @note authorizeRequest returns a cloned request.
!clonedRequest && (clonedRequest = new Request(request.toJSON()));
if (err) {
return callback(null, clonedRequest);
}
var auth = request.auth,
authType = auth && auth.type,
manifest = _.get(authHandlers, [authType, 'manifest']),
headers = _.get(clonedRequest, 'headers.reference') || {},
queryParams = _.get(clonedRequest, 'url.query.reference') || {},
bodyParams = _.get(clonedRequest, 'body.urlencoded.reference') || {},
propertyList,
propertyKey,
property;
if (authType === 'apikey' && (auth = auth.apikey)) {
propertyKey = String(auth.get('key')).toLowerCase();
propertyList = auth.get('in') === 'query' ? queryParams : headers;
if ((property = propertyList[propertyKey])) {
Array.isArray(property) && (property = property[property.length - 1]);
property.auth = true;
}
return callback(null, clonedRequest);
}
if (!(manifest && manifest.updates)) {
return callback(null, clonedRequest);
}
manifest.updates.forEach(function (update) {
propertyKey = update.property;
switch (update.type) {
case 'header':
propertyKey = propertyKey.toLowerCase();
propertyList = headers;
break;
case 'url.param':
propertyList = queryParams;
break;
case 'body.urlencoded':
propertyList = bodyParams;
break;
default: return;
}
if ((property = propertyList[propertyKey])) {
Array.isArray(property) && (property = property[property.length - 1]);
property.auth = true;
}
});
callback(null, clonedRequest);
});
}
/**
* Adds Content-Type header based on selected request body.
*
* @private
* @param {Request} request
* @param {Function} callback
*/
function setContentType (request, callback) {
// bail out if body is empty
if (bodyIsEmpty(request.body)) {
return callback(null, request);
}
var headers = request.headers,
contentLanguage;
switch (request.body.mode) {
case BODY_MODE.raw:
contentLanguage = _.get(request, 'body.options.raw.language', 'text');
addSystemHeader(headers, CONTENT_TYPE, CONTENT_TYPE_LANGUAGE[contentLanguage] ||
CONTENT_TYPE_LANGUAGE.text);
break;
case BODY_MODE.urlencoded:
addSystemHeader(headers, CONTENT_TYPE, CONTENT_TYPE_URLENCODED);
break;
case BODY_MODE.formdata:
addSystemHeader(headers, CONTENT_TYPE, CONTENT_TYPE_FORMDATA);
break;
case BODY_MODE.graphql:
addSystemHeader(headers, CONTENT_TYPE, CONTENT_TYPE_LANGUAGE.json);
break;
case BODY_MODE.file:
addSystemHeader(headers, CONTENT_TYPE,
mime.lookup(request.body.file && request.body.file.src) || DEFAULT_MIME_TYPE);
break;
default: break;
}
addSystemHeader(headers, 'Content-Length', CALCULATED_AT_RUNTIME);
callback(null, request);
}
/**
* Adds Cookie header for the given request url.
*
* @private
* @param {Request} request
* @param {Object} cookieJar
* @param {Function} callback
*/
function setCookie (request, cookieJar, callback) {
// bail out if not a valid instance of CookieJar
if (!(cookieJar && cookieJar.getCookieString)) {
return callback(null, request);
}
// @note don't pass request.url instance to force re-parsing of the URL
cookieJar.getCookieString(urlEncoder.toNodeUrl(request.url.toString()), function (err, cookies) {
if (err) {
return callback(null, request);
}
if (cookies && cookies.length) {
addSystemHeader(request.headers, COOKIE, cookies);
}
callback(null, request);
});
}
/**
* A helper method to dry run the given request instance.
* It returns the cloned request instance with the system added properties.
*
* @param {Request} request
* @param {Object} options
* @param {Object} options.cookieJar
* @param {Object} options.protocolProfileBehavior
* @param {Function} done
*/
function dryRun (request, options, done) {
if (!done && typeof options === FUNCTION) {
done = options;
options = {};
}
if (!Request.isRequest(request)) {
return done(new Error('Invalid Request instance'));
}
!options && (options = {});
var cookieJar = options.cookieJar,
implicitCacheControl = options.implicitCacheControl,
implicitTraceHeader = options.implicitTraceHeader,
disabledSystemHeaders = _.get(options.protocolProfileBehavior, 'disabledSystemHeaders') || {},
disableCookies = _.get(options.protocolProfileBehavior, 'disableCookies');
async.waterfall([
function setAuthorizationHeaders (next) {
setAuthorization(request, next);
},
function setContentTypeHeader (request, next) {
setContentType(request, next);
},
function setContentLength (request, next) {
var headers = request.headers,
header = headers.one('content-length');
// bail out if header added by body helper
if (header && header.system) {
return next(null, request);
}
switch (String(request.method).toUpperCase()) {
case 'GET':
case 'HEAD':
case 'TRACE':
case 'DELETE':
case 'CONNECT':
case 'OPTIONS':
break;
default:
addSystemHeader(headers, 'Content-Length', '0');
break;
}
next(null, request);
},
function setCookieHeader (request, next) {
if (disableCookies || !cookieJar) {
return next(null, request);
}
setCookie(request, cookieJar, next);
},
function setStaticHeaders (request, next) {
var headers = request.headers;
// remove header added by auth helpers
headers.remove(function (header) {
return header.system && header.key.toLowerCase() === 'host';
});
addSystemHeader(headers, 'User-Agent', 'PostmanRuntime/' + version);
addSystemHeader(headers, 'Accept', '*/*');
addSystemHeader(headers, 'Accept-Encoding', 'gzip, deflate, br');
addSystemHeader(headers, 'Host', CALCULATED_AT_RUNTIME);
addSystemHeader(headers, 'Connection', 'keep-alive');
implicitCacheControl && addSystemHeader(headers, 'Cache-Control', 'no-cache');
implicitTraceHeader && addSystemHeader(headers, 'Postman-Token', CALCULATED_AT_RUNTIME);
next(null, request);
},
function disableSystemHeaders (request, next) {
var headersReference = request.headers.reference,
header;
_.forEach(disabledSystemHeaders, function (disabled, headerKey) {
if (!disabled) { return; }
if ((header = headersReference[headerKey.toLowerCase()])) {
Array.isArray(header) && (header = header[header.length - 1]);
header.system && (header.disabled = true);
}
});
next(null, request);
}
], function (err, request) {
if (err) { return done(err); }
done(null, request);
});
}
module.exports = dryRun;

4
node_modules/postman-runtime/lib/requester/index.js generated vendored Normal file
View File

@@ -0,0 +1,4 @@
module.exports = {
Requester: require('./requester').Requester,
RequesterPool: require('./requester-pool').RequesterPool
};

View File

@@ -0,0 +1,94 @@
var _ = require('lodash'),
async = require('async'),
requests = require('postman-request'),
/**
* Sets the Proxy and tunnel to the options
*
* @param request
* @param options
* @param cb
*/
setProxy = function (request, options, cb) {
var proxyConfig;
if ((proxyConfig = _.get(request, 'proxy'))) {
options.proxy = proxyConfig.getProxyUrl();
// TODO: Use tri-state var for tunnel in SDK and update here
// for now determine the tunnel value from the URL unless explicitly set to true
options.tunnel = proxyConfig.tunnel ? true : request.url.protocol === 'https';
}
// if proxy is not set, postman-request implicitly fallbacks to proxy
// environment variables. To opt-out of this, set `ignoreProxyEnvironmentVariables`
// requester option.
// Setting proxy to `false` opt out of the implicit proxy configuration
// of the other environment variables.
if (!options.proxy && options.ignoreProxyEnvironmentVariables) {
options.proxy = false;
}
cb(null, request, options);
},
/**
* Gets the certificate from the options.certificate
* And appends it with the options provided
*
* @param request
* @param options
* @param cb
*/
setCertificate = function (request, options, cb) {
var certificate,
isSSL = request.url.protocol === 'https',
hasCertificate = request.certificate;
// exit if protocol is not https
// or both certificateManager and certificateList are missing
if (!isSSL || !hasCertificate) {
return cb(null, options);
}
certificate = request.certificate;
if (!certificate) { return cb(null, options); }
_.assign(options, {
pfx: _.get(certificate, 'pfx.value'),
key: _.get(certificate, 'key.value'),
cert: _.get(certificate, 'cert.value'),
passphrase: certificate.passphrase
});
cb(null, options);
};
// Enable support for extending root CAs.
// Refer: https://github.com/postmanlabs/postman-request/pull/35
// @todo trigger console warning (using callback) if not enabled.
requests.enableNodeExtraCACerts();
module.exports = function (request, options, onStart, callback) {
var req = {};
async.waterfall([
function (next) {
setProxy(request, options, next);
},
function (request, options, next) {
setCertificate(request, options, next);
}
], function (err, options) {
if (err) { return callback(err); }
var request = requests(options, callback);
// todo: this is a hack to ensure that we can abort requests from the app before they're complete.
req.abort = request.abort.bind(request);
// emit responseStart event
request.on('response', onStart);
});
return req;
};

View File

@@ -0,0 +1,69 @@
var _ = require('lodash'),
Requester = require('./requester').Requester,
RequestCookieJar = require('postman-request').jar,
STRING = 'string',
FUNCTION = 'function',
RequesterPool; // fn
RequesterPool = function (options, callback) {
var self = this,
extendedRootCA,
fileResolver = options && options.fileResolver;
_.assign((self.options = {}), {
timeout: _.min([
_.get(options, 'timeout.request'),
_.get(options, 'timeout.global')
]), // validated later inside requester
timings: _.get(options, 'requester.timings', true),
verbose: _.get(options, 'requester.verbose', false),
keepAlive: _.get(options, 'requester.keepAlive', true),
agents: _.get(options, 'requester.agents'), // http(s).Agent instances
cookieJar: _.get(options, 'requester.cookieJar'), // default set later in this constructor
strictSSL: _.get(options, 'requester.strictSSL'),
maxResponseSize: _.get(options, 'requester.maxResponseSize'),
// @todo drop support in v8
useWhatWGUrlParser: _.get(options, 'requester.useWhatWGUrlParser', false),
followRedirects: _.get(options, 'requester.followRedirects', true),
followOriginalHttpMethod: _.get(options, 'requester.followOriginalHttpMethod'),
maxRedirects: _.get(options, 'requester.maxRedirects'),
implicitCacheControl: _.get(options, 'requester.implicitCacheControl', true),
implicitTraceHeader: _.get(options, 'requester.implicitTraceHeader', true),
systemHeaders: _.get(options, 'requester.systemHeaders', {}),
removeRefererHeaderOnRedirect: _.get(options, 'requester.removeRefererHeaderOnRedirect'),
ignoreProxyEnvironmentVariables: _.get(options, 'ignoreProxyEnvironmentVariables'),
network: _.get(options, 'network', {})
});
// create a cookie jar if one is not provided
if (!self.options.cookieJar) {
self.options.cookieJar = RequestCookieJar();
}
if (fileResolver && typeof fileResolver.readFile === FUNCTION &&
typeof (extendedRootCA = _.get(options, 'requester.extendedRootCA')) === STRING) {
// eslint-disable-next-line security/detect-non-literal-fs-filename
fileResolver.readFile(extendedRootCA, function (err, caCerts) {
if (err) {
// @todo trigger console error
}
else {
// set extendedRootCA option
self.options.extendedRootCA = caCerts;
}
return callback();
});
}
else {
return callback();
}
};
RequesterPool.prototype.create = function (trace, callback) {
return Requester.create(trace, this.options, callback);
};
module.exports.RequesterPool = RequesterPool;

494
node_modules/postman-runtime/lib/requester/requester.js generated vendored Normal file
View File

@@ -0,0 +1,494 @@
var _ = require('lodash'),
core = require('./core'),
Emitter = require('events'),
inherits = require('inherits'),
now = require('performance-now'),
sdk = require('postman-collection'),
requests = require('./request-wrapper'),
dryRun = require('./dry-run'),
RESPONSE_START_EVENT_BASE = 'response.start.',
RESPONSE_END_EVENT_BASE = 'response.end.',
RESPONSE_START = 'responseStart',
RESPONSE_END = 'response',
ERROR_RESTRICTED_ADDRESS = 'NETERR: getaddrinfo ENOTFOUND ',
/**
* Headers which get overwritten by the requester.
*
* @private
* @const
* @type {Object}
*/
OVERWRITTEN_HEADERS = {
cookie: true, // cookies get appended with `;`
'content-length': true
},
/**
* Creates a sdk compatible cookie from a tough-cookie compatible cookie.
*
* @param cookie
* @returns {Object}
*/
toPostmanCookie = function (cookie) {
var expires = cookie.expiryTime();
cookie.toJSON && (cookie = cookie.toJSON());
return new sdk.Cookie({
name: cookie.key,
value: cookie.value,
expires: Number.isFinite(expires) ? new Date(expires) : null,
maxAge: cookie.maxAge,
domain: cookie.domain,
path: cookie.path,
secure: cookie.secure,
httpOnly: cookie.httpOnly,
hostOnly: cookie.hostOnly,
extensions: cookie.extensions
});
},
/**
* This method is used in conjunction with _.transform method to convert multi-value headers to multiple single
* value headers
*
* @param {Array} acc
* @param {Array|String} val
* @param {String} key
* @return {Object}
*/
transformMultiValueHeaders = function (acc, val, key) {
var i, ii;
if (Array.isArray(val)) {
for (i = 0, ii = val.length; i < ii; i++) {
acc.push({
key: key,
value: val[i]
});
}
}
else {
acc.push({
key: key,
value: val
});
}
},
/**
* Calculate request timings offset by adding runtime overhead which
* helps to determine request prepare and process time taken.
*
* @param {Number} runtimeTimer - Runtime request start HR time
* @param {Number} requestTimer - Request start HR time
* @param {Object} timings - Request timings offset
* @returns {Object}
*/
calcTimingsOffset = function (runtimeTimer, requestTimer, timings) {
if (!(runtimeTimer && requestTimer && timings)) { return; }
// runtime + postman-request initialization time
var initTime = requestTimer - runtimeTimer,
offset = {
request: initTime
};
// add initialization overhead to request offsets
_.forOwn(timings, function (value, key) {
offset[key] = value + initTime;
});
// total time taken by runtime to get the response
// @note if offset.end is missing, that means request is not complete.
// this is used to calculate timings on responseStart.
if (offset.end) {
offset.done = now() - runtimeTimer;
}
return offset;
},
Requester;
/**
* Creates a new Requester, which is used to make HTTP(s) requests.
*
* @param trace
* @param options
* @param {Boolean} [options.keepAlive=true] Optimizes HTTP connections by keeping them alive, so that new requests
* to the same host are made over the same underlying TCP connection.
* @param {CookieJar} [options.cookieJar] A cookie jar to use with Node requests.
* @param {Boolean} [options.strictSSL]
* @param {Boolean} [options.followRedirects=true] If false, returns a 301/302 as the response code
* instead of following the redirect
* @note `options.keepAlive` is only supported in Node.
* @note `options.cookieJar` is only supported in Node.
*
* @constructor
*/
inherits(Requester = function (trace, options) {
this.options = options || {};
// protect the timeout value from being non-numeric or infinite
if (!_.isFinite(this.options.timeout)) {
this.options.timeout = undefined;
}
this.trace = trace;
Requester.super_.call(this);
}, Emitter);
_.assign(Requester.prototype, /** @lends Requester.prototype */ {
/**
* Perform an HTTP request.
*
* @param {String} id
* @param {Request} request
* @param {Object} protocolProfileBehavior
* @param {Function} callback
*/
request: function (id, request, protocolProfileBehavior, callback) {
var self = this,
hostname,
cookieJar,
requestOptions,
networkOptions = self.options.network || {},
startTime = Date.now(),
startTimer = now(), // high-resolution time
cookies = [],
responseHeaders = [],
responseJSON = {},
// keep track of `responseStart` and `response` triggers
_responseStarted = false,
_responseEnded = false,
_responseData = {},
// Refer: https://github.com/postmanlabs/postman-runtime/blob/v7.14.0/docs/history.md
getExecutionHistory = function (debugInfo) {
var history = {
execution: {
verbose: Boolean(requestOptions.verbose),
sessions: {},
data: []
}
},
executionData = [],
requestSessions = {};
if (!Array.isArray(debugInfo)) {
return history;
}
// prepare history from request debug data
debugInfo.forEach(function (debugData) {
if (!debugData) { return; }
// @todo cache connection sessions and fetch reused session
// from the requester pool.
if (debugData.session && !requestSessions[debugData.session.id]) {
requestSessions[debugData.session.id] = debugData.session.data;
}
executionData.push({
request: debugData.request,
response: debugData.response,
timings: debugData.timings && {
// runtime start time
start: startTime,
// request start time
requestStart: debugData.timingStart,
// offsets calculated are relative to runtime start time
offset: calcTimingsOffset(startTimer, debugData.timingStartTimer, debugData.timings)
},
session: debugData.session && {
id: debugData.session.id,
// is connection socket reused
reused: debugData.session.reused
}
});
});
// update history object
history.execution.data = executionData;
history.execution.sessions = requestSessions;
return history;
},
/**
* Add the missing/system headers in the request object
*
* @param {Object[]} headers
*/
addMissingRequestHeaders = function (headers) {
_.forEach(headers, function (header) {
var lowerCasedKey = header.key.toLowerCase();
// update headers which gets overwritten by the requester
if (OVERWRITTEN_HEADERS[lowerCasedKey]) {
if (Array.isArray(_.get(request.headers, ['reference', lowerCasedKey]))) {
request.headers.remove(header.key);
}
request.headers.upsert({
key: header.key,
value: header.value,
system: true
});
}
});
},
/**
* Helper function to trigger `callback` and complete the request function
*
* @param {Error} error - error while requesting
* @param {Response} response - SDK Response instance
* @param {Object} history - Request-Response History
*/
onEnd = function (error, response, history) {
self.emit(RESPONSE_END_EVENT_BASE + id, error, self.trace.cursor,
self.trace, response, request, cookies, history);
return callback(error, response, request, cookies, history);
},
/**
* Helper function to keep track of `responseStart` and `response`
* triggers to make they are emitted in correct order.
*
* @todo fix requester control flow to remove this hack!
* this is required because CookieJar.getCookies is async method
* and by that time postman-request ends the request, which affects
* request post-send helpers because `response.start` event is not
* emitted on time and shared variables `cookies`, `responseJSON`,
* and, `responseHeaders` are initialized in onStart function.
*
* @param {String} trigger - trigger name
* @param {Response} response - SDK Response instance
* @param {Object} history - Request-Response History
*/
onComplete = function (trigger, response, history) {
if (trigger === RESPONSE_START) {
// set flag for responseStart callback
_responseStarted = true;
// if response is ended, end the response using cached data
if (_responseEnded) {
onEnd(null, _responseData.response, _responseData.history);
}
// bail out and wait for response end if not ended already
return;
}
// if response started, don't wait and end the response
if (_responseStarted) {
onEnd(null, response, history);
return;
}
// wait for responseStart and cache response callback data
_responseEnded = true;
_responseData = {
response: response,
history: history
};
},
/**
* Helper function to trigger `responseStart` callback and
* - transform postman-request response instance to SDK Response
* - filter cookies
* - filter response headers
* - add missing request headers
*
* @param {Object} response - Postman-Request response instance
*/
onStart = function (response) {
var responseStartEventName = RESPONSE_START_EVENT_BASE + id,
executionData,
initialRequest,
finalRequest,
sdkResponse,
history,
done = function () {
// emit the response.start event which eventually
// triggers responseStart callback
self.emit(responseStartEventName, null, sdkResponse, request, cookies, history);
// trigger completion of responseStart
onComplete(RESPONSE_START);
};
// @todo get rid of jsonifyResponse
responseJSON = core.jsonifyResponse(response, requestOptions);
// transform response headers to SDK compatible HeaderList
responseHeaders = _.transform(responseJSON.headers, transformMultiValueHeaders, []);
// initialize SDK Response instance
sdkResponse = new sdk.Response({
status: response && response.statusMessage,
code: responseJSON.statusCode,
header: responseHeaders
});
// prepare history from request debug data
history = getExecutionHistory(_.get(response, 'request._debug'));
// get the initial and final (on redirect) request from history
executionData = _.get(history, 'execution.data') || [];
initialRequest = _.get(executionData, '[0].request') || {};
finalRequest = executionData.length > 1 ?
// get final redirect
_.get(executionData, [executionData.length - 1, 'request']) :
// no redirects
initialRequest;
// add missing request headers so that they get bubbled up into the UI
addMissingRequestHeaders(initialRequest.headers);
// pull out cookies from the cookie jar, and make them chrome compatible.
if (cookieJar && _.isFunction(cookieJar.getCookies)) {
// get cookies set for the final request URL
cookieJar.getCookies(finalRequest.href, function (err, cookiesFromJar) {
if (err) {
return done();
}
cookies = _.transform(cookiesFromJar, function (acc, cookie) {
acc.push(toPostmanCookie(cookie));
}, []);
cookies = new sdk.CookieList(null, cookies);
done();
});
}
else {
cookies = new sdk.CookieList(null, []);
done();
}
};
// at this point the request could have come from collection, auth or sandbox
// we can't trust the integrity of this request
// bail out if request url is empty
if (!(request && request.url && request.url.toString && request.url.toString())) {
return onEnd(new Error('runtime:extensions~request: request url is empty'));
}
cookieJar = self.options.cookieJar;
requestOptions = core.getRequestOptions(request, self.options, protocolProfileBehavior);
// update url with the final encoded url
// @note this mutates the request object which will be passed in request
// and response callbacks
request.url.update(requestOptions.url.href);
hostname = request.url.getHost();
// check if host is on the `restrictedAddresses`
if (networkOptions.restrictedAddresses && core.isAddressRestricted(hostname, networkOptions)) {
return onEnd(new Error(ERROR_RESTRICTED_ADDRESS + hostname));
}
return requests(request, requestOptions, onStart, function (err, res, resBody, debug) {
// prepare history from request debug data
var history = getExecutionHistory(debug),
responseTime,
response;
if (err) {
// bubble up http errors
// @todo - Should we send an empty sdk Response here?
//
// Sending `history` object even in case of error
return onEnd(err, undefined, history);
}
// Calculate the time taken for us to get the response.
responseTime = Date.now() - startTime;
if (res && res.timings) {
// update response time to actual response end time
// of the final request in the redirect chain.
responseTime = Math.ceil(res.timings.end);
}
if (resBody && resBody instanceof ArrayBuffer) {
resBody = Buffer.from(resBody);
}
// Response in the SDK format
// @todo reuse same response instance used for responseStart callback
response = new sdk.Response({
code: responseJSON.statusCode,
status: res && res.statusMessage,
header: responseHeaders,
stream: resBody,
responseTime: responseTime
});
onComplete(RESPONSE_END, response, history);
});
},
/**
* Removes all current event listeners on the requester, and makes it ready for garbage collection :).
*
* @param {Function=} cb - Optional callback to be called on disposal
*
* @todo - In the future, when the requester manages its own connections etc, close them all here.
*/
dispose: function (cb) {
// This is safe for us, because we do not use wait on events. (i.e, no part of Runtime ever waits on
// any event to occur). We rely on callbacks for that, only choosing to use events as a way of streaming
// information outside runtime.
this.removeAllListeners();
_.isFunction(cb) && cb();
}
});
_.assign(Requester, /** @lends Requester */ {
/**
* Asynchronously create a new requester.
*
* @param trace
* @param trace.type - type of requester to return (for now, just http)
* @param trace.source - information about who needs this requester, e.g Auth, etc.
* @param trace.cursor - the cursor
* @param options
* @param callback
* @returns {*}
*/
create: function (trace, options, callback) {
return callback(null, new Requester(trace, options));
},
/**
* A helper method to dry run the given request instance.
* It returns the cloned request instance with the system added properties.
*
* @param {Request} request
* @param {Object} options
* @param {Object} options.cookieJar
* @param {Object} options.protocolProfileBehavior
* @param {Object} options.implicitCacheControl
* @param {Object} options.implicitTraceHeader
* @param {Function} done
*/
dryRun
});
module.exports.Requester = Requester;

View File

@@ -0,0 +1,65 @@
var _ = require('lodash'),
sdk = require('postman-collection'),
/**
* @const
* @type {string}
*/
FUNCTION = 'function',
SAFE_CONTEXT_PROPERTIES = ['replayState', 'coords'];
/**
* Creates a context object to be used with `http-request.command` extension.
*
* @function createItemContext
*
* @param {Object} payload
* @param {Item} payload.item
* @param {Object} [payload.coords]
* @param {Object} [defaults]
* @param {Object} [defaults.replayState]
* @param {Object} [defaults.coords]
*
* @returns {ItemContext}
*/
module.exports = function (payload, defaults) {
// extract properties from defaults that can/should be reused in new context
var context = defaults ? _.pick(defaults, SAFE_CONTEXT_PROPERTIES) : {};
// set cursor to context
!context.coords && (context.coords = payload.coords);
// save original item for reference
context.originalItem = payload.item;
// we clone item from the payload, so that we can make any changes we need there, without mutating the
// collection
context.item = new sdk.Item(payload.item.toJSON());
// get a reference to the Auth instance from the item, so changes are synced back
context.auth = context.originalItem.getAuth();
// Make sure run is not errored out if older version of collection SDK is used.
// @todo remove this safety check in the next release
if (typeof context.originalItem.getProtocolProfileBehaviorResolved === FUNCTION) {
// get protocolProfileBehavior for the item, also inherited from parent
context.protocolProfileBehavior = context.originalItem.getProtocolProfileBehaviorResolved();
}
else {
// get protocolProfileBehavior for the item
context.protocolProfileBehavior = context.originalItem.protocolProfileBehavior;
}
/**
* @typedef {Object} ItemContext
* @property {Object} coords - current cursor
* @property {Item} originalItem - reference to the item in the collection
* @property {Item} item - Holds a copy of the item given in the payload, so that it can be manipulated
* as necessary
* @property {RequestAuthBase|undefined} auth - If present, is the instance of Auth in the collection, which
* is changed as necessary using intermediate requests, etc.
* @property {ReplayState} replayState - has context on number of replays(if any) for this request
*/
return context;
};

376
node_modules/postman-runtime/lib/runner/cursor.js generated vendored Normal file
View File

@@ -0,0 +1,376 @@
var _ = require('lodash'),
uuid = require('uuid'),
Cursor;
/**
* @param {Number} [length=0]
* @param {Number} [cycles=1]
* @param {Number} [position=0]
* @param {Number} [iteration=0]
* @param {String} [ref]
* @constructor
*/
Cursor = function RunCursor (length, cycles, position, iteration, ref) { // eslint-disable-line func-name-matching
this.length = Cursor.validate(length, 0);
this.position = Cursor.validate(position, 0, this.length);
this.cycles = Cursor.validate(cycles, 1, 1);
this.iteration = Cursor.validate(iteration, 0, this.cycles);
this.ref = ref || uuid.v4();
};
_.assign(Cursor.prototype, {
/**
*
*
* @param {Object} state
* @param {Number} [state.length=0]
* @param {Number} [state.cycles=1]
* @param {Number} [state.position=0]
* @param {Number} [state.iteration=0]
* @param {String} [state.ref]
* @param {Function} [callback] - receives `(err:Error, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
load: function (state, callback, scope) {
!state && (state = {});
(state instanceof Cursor) && (state = state.current());
this.reset(state.length, state.cycles, state.position, state.iteration, state.ref, callback, scope);
},
/**
* Update length and cycle bounds
*
* @param {Number} [length=0]
* @param {Number} [cycles=1]
* @param {Number} [position=0]
* @param {Number} [iteration=0]
* @param {String} [ref]
* @param {Function} [callback] - receives `(err:Error, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
reset: function (length, cycles, position, iteration, ref, callback, scope) {
var coords = _.isFunction(callback) && this.current();
// validate parameter defaults
_.isNil(length) && (length = this.length);
_.isNil(cycles) && (cycles = this.cycles);
_.isNil(position) && (position = this.position);
_.isNil(iteration) && (iteration = this.iteration);
_.isNil(ref) && (ref = this.ref);
// use the constructor to set the values
Cursor.call(this, length, cycles, position, iteration, ref);
// send before and after values to the callback
return coords && callback.call(scope || this, null, this.current(), coords);
},
/**
* Update length and cycle bounds
*
* @param {Number} [length=0]
* @param {Number} [cycles=1]
* @param {Function} [callback] - receives `(err:Error, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
bounds: function (length, cycles, callback, scope) {
var coords = _.isFunction(callback) && this.current();
// validate parameter defaults
_.isNil(length) && (length = this.length);
_.isNil(cycles) && (cycles = this.cycles);
// use the constructor to set the values
Cursor.call(this, length, cycles, this.position, this.iteration);
return coords && callback.call(scope || this, null, this.current(), coords);
},
/**
* Set everything to minimum dimension
*
* @param {Function} [callback] - receives `(err:Error, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
zero: function (callback, scope) {
var coords = _.isFunction(callback) && this.current();
this.position = 0;
this.iteration = 0;
// send before and after values to the callback
return coords && callback.call(scope || this, null, this.current(), coords);
},
/**
* Set everything to mnimum dimension
*
* @param {Function} [callback] - receives `(err:Error, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
clear: function (callback, scope) {
var coords = _.isFunction(callback) && this.current();
this.position = 0;
this.iteration = 0;
this.cycles = 1;
this.length = 0;
return coords && callback.call(scope || this, null, this.current(), coords);
},
/**
* Seek to a specified Cursor
*
* @param {Number} [position]
* @param {Number} [iteration]
* @param {Function} [callback] - receives `(err:Error, changed:Boolean, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
seek: function (position, iteration, callback, scope) {
var coords = _.isFunction(callback) && this.current();
// if null or undefined implies use existing seek position
_.isNil(position) && (position = this.position);
_.isNil(iteration) && (iteration = this.iteration);
// make the pointers stay within boundary
if ((position >= this.length) || (iteration >= this.cycles) || (position < 0) || (iteration < 0) ||
isNaN(position) || isNaN(iteration)) {
return coords &&
callback.call(scope || this, new Error('runcursor: seeking out of bounds: ' + [position, iteration]));
}
// floor the numbers
position = ~~position;
iteration = ~~iteration;
// set the new positions
this.position = Cursor.validate(position, 0, this.length);
this.iteration = Cursor.validate(iteration, 0, this.cycles);
// finally execute the callback with the seek position
return coords && callback.call(scope || this, null, this.hasChanged(coords), this.current(), coords);
},
/**
* Seek one forward
*
* @param {Function} [callback] - receives `(err:Error, changed:Boolean, coords:Object, previous:Object)`
* @param {Object} [scope]
*/
next: function (callback, scope) {
var position = this.position,
iteration = this.iteration,
coords;
// increment position
position += 1;
// check if we need to increment cycle
if (position >= this.length) {
// set position to 0 and increment iteration
position = 0;
iteration += 1;
if (iteration >= this.cycles) {
coords = _.isFunction(callback) && this.current();
coords.eof = true;
return coords && callback.call(scope || this, null, false, coords, coords);
}
coords && (coords.cr = true);
}
// finally handover the new coordinates to seek function
return this.seek(position, iteration, callback, scope);
},
/**
* Tentative Cursor status, if we do `.next()`
*
* @param {Object} coords
*
* @returns {Object}
*/
whatnext: function (coords) {
var base = {
ref: this.ref,
length: this.length,
cycles: this.cycles
},
position,
iteration;
if (!_.isObject(coords)) {
return _.assign(base, {eof: true, bof: true, empty: this.empty()});
}
if (!this.length) {
return _.assign(base, {eof: true, bof: true, empty: true});
}
position = coords.position;
iteration = coords.iteration;
// increment position
position += 1;
// check if we need to increment cycle
if (position >= this.length) {
// set position to 0 and increment iteration
position = 0;
iteration += 1;
if (iteration >= this.cycles) {
return _.assign(base, {
position: this.length - 1,
iteration: iteration - 1,
eof: true
});
}
return _.assign(base, {
position: position,
iteration: iteration,
cr: true
});
}
return _.assign(base, {position: position, iteration: iteration});
},
/**
* Check whether current position and iteration is not as the same specified
*
* @param {Object} coords
* @returns {Boolean}
*/
hasChanged: function (coords) {
return _.isObject(coords) && !((this.position === coords.position) && (this.iteration === coords.iteration));
},
/**
* Current Cursor state
*
* @returns {Object}
*/
current: function () {
return {
position: this.position,
iteration: this.iteration,
length: this.length,
cycles: this.cycles,
empty: this.empty(),
eof: this.eof(),
bof: this.bof(),
cr: this.cr(),
ref: this.ref
};
},
/**
* Is the current position going to trigger a new iteration on `.next`?
*
* @returns {Boolean}
*/
cr: function () {
return !this.length || (this.position >= this.length);
},
/**
* @returns {Boolean}
*/
eof: function () {
return !this.length || (this.position >= this.length) && (this.iteration >= this.cycles);
},
/**
* @returns {Boolean}
*/
bof: function () {
return !this.length || ((this.position === 0) && (this.iteration === 0));
},
/**
* @returns {Boolean}
*/
empty: function () {
return !this.length;
},
/**
* @returns {Object}
*/
valueOf: function () {
return this.current();
},
clone: function () {
return new Cursor(this.length, this.cycles, this.position, this.iteration);
}
});
_.assign(Cursor, {
/**
* @param {Number} [length=0]
* @param {Number} [cycles=1]
* @param {Number} [position=0]
* @param {Number} [iteration=0]
* @param {String} [ref]
*
* @returns {Number}
*/
create: function (length, cycles, position, iteration, ref) {
return new Cursor(length, cycles, position, iteration, ref);
},
/**
* @param {Object|Cursor} obj
* @param {Object} [bounds]
* @param {Number} [bounds.length]
* @param {Number} [bounds.cycles]
*
* @returns {Cursor}
*/
box: function (obj, bounds) {
// already a Cursor, do nothing
if (obj instanceof Cursor) {
bounds && obj.bounds(bounds.length, bounds.cycles);
return obj;
}
// nothing to box, create a blank Cursor
if (!_.isObject(obj)) { return new Cursor(bounds && bounds.length, bounds && bounds.cycles); }
// load Cursor values from object
return new Cursor((bounds || obj).length, (bounds || obj).cycles, obj.position, obj.iteration, obj.ref);
},
/**
* @private
*
* @param {Number} num
* @param {Number} min [description]
* @param {Number} [max]
*
* @returns {Number}
*/
validate: function (num, min, max) {
if (typeof num !== 'number' || num < min) {
return min;
}
if (num === Infinity) {
return _.isNil(max) ? min : max;
}
return num;
}
});
module.exports = Cursor;

View File

@@ -0,0 +1,105 @@
var _ = require('lodash'),
util = require('../util'),
backpack = require('../../backpack');
module.exports = {
/**
* All the events that this extension triggers
* @type {Array}
*/
triggers: ['pause', 'resume', 'abort'],
prototype: /** @lends Run.prototype */ {
/**
* Pause a run
*
* @param {Function} callback
*/
pause: function (callback) {
callback = backpack.ensure(callback, this);
if (this.paused) { return callback && callback(new Error('run: already paused')); }
// schedule the pause command as an interrupt and flag that the run is pausing
this.paused = true;
this.interrupt('pause', null, callback);
},
/**
* Resume a paused a run
*
* @param {Function} callback
*/
resume: function (callback) {
callback = backpack.ensure(callback, this);
if (!this.paused) { return callback && callback(new Error('run: not paused')); }
// set flag that it is no longer paused and fire the stored callback for the command when it was paused
this.paused = false;
setTimeout(function () {
this.__resume();
delete this.__resume;
this.triggers.resume(null, this.state.cursor.current());
}.bind(this), 0);
callback && callback();
},
/**
* Aborts a run
*
* @param {boolean} [summarise=true]
* @param {function} callback
*/
abort: function (summarise, callback) {
if (_.isFunction(summarise) && !callback) {
callback = summarise;
summarise = true;
}
this.interrupt('abort', {
summarise: summarise
}, callback);
_.isFunction(this.__resume) && this.resume();
}
},
process: /** @lends Run.commands */ {
pause: function (userback, payload, next) {
// trigger the secondary callbacks
this.triggers.pause(null, this.state.cursor.current());
// tuck away the command completion callback in the run object so that it can be used during resume
this.__resume = next;
// execute the userback sent as part of the command and do so in a try block to ensure it does not hamper
// the process tick
var error = util.safeCall(userback, this);
// if there is an error executing the userback, then and only then raise the error (which stops the run)
if (error) {
return next(error);
}
},
/**
* @param {Function} userback
* @param {Object} payload
* @param {Boolean} payload.summarise
* @param {Function} next
*/
abort: function (userback, payload, next) {
// clear instruction pool and as such there will be nothing next to execute
this.pool.clear();
this.triggers.abort(null, this.state.cursor.current());
// execute the userback sent as part of the command and do so in a try block to ensure it does not hamper
// the process tick
backpack.ensure(userback, this) && userback();
next(null);
}
}
};

View File

@@ -0,0 +1,62 @@
var _ = require('lodash');
module.exports = {
init: function (done) {
done();
},
triggers: ['waitStateChange'],
prototype: {
/**
* @param {Function} fn - function to execute
* @param {Object} options
* @param {String} options.source
* @param {Number} options.time
* @param {Object} options.cursor
* @param {Function} next
* @private
*/
queueDelay: function (fn, options, next) {
var time = _.isFinite(options.time) ? parseInt(options.time, 10) : 0;
// if the time is a valid and finite time, we queue the delay command
if (time > 0) {
this.queue('delay', {
cursor: options.cursor,
source: options.source,
time: time
}).done(fn);
}
// otherwise, we do not delay and simply execute the function that was supposed to be called post delay
else {
fn();
}
next();
}
},
process: {
/**
* @param {Object} payload
* @param {Number} payload.time
* @param {Object} payload.cursor
* @param {String} payload.source
* @param {Function} next
*/
delay: function (payload, next) {
var cursor = payload.cursor || this.state.cursor.current();
this.waiting = true; // set flag
// trigger the waiting stae change event
this.triggers.waitStateChange(null, cursor, true, payload.time, payload.source);
setTimeout((function () {
this.waiting = false; // unset flag
this.triggers.waitStateChange(null, cursor, false, payload.time, payload.source);
next();
}).bind(this), payload.time || 0);
}
}
};

View File

@@ -0,0 +1,530 @@
var _ = require('lodash'),
uuid = require('uuid'),
async = require('async'),
util = require('../util'),
sdk = require('postman-collection'),
sandbox = require('postman-sandbox'),
serialisedError = require('serialised-error'),
ToughCookie = require('tough-cookie').Cookie,
createItemContext = require('../create-item-context'),
ASSERTION_FAILURE = 'AssertionFailure',
SAFE_CONTEXT_VARIABLES = ['_variables', 'environment', 'globals', 'collectionVariables', 'cookies', 'data',
'request', 'response'],
EXECUTION_REQUEST_EVENT_BASE = 'execution.request.',
EXECUTION_RESPONSE_EVENT_BASE = 'execution.response.',
EXECUTION_ASSERTION_EVENT_BASE = 'execution.assertion.',
EXECUTION_ERROR_EVENT_BASE = 'execution.error.',
EXECUTION_COOKIES_EVENT_BASE = 'execution.cookies.',
COOKIES_EVENT_STORE_ACTION = 'store',
COOKIE_STORE_PUT_METHOD = 'putCookie',
COOKIE_STORE_UPDATE_METHOD = 'updateCookie',
FILE = 'file',
REQUEST_BODY_MODE_FILE = 'file',
REQUEST_BODY_MODE_FORMDATA = 'formdata',
getCookieDomain, // fn
postProcessContext, // fn
sanitizeFiles; // fn
postProcessContext = function (execution, failures) { // function determines whether the event needs to abort
var error;
if (failures && failures.length) {
error = new Error(failures.join(', '));
error.name = ASSERTION_FAILURE;
}
return error ? serialisedError(error, true) : undefined;
};
/**
* Removes files in Request body if any.
*
* @private
*
* @param {Request~definition} request Request JSON representation to be sanitized
* @param {Function} callback function invoked with error, request and sanitisedFiles.
* sanitisedFiles is the list of files removed from request.
*
* @note this function mutates the request
* @todo remove files path from request.certificate
*/
sanitizeFiles = function (request, callback) {
if (!request) {
return callback(new Error('Could not complete pm.sendRequest. Request is empty.'));
}
var sanitisedFiles = [];
// do nothing if request body is empty
if (!request.body) {
// send request as such
return callback(null, request, sanitisedFiles);
}
// in case of request body mode is file, we strip it out
if (request.body.mode === REQUEST_BODY_MODE_FILE) {
sanitisedFiles.push(_.get(request, 'body.file.src'));
request.body = null; // mutate the request for body
}
// if body is form-data then we deep dive into the data items and remove the entries that have file data
else if (request.body.mode === REQUEST_BODY_MODE_FORMDATA) {
// eslint-disable-next-line lodash/prefer-immutable-method
_.remove(request.body.formdata, function (param) {
// blank param and non-file param is removed
if (!param || param.type !== FILE) { return false; }
// at this point the param needs to be removed
sanitisedFiles.push(param.src);
return true;
});
}
return callback(null, request, sanitisedFiles);
};
/**
* Fetch domain name from CookieStore event arguments.
*
* @private
* @param {String} fnName - CookieStore method name
* @param {Array} args - CookieStore method arguments
* @returns {String|Undefined} - Domain name
*/
getCookieDomain = function (fnName, args) {
if (!(fnName && args)) {
return;
}
var domain;
switch (fnName) {
case 'findCookie':
case 'findCookies':
case 'removeCookie':
case 'removeCookies':
domain = args[0];
break;
case 'putCookie':
case 'updateCookie':
domain = args[0] && args[0].domain;
break;
default:
return;
}
return domain;
};
/**
* Script execution extension of the runner.
* This module exposes processors for executing scripts before and after requests. Essentially, the processors are
* itself not aware of other processors and simply allow running of a script and then queue a procesor as defined in
* payload.
*
* Adds options
* - stopOnScriptError:Boolean [false]
* - host:Object [undefined]
*/
module.exports = {
init: function (done) {
var run = this;
// if this run object already has a host, we do not need to create one.
if (run.host) {
return done();
}
// @todo - remove this when chrome app and electron host creation is offloaded to runner
// @todo - can this be removed now in runtime v4?
if (run.options.host && run.options.host.external === true) {
run.host = run.options.host.instance;
return done();
}
sandbox.createContext(_.merge({
timeout: _(run.options.timeout).pick(['script', 'global']).values().min()
// debug: true
}, run.options.host), function (err, context) {
if (err) { return done(err); }
// store the host in run object for future use and move on
run.host = context;
context.on('console', function () {
run.triggers.console.apply(run.triggers, arguments);
});
context.on('error', function () {
run.triggers.error.apply(run.triggers, arguments);
});
context.on('execution.error', function () {
run.triggers.exception.apply(run.triggers, arguments);
});
context.on('execution.assertion', function () {
run.triggers.assertion.apply(run.triggers, arguments);
});
done();
});
},
/**
* This lists the name of the events that the script processors are likely to trigger
*
* @type {Array}
*/
triggers: ['beforeScript', 'script', 'assertion', 'exception', 'console'],
process: {
/**
* This processors job is to do the following:
* - trigger event by its name
* - execute all scripts that the event listens to and return execution results
*
* @param {Object} payload
* @param {String} payload.name
* @param {Item} payload.item
* @param {Object} [payload.context]
* @param {Cursor} [payload.coords]
* @param {Number} [payload.scriptTimeout] - The millisecond timeout for the current running script.
* @param {Array.<String>} [payload.trackContext]
* @param {Boolean} [payload.stopOnScriptError] - if set to true, then a synchronous error encountered during
* execution of a script will stop executing any further scripts
* @param {Boolean} [payload.abortOnFailure]
* @param {Boolean} [payload.stopOnFailure]
* @param {Function} next
*
* @note - in order to raise trigger for the entire event, ensure your extension has registered the triggers
*/
event: function (payload, next) {
var item = payload.item,
eventName = payload.name,
cursor = payload.coords,
// the payload can have a list of variables to track from the context post execution, ensure that
// those are accurately set
track = _.isArray(payload.trackContext) && _.isObject(payload.context) &&
// ensure that only those variables that are defined in the context are synced
payload.trackContext.filter(function (variable) {
return _.isObject(payload.context[variable]);
}),
stopOnScriptError = (_.has(payload, 'stopOnScriptError') ? payload.stopOnScriptError :
this.options.stopOnScriptError),
abortOnError = (_.has(payload, 'abortOnError') ? payload.abortOnError : this.options.abortOnError),
// @todo: find a better home for this option processing
abortOnFailure = payload.abortOnFailure,
stopOnFailure = payload.stopOnFailure,
events;
// @todo: find a better place to code this so that event is not aware of such options
if (abortOnFailure) {
abortOnError = true;
}
// validate the payload
if (!eventName) {
return next(new Error('runner.extension~events: event payload is missing the event name.'));
}
if (!item) {
return next(new Error('runner.extension~events: event payload is missing the triggered item.'));
}
// get the list of events to be executed
// includes events in parent as well
events = item.events.listeners(eventName, {excludeDisabled: true});
// call the "before" event trigger by its event name.
// at this point, the one who queued this event, must ensure that the trigger for it is defined in its
// 'trigger' interface
this.triggers[_.camelCase('before-' + eventName)](null, cursor, events, item);
// with all the event listeners in place, we now iterate on them and execute its scripts. post execution,
// we accumulate the results in order to be passed on to the event callback trigger.
async.mapSeries(events, function (event, next) {
// in case the event has no script we bail out early
if (!event.script) {
return next(null, {event: event});
}
// get access to the script from the event.
var script = event.script,
executionId = uuid(),
assertionFailed = [],
asyncScriptError,
// create copy of cursor so we don't leak script ids outside `event.command`
// and across scripts
scriptCursor = _.clone(cursor);
// store the execution id in script
script._lastExecutionId = executionId; // please don't use it anywhere else!
// if we can find an id on script or event we add them to the cursor
// so logs and errors can be traced back to the script they came from
event.id && (scriptCursor.eventId = event.id);
event.script.id && (scriptCursor.scriptId = event.script.id);
// trigger the "beforeScript" callback
this.triggers.beforeScript(null, scriptCursor, script, event, item);
// add event listener to trap all assertion events, but only if needed. to avoid needlessly accumulate
// stuff in memory.
(abortOnFailure || stopOnFailure) &&
this.host.on(EXECUTION_ASSERTION_EVENT_BASE + executionId, function (scriptCursor, assertions) {
_.forEach(assertions, function (assertion) {
assertion && !assertion.passed && assertionFailed.push(assertion.name);
});
});
// To store error event, but only if needed. Because error in callback of host.execute()
// don't show execution errors for async scripts
(abortOnError || stopOnScriptError) &&
// only store first async error in case of multiple errors
this.host.once(EXECUTION_ERROR_EVENT_BASE + executionId, function (scriptCursor, error) {
if (error && !(error instanceof Error)) {
error = new Error(error.message || error);
}
asyncScriptError = error;
// @todo: Figure out a way to abort the script execution here as soon as we get an error.
// We can send `execution.abort.` event to sandbox for this, but currently it silently
// terminates the script execution without triggering the callback.
});
this.host.on(EXECUTION_COOKIES_EVENT_BASE + executionId,
function (eventId, action, fnName, args) {
// only store action is supported, might need to support
// more cookie actions in next 2 years ¯\_(ツ)_/¯
if (action !== COOKIES_EVENT_STORE_ACTION) { return; }
var self = this,
dispatchEvent = EXECUTION_COOKIES_EVENT_BASE + executionId,
cookieJar = _.get(self, 'requester.options.cookieJar'),
cookieStore = cookieJar && cookieJar.store,
cookieDomain;
if (!cookieStore) {
return self.host.dispatch(dispatchEvent, eventId, 'CookieStore: no store found');
}
if (typeof cookieStore[fnName] !== 'function') {
return self.host.dispatch(dispatchEvent, eventId,
`CookieStore: invalid method name '${fnName}'`);
}
!Array.isArray(args) && (args = []);
// set expected args length to make sure callback is always called
args.length = cookieStore[fnName].length - 1;
// there's no way cookie store can identify the difference
// between regular and programmatic access. So, for now
// we check for programmatic access using the cookieJar
// helper method and emit the default empty value for that
// method.
// @note we don't emit access denied error here because
// that might blocks users use-case while accessing
// cookies for a sub-domain.
cookieDomain = getCookieDomain(fnName, args);
if (cookieJar && typeof cookieJar.allowProgrammaticAccess === 'function' &&
!cookieJar.allowProgrammaticAccess(cookieDomain)) {
return self.host.dispatch(dispatchEvent, eventId,
`CookieStore: programmatic access to "${cookieDomain}" is denied`);
}
// serialize cookie object
if (fnName === COOKIE_STORE_PUT_METHOD && args[0]) {
args[0] = ToughCookie.fromJSON(args[0]);
}
if (fnName === COOKIE_STORE_UPDATE_METHOD && args[0] && args[1]) {
args[0] = ToughCookie.fromJSON(args[0]);
args[1] = ToughCookie.fromJSON(args[1]);
}
// add store method's callback argument
args.push(function (err, res) {
// serialize error message
if (err && err instanceof Error) {
err = err.message || String(err);
}
self.host.dispatch(dispatchEvent, eventId, err, res);
});
try {
cookieStore[fnName].apply(cookieStore, args);
}
catch (error) {
self.host.dispatch(dispatchEvent, eventId,
`runtime~CookieStore: error executing "${fnName}"`);
}
}.bind(this));
this.host.on(EXECUTION_REQUEST_EVENT_BASE + executionId,
function (scriptCursor, id, requestId, request) {
// remove files in request body if any
sanitizeFiles(request, function (err, request, sanitisedFiles) {
if (err) {
return this.host.dispatch(EXECUTION_RESPONSE_EVENT_BASE + id, requestId, err);
}
var nextPayload;
// if request is sanitized send a warning
if (!_.isEmpty(sanitisedFiles)) {
this.triggers.console(scriptCursor, 'warn',
'uploading files from scripts is not allowed');
}
nextPayload = {
item: new sdk.Item({request: request}),
coords: scriptCursor,
// @todo - get script type from the sandbox
source: 'script',
// abortOnError makes sure request command bubbles errors
// so we can pass it on to the callback
abortOnError: true
};
// create context for executing this request
nextPayload.context = createItemContext(nextPayload);
this.immediate('httprequest', nextPayload).done(function (result) {
this.host.dispatch(
EXECUTION_RESPONSE_EVENT_BASE + id,
requestId,
null,
result && result.response,
// @todo get cookies from result.history or pass PostmanHistory
// instance once it is fully supported
result && {cookies: result.cookies}
);
}).catch(function (err) {
this.host.dispatch(EXECUTION_RESPONSE_EVENT_BASE + id, requestId, err);
});
}.bind(this));
}.bind(this));
// finally execute the script
this.host.execute(event, {
id: executionId,
// debug: true,
timeout: payload.scriptTimeout, // @todo: Expose this as a property in Collection SDK's Script
cursor: scriptCursor,
context: _.pick(payload.context, SAFE_CONTEXT_VARIABLES),
serializeLogs: _.get(this, 'options.script.serializeLogs'),
// legacy options
legacy: {
_itemId: item.id,
_itemName: item.name
}
}, function (err, result) {
this.host.removeAllListeners(EXECUTION_REQUEST_EVENT_BASE + executionId);
this.host.removeAllListeners(EXECUTION_ASSERTION_EVENT_BASE + executionId);
this.host.removeAllListeners(EXECUTION_RESPONSE_EVENT_BASE + executionId);
this.host.removeAllListeners(EXECUTION_COOKIES_EVENT_BASE + executionId);
this.host.removeAllListeners(EXECUTION_ERROR_EVENT_BASE + executionId);
// Handle async errors as well.
// If there was an error running the script itself, that takes precedence
if (!err && asyncScriptError) {
err = asyncScriptError;
}
// electron IPC does not bubble errors to the browser process, so we serialize it here.
err && (err = serialisedError(err, true));
// if it is defined that certain variables are to be synced back to result, we do the same
track && result && track.forEach(function (variable) {
if (!(_.isObject(result[variable]) && payload.context[variable])) { return; }
var contextVariable = payload.context[variable],
mutations = result[variable].mutations;
// bail out if there are no mutations
if (!mutations) {
return;
}
// ensure that variable scope is treated accordingly
if (_.isFunction(contextVariable.applyMutation)) {
mutations = new sdk.MutationTracker(result[variable].mutations);
mutations.applyOn(contextVariable);
}
// @todo: unify the non variable scope flows and consume diff always
// and drop sending the full variable scope from sandbox
else {
util.syncObject(contextVariable, result[variable]);
}
});
// Get the failures. If there was an error running the script itself, that takes precedence
if (!err && (abortOnFailure || stopOnFailure)) {
err = postProcessContext(result, assertionFailed); // also use async assertions
}
// Ensure that we have SDK instances, not serialized plain objects.
// @todo - should this be handled by the sandbox?
result && result._variables && (result._variables = new sdk.VariableScope(result._variables));
result && result.environment && (result.environment = new sdk.VariableScope(result.environment));
result && result.globals && (result.globals = new sdk.VariableScope(result.globals));
result && result.collectionVariables &&
(result.collectionVariables = new sdk.VariableScope(result.collectionVariables));
result && result.request && (result.request = new sdk.Request(result.request));
// @note Since postman-sandbox@3.5.2, response object is not included in the execution result.
// Refer: https://github.com/postmanlabs/postman-sandbox/pull/512
// Adding back here to avoid breaking change in `script` callback.
// @todo revisit script callback args in runtime v8.
result && payload.context && payload.context.response &&
(result.response = new sdk.Response(payload.context.response));
// persist the pm.variables for the next script
result && result._variables &&
(payload.context._variables = new sdk.VariableScope(result._variables));
// persist the pm.variables for the next request
result && result._variables && (this.state._variables = new sdk.VariableScope(result._variables));
// persist the mutated request in payload context,
// @note this will be used for the next prerequest script or
// upcoming commands(request, httprequest).
result && result.request && (payload.context.request = result.request);
// now that this script is done executing, we trigger the event and move to the next script
this.triggers.script(err || null, scriptCursor, result, script, event, item);
// move to next script and pass on the results for accumulation
next(((stopOnScriptError || abortOnError || stopOnFailure) && err) ? err : null, _.assign({
event: event,
script: script,
result: result
}, err && {error: err})); // we use assign here to avoid needless error property
}.bind(this));
}.bind(this), function (err, results) {
// trigger the event completion callback
this.triggers[eventName](null, cursor, results, item);
next((abortOnError && err) ? err : null, results, err);
}.bind(this));
}
}
};

View File

@@ -0,0 +1,211 @@
var _ = require('lodash'),
async = require('async'),
uuid = require('uuid'),
// These are functions which a request passes through _before_ being sent. They take care of stuff such as
// variable resolution, loading of files, etc.
prehelpers = require('../request-helpers-presend'),
// Similarly, these run after the request, and have the power to dictate whether a request should be re-queued
posthelpers = require('../request-helpers-postsend'),
ReplayController = require('../replay-controller'),
RequesterPool = require('../../requester').RequesterPool,
RESPONSE_START_EVENT_BASE = 'response.start.',
RESPONSE_END_EVENT_BASE = 'response.end.';
module.exports = {
init: function (done) {
// Request timeouts are applied by the requester, so add them to requester options (if any).
// create a requester pool
this.requester = new RequesterPool(this.options, done);
},
// the http trigger is actually directly triggered by the requester
// todo - figure out whether we should trigger it from here rather than the requester.
triggers: ['beforeRequest', 'request', 'responseStart', 'io'],
process: {
/**
* @param {Object} payload
* @param {Item} payload.item
* @param {Object} payload.data
* @param {Object} payload.context
* @param {VariableScope} payload.globals
* @param {VariableScope} payload.environment
* @param {Cursor} payload.coords
* @param {Boolean} payload.abortOnError
* @param {String} payload.source
* @param {Function} next
*
* @todo validate payload
*/
httprequest: function (payload, next) {
var abortOnError = _.has(payload, 'abortOnError') ? payload.abortOnError : this.options.abortOnError,
self = this,
context;
context = payload.context;
// generates a unique id for each http request
// a collection request can have multiple http requests
_.set(context, 'coords.httpRequestId', payload.httpRequestId || uuid());
// Run the helper functions
async.applyEachSeries(prehelpers, context, self, function (err) {
var xhr,
aborted,
item = context.item,
beforeRequest,
afterRequest,
safeNext;
// finish up current command
safeNext = function (error, finalPayload) {
// the error is passed twice to allow control between aborting the error vs just
// bubbling it up
return next((error && abortOnError) ? error : null, finalPayload, error);
};
// Helper function which calls the beforeRequest trigger ()
beforeRequest = function (err) {
self.triggers.beforeRequest(err, context.coords, item.request, payload.item, {
httpRequestId: context.coords && context.coords.httpRequestId,
abort: function () {
!aborted && xhr && xhr.abort();
aborted = true;
}
});
};
// Helper function to call the afterRequest trigger.
afterRequest = function (err, response, request, cookies, history) {
self.triggers.request(err, context.coords, response, request, payload.item, cookies, history);
};
// Ensure that this is called.
beforeRequest(null);
if (err) {
// Since we encountered an error before even attempting to send the request, we bubble it up
// here.
afterRequest(err, undefined, item.request);
return safeNext(
err,
{request: item.request, coords: context.coords, item: context.originalItem}
);
}
if (aborted) {
return next(new Error('runtime: request aborted'));
}
self.requester.create({
type: 'http',
source: payload.source,
cursor: context.coords
}, function (err, requester) {
if (err) { return next(err); } // this should never happen
var requestId = uuid(),
replayOptions;
// eslint-disable-next-line max-len
requester.on(RESPONSE_START_EVENT_BASE + requestId, function (err, response, request, cookies, history) {
// we could have also added the response to the set of responses in the cloned item,
// but then, we would have to iterate over all of them, which seems unnecessary
context.response = response;
// run the post request helpers, which need to use the response, assigned above
async.applyEachSeries(posthelpers, context, self, function (error, options) {
if (error) {
return;
}
// find the first helper that requested a replay
replayOptions = _.find(options, {replay: true});
// bail out if we know that request will be replayed.
if (replayOptions) {
return;
}
// bail out if its a pm.sendRequest
// @todo find a better way of identifying scripts
// @note don't use source='script'. Script requests
// can trigger `*.auth` source requests as well.
if (context.coords && context.coords.scriptId) {
return;
}
// trigger responseStart only for collection request.
// if there are replays, this will be triggered for the last request in the replay chain.
self.triggers.responseStart(err, context.coords, response, request, payload.item, cookies,
history);
});
});
requester.on(RESPONSE_END_EVENT_BASE + requestId, self.triggers.io.bind(self.triggers));
// eslint-disable-next-line max-len
xhr = requester.request(requestId, item.request, context.protocolProfileBehavior, function (err, res, req, cookies, history) {
err = err || null;
var nextPayload = {
response: res,
request: req,
item: context.originalItem,
cookies: cookies,
coords: context.coords,
history: history
},
replayController;
// trigger the request event.
// @note - we give the _original_ item in this trigger, so someone can do reference
// checking. Not sure if we should do that or not, but that's how it is.
// Don't break it.
afterRequest(err, res, req, cookies, history);
// Dispose off the requester, we don't need it anymore.
requester.dispose();
// do not process replays if there was an error
if (err) {
return safeNext(err, nextPayload);
}
// request replay logic
if (replayOptions) {
// prepare for replay
replayController = new ReplayController(context.replayState, self);
// replay controller invokes callback no. 1 when replaying the request
// invokes callback no. 2 when replay count has exceeded maximum limit
// @note: errors in replayed requests are passed to callback no. 1
return replayController.requestReplay(context,
context.item,
{source: replayOptions.helper},
// new payload with response from replay is sent to `next`
function (err, payloadFromReplay) { safeNext(err, payloadFromReplay); },
// replay was stopped, move on with older payload
function (err) {
// warn users that maximum retries have exceeded
// but don't bubble up the error with the request
self.triggers.console(context.coords, 'warn', (err.message || err));
safeNext(null, nextPayload);
}
);
}
// finish up for any other request
return safeNext(err, nextPayload);
});
});
});
}
}
};

View File

@@ -0,0 +1,277 @@
var _ = require('lodash'),
uuid = require('uuid'),
Response = require('postman-collection').Response,
visualizer = require('../../visualizer'),
/**
* List of request properties which can be mutated via pre-request
*
* @private
* @const
* @type {String[]}
*/
ALLOWED_REQUEST_MUTATIONS = ['url', 'method', 'headers', 'body'],
extractVisualizerData,
getResponseJSON;
/**
* Returns visualizer data from the latest execution result.
*
* @param {Array} prereqExecutions - pre-script executions results
* @param {Array} testExecutions - test-script executions results
* @returns {Object|undefined} - visualizer data
*/
extractVisualizerData = function (prereqExecutions, testExecutions) {
var visualizerData,
i;
if (_.isArray(testExecutions)) {
// loop through the test executions in reverse order to return data from latest execution
for (i = testExecutions.length - 1; i >= 0; i--) {
visualizerData = _.get(testExecutions[i], 'result.return.visualizer');
if (visualizerData) {
return visualizerData;
}
}
}
if (_.isArray(prereqExecutions)) {
// extract visualizer data from pre-request script results if it is not found earlier
for (i = prereqExecutions.length - 1; i >= 0; i--) {
visualizerData = _.get(prereqExecutions[i], 'result.return.visualizer');
if (visualizerData) {
return visualizerData;
}
}
}
};
/**
* Convert response into a JSON serializable object.
* The stream property is converted to base64 string for performance reasons.
*
* @param {Object} response - SDK Response instance
* @returns {Object}
*/
getResponseJSON = function (response) {
if (!Response.isResponse(response)) {
return;
}
return {
id: response.id,
code: response.code,
status: response.status,
header: response.headers && response.headers.toJSON(),
stream: response.stream && {
type: 'Base64',
data: response.stream.toString('base64')
},
responseTime: response.responseTime
};
};
/**
* Add options
* stopOnError:Boolean
* @type {Object}
*/
module.exports = {
init: function (done) {
// @todo - code item global timeout and delay here
done();
},
triggers: ['beforeItem', 'item', 'beforePrerequest', 'prerequest', 'beforeTest', 'test'],
process: {
/**
* @param {Function=} callback
* @param {Object} payload
* @param {Function} next
* @todo validate payload
*/
item: function (callback, payload, next) {
// adjust for polymorphic instructions
if (!next && _.isFunction(payload) && !_.isFunction(callback)) {
next = payload;
payload = callback;
callback = null;
}
var item = payload.item,
originalRequest = item.request.clone(),
coords = payload.coords,
data = _.isObject(payload.data) ? payload.data : {},
environment = _.isObject(payload.environment) ? payload.environment : {},
globals = _.isObject(payload.globals) ? payload.globals : {},
collectionVariables = _.isObject(payload.collectionVariables) ? payload.collectionVariables : {},
_variables = _.isObject(payload._variables) ? payload._variables : {},
stopOnError = _.has(payload, 'stopOnError') ? payload.stopOnError : this.options.stopOnError,
// @todo: this is mostly coded in event extension and we are
// still not sure whether that is the right place for it to be.
abortOnFailure = this.options.abortOnFailure,
stopOnFailure = this.options.stopOnFailure,
delay = _.get(this.options, 'delay.item'),
ctxTemplate;
// validate minimum parameters required for the command to work
if (!(item && coords)) {
return next(new Error('runtime: item execution is missing required parameters'));
}
// store a common uuid in the coords
coords.ref = uuid.v4();
// here we code to queue prerequest script, then make a request and then execute test script
this.triggers.beforeItem(null, coords, item);
this.queueDelay(function () {
// create the context object for scripts to run
ctxTemplate = {
collectionVariables: collectionVariables,
_variables: _variables,
globals: globals,
environment: environment,
data: data,
request: item.request
};
// @todo make it less nested by coding Instruction.thenQueue
this.queue('event', {
name: 'prerequest',
item: item,
coords: coords,
context: ctxTemplate,
trackContext: ['globals', 'environment', 'collectionVariables'],
stopOnScriptError: stopOnError,
stopOnFailure: stopOnFailure
}).done(function (prereqExecutions, prereqExecutionError) {
// if stop on error is marked and script executions had an error,
// do not proceed with more commands, instead we bail out
if ((stopOnError || stopOnFailure) && prereqExecutionError) {
this.triggers.item(null, coords, item); // @todo - should this trigger receive error?
return callback && callback.call(this, prereqExecutionError, {
prerequest: prereqExecutions
});
}
// update allowed request mutation properties with the mutated context
// @note from this point forward, make sure this mutated
// request instance is used for upcoming commands.
ALLOWED_REQUEST_MUTATIONS.forEach(function (property) {
if (_.has(ctxTemplate, ['request', property])) {
item.request[property] = ctxTemplate.request[property];
}
// update property's parent reference
if (item.request[property] && typeof item.request[property].setParent === 'function') {
item.request[property].setParent(item.request);
}
});
this.queue('request', {
item: item,
globals: ctxTemplate.globals,
environment: ctxTemplate.environment,
collectionVariables: ctxTemplate.collectionVariables,
_variables: ctxTemplate._variables,
data: ctxTemplate.data,
coords: coords,
source: 'collection'
}).done(function (result, requestError) {
!result && (result = {});
var request = result.request,
response = result.response,
cookies = result.cookies;
if ((stopOnError || stopOnFailure) && requestError) {
this.triggers.item(null, coords, item); // @todo - should this trigger receive error?
return callback && callback.call(this, requestError, {
request: request
});
}
// also the test object requires the updated request object (since auth helpers may modify it)
request && (ctxTemplate.request = request);
// @note convert response instance to plain object.
// we want to avoid calling Response.toJSON() which triggers toJSON on Response.stream buffer.
// Because that increases the size of stringified object by 3 times.
// Also, that increases the total number of tokens (buffer.data) whereas Buffer.toString
// generates a single string that is easier to stringify and sent over the UVM bridge.
response && (ctxTemplate.response = getResponseJSON(response));
// set cookies for this transaction
cookies && (ctxTemplate.cookies = cookies);
// the context template also has a test object to store assertions
ctxTemplate.tests = {}; // @todo remove
this.queue('event', {
name: 'test',
item: item,
coords: coords,
context: ctxTemplate,
trackContext: ['tests', 'globals', 'environment', 'collectionVariables'],
stopOnScriptError: stopOnError,
abortOnFailure: abortOnFailure,
stopOnFailure: stopOnFailure
}).done(function (testExecutions, testExecutionError) {
var visualizerData = extractVisualizerData(prereqExecutions, testExecutions),
visualizerResult;
if (visualizerData) {
visualizer.processTemplate(visualizerData.template,
visualizerData.data,
visualizerData.options,
function (err, processedTemplate) {
visualizerResult = {
// bubble up the errors while processing template through visualizer result
error: err,
// add processed template and data to visualizer result
processedTemplate: processedTemplate,
data: visualizerData.data
};
// trigger an event saying that item has been processed
this.triggers.item(null, coords, item, visualizerResult);
}.bind(this));
}
else {
// trigger an event saying that item has been processed
// @todo - should this trigger receive error?
this.triggers.item(null, coords, item, null);
}
// reset mutated request with original request instance
// @note request mutations are not persisted across iterations
item.request = originalRequest;
callback && callback.call(this, ((stopOnError || stopOnFailure) && testExecutionError) ?
testExecutionError : null, {
prerequest: prereqExecutions,
request: request,
response: response,
test: testExecutions
});
});
});
});
}.bind(this), {
time: delay,
source: 'item',
cursor: coords
}, next);
}
}
};

View File

@@ -0,0 +1,100 @@
var _ = require('lodash'),
sdk = require('postman-collection'),
createItemContext = require('../create-item-context'),
/**
* Resolve variables in item and auth in context.
*
* @param {ItemContext} context
* @param {Item} [context.item]
* @param {RequestAuth} [context.auth]
* @param {Object} payload
* @param {VariableScope} payload._variables
* @param {Object} payload.data
* @param {VariableScope} payload.environment
* @param {VariableScope} payload.collectionVariables
* @param {VariableScope} payload.globals
*/
resolveVariables = function (context, payload) {
if (!(context.item && context.item.request)) { return; }
// @todo - resolve variables in a more graceful way
var variableDefinitions = [
// extract the variable list from variable scopes
// @note: this is the order of precedence for variable resolution - don't change it
payload._variables.values,
payload.data,
payload.environment.values,
payload.collectionVariables.values,
payload.globals.values
],
urlString = context.item.request.url.toString(),
item,
auth;
// @todo - no need to sync variables when SDK starts supporting resolution from scope directly
// @todo - avoid resolving the entire item as this unnecessarily resolves URL
item = context.item = new sdk.Item(context.item.toObjectResolved(null,
variableDefinitions, {ignoreOwnVariables: true}));
auth = context.auth;
// resolve variables in URL string
if (urlString) {
// @note this adds support resolving nested variables as URL parser doesn't support them well.
urlString = sdk.Property.replaceSubstitutions(urlString, variableDefinitions);
// Re-parse the URL from the resolved string
item.request.url = new sdk.Url(urlString);
}
// resolve variables in auth
auth && (context.auth = new sdk.RequestAuth(auth.toObjectResolved(null,
variableDefinitions, {ignoreOwnVariables: true})));
};
module.exports = {
init: function (done) {
done();
},
triggers: ['response'],
process: {
request: function (payload, next) {
var abortOnError = _.has(payload, 'abortOnError') ? payload.abortOnError : this.options.abortOnError,
// helper function to trigger `response` callback anc complete the command
complete = function (err, nextPayload) {
// nextPayload will be empty for unhandled errors
// trigger `response` callback
// nextPayload.response will be empty for error flows
// the `item` argument is resolved and mutated here
nextPayload && this.triggers.response(err, nextPayload.coords, nextPayload.response,
nextPayload.request, nextPayload.item, nextPayload.cookies, nextPayload.history);
// the error is passed twice to allow control between aborting the error vs just
// bubbling it up
return next(err && abortOnError ? err : null, nextPayload, err);
}.bind(this),
context = createItemContext(payload);
// resolve variables in item and auth
resolveVariables(context, payload);
// add context for use, after resolution
payload.context = context;
// we do not queue `httprequest` instruction here,
// queueing will unblock the item command to prepare for the next `event` instruction
// at this moment request is not fulfilled, and we want to block it
this.immediate('httprequest', payload)
.done(function (nextPayload, err) {
// change signature to error first
complete(err, nextPayload);
})
.catch(complete);
}
}
};

View File

@@ -0,0 +1,239 @@
var _ = require('lodash'),
Cursor = require('../cursor'),
VariableScope = require('postman-collection').VariableScope,
prepareLookupHash,
extractSNR,
getIterationData;
/**
* Returns a hash of IDs and Names of items in an array
*
* @param {Array} items
* @returns {Object}
*/
prepareLookupHash = function (items) {
var hash = {
ids: {},
names: {},
obj: {}
};
_.forEach(items, function (item, index) {
if (item) {
item.id && (hash.ids[item.id] = index);
item.name && (hash.names[item.name] = index);
}
});
return hash;
};
extractSNR = function (executions, previous) {
var snr = previous || {};
_.isArray(executions) && executions.forEach(function (execution) {
_.has(_.get(execution, 'result.return'), 'nextRequest') && (
(snr.defined = true),
(snr.value = execution.result.return.nextRequest)
);
});
return snr;
};
/**
* Returns the data for the given iteration
*
* @function getIterationData
* @param {Array} data - The data array containing all iterations' data
* @param {Number} iteration - The iteration to get data for
* @return {Any} - The data for the iteration
*/
getIterationData = function (data, iteration) {
// if iteration has a corresponding data element use that
if (iteration < data.length) {
return data[iteration];
}
// otherwise use the last data element
return data[data.length - 1];
};
/**
* Adds options
* disableSNR:Boolean
*
* @type {Object}
*/
module.exports = {
init: function (done) {
var state = this.state;
// ensure that the environment, globals and collectionVariables are in VariableScope instance format
state.environment = VariableScope.isVariableScope(state.environment) ? state.environment :
new VariableScope(state.environment);
state.globals = VariableScope.isVariableScope(state.globals) ? state.globals :
new VariableScope(state.globals);
state.collectionVariables = VariableScope.isVariableScope(state.collectionVariables) ?
state.collectionVariables : new VariableScope(state.collectionVariables);
state._variables = new VariableScope();
// ensure that the items and iteration data set is in place
!_.isArray(state.items) && (state.items = []);
!_.isArray(state.data) && (state.data = []);
!_.isObject(state.data[0]) && (state.data[0] = {});
// if the location in state is already normalised then go ahead and queue iteration, else normalise the
// location
state.cursor = Cursor.box(state.cursor, { // we pass bounds to ensure there is no stale state
cycles: this.options.iterationCount,
length: state.items.length
});
this.waterfall = state.cursor; // copy the location object to instance for quick access
// queue the iteration command on start
this.queue('waterfall', {
coords: this.waterfall.current(),
static: true,
start: true
});
// clear the variable that is supposed to store item name and id lookup hash for easy setNextRequest
this.snrHash = null; // we populate it in the first SNR call
done();
},
triggers: ['beforeIteration', 'iteration'],
process: {
/**
* This processor simply queues scripts and requests in a linear chain.
*
* @param {Object} payload
* @param {Object} payload.coords
* @param {Boolean} [payload.static=false]
* @param {Function} next
*/
waterfall: function (payload, next) {
// we procure the coordinates that we have to pick item and data from. the data is
var coords = payload.static ? payload.coords : this.waterfall.whatnext(payload.coords),
item = this.state.items[coords.position],
delay;
// if there is nothing to process, we bail out from here, even before we enter the iteration cycle
if (coords.empty) {
return next();
}
if (payload.stopRunNow) {
this.triggers.iteration(null, payload.coords);
return next();
}
// if it is a beginning of a run, we need to raise events for iteration start
if (payload.start) {
this.triggers.beforeIteration(null, coords);
}
// if this is a new iteration, we close the previous one and start new
if (coords.cr) {
// getting the iteration delay here ensures that delay is only called between two iterations
delay = _.get(this.options, 'delay.iteration', 0);
this.triggers.iteration(null, payload.coords);
this.triggers.beforeIteration(null, coords);
}
// if this is end of waterfall, it is an end of iteration and also end of run
if (coords.eof) {
this.triggers.iteration(null, coords);
return next();
}
this.queueDelay(function () {
this.queue('item', {
item: item,
coords: coords,
data: getIterationData(this.state.data, coords.iteration),
environment: this.state.environment,
globals: this.state.globals,
collectionVariables: this.state.collectionVariables,
_variables: this.state._variables
}, function (executionError, executions) {
var snr = {},
nextCoords,
seekingToStart,
stopRunNow,
stopOnFailure = this.options.stopOnFailure;
if (!executionError) {
// extract set next request
snr = extractSNR(executions.prerequest);
snr = extractSNR(executions.test, snr);
}
if (!this.options.disableSNR && snr.defined) {
// prepare the snr lookup hash if it is not already provided
// @todo - figure out a way to reset this post run complete
!this.snrHash && (this.snrHash = prepareLookupHash(this.state.items));
// if it is null, we do not proceed further and move on
// see if a request is found in the hash and then reset the coords position to the lookup
// value.
(snr.value !== null) && (snr.position = // eslint-disable-next-line no-nested-ternary
this.snrHash[_.has(this.snrHash.ids, snr.value) ? 'ids' :
(_.has(this.snrHash.names, snr.value) ? 'names' : 'obj')][snr.value]);
snr.valid = _.isNumber(snr.position);
}
nextCoords = _.clone(coords);
if (snr.valid) {
// if the position was detected, we set the position to the one previous to the desired location
// this ensures that the next call to .whatnext() will return the desired position.
nextCoords.position = snr.position - 1;
}
else {
// if snr was requested, but not valid, we stop this iteration.
// stopping an iteration is equivalent to seeking the last position of the current
// iteration, so that the next call to .whatnext() will automatically move to the next
// iteration.
(snr.defined || executionError) && (nextCoords.position = nextCoords.length - 1);
// If we need to stop on a run, we set the stop flag to true.
(stopOnFailure && executionError) && (stopRunNow = true);
}
// @todo - do this in unhacky way
if (nextCoords.position === -1) {
nextCoords.position = 0;
seekingToStart = true;
}
this.waterfall.seek(nextCoords.position, nextCoords.iteration, function (err, chngd, coords) {
// this condition should never arise, so better throw error when this happens
if (err) {
throw err;
}
this.queue('waterfall', {
coords: coords,
static: seekingToStart,
stopRunNow: stopRunNow
});
}, this);
});
}.bind(this), {
time: delay,
source: 'iteration',
cursor: coords
}, next);
}
}
};

View File

@@ -0,0 +1,296 @@
var sdk = require('postman-collection'),
ItemGroup = sdk.ItemGroup,
Item = sdk.Item,
DEFAULT_LOOKUP_STRATEGY = 'idOrName',
INVALID_LOOKUP_STRATEGY_ERROR = 'runtime~extractRunnableItems: Invalid entrypoint lookupStrategy',
/**
* Accumulate all items in order if entry point is a collection/folder.
* If an item is passed returns an array with that item.
*
* @param {ItemGroup|Item} node
*
* @returns {Array<Item>}
*
* @todo: Possibly add mapItem to sdk.ItemGroup?
*/
flattenNode = function (node) {
var items = [];
// bail out
if (!node) { return items; }
if (ItemGroup.isItemGroup(node)) {
node.forEachItem(function (item) { items.push(item); });
}
else if (Item.isItem(node)) {
items.push(node);
}
return items;
},
/**
* Finds an item or item group based on id or name.
*
* @param {ItemGroup} itemGroup
* @param {?String} match
*
* @returns {Item|ItemGroup|undefined}
*/
findItemOrGroup = function (itemGroup, match) {
if (!itemGroup || !itemGroup.items) { return; }
var matched;
// lookup match on own children
itemGroup.items.each(function (itemOrGroup) {
if (itemOrGroup.id === match || itemOrGroup.name === match) {
matched = itemOrGroup;
return false; // exit the loop
}
});
// if there is no match on own children, start lookup on grand children
!matched && itemGroup.items.each(function (itemOrGroup) {
matched = findItemOrGroup(itemOrGroup, match);
if (matched) { return false; } // exit the loop
});
return matched;
},
/**
* Finds items based on multiple ids or names provided.
*
* @param {ItemGroup} itemGroup - Composite list of Item or ItemGroup.
* @param {Object} entrypointSubset - Entry-points reference passed across multiple recursive calls.
* @param {Boolean} _continueAccumulation - Flag used to decide whether to accumulate items or not.
* @param {Object} _accumulatedItems - Found Items or ItemGroups.
* @returns {Object} Found Items or ItemGroups.
*/
findItemsOrGroups = function (itemGroup, entrypointSubset, _continueAccumulation, _accumulatedItems) {
!_accumulatedItems && (_accumulatedItems = {members: [], reference: {}});
if (!itemGroup || !itemGroup.items) { return _accumulatedItems; }
var match;
itemGroup.items.each(function (item) {
// bail out if all entry-points are found.
if (!Object.keys(entrypointSubset).length) { return false; }
// lookup for item.id in entrypointSubset and if not found, lookup by item.name.
if (!(match = entrypointSubset[item.id] && item.id)) {
match = entrypointSubset[item.name] && item.name;
}
if (match) {
// only accumulate items which are not previously got tracked from its parent entrypoint.
if (_continueAccumulation) {
_accumulatedItems.members.push(item);
_accumulatedItems.reference[match] = item;
}
// delete looked-up entrypoint.
delete entrypointSubset[match];
}
// recursive call to find nested entry-points. To make sure all provided entry-points got tracked.
// _continueAccumulation flag will be `false` for children if their parent entrypoint is found.
return findItemsOrGroups(item, entrypointSubset, !match, _accumulatedItems);
});
return _accumulatedItems;
},
/**
* Finds an item or group from a path. The path should be an array of ids from the parent chain.
*
* @param {Collection} collection
* @param {Object} options
* @param {String} options.execute
* @param {?Array<String>} [options.path]
* @param {Function} callback
*/
lookupByPath = function (collection, options, callback) {
var lookupPath,
lastMatch = collection,
lookupOptions = options || {},
i,
ii;
// path can be empty, if item/group is at the top level
lookupPath = lookupOptions.path || [];
// push execute id to the path
options.execute && (lookupPath.push(options.execute));
// go down the lookup path
for (i = 0, ii = lookupPath.length; (i < ii) && lastMatch; i++) {
lastMatch = lastMatch.items && lastMatch.items.one(lookupPath[i]);
}
callback && callback(null, flattenNode(lastMatch), lastMatch);
},
/**
* Finds an item or group on a collection with a matching id or name.
*
* @param {Collection} collection
* @param {Object} options
* @param {String} [options.execute]
* @param {Function} callback
*/
lookupByIdOrName = function (collection, options, callback) {
var match = options.execute,
matched;
if (!match) { return callback(null, []); }
// do a recursive lookup
matched = findItemOrGroup(collection, match);
callback(null, flattenNode(matched), matched);
},
/**
* Finds items or item groups in a collection with matching list of ids or names.
*
* @note runnable items follows the order in which the items are defined in the collection
*
* @param {Collection} collection
* @param {Object} options
* @param {Array<String>} [options.execute]
* @param {Function} callback
*/
lookupByMultipleIdOrName = function (collection, options, callback) {
var entrypoints = options.execute,
preserveOrder = options.preserveOrder,
entrypointLookup = {},
runnableItems = [],
items,
i,
ii;
if (!(Array.isArray(entrypoints) && entrypoints.length)) {
return callback(null, []);
}
// add temp reference for faster lookup of entry-point name/id.
// entry-points with same name/id will be ignored.
for (i = 0, ii = entrypoints.length; i < ii; i++) {
entrypointLookup[entrypoints[i]] = true;
}
items = findItemsOrGroups(collection, entrypointLookup, true);
// Extracting the items and folders by the order in which they appear as folder/request arguments,
// only if specified in the entrypoint.preserveOrder
if (preserveOrder) {
items.members = entrypoints.map(function (ref) {
return items.reference[ref];
});
}
// at this point of time, we should have traversed all items mentioned in entrypoint and created a linear
// subset of items. However, if post that, we still have items remaining in lookup object, that implies that
// extra items were present in user input and corresponding items for those do not exist in collection. As such
// we need to bail out if any of the given entry-point is not found.
if (Object.keys(entrypointLookup).length) {
return callback(null, []);
}
// extract runnable items from the searched items.
for (i = 0, ii = items.members.length; i < ii; i++) {
runnableItems = runnableItems.concat(flattenNode(items.members[i]));
}
callback(null, runnableItems, collection);
},
/**
* Finds items or item groups in a collection with matching list of ids or names.
*
* @note runnable items follows the order of entrypoints
*
* @param {Collection} collection
* @param {Object} options
* @param {Array<String>} [options.execute]
* @param {Function} callback
*/
lookupByOrder = function (collection, options, callback) {
var entrypoints = options.execute,
entrypointLookup = {},
runnableItems = [],
items,
i,
ii;
if (!(Array.isArray(entrypoints) && entrypoints.length)) {
return callback(null, []);
}
// add temp reference for faster lookup of entry-point name/id.
// entry-points with same name/id will be ignored.
for (i = 0, ii = entrypoints.length; i < ii; i++) {
entrypointLookup[entrypoints[i]] = true;
}
items = findItemsOrGroups(collection, entrypointLookup, true);
// at this point of time, we should have traversed all items mentioned in entrypoint and created a linear
// subset of items. However, if post that, we still have items remaining in lookup object, that implies that
// extra items were present in user input and corresponding items for those do not exist in collection. As such
// we need to bail out if any of the given entry-point is not found.
if (Object.keys(entrypointLookup).length) {
return callback(null, []);
}
// extract runnable items from the searched items.
entrypoints.forEach(function (entrypoint) {
runnableItems = runnableItems.concat(flattenNode(items.reference[entrypoint]));
});
callback(null, runnableItems, collection);
},
lookupStrategyMap = {
path: lookupByPath,
idOrName: lookupByIdOrName,
followOrder: lookupByOrder,
multipleIdOrName: lookupByMultipleIdOrName
},
/**
* Extracts all the items on a collection starting from the entrypoint.
*
* @param {Collection} collection
* @param {?Object} [entrypoint]
* @param {String} [entrypoint.execute] id of item or group to execute (can be name when used with `idOrName`)
* @param {Array<String>} [entrypoint.path] path leading to the item or group selected (only for `path` strategy)
* @param {String} [entrypoint.lookupStrategy=idOrName] strategy to use for entrypoint lookup [idOrName, path]
* @param {Boolean} [entrypoint.preserveOrder] option to preserve the order of folder/items when specified.
* @param {Function} callback
*/
extractRunnableItems = function (collection, entrypoint, callback) {
var lookupFunction,
lookupStrategy;
// if no entrypoint is specified, flatten the entire collection
if (!entrypoint) { return callback(null, flattenNode(collection), collection); }
lookupStrategy = entrypoint.lookupStrategy || DEFAULT_LOOKUP_STRATEGY;
// lookup entry using given strategy
// eslint-disable-next-line no-cond-assign
(lookupFunction = lookupStrategyMap[lookupStrategy]) ?
lookupFunction(collection, entrypoint, callback) :
callback(new Error(INVALID_LOOKUP_STRATEGY_ERROR)); // eslint-disable-line callback-return
};
module.exports = {
extractRunnableItems: extractRunnableItems
};

136
node_modules/postman-runtime/lib/runner/index.js generated vendored Normal file
View File

@@ -0,0 +1,136 @@
var _ = require('lodash'),
backpack = require('../backpack'),
Run = require('./run'),
extractRunnableItems = require('./extract-runnable-items').extractRunnableItems,
Runner,
defaultTimeouts = {
global: 3 * 60 * 1000, // 3 minutes
request: Infinity,
script: Infinity
};
/**
* @typedef {runCallback}
* @property {Function} [done]
* @property {Function} [error]
* @property {Function} [success]
*/
/**
* @constructor
*
* @param {Object} [options]
*/
Runner = function PostmanCollectionRunner (options) { // eslint-disable-line func-name-matching
this.options = _.assign({}, options);
};
_.assign(Runner.prototype, {
/**
* Prepares `run` config by combining `runner` config with given run options.
*
* @param {Object} [options]
* @param {Object} [options.timeout]
* @param {Object} [options.timeout.global]
* @param {Object} [options.timeout.request]
* @param {Object} [options.timeout.script]
*/
prepareRunConfig: function (options) {
// combine runner config and make a copy
var runOptions = _.merge(_.omit(options, ['environment', 'globals', 'data']), this.options.run) || {};
// start timeout sanitization
!runOptions.timeout && (runOptions.timeout = {});
_.mergeWith(runOptions.timeout, defaultTimeouts, function (userTimeout, defaultTimeout) {
// non numbers, Infinity and missing values are set to default
if (!_.isFinite(userTimeout)) { return defaultTimeout; }
// 0 and negative numbers are set to Infinity, which only leaves positive numbers
return userTimeout > 0 ? userTimeout : Infinity;
});
return runOptions;
},
/**
* Runs a collection or a folder.
*
* @param {Collection} collection
* @param {Object} [options]
* @param {Array.<Item>} options.items
* @param {Array.<Object>} [options.data]
* @param {Object} [options.globals]
* @param {Object} [options.environment]
* @param {Number} [options.iterationCount]
* @param {CertificateList} [options.certificates]
* @param {ProxyConfigList} [options.proxies]
* @param {Array} [options.data]
* @param {Object} [options.entrypoint]
* @param {String} [options.entrypoint.execute] ID of the item-group to be run.
* Can be Name if `entrypoint.lookupStrategy` is `idOrName`
* @param {String} [options.entrypoint.lookupStrategy=idOrName] strategy to lookup the entrypoint [idOrName, path]
* @param {Array<String>} [options.entrypoint.path] path to lookup
* @param {Object} [options.run] Run-specific options, such as options related to the host
*
* @param {Function} callback
*/
run: function (collection, options, callback) {
var self = this,
runOptions = this.prepareRunConfig(options);
callback = backpack.normalise(callback);
!_.isObject(options) && (options = {});
// @todo make the extract runnables interface better defined and documented
// - give the ownership of error to each strategy lookup functions
// - think about moving these codes into an extension command prior to waterfall
// - the third argument in callback that returns control, is ambiguous and can be removed if error is controlled
// by each lookup function.
// - the interface can be further broken down to have the "flattenNode" action be made common and not be
// required to be coded in each lookup strategy
//
// serialise the items into a linear array based on the lookup strategy provided as input
extractRunnableItems(collection, options.entrypoint, function (err, runnableItems, entrypoint) {
if (err || !runnableItems) { return callback(new Error('Error fetching run items')); }
// Bail out only if: abortOnError is set and the returned entrypoint is invalid
if (options.abortOnError && !entrypoint) {
// eslint-disable-next-line max-len
return callback(new Error(`Unable to find a folder or request: ${_.get(options, 'entrypoint.execute')}`));
}
// ensure data is an array
!_.isArray(options.data) && (options.data = [{}]);
// get iterationCount from data if not set
if (!runOptions.iterationCount) {
runOptions.iterationCount = options.data.length;
}
return callback(null, (new Run({
items: runnableItems,
data: options.data,
environment: options.environment,
globals: _.has(options, 'globals') ? options.globals : self.options.globals,
// @todo Move to item level to support Item and ItemGroup variables
collectionVariables: collection.variables,
certificates: options.certificates,
proxies: options.proxies
}, runOptions)));
});
}
});
_.assign(Runner, {
/**
* Expose Run instance for testability
*
* @type {Run}
*/
Run: Run
});
module.exports = Runner;

205
node_modules/postman-runtime/lib/runner/instruction.js generated vendored Normal file
View File

@@ -0,0 +1,205 @@
/**
* An instruction is a self contained piece of information that can be created and then later be executed. {@link Run}
* instance uses this as the values of the `Run.next` queue.
*
* @module Run~Instructions
*/
var _ = require('lodash'),
Timings = require('./timings'),
arrayProtoSlice = Array.prototype.slice,
arrayProtoUnshift = Array.prototype.unshift,
pool; // function
/**
* Create a new instruction pool
*
* @param {Object.<Function>} processors - hash of all command processor functions
* @returns {InstructionPool}
*/
pool = function (processors) {
!_.isObject(processors) && (processors = {});
/**
* Create a new instruction to be executed later
*
* @constructor
*
* @param {String} name - name of the instruction. this is useful for later lookup of the `processor` function when
* deserialising this object
* @param {Object} [payload] - a **JSON compatible** object that will be forwarded as the 2nd last parameter to the
* processor.
* @param {Array} [args] - all the arguments that needs to be passed to the processor is in this array
* @private
* @example
* var inst = Instruction.create(function (arg1, payload, next) {
* console.log(payload);
* next(null, 'hello-on-execute with ' + arg1);
* }, 'sample-instruction', {
* payloadData1: 'value'
* }, ['one-arg']);
*
* // now, when we do execute, the result will be a console.log of payload and message will be as expected
* instance.execute(function (err, message) {
* console.log(message);
* });
*
*/
var Instruction = function (name, payload, args) {
var processor = processors[name];
if (!_.isString(name) || !_.isFunction(processor)) {
throw new Error('run-instruction: invalid construction');
}
// ensure that payload is an object so that data storage can be done. also ensure arguments is an array
!_.isObject(payload) && (payload = {});
!_.isArray(args) && (args = []);
_.assign(this, /** @lends Instruction.prototype */ {
/**
* @type {String}
*/
action: name,
/**
* @type {Object}
*/
payload: payload,
/**
* @type {Array}
*/
in: args,
/**
* @type {Timings}
*/
timings: Timings.create(),
/**
* @private
* @type {Function}
*/
_processor: processor
});
// record the timing when this instruction was created
this.timings.record('created');
};
/**
* Shortcut to `new Instruction(...);`
*
* @param {Function} processor
* @param {String} name
* @param {Object} [payload]
* @param {Array} [args]
*
* @returns {Instruction}
*/
Instruction.create = function (processor, name, payload, args) {
return new Instruction(processor, name, payload, args);
};
/**
* Store all thenable items
*
* @type {Array}
*/
Instruction._queue = [];
/**
* Executes an instruction with previously saved payload and arguments
*
* @param {Function} callback
* @param {*} [scope]
*
* @todo: use timeback and control it via options sent during pool creation as an option
*/
Instruction.prototype.execute = function (callback, scope) {
!scope && (scope = this);
var params = _.clone(this.in),
sealed = false,
doneAndSpread = function (err) {
if (sealed) {
console.error('__postmanruntime_fatal_debug: instruction.execute callback called twice');
if (err) {
console.error(err);
}
return;
}
sealed = true;
this.timings.record('end');
var args = arrayProtoSlice.call(arguments);
arrayProtoUnshift.call(args, scope);
if (err) { // in case it errored, we do not process any thenables
_.isArray(this._catch) && _.invokeMap(this._catch, _.apply, scope, arguments);
}
else {
// call all the `then` stuff and then the main callback
_.isArray(this._done) && _.invokeMap(this._done, _.apply, scope, _.tail(arguments));
}
setTimeout(callback.bind.apply(callback, args), 0);
}.bind(this);
// add two additional arguments at the end of the arguments saved - i.e. the payload and a function to call the
// callback asynchronously
params.push(this.payload, doneAndSpread);
this.timings.record('start');
// run the processor in a try block to avoid causing stalled runs
try {
this._processor.apply(scope, params);
}
catch (e) {
doneAndSpread(e);
}
};
Instruction.prototype.done = function (callback) {
(this._done || (this._done = [])).push(callback);
return this;
};
Instruction.prototype.catch = function (callback) {
(this._catch || (this._catch = [])).push(callback);
return this;
};
Instruction.clear = function () {
_.forEach(Instruction._queue, function (instruction) {
delete instruction._done;
});
Instruction._queue.length = 0;
};
Instruction.shift = function () {
return Instruction._queue.shift.apply(Instruction._queue, arguments);
};
Instruction.unshift = function () {
return Instruction._queue.unshift.apply(Instruction._queue, arguments);
};
Instruction.push = function () {
return Instruction._queue.push.apply(Instruction._queue, arguments);
};
return Instruction;
};
module.exports = {
pool: pool
};

View File

@@ -0,0 +1,88 @@
var _ = require('lodash'),
createItemContext = require('./create-item-context'),
// total number of replays allowed
MAX_REPLAY_COUNT = 3,
ReplayController;
/**
* Handles replay logic with replayState from context.
* Makes sure request replays do not go into an infinite loop.
*
* @param {ReplayState} replayState
* @param {Run} run
*
* @constructor
*/
ReplayController = function ReplayController (replayState, run) {
// store state
this.count = replayState ? replayState.count : 0;
this.run = run;
};
_.assign(ReplayController.prototype, /** @lends ReplayController.prototype */{
/**
* Sends a request in the item. This takes care of limiting the total number of replays for a request.
*
* @param {Object} context
* @param {Request} item
* @param {Object} desiredPayload a partial payload to use for the replay request
* @param {Function} success this callback is invoked when replay controller sent the request
* @param {Function} failure this callback is invoked when replay controller decided not to send the request
*/
requestReplay: function (context, item, desiredPayload, success, failure) {
// max retries exceeded
if (this.count >= MAX_REPLAY_COUNT) {
return failure(new Error('runtime: maximum intermediate request limit exceeded'));
}
// update replay count state
this.count++;
// update replay state to context
context.replayState = this.getReplayState();
// construct payload for request
var payload = _.defaults({
item: item,
// abortOnError makes sure request command bubbles errors
// so we can pass it on to the callback
abortOnError: true
}, desiredPayload);
// create item context from the new item
payload.context = createItemContext(payload, context);
this.run.immediate('httprequest', payload)
.done(function (response) {
success(null, response);
})
.catch(success);
},
/**
* Returns a serialized version of current ReplayController
*
* @returns {ReplayState}
*/
getReplayState: function () {
/**
* Defines the current replay state of a request.
*
* By replay state, we mean the number of requests sent
* as part of one Collection requests. It can be intermediate requests,
* or replays of the same collection requests.
*
* @typedef {Object} ReplayState
*
* @property {Number} count total number of requests, including Collection requests and replays
*/
return {
count: this.count
};
}
});
module.exports = ReplayController;

View File

@@ -0,0 +1,61 @@
var _ = require('lodash'),
AuthLoader = require('../authorizer/index').AuthLoader,
createAuthInterface = require('../authorizer/auth-interface'),
DOT_AUTH = '.auth';
module.exports = [
// Post authorization.
function (context, run, done) {
// if no response is provided, there's nothing to do, and probably means that the request errored out
// let the actual request command handle whatever needs to be done.
if (!context.response) { return done(); }
// bail out if there is no auth
if (!(context.auth && context.auth.type)) { return done(); }
var auth = context.auth,
originalAuth = context.originalItem.getAuth(),
originalAuthParams = originalAuth && originalAuth.parameters(),
authHandler = AuthLoader.getHandler(auth.type),
authInterface = createAuthInterface(auth);
// bail out if there is no matching auth handler for the type
if (!authHandler) {
run.triggers.console(context.coords, 'warn', 'runtime: could not find a handler for auth: ' + auth.type);
return done();
}
// invoke `post` on the Auth
authHandler.post(authInterface, context.response, function (err, success) {
// sync all auth system parameters to the original auth
originalAuthParams && auth.parameters().each(function (param) {
param && param.system && originalAuthParams.upsert({key: param.key, value: param.value, system: true});
});
// sync auth state back to item request
_.set(context, 'item.request.auth', auth);
// there was an error in auth post hook
// warn the user but don't bubble it up
if (err) {
run.triggers.console(
context.coords,
'warn',
'runtime~' + auth.type + '.auth: there was an error validating auth: ' + (err.message || err),
err
);
return done();
}
// auth was verified
if (success) { return done(); }
// request a replay of request
done(null, {replay: true, helper: auth.type + DOT_AUTH});
});
}
];

View File

@@ -0,0 +1,383 @@
var _ = require('lodash'),
async = require('async'),
util = require('./util'),
sdk = require('postman-collection'),
createAuthInterface = require('../authorizer/auth-interface'),
AuthLoader = require('../authorizer/index').AuthLoader,
ReplayController = require('./replay-controller'),
DOT_AUTH = '.auth';
module.exports = [
// File loading
function (context, run, done) {
if (!context.item) { return done(new Error('Nothing to resolve files for.')); }
var triggers = run.triggers,
cursor = context.coords,
resolver = run.options.fileResolver,
request = context.item && context.item.request,
mode,
data;
if (!request) { return done(new Error('No request to send.')); }
// if body is disabled than skip loading files.
// @todo this may cause problem if body is enabled/disabled programmatically from pre-request script.
if (request.body && request.body.disabled) { return done(); }
// todo: add helper functions in the sdk to do this cleanly for us
mode = _.get(request, 'body.mode');
data = _.get(request, ['body', mode]);
// if there is no mode specified, or no data for the specified mode we cannot resolve anything!
// @note that if source is not readable, there is no point reading anything, yet we need to warn that file
// upload was not done. hence we will have to proceed even without an unreadable source
if (!data) { // we do not need to check `mode` here since false mode returns no `data`
return done();
}
// in this block, we simply use async.waterfall to ensure that all form of file reading is async. essentially,
// we first determine the data mode and based on it pass the waterfall functions.
async.waterfall([async.constant(data), {
// form data parsing simply "enriches" all form parameters having file data type by replacing / setting the
// value as a read stream
formdata: function (formdata, next) {
// ensure that we only process the file type
async.eachSeries(_.filter(formdata.all(), {type: 'file'}), function (formparam, callback) {
if (!formparam || formparam.disabled) {
return callback(); // disabled params will be filtered in body-builder.
}
var paramIsComposite = Array.isArray(formparam.src),
onLoadError = function (err, disableParam) {
// triggering a warning message for the user
triggers.console(cursor, 'warn',
`Form param \`${formparam.key}\`, file load error: ${err.message || err}`);
// set disabled, it will be filtered in body-builder
disableParam && (formparam.disabled = true);
};
// handle missing file src
if (!formparam.src || (paramIsComposite && !formparam.src.length)) {
onLoadError(new Error('missing file source'), false);
return callback();
}
// handle form param with a single file
// @note we are handling single file first so that we do not need to hit additional complexity of
// handling multiple files while the majority use-case would be to handle single file.
if (!paramIsComposite) {
// eslint-disable-next-line security/detect-non-literal-fs-filename
util.createReadStream(resolver, formparam.src, function (err, stream) {
if (err) {
onLoadError(err, true);
}
else {
formparam.value = stream;
}
callback();
});
return;
}
// handle form param with multiple files
// @note we use map-limit here instead of free-form map in order to avoid choking the file system
// with many parallel descriptor access.
async.mapLimit(formparam.src, 10, function (src, next) {
// eslint-disable-next-line security/detect-non-literal-fs-filename
util.createReadStream(resolver, src, function (err, stream) {
if (err) {
// @note don't throw error or disable param if one of the src fails to load
onLoadError(err);
return next(); // swallow the error
}
next(null, {src: src, value: stream});
});
}, function (err, results) {
if (err) {
onLoadError(err, true);
return done();
}
_.forEach(results, function (result) {
// Insert individual param above the current formparam
result && formdata.insert(new sdk.FormParam(_.assign(formparam.toJSON(), result)),
formparam);
});
// remove the current formparam after exploding src
formdata.remove(formparam);
done();
});
}, next);
},
// file data
file: function (filedata, next) {
// eslint-disable-next-line security/detect-non-literal-fs-filename
util.createReadStream(resolver, filedata.src, function (err, stream) {
if (err) {
triggers.console(cursor, 'warn', 'Binary file load error: ' + err.message || err);
filedata.value = null; // ensure this does not mess with requester
delete filedata.content; // @todo - why content?
}
else {
filedata.content = stream;
}
next();
});
}
}[mode] || async.constant()], function (err) {
// just as a precaution, show the error in console. each resolver anyway should handle their own console
// warnings.
// @todo - get cursor here.
err && triggers.console(cursor, 'warn', 'file data resolution error: ' + (err.message || err));
done(null); // absorb the error since a console has been trigerred
});
},
// Authorization
function (context, run, done) {
// validate all stuff. dont ask.
if (!context.item) { return done(new Error('runtime: nothing to authorize.')); }
// bail out if there is no auth
if (!(context.auth && context.auth.type)) { return done(null); }
// get auth handler
var auth = context.auth,
authType = auth.type,
originalAuth = context.originalItem.getAuth(),
originalAuthParams = originalAuth && originalAuth.parameters(),
authHandler = AuthLoader.getHandler(authType),
authPreHook,
authInterface,
authSignHook = function () {
try {
authHandler.sign(authInterface, context.item.request, function (err) {
// handle all types of errors in one place, see catch block
if (err) { throw err; }
done();
});
}
catch (err) {
// handles synchronous and asynchronous errors in auth.sign
run.triggers.console(context.coords,
'warn',
'runtime~' + authType + '.auth: could not sign the request: ' + (err.message || err),
err
);
// swallow the error, we've warned the user
done();
}
};
// bail out if there is no matching auth handler for the type
if (!authHandler) {
run.triggers.console(context.coords, 'warn', 'runtime: could not find a handler for auth: ' + auth.type);
return done();
}
authInterface = createAuthInterface(auth, context.protocolProfileBehavior);
/**
* We go through the `pre` request send validation for the auth. In this step one of the three things can happen
*
* If the Auth `pre` hook
* 1. gives a go, we sign the request and proceed to send the request.
* 2. gives a no go, we don't sign the request, but proceed to send the request.
* 3. gives a no go, with a intermediate request,
* a. we suspend current request, send the intermediate request
* b. invoke Auth `init` hook with the response of the intermediate request
* c. invoke Auth `pre` hook, and repeat from 1
*/
authPreHook = function () {
authHandler.pre(authInterface, function (err, success, request) {
// there was an error in pre hook of auth
if (err) {
// warn the user
run.triggers.console(context.coords,
'warn',
'runtime~' + authType + '.auth: could not validate the request: ' + (err.message || err),
err
);
// swallow the error, we've warned the user
return done();
}
// sync all auth system parameters to the original auth
originalAuthParams && auth.parameters().each(function (param) {
param && param.system &&
originalAuthParams.upsert({key: param.key, value: param.value, system: true});
});
// authHandler gave a go, sign the request
if (success) { return authSignHook(); }
// auth gave a no go, but no intermediate request
if (!request) { return done(); }
// prepare for sending intermediate request
var replayController = new ReplayController(context.replayState, run),
item = new sdk.Item({request: request});
// auth handler gave a no go, and an intermediate request.
// make the intermediate request the response is passed to `init` hook
replayController.requestReplay(context,
item,
// marks the auth as source for intermediate request
{source: auth.type + DOT_AUTH},
function (err, response) {
// errors for intermediate requests are passed to request callback
// passing it here will add it to original request as well, so don't do it
if (err) { return done(); }
// pass the response to Auth `init` hook
authHandler.init(authInterface, response, function (error) {
if (error) {
// warn about the err
run.triggers.console(context.coords, 'warn', 'runtime~' + authType + '.auth: ' +
'could not initialize auth: ' + (error.message || error), error);
// swallow the error, we've warned the user
return done();
}
// schedule back to pre hook
authPreHook();
});
},
function (err) {
// warn users that maximum retries have exceeded
if (err) {
run.triggers.console(
context.coords, 'warn', 'runtime~' + authType + '.auth: ' + (err.message || err)
);
}
// but don't bubble up the error with the request
done();
}
);
});
};
// start the by calling the pre hook of the auth
authPreHook();
},
// Proxy lookup
function (context, run, done) {
var proxies = run.options.proxies,
request = context.item.request,
url;
if (!request) { return done(new Error('No request to resolve proxy for.')); }
url = request.url && request.url.toString();
async.waterfall([
// try resolving custom proxies before falling-back to system proxy
function (cb) {
if (_.isFunction(_.get(proxies, 'resolve'))) {
return cb(null, proxies.resolve(url));
}
return cb(null, undefined);
},
// fallback to system proxy
function (config, cb) {
if (config) {
return cb(null, config);
}
return _.isFunction(run.options.systemProxy) ? run.options.systemProxy(url, cb) : cb(null, undefined);
}
], function (err, config) {
if (err) {
run.triggers.console(context.coords, 'warn', 'proxy lookup error: ' + (err.message || err));
}
config && (request.proxy = sdk.ProxyConfig.isProxyConfig(config) ? config : new sdk.ProxyConfig(config));
return done();
});
},
// Certificate lookup + reading from whichever file resolver is provided
function (context, run, done) {
var request,
pfxPath,
keyPath,
certPath,
fileResolver,
certificate;
// A. Check if we have the file resolver
fileResolver = run.options.fileResolver;
if (!fileResolver) { return done(); } // No point going ahead
// B. Ensure we have the request
request = _.get(context.item, 'request');
if (!request) { return done(new Error('No request to resolve certificates for.')); }
// C. See if any cert should be sent, by performing a URL matching
certificate = run.options.certificates && run.options.certificates.resolveOne(request.url);
if (!certificate) { return done(); }
// D. Fetch the paths
// @todo: check why aren't we reading ca file (why are we not supporting ca file)
pfxPath = _.get(certificate, 'pfx.src');
keyPath = _.get(certificate, 'key.src');
certPath = _.get(certificate, 'cert.src');
// E. Read from the path, and add the values to the certificate, also associate
// the certificate with the current request.
async.mapValues({
pfx: pfxPath,
key: keyPath,
cert: certPath
}, function (value, key, next) {
// bail out if value is not defined
// @todo add test with server which only accepts cert file
if (!value) { return next(); }
// eslint-disable-next-line security/detect-non-literal-fs-filename
fileResolver.readFile(value, function (err, data) {
// Swallow the error after triggering a warning message for the user.
err && run.triggers.console(context.coords, 'warn',
`certificate "${key}" load error: ${(err.message || err)}`);
next(null, data);
});
}, function (err, fileContents) {
if (err) {
// Swallow the error after triggering a warning message for the user.
run.triggers.console(context.coords, 'warn', 'certificate load error: ' + (err.message || err));
return done();
}
if (fileContents) {
!_.isNil(fileContents.pfx) && _.set(certificate, 'pfx.value', fileContents.pfx);
!_.isNil(fileContents.key) && _.set(certificate, 'key.value', fileContents.key);
!_.isNil(fileContents.cert) && _.set(certificate, 'cert.value', fileContents.cert);
(fileContents.cert || fileContents.key || fileContents.pfx) && (request.certificate = certificate);
}
done();
});
}
];

236
node_modules/postman-runtime/lib/runner/run.js generated vendored Normal file
View File

@@ -0,0 +1,236 @@
var _ = require('lodash'),
async = require('async'),
backpack = require('../backpack'),
Instruction = require('./instruction'),
Run; // constructor
/**
* The run object is the primary way to interact with a run in progress. It allows controlling the run (pausing,
* starting, etc) and holds references to the helpers, such as requesters and authorizer.
*
* @param state
* @param options
*
* @property {Requester} requester
* @constructor
*/
Run = function PostmanCollectionRun (state, options) { // eslint-disable-line func-name-matching
_.assign(this, /** @lends Run.prototype */ {
/**
* @private
* @type {Object}
* @todo: state also holds the host for now (if any).
*/
state: _.assign({}, state),
/**
* @private
* @type {InstructionPool}
*/
pool: Instruction.pool(Run.commands),
/**
* @private
* @type {Object}
*/
stack: {},
/**
* @private
* @type {Object}
*/
options: options || {}
});
};
_.assign(Run.prototype, {
// eslint-disable-next-line jsdoc/check-param-names
/**
* @param {String} action
* @param {Object} [payload]
* @param {*} [args...]
*/
queue: function (action, payload) {
// extract the arguments that are to be forwarded to the processor
return this._schedule(action, payload, _.slice(arguments, 2), false);
},
// eslint-disable-next-line jsdoc/check-param-names
/**
* @param {String} action
* @param {Object} [payload]
* @param {*} [args...]
*/
interrupt: function (action, payload) {
// extract the arguments that are to be forwarded to the processor
return this._schedule(action, payload, _.slice(arguments, 2), true);
},
// eslint-disable-next-line jsdoc/check-param-names
/**
* Suspends current instruction and executes the given instruction.
*
* This method explicitly chooses not to handle errors, to allow the caller to catch errors and continue execution
* without terminating the instruction queue. However, it is up to the caller to make sure errors are handled,
* or it will go unhandled.
*
* @param {String} action
* @param {Object} payload
* @param {*} [args...]
*/
immediate: function (action, payload) {
var scope = this,
instruction = this.pool.create(action, payload, _.slice(arguments, 2));
// we directly execute this instruction instead od queueing it.
setTimeout(function () {
// we do not have callback, hence we send _.noop. we could have had made callback in .execute optional, but
// that would suppress design-time bugs in majority use-case and hence we avoided the same.
instruction.execute(_.noop, scope);
}, 0);
return instruction;
},
/**
* @param {Function|Object} callback
*/
start: function (callback) {
// @todo add `when` parameter to backpack.normalise
callback = backpack.normalise(callback, Object.keys(Run.triggers));
// cannot start run if it is already running
if (this.triggers) {
return callback(new Error('run: already running'));
}
var timeback = callback;
if (_.isFinite(_.get(this.options, 'timeout.global'))) {
timeback = backpack.timeback(callback, this.options.timeout.global, this, function () {
this.pool.clear();
});
}
// invoke all the initialiser functions one after another and if it has any error then abort with callback.
async.series(_.map(Run.initialisers, function (initializer) {
return initializer.bind(this);
}.bind(this)), function (err) {
if (err) { return callback(err); }
// save the normalised callbacks as triggers
this.triggers = callback;
this.triggers.start(null, this.state.cursor.current()); // @todo may throw error if cursor absent
this._process(timeback);
}.bind(this));
},
/**
* @private
* @param {Object|Cursor} cursor
* @return {Item}
*/
resolveCursor: function (cursor) {
if (!cursor || !Array.isArray(this.state.items)) { return; }
return this.state.items[cursor.position];
},
/**
* @private
*
* @param {String} action
* @param {Object} [payload]
* @param {Array} [args]
* @param {Boolean} [immediate]
*/
_schedule: function (action, payload, args, immediate) {
var instruction = this.pool.create(action, payload, args);
// based on whether the immediate flag is set, add to the top or bottom of the instruction queue.
(immediate ? this.pool.unshift : this.pool.push)(instruction);
return instruction;
},
_process: function (callback) {
// extract the command from the queue
var instruction = this.pool.shift();
// if there is nothing to process, exit
if (!instruction) {
callback(null, this.state.cursor.current());
return;
}
instruction.execute(function (err) {
return err ? callback(err, this.state.cursor.current()) : this._process(callback); // process recursively
}, this);
}
});
_.assign(Run, {
/**
* Stores all events that runner triggers
*
* @type {Object}
*/
triggers: {
start: true
},
/**
* stores all execution commands
* @enum {Function}
*
* @note commands are loaded by flattening the modules in the `./commands` directory
*/
commands: {},
/**
* Functions executed with commands on start
* @type {Array}
*/
initialisers: []
});
// commands are loaded by flattening the modules in the `./commands` directory
Run.commands = _.transform({
'control.command': require('./extensions/control.command'),
'event.command': require('./extensions/event.command'),
'httprequest.command': require('./extensions/http-request.command'),
'request.command': require('./extensions/request.command'),
'waterfall.command': require('./extensions/waterfall.command'),
'item.command': require('./extensions/item.command'),
'delay.command': require('./extensions/delay.command')
}, function (all, extension) {
// extract the prototype from the command interface
_.has(extension, 'prototype') && _.forOwn(extension.prototype, function (value, prop) {
if (Run.prototype.hasOwnProperty(prop)) {
throw new Error('run: duplicate command prototype extension ' + prop);
}
Run.prototype[prop] = value;
});
// put the triggers in a box
_.has(extension, 'triggers') && _.isArray(extension.triggers) && _.forEach(extension.triggers, function (name) {
name && (Run.triggers[name] = true);
});
// we add the processors to the processor list
_.has(extension, 'process') && _.forOwn(extension.process, function (command, name) {
if (!_.isFunction(command)) { return; }
if (all.hasOwnProperty(name)) {
throw new Error('run: duplicate command processor ' + name);
}
// finally add the command function to the accumulator
all[name] = command;
});
// add the initialisation functions
_.has(extension, 'init') && _.isFunction(extension.init) && Run.initialisers.push(extension.init);
});
module.exports = Run;

69
node_modules/postman-runtime/lib/runner/timings.js generated vendored Normal file
View File

@@ -0,0 +1,69 @@
/**
* All timing related functions within the runner is maintained in this module. Things like recording time with label,
* computing elapsed time between two labels, etc all go in here.
* @module Run~Timer
*/
var /**
* @const
* @type {string}
*/
NUMBER = 'number',
Timings; // constructor
/**
* An instance of a timer can record times with a label associated with it.
*
* @constructor
* @private
* @param {Object.<Number>} records create the timer instance with one or more labels and their timestamp.
*/
Timings = function Timings (records) {
for (var prop in records) {
this[prop] = parseInt(records[prop], 10);
}
};
/**
* Create a new instance of timer. Equivalent to doing new {@link Timer}(records:Object.<Number>);
*
* @param {Object.<Number>} records
* @returns {Timings}
*/
Timings.create = function (records) {
return new Timings(records);
};
/**
* Record the current time with the label specified.
*
* @param {String} label
* @returns {Number}
*
* @example
* var t = new Timings();
* t.record('start');
*
* console.log(t.toObject()); // logs {start: 1246333 }
*/
Timings.prototype.record = function (label) {
return (this[label] = Date.now());
};
/**
* Serialise a timing instance to an Object that can then be later used as a source to recreate another timing instance.
*
* @returns {Object.<Number>}
*/
Timings.prototype.toObject = function () {
var obj = {},
prop;
for (prop in this) {
(typeof this[prop] === NUMBER) && (obj[prop] = this[prop]);
}
return obj;
};
module.exports = Timings;

165
node_modules/postman-runtime/lib/runner/util.js generated vendored Normal file
View File

@@ -0,0 +1,165 @@
var /**
* @const
* @type {string}
*/
FUNCTION = 'function',
/**
* @const
* @type {string}
*/
STRING = 'string',
createReadStream; // function
/**
* Create readable stream for given file as well as detect possible file
* read issues.
*
* @param {Object} resolver - External file resolver module
* @param {String} fileSrc - File path
* @param {Function} callback - Final callback
*
* @note This function is defined in the file's root because there is a need to
* trap it within closure in order to append the stream clone functionalities.
* This ensures lesser footprint in case we have a memory leak.
*/
createReadStream = function (resolver, fileSrc, callback) {
var readStream;
// check for the existence of the file before creating read stream.
// eslint-disable-next-line security/detect-non-literal-fs-filename
resolver.stat(fileSrc, function (err, stats) {
if (err) {
// overwrite `ENOENT: no such file or directory` error message. Most likely the case.
err.code === 'ENOENT' && (err.message = `"${fileSrc}", no such file`);
return callback(err);
}
// check for a valid file.
if (stats && typeof stats.isFile === FUNCTION && !stats.isFile()) {
return callback(new Error(`"${fileSrc}", is not a file`));
}
// check read permissions for user.
// octal `400` signifies 'user permissions'. [4 0 0] -> [u g o]
// `4` signifies 'read permission'. [4] -> [1 0 0] -> [r w x]
if (stats && !(stats.mode & 0o400)) {
return callback(new Error(`"${fileSrc}", read permission denied`));
}
// @note Handle all the errors before `createReadStream` to avoid listening on stream error event.
// listening on error requires listening on end event as well. which will make this sync.
// @note In form-data mode stream error will be handled in postman-request but bails out ongoing request.
// eslint-disable-next-line security/detect-non-literal-fs-filename
readStream = resolver.createReadStream(fileSrc);
// We might have to read the file before making the actual request
// e.g, while calculating body hash during AWS auth or redirecting form-data params
// So, this method wraps the `createReadStream` function with fixed arguments.
// This makes sure that we don't have to pass `fileResolver` to
// internal modules (like auth plugins) for security reasons.
readStream.cloneReadStream = function (callback) {
// eslint-disable-next-line security/detect-non-literal-fs-filename
return createReadStream(resolver, fileSrc, callback);
};
callback(null, readStream);
});
};
/**
* Utility functions that are required to be re-used throughout the runner
* @module Runner~util
* @private
*
* @note Do not put module logic or business logic related functions here.
* The functions here are purely decoupled and low-level functions.
*/
module.exports = {
/**
* This function allows one to call another function by wrapping it within a try-catch block.
* The first parameter is the function itself, followed by the scope in which this function is to be executed.
* The third parameter onwards are blindly forwarded to the function being called
*
* @param {Function} fn
* @param {*} ctx
*
* @returns {Error} If there was an error executing the function, the error is returned.
* Note that if the function called here is asynchronous, it's errors will not be returned (for obvious reasons!)
*/
safeCall: function (fn, ctx) {
// extract the arguments that are to be forwarded to the function to be called
var args = Array.prototype.slice.call(arguments, 2);
try {
(typeof fn === FUNCTION) && fn.apply(ctx || global, args);
}
catch (err) {
return err;
}
},
/**
* Copies attributes from source object to destination object.
*
* @param dest
* @param src
*
* @return {Object}
*/
syncObject: function (dest, src) {
var prop;
// update or add values from src
for (prop in src) {
if (src.hasOwnProperty(prop)) {
dest[prop] = src[prop];
}
}
// remove values that no longer exist
for (prop in dest) {
if (dest.hasOwnProperty(prop) && !src.hasOwnProperty(prop)) {
delete dest[prop];
}
}
return dest;
},
/**
* Create readable stream for given file as well as detect possible file
* read issues. The resolver also attaches a clone function to the stream
* so that the stream can be restarted any time.
*
* @param {Object} resolver - External file resolver module
* @param {Function} resolver.stat - Resolver method to check for existence and permissions of file
* @param {Function} resolver.createReadStream - Resolver method for creating read stream
* @param {String} fileSrc - File path
* @param {Function} callback -
*
*/
createReadStream: function (resolver, fileSrc, callback) {
// bail out if resolver not found.
if (!resolver) {
return callback(new Error('file resolver not supported'));
}
// bail out if resolver is not supported.
if (typeof resolver.stat !== FUNCTION || typeof resolver.createReadStream !== FUNCTION) {
return callback(new Error('file resolver interface mismatch'));
}
// bail out if file source is invalid or empty string.
if (!fileSrc || typeof fileSrc !== STRING) {
return callback(new Error('invalid or missing file source'));
}
// now that things are sanitized and validated, we transfer it to the
// stream reading utility function that does the heavy lifting of
// calling there resolver to return the stream
return createReadStream(resolver, fileSrc, callback);
}
};

121
node_modules/postman-runtime/lib/version.js generated vendored Normal file
View File

@@ -0,0 +1,121 @@
var _ = require('lodash'),
RUNTIME_PACKAGE = require('../package.json'),
/**
* List of postman packages to include while generating dependency object
*
* @const
* @type {String[]}
*/
PACKAGES_TO_CONSIDER = {
'chai-postman': true,
'postman-collection': true,
'postman-request': true,
'postman-sandbox': true,
'uniscope': true,
'uvm': true
},
/**
* Generate object containing version and dependencies of Runtime or given module.
*
* @param {String} [moduleName] - Module name
* @param {Boolean} [deepDependencies=true] - Do depth dependencies traversal or stop at root
* @param {Object} [moduleData={}] - Object to store module data
* @returns {Object}
*
* @example <caption>Returned Object Structure</caption>
* var runtime = require('postman-runtime');
* runtime.version();
* {
* version: '7.6.1',
* dependencies: {
* 'postman-collection' : {
* version: '3.4.2',
* dependencies: {
* 'postman-request': {
* version: '2.88.1-postman.5'
* }
* }
* },
* 'postman-request': {
* version: '2.88.1-postman.5'
* }
* }
* }
*/
getVersionData = function (moduleName, deepDependencies, moduleData) {
// Set default values of function arguments if not given
// @note Argument moduleData is undefined if function is called directly.
// Otherwise moduleData will contain object if called recursively.
!moduleData && (moduleData = {});
// Include nested dependencies in moduleData object only if deepDependencies=true.
// Return only direct dependencies of module if deepDependencies=false.
(deepDependencies === undefined) && (deepDependencies = true);
var version,
// Runtime's package.json is considered by default
packageJson = RUNTIME_PACKAGE;
// bail out if either dependency not in PACKAGES_TO_CONSIDER
// or not Runtime's dependency
if (
moduleName &&
!(
PACKAGES_TO_CONSIDER[moduleName] &&
(
_.has(packageJson, ['dependencies', moduleName]) ||
_.has(packageJson, ['devDependencies', moduleName])
)
)
) {
return;
}
// if module name is given in function argument, consider that module's package.json instead of default
if (moduleName) {
// eslint-disable-next-line security/detect-non-literal-require
packageJson = require(require.resolve(`${moduleName}/package.json`));
}
// set version of dependency given as function argument or Runtime
moduleData.version = packageJson.version;
moduleData.dependencies = {};
_.forEach(PACKAGES_TO_CONSIDER, function (value, key) {
// if key is normal dependency
if (_.has(packageJson, ['dependencies', key])) {
version = packageJson.dependencies[key];
}
// else if key is devDependency
else if (_.has(packageJson, ['devDependencies', key])) {
version = packageJson.devDependencies[key];
}
// skip if key is not listed as dependency in packageJson
else {
return;
}
// include dependency in module data
moduleData.dependencies[key] = {
version: version,
dependencies: {}
};
// recursive call to include deep-dependency
if (!moduleName && deepDependencies) {
getVersionData(key, deepDependencies, moduleData.dependencies[key]);
}
// delete if no deep-dependency found
if (!_.size(moduleData.dependencies[key].dependencies)) {
delete moduleData.dependencies[key].dependencies;
}
});
return moduleData;
};
module.exports = getVersionData;

31
node_modules/postman-runtime/lib/visualizer/index.js generated vendored Normal file
View File

@@ -0,0 +1,31 @@
const Handlebars = require('handlebars');
module.exports = {
/**
* Hydrate the given template with given data and produce final HTML to render in visualizer
*
* @param {String} template - handlebars template as a string
* @param {Object} userData - data provided by user
* @param {Object} options - options for processing the template
* @param {Function} callback - callback called with errors and processed template
*/
processTemplate: function (template, userData, options, callback) {
// bail out if there is no valid template to process
if (typeof template !== 'string') {
return callback(new Error(`Invalid template. Template must be of type string, found ${typeof template}`));
}
var compiledTemplate = Handlebars.compile(template, options),
processedTemplate;
try {
// hydrate the template with provided data
processedTemplate = compiledTemplate(userData);
}
catch (err) {
return callback(err);
}
return callback(null, processedTemplate);
}
};

12
node_modules/postman-runtime/node_modules/.bin/uuid generated vendored Normal file
View File

@@ -0,0 +1,12 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../uuid/bin/uuid" "$@"
else
exec node "$basedir/../uuid/bin/uuid" "$@"
fi

View File

@@ -0,0 +1,17 @@
@ECHO off
GOTO start
:find_dp0
SET dp0=%~dp0
EXIT /b
:start
SETLOCAL
CALL :find_dp0
IF EXIST "%dp0%\node.exe" (
SET "_prog=%dp0%\node.exe"
) ELSE (
SET "_prog=node"
SET PATHEXT=%PATHEXT:;.JS;=;%
)
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\uuid\bin\uuid" %*

View File

@@ -0,0 +1,28 @@
#!/usr/bin/env pwsh
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
$exe=""
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
# Fix case when both the Windows and Linux builds of Node
# are installed in the same directory
$exe=".exe"
}
$ret=0
if (Test-Path "$basedir/node$exe") {
# Support pipeline input
if ($MyInvocation.ExpectingInput) {
$input | & "$basedir/node$exe" "$basedir/../uuid/bin/uuid" $args
} else {
& "$basedir/node$exe" "$basedir/../uuid/bin/uuid" $args
}
$ret=$LASTEXITCODE
} else {
# Support pipeline input
if ($MyInvocation.ExpectingInput) {
$input | & "node$exe" "$basedir/../uuid/bin/uuid" $args
} else {
& "node$exe" "$basedir/../uuid/bin/uuid" $args
}
$ret=$LASTEXITCODE
}
exit $ret

View File

@@ -0,0 +1,275 @@
# v2.6.3
- Updated lodash to squelch a security warning (#1675)
# v2.6.2
- Updated lodash to squelch a security warning (#1620)
# v2.6.1
- Updated lodash to prevent `npm audit` warnings. (#1532, #1533)
- Made `async-es` more optimized for webpack users (#1517)
- Fixed a stack overflow with large collections and a synchronous iterator (#1514)
- Various small fixes/chores (#1505, #1511, #1527, #1530)
# v2.6.0
- Added missing aliases for many methods. Previously, you could not (e.g.) `require('async/find')` or use `async.anyLimit`. (#1483)
- Improved `queue` performance. (#1448, #1454)
- Add missing sourcemap (#1452, #1453)
- Various doc updates (#1448, #1471, #1483)
# v2.5.0
- Added `concatLimit`, the `Limit` equivalent of [`concat`](https://caolan.github.io/async/docs.html#concat) ([#1426](https://github.com/caolan/async/issues/1426), [#1430](https://github.com/caolan/async/pull/1430))
- `concat` improvements: it now preserves order, handles falsy values and the `iteratee` callback takes a variable number of arguments ([#1437](https://github.com/caolan/async/issues/1437), [#1436](https://github.com/caolan/async/pull/1436))
- Fixed an issue in `queue` where there was a size discrepancy between `workersList().length` and `running()` ([#1428](https://github.com/caolan/async/issues/1428), [#1429](https://github.com/caolan/async/pull/1429))
- Various doc fixes ([#1422](https://github.com/caolan/async/issues/1422), [#1424](https://github.com/caolan/async/pull/1424))
# v2.4.1
- Fixed a bug preventing functions wrapped with `timeout()` from being re-used. ([#1418](https://github.com/caolan/async/issues/1418), [#1419](https://github.com/caolan/async/issues/1419))
# v2.4.0
- Added `tryEach`, for running async functions in parallel, where you only expect one to succeed. ([#1365](https://github.com/caolan/async/issues/1365), [#687](https://github.com/caolan/async/issues/687))
- Improved performance, most notably in `parallel` and `waterfall` ([#1395](https://github.com/caolan/async/issues/1395))
- Added `queue.remove()`, for removing items in a `queue` ([#1397](https://github.com/caolan/async/issues/1397), [#1391](https://github.com/caolan/async/issues/1391))
- Fixed using `eval`, preventing Async from running in pages with Content Security Policy ([#1404](https://github.com/caolan/async/issues/1404), [#1403](https://github.com/caolan/async/issues/1403))
- Fixed errors thrown in an `asyncify`ed function's callback being caught by the underlying Promise ([#1408](https://github.com/caolan/async/issues/1408))
- Fixed timing of `queue.empty()` ([#1367](https://github.com/caolan/async/issues/1367))
- Various doc fixes ([#1314](https://github.com/caolan/async/issues/1314), [#1394](https://github.com/caolan/async/issues/1394), [#1412](https://github.com/caolan/async/issues/1412))
# v2.3.0
- Added support for ES2017 `async` functions. Wherever you can pass a Node-style/CPS function that uses a callback, you can also pass an `async` function. Previously, you had to wrap `async` functions with `asyncify`. The caveat is that it will only work if `async` functions are supported natively in your environment, transpiled implementations can't be detected. ([#1386](https://github.com/caolan/async/issues/1386), [#1390](https://github.com/caolan/async/issues/1390))
- Small doc fix ([#1392](https://github.com/caolan/async/issues/1392))
# v2.2.0
- Added `groupBy`, and the `Series`/`Limit` equivalents, analogous to [`_.groupBy`](http://lodash.com/docs#groupBy) ([#1364](https://github.com/caolan/async/issues/1364))
- Fixed `transform` bug when `callback` was not passed ([#1381](https://github.com/caolan/async/issues/1381))
- Added note about `reflect` to `parallel` docs ([#1385](https://github.com/caolan/async/issues/1385))
# v2.1.5
- Fix `auto` bug when function names collided with Array.prototype ([#1358](https://github.com/caolan/async/issues/1358))
- Improve some error messages ([#1349](https://github.com/caolan/async/issues/1349))
- Avoid stack overflow case in queue
- Fixed an issue in `some`, `every` and `find` where processing would continue after the result was determined.
- Cleanup implementations of `some`, `every` and `find`
# v2.1.3
- Make bundle size smaller
- Create optimized hotpath for `filter` in array case.
# v2.1.2
- Fixed a stackoverflow bug with `detect`, `some`, `every` on large inputs ([#1293](https://github.com/caolan/async/issues/1293)).
# v2.1.0
- `retry` and `retryable` now support an optional `errorFilter` function that determines if the `task` should retry on the error ([#1256](https://github.com/caolan/async/issues/1256), [#1261](https://github.com/caolan/async/issues/1261))
- Optimized array iteration in `race`, `cargo`, `queue`, and `priorityQueue` ([#1253](https://github.com/caolan/async/issues/1253))
- Added alias documentation to doc site ([#1251](https://github.com/caolan/async/issues/1251), [#1254](https://github.com/caolan/async/issues/1254))
- Added [BootStrap scrollspy](http://getbootstrap.com/javascript/#scrollspy) to docs to highlight in the sidebar the current method being viewed ([#1289](https://github.com/caolan/async/issues/1289), [#1300](https://github.com/caolan/async/issues/1300))
- Various minor doc fixes ([#1263](https://github.com/caolan/async/issues/1263), [#1264](https://github.com/caolan/async/issues/1264), [#1271](https://github.com/caolan/async/issues/1271), [#1278](https://github.com/caolan/async/issues/1278), [#1280](https://github.com/caolan/async/issues/1280), [#1282](https://github.com/caolan/async/issues/1282), [#1302](https://github.com/caolan/async/issues/1302))
# v2.0.1
- Significantly optimized all iteration based collection methods such as `each`, `map`, `filter`, etc ([#1245](https://github.com/caolan/async/issues/1245), [#1246](https://github.com/caolan/async/issues/1246), [#1247](https://github.com/caolan/async/issues/1247)).
# v2.0.0
Lots of changes here!
First and foremost, we have a slick new [site for docs](https://caolan.github.io/async/). Special thanks to [**@hargasinski**](https://github.com/hargasinski) for his work converting our old docs to `jsdoc` format and implementing the new website. Also huge ups to [**@ivanseidel**](https://github.com/ivanseidel) for designing our new logo. It was a long process for both of these tasks, but I think these changes turned out extraordinary well.
The biggest feature is modularization. You can now `require("async/series")` to only require the `series` function. Every Async library function is available this way. You still can `require("async")` to require the entire library, like you could do before.
We also provide Async as a collection of ES2015 modules. You can now `import {each} from 'async-es'` or `import waterfall from 'async-es/waterfall'`. If you are using only a few Async functions, and are using a ES bundler such as Rollup, this can significantly lower your build size.
Major thanks to [**@Kikobeats**](github.com/Kikobeats), [**@aearly**](github.com/aearly) and [**@megawac**](github.com/megawac) for doing the majority of the modularization work, as well as [**@jdalton**](github.com/jdalton) and [**@Rich-Harris**](github.com/Rich-Harris) for advisory work on the general modularization strategy.
Another one of the general themes of the 2.0 release is standardization of what an "async" function is. We are now more strictly following the node-style continuation passing style. That is, an async function is a function that:
1. Takes a variable number of arguments
2. The last argument is always a callback
3. The callback can accept any number of arguments
4. The first argument passed to the callback will be treated as an error result, if the argument is truthy
5. Any number of result arguments can be passed after the "error" argument
6. The callback is called once and exactly once, either on the same tick or later tick of the JavaScript event loop.
There were several cases where Async accepted some functions that did not strictly have these properties, most notably `auto`, `every`, `some`, `filter`, `reject` and `detect`.
Another theme is performance. We have eliminated internal deferrals in all cases where they make sense. For example, in `waterfall` and `auto`, there was a `setImmediate` between each task -- these deferrals have been removed. A `setImmediate` call can add up to 1ms of delay. This might not seem like a lot, but it can add up if you are using many Async functions in the course of processing a HTTP request, for example. Nearly all asynchronous functions that do I/O already have some sort of deferral built in, so the extra deferral is unnecessary. The trade-off of this change is removing our built-in stack-overflow defense. Many synchronous callback calls in series can quickly overflow the JS call stack. If you do have a function that is sometimes synchronous (calling its callback on the same tick), and are running into stack overflows, wrap it with `async.ensureAsync()`.
Another big performance win has been re-implementing `queue`, `cargo`, and `priorityQueue` with [doubly linked lists](https://en.wikipedia.org/wiki/Doubly_linked_list) instead of arrays. This has lead to queues being an order of [magnitude faster on large sets of tasks](https://github.com/caolan/async/pull/1205).
## New Features
- Async is now modularized. Individual functions can be `require()`d from the main package. (`require('async/auto')`) ([#984](https://github.com/caolan/async/issues/984), [#996](https://github.com/caolan/async/issues/996))
- Async is also available as a collection of ES2015 modules in the new `async-es` package. (`import {forEachSeries} from 'async-es'`) ([#984](https://github.com/caolan/async/issues/984), [#996](https://github.com/caolan/async/issues/996))
- Added `race`, analogous to `Promise.race()`. It will run an array of async tasks in parallel and will call its callback with the result of the first task to respond. ([#568](https://github.com/caolan/async/issues/568), [#1038](https://github.com/caolan/async/issues/1038))
- Collection methods now accept ES2015 iterators. Maps, Sets, and anything that implements the iterator spec can now be passed directly to `each`, `map`, `parallel`, etc.. ([#579](https://github.com/caolan/async/issues/579), [#839](https://github.com/caolan/async/issues/839), [#1074](https://github.com/caolan/async/issues/1074))
- Added `mapValues`, for mapping over the properties of an object and returning an object with the same keys. ([#1157](https://github.com/caolan/async/issues/1157), [#1177](https://github.com/caolan/async/issues/1177))
- Added `timeout`, a wrapper for an async function that will make the task time-out after the specified time. ([#1007](https://github.com/caolan/async/issues/1007), [#1027](https://github.com/caolan/async/issues/1027))
- Added `reflect` and `reflectAll`, analagous to [`Promise.reflect()`](http://bluebirdjs.com/docs/api/reflect.html), a wrapper for async tasks that always succeeds, by gathering results and errors into an object. ([#942](https://github.com/caolan/async/issues/942), [#1012](https://github.com/caolan/async/issues/1012), [#1095](https://github.com/caolan/async/issues/1095))
- `constant` supports dynamic arguments -- it will now always use its last argument as the callback. ([#1016](https://github.com/caolan/async/issues/1016), [#1052](https://github.com/caolan/async/issues/1052))
- `setImmediate` and `nextTick` now support arguments to partially apply to the deferred function, like the node-native versions do. ([#940](https://github.com/caolan/async/issues/940), [#1053](https://github.com/caolan/async/issues/1053))
- `auto` now supports resolving cyclic dependencies using [Kahn's algorithm](https://en.wikipedia.org/wiki/Topological_sorting#Kahn.27s_algorithm) ([#1140](https://github.com/caolan/async/issues/1140)).
- Added `autoInject`, a relative of `auto` that automatically spreads a task's dependencies as arguments to the task function. ([#608](https://github.com/caolan/async/issues/608), [#1055](https://github.com/caolan/async/issues/1055), [#1099](https://github.com/caolan/async/issues/1099), [#1100](https://github.com/caolan/async/issues/1100))
- You can now limit the concurrency of `auto` tasks. ([#635](https://github.com/caolan/async/issues/635), [#637](https://github.com/caolan/async/issues/637))
- Added `retryable`, a relative of `retry` that wraps an async function, making it retry when called. ([#1058](https://github.com/caolan/async/issues/1058))
- `retry` now supports specifying a function that determines the next time interval, useful for exponential backoff, logging and other retry strategies. ([#1161](https://github.com/caolan/async/issues/1161))
- `retry` will now pass all of the arguments the task function was resolved with to the callback ([#1231](https://github.com/caolan/async/issues/1231)).
- Added `q.unsaturated` -- callback called when a `queue`'s number of running workers falls below a threshold. ([#868](https://github.com/caolan/async/issues/868), [#1030](https://github.com/caolan/async/issues/1030), [#1033](https://github.com/caolan/async/issues/1033), [#1034](https://github.com/caolan/async/issues/1034))
- Added `q.error` -- a callback called whenever a `queue` task calls its callback with an error. ([#1170](https://github.com/caolan/async/issues/1170))
- `applyEach` and `applyEachSeries` now pass results to the final callback. ([#1088](https://github.com/caolan/async/issues/1088))
## Breaking changes
- Calling a callback more than once is considered an error, and an error will be thrown. This had an explicit breaking change in `waterfall`. If you were relying on this behavior, you should more accurately represent your control flow as an event emitter or stream. ([#814](https://github.com/caolan/async/issues/814), [#815](https://github.com/caolan/async/issues/815), [#1048](https://github.com/caolan/async/issues/1048), [#1050](https://github.com/caolan/async/issues/1050))
- `auto` task functions now always take the callback as the last argument. If a task has dependencies, the `results` object will be passed as the first argument. To migrate old task functions, wrap them with [`_.flip`](https://lodash.com/docs#flip) ([#1036](https://github.com/caolan/async/issues/1036), [#1042](https://github.com/caolan/async/issues/1042))
- Internal `setImmediate` calls have been refactored away. This may make existing flows vulnerable to stack overflows if you use many synchronous functions in series. Use `ensureAsync` to work around this. ([#696](https://github.com/caolan/async/issues/696), [#704](https://github.com/caolan/async/issues/704), [#1049](https://github.com/caolan/async/issues/1049), [#1050](https://github.com/caolan/async/issues/1050))
- `map` used to return an object when iterating over an object. `map` now always returns an array, like in other libraries. The previous object behavior has been split out into `mapValues`. ([#1157](https://github.com/caolan/async/issues/1157), [#1177](https://github.com/caolan/async/issues/1177))
- `filter`, `reject`, `some`, `every`, `detect` and their families like `{METHOD}Series` and `{METHOD}Limit` now expect an error as the first callback argument, rather than just a simple boolean. Pass `null` as the first argument, or use `fs.access` instead of `fs.exists`. ([#118](https://github.com/caolan/async/issues/118), [#774](https://github.com/caolan/async/issues/774), [#1028](https://github.com/caolan/async/issues/1028), [#1041](https://github.com/caolan/async/issues/1041))
- `{METHOD}` and `{METHOD}Series` are now implemented in terms of `{METHOD}Limit`. This is a major internal simplification, and is not expected to cause many problems, but it does subtly affect how functions execute internally. ([#778](https://github.com/caolan/async/issues/778), [#847](https://github.com/caolan/async/issues/847))
- `retry`'s callback is now optional. Previously, omitting the callback would partially apply the function, meaning it could be passed directly as a task to `series` or `auto`. The partially applied "control-flow" behavior has been separated out into `retryable`. ([#1054](https://github.com/caolan/async/issues/1054), [#1058](https://github.com/caolan/async/issues/1058))
- The test function for `whilst`, `until`, and `during` used to be passed non-error args from the iteratee function's callback, but this led to weirdness where the first call of the test function would be passed no args. We have made it so the test function is never passed extra arguments, and only the `doWhilst`, `doUntil`, and `doDuring` functions pass iteratee callback arguments to the test function ([#1217](https://github.com/caolan/async/issues/1217), [#1224](https://github.com/caolan/async/issues/1224))
- The `q.tasks` array has been renamed `q._tasks` and is now implemented as a doubly linked list (DLL). Any code that used to interact with this array will need to be updated to either use the provided helpers or support DLLs ([#1205](https://github.com/caolan/async/issues/1205)).
- The timing of the `q.saturated()` callback in a `queue` has been modified to better reflect when tasks pushed to the queue will start queueing. ([#724](https://github.com/caolan/async/issues/724), [#1078](https://github.com/caolan/async/issues/1078))
- Removed `iterator` method in favour of [ES2015 iterator protocol](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Iterators_and_Generators ) which natively supports arrays ([#1237](https://github.com/caolan/async/issues/1237))
- Dropped support for Component, Jam, SPM, and Volo ([#1175](https://github.com/caolan/async/issues/1175), #[#176](https://github.com/caolan/async/issues/176))
## Bug Fixes
- Improved handling of no dependency cases in `auto` & `autoInject` ([#1147](https://github.com/caolan/async/issues/1147)).
- Fixed a bug where the callback generated by `asyncify` with `Promises` could resolve twice ([#1197](https://github.com/caolan/async/issues/1197)).
- Fixed several documented optional callbacks not actually being optional ([#1223](https://github.com/caolan/async/issues/1223)).
## Other
- Added `someSeries` and `everySeries` for symmetry, as well as a complete set of `any`/`anyLimit`/`anySeries` and `all`/`/allLmit`/`allSeries` aliases.
- Added `find` as an alias for `detect. (as well as `findLimit` and `findSeries`).
- Various doc fixes ([#1005](https://github.com/caolan/async/issues/1005), [#1008](https://github.com/caolan/async/issues/1008), [#1010](https://github.com/caolan/async/issues/1010), [#1015](https://github.com/caolan/async/issues/1015), [#1021](https://github.com/caolan/async/issues/1021), [#1037](https://github.com/caolan/async/issues/1037), [#1039](https://github.com/caolan/async/issues/1039), [#1051](https://github.com/caolan/async/issues/1051), [#1102](https://github.com/caolan/async/issues/1102), [#1107](https://github.com/caolan/async/issues/1107), [#1121](https://github.com/caolan/async/issues/1121), [#1123](https://github.com/caolan/async/issues/1123), [#1129](https://github.com/caolan/async/issues/1129), [#1135](https://github.com/caolan/async/issues/1135), [#1138](https://github.com/caolan/async/issues/1138), [#1141](https://github.com/caolan/async/issues/1141), [#1153](https://github.com/caolan/async/issues/1153), [#1216](https://github.com/caolan/async/issues/1216), [#1217](https://github.com/caolan/async/issues/1217), [#1232](https://github.com/caolan/async/issues/1232), [#1233](https://github.com/caolan/async/issues/1233), [#1236](https://github.com/caolan/async/issues/1236), [#1238](https://github.com/caolan/async/issues/1238))
Thank you [**@aearly**](github.com/aearly) and [**@megawac**](github.com/megawac) for taking the lead on version 2 of async.
------------------------------------------
# v1.5.2
- Allow using `"constructor"` as an argument in `memoize` ([#998](https://github.com/caolan/async/issues/998))
- Give a better error messsage when `auto` dependency checking fails ([#994](https://github.com/caolan/async/issues/994))
- Various doc updates ([#936](https://github.com/caolan/async/issues/936), [#956](https://github.com/caolan/async/issues/956), [#979](https://github.com/caolan/async/issues/979), [#1002](https://github.com/caolan/async/issues/1002))
# v1.5.1
- Fix issue with `pause` in `queue` with concurrency enabled ([#946](https://github.com/caolan/async/issues/946))
- `while` and `until` now pass the final result to callback ([#963](https://github.com/caolan/async/issues/963))
- `auto` will properly handle concurrency when there is no callback ([#966](https://github.com/caolan/async/issues/966))
- `auto` will no. properly stop execution when an error occurs ([#988](https://github.com/caolan/async/issues/988), [#993](https://github.com/caolan/async/issues/993))
- Various doc fixes ([#971](https://github.com/caolan/async/issues/971), [#980](https://github.com/caolan/async/issues/980))
# v1.5.0
- Added `transform`, analogous to [`_.transform`](http://lodash.com/docs#transform) ([#892](https://github.com/caolan/async/issues/892))
- `map` now returns an object when an object is passed in, rather than array with non-numeric keys. `map` will begin always returning an array with numeric indexes in the next major release. ([#873](https://github.com/caolan/async/issues/873))
- `auto` now accepts an optional `concurrency` argument to limit the number o. running tasks ([#637](https://github.com/caolan/async/issues/637))
- Added `queue#workersList()`, to retrieve the lis. of currently running tasks. ([#891](https://github.com/caolan/async/issues/891))
- Various code simplifications ([#896](https://github.com/caolan/async/issues/896), [#904](https://github.com/caolan/async/issues/904))
- Various doc fixes :scroll: ([#890](https://github.com/caolan/async/issues/890), [#894](https://github.com/caolan/async/issues/894), [#903](https://github.com/caolan/async/issues/903), [#905](https://github.com/caolan/async/issues/905), [#912](https://github.com/caolan/async/issues/912))
# v1.4.2
- Ensure coverage files don't get published on npm ([#879](https://github.com/caolan/async/issues/879))
# v1.4.1
- Add in overlooked `detectLimit` method ([#866](https://github.com/caolan/async/issues/866))
- Removed unnecessary files from npm releases ([#861](https://github.com/caolan/async/issues/861))
- Removed usage of a reserved word to prevent :boom: in older environments ([#870](https://github.com/caolan/async/issues/870))
# v1.4.0
- `asyncify` now supports promises ([#840](https://github.com/caolan/async/issues/840))
- Added `Limit` versions of `filter` and `reject` ([#836](https://github.com/caolan/async/issues/836))
- Add `Limit` versions of `detect`, `some` and `every` ([#828](https://github.com/caolan/async/issues/828), [#829](https://github.com/caolan/async/issues/829))
- `some`, `every` and `detect` now short circuit early ([#828](https://github.com/caolan/async/issues/828), [#829](https://github.com/caolan/async/issues/829))
- Improve detection of the global object ([#804](https://github.com/caolan/async/issues/804)), enabling use in WebWorkers
- `whilst` now called with arguments from iterator ([#823](https://github.com/caolan/async/issues/823))
- `during` now gets called with arguments from iterator ([#824](https://github.com/caolan/async/issues/824))
- Code simplifications and optimizations aplenty ([diff](https://github.com/caolan/async/compare/v1.3.0...v1.4.0))
# v1.3.0
New Features:
- Added `constant`
- Added `asyncify`/`wrapSync` for making sync functions work with callbacks. ([#671](https://github.com/caolan/async/issues/671), [#806](https://github.com/caolan/async/issues/806))
- Added `during` and `doDuring`, which are like `whilst` with an async truth test. ([#800](https://github.com/caolan/async/issues/800))
- `retry` now accepts an `interval` parameter to specify a delay between retries. ([#793](https://github.com/caolan/async/issues/793))
- `async` should work better in Web Workers due to better `root` detection ([#804](https://github.com/caolan/async/issues/804))
- Callbacks are now optional in `whilst`, `doWhilst`, `until`, and `doUntil` ([#642](https://github.com/caolan/async/issues/642))
- Various internal updates ([#786](https://github.com/caolan/async/issues/786), [#801](https://github.com/caolan/async/issues/801), [#802](https://github.com/caolan/async/issues/802), [#803](https://github.com/caolan/async/issues/803))
- Various doc fixes ([#790](https://github.com/caolan/async/issues/790), [#794](https://github.com/caolan/async/issues/794))
Bug Fixes:
- `cargo` now exposes the `payload` size, and `cargo.payload` can be changed on the fly after the `cargo` is created. ([#740](https://github.com/caolan/async/issues/740), [#744](https://github.com/caolan/async/issues/744), [#783](https://github.com/caolan/async/issues/783))
# v1.2.1
Bug Fix:
- Small regression with synchronous iterator behavior in `eachSeries` with a 1-element array. Before 1.1.0, `eachSeries`'s callback was called on the same tick, which this patch restores. In 2.0.0, it will be called on the next tick. ([#782](https://github.com/caolan/async/issues/782))
# v1.2.0
New Features:
- Added `timesLimit` ([#743](https://github.com/caolan/async/issues/743))
- `concurrency` can be changed after initialization in `queue` by setting `q.concurrency`. The new concurrency will be reflected the next time a task is processed. ([#747](https://github.com/caolan/async/issues/747), [#772](https://github.com/caolan/async/issues/772))
Bug Fixes:
- Fixed a regression in `each` and family with empty arrays that have additional properties. ([#775](https://github.com/caolan/async/issues/775), [#777](https://github.com/caolan/async/issues/777))
# v1.1.1
Bug Fix:
- Small regression with synchronous iterator behavior in `eachSeries` with a 1-element array. Before 1.1.0, `eachSeries`'s callback was called on the same tick, which this patch restores. In 2.0.0, it will be called on the next tick. ([#782](https://github.com/caolan/async/issues/782))
# v1.1.0
New Features:
- `cargo` now supports all of the same methods and event callbacks as `queue`.
- Added `ensureAsync` - A wrapper that ensures an async function calls its callback on a later tick. ([#769](https://github.com/caolan/async/issues/769))
- Optimized `map`, `eachOf`, and `waterfall` families of functions
- Passing a `null` or `undefined` array to `map`, `each`, `parallel` and families will be treated as an empty array ([#667](https://github.com/caolan/async/issues/667)).
- The callback is now optional for the composed results of `compose` and `seq`. ([#618](https://github.com/caolan/async/issues/618))
- Reduced file size by 4kb, (minified version by 1kb)
- Added code coverage through `nyc` and `coveralls` ([#768](https://github.com/caolan/async/issues/768))
Bug Fixes:
- `forever` will no longer stack overflow with a synchronous iterator ([#622](https://github.com/caolan/async/issues/622))
- `eachLimit` and other limit functions will stop iterating once an error occurs ([#754](https://github.com/caolan/async/issues/754))
- Always pass `null` in callbacks when there is no error ([#439](https://github.com/caolan/async/issues/439))
- Ensure proper conditions when calling `drain()` after pushing an empty data set to a queue ([#668](https://github.com/caolan/async/issues/668))
- `each` and family will properly handle an empty array ([#578](https://github.com/caolan/async/issues/578))
- `eachSeries` and family will finish if the underlying array is modified during execution ([#557](https://github.com/caolan/async/issues/557))
- `queue` will throw if a non-function is passed to `q.push()` ([#593](https://github.com/caolan/async/issues/593))
- Doc fixes ([#629](https://github.com/caolan/async/issues/629), [#766](https://github.com/caolan/async/issues/766))
# v1.0.0
No known breaking changes, we are simply complying with semver from here on out.
Changes:
- Start using a changelog!
- Add `forEachOf` for iterating over Objects (or to iterate Arrays with indexes available) ([#168](https://github.com/caolan/async/issues/168) [#704](https://github.com/caolan/async/issues/704) [#321](https://github.com/caolan/async/issues/321))
- Detect deadlocks in `auto` ([#663](https://github.com/caolan/async/issues/663))
- Better support for require.js ([#527](https://github.com/caolan/async/issues/527))
- Throw if queue created with concurrency `0` ([#714](https://github.com/caolan/async/issues/714))
- Fix unneeded iteration in `queue.resume()` ([#758](https://github.com/caolan/async/issues/758))
- Guard against timer mocking overriding `setImmediate` ([#609](https://github.com/caolan/async/issues/609) [#611](https://github.com/caolan/async/issues/611))
- Miscellaneous doc fixes ([#542](https://github.com/caolan/async/issues/542) [#596](https://github.com/caolan/async/issues/596) [#615](https://github.com/caolan/async/issues/615) [#628](https://github.com/caolan/async/issues/628) [#631](https://github.com/caolan/async/issues/631) [#690](https://github.com/caolan/async/issues/690) [#729](https://github.com/caolan/async/issues/729))
- Use single noop function internally ([#546](https://github.com/caolan/async/issues/546))
- Optimize internal `_each`, `_map` and `_keys` functions.

View File

@@ -0,0 +1,19 @@
Copyright (c) 2010-2018 Caolan McMahon
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@@ -0,0 +1,56 @@
![Async Logo](https://raw.githubusercontent.com/caolan/async/master/logo/async-logo_readme.jpg)
[![Build Status via Travis CI](https://travis-ci.org/caolan/async.svg?branch=master)](https://travis-ci.org/caolan/async)
[![NPM version](https://img.shields.io/npm/v/async.svg)](https://www.npmjs.com/package/async)
[![Coverage Status](https://coveralls.io/repos/caolan/async/badge.svg?branch=master)](https://coveralls.io/r/caolan/async?branch=master)
[![Join the chat at https://gitter.im/caolan/async](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/caolan/async?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![libhive - Open source examples](https://www.libhive.com/providers/npm/packages/async/examples/badge.svg)](https://www.libhive.com/providers/npm/packages/async)
[![jsDelivr Hits](https://data.jsdelivr.com/v1/package/npm/async/badge?style=rounded)](https://www.jsdelivr.com/package/npm/async)
Async is a utility module which provides straight-forward, powerful functions for working with [asynchronous JavaScript](http://caolan.github.io/async/global.html). Although originally designed for use with [Node.js](https://nodejs.org/) and installable via `npm install --save async`, it can also be used directly in the browser.
This version of the package is optimized for the Node.js environment. If you use Async with webpack, install [`async-es`](https://www.npmjs.com/package/async-es) instead.
For Documentation, visit <https://caolan.github.io/async/>
*For Async v1.5.x documentation, go [HERE](https://github.com/caolan/async/blob/v1.5.2/README.md)*
```javascript
// for use with Node-style callbacks...
var async = require("async");
var obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"};
var configs = {};
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err);
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e);
}
callback();
});
}, err => {
if (err) console.error(err.message);
// configs is now a map of JSON data
doSomethingWith(configs);
});
```
```javascript
var async = require("async");
// ...or ES2017 async functions
async.mapLimit(urls, 5, async function(url) {
const response = await fetch(url)
return response.body
}, (err, results) => {
if (err) throw err
// results is now an array of the response bodies
console.log(results)
})
```

50
node_modules/postman-runtime/node_modules/async/all.js generated vendored Normal file
View File

@@ -0,0 +1,50 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallel = require('./internal/doParallel');
var _doParallel2 = _interopRequireDefault(_doParallel);
var _notId = require('./internal/notId');
var _notId2 = _interopRequireDefault(_notId);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Returns `true` if every element in `coll` satisfies an async test. If any
* iteratee call returns `false`, the main `callback` is immediately called.
*
* @name every
* @static
* @memberOf module:Collections
* @method
* @alias all
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collection in parallel.
* The iteratee must complete with a boolean result value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Result will be either `true` or `false`
* depending on the values of the async tests. Invoked with (err, result).
* @example
*
* async.every(['file1','file2','file3'], function(filePath, callback) {
* fs.access(filePath, function(err) {
* callback(null, !err)
* });
* }, function(err, result) {
* // if result is true then every file exists
* });
*/
exports.default = (0, _doParallel2.default)((0, _createTester2.default)(_notId2.default, _notId2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,42 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallelLimit = require('./internal/doParallelLimit');
var _doParallelLimit2 = _interopRequireDefault(_doParallelLimit);
var _notId = require('./internal/notId');
var _notId2 = _interopRequireDefault(_notId);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`every`]{@link module:Collections.every} but runs a maximum of `limit` async operations at a time.
*
* @name everyLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.every]{@link module:Collections.every}
* @alias allLimit
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collection in parallel.
* The iteratee must complete with a boolean result value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Result will be either `true` or `false`
* depending on the values of the async tests. Invoked with (err, result).
*/
exports.default = (0, _doParallelLimit2.default)((0, _createTester2.default)(_notId2.default, _notId2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,37 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _everyLimit = require('./everyLimit');
var _everyLimit2 = _interopRequireDefault(_everyLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`every`]{@link module:Collections.every} but runs only a single async operation at a time.
*
* @name everySeries
* @static
* @memberOf module:Collections
* @method
* @see [async.every]{@link module:Collections.every}
* @alias allSeries
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collection in series.
* The iteratee must complete with a boolean result value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Result will be either `true` or `false`
* depending on the values of the async tests. Invoked with (err, result).
*/
exports.default = (0, _doLimit2.default)(_everyLimit2.default, 1);
module.exports = exports['default'];

52
node_modules/postman-runtime/node_modules/async/any.js generated vendored Normal file
View File

@@ -0,0 +1,52 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallel = require('./internal/doParallel');
var _doParallel2 = _interopRequireDefault(_doParallel);
var _identity = require('lodash/identity');
var _identity2 = _interopRequireDefault(_identity);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Returns `true` if at least one element in the `coll` satisfies an async test.
* If any iteratee call returns `true`, the main `callback` is immediately
* called.
*
* @name some
* @static
* @memberOf module:Collections
* @method
* @alias any
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collections in parallel.
* The iteratee should complete with a boolean `result` value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the iteratee functions have finished.
* Result will be either `true` or `false` depending on the values of the async
* tests. Invoked with (err, result).
* @example
*
* async.some(['file1','file2','file3'], function(filePath, callback) {
* fs.access(filePath, function(err) {
* callback(null, !err)
* });
* }, function(err, result) {
* // if result is true then at least one of the files exists
* });
*/
exports.default = (0, _doParallel2.default)((0, _createTester2.default)(Boolean, _identity2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,43 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallelLimit = require('./internal/doParallelLimit');
var _doParallelLimit2 = _interopRequireDefault(_doParallelLimit);
var _identity = require('lodash/identity');
var _identity2 = _interopRequireDefault(_identity);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`some`]{@link module:Collections.some} but runs a maximum of `limit` async operations at a time.
*
* @name someLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.some]{@link module:Collections.some}
* @alias anyLimit
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collections in parallel.
* The iteratee should complete with a boolean `result` value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the iteratee functions have finished.
* Result will be either `true` or `false` depending on the values of the async
* tests. Invoked with (err, result).
*/
exports.default = (0, _doParallelLimit2.default)((0, _createTester2.default)(Boolean, _identity2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,38 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _someLimit = require('./someLimit');
var _someLimit2 = _interopRequireDefault(_someLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`some`]{@link module:Collections.some} but runs only a single async operation at a time.
*
* @name someSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.some]{@link module:Collections.some}
* @alias anySeries
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collections in series.
* The iteratee should complete with a boolean `result` value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the iteratee functions have finished.
* Result will be either `true` or `false` depending on the values of the async
* tests. Invoked with (err, result).
*/
exports.default = (0, _doLimit2.default)(_someLimit2.default, 1);
module.exports = exports['default'];

View File

@@ -0,0 +1,68 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = function (fn /*, ...args*/) {
var args = (0, _slice2.default)(arguments, 1);
return function () /*callArgs*/{
var callArgs = (0, _slice2.default)(arguments);
return fn.apply(null, args.concat(callArgs));
};
};
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
;
/**
* Creates a continuation function with some arguments already applied.
*
* Useful as a shorthand when combined with other control flow functions. Any
* arguments passed to the returned function are added to the arguments
* originally passed to apply.
*
* @name apply
* @static
* @memberOf module:Utils
* @method
* @category Util
* @param {Function} fn - The function you want to eventually apply all
* arguments to. Invokes with (arguments...).
* @param {...*} arguments... - Any number of arguments to automatically apply
* when the continuation is called.
* @returns {Function} the partially-applied function
* @example
*
* // using apply
* async.parallel([
* async.apply(fs.writeFile, 'testfile1', 'test1'),
* async.apply(fs.writeFile, 'testfile2', 'test2')
* ]);
*
*
* // the same process without using apply
* async.parallel([
* function(callback) {
* fs.writeFile('testfile1', 'test1', callback);
* },
* function(callback) {
* fs.writeFile('testfile2', 'test2', callback);
* }
* ]);
*
* // It's possible to pass any number of additional arguments when calling the
* // continuation:
*
* node> var fn = async.apply(sys.puts, 'one');
* node> fn('two', 'three');
* one
* two
* three
*/
module.exports = exports['default'];

View File

@@ -0,0 +1,51 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _applyEach = require('./internal/applyEach');
var _applyEach2 = _interopRequireDefault(_applyEach);
var _map = require('./map');
var _map2 = _interopRequireDefault(_map);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Applies the provided arguments to each function in the array, calling
* `callback` after all functions have completed. If you only provide the first
* argument, `fns`, then it will return a function which lets you pass in the
* arguments as if it were a single function call. If more arguments are
* provided, `callback` is required while `args` is still optional.
*
* @name applyEach
* @static
* @memberOf module:ControlFlow
* @method
* @category Control Flow
* @param {Array|Iterable|Object} fns - A collection of {@link AsyncFunction}s
* to all call with the same arguments
* @param {...*} [args] - any number of separate arguments to pass to the
* function.
* @param {Function} [callback] - the final argument should be the callback,
* called when all functions have completed processing.
* @returns {Function} - If only the first argument, `fns`, is provided, it will
* return a function which lets you pass in the arguments as if it were a single
* function call. The signature is `(..args, callback)`. If invoked with any
* arguments, `callback` is required.
* @example
*
* async.applyEach([enableSearch, updateSchema], 'bucket', callback);
*
* // partial application example:
* async.each(
* buckets,
* async.applyEach([enableSearch, updateSchema]),
* callback
* );
*/
exports.default = (0, _applyEach2.default)(_map2.default);
module.exports = exports['default'];

View File

@@ -0,0 +1,37 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _applyEach = require('./internal/applyEach');
var _applyEach2 = _interopRequireDefault(_applyEach);
var _mapSeries = require('./mapSeries');
var _mapSeries2 = _interopRequireDefault(_mapSeries);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`applyEach`]{@link module:ControlFlow.applyEach} but runs only a single async operation at a time.
*
* @name applyEachSeries
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.applyEach]{@link module:ControlFlow.applyEach}
* @category Control Flow
* @param {Array|Iterable|Object} fns - A collection of {@link AsyncFunction}s to all
* call with the same arguments
* @param {...*} [args] - any number of separate arguments to pass to the
* function.
* @param {Function} [callback] - the final argument should be the callback,
* called when all functions have completed processing.
* @returns {Function} - If only the first argument is provided, it will return
* a function which lets you pass in the arguments as if it were a single
* function call.
*/
exports.default = (0, _applyEach2.default)(_mapSeries2.default);
module.exports = exports['default'];

View File

@@ -0,0 +1,110 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = asyncify;
var _isObject = require('lodash/isObject');
var _isObject2 = _interopRequireDefault(_isObject);
var _initialParams = require('./internal/initialParams');
var _initialParams2 = _interopRequireDefault(_initialParams);
var _setImmediate = require('./internal/setImmediate');
var _setImmediate2 = _interopRequireDefault(_setImmediate);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Take a sync function and make it async, passing its return value to a
* callback. This is useful for plugging sync functions into a waterfall,
* series, or other async functions. Any arguments passed to the generated
* function will be passed to the wrapped function (except for the final
* callback argument). Errors thrown will be passed to the callback.
*
* If the function passed to `asyncify` returns a Promise, that promises's
* resolved/rejected state will be used to call the callback, rather than simply
* the synchronous return value.
*
* This also means you can asyncify ES2017 `async` functions.
*
* @name asyncify
* @static
* @memberOf module:Utils
* @method
* @alias wrapSync
* @category Util
* @param {Function} func - The synchronous function, or Promise-returning
* function to convert to an {@link AsyncFunction}.
* @returns {AsyncFunction} An asynchronous wrapper of the `func`. To be
* invoked with `(args..., callback)`.
* @example
*
* // passing a regular synchronous function
* async.waterfall([
* async.apply(fs.readFile, filename, "utf8"),
* async.asyncify(JSON.parse),
* function (data, next) {
* // data is the result of parsing the text.
* // If there was a parsing error, it would have been caught.
* }
* ], callback);
*
* // passing a function returning a promise
* async.waterfall([
* async.apply(fs.readFile, filename, "utf8"),
* async.asyncify(function (contents) {
* return db.model.create(contents);
* }),
* function (model, next) {
* // `model` is the instantiated model object.
* // If there was an error, this function would be skipped.
* }
* ], callback);
*
* // es2017 example, though `asyncify` is not needed if your JS environment
* // supports async functions out of the box
* var q = async.queue(async.asyncify(async function(file) {
* var intermediateStep = await processFile(file);
* return await somePromise(intermediateStep)
* }));
*
* q.push(files);
*/
function asyncify(func) {
return (0, _initialParams2.default)(function (args, callback) {
var result;
try {
result = func.apply(this, args);
} catch (e) {
return callback(e);
}
// if result is Promise object
if ((0, _isObject2.default)(result) && typeof result.then === 'function') {
result.then(function (value) {
invokeCallback(callback, null, value);
}, function (err) {
invokeCallback(callback, err.message ? err : new Error(err));
});
} else {
callback(null, result);
}
});
}
function invokeCallback(callback, error, value) {
try {
callback(error, value);
} catch (e) {
(0, _setImmediate2.default)(rethrow, e);
}
}
function rethrow(error) {
throw error;
}
module.exports = exports['default'];

289
node_modules/postman-runtime/node_modules/async/auto.js generated vendored Normal file
View File

@@ -0,0 +1,289 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = function (tasks, concurrency, callback) {
if (typeof concurrency === 'function') {
// concurrency is optional, shift the args.
callback = concurrency;
concurrency = null;
}
callback = (0, _once2.default)(callback || _noop2.default);
var keys = (0, _keys2.default)(tasks);
var numTasks = keys.length;
if (!numTasks) {
return callback(null);
}
if (!concurrency) {
concurrency = numTasks;
}
var results = {};
var runningTasks = 0;
var hasError = false;
var listeners = Object.create(null);
var readyTasks = [];
// for cycle detection:
var readyToCheck = []; // tasks that have been identified as reachable
// without the possibility of returning to an ancestor task
var uncheckedDependencies = {};
(0, _baseForOwn2.default)(tasks, function (task, key) {
if (!(0, _isArray2.default)(task)) {
// no dependencies
enqueueTask(key, [task]);
readyToCheck.push(key);
return;
}
var dependencies = task.slice(0, task.length - 1);
var remainingDependencies = dependencies.length;
if (remainingDependencies === 0) {
enqueueTask(key, task);
readyToCheck.push(key);
return;
}
uncheckedDependencies[key] = remainingDependencies;
(0, _arrayEach2.default)(dependencies, function (dependencyName) {
if (!tasks[dependencyName]) {
throw new Error('async.auto task `' + key + '` has a non-existent dependency `' + dependencyName + '` in ' + dependencies.join(', '));
}
addListener(dependencyName, function () {
remainingDependencies--;
if (remainingDependencies === 0) {
enqueueTask(key, task);
}
});
});
});
checkForDeadlocks();
processQueue();
function enqueueTask(key, task) {
readyTasks.push(function () {
runTask(key, task);
});
}
function processQueue() {
if (readyTasks.length === 0 && runningTasks === 0) {
return callback(null, results);
}
while (readyTasks.length && runningTasks < concurrency) {
var run = readyTasks.shift();
run();
}
}
function addListener(taskName, fn) {
var taskListeners = listeners[taskName];
if (!taskListeners) {
taskListeners = listeners[taskName] = [];
}
taskListeners.push(fn);
}
function taskComplete(taskName) {
var taskListeners = listeners[taskName] || [];
(0, _arrayEach2.default)(taskListeners, function (fn) {
fn();
});
processQueue();
}
function runTask(key, task) {
if (hasError) return;
var taskCallback = (0, _onlyOnce2.default)(function (err, result) {
runningTasks--;
if (arguments.length > 2) {
result = (0, _slice2.default)(arguments, 1);
}
if (err) {
var safeResults = {};
(0, _baseForOwn2.default)(results, function (val, rkey) {
safeResults[rkey] = val;
});
safeResults[key] = result;
hasError = true;
listeners = Object.create(null);
callback(err, safeResults);
} else {
results[key] = result;
taskComplete(key);
}
});
runningTasks++;
var taskFn = (0, _wrapAsync2.default)(task[task.length - 1]);
if (task.length > 1) {
taskFn(results, taskCallback);
} else {
taskFn(taskCallback);
}
}
function checkForDeadlocks() {
// Kahn's algorithm
// https://en.wikipedia.org/wiki/Topological_sorting#Kahn.27s_algorithm
// http://connalle.blogspot.com/2013/10/topological-sortingkahn-algorithm.html
var currentTask;
var counter = 0;
while (readyToCheck.length) {
currentTask = readyToCheck.pop();
counter++;
(0, _arrayEach2.default)(getDependents(currentTask), function (dependent) {
if (--uncheckedDependencies[dependent] === 0) {
readyToCheck.push(dependent);
}
});
}
if (counter !== numTasks) {
throw new Error('async.auto cannot execute tasks due to a recursive dependency');
}
}
function getDependents(taskName) {
var result = [];
(0, _baseForOwn2.default)(tasks, function (task, key) {
if ((0, _isArray2.default)(task) && (0, _baseIndexOf2.default)(task, taskName, 0) >= 0) {
result.push(key);
}
});
return result;
}
};
var _arrayEach = require('lodash/_arrayEach');
var _arrayEach2 = _interopRequireDefault(_arrayEach);
var _baseForOwn = require('lodash/_baseForOwn');
var _baseForOwn2 = _interopRequireDefault(_baseForOwn);
var _baseIndexOf = require('lodash/_baseIndexOf');
var _baseIndexOf2 = _interopRequireDefault(_baseIndexOf);
var _isArray = require('lodash/isArray');
var _isArray2 = _interopRequireDefault(_isArray);
var _keys = require('lodash/keys');
var _keys2 = _interopRequireDefault(_keys);
var _noop = require('lodash/noop');
var _noop2 = _interopRequireDefault(_noop);
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
var _once = require('./internal/once');
var _once2 = _interopRequireDefault(_once);
var _onlyOnce = require('./internal/onlyOnce');
var _onlyOnce2 = _interopRequireDefault(_onlyOnce);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
module.exports = exports['default'];
/**
* Determines the best order for running the {@link AsyncFunction}s in `tasks`, based on
* their requirements. Each function can optionally depend on other functions
* being completed first, and each function is run as soon as its requirements
* are satisfied.
*
* If any of the {@link AsyncFunction}s pass an error to their callback, the `auto` sequence
* will stop. Further tasks will not execute (so any other functions depending
* on it will not run), and the main `callback` is immediately called with the
* error.
*
* {@link AsyncFunction}s also receive an object containing the results of functions which
* have completed so far as the first argument, if they have dependencies. If a
* task function has no dependencies, it will only be passed a callback.
*
* @name auto
* @static
* @memberOf module:ControlFlow
* @method
* @category Control Flow
* @param {Object} tasks - An object. Each of its properties is either a
* function or an array of requirements, with the {@link AsyncFunction} itself the last item
* in the array. The object's key of a property serves as the name of the task
* defined by that property, i.e. can be used when specifying requirements for
* other tasks. The function receives one or two arguments:
* * a `results` object, containing the results of the previously executed
* functions, only passed if the task has any dependencies,
* * a `callback(err, result)` function, which must be called when finished,
* passing an `error` (which can be `null`) and the result of the function's
* execution.
* @param {number} [concurrency=Infinity] - An optional `integer` for
* determining the maximum number of tasks that can be run in parallel. By
* default, as many as possible.
* @param {Function} [callback] - An optional callback which is called when all
* the tasks have been completed. It receives the `err` argument if any `tasks`
* pass an error to their callback. Results are always returned; however, if an
* error occurs, no further `tasks` will be performed, and the results object
* will only contain partial results. Invoked with (err, results).
* @returns undefined
* @example
*
* async.auto({
* // this function will just be passed a callback
* readData: async.apply(fs.readFile, 'data.txt', 'utf-8'),
* showData: ['readData', function(results, cb) {
* // results.readData is the file's contents
* // ...
* }]
* }, callback);
*
* async.auto({
* get_data: function(callback) {
* console.log('in get_data');
* // async code to get some data
* callback(null, 'data', 'converted to array');
* },
* make_folder: function(callback) {
* console.log('in make_folder');
* // async code to create a directory to store a file in
* // this is run at the same time as getting the data
* callback(null, 'folder');
* },
* write_file: ['get_data', 'make_folder', function(results, callback) {
* console.log('in write_file', JSON.stringify(results));
* // once there is some data and the directory exists,
* // write the data to a file in the directory
* callback(null, 'filename');
* }],
* email_link: ['write_file', function(results, callback) {
* console.log('in email_link', JSON.stringify(results));
* // once the file is written let's email a link to it...
* // results.write_file contains the filename returned by write_file.
* callback(null, {'file':results.write_file, 'email':'user@example.com'});
* }]
* }, function(err, results) {
* console.log('err = ', err);
* console.log('results = ', results);
* });
*/

View File

@@ -0,0 +1,170 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = autoInject;
var _auto = require('./auto');
var _auto2 = _interopRequireDefault(_auto);
var _baseForOwn = require('lodash/_baseForOwn');
var _baseForOwn2 = _interopRequireDefault(_baseForOwn);
var _arrayMap = require('lodash/_arrayMap');
var _arrayMap2 = _interopRequireDefault(_arrayMap);
var _isArray = require('lodash/isArray');
var _isArray2 = _interopRequireDefault(_isArray);
var _trim = require('lodash/trim');
var _trim2 = _interopRequireDefault(_trim);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
var FN_ARGS = /^(?:async\s+)?(function)?\s*[^\(]*\(\s*([^\)]*)\)/m;
var FN_ARG_SPLIT = /,/;
var FN_ARG = /(=.+)?(\s*)$/;
var STRIP_COMMENTS = /((\/\/.*$)|(\/\*[\s\S]*?\*\/))/mg;
function parseParams(func) {
func = func.toString().replace(STRIP_COMMENTS, '');
func = func.match(FN_ARGS)[2].replace(' ', '');
func = func ? func.split(FN_ARG_SPLIT) : [];
func = func.map(function (arg) {
return (0, _trim2.default)(arg.replace(FN_ARG, ''));
});
return func;
}
/**
* A dependency-injected version of the [async.auto]{@link module:ControlFlow.auto} function. Dependent
* tasks are specified as parameters to the function, after the usual callback
* parameter, with the parameter names matching the names of the tasks it
* depends on. This can provide even more readable task graphs which can be
* easier to maintain.
*
* If a final callback is specified, the task results are similarly injected,
* specified as named parameters after the initial error parameter.
*
* The autoInject function is purely syntactic sugar and its semantics are
* otherwise equivalent to [async.auto]{@link module:ControlFlow.auto}.
*
* @name autoInject
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.auto]{@link module:ControlFlow.auto}
* @category Control Flow
* @param {Object} tasks - An object, each of whose properties is an {@link AsyncFunction} of
* the form 'func([dependencies...], callback). The object's key of a property
* serves as the name of the task defined by that property, i.e. can be used
* when specifying requirements for other tasks.
* * The `callback` parameter is a `callback(err, result)` which must be called
* when finished, passing an `error` (which can be `null`) and the result of
* the function's execution. The remaining parameters name other tasks on
* which the task is dependent, and the results from those tasks are the
* arguments of those parameters.
* @param {Function} [callback] - An optional callback which is called when all
* the tasks have been completed. It receives the `err` argument if any `tasks`
* pass an error to their callback, and a `results` object with any completed
* task results, similar to `auto`.
* @example
*
* // The example from `auto` can be rewritten as follows:
* async.autoInject({
* get_data: function(callback) {
* // async code to get some data
* callback(null, 'data', 'converted to array');
* },
* make_folder: function(callback) {
* // async code to create a directory to store a file in
* // this is run at the same time as getting the data
* callback(null, 'folder');
* },
* write_file: function(get_data, make_folder, callback) {
* // once there is some data and the directory exists,
* // write the data to a file in the directory
* callback(null, 'filename');
* },
* email_link: function(write_file, callback) {
* // once the file is written let's email a link to it...
* // write_file contains the filename returned by write_file.
* callback(null, {'file':write_file, 'email':'user@example.com'});
* }
* }, function(err, results) {
* console.log('err = ', err);
* console.log('email_link = ', results.email_link);
* });
*
* // If you are using a JS minifier that mangles parameter names, `autoInject`
* // will not work with plain functions, since the parameter names will be
* // collapsed to a single letter identifier. To work around this, you can
* // explicitly specify the names of the parameters your task function needs
* // in an array, similar to Angular.js dependency injection.
*
* // This still has an advantage over plain `auto`, since the results a task
* // depends on are still spread into arguments.
* async.autoInject({
* //...
* write_file: ['get_data', 'make_folder', function(get_data, make_folder, callback) {
* callback(null, 'filename');
* }],
* email_link: ['write_file', function(write_file, callback) {
* callback(null, {'file':write_file, 'email':'user@example.com'});
* }]
* //...
* }, function(err, results) {
* console.log('err = ', err);
* console.log('email_link = ', results.email_link);
* });
*/
function autoInject(tasks, callback) {
var newTasks = {};
(0, _baseForOwn2.default)(tasks, function (taskFn, key) {
var params;
var fnIsAsync = (0, _wrapAsync.isAsync)(taskFn);
var hasNoDeps = !fnIsAsync && taskFn.length === 1 || fnIsAsync && taskFn.length === 0;
if ((0, _isArray2.default)(taskFn)) {
params = taskFn.slice(0, -1);
taskFn = taskFn[taskFn.length - 1];
newTasks[key] = params.concat(params.length > 0 ? newTask : taskFn);
} else if (hasNoDeps) {
// no dependencies, use the function as-is
newTasks[key] = taskFn;
} else {
params = parseParams(taskFn);
if (taskFn.length === 0 && !fnIsAsync && params.length === 0) {
throw new Error("autoInject task functions require explicit parameters.");
}
// remove callback param
if (!fnIsAsync) params.pop();
newTasks[key] = params.concat(newTask);
}
function newTask(results, taskCb) {
var newArgs = (0, _arrayMap2.default)(params, function (name) {
return results[name];
});
newArgs.push(taskCb);
(0, _wrapAsync2.default)(taskFn).apply(null, newArgs);
}
});
(0, _auto2.default)(newTasks, callback);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,17 @@
{
"name": "async",
"main": "dist/async.js",
"ignore": [
"bower_components",
"lib",
"mocha_test",
"node_modules",
"perf",
"support",
"**/.*",
"*.config.js",
"*.json",
"index.js",
"Makefile"
]
}

View File

@@ -0,0 +1,94 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = cargo;
var _queue = require('./internal/queue');
var _queue2 = _interopRequireDefault(_queue);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* A cargo of tasks for the worker function to complete. Cargo inherits all of
* the same methods and event callbacks as [`queue`]{@link module:ControlFlow.queue}.
* @typedef {Object} CargoObject
* @memberOf module:ControlFlow
* @property {Function} length - A function returning the number of items
* waiting to be processed. Invoke like `cargo.length()`.
* @property {number} payload - An `integer` for determining how many tasks
* should be process per round. This property can be changed after a `cargo` is
* created to alter the payload on-the-fly.
* @property {Function} push - Adds `task` to the `queue`. The callback is
* called once the `worker` has finished processing the task. Instead of a
* single task, an array of `tasks` can be submitted. The respective callback is
* used for every task in the list. Invoke like `cargo.push(task, [callback])`.
* @property {Function} saturated - A callback that is called when the
* `queue.length()` hits the concurrency and further tasks will be queued.
* @property {Function} empty - A callback that is called when the last item
* from the `queue` is given to a `worker`.
* @property {Function} drain - A callback that is called when the last item
* from the `queue` has returned from the `worker`.
* @property {Function} idle - a function returning false if there are items
* waiting or being processed, or true if not. Invoke like `cargo.idle()`.
* @property {Function} pause - a function that pauses the processing of tasks
* until `resume()` is called. Invoke like `cargo.pause()`.
* @property {Function} resume - a function that resumes the processing of
* queued tasks when the queue is paused. Invoke like `cargo.resume()`.
* @property {Function} kill - a function that removes the `drain` callback and
* empties remaining tasks from the queue forcing it to go idle. Invoke like `cargo.kill()`.
*/
/**
* Creates a `cargo` object with the specified payload. Tasks added to the
* cargo will be processed altogether (up to the `payload` limit). If the
* `worker` is in progress, the task is queued until it becomes available. Once
* the `worker` has completed some tasks, each callback of those tasks is
* called. Check out [these](https://camo.githubusercontent.com/6bbd36f4cf5b35a0f11a96dcd2e97711ffc2fb37/68747470733a2f2f662e636c6f75642e6769746875622e636f6d2f6173736574732f313637363837312f36383130382f62626330636662302d356632392d313165322d393734662d3333393763363464633835382e676966) [animations](https://camo.githubusercontent.com/f4810e00e1c5f5f8addbe3e9f49064fd5d102699/68747470733a2f2f662e636c6f75642e6769746875622e636f6d2f6173736574732f313637363837312f36383130312f38346339323036362d356632392d313165322d383134662d3964336430323431336266642e676966)
* for how `cargo` and `queue` work.
*
* While [`queue`]{@link module:ControlFlow.queue} passes only one task to one of a group of workers
* at a time, cargo passes an array of tasks to a single worker, repeating
* when the worker is finished.
*
* @name cargo
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.queue]{@link module:ControlFlow.queue}
* @category Control Flow
* @param {AsyncFunction} worker - An asynchronous function for processing an array
* of queued tasks. Invoked with `(tasks, callback)`.
* @param {number} [payload=Infinity] - An optional `integer` for determining
* how many tasks should be processed per round; if omitted, the default is
* unlimited.
* @returns {module:ControlFlow.CargoObject} A cargo object to manage the tasks. Callbacks can
* attached as certain properties to listen for specific events during the
* lifecycle of the cargo and inner queue.
* @example
*
* // create a cargo object with payload 2
* var cargo = async.cargo(function(tasks, callback) {
* for (var i=0; i<tasks.length; i++) {
* console.log('hello ' + tasks[i].name);
* }
* callback();
* }, 2);
*
* // add some items
* cargo.push({name: 'foo'}, function(err) {
* console.log('finished processing foo');
* });
* cargo.push({name: 'bar'}, function(err) {
* console.log('finished processing bar');
* });
* cargo.push({name: 'baz'}, function(err) {
* console.log('finished processing baz');
* });
*/
function cargo(worker, payload) {
return (0, _queue2.default)(worker, 1, payload);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,58 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = function () /*...args*/{
return _seq2.default.apply(null, (0, _slice2.default)(arguments).reverse());
};
var _seq = require('./seq');
var _seq2 = _interopRequireDefault(_seq);
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
;
/**
* Creates a function which is a composition of the passed asynchronous
* functions. Each function consumes the return value of the function that
* follows. Composing functions `f()`, `g()`, and `h()` would produce the result
* of `f(g(h()))`, only this version uses callbacks to obtain the return values.
*
* Each function is executed with the `this` binding of the composed function.
*
* @name compose
* @static
* @memberOf module:ControlFlow
* @method
* @category Control Flow
* @param {...AsyncFunction} functions - the asynchronous functions to compose
* @returns {Function} an asynchronous function that is the composed
* asynchronous `functions`
* @example
*
* function add1(n, callback) {
* setTimeout(function () {
* callback(null, n + 1);
* }, 10);
* }
*
* function mul3(n, callback) {
* setTimeout(function () {
* callback(null, n * 3);
* }, 10);
* }
*
* var add1mul3 = async.compose(mul3, add1);
* add1mul3(4, function (err, result) {
* // result now equals 15
* });
*/
module.exports = exports['default'];

View File

@@ -0,0 +1,43 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
var _concatLimit = require('./concatLimit');
var _concatLimit2 = _interopRequireDefault(_concatLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Applies `iteratee` to each item in `coll`, concatenating the results. Returns
* the concatenated list. The `iteratee`s are called in parallel, and the
* results are concatenated as they return. There is no guarantee that the
* results array will be returned in the original order of `coll` passed to the
* `iteratee` function.
*
* @name concat
* @static
* @memberOf module:Collections
* @method
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A function to apply to each item in `coll`,
* which should use an array as its result. Invoked with (item, callback).
* @param {Function} [callback(err)] - A callback which is called after all the
* `iteratee` functions have finished, or an error occurs. Results is an array
* containing the concatenated results of the `iteratee` function. Invoked with
* (err, results).
* @example
*
* async.concat(['dir1','dir2','dir3'], fs.readdir, function(err, files) {
* // files is now a list of filenames that exist in the 3 directories
* });
*/
exports.default = (0, _doLimit2.default)(_concatLimit2.default, Infinity);
module.exports = exports['default'];

View File

@@ -0,0 +1,65 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = function (coll, limit, iteratee, callback) {
callback = callback || _noop2.default;
var _iteratee = (0, _wrapAsync2.default)(iteratee);
(0, _mapLimit2.default)(coll, limit, function (val, callback) {
_iteratee(val, function (err /*, ...args*/) {
if (err) return callback(err);
return callback(null, (0, _slice2.default)(arguments, 1));
});
}, function (err, mapResults) {
var result = [];
for (var i = 0; i < mapResults.length; i++) {
if (mapResults[i]) {
result = _concat.apply(result, mapResults[i]);
}
}
return callback(err, result);
});
};
var _noop = require('lodash/noop');
var _noop2 = _interopRequireDefault(_noop);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
var _mapLimit = require('./mapLimit');
var _mapLimit2 = _interopRequireDefault(_mapLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
var _concat = Array.prototype.concat;
/**
* The same as [`concat`]{@link module:Collections.concat} but runs a maximum of `limit` async operations at a time.
*
* @name concatLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.concat]{@link module:Collections.concat}
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - A function to apply to each item in `coll`,
* which should use an array as its result. Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished, or an error occurs. Results is an array
* containing the concatenated results of the `iteratee` function. Invoked with
* (err, results).
*/
module.exports = exports['default'];

View File

@@ -0,0 +1,36 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
var _concatLimit = require('./concatLimit');
var _concatLimit2 = _interopRequireDefault(_concatLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`concat`]{@link module:Collections.concat} but runs only a single async operation at a time.
*
* @name concatSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.concat]{@link module:Collections.concat}
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A function to apply to each item in `coll`.
* The iteratee should complete with an array an array of results.
* Invoked with (item, callback).
* @param {Function} [callback(err)] - A callback which is called after all the
* `iteratee` functions have finished, or an error occurs. Results is an array
* containing the concatenated results of the `iteratee` function. Invoked with
* (err, results).
*/
exports.default = (0, _doLimit2.default)(_concatLimit2.default, 1);
module.exports = exports['default'];

View File

@@ -0,0 +1,66 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = function () /*...values*/{
var values = (0, _slice2.default)(arguments);
var args = [null].concat(values);
return function () /*...ignoredArgs, callback*/{
var callback = arguments[arguments.length - 1];
return callback.apply(this, args);
};
};
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
;
/**
* Returns a function that when called, calls-back with the values provided.
* Useful as the first function in a [`waterfall`]{@link module:ControlFlow.waterfall}, or for plugging values in to
* [`auto`]{@link module:ControlFlow.auto}.
*
* @name constant
* @static
* @memberOf module:Utils
* @method
* @category Util
* @param {...*} arguments... - Any number of arguments to automatically invoke
* callback with.
* @returns {AsyncFunction} Returns a function that when invoked, automatically
* invokes the callback with the previous given arguments.
* @example
*
* async.waterfall([
* async.constant(42),
* function (value, next) {
* // value === 42
* },
* //...
* ], callback);
*
* async.waterfall([
* async.constant(filename, "utf8"),
* fs.readFile,
* function (fileData, next) {
* //...
* }
* //...
* ], callback);
*
* async.auto({
* hostname: async.constant("https://server.net/"),
* port: findFreePort,
* launchServer: ["hostname", "port", function (options, cb) {
* startServer(options, cb);
* }],
* //...
* }, callback);
*/
module.exports = exports['default'];

View File

@@ -0,0 +1,61 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _identity = require('lodash/identity');
var _identity2 = _interopRequireDefault(_identity);
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallel = require('./internal/doParallel');
var _doParallel2 = _interopRequireDefault(_doParallel);
var _findGetResult = require('./internal/findGetResult');
var _findGetResult2 = _interopRequireDefault(_findGetResult);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Returns the first value in `coll` that passes an async truth test. The
* `iteratee` is applied in parallel, meaning the first iteratee to return
* `true` will fire the detect `callback` with that result. That means the
* result might not be the first item in the original `coll` (in terms of order)
* that passes the test.
* If order within the original `coll` is important, then look at
* [`detectSeries`]{@link module:Collections.detectSeries}.
*
* @name detect
* @static
* @memberOf module:Collections
* @method
* @alias find
* @category Collections
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A truth test to apply to each item in `coll`.
* The iteratee must complete with a boolean value as its result.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the `iteratee` functions have finished.
* Result will be the first item in the array that passes the truth test
* (iteratee) or the value `undefined` if none passed. Invoked with
* (err, result).
* @example
*
* async.detect(['file1','file2','file3'], function(filePath, callback) {
* fs.access(filePath, function(err) {
* callback(null, !err)
* });
* }, function(err, result) {
* // result now equals the first file in the list that exists
* });
*/
exports.default = (0, _doParallel2.default)((0, _createTester2.default)(_identity2.default, _findGetResult2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,48 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _identity = require('lodash/identity');
var _identity2 = _interopRequireDefault(_identity);
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallelLimit = require('./internal/doParallelLimit');
var _doParallelLimit2 = _interopRequireDefault(_doParallelLimit);
var _findGetResult = require('./internal/findGetResult');
var _findGetResult2 = _interopRequireDefault(_findGetResult);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`detect`]{@link module:Collections.detect} but runs a maximum of `limit` async operations at a
* time.
*
* @name detectLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.detect]{@link module:Collections.detect}
* @alias findLimit
* @category Collections
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - A truth test to apply to each item in `coll`.
* The iteratee must complete with a boolean value as its result.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the `iteratee` functions have finished.
* Result will be the first item in the array that passes the truth test
* (iteratee) or the value `undefined` if none passed. Invoked with
* (err, result).
*/
exports.default = (0, _doParallelLimit2.default)((0, _createTester2.default)(_identity2.default, _findGetResult2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,38 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _detectLimit = require('./detectLimit');
var _detectLimit2 = _interopRequireDefault(_detectLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`detect`]{@link module:Collections.detect} but runs only a single async operation at a time.
*
* @name detectSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.detect]{@link module:Collections.detect}
* @alias findSeries
* @category Collections
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A truth test to apply to each item in `coll`.
* The iteratee must complete with a boolean value as its result.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the `iteratee` functions have finished.
* Result will be the first item in the array that passes the truth test
* (iteratee) or the value `undefined` if none passed. Invoked with
* (err, result).
*/
exports.default = (0, _doLimit2.default)(_detectLimit2.default, 1);
module.exports = exports['default'];

43
node_modules/postman-runtime/node_modules/async/dir.js generated vendored Normal file
View File

@@ -0,0 +1,43 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _consoleFunc = require('./internal/consoleFunc');
var _consoleFunc2 = _interopRequireDefault(_consoleFunc);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Logs the result of an [`async` function]{@link AsyncFunction} to the
* `console` using `console.dir` to display the properties of the resulting object.
* Only works in Node.js or in browsers that support `console.dir` and
* `console.error` (such as FF and Chrome).
* If multiple arguments are returned from the async function,
* `console.dir` is called on each argument in order.
*
* @name dir
* @static
* @memberOf module:Utils
* @method
* @category Util
* @param {AsyncFunction} function - The function you want to eventually apply
* all arguments to.
* @param {...*} arguments... - Any number of arguments to apply to the function.
* @example
*
* // in a module
* var hello = function(name, callback) {
* setTimeout(function() {
* callback(null, {hello: name});
* }, 1000);
* };
*
* // in the node repl
* node> async.dir(hello, 'world');
* {hello: 'world'}
*/
exports.default = (0, _consoleFunc2.default)('dir');
module.exports = exports['default'];

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,66 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = doDuring;
var _noop = require('lodash/noop');
var _noop2 = _interopRequireDefault(_noop);
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
var _onlyOnce = require('./internal/onlyOnce');
var _onlyOnce2 = _interopRequireDefault(_onlyOnce);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The post-check version of [`during`]{@link module:ControlFlow.during}. To reflect the difference in
* the order of operations, the arguments `test` and `fn` are switched.
*
* Also a version of [`doWhilst`]{@link module:ControlFlow.doWhilst} with asynchronous `test` function.
* @name doDuring
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.during]{@link module:ControlFlow.during}
* @category Control Flow
* @param {AsyncFunction} fn - An async function which is called each time
* `test` passes. Invoked with (callback).
* @param {AsyncFunction} test - asynchronous truth test to perform before each
* execution of `fn`. Invoked with (...args, callback), where `...args` are the
* non-error args from the previous callback of `fn`.
* @param {Function} [callback] - A callback which is called after the test
* function has failed and repeated execution of `fn` has stopped. `callback`
* will be passed an error if one occurred, otherwise `null`.
*/
function doDuring(fn, test, callback) {
callback = (0, _onlyOnce2.default)(callback || _noop2.default);
var _fn = (0, _wrapAsync2.default)(fn);
var _test = (0, _wrapAsync2.default)(test);
function next(err /*, ...args*/) {
if (err) return callback(err);
var args = (0, _slice2.default)(arguments, 1);
args.push(check);
_test.apply(this, args);
};
function check(err, truth) {
if (err) return callback(err);
if (!truth) return callback(null);
_fn(next);
}
check(null, true);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,39 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = doUntil;
var _doWhilst = require('./doWhilst');
var _doWhilst2 = _interopRequireDefault(_doWhilst);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Like ['doWhilst']{@link module:ControlFlow.doWhilst}, except the `test` is inverted. Note the
* argument ordering differs from `until`.
*
* @name doUntil
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.doWhilst]{@link module:ControlFlow.doWhilst}
* @category Control Flow
* @param {AsyncFunction} iteratee - An async function which is called each time
* `test` fails. Invoked with (callback).
* @param {Function} test - synchronous truth test to perform after each
* execution of `iteratee`. Invoked with any non-error callback results of
* `iteratee`.
* @param {Function} [callback] - A callback which is called after the test
* function has passed and repeated execution of `iteratee` has stopped. `callback`
* will be passed an error and any arguments passed to the final `iteratee`'s
* callback. Invoked with (err, [results]);
*/
function doUntil(iteratee, test, callback) {
(0, _doWhilst2.default)(iteratee, function () {
return !test.apply(this, arguments);
}, callback);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,59 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = doWhilst;
var _noop = require('lodash/noop');
var _noop2 = _interopRequireDefault(_noop);
var _slice = require('./internal/slice');
var _slice2 = _interopRequireDefault(_slice);
var _onlyOnce = require('./internal/onlyOnce');
var _onlyOnce2 = _interopRequireDefault(_onlyOnce);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The post-check version of [`whilst`]{@link module:ControlFlow.whilst}. To reflect the difference in
* the order of operations, the arguments `test` and `iteratee` are switched.
*
* `doWhilst` is to `whilst` as `do while` is to `while` in plain JavaScript.
*
* @name doWhilst
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.whilst]{@link module:ControlFlow.whilst}
* @category Control Flow
* @param {AsyncFunction} iteratee - A function which is called each time `test`
* passes. Invoked with (callback).
* @param {Function} test - synchronous truth test to perform after each
* execution of `iteratee`. Invoked with any non-error callback results of
* `iteratee`.
* @param {Function} [callback] - A callback which is called after the test
* function has failed and repeated execution of `iteratee` has stopped.
* `callback` will be passed an error and any arguments passed to the final
* `iteratee`'s callback. Invoked with (err, [results]);
*/
function doWhilst(iteratee, test, callback) {
callback = (0, _onlyOnce2.default)(callback || _noop2.default);
var _iteratee = (0, _wrapAsync2.default)(iteratee);
var next = function (err /*, ...args*/) {
if (err) return callback(err);
var args = (0, _slice2.default)(arguments, 1);
if (test.apply(this, args)) return _iteratee(next);
callback.apply(null, [null].concat(args));
};
_iteratee(next);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,76 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = during;
var _noop = require('lodash/noop');
var _noop2 = _interopRequireDefault(_noop);
var _onlyOnce = require('./internal/onlyOnce');
var _onlyOnce2 = _interopRequireDefault(_onlyOnce);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Like [`whilst`]{@link module:ControlFlow.whilst}, except the `test` is an asynchronous function that
* is passed a callback in the form of `function (err, truth)`. If error is
* passed to `test` or `fn`, the main callback is immediately called with the
* value of the error.
*
* @name during
* @static
* @memberOf module:ControlFlow
* @method
* @see [async.whilst]{@link module:ControlFlow.whilst}
* @category Control Flow
* @param {AsyncFunction} test - asynchronous truth test to perform before each
* execution of `fn`. Invoked with (callback).
* @param {AsyncFunction} fn - An async function which is called each time
* `test` passes. Invoked with (callback).
* @param {Function} [callback] - A callback which is called after the test
* function has failed and repeated execution of `fn` has stopped. `callback`
* will be passed an error, if one occurred, otherwise `null`.
* @example
*
* var count = 0;
*
* async.during(
* function (callback) {
* return callback(null, count < 5);
* },
* function (callback) {
* count++;
* setTimeout(callback, 1000);
* },
* function (err) {
* // 5 seconds have passed
* }
* );
*/
function during(test, fn, callback) {
callback = (0, _onlyOnce2.default)(callback || _noop2.default);
var _fn = (0, _wrapAsync2.default)(fn);
var _test = (0, _wrapAsync2.default)(test);
function next(err) {
if (err) return callback(err);
_test(check);
}
function check(err, truth) {
if (err) return callback(err);
if (!truth) return callback(null);
_fn(next);
}
_test(check);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,82 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = eachLimit;
var _eachOf = require('./eachOf');
var _eachOf2 = _interopRequireDefault(_eachOf);
var _withoutIndex = require('./internal/withoutIndex');
var _withoutIndex2 = _interopRequireDefault(_withoutIndex);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Applies the function `iteratee` to each item in `coll`, in parallel.
* The `iteratee` is called with an item from the list, and a callback for when
* it has finished. If the `iteratee` passes an error to its `callback`, the
* main `callback` (for the `each` function) is immediately called with the
* error.
*
* Note, that since this function applies `iteratee` to each item in parallel,
* there is no guarantee that the iteratee functions will complete in order.
*
* @name each
* @static
* @memberOf module:Collections
* @method
* @alias forEach
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async function to apply to
* each item in `coll`. Invoked with (item, callback).
* The array index is not passed to the iteratee.
* If you need the index, use `eachOf`.
* @param {Function} [callback] - A callback which is called when all
* `iteratee` functions have finished, or an error occurs. Invoked with (err).
* @example
*
* // assuming openFiles is an array of file names and saveFile is a function
* // to save the modified contents of that file:
*
* async.each(openFiles, saveFile, function(err){
* // if any of the saves produced an error, err would equal that error
* });
*
* // assuming openFiles is an array of file names
* async.each(openFiles, function(file, callback) {
*
* // Perform operation on file here.
* console.log('Processing file ' + file);
*
* if( file.length > 32 ) {
* console.log('This file name is too long');
* callback('File name too long');
* } else {
* // Do work to process file here
* console.log('File processed');
* callback();
* }
* }, function(err) {
* // if any of the file processing produced an error, err would equal that error
* if( err ) {
* // One of the iterations produced an error.
* // All processing will now stop.
* console.log('A file failed to process');
* } else {
* console.log('All files have been processed successfully');
* }
* });
*/
function eachLimit(coll, iteratee, callback) {
(0, _eachOf2.default)(coll, (0, _withoutIndex2.default)((0, _wrapAsync2.default)(iteratee)), callback);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,45 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = eachLimit;
var _eachOfLimit = require('./internal/eachOfLimit');
var _eachOfLimit2 = _interopRequireDefault(_eachOfLimit);
var _withoutIndex = require('./internal/withoutIndex');
var _withoutIndex2 = _interopRequireDefault(_withoutIndex);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`each`]{@link module:Collections.each} but runs a maximum of `limit` async operations at a time.
*
* @name eachLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.each]{@link module:Collections.each}
* @alias forEachLimit
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - An async function to apply to each item in
* `coll`.
* The array index is not passed to the iteratee.
* If you need the index, use `eachOfLimit`.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called when all
* `iteratee` functions have finished, or an error occurs. Invoked with (err).
*/
function eachLimit(coll, limit, iteratee, callback) {
(0, _eachOfLimit2.default)(limit)(coll, (0, _withoutIndex2.default)((0, _wrapAsync2.default)(iteratee)), callback);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,111 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = function (coll, iteratee, callback) {
var eachOfImplementation = (0, _isArrayLike2.default)(coll) ? eachOfArrayLike : eachOfGeneric;
eachOfImplementation(coll, (0, _wrapAsync2.default)(iteratee), callback);
};
var _isArrayLike = require('lodash/isArrayLike');
var _isArrayLike2 = _interopRequireDefault(_isArrayLike);
var _breakLoop = require('./internal/breakLoop');
var _breakLoop2 = _interopRequireDefault(_breakLoop);
var _eachOfLimit = require('./eachOfLimit');
var _eachOfLimit2 = _interopRequireDefault(_eachOfLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
var _noop = require('lodash/noop');
var _noop2 = _interopRequireDefault(_noop);
var _once = require('./internal/once');
var _once2 = _interopRequireDefault(_once);
var _onlyOnce = require('./internal/onlyOnce');
var _onlyOnce2 = _interopRequireDefault(_onlyOnce);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
// eachOf implementation optimized for array-likes
function eachOfArrayLike(coll, iteratee, callback) {
callback = (0, _once2.default)(callback || _noop2.default);
var index = 0,
completed = 0,
length = coll.length;
if (length === 0) {
callback(null);
}
function iteratorCallback(err, value) {
if (err) {
callback(err);
} else if (++completed === length || value === _breakLoop2.default) {
callback(null);
}
}
for (; index < length; index++) {
iteratee(coll[index], index, (0, _onlyOnce2.default)(iteratorCallback));
}
}
// a generic version of eachOf which can handle array, object, and iterator cases.
var eachOfGeneric = (0, _doLimit2.default)(_eachOfLimit2.default, Infinity);
/**
* Like [`each`]{@link module:Collections.each}, except that it passes the key (or index) as the second argument
* to the iteratee.
*
* @name eachOf
* @static
* @memberOf module:Collections
* @method
* @alias forEachOf
* @category Collection
* @see [async.each]{@link module:Collections.each}
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A function to apply to each
* item in `coll`.
* The `key` is the item's key, or index in the case of an array.
* Invoked with (item, key, callback).
* @param {Function} [callback] - A callback which is called when all
* `iteratee` functions have finished, or an error occurs. Invoked with (err).
* @example
*
* var obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"};
* var configs = {};
*
* async.forEachOf(obj, function (value, key, callback) {
* fs.readFile(__dirname + value, "utf8", function (err, data) {
* if (err) return callback(err);
* try {
* configs[key] = JSON.parse(data);
* } catch (e) {
* return callback(e);
* }
* callback();
* });
* }, function (err) {
* if (err) console.error(err.message);
* // configs is now a map of JSON data
* doSomethingWith(configs);
* });
*/
module.exports = exports['default'];

View File

@@ -0,0 +1,41 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = eachOfLimit;
var _eachOfLimit2 = require('./internal/eachOfLimit');
var _eachOfLimit3 = _interopRequireDefault(_eachOfLimit2);
var _wrapAsync = require('./internal/wrapAsync');
var _wrapAsync2 = _interopRequireDefault(_wrapAsync);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`eachOf`]{@link module:Collections.eachOf} but runs a maximum of `limit` async operations at a
* time.
*
* @name eachOfLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.eachOf]{@link module:Collections.eachOf}
* @alias forEachOfLimit
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - An async function to apply to each
* item in `coll`. The `key` is the item's key, or index in the case of an
* array.
* Invoked with (item, key, callback).
* @param {Function} [callback] - A callback which is called when all
* `iteratee` functions have finished, or an error occurs. Invoked with (err).
*/
function eachOfLimit(coll, limit, iteratee, callback) {
(0, _eachOfLimit3.default)(limit)(coll, (0, _wrapAsync2.default)(iteratee), callback);
}
module.exports = exports['default'];

View File

@@ -0,0 +1,35 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _eachOfLimit = require('./eachOfLimit');
var _eachOfLimit2 = _interopRequireDefault(_eachOfLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`eachOf`]{@link module:Collections.eachOf} but runs only a single async operation at a time.
*
* @name eachOfSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.eachOf]{@link module:Collections.eachOf}
* @alias forEachOfSeries
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async function to apply to each item in
* `coll`.
* Invoked with (item, key, callback).
* @param {Function} [callback] - A callback which is called when all `iteratee`
* functions have finished, or an error occurs. Invoked with (err).
*/
exports.default = (0, _doLimit2.default)(_eachOfLimit2.default, 1);
module.exports = exports['default'];

View File

@@ -0,0 +1,37 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _eachLimit = require('./eachLimit');
var _eachLimit2 = _interopRequireDefault(_eachLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`each`]{@link module:Collections.each} but runs only a single async operation at a time.
*
* @name eachSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.each]{@link module:Collections.each}
* @alias forEachSeries
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async function to apply to each
* item in `coll`.
* The array index is not passed to the iteratee.
* If you need the index, use `eachOfSeries`.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called when all
* `iteratee` functions have finished, or an error occurs. Invoked with (err).
*/
exports.default = (0, _doLimit2.default)(_eachLimit2.default, 1);
module.exports = exports['default'];

View File

@@ -0,0 +1,73 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.default = ensureAsync;
var _setImmediate = require('./internal/setImmediate');
var _setImmediate2 = _interopRequireDefault(_setImmediate);
var _initialParams = require('./internal/initialParams');
var _initialParams2 = _interopRequireDefault(_initialParams);
var _wrapAsync = require('./internal/wrapAsync');
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Wrap an async function and ensure it calls its callback on a later tick of
* the event loop. If the function already calls its callback on a next tick,
* no extra deferral is added. This is useful for preventing stack overflows
* (`RangeError: Maximum call stack size exceeded`) and generally keeping
* [Zalgo](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony)
* contained. ES2017 `async` functions are returned as-is -- they are immune
* to Zalgo's corrupting influences, as they always resolve on a later tick.
*
* @name ensureAsync
* @static
* @memberOf module:Utils
* @method
* @category Util
* @param {AsyncFunction} fn - an async function, one that expects a node-style
* callback as its last argument.
* @returns {AsyncFunction} Returns a wrapped function with the exact same call
* signature as the function passed in.
* @example
*
* function sometimesAsync(arg, callback) {
* if (cache[arg]) {
* return callback(null, cache[arg]); // this would be synchronous!!
* } else {
* doSomeIO(arg, callback); // this IO would be asynchronous
* }
* }
*
* // this has a risk of stack overflows if many results are cached in a row
* async.mapSeries(args, sometimesAsync, done);
*
* // this will defer sometimesAsync's callback if necessary,
* // preventing stack overflows
* async.mapSeries(args, async.ensureAsync(sometimesAsync), done);
*/
function ensureAsync(fn) {
if ((0, _wrapAsync.isAsync)(fn)) return fn;
return (0, _initialParams2.default)(function (args, callback) {
var sync = true;
args.push(function () {
var innerArgs = arguments;
if (sync) {
(0, _setImmediate2.default)(function () {
callback.apply(null, innerArgs);
});
} else {
callback.apply(null, innerArgs);
}
});
fn.apply(this, args);
sync = false;
});
}
module.exports = exports['default'];

View File

@@ -0,0 +1,50 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallel = require('./internal/doParallel');
var _doParallel2 = _interopRequireDefault(_doParallel);
var _notId = require('./internal/notId');
var _notId2 = _interopRequireDefault(_notId);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Returns `true` if every element in `coll` satisfies an async test. If any
* iteratee call returns `false`, the main `callback` is immediately called.
*
* @name every
* @static
* @memberOf module:Collections
* @method
* @alias all
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collection in parallel.
* The iteratee must complete with a boolean result value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Result will be either `true` or `false`
* depending on the values of the async tests. Invoked with (err, result).
* @example
*
* async.every(['file1','file2','file3'], function(filePath, callback) {
* fs.access(filePath, function(err) {
* callback(null, !err)
* });
* }, function(err, result) {
* // if result is true then every file exists
* });
*/
exports.default = (0, _doParallel2.default)((0, _createTester2.default)(_notId2.default, _notId2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,42 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallelLimit = require('./internal/doParallelLimit');
var _doParallelLimit2 = _interopRequireDefault(_doParallelLimit);
var _notId = require('./internal/notId');
var _notId2 = _interopRequireDefault(_notId);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`every`]{@link module:Collections.every} but runs a maximum of `limit` async operations at a time.
*
* @name everyLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.every]{@link module:Collections.every}
* @alias allLimit
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collection in parallel.
* The iteratee must complete with a boolean result value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Result will be either `true` or `false`
* depending on the values of the async tests. Invoked with (err, result).
*/
exports.default = (0, _doParallelLimit2.default)((0, _createTester2.default)(_notId2.default, _notId2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,37 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _everyLimit = require('./everyLimit');
var _everyLimit2 = _interopRequireDefault(_everyLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`every`]{@link module:Collections.every} but runs only a single async operation at a time.
*
* @name everySeries
* @static
* @memberOf module:Collections
* @method
* @see [async.every]{@link module:Collections.every}
* @alias allSeries
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - An async truth test to apply to each item
* in the collection in series.
* The iteratee must complete with a boolean result value.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Result will be either `true` or `false`
* depending on the values of the async tests. Invoked with (err, result).
*/
exports.default = (0, _doLimit2.default)(_everyLimit2.default, 1);
module.exports = exports['default'];

View File

@@ -0,0 +1,45 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _filter = require('./internal/filter');
var _filter2 = _interopRequireDefault(_filter);
var _doParallel = require('./internal/doParallel');
var _doParallel2 = _interopRequireDefault(_doParallel);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Returns a new array of all the values in `coll` which pass an async truth
* test. This operation is performed in parallel, but the results array will be
* in the same order as the original.
*
* @name filter
* @static
* @memberOf module:Collections
* @method
* @alias select
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {Function} iteratee - A truth test to apply to each item in `coll`.
* The `iteratee` is passed a `callback(err, truthValue)`, which must be called
* with a boolean argument once it has completed. Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Invoked with (err, results).
* @example
*
* async.filter(['file1','file2','file3'], function(filePath, callback) {
* fs.access(filePath, function(err) {
* callback(null, !err)
* });
* }, function(err, results) {
* // results now equals an array of the existing files
* });
*/
exports.default = (0, _doParallel2.default)(_filter2.default);
module.exports = exports['default'];

View File

@@ -0,0 +1,37 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _filter = require('./internal/filter');
var _filter2 = _interopRequireDefault(_filter);
var _doParallelLimit = require('./internal/doParallelLimit');
var _doParallelLimit2 = _interopRequireDefault(_doParallelLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`filter`]{@link module:Collections.filter} but runs a maximum of `limit` async operations at a
* time.
*
* @name filterLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.filter]{@link module:Collections.filter}
* @alias selectLimit
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {Function} iteratee - A truth test to apply to each item in `coll`.
* The `iteratee` is passed a `callback(err, truthValue)`, which must be called
* with a boolean argument once it has completed. Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Invoked with (err, results).
*/
exports.default = (0, _doParallelLimit2.default)(_filter2.default);
module.exports = exports['default'];

View File

@@ -0,0 +1,35 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _filterLimit = require('./filterLimit');
var _filterLimit2 = _interopRequireDefault(_filterLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`filter`]{@link module:Collections.filter} but runs only a single async operation at a time.
*
* @name filterSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.filter]{@link module:Collections.filter}
* @alias selectSeries
* @category Collection
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {Function} iteratee - A truth test to apply to each item in `coll`.
* The `iteratee` is passed a `callback(err, truthValue)`, which must be called
* with a boolean argument once it has completed. Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called after all the
* `iteratee` functions have finished. Invoked with (err, results)
*/
exports.default = (0, _doLimit2.default)(_filterLimit2.default, 1);
module.exports = exports['default'];

View File

@@ -0,0 +1,61 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _identity = require('lodash/identity');
var _identity2 = _interopRequireDefault(_identity);
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallel = require('./internal/doParallel');
var _doParallel2 = _interopRequireDefault(_doParallel);
var _findGetResult = require('./internal/findGetResult');
var _findGetResult2 = _interopRequireDefault(_findGetResult);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* Returns the first value in `coll` that passes an async truth test. The
* `iteratee` is applied in parallel, meaning the first iteratee to return
* `true` will fire the detect `callback` with that result. That means the
* result might not be the first item in the original `coll` (in terms of order)
* that passes the test.
* If order within the original `coll` is important, then look at
* [`detectSeries`]{@link module:Collections.detectSeries}.
*
* @name detect
* @static
* @memberOf module:Collections
* @method
* @alias find
* @category Collections
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A truth test to apply to each item in `coll`.
* The iteratee must complete with a boolean value as its result.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the `iteratee` functions have finished.
* Result will be the first item in the array that passes the truth test
* (iteratee) or the value `undefined` if none passed. Invoked with
* (err, result).
* @example
*
* async.detect(['file1','file2','file3'], function(filePath, callback) {
* fs.access(filePath, function(err) {
* callback(null, !err)
* });
* }, function(err, result) {
* // result now equals the first file in the list that exists
* });
*/
exports.default = (0, _doParallel2.default)((0, _createTester2.default)(_identity2.default, _findGetResult2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,48 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _identity = require('lodash/identity');
var _identity2 = _interopRequireDefault(_identity);
var _createTester = require('./internal/createTester');
var _createTester2 = _interopRequireDefault(_createTester);
var _doParallelLimit = require('./internal/doParallelLimit');
var _doParallelLimit2 = _interopRequireDefault(_doParallelLimit);
var _findGetResult = require('./internal/findGetResult');
var _findGetResult2 = _interopRequireDefault(_findGetResult);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`detect`]{@link module:Collections.detect} but runs a maximum of `limit` async operations at a
* time.
*
* @name detectLimit
* @static
* @memberOf module:Collections
* @method
* @see [async.detect]{@link module:Collections.detect}
* @alias findLimit
* @category Collections
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {number} limit - The maximum number of async operations at a time.
* @param {AsyncFunction} iteratee - A truth test to apply to each item in `coll`.
* The iteratee must complete with a boolean value as its result.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the `iteratee` functions have finished.
* Result will be the first item in the array that passes the truth test
* (iteratee) or the value `undefined` if none passed. Invoked with
* (err, result).
*/
exports.default = (0, _doParallelLimit2.default)((0, _createTester2.default)(_identity2.default, _findGetResult2.default));
module.exports = exports['default'];

View File

@@ -0,0 +1,38 @@
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _detectLimit = require('./detectLimit');
var _detectLimit2 = _interopRequireDefault(_detectLimit);
var _doLimit = require('./internal/doLimit');
var _doLimit2 = _interopRequireDefault(_doLimit);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
/**
* The same as [`detect`]{@link module:Collections.detect} but runs only a single async operation at a time.
*
* @name detectSeries
* @static
* @memberOf module:Collections
* @method
* @see [async.detect]{@link module:Collections.detect}
* @alias findSeries
* @category Collections
* @param {Array|Iterable|Object} coll - A collection to iterate over.
* @param {AsyncFunction} iteratee - A truth test to apply to each item in `coll`.
* The iteratee must complete with a boolean value as its result.
* Invoked with (item, callback).
* @param {Function} [callback] - A callback which is called as soon as any
* iteratee returns `true`, or after all the `iteratee` functions have finished.
* Result will be the first item in the array that passes the truth test
* (iteratee) or the value `undefined` if none passed. Invoked with
* (err, result).
*/
exports.default = (0, _doLimit2.default)(_detectLimit2.default, 1);
module.exports = exports['default'];

Some files were not shown because too many files have changed in this diff Show More