mirror of
https://github.com/sablierapp/sablier.git
synced 2025-12-25 14:59:16 +01:00
docs(release): update doc version from 1.2.0 to 1.3.0-beta.1 [skip ci]
This commit is contained in:
72
node_modules/git-log-parser/README.md
generated
vendored
Normal file
72
node_modules/git-log-parser/README.md
generated
vendored
Normal file
@@ -0,0 +1,72 @@
|
||||
git-log-parser [](https://travis-ci.org/bendrucker/git-log-parser)
|
||||
==============
|
||||
|
||||
Run `git log` and return a stream of commit objects.
|
||||
|
||||
## Setup
|
||||
|
||||
```bash
|
||||
$ npm install git-log-parser
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
#### `log.parse(config, options)` -> `Stream(commits)`
|
||||
|
||||
Accepts a `config` object mapping to the [options accepted by `git log`](http://git-scm.com/docs/git-log). `config` will be automatically converted to command line options and flags by [argv-formatter](https://github.com/bendrucker/argv-formatter). Returns a stream of commit objects.
|
||||
|
||||
`options` is passed directly to [`child_process.spawn`](https://nodejs.org/api/child_process.html#child_process_child_process_spawn_command_args_options).
|
||||
|
||||
A commit is structured as follows:
|
||||
|
||||
```js
|
||||
{
|
||||
commit: {
|
||||
'long': '4bba6092ecb2571301ca0daa2c55336ea2c74ea2',
|
||||
'short': '4bba609'
|
||||
},
|
||||
tree: {
|
||||
'long': 'b4ef3379e639f8c0034831deae8f6ce63dd41566',
|
||||
'short': 'b4ef337'
|
||||
},
|
||||
author: {
|
||||
'name': 'Ben Drucker',
|
||||
'email': 'bvdrucker@gmail.com',
|
||||
'date': new Date('2014-11-20T14:39:01.000Z')
|
||||
},
|
||||
committer: {
|
||||
'name': 'Ben Drucker',
|
||||
'email': 'bvdrucker@gmail.com',
|
||||
'date': new Date('2014-11-20T14:39:01.000Z')
|
||||
},
|
||||
subject: 'Initial commit',
|
||||
body: 'The commit body'
|
||||
}
|
||||
```
|
||||
|
||||
`author.date` and `commiter.date` are `Date` objects while all other values are strings.
|
||||
|
||||
If you just want an array of commits, use [stream-to-array](https://www.npmjs.com/package/stream-to-array) to wrap the returned stream.
|
||||
|
||||
#### `log.fields` -> `Object`
|
||||
|
||||
Commit objects contain the most frequently used commit information. However, the [field mappings](https://github.com/bendrucker/git-log-parser/blob/master/src/fields.js) used to format and then parse log output can be amended before calling the parser. Consult the [full range of formatting placeholders](http://opensource.apple.com/source/Git/Git-19/src/git-htmldocs/pretty-formats.txt) and add the placeholder to the object tree if you wish to add extra fields.
|
||||
|
||||
## Example
|
||||
|
||||
Get all commits from earlier than an hour ago and stream them to `stdout` as pretty-printed JSON
|
||||
|
||||
```js
|
||||
var log = require('git-log-parser');
|
||||
var through2 = require('through2');
|
||||
|
||||
log.parse({
|
||||
before: new Date(Date.now() - 60 * 60 * 1000)
|
||||
})
|
||||
.pipe(through2.obj(function (chunk, enc, callback) {
|
||||
callback(null, JSON.stringify(chunk, undefined, 2));
|
||||
}))
|
||||
.pipe(process.stdout);
|
||||
```
|
||||
|
||||
Note that `before` is stringified and passed directly as an argument to `git log`. No special handling is required for any standard `git log` option. You can filter by committer, time, or any other field supported by [`git log`](http://git-scm.com/docs/git-log).
|
||||
3
node_modules/git-log-parser/node_modules/split2/.travis.yml
generated
vendored
Normal file
3
node_modules/git-log-parser/node_modules/split2/.travis.yml
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
language: node_js
|
||||
node_js:
|
||||
- "0.10"
|
||||
74
node_modules/git-log-parser/node_modules/split2/README.md
generated
vendored
Normal file
74
node_modules/git-log-parser/node_modules/split2/README.md
generated
vendored
Normal file
@@ -0,0 +1,74 @@
|
||||
# Split2(matcher, mapper, options)
|
||||
|
||||
[](http://travis-ci.org/mcollina/split2)
|
||||
|
||||
Break up a stream and reassemble it so that each line is a chunk.
|
||||
`split2` is inspired by [@dominictarr](https://github.com/dominictarr) [`split`](https://github.com/dominictarr) module,
|
||||
and it is totally API compatible with it.
|
||||
However, it is based on [`through2`](https://github.com/rvagg/through2) by [@rvagg](https://github.com/rvagg) and it is fully based on Stream3.
|
||||
|
||||
`matcher` may be a `String`, or a `RegExp`. Example, read every line in a file ...
|
||||
|
||||
``` js
|
||||
fs.createReadStream(file)
|
||||
.pipe(split2())
|
||||
.on('data', function (line) {
|
||||
//each chunk now is a seperate line!
|
||||
})
|
||||
|
||||
```
|
||||
|
||||
`split` takes the same arguments as `string.split` except it defaults to '/\r?\n/' instead of ',', and the optional `limit` paremeter is ignored.
|
||||
[String#split](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/String/split)
|
||||
|
||||
`split` takes an optional options object on it's third argument, which
|
||||
is directly passed as a
|
||||
[Transform](http://nodejs.org/api/stream.html#stream_class_stream_transform_1)
|
||||
option.
|
||||
|
||||
Calling `.destroy` will make the stream emit `close`. Use this to perform cleanup logic
|
||||
|
||||
``` js
|
||||
var splitFile = function(filename) {
|
||||
var file = fs.createReadStream(filename)
|
||||
|
||||
return file
|
||||
.pipe(split2())
|
||||
.on('close', function() {
|
||||
// destroy the file stream in case the split stream was destroyed
|
||||
file.destroy()
|
||||
})
|
||||
}
|
||||
|
||||
var stream = splitFile('my-file.txt')
|
||||
|
||||
stream.destroy() // will destroy the input file stream
|
||||
```
|
||||
|
||||
# NDJ - Newline Delimited Json
|
||||
|
||||
`split2` accepts a function which transforms each line.
|
||||
|
||||
``` js
|
||||
fs.createReadStream(file)
|
||||
.pipe(split2(JSON.parse))
|
||||
.on('data', function (obj) {
|
||||
//each chunk now is a a js object
|
||||
})
|
||||
```
|
||||
|
||||
# License
|
||||
|
||||
Copyright (c) 2014, Matteo Collina <hello@matteocollina.com>
|
||||
|
||||
Permission to use, copy, modify, and/or distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
|
||||
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
9
node_modules/git-log-parser/node_modules/through2/LICENSE.md
generated
vendored
Normal file
9
node_modules/git-log-parser/node_modules/through2/LICENSE.md
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
# The MIT License (MIT)
|
||||
|
||||
**Copyright (c) Rod Vagg (the "Original Author") and additional contributors**
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
134
node_modules/git-log-parser/node_modules/through2/README.md
generated
vendored
Normal file
134
node_modules/git-log-parser/node_modules/through2/README.md
generated
vendored
Normal file
@@ -0,0 +1,134 @@
|
||||
# through2
|
||||
|
||||
[](https://nodei.co/npm/through2/)
|
||||
|
||||
**A tiny wrapper around Node streams.Transform (Streams2/3) to avoid explicit subclassing noise**
|
||||
|
||||
Inspired by [Dominic Tarr](https://github.com/dominictarr)'s [through](https://github.com/dominictarr/through) in that it's so much easier to make a stream out of a function than it is to set up the prototype chain properly: `through(function (chunk) { ... })`.
|
||||
|
||||
Note: As 2.x.x this module starts using **Streams3** instead of Stream2. To continue using a Streams2 version use `npm install through2@0` to fetch the latest version of 0.x.x. More information about Streams2 vs Streams3 and recommendations see the article **[Why I don't use Node's core 'stream' module](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html)**.
|
||||
|
||||
```js
|
||||
fs.createReadStream('ex.txt')
|
||||
.pipe(through2(function (chunk, enc, callback) {
|
||||
for (var i = 0; i < chunk.length; i++)
|
||||
if (chunk[i] == 97)
|
||||
chunk[i] = 122 // swap 'a' for 'z'
|
||||
|
||||
this.push(chunk)
|
||||
|
||||
callback()
|
||||
}))
|
||||
.pipe(fs.createWriteStream('out.txt'))
|
||||
.on('finish', () => doSomethingSpecial())
|
||||
```
|
||||
|
||||
Or object streams:
|
||||
|
||||
```js
|
||||
var all = []
|
||||
|
||||
fs.createReadStream('data.csv')
|
||||
.pipe(csv2())
|
||||
.pipe(through2.obj(function (chunk, enc, callback) {
|
||||
var data = {
|
||||
name : chunk[0]
|
||||
, address : chunk[3]
|
||||
, phone : chunk[10]
|
||||
}
|
||||
this.push(data)
|
||||
|
||||
callback()
|
||||
}))
|
||||
.on('data', (data) => {
|
||||
all.push(data)
|
||||
})
|
||||
.on('end', () => {
|
||||
doSomethingSpecial(all)
|
||||
})
|
||||
```
|
||||
|
||||
Note that `through2.obj(fn)` is a convenience wrapper around `through2({ objectMode: true }, fn)`.
|
||||
|
||||
## API
|
||||
|
||||
<b><code>through2([ options, ] [ transformFunction ] [, flushFunction ])</code></b>
|
||||
|
||||
Consult the **[stream.Transform](http://nodejs.org/docs/latest/api/stream.html#stream_class_stream_transform)** documentation for the exact rules of the `transformFunction` (i.e. `this._transform`) and the optional `flushFunction` (i.e. `this._flush`).
|
||||
|
||||
### options
|
||||
|
||||
The options argument is optional and is passed straight through to `stream.Transform`. So you can use `objectMode:true` if you are processing non-binary streams (or just use `through2.obj()`).
|
||||
|
||||
The `options` argument is first, unlike standard convention, because if I'm passing in an anonymous function then I'd prefer for the options argument to not get lost at the end of the call:
|
||||
|
||||
```js
|
||||
fs.createReadStream('/tmp/important.dat')
|
||||
.pipe(through2({ objectMode: true, allowHalfOpen: false },
|
||||
(chunk, enc, cb) => {
|
||||
cb(null, 'wut?') // note we can use the second argument on the callback
|
||||
// to provide data as an alternative to this.push('wut?')
|
||||
}
|
||||
)
|
||||
.pipe(fs.createWriteStream('/tmp/wut.txt'))
|
||||
```
|
||||
|
||||
### transformFunction
|
||||
|
||||
The `transformFunction` must have the following signature: `function (chunk, encoding, callback) {}`. A minimal implementation should call the `callback` function to indicate that the transformation is done, even if that transformation means discarding the chunk.
|
||||
|
||||
To queue a new chunk, call `this.push(chunk)`—this can be called as many times as required before the `callback()` if you have multiple pieces to send on.
|
||||
|
||||
Alternatively, you may use `callback(err, chunk)` as shorthand for emitting a single chunk or an error.
|
||||
|
||||
If you **do not provide a `transformFunction`** then you will get a simple pass-through stream.
|
||||
|
||||
### flushFunction
|
||||
|
||||
The optional `flushFunction` is provided as the last argument (2nd or 3rd, depending on whether you've supplied options) is called just prior to the stream ending. Can be used to finish up any processing that may be in progress.
|
||||
|
||||
```js
|
||||
fs.createReadStream('/tmp/important.dat')
|
||||
.pipe(through2(
|
||||
(chunk, enc, cb) => cb(null, chunk), // transform is a noop
|
||||
function (cb) { // flush function
|
||||
this.push('tacking on an extra buffer to the end');
|
||||
cb();
|
||||
}
|
||||
))
|
||||
.pipe(fs.createWriteStream('/tmp/wut.txt'));
|
||||
```
|
||||
|
||||
<b><code>through2.ctor([ options, ] transformFunction[, flushFunction ])</code></b>
|
||||
|
||||
Instead of returning a `stream.Transform` instance, `through2.ctor()` returns a **constructor** for a custom Transform. This is useful when you want to use the same transform logic in multiple instances.
|
||||
|
||||
```js
|
||||
var FToC = through2.ctor({objectMode: true}, function (record, encoding, callback) {
|
||||
if (record.temp != null && record.unit == "F") {
|
||||
record.temp = ( ( record.temp - 32 ) * 5 ) / 9
|
||||
record.unit = "C"
|
||||
}
|
||||
this.push(record)
|
||||
callback()
|
||||
})
|
||||
|
||||
// Create instances of FToC like so:
|
||||
var converter = new FToC()
|
||||
// Or:
|
||||
var converter = FToC()
|
||||
// Or specify/override options when you instantiate, if you prefer:
|
||||
var converter = FToC({objectMode: true})
|
||||
```
|
||||
|
||||
## See Also
|
||||
|
||||
- [through2-map](https://github.com/brycebaril/through2-map) - Array.prototype.map analog for streams.
|
||||
- [through2-filter](https://github.com/brycebaril/through2-filter) - Array.prototype.filter analog for streams.
|
||||
- [through2-reduce](https://github.com/brycebaril/through2-reduce) - Array.prototype.reduce analog for streams.
|
||||
- [through2-spy](https://github.com/brycebaril/through2-spy) - Wrapper for simple stream.PassThrough spies.
|
||||
- the [mississippi stream utility collection](https://github.com/maxogden/mississippi) includes `through2` as well as many more useful stream modules similar to this one
|
||||
|
||||
## License
|
||||
|
||||
**through2** is Copyright (c) Rod Vagg [@rvagg](https://twitter.com/rvagg) and additional contributors and licensed under the MIT license. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE file for more details.
|
||||
Reference in New Issue
Block a user