@Aria I tried to keep myself informed of new modules and regularly search 
for modules like this on npm. I wrote some streamy stuff that already 
existed, except mine are kept up to date with the stream3  API.
The discoverability problem is not so much from many tiny modules, IMO. 
It's rather that people make very little effort to name things correctly. 
You'll notice that the only one in my list to have an exotic name is 
ka-ching which is intended to bring an opnionated framework to caching. The 
rest of them are named after what they do. And since they do little, what 
they do can be summed up pretty easily in a single sentence.

@Serapath
yes, I like problems to be solved my way =) but I usually love what the 
community offers. I use express most the time and a lot of visionmedia's 
stuff as well as some of substack's. It's funny how the community 
cristalized around module authors rather than frameworks. I still don't 
know if that's a cool thing or not, but it definitely help reusability of 
small components. I'll admit that this might hurt discoverability in some 
way.

Also, I forgot to add some links to my list
All of it is at https://github.com/Floby

On Monday, 22 December 2014 18:05:36 UTC+1, Floby wrote:
>
> Hello everyone,
>
> I have been writing and publishing tools and modules this year without 
> advertising much any of them. Most of them are related in some way to what 
> I do at work and all of them have been published as to be as general 
> purpose as possible.
> Since publishing is always funnier with real life feedback, I decided to 
> make a list of the stuff I've been working on.
>
> Here is a list of things I've published and might be useful to you :
>
> *stream-stream*
>
> > A stream of streams in order to concatenate the contents of several 
> streams
>
> var ss = require('stream-stream');var fs = require('fs');
> var files = ['a.txt', 'b.txt', 'c.txt'];var stream = ss();
>
> files.forEach(function(f) {
>     stream.write(fs.createReadStream(f));
> });
> stream.end();
>
> stream.pipe(process.stdout);
>
>
> *stream-write-read*
>
> > Write to a file, read when it's done
>
> var WriteRead = require('stream-write-read');
> var cache = WriteRead('/my/cache/folder/file');
> cache.createReadable().pipe(destination);
>
> source.pipe(cache);
>
>
> *stream-sink*
>
> > Collect all data piped to this stream when it ends
> Useful for testing purposes
>
> var sink = require('stream-sink');
> readable.pipe(sink()).on('data', function(data) {
>     // YAY!
> });
>
>
> *stream-blackhole*
>
> > A silly writable stream eating all data
> Useful when you need to consume a readable stream but don't care about its 
> data
>
> var blackhole = require('stream-blackhole');process.stdin.pipe(blackhole());
>
>
> *duplex-maker*
>
> > Create a duplex stream from a writable and a readable
>
> var writable = fs.createWriteStream('/to/write');var readable = 
> fs.createReadStream('/to/read');
> var duplex = DuplexMaker(writable, readable);
>
>
> *ka-ching*
>
> > Caching framework for streams
> ka-ching is one of these larger projects that all of the others come from. 
> It can do many things and is mostly functional. It still needs some 
> polishing and battle testing though.
>
> var kaChing = require('ka-ching')('/path/to/cache/dir');var request = 
> require('request');
>
> kaChing('my-cached-resource-id', function () {
>   return request('http://google.com/');
> }).pipe(destination);
>
>
>
> *cache-depend*
> > Utility functions to detect when you should invalidate your cached data
>
> var onDate = require('cache-depend')
>                .date('2015-06-23 12:36:00')
>
> onDate.on('change', function (changeinfo) {
>   changeinfo.changeId
>   changeinfo.startedAt
>   changeinfo.endedAt
>   // Same here
> })
> var onOthers = require('cache-depend')
>                 .others(onEtag, onDate) // any number of arguments
>
> onOthers.on('change', function (changeinfo) {
>   changeinfo.changeId // is the same as the one emitted by the first to change
>   changeinfo.changed // reference to the changing dependency
> })
>
>
>
> *stream-json-stringify*
>
> > JSON.stringify, streaming, non-blocking
> Still has some inconsistencies with the behaviour of JSON.stringify in 
> some edge cases but this is being worked on
>
> var stringify = require('stream-json-stringify');
> stringify(myBigObject).pipe(process.stdout);
>
>
>
> *object-iterator*
>
> > a module to walk through an object with an iterator
>
> var oi = require('object-iterator');var source = [8, {one: 1, yes: true}, 
> null];var next = oi(source);var v;while(v = next()) {
>     console.log(v.type);
> }
> // array// number// object// number// boolean// end-object// null// end-array
>
>
> *url-assembler*
>
> > assemble URLs from route-like templates (/path/:param)
>
> UrlAssembler('https://api.site.com/')
>   .prefix('/v2')
>   .segment('/users/:user')
>   .segment('/projects/:project_id')
>   .segment('/summary')
>   .param({
>     user: 'floby',
>     project_id: 'node-url-assembler'
>   })
>   .toString()
>
> // => 'https://api.site.com/users/floby/projects/node-url-assembler/summary'
>
>
> *http-measuring-client*
>
> > Like the http module, except with stats
> Drop-in replacement for http/https modules. Can also monkey-patch the 
> native modules if necessary
>
> var http = require('http-measuring-client').create();
> http.get('http://google.com', function (response) {
>   // `response` is your plain old response object
> });
>
> http.on('stat', function (parsedUri, stats) {
>   // `parseUri` is parsed with url.parse();
>   stats.totalTime; // -> total time taken for request
> })
>
>
> *disect*
>
> > Bisection helper for javascript
>
> disect([10, 20, 30], function(element, index) {
>   return element > 11;
> })// returns 20
>
>
> *crossroad*
>
> > Semantically-versionned service discovery
> This one is my latest work in progress and is therefore not yet 
> functional, but is evolving very quickly
>
> the main design principles are 
>
>    - one sidekick process running per host (agent)
>    - gossip between agents to synchronise running services
>    - HTTP is the communication protocol
>    - The client is your regular HTTP client
>    - Proactive and Reactive consumption from clients
>
> GET /services/my-web-service/~1.0.1
> Host: localhost:5555
> Accept: application/json
>
> -> 200 OK
> -> Content-Type: application/json
> -> 
> -> {
> ->   "type": "my-web-service",
> ->   "uuid": "my-web-service-bd80ddff76e8ae5",
> ->   "version": "1.0.3",
> ->   "location": {
> ->     "host": "172.50.60.22",
> ->     "port": 8080
> ->   }
> -> }
>
>
>

-- 
Job board: http://jobs.nodejs.org/
New group rules: 
https://gist.github.com/othiym23/9886289#file-moderation-policy-md
Old group rules: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/nodejs/9ecb3d29-0bcc-4239-b467-1f6ceeee6daf%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to