13Haraka/06master 1434fecb2 15Steve Freegard: Tidy-up graceful shutdown and fix for non-cluster mode (#1639)
DoubleMalt joined the channel
baudehlo
_smf_: app.addRewriteHook(filter, handler);
filter is a function that is called for every found mime node. If the function returns true, than this node will be processed, otherwise it is skipped. The function gets two arguments: envelope and node
So basically the same way we do things with attachments, only better.
DoubleMalt joined the channel
[b__b] joined the channel
DoubleMalt joined the channel
basically it's a filter that gets a read stream and a write stream as params.
knutix_ has quit
knutix joined the channel
knutix has quit
knutix joined the channel
DoubleMalt joined the channel
_smf_
baudehlo: yeah; that's basically what I was suggesting the other day that we do.
Ideally we'd have a Input stream -> decode stream (QP/Base64) -> (optional) Modifier stream -> Re-encode Stream -> Output stream
The input -> decode streams would automatically be connected directly to the output stream *unless* we have modifier filters that need to change something.
It's simply a question of how to do it efficiently whilst making the API as simple to use as possible.
baudehlo
exactly.
I mean we basically already do all this stuff, just without streams.
_smf_
*nod*
The problem I can see is that our current plugin infrastructure doesn't lend itself to streams.
e.g. All the hook_data_post plugins that require access to the data, would need to be in a stream pipeline.
And that implies that they'd run concurrently instead of how we do it now.
baudehlo
well bannering and attachment processing both require setting up in that manner, so it's not all bad.