this is different because terraform isn't a filter.
not even in the semantics.
which is what we are talking about.
a filter is a function
Spark
you can pipe into things that have side-effects
like netcat
^7heo
terrafom is a procedure, returning void.
Spark
the main reason this is even remotely difficult is that terraform can't take the tf on stdin
that i know of
so you have to write to a temporary file
but in the scheme of difficult problems to solve, that is ranked just under "make myself a nice cup of tea"
nulleric joined the channel
^7heo
Spark: if terraform was taking stdin as data I would stfu
Spark
tbh it is annoying that it just slurps up all the tf files in the current dir and fuses the together
ajw0100 joined the channel
it'd be good to have a commandline arg to specify them explicitly
^7heo
nah
it'd be cool to take stdin.
And let the tool be a filter.
and to be honest
I'd like to have a tool that takes a file as input, and generates API calls as output.
then you just netcat that to the API.
done.
that would be UNIX.
Spark
well it does more than that
^7heo
yeah
more is less.
(works both way)
Spark
it does API calls, interprets the results, diffs with state, builds a plan
^7heo
yeah, systemd also does "more"
</troll>
Spark
executing that plan could be a separate tool, but there's not much point because it involves calling pretty much the same APIs as were needed to build the plan
^7heo
well, in IT, there are two different schools
1. the UNIX people
2. the rest
Spark
there would be little value in terraform if it just converted a large json file into lots of 'create' api calls for each part of that json file
^7heo
the UNIX people know the value of filters, because they can plug things like lego
the non-UNIX people don't like filters because the require you to plug things like lego.
and they prefer to use one tool for everything.
Spark
hashicorp tools are pretty modular though
^7heo
not in the UNIX sense.
I'm not saying you do things wrong.
Spark
well, unix isn't distributed
^7heo
wat?
wait, wat?!
how?!
Spark
pipe is for IPC, not communicating with highly available services like consul
^7heo
high availablility is a scam.
marketing scam
UNIX *IS* high available from the ground up, from the beginning.
it's not because some things have been done wrong that the whole idea is.
otherwise you can say that 100% of the software is done wrong.
rmenn joined the channel
Spark
don't think unix responds very well to a power cut
^7heo
okay
to keep things at the same scale
Spark
no, that's the point
^7heo
I don't think terraform responds very well to a cloud-provider cut.
and yes, that's the point.
Spark
it needs to handle a zone failure
^7heo
DNS zone?
Spark
i'm not sure it has the required features for that right now
no, availability zone
^7heo
"it"?
UNIX?
Spark
terraform
^7heo
ah
yeah, thanks for clearing that up :)
Spark
cloud provider SLAs are defined in terms of availability zones
^7heo
yeah
nah but I'm not saying that the marketing bs is your fault
not at all
you're merely making something shitty a bit more unix.
Spark
I don't think you can easily tell Terraform "termporarily ignore all resources in a particular zone"
^7heo
but anyway
I don't have the experience to convince you
Spark
but if you could do that, you could absolutely continue to manage infrastructure during a zone outage
_jac joined the channel
KenDawg2 has quit
^7heo
look, I totally like what terraform is doing, but not how.
so yeah, in a nutshell, I miss the UNIX aspect.
and yes, I think UNIX is highly available.
It's missing stuff
old sometimes
like sysvinit.
but that's no reason to replace it by broken shit like systemd (I'm not saying that terraform is broken shit - don't get me wrong)
I'm just saying that when you replace software, you have much higher responsability than when you modify software.
and terraform is replacing softare: diff is a/dev/null b/terraform
catsby
I appreciate the passion being displayed here, but please try to convey it without profanity
^7heo
so you define the first thing to be there.
catsby: yeah, sorry v_v
at least I'm interested in the matter.
I'll keep the swearing down.
and just to answer < Spark> pipe is for IPC
yeah. And when you use an API, what do you do? IPC.
ryanuber joined the channel
the whole cloud thing can just be summarized in one sentence: add 1% of features, rename it.
like docker.
Spark
you are ranting now :)
^7heo
yeah.
bad day, I told ya.
sorry, I grow older than expected.
not wiser, just older.
delianides joined the channel
blackjid
hi! where can I view the terrafom generated graphs??? can I view beautifuly rendered graphs with some app?
phinze
blackjid: hi! terraform generates graphviz DOT format, which can be piped into the graphviz command line tool to generate images in various formats - there are details on the doc page for `terraform graph` http://www.terraform.io/docs/commands/graph.html
blackjid
sorry about the cuestion, I did rememebered that it was in the docs, but I couldn't find it....
Is there any javascript/svg library you know where I can view the graphs, besides graphviz?
Spark
sounds like somebody is making a web frontend to terraform :)