#terraform-tool

/

      • ^7heo
        without jsonnet.
      • Spark
        just use another tool for the language part of the problem
      • and do deployment with terraform
      • ^7heo
        I can do that with my tfhlp too
      • but the point of "doing it right ™" is not to require hacks.
      • and that's the whole point I'm bringing.
      • Spark
        i think it's nice modularity to keep the two things separate
      • they have nothing in common really
      • the language aspect is reusable
      • ^7heo
        the GCE API has metadata.
      • Spark
        and terraform can be driven by automatic config generation, e.g. CD tools, in which case the language part is not necessary
      • ^7heo
        this metadata is a data store.
      • it is a totally different type of data than all the rest that you define via the API.
      • because *you* can use it to store your data.
      • and access it from the instance.
      • but if you use terraform
      • you lose that flexibility.
      • Then the metadata becomes static, and you're not free to define anything yourself.
      • you have to declare it first in the tf file; and then you MUST define it.
      • Spark
        only if you use terraform in that rigid way
      • ^7heo
        so, that is an anti-feature.
      • how can you use it in a different way?
      • Spark
        generate the json for it
      • ^7heo
        with jsonnet?
      • Spark
        with whatever youw ant
      • ^7heo
        yeah okay
      • with !terraform?
      • Spark
        there are about half a dozen general purpose config langauges out there
      • ^7heo
        yeah
      • Spark
        use jinja if you want
      • ^7heo
        piling things up is not the solution, imho.
      • I'm only bringing the issue up because I truly think it's the right way to do it.
      • Spark
        that's how devops works
      • ^7heo
        piling things up?
      • Spark
        text editor, compiler, build tool, testing framework, continuous integration, deployment, monitoring
      • ^7heo
        yeah
      • Spark
        it's all an assembly of discrete tools
      • ^7heo
        but that's not piling.
      • it's UNIX.
      • Spark
        how is this any different
      • ^7heo
        piping tools is UNIX.
      • this is different because terraform isn't a filter.
      • not even in the semantics.
      • which is what we are talking about.
      • a filter is a function
      • Spark
        you can pipe into things that have side-effects
      • like netcat
      • ^7heo
        terrafom is a procedure, returning void.
      • Spark
        the main reason this is even remotely difficult is that terraform can't take the tf on stdin
      • that i know of
      • so you have to write to a temporary file
      • but in the scheme of difficult problems to solve, that is ranked just under "make myself a nice cup of tea"
      • nulleric joined the channel
      • ^7heo
        Spark: if terraform was taking stdin as data I would stfu
      • Spark
        tbh it is annoying that it just slurps up all the tf files in the current dir and fuses the together
      • ajw0100 joined the channel
      • it'd be good to have a commandline arg to specify them explicitly
      • ^7heo
        nah
      • it'd be cool to take stdin.
      • And let the tool be a filter.
      • and to be honest
      • I'd like to have a tool that takes a file as input, and generates API calls as output.
      • then you just netcat that to the API.
      • done.
      • that would be UNIX.
      • Spark
        well it does more than that
      • ^7heo
        yeah
      • more is less.
      • (works both way)
      • Spark
        it does API calls, interprets the results, diffs with state, builds a plan
      • ^7heo
        yeah, systemd also does "more"
      • </troll>
      • Spark
        executing that plan could be a separate tool, but there's not much point because it involves calling pretty much the same APIs as were needed to build the plan
      • ^7heo
        well, in IT, there are two different schools
      • 1. the UNIX people
      • 2. the rest
      • Spark
        there would be little value in terraform if it just converted a large json file into lots of 'create' api calls for each part of that json file
      • ^7heo
        the UNIX people know the value of filters, because they can plug things like lego
      • the non-UNIX people don't like filters because the require you to plug things like lego.
      • and they prefer to use one tool for everything.
      • Spark
        hashicorp tools are pretty modular though
      • ^7heo
        not in the UNIX sense.
      • I'm not saying you do things wrong.
      • Spark
        well, unix isn't distributed
      • ^7heo
        wat?
      • wait, wat?!
      • how?!
      • Spark
        pipe is for IPC, not communicating with highly available services like consul
      • ^7heo
        high availablility is a scam.
      • marketing scam
      • UNIX *IS* high available from the ground up, from the beginning.
      • it's not because some things have been done wrong that the whole idea is.
      • otherwise you can say that 100% of the software is done wrong.
      • rmenn joined the channel
      • Spark
        don't think unix responds very well to a power cut
      • ^7heo
        okay
      • to keep things at the same scale
      • Spark
        no, that's the point
      • ^7heo
        I don't think terraform responds very well to a cloud-provider cut.
      • and yes, that's the point.
      • Spark
        it needs to handle a zone failure
      • ^7heo
        DNS zone?
      • Spark
        i'm not sure it has the required features for that right now
      • no, availability zone
      • ^7heo
        "it"?
      • UNIX?
      • Spark
        terraform
      • ^7heo
        ah
      • yeah, thanks for clearing that up :)
      • Spark
        cloud provider SLAs are defined in terms of availability zones
      • ^7heo
        yeah
      • nah but I'm not saying that the marketing bs is your fault
      • not at all
      • you're merely making something shitty a bit more unix.
      • Spark
        I don't think you can easily tell Terraform "termporarily ignore all resources in a particular zone"
      • ^7heo
        but anyway
      • I don't have the experience to convince you
      • Spark
        but if you could do that, you could absolutely continue to manage infrastructure during a zone outage
      • _jac joined the channel
      • KenDawg2 has quit
      • ^7heo
        look, I totally like what terraform is doing, but not how.
      • so yeah, in a nutshell, I miss the UNIX aspect.
      • and yes, I think UNIX is highly available.
      • It's missing stuff
      • old sometimes
      • like sysvinit.
      • but that's no reason to replace it by broken shit like systemd (I'm not saying that terraform is broken shit - don't get me wrong)
      • I'm just saying that when you replace software, you have much higher responsability than when you modify software.
      • and terraform is replacing softare: diff is a/dev/null b/terraform
      • catsby
        I appreciate the passion being displayed here, but please try to convey it without profanity
      • ^7heo
        so you define the first thing to be there.
      • catsby: yeah, sorry v_v
      • at least I'm interested in the matter.
      • I'll keep the swearing down.
      • and just to answer < Spark> pipe is for IPC
      • yeah. And when you use an API, what do you do? IPC.
      • ryanuber joined the channel
      • the whole cloud thing can just be summarized in one sentence: add 1% of features, rename it.
      • like docker.
      • Spark
        you are ranting now :)
      • ^7heo
        yeah.
      • bad day, I told ya.
      • sorry, I grow older than expected.
      • not wiser, just older.
      • delianides joined the channel
      • blackjid
        hi! where can I view the terrafom generated graphs??? can I view beautifuly rendered graphs with some app?
      • phinze
        blackjid: hi! terraform generates graphviz DOT format, which can be piped into the graphviz command line tool to generate images in various formats - there are details on the doc page for `terraform graph` http://www.terraform.io/docs/commands/graph.html
      • blackjid
        sorry about the cuestion, I did rememebered that it was in the docs, but I couldn't find it....
      • Is there any javascript/svg library you know where I can view the graphs, besides graphviz?
      • Spark
        sounds like somebody is making a web frontend to terraform :)