#influxdb

/

      • ^aq has quit
      • Druide_ has quit
      • koollman joined the channel
      • Micromus joined the channel
      • overlord_tm has quit
      • Cohedrin joined the channel
      • nicolas17 joined the channel
      • SkyRocknRoll joined the channel
      • Cohedrin joined the channel
      • Cohedrin joined the channel
      • Twiglet joined the channel
      • jgornick joined the channel
      • Danielss89 joined the channel
      • uictamale has quit
      • uictamale joined the channel
      • Cohedrin joined the channel
      • SkyRocknRoll joined the channel
      • sborza joined the channel
      • SkyRocknRoll_ joined the channel
      • sborza has quit
      • sborza joined the channel
      • sborza has quit
      • sborza joined the channel
      • sborza has quit
      • sborza joined the channel
      • sborza has quit
      • sborza joined the channel
      • blkadder has quit
      • nicolas17 has quit
      • knutix has quit
      • Cohedrin joined the channel
      • kamalmarhubi_ has quit
      • kamalmarhubi_ joined the channel
      • tardyp joined the channel
      • overlord_tm joined the channel
      • jascbu joined the channel
      • laurent\ joined the channel
      • SkyRocknRoll joined the channel
      • deniszh joined the channel
      • brijesh_learner joined the channel
      • infernix joined the channel
      • jascbu joined the channel
      • Mattch joined the channel
      • knutix joined the channel
      • brijesh_learner joined the channel
      • knutix_ joined the channel
      • infernix joined the channel
      • Druid_ joined the channel
      • Druid_ is now known as Guest50782
      • ^aq joined the channel
      • knutix_ has quit
      • `chris has quit
      • knutix joined the channel
      • knutix has quit
      • `chris joined the channel
      • ^aq has quit
      • ^aq joined the channel
      • knutix joined the channel
      • Sandcrab joined the channel
      • ^aq has quit
      • SkyRocknRoll joined the channel
      • Meanirelli has quit
      • Meanirelli joined the channel
      • jascbu joined the channel
      • knutix has quit
      • MajPotatohead joined the channel
      • MajPotatohead has quit
      • MajPotatohead joined the channel
      • overlord_tm joined the channel
      • deniszh has quit
      • jascbu joined the channel
      • sandeen joined the channel
      • sandeen
        I'm confused by a query result
      • > SELECT mean("value") FROM "mqtt_consumer" WHERE ("topic" = 'openevse/amp') AND time >= 1512540000000ms GROUP BY time(24h) tz('America/Chicago')
      • the time restriction is midnight my time (i.e the start of the day)
      • I'm only 12h into the day, and I'm asking it to "group by time(24h)"
      • ... what does that do? :)
      • or is that invalid?
      • ok this looks like a bug
      • if I give it GROUP BY time(1h) it duly gives me information starting at my requested timestamp, by the hour, up until now
      • if I give it GROUP_BY time(21h) it gives me one for 6am yesterday morning and 3am this morning (!)
      • so IOWS, if I do "WHERE time >= (12 hours ago) GROUP BY time(24h)" it /seems/ to include data from /prior/ to 12 hours ago. is that expected?
      • sandeen RTFMs
      • hm yeah still can't explain this
      • blizzkid joined the channel
      • actually more accurately it's:
      • blizzkid
        Lo all. Any way to do influx -database 'Energylogger' -execute "select first(value) from SDM530A_Total_kWh where time > '$today' and time < '$tomorrow'" directly in influxdb?
      • sandeen
        so IOWS, if I do "WHERE time >= (start of today) GROUP BY time(24h)" it /seems/ to include data from /prior/ to today's start, even though I'd expect the group_by boundary to be at 0:00
      • influx
      • > use Energylogger
      • > select first(value) from SDM530A_Total_kWh where time > '$today' and time < '$tomorrow'
      • with explicit values for those 2 variables I think, right?
      • blizzkid, ^
      • blizzkid
        sandeen: explicit values? Not really, it needs to be today and tomorrow
      • Basically I want to extract today's energy usage from my total_kWh series
      • I could do it in a bash script (the way I'm going now) and re-insert that in to a new series
      • But I'd like to avoid that if possible
      • sandeen
        well "today" isn't a time :)
      • blizzkid
        sandeen: $today is in bash ;-)
      • sandeen
        (I'm also using influxdb and fiddling with finding daily kwh, but with different issues ...)
      • try
      • select first(value) from SDM530A_Total_kWh where time > now() - 1w GROUP BY time(1d)
      • that should give you the starting kwh number for the start of each of the last 7 days. I think.
      • but what i'm fighting with is where the boundary actually lands from that GROUP BY
      • but you can see what the timestamps say :)
      • blizzkid
        That seems to be quite ok
      • sandeen
        (that looks likea cool meter, btw)
      • blizzkid
        the SDM530-modbus is pretty nifty for a very reasonable price
      • sandeen
        SELECT DIFFERNECE(first(value)) from SDM530A_Total_kWh where time > now() - 1w GROUP BY time(1d)
      • might even give you actual kwh per day
      • sorry DIFFERENCE (typo)
      • blizzkid
        I saw the typo and indeed it gives me seemingly correct values
      • sandeen
        cool
      • what I'm struggling with is things like "time > now() - 1d GROUP BY time(1d)"
      • blizzkid
        and indeed they use 00:00:00 for the different days
      • sandeen
        where the boundaries don't line up, I'm not quite sure where values come from
      • blizzkid
        good question
      • but the results are close enough to my own way of calculating it
      • first_value_of_today=$(influx -database 'Energylogger' -execute "select first(value) from SDM530A_Total_kWh where time > '$today' and time < '$tomorrow'" -format csv | cut -d',' -f3 | tail -n 1)
      • last_value_of_today=$(influx -database 'Energylogger' -execute "select last(value) from SDM530A_Total_kWh where time > '$today' and time < '$tomorrow'" -format csv | cut -d ',' -f3 | tail -n 1)
      • in fact, I get the same value for yesterday
      • for today my calculation is slightly lower
      • (mine is 12.4, your select gives me 13.9)
      • sandeen
        my database contains amps, so I have to convert to watts and integrate to get Wh ... but anyway, what I seem to see is that my last result includes part of the previous day
      • i.e. inflating today's value - I wonder if you see the same thing
      • blizzkid
        Might indeed be the case
      • That's why in bash I do today=$(date +%Y-%m-%d)
      • tomorrow=$(date +%Y-%m-%d -d tomorrow)
      • yesterday=$(date +%Y-%m-%d -d yesterday)
      • twodaysago=$(date +%Y-%m-%d -d "-2 days")
      • And I guess now() - 1day is not equal to today() - 1 day
      • as in you get from 24 hours ago to now instead of start of day until now
      • sandeen
        right, so I'm not sure how the boundaries fall. there are a lot of words about this in the documentation, I guess I need to read it more carefully
      • the GROUP BY clause gives built-in boundaries
      • which I think should be at day start for time(1d)
      • you can also set a time zone, in case your times are in UTC, for what it's worth
      • just append i.e. tz('America/Chicago') to the whole thing
      • blizzkid
        hmm, good question. No ideaa if it's in UTC actually
      • let me check the last points
      • sandeen
        in my case, every day prior to today comes out properly on good boundaries, but the last partial day seems to span the last 24h, rather than the hours so far today
      • if that makes any sense
      • blizzkid
        yep, that's what I'm seemingly experiencing as well
      • sandeen
        (or something) - I'm confused, and had better get back to my real job now ;) have fun :)
      • blizzkid
        oh boy, a select(*) on a series that has been logging every second for a week was not a smart thing to do
      • :)
      • yeah, you too!
      • Lydia_K
        I just did that same thing earlier today, thankfully it only had two days of data in it at the time, but still.
      • blizzkid
        lol. Guess it happens to all of us at some point.
      • sandeen fixes his problem by using INTEGRAL()
      • sandeen now really goes back to real job