mafintosh: hey can you point me at your server that streams movies from dat archives?
(the code)
iml_ joined the channel
rhodey_orbits joined the channel
rhodey_orbits
hey all, I just received a 4TB NAS in the mail today and would like to use dat to share the .WAV files and metadata database from https://radiowitness.io
I'm going to mirror the Amazon S3 bucket to some folder structure, seems that dat handles this simply enough
but I'm not sure the best way to share the metadata databases
SQL dump, or weekly .csv, are there any existing best practices for this?
karissa: Im thinking of the adaptive streaming example he did
karissa
pfraze: ah
flyingzumwalt joined the channel
pfraze
karissa: do you think dat could handle 4tb right now?
that's a heavy heavy load
rhodey_orbits
the .wav files total about 600GB right now, the metadata is just a couple GB
@karissa, my NAS just happens to be 4TB right now :) don't have that much data yet
karissa
rhodey_orbits: dat should be able to handle 4tb of data it will just take a bit to scan it
rhodey_orbits: a few gigs of csv isn't too bad
rhodey_orbits: but if you're comfortable with SQLite it's nice
Nicer
rhodey_orbits
karissa: cool :) yeah I'm happy with SQLite I'm just not sure how it stores it's DB on disk and if diffs to the tables would play nicely with dat
karissa
pfraze: since we stopped storing block data in the leveldb directly it handles larger repo sizes much better
pfraze: now it's just using the fs rw
rhodey_orbits: it just stores binary data so think of it as closer to bit torrent sync right now
rhodey_orbits: it's on the roadmap to have better hooks into the history w human readable commits and diffing
pfraze
karissa: well if dat handles a 4TB load, I'll be impressed! That's such a large workload, if you dont manage your RAM usage carefully, it'd be easy to exhaust the system
karissa
rhodey_orbits: but none of that is really available in the cli, desktop app, or browser right now
rhodey_orbits
karissa: ok, great, I'll probably go with sqlite then. any magic number of files per folder to boost performance/
performance?*
karissa
rhodey_orbits: that's a mafintosh question :)
pfraze: I haven't tried it personally but mafintosh designed hyperdrive with large datasets in mind.
rhodey_orbits
mafintosh: I wanna store 600+GB .WAV files in dat, there is no inherit folder hierarchy for these files, can you recommend any magic number of files per folder to boost dat performance?
files are very small on average, I got ~8.6M of them right now
mafintosh
pfraze: afk right now. will link you later
pfraze
mafintosh: thanks
karissa
rhodey_orbits: 8.6M files??
rhodey_orbits: do they ever change?
rhodey_orbits: I wonder if you can concatenate them somehow and denote the start/stop time
rhodey_orbits
karissa: lol yeah 8.6M, no they don't ever change, append only for the database too
karissa: ohhhh, 0.o yeah maybe
karissa
rhodey_orbits: dat is cool because you can stream blocks from the middle of a file
rhodey_orbits: so you could in theory have a few larger wav files and skip to the middle if you want to listen to a particular start time
rhodey_orbits
karissa: so I get the sense I'll get a large performance boost if I trade file size for file count?