[1:51:51][@Miblo][Amazing. I think I followed it, but definitely want to re-watch (during and after annotating)]
[1:52:04][@garryjohanson][Forgive me if this information was somehow implicit in today's lecture and I missed it, but is compression ever used for locality wins?]
[1:52:43][@graeme7][Do you know off-hand what kind of compression ratio and decompression speed are needed before it's faster to load compressed data and decompress, over loading decompressed data?]
[1:54:01][@nxsy][inttypes.h and PRIu64 is probably what you are supposed to do for %llu for 64-bit printfs]
[1:54:38][@ieee754][You still have to uncompress your images for display, right? So do you have plans for using block compression?]
[1:55:16][@garryjohanson][Make data fit in cache]
[1:57:08][@mtsmox][Why is 255 not a good look back limit for images? You wanted to elaborate on that]
[1:57:47][Blackboard: Choosing your look back window for image compression]
[1:59:22][@bluespide][Have you ever heard of random seed data compression?]
[1:59:46][@kknewkles][Probably more of Jeff and Fabian's territory, but is there generally such a thing as a compressor that compresses a lot and fast? Would some RAD compressor leave 10% of initial size and be considerably faster than LZ? I guess BINK is hyperfast, isn't it like live decompression or something?]
[2:00:59][@ieee754][I meant block compression that's GPU-supported (BC1-BC7). You upload the compressed image to the GPU, and hardware decodes it when sampled]