I added precompressed files to my website to get both compression and a Content-Length header. It worked, except for that one fucking place I need the header for: the root index. Makes no sense because on-the-fly encoding is off and Caddy should have no reason to strip the Content-Length header
I have yet to find a use-case where AI would be a legitimate help for me. The only thing I could think of is creating unit test cases, but at work we just YOLO our projects so no dice here
I know now why my website won't verify on Iceshrimp.NET. Caddy won't send a Content-Length header if the site is compressed on-the-fly, but Iceshrimp.NET uses it to check if the response is too large. Super weird behavior on Caddy's end
It really only needs to get that under control and it would be perfect. I have to type cast the recursion away in a project or the whole VSCode extension server gets stuck after every single change in my code