I’m currently working on a build tool, which does caching based on the last-modified timestamp of files. And yeah, man, I was prepared for a world of pain, where I’d have to store a list of all files, so I could tell when one of them disappears.
I probably would’ve also had to make up some non-existent last-modified timestamp to try to pretend I know when that file got deleted. I figured, there’s no way to ask the deleted file when it got deleted, because it doesn’t exist anymore.
Thank you, to whomever had that smart idea to design it like that. I can just take the directory last-modified timestamp now, if it’s the highest value.
In fact, my implementation accidentally does this correct already. That’s how I found out. 🫠
Yeah, good point. It also doesn’t update when the content of a file changes. So, in order to detect a change in a directory, you have to walk all the files and sub-directories and the directory itself to get the last-modified timestamp for each of them. Then determine the highest last-modified and compare it to what you measured in a previous run. If they differ, a change happened.