Name: Onanymous 2014-12-10 10:27
I was writing some text file processing in Pythong using generators, you know, the usual stuff, lazily read lines, parse each line, lazily perform some grouping, etc. Then I realized that I have a problem, two requirements that don't play well with each other:
1. On one hand, I want to process lines lazily so that at any point I use just as much memory as I need to process one line or one group of lines. I don't want to intentionally or accidentally get the entire file (or any intermediate dataset) entirely into memory.
2. On the other hand, I want to deterministically close the file after I'm done reading it.
As a result I should be careful to force all my lazy stuff before I return from
How does HASCAL master race solve this problem that looms even more menacingly for them, with laziness being the default and pervasive? Or, as quick googling makes it seem, they don't stoop to such pleb things as reading files efficiently and closing them at the right time?
1. On one hand, I want to process lines lazily so that at any point I use just as much memory as I need to process one line or one group of lines. I don't want to intentionally or accidentally get the entire file (or any intermediate dataset) entirely into memory.
2. On the other hand, I want to deterministically close the file after I'm done reading it.
As a result I should be careful to force all my lazy stuff before I return from
with open(fname, 'r') as f: ...
. And it's really not obvious that something is wrong with the code that doesn't do that (apart from getting "I/O operation on closed file" at runtime, of course).How does HASCAL master race solve this problem that looms even more menacingly for them, with laziness being the default and pervasive? Or, as quick googling makes it seem, they don't stoop to such pleb things as reading files efficiently and closing them at the right time?