Thanks Alexei. Now my 2 custom indicators are constructed using HourlyBars that I define in OnStrategyStart() as HourlyBars = GetBars(BarType.Time, 3600);
That said, here's OpenQuant.exe footprint when I start OQ and then run a backtest using 1h + 1mn data, with BarFilter enabled and BarFilter.Items[0] = Bars Time(60).
CPU on top (4 cores), memory consumption in the middle, I/O at the bottom.

Backtesting on 1mn data for 1 instrument takes far more time than on 1h bars but the way OQ's backtesting engine works explains why.
However, memory consumption is clearly HUGE and almost always increasing either slowly or by 'gapping'.
I have only 2 custom indicators instanciated once using hourly bars and they have no specific memory usage: I can see that when I backtest using only hourly bars.
Also, memory is not released after the backtest (I can understand that, it's a design choice).
What I don't get is why OQ reads so much IO and why it consumes so much memory.
Even with ALL 1mn bars (count = 3 million) for this instrument fetched in memory (either at once at the beginning or progressively), this data is 152 MB in a CSV file and 18 MB zipped. I assume the compression algo used in your binary DB makes the data closer to 18 MB than 152 MB but anyway, even if it was 500 MB, it doesn't explain why OQ uses only ~100 MB of RAM at the beginning and then 1.5 GB at the end of the backtest.