For most traders, minute data seems to provide the best balance of granularity, data manageability, and ability to execute trades. As sub mentions, storage size is a challenge with second data (although the US stock minute bundle is 50-60GB not 190GB). And more data can mean slower query speeds or computation speeds.
In addition, QuantRocket’s backtesters pair better with minute data. Zipline is hard-wired for minute data, and Moonshot uses cron for scheduling, which has minute granularity.
Second-level data is also harder to execute on. Many stocks are illiquid at that granularity. And there’s less margin for error when it comes to timely order execution with your broker.
A realistic use case for more granular data might be receiving real-time data over a websocket for a small number of symbols in a custom script and sending orders to the blotter. That could be done today. For backtesting, since QuantRocket saves real-time data to a database, you could run the real-time data collection for a few weeks or months and use that for testing and analysis. That won’t give you deep backtesting, but usually, the higher frequency your trading strategy, the less useful backtesting is anyway and the more you have to try it in live execution to see if it works.