Implementing an API Request Limit for Reuters estimates and actuals and Reuters financial statements

Hello everyone,

The last couple of days I tried to collect fundamentals for my algorithm. I constantly ran into the same error in Flightlog, saying: WARNING ibg1 client 6000 got IB error code 2105: HMDS data farm connection is broken: fundfarm.

I got in touch with IB support team and have uploaded my log files and they said, that this error comes from to many requests per minute. The API Request Limit is 6.
I guess, I might not be the only one running into this problem, so I wanted to ask you if you could implement an API Request Limit in the collect_reuters_estimates and collect_reuters_financials functions?
A simple argument where someone can enter the amount of requests per minute might be enough. In the background the function should get executed no more than this many times in one minute. In my case something like this would work:

collect_reuters_estimates(universes='nyse', request_amounts=6)

This might be a great addition to the QuantRocket API.

Best
JD

You can safely ignore that message. It's coming from the blotter which doesn't do any data collection, so it's just noise. The message indicates a temporary disconnect which recovers by itself. See [email protected] | Market data farm connection is broken error

Separately, it's true that IB advertises a stricter rate limit for collecting fundamentals than what QuantRocket observes but we've never found that rate limit to be enforced. With the IB API, seeing is believing, you can't always go by what's advertised.

Thanks for the intel.

But I didn't receive any fundamental data in my environment. I just kept getting this error message an no data was collected. Even after restarting my Gateway and Instance which runs QuantRocket.

What am I doing wrong?

The logs from quantrocket.fundamental should contain the clues. Check both the standard logs and the detailed logs.

According to the detail log collecting Reuters estimates first works fine, but after a couple of tickers this happens:

  • license profile for my key gets fetched
  • account balance gets requested
  • gateway status is checked

and after the status check of the gateway I get the first time the message "WARNING ibg1 client 6000 got IB error code 2105: HMDS data farm connection is broken:fundfarm" from blotter.
Thereafter no fundamental data is collected and the only thing I get is the already described error message.

Is it possible to upload the log file to QR that you can have a look?

This is what the non-detailed version of the logfile looks like around the time the error occurs:

And this is the detailed version of the logfile of the moment when the error occurs:

Is it always the same conids that timeout? Can you get data for these conids if you request them alone (using the conids param)? If you split your universe in half do you get the same issue with both halves? What if you split the universe into small parts and manually run the parts to go at the rate IB suggested?

Running those types of scenarios could help pinpoint what triggers the "No response received" errors. As it stands I can't reproduce the issue directly.

One thing to note is that the fundfarm is broken messages are usually followed by fundfarm is ok messages but the latter are suppressed from the standard logs so you have to look at the detailed logs to see the full picture with those.

I don't know what happened but over the weekend the request for Reuters Estimates work fine. It looks like splitting up the universe might be a solution to my problem. Over the weekend only 1,200 stocks have been in my universe which I requested estimates for. After updating my universe this morning I'm back at 2,400 stocks. And now nothing happens when I run the request command.
The only two purposes that show up in the IB Gateway are Interactive Brokers API Server and API Client. The other purposes don't show up at all.
How do I go about splitting my universe in different parts containing for example 1,000 stocks? Do I have to create different databases using sharding or are there other ways to split a universe?

You simply need to define the universes in the master database however you like and use them in the fundamental data collection command:

https://www.quantrocket.com/docs/#universe-define-universes