Apple M1fix status

Hi Brian,

Wanted to know the status of the Apple M1 fix. I bought a nice machine and hope that I can use its power soon.

Peter

I managed to install it on my M1. The main error I get in the log is:

quantrocket_master_1|Mon Dec 20 00:49:53 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (40055088/64) ***
quantrocket_blotter_1|Mon Dec 20 00:49:53 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (42496304/64) ***
quantrocket_realtime_1|Mon Dec 20 00:49:53 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (46182704/64) ***
quantrocket_countdown_1|Mon Dec 20 00:49:53 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 3) full !!! (38986032/64) ***
 quantrocket_realtime_1|Mon Dec 20 00:49:53 2021 - *** uWSGI listen queue of socket "0.0.0.0:81" (fd: 3) full !!! (39289136/64) ***
  quantrocket_zipline_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 6) full !!! (39276848/64) ***
quantrocket_fundamental_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 3) full !!! (39153968/64) ***
       quantrocket_ibg1_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (39919920/64) ***
   quantrocket_moonshot_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 6) full !!! (39121200/64) ***
         quantrocket_db_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 3) full !!! (38998320/64) ***
  quantrocket_flightlog_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 3) full !!! (38990128/64) ***
    quantrocket_history_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 6) full !!! (39362864/64) ***
quantrocket_license-service_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (39821616/64) ***
        quantrocket_account_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 3) full !!! (39047472/64) ***
      quantrocket_flightlog_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "127.0.0.1:44587" (fd: 3) full !!! (38990128/64) ***
      quantrocket_ibgrouter_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (39936304/64) ***
       quantrocket_codeload_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 3) full !!! (38973744/64) ***
        quantrocket_blotter_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (42496304/64) ***
         quantrocket_master_1|Mon Dec 20 00:49:54 2021 - *** uWSGI listen queue of socket "0.0.0.0:80" (fd: 4) full !!! (40055088/64) ***

on all Quantrocket containers.

  1. I managed to ingest the data
  2. when opening notebook I get: "Invalid response: 502 Bad Gateway" when assigning the kernel to the notebook.

Happy to help Brian!

Unfortunately there are roadblocks at present because certain necessary packages for building uWSGI, the web server QuantRocket uses, are not available for arm64.

I have checked and with homebrew I can install uWSGI with no issues.

I have seen an uptick in these types of errors.

uWSGI listen queue full

Has this been resolved? My current work around is to completely restart everything (docker + vs code).

This issue was resolved via the support for arm64 containers in version 2.8.0, and if you're seeing an "uptick" in listen queue full message, it doesn't sound the same as the original issue, because the original issue was a constant flood of these messages when running on Apple Silicon (see the timestamps in the originally pasted logs).

That said, it's not expected to see this message. Can you paste more of the logs to show which container(s) the log messages are coming from, and how frequently you're seeing them? Are you seeing any adverse consequences of the log messages or just the messages themselves?

As far as a workaround, restarting everything is fine, but a simpler workaround is just restart the container that has the issue.

Understood. I'll post more detail from the log if this happens again. It hasn't happened since I restarted.

The direct impact to my workflow was an error message saying that I needed a license key to run the ipython cell that I was working in. Subsequent calls to get license profile and set license were timing out. Only after checking "quantrocket flightlog stream -d" did I notice the error message similar to the one discussed in this thread.

In all fairness I can not confidently say they are exactly the same. I apologize for the lack of detail.