I'd be interested to know of applications where RxInfer (or similar approximate variational inference approaches) has been demonstrated to perform much better than competing Bayesian inference approaches -- in the sense of a combined performance, accuracy & maintainability/stability engineering tradeoff -- and also of applications where the approximations used by RxInfer introduce too much approximation error to be useful and other methods are preferred. Examples of commercialised / "industrial scale" applications that are a great fit for the approximations used by RxInfer (and also those applications that are likely to be a poor fit) would be especially convincing!
I'm also curious to know if, once a reasonable way to model a problem with RxInfer is found, can better results (either speeding up evaluation or reducing approximation error) be obtained by throwing more hardware at it (CPU, ram, GPU etc)? Or in practice does inference speed tend not to be a practical bottleneck, and if the bottleneck is that approximation error is too high then the remedy is reformulating the problem / switching to another method (i.e. requires R&D) / gathering more data to better constrain model fits.
RxInfer is specifically designed for real-time Bayesian inference, which sets it apart from many other approaches. You can’t reliably compare it to “competing Bayesian inference methods” because most of them rely on sampling, which is simply too computationally expensive to be used in real-time. While RxInfer may introduce some approximation error, sampling-based methods are often too slow to produce any result within the required time frame.
For example, in our portfolio we have a comparison with NumPyro on Bayesian linear regression: https://examples.rxinfer.com/categories/basic_examples/bayes.... While this isn’t a real-time signal processing use case, it still shows that RxInfer can match the quality of NumPyro’s results—at a fraction of the computational cost