In many applications, especially those involving scientific instrumentation data with a large experimental error, it is often necessary to carry out linear regression in the presence of severe outliers which may adversely affect the results. Robust regression methods do exist, but they are much more computationally intensive, making it difficult to apply them in real-time scenarios. In this work, we resort to graphics processing unit (GPU)-based computing to carry out robust regression in a time-sensitive application. We illustrate the results and the performance gains obtained by parallelizing one of the most common robust regression methods, namely, least median of squares. Although the method has a complexity of O(n(3) log n), with GPU computing, it is possible to accelerate it to the point that it becomes usable within the required time frame. In our experiments, the input data come from a plasma diagnostic system installed at Joint European Torus, the largest fusion experiment in Europe, but the approach can be easily transferred to other applications.
- DENSITY PROFILE MEASUREMENTS
- MICROWAVE REFLECTOMETRY