The early detection of the transformer faults with high accuracy rates guarantees the continuous operation of the power system networks. Dissolved gas analysis (DGA) is a technique that is used to detect or diagnose the transformer faults based on the dissolved gases due to the electrical and thermal stresses influencing the insulating oil. Many attempts are accomplished to discover an appropriate technique to correctly diagnose the transformer fault types, such as the Duval Triangle method, Rogers' ratios method, and IEC standard 60599. In addition, several artificial intelligence, classification, and optimization techniques are merged with the previous methods to enhance their diagnostic accuracy. In this article, a novel approach is proposed to enhance the diagnostic accuracy of the transformer faults based on introducing new gas concentration percentages limits and gases' ratios that help to separate the conflict between the diverse transformer faults. To do so, an optimization model is established which simultaneously optimizes both gas concentration percentages and ratios so as to maximize the agreement of the diagnostic faults with respect to the actual ones achieving the high diagnostic accuracy of the transformer faults. Accordingly, an efficient teaching-learning based optimization (TLBO) is developed to accurately solve the optimization model considering training datasets (Egyptian chemical laboratory and literature). The proposed TLBO algorithm enhances diagnostic accuracy at a significant level, which is higher than some of the other DGA techniques that were presented in the literature. The robustness of the proposed optimization-based approach is confirmed against uncertainty in measurement where its accuracy is not affected by the uncertainty rates. To prove the efficacy of the proposed approach, it is compared with five existing approaches using an out-of-sample dataset where a superior agreement rate is reached for the different fault types.