Uncertainty is often a concern in Artificial intelligence (AI) based fault diagnosis due to limited data, noise and unseen fault scenarios. Hence, this digest leverages a probabilistic regime using the Bayesian Neural Networks (BNN) as a candidate solution, that quantifies the uncertainty of its predictions as an additional decision variable alongside the prediction accuracy. It provides a comprehensive framework to quantify the uncertainties arising out of both: inconsistent data and insufficient model knowledge. As a result, the proposed approach will assist the decision-making by system operators to either collect data under specific conditions or re-train the AI model accordingly. Preliminary case studies are demonstrated with data collected from a Gearbox Dynamic Simulator, while more extensive validations are to be included in the final paper.