In computer science, average case complexity is a measure of the computational resources required by an algorithm on average, over a distribution of inputs. It provides a more realistic estimation of the algorithm's performance than worst-case complexity, which considers the maximum amount of resources required by the algorithm for any input.
Calculating the average case complexity involves considering the probabilities of different inputs and their corresponding resource usage. This can be done through various methods, such as probability theory, statistical analysis, or assuming a specific distribution for the inputs. The complexity is usually expressed using big O notation, similar to worst-case complexity.