Jump to content

nelsibrun

Members
  • Posts

    2
  • Joined

  • Last visited

nelsibrun's Achievements

Decaf

Decaf (2/10)

0

Reputation

  1. So I'm a little stuck with understanding time complexity for average, worst and best case. Say I have an algorithm that runs on a data structure in O(1) time most of the time, but every log(k) iterations it has to run an O(n) operation. For best case do I assume that it doesn't run the operation and we have O(1). Then for worst we'd assume it does and we have O(n)? What about the average case. If we run the algorithm k times, it has to run an O(n) operation log(k) times, so we have something like log(k) * O(n). Averaging this out over the k iterations we get: (log(k) * O(n)) / k as k tends to infinity this reduces to 0 and I guess we have constant time then? I'm not really sure. When we're talking about complexity in terms of n, I assume n is the size of the data structure the algorithm operates on, but then how does this relate to running the algorithm k times? Sorry if that's a bit of a mess just trying to wrap my head around it.
×
×
  • Create New...

Important Information

This website uses cookies to ensure you get the best experience on our website. See our Privacy Policy and Terms of Use