I'd prefer as little formal definition as possible and simple mathematics.

Summary: The upper bound of the complexity of an algorithm. See also the similar question Big O, how do you calculate/approximate it? for a good explaination.

**2019年04月25日05分45秒**

The other answers are quite good, just one detail to understand it: O(log n) or similar means, that it depends on the "length" or "size" of the input, not on the value itself. This could be hard to understand, but is very important. For example, this happens when your algorithm is splitting things in two in each iteration.

**2019年04月25日05分45秒**

There is a lecture dedicated to complexity of the algorithms in the Lecture 8 of the MIT "Introduction to Computer Science and Programming" course youtube.com/watch?v=ewd7Lf2dr5Q It is not completely plain English, but gives nice explanation with examples that are easily understandable.

**2019年04月25日05分45秒**

Big O is an estimate of the worst case performance of a function assuming the algorithm will perform the maximum number of iterations.

**2019年04月25日05分45秒**

Big-O notation explained by a self-taught programmer

**2019年04月25日05分45秒**

While the other answers focus on explaining the differences between O(1), O(n^2) et al.... yours is the one which details how algorithms can get classified into n^2, nlog(n) etc. +1 for a good answer that helped me understand Big O notation as well

**2019年04月25日05分45秒**

one might want to add that big-O represents an upper bound (given by an algorithm), big-Omega give a lower bound (usually given as a proof independent from a specific algorithm) and big-Theta means that an "optimal" algorithm reaching that lower bound is known.

**2019年04月25日05分45秒**

This is good if you're looking for the longest answer, but not for the answer that best explains Big-O in a simple manner.

**2019年04月25日05分45秒**

-1: This is blatantly wrong: _"BigOh is relative representation of complexity of algorithm". No. BigOh is an asymptotic upper bound and exists quite well independent of computer science. O(n) is linear. No, you are confusing BigOh with theta. log n is O(n). 1 is O(n). The number of upvotes to this answer (and the comments), which makes the basic mistake of confusing Theta with BigOh is quite embarassing...

**2019年04月25日05分45秒**

"By the time you get to 200 towns there isn't enough time left in the universe to solve the problem with traditional computers." When the universe is going to end?

**2019年04月25日05分45秒**

what does this definition mean exactly? (The number of items is still increasing by a factor of 10, but the scaling factor of O(1) is always 1.)

**2019年04月25日05分45秒**

Not seconds, operations. Also, you missed out on factorial and logarithmic time.

**2019年04月25日05分45秒**

This doesn't explain very well that O(n^2) could be describing an algorithm that runs in precisely .01*n^2 + 999999*n + 999999.It's important to know that algorithms are compared using this scale, and that the comparison works when n is 'sufficiently large'.Python's timsort actually uses insertion sort (worst/average case O(n^2)) for small arrays due to the fact that it has a small overhead.

**2019年04月25日05分45秒**

This answer also confuses big O notation and Theta notation. The function of n that returns 1 for all its inputs (usually simply written as 1) is actually in O(n^2) (even though it is also in O(1)). Similarly, an algorithm that only has to do one step which takes a constant amount of time is also considered to be an O(1) algorithm, but also to be an O(n) and an O(n^2) algorithm. But maybe mathematicians and computer scientists don't agree on the definition :-/.

**2019年04月25日05分45秒**

The O(log n) Logarithmic complexity considered in this answer is of the base 10. Generally the standard is to calculate with base 2. One should keep in mind this fact and should not get confused. Also as mentioned by ChrisCharabaruk the complexity denotes number of operations and not seconds.

**2019年04月25日05分45秒**

An excellent mathematical answer, but the OP asked for a plain English answer. This level of mathematical description isn't required to understand the answer, though for people particularly mathematically minded it may be a lot simpler to understand than "plain English". However the OP asked for the latter.

**2019年04月25日05分45秒**

Presumably people other than the OP might have an interest in the answers to this question.Isn't that the guiding principle of the site?

**2019年04月25日05分45秒**

While I can maybe see why people might skim my answer and think it is too mathy (especially the "math is the new plain english" snide remarks, since removed), the original question asks about big-O which is about functions, so I attempt to be explicit and talk about functions in a way that complements the plain-English intuition. The math here can often be glossed over, or understood with a highschool math background. I do feel that people may look at the Math Addenda at the end though, and assume that is part of the answer, when it is merely there to see what the real math looks like.

**2019年04月25日05分45秒**

This is a fantastic answer; much better IMO than the one with the most votes.The "math" required doesn't go beyond what's needed to understand the expressions in the parentheses after the "O," which no reasonable explanation that uses any examples can avoid.

**2019年04月25日05分45秒**

"f(x) ∈ O(upperbound) means f "grows no faster than" upperbound" these three simply worded, but mathematically proper explanations of big Oh, Theta, and Omega are golden.He described to me in plain english the point that 5 different sources couldn't seem to translate to me without writing complex mathematical expressions.Thanks man! :)

**2019年04月25日05分45秒**

A hashtable requires an algorithm to run to calculate the index of theactual array ( depending on the implementation ). And an array just have O(1) because it's just an adress. But this has nothing to do with the question, just an observation :)

**2019年04月25日05分45秒**

jon's explanation has very much todo with the question i think. it's exactly how one could explain it to some mum, and she would eventually understand it i think :) i like the clothes example (in particular the last, where it explains the exponential growth of complexity)

**2019年04月25日05分45秒**

Filip: I'm not talking about address an array by index, I'm talking about finding a matching entry in an array. Could you reread the answer and see if that's still unclear?

**2019年04月25日05分45秒**

Filip Ekberg I think you're thinking of a direct-address table where each index maps to a key directly hence is O(1), however I believe Jon is talking about an unsorted array of key/val pairs which you have to search through linearly.

**2019年04月25日05分45秒**

RBT: No, it's not a binary look-up. It can get to the right hash bucket immediately, just based on a transformation from hash code to bucket index. After that, finding the right hash code in the bucket may be linear or it may be a binary search... but by that time you're down to a very small proportion of the total size of the dictionary.

**2019年04月25日05分45秒**

incorrect! for exampleO(n): If I double the input size the runtime will multiply to finite non zero constant. I mean O(n) = O(n + n)

**2019年04月25日05分45秒**

I'm talking about the f in f(n) = O(g(n)), not the g as you seem to understand.

**1970年01月01日00分03秒**

I upvoted, but the last sentence doesn't contribute much I feel. We don't often talk about "bits" when discussing or measuring Big(O).

**2019年04月25日05分45秒**

You should add an example for O(n log n).

**2019年04月25日05分45秒**

That's not so clear, essentially it behaves a little worse than O(n). So if n doubles, the runtime is multiplied by a factor somewhat larger than 2.

**2019年04月25日05分45秒**

cdiggins, what if i have O(N/2) complexity , should it be O(N) or O(N/2), for example what the complexity if i will loop over half string.

**2019年04月25日05分45秒**

Melad This is an example of a constant (0.5) being multiplied to the function. This is ignored as it is considered to have a meaningful effect for very large values of N.

**2019年04月25日05分45秒**

for logN we consider for loop running from 0 to N/2 the what about O(log log N)? I mean how does program look like? pardon me for pure math skills

**2019年04月25日05分45秒**

Could you please consider a different analogy for O(1)?The engineer in me wants to pull out a discussion about RF impedance due to obstructions.

**2019年04月25日05分45秒**

I did use the word "somewhat" for good reason.

**2019年04月25日05分45秒**

It's not about space.It's about complexity which means time.

**2019年04月25日05分45秒**

I have always believed it can be about time OR space. but not about both at the same time.

**2019年04月25日05分45秒**

Complexity most definitely can be about space. Have a look at this: en.wikipedia.org/wiki/PSPACE

**2019年04月25日05分45秒**

This answer is the most "plain" one here. Previous ones actually assume readers know enough to understand them but writers are not aware of it. They think theirs are simple and plain, which are absolutely not. Writing a lot text with pretty format and making fancy artificial examples that are hard to non-CS people is not plain and simple, it is just attractive to stackoverflowers who are mostly CS people to up vote. Explaining CS term in plain English needs nothing about code and math at all. +1 for this answer though it is still not good enough.

**2019年04月25日05分45秒**

This answer makes the (common) error of assuming that f=O(g) means that f and g are proportional.

**2019年04月25日05分45秒**

I've also heard the term Linearithmic - O(n log n) which would be considered good.

**1970年01月01日00分03秒**

while (size-->0) I hope this wouldn't ask again.

**2019年04月25日05分45秒**

We can use BigO notation to measure the worst case and average case as well. en.wikipedia.org/wiki/Big_O_notation

**2019年04月25日05分45秒**

Being asked to explain something mathematical without mathematics is always a personal challenge to me, as a bona fide Ph.D. mathematician and teacher who believes that such a thing is actually possible. And being a programmer as well, I hope that no one minds that I found answering this particular question, without mathematics, to be a challenge that was completely irresistible.

**2019年04月25日05分45秒**

Where does your constant 4 comes from?

**2019年04月25日05分45秒**

Rod iterator init, iterator comparison, iterator read, key comparison.. I think C would be better

**2019年04月25日05分45秒**

You are stating "Big O" and "Small o" without explainy what they are, introducing lots of mathematical concepts without telling why they are important and the link to wikipedia may be in this case too obvious for this kind of question.

**2019年04月25日05分45秒**

AditSaxena What do you mean "without explaining what they are"? I exactly explained what they are. That is, "big O" and "small o" are nothing by themselves, only a formula like "f(x) = O(g(x))" has a meaning, which i explained (in plain English, but without defining of course all the necessary things from a Calculus course).Sometimes "O(f(x))" is viewed as the class (actually the set) of all the functions "g(x)" such that "g(x) = O(f(x))", but this is an extra step, which is not necessary for understanding the basics.

**2019年04月25日05分45秒**

Well, ok, there are words that are not plain English, but it is inevitable, unless i would have to include all necessary definitions from Mathematical Analysis.

**2019年04月25日05分45秒**

Hi #Alexey, please have a look at accepted answer: it is long but it is well constructed and well formatted. It starts with a simple definition with no mathematical background needed. While doing so he introduce thre 3 "technical" words which he explains immediately (relative, representation, complexity). This goes on step by step while digging into this field.

**2019年04月25日05分45秒**

Big O is used for understanding asymptotic behavior of algorithms for the same reason it is used for understanding asymptotic behavior of functions (asymptotic behavior is the behavior near infinity).It is a convenient notation for comparing a complicated function (the actual time or space the algorithm takes) to simple ones (anything simple, usually a power function) near infinity, or near anything else.I only explained what it is (gave the definition).How to compute with big O is a different story, maybe i'll add some examples, since you are interested.

**2019年04月25日05分45秒**

William ...and people tend to die of old age, species go extinct, planets turn barren etc.

**2019年04月25日05分45秒**

- Big-O for Eight Year Olds?
- What is Big O notation? Do you use it?
- what is the meaning of O(1), O(n), O(n*n) memory?
- Time Complexity of two for loops
- calculating time complexity of a algorithm
- Why is sorting a string O(n log n)?
- What is Big O Notation?
- Confused about big O of n^2 vs 2^n
- Tutorial on space complexity of algorithms
- time, space complexity and O notation problem
- Big O, how do you calculate/approximate it?
- What is tail recursion?
- What is the best algorithm for an overridden System.Object.GetHashCode?
- Difference between Big-O and Little-O Notation
- What are the differences between NP, NP-Complete and NP-Hard?
- What does O(log n) mean exactly?
- Ukkonen's suffix tree algorithm in plain English
- What exactly does big Ө notation represent?
- Plain English explanation of Theta notation?
- What is the optimal algorithm for the game 2048?

ADS