Funzioni di ordinamento per crescita asintotica


Supponiamo che io abbia un elenco di funzioni, ad esempio


Come posso ordinarli asintoticamente, cioè dopo la relazione definita da

fOgfO(g) ,

supponendo che siano effettivamente comparabili a coppie (vedi anche qui )? L'uso della definizione di O sembra imbarazzante ed è spesso difficile provare l'esistenza di costanti adatte c e n0 .

Si tratta di misure di complessità, quindi siamo interessati al comportamento asintotico come n+ e supponiamo che tutte le funzioni assumano solo valori non negativi ( n,f(n)0 ).

Dal momento che l'OP non è più tornato, sto rimuovendo le cose localizzate e ne faccio una domanda di riferimento.



Se si desidera una prova rigorosa, il seguente lemma è spesso utile resp. più utile delle definizioni.

Se c=limnf(n)g(n) esiste, quindi

  • c=0 fo(g) ,
  • c(0,)fΘ(g) and
  • c=   fω(g).

With this, you should be able to order most of the functions coming up in algorithm analysis¹. As an exercise, prove it!

Of course you have to be able to calculate the limits accordingly. Some useful tricks to break complicated functions down to basic ones are:

  • Express both functions as e and compare the exponents; if their ratio tends to 0 or , so does the original quotient.
  • More generally: if you have a convex, continuously differentiable and strictly increasing function h so that you can re-write your quotient as


    with gΩ(1) and




    See here for a rigorous proof of this rule (in German).

  • Consider continuations of your functions over the reals. You can now use L'Hôpital's rule; be mindful of its conditions²!

  • Have a look at the discrete equivalent, Stolz–Cesàro.
  • When factorials pop up, use Stirling's formula:


It is also useful to keep a pool of basic relations you prove once and use often, such as:

  • i logaritmi diventano più lenti dei polinomi, cioè


  • ordine dei polinomi:

    nαo(nβ) for all α<β.

  • polynomials grow slower than exponentials:

    nαo(cn) for all α and c>1.

It can happen that above lemma is not applicable because the limit does not exist (e.g. when functions oscillate). In this case, consider the following characterisation of Landau classes using limes superior/inferior:

With cs:=lim supnf(n)g(n) we have

  • 0cs<fO(g) and
  • cs=0fo(g).

With ci:=lim infnf(n)g(n) we have

  • 0<cifΩ(g) and
  • ci=fω(g).


  • 0<ci,cs<fΘ(g)gΘ(f) and
  • ci=cs=1fg.

Check here and here if you are confused by my notation.

¹ Nota bene: My colleague wrote a Mathematica function that does this successfully for many functions, so the lemma really reduces the task to mechanical computation.

² See also here.

@Juho Not publicly, afaik, but it's elementary to write yourself; compute Limit[f[n]/g[n], n -> Infinity] and perform a case distinction.


Another tip: sometimes applying a monotone function (like log or exp) to the functions makes things clearer.

This should be done carefully: 2nO(n), but 22nO(2n).

Seconded. The "apply monotone function" thing seems to be some kind of folklore which does not work in general. We have been working on sufficient criteria and have been come up with what I posted in the latest revision of my answer.


Skiena provides a sorted list of the dominance relations between the most common functions in his book, The Algorithm Design Manual:


Here α(n) denotes the inverse Ackermann function.

That's an oddly specific list. Many of the relations (whatever means exactly) can be summarised to a handful of more general lemmata.

It's his notation for a dominance relation.
Robert S. Barnes


Tip: draw graphs of these functions using something like Wolfram Alpha to get a feeling for how they grow. Note that this is not very precise, but if you try it for sufficiently large numbers, you should see the comparative patterns of growth. This of course is no substitute for a proof.

E.g., try: plot log(log(n)) from 1 to 10000 to see an individual graph or plot log(log(n)) and plot log(n) from 1 to 10000 to see a comparison.

Should we really recommend vodoo?

+1 for suggesting to draw graphs of the functions, although the linked graphs are rather confusing unless you know what they mean.
Tsuyoshi Ito

Take a graph as a hint what you might want to prove. That hint may be wrong of course.


I suggest proceeding according to the definitions of various notations. Start with some arbitrary pair of expressions, and determine the order of those, as outlined below. Then, for each additional element, find its position in the sorted list using binary search and comparing as below. So, for example, let's sort nloglogn and 2n, the first two functions of n, to get the list started.

We use the property that n=2logn to rewrite the first expression as nloglogn=(2logn)loglogn=2lognloglogn. We could then proceed to use the definition to show that nloglogn=2lognloglogno(2n), since for any constant c>0, there is an n0 such that for nn0, c(nloglogn)=c(2lognloglogn)<2n.

Next, we try 3n. We compare it to 2n, the largest element we have placed so far. Since 3n=(2log3)n=2nlog3, we similarly show that 2no(3n)=o(2nlog3).



Here a list from Wikipedia, The lower in the table the bigger complexity class;

NameRunning TimeConstant timeO(1)Inverse Ackermann timeO(a(n))Iterated logarithmic timeO(logn)Log-logarithmicO(nlogn)Logarithmic timeO(logn)Polylogarithmic timepoly(logn)Fractional powerO(nc),where 0<c<1Linear timeO(n)"n log star n" timeO(nlogn)Quasilinear timeO(nlogn)Quadratic timeO(n2)Cubic timeO(n3)Polynomial timepoly(n)=2O(logn)Quasi-polynomial time2O(poly(logn))Sub-exponential time (first definition)O(2nϵ),ϵ>0Sub-exponential time (second definition)2o(n)Exponential time(with linear exponent)2O(n)Exponential time2poly(n)Factorial timeO(n!)

note : poly(x)=xO(1)

Also, interesting how the table suggests that 2nlogno(n!). While the table you link to is somewhat accurate, the one linked there (and which you copied) is about complexity classes, which is not a helpful thing to mix in here. Landau notation is not about "time".

I put this so the name of the complexity classes can be talked directly here. Yes, Landau is more about a specific type of algorithm in Cryptography.

I object to some of @Raphael's views. I have been a mathematician and an instructor for many years. I believe, apart from proving those things, a big table like this increases people's intuition easily and greatly. And the names of the asymptotic classes help people remember and communicate a lot.

@Apass.Jack In my teaching experience, when given a table many students will learn it by heart and fail to order wrt any function not in the table. Note how that effect seems to account for many of questions regarding asymptotic growth that land on this site. That said, of course we'll use lemmata implied by the table if it makes proofs easier, but that comes after learning how to proof the table. (To emphasize that point, people who come here don't need help reading stuff off a table. They need help proving relations.)

@kelalaka "Yes, Landau is more about a specific type of algorithm in Cryptography." -- that doesn't even make sense. Landau notation is a shorthand to describe properties of mathematical functions. It has nothing to do with algorithms let alone cryptography, per se.
Utilizzando il nostro sito, riconosci di aver letto e compreso le nostre Informativa sui cookie e Informativa sulla privacy.
Licensed under cc by-sa 3.0 with attribution required.