0 like 0 dislike
14 views
Understand the algorithms and came across the analysis. How to read the spatial complexity of the algorithm?
The dependence of the number of occupied memory size of the input, it's just the ratio of the total memory consumption of the script to the input data or I not so understood?
If you can sample on your toes. Thank you.
| 14 views

0 like 0 dislike
As with time complexity - number of operations. Just consider the number of used bytes.

Roughly speaking, look at all the creation in the worst case variables (not to forget that local variables and function arguments on the stack and are calculated separately for each function call across Gulbene has calls).

Get some kind of formula type 10*n+16*m+n*m/2 + 100500*log n. Then apply the asymptotic analysis.

For example, the algorithm of DFS. there is a graph with n vertices and m edges. Let set in the form of lists of incidence.

Then you have n lists, which together contain m elements. I.e. your input data is n*8+m(4+8) bytes. n pointers to the beginning of the list and m elements, each contains a value and a pointer. But it is not necessary to consider thoroughly each byte, you can simply estimate n+m.

The algorithm requires the array to mark the traversed vertices: +n to the memory consumption.

The function requires a number of local variables and parameters. Several of them and from n = m it does not. In recursion a function may be called n times if all the vertices you pass in depth. Total stack these functions will eat n*C, where C is some constant.

The total hardware complexity of n+m+n*C. Or O(n+m).

Local variables, if it's not arrays, usually can never assume, because it's multipliers or the number of function calls, or simply a constant increment which asimptotiki is ignored.
by

0 like 0 dislike