Technological trends and Asymptotic notations

Varad Vinayak Godse
4 min readMar 21, 2021

The computing environment began in the 1940s and has witnessed revolutionary changes since then. The architecture of early computers was large and complex, but today we hold electronic devices in our hands. The journey to the existing environment hasn’t been drastic but gradual. The people like Steve Jobs, who thought compatibility is a need, or Bill Gates, who gave a cost-effective and quality OS, played a crucial role in the growing stage. There are countless contributors whose efforts laid a significant foundation that targeted an enhanced integration of software and hardware. The basic functioning depends solely on mathematics and its properties. The operations of binary digits helped display the early functioning of computers and, the introduction of programming languages helped in better connectivity between user and system. The drive towards a better solution was possible only by creating a better mathematical algorithm. The basic needs are time and space, and contributors focused on reducing these parameters at every developing stage. The progress of different software, kernels, compilers, hardware components depends on the algorithm’s efficiency and its structure.
The algorithm that defines a system can be analyzed in its algebraic form.
The rate of growth of the function(defining the algorithm) forms a significant factor in tracking its behavior over an interval of time. A simple example considering two quadratic functions: 17n²+n+2 and 5n²+2. The two functions look different as they return different values for finite or limited input values. The graph of the functions varies from each other as well. But if we increase the input values tending towards infinity, the results are different. The n² term gives large outputs as compared to other parts. The coefficients do not matter as well. The two functions that seemed different behave in a similar manner for large values. Similar behavior makes the functions fall under the same class or category.
The running time varies algebraically and categorizes algorithms based on the rate of growth.
To understand the divisions, it is very important to understand Asymptotic notations.
The asymptotic notations describe the running time of an algorithm when input tends towards a particular value or limit. A small example to understand them is through bubble sort. If we input an already sorted array to a bubble sort function, it takes minimum time for the output. The graph varies linearly by providing the best case. The growth is quadratic if a reverse order array inputs the bubble sort function. The time required is maximum in this case hence the worst scenario. The average scenario occurs when neither of the above two situations takes place.
The asymptotic notations denote these differences. There are three types :
Big Oh notation
Omega Notation
Theta notation
Each type analyses a function through different parameters by applying a different set of rules. The result of every notation is independent of coefficients, constants. It only relies on changing input variables. The three types provide bounds to a function that are distinct from each other. The information can help to analyze and categorize functions accordingly.

The “Big Oh” notation-
We would consider two non-negative functions g(n) and f(n). The input variable ’n’ is non-negative as well. The big oh of g(n) {O(g(n))} is defined as the set of all functions f(n) such that f(n) is always smaller than or equal to g(n) multiplied with a constant. The changing variable n must be greater than or equal to a fixed constant, n1.
0(g(n))={
f(n):there exists constants C and n1 , such that f(n)<=Cg(n),for n>=n1
}
The condition Cg(n)>=f(n) satisfies every n>=n1 and hence provides an upper bound of the rate of growth.

The “Omega” notation-
The Omega of g(n) {O(g(n))}, is defined as the set of all functions f(n) such that f(n) is always greater than or equal to g(n) multiplied with a constant. The changing variable n must be greater than or equal to a fixed constant, n1.
W(g(n))={
f(n):there exists constants C and n1 , such that f(n)>=Cg(n),for n>=n1
}
The condition Cg(n)<=f(n) satisfies every n>=n1 and hence provides a lower bound of the rate of growth.

The “Theta” notation-
The Theta of g(n) {O(g(n))}, is defined as the set of all functions f(n) such that f(n) is always greater than or equal to g(n) multiplied with a constant and smaller than or equal to g(n) multiplied with an another constant. The changing variable n must be greater than or equal to a fixed constant, n1.
theta(g(n))={
f(n):there exists constants C1,C2 and n1 , such that C1g(n)<=f(n)<=C2g(n) ,for n>=n1
}
The condition C1g(n)<=f(n)<=C2g(n) satisfies every n>=n1 and hence provides a tight bound of the rate of growth.

Considering an example f(n)=5n²+2n+1 and g(n)=n². The constant C=8 satisfies the condition f(n)<=Cg(n) for n>=1. It gives an upper bound with O(n²). The constant C=5 gives a lower bound with W(n²). The tight-bound exists for C=1.
There are three ways to analyze the function under a quadratic representation. The three bounds help to understand the behavior needed in many real-life applications.
The enhancements in various technological fields have been immense. The areas of AI, ML, hardware security, encryption, IoT, web development are seeing extensive integration. It has lead to the generation of many real-life needs. The recent technologies aim at creating systems that are simpler and more efficient than previous ones. The simple processes require complex technical solutions and hence, strong algorithmic background. The efforts of IT corporations all over the world have been to find optimal and efficient algorithms. It helps to create the needful and enhanced systems. In such a scenario, analysis of algorithms through parameters is essential. The asymptotic notations play a crucial role in satisfying this need. They monitor the rate of growth through bounded patterns giving an insight into time and space complexities. The mathematical requirements are going to increase a lot in the future. More researchers, mathematicians, experts are going to create better algorithms to supply the growing needs. The asymptotic notations will hence be a neverending set of principles that govern the algorithms of all types. They have played a role in the past, are playing in the present, and will play in the future.

--

--