Algorithm

    « Back to Glossary Index

    In short, an algorithm is a set of steps that defines a sequence of actions. It may also be described as a set of commands designed to achieve a specific goal or solve a particular problem. Algorithms are mainly used and studied in the fields of mathematics and computer sciences, but they may also be related to other contexts, such as biological neural networks and electronic devices.

    In computer science, an algorithm consists of a sequence of unambiguous instructions that conduct computer programs to perform a variety of tasks. They can be designed to execute a simple action like subtracting two numbers, or more complex operations, like finding the best route between two or more geographic locations. As such, computer algorithms are extremely useful for performing all sorts of tasks, from calculations, data processing, and even decision-making. 

    Every algorithm is made of a fixed beginning and ending point, producing outputs according to the inputs and to the predefined steps. Multiple algorithms can be combined to perform more elaborate tasks, but higher complexity also requires more computational resources. 

    Algorithms can be measured by their correctness and efficiency. Correctness refers to the algorithm’s accuracy and whether or not it can solve a certain problem. Efficiency is related to the amount of resources, and time an algorithm needs to perform a particular task. Many computer scientists use a mathematical analysis technique known as asymptotics to compare different algorithms, regardless of the programming language or hardware they are running on.