An algorithm is a sequence of instructions for completing a task. The order of the sequence is significant. In computing, algorithms tell processors what to do.
Algorithms solve calculations or other problems by operating on variables. The variables that an algorithm operates on are inputs. The result of the operation is the output of the algorithm.
Used In Many Different Fields

Recipes
When we follow a recipe to bake a cake, we are in effect executing an algorithm. The inputs are the ingredients. By obeying the algorithm – “do such and such to the inputs” – we create the output, the cake.

Mathematicians
Mathematicians write algorithms as formulas, such as x + y = z. In this example, inputs are x and y and the output is z.

Computing
In computing, the variables that algorithms manipulate are in storage locations.
A simple algorithm in a computer might go like this: take the quantity in location A1 and add it to the quantity in location A2. Store the result in location A3.
The Federal Reserve Bank of St. Louis’ Glossary of Economics and Personal Finance Terms has the following definition of the term ‘algorithm’:
“A process or set of rules to be followed in calculations or other problemsolving operations, especially by a computer.”

Finance
In finance, algorithms are used to analyze market trends, manage portfolios, and execute trades automatically.
For instance, algorithmic trading involves using preprogrammed instructions to buy and sell stocks at optimal times. The inputs could be market data, and the output would be the decision to trade based on that data.
Origin of the term algorithm
The father of the algorithm was the Persian mathematician and astronomer Muḥammad ibn Mūsā alKhwārizmī. He was born around the end of the 8th century and died around the middle of the 9th.
Historians believe that AlKhwārizmī’s major achievement was bringing HinduArabic numerals and algebra into European mathematics.
His name in Latin was Algorithmi, which is where the terms algebra and algorithm come from.
Algorithms and computers
The authors of the popular university textbook Introduction to Algorithms sum up the relationship between algorithms and computers.
“Before there were computers, there were algorithms,” write Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein, in the preface to the third edition.
“But now that there are computers, there are even more algorithms,” they add, “and algorithms lie at the heart of computing.”
Another example of a simple computing algorithm is “find the maximum.” The inputs are a list of positive numbers. Each input is stored in its own location. The task is to find the biggest number in the list and put it in the output location (Max).
As a set of instructions for the computer, the algorithm might look like this:
 Set Max to zero.
 Compare the first number in the list to Max.
 If the new number is bigger than Max, make Max equal to that number.
 Repeat the operation for each subsequent number in the list.
When the algorithm has finished, the location Max contains the maximum in the list.
In his book The Master Algorithm, Pedro Domingos simply states that an algorithm is “a sequence of instructions telling a computer what to do.”
Domingos is a professor in computer science and engineering at the University of Washington in the United States. He says that you can break down all algorithms into three simple logical operations: AND, OR, and NOT.
Sophisticated algorithms
By creating complexes of simple algorithms, we can get computers to perform sophisticated tasks. From sending text messages to searching the Internet, we are using algorithms that at heart are just performing simple logical operations.
A notable example of the use of advanced algorithms is the Human Genome Project (HGP).
HGP scientists used sophisticated algorithms to map human DNA. Using these, and other computational tools, they identified 100,000 genes, and sequenced all 3.2 billion chemical base pairs in the genome.
Machine learning
More recently, algorithms have progressed to a new level: machine learning. With machine learning algorithms, computers can get better and better at doing things.
At one time, the only way to get a computer to complete a task – whether to fly a plane or add up numbers – was to write algorithms one instruction at a time. And they remained fixed, until a human updated them.
But with machine learning, computers can update their own instructions. They can run algorithms that learn every time they complete a task. At the end of each cycle, they insert the learning – or inference – into the next cycle. Also, the more data they have, the better they get.
When you join Amazon or Netflix as a new customer, machine learning algorithms get to work straight away. They gather information about your searches, together with your buying and viewing history. The more you use the services, the better they get at offering you options that fit what they infer to be your preferences.
“Society is changing, one learning algorithm at a time. Machine learning is remaking science, technology, business, politics, and war,” says Domingos, who is also a prominent machine learning researcher.
“Someday,” he adds, “there’ll be a robot in every house, doing the dishes, making the beds, even looking after the children while the parents work.”
Interesting related articles:
 University of Tokyo researchers create algorithm that predicts consumer purchases
 Deep learning algorithm can solve a Rubik’s Cube faster than most humans
 Machine learning algorithm locates nearly all US solar panels
 Study: Algorithms based on AI can make very profitable investment decisions
 Google Chrome receiving significant speed boost thanks to new compression algorithm
 Netflix algorithm update cuts data consumption by 20 percent