The first programmer
Our first programming fact is about the first coder in history that was the female mathematician Ada Lovelace. Ada was born Augusta Ada King-Noel 1815 in London and was the only daughter of Lord Byron. She wrote the first program of the world for Charles Babbage‘s Analytical Engine.
The Analytical Engine was designed to be a mechanical computer for basic analytical operations. It had an arithmetic logic unit, a control flow, and even supported modern concepts such as loops and conditionals.
Today we know it was most likely the first computer ever to be described as Turing-complete. Unfortunately, Babbage was never able to finish his machine due to funding issues. Where would we be now, if he had finished it back then, we might have quantum home computers already?
Why is a bug called a bug?
The term bug really comes from an insect. It was a moth that was found by Grace Hopper trapped between two relais in the Harvard Mark II.
Grace Hopper is another shining example for women in computer science. She was born Grace Brewster Murray Hopper in 1906 in New York City and received her doctorate in mathematics before joining the Navy. There she wrote a program called “linker” that translated the English language into code readable for machines for the first time. Most of today’s technology won’t exist if Hopper’s concepts haven’t influenced coding.
When it comes to programming facts, not many instantly think of viruses… However, in 1984 Fred Cohen, by the time a student at the University of Southern California School of Engineering (Today Viterbi School of Engineer), wrote a paper called “Computer Viruses – Theory and Experiments”.
This paper defines a major computer security problem called a virusFred Cohen, 1984 in the introduction of his paper “Computer Viruses – Theory and Experiments”
In his experiment, he wrote a short program that could infect computers by making copies of itself and was able to spread to other machines. The program itself was hidden inside another larger and legitimate program. His virus wasn’t designed to be harmful.
However, there were several issues with computer viruses in the past, and each time the virus at hand got way more expensive for the economy. But a virus doesn’t have to be attached to programs by all means. For example, in 2004, every fourth E-Mail was infected with MyDoom and caused over 38 Billion USD of damage.
High-level programming languages
The first programming language that is today considered a high-level programming language was Fortran. John Backus was an American computer scientist at IBM. In 1953 Backus proposed a method to his superiors to ease how they program their IBM 704 mainframe computer.
A programming language is considered a high-level language if it strongly abstracts the details of the computer. Meaning your coding doesn’t look like writing binary or low-level code such as assembly. In Short: the term high- or low-level describes the level of abstraction that is used.
It may even use natural language – such as the most commonly used coding languages today. Even though Fortran is still considered the #38 of the most popular programming languages (IEEE, 2019), not many people into computer science or even students ever get in touch with Fortran. However, almost any computer science student knows John Backus due to the Backus-Naur form (BNF) named after him and Peter Naur.
Fortran hugely influenced coding as we know it today, and the idea of high-level coding got a lot of uplift from it.
Even though it feels like artificial intelligence belongs to the robots’ age, it isn’t a very new concept. In fact, the idea of artificial intelligence is known back to antiquity.
The foundations of the concept go back to Alan Turing and his theory of computation (1936). Turing postulated that a machine could simulate any mathematical deduction by shuffling simple symbols such as 0 and 1. The realization that machines can simulate any process of formal reasoning is today known as the Church-Turing thesis. Named after Alan Mathison Turing and Alonzo Church.
Artificial intelligence and computers are more interrelated, as most would guess. And computers, as we know them, wouldn’t exist if Turing wouldn’t have had these groundbreaking thoughts.
Since 1966 the so-called Turing Award is awarded to persons who have rendered outstanding services to computer science development. It is considered the Nobel prize in computer science.
No matter if you are a programmer or just interested, by now, you know more about the history of notable programming languages, coding, and computer science in general than 99% of your fellow human beings. If you have any interesting programming facts for us, please let us know in the comments 🙂
I am a developer and entrepreneur from Germany. I chose studying computer science because I love building things. I am of the opinion that there isn’t one truth (especially for computer science concepts) and whoever claims so isn’t that trustworthy in my eyes. I constantly try to improve things and myself. In my spare time I support businesses by offering them most of my services for free. Beside that I am kind of addict to e-learning courses.