Programming and coding are the same things. The best way to define programming is that it is the art of teaching a computer how to perform tasks. These tasks can be as simple as adding two numbers together and as complicated as plotting the trajectory of a rocket, but they all start with a programmer teaching a computer how to do it.
What Exactly Is Programming?
Programmers use programming languages to 'talk' to computers. Early computer languages like Assembly Language were little more than translators to Machine Language, which is made up of the 1s and 0s the computer actually understands. As programmers taught computers more tasks, they compiled these tasks into more advanced languages. Fortran is an example of an early high-level language. A high-level language is capable of carrying hundreds of instructions with a single line of code.
The first program many people learn is "Hello World," which simply displays those words to the screen. In some languages, this is as simple as a single line of code. However, the computer must carry out hundreds of instructions. Not only does it have to print out every single letter, but every letter is composed of individual pixels from the letter. So, print "hello world" is actually quite complicated from the computer's standpoint.
While all programming may seem the same, there are many different types of programming such as object-oriented programming and database programming. Modern programming languages include Swift, Java, C++, Ruby, Objective C, Python, SQL, BASIC and PHP to name a few. In recent years, new variations of programming languages have sprung up with the intent to teach kids how to program.
What is Coding? Are Coding and Programming the Same Thing?
The words "programming" and "coding" are often used interchangeably, but the word "coder" has a history of being used to describe inexperienced or inefficient programmers. Of course, just because someone uses the word doesn't mean they are trying to be derogatory. Many people don't realize "coder" is sometimes used in that sense and the word has become more normalized over time.
To put it simply, there is no difference between a "programmer" and a "coder," and there are no jobs for "coders." If you search a job database, you will see job titles with many variations of a programmer from programmer analyst to program engineer to program architect, but you will see very few (if any) for a "coder" of any type.
Programmers sometimes refer to 'coding' as the actual act of writing computer code as opposed to other programming tasks such as planning, debugging, etc.
What Does a Programmer Do?
In movies, programmers sit furiously typing on their computers. In reality, programming begins well before any computer code is actually generated.
Programming starts with a goal or a set of requirements. This can be a boss wanting a report or the programmer wanting to build a game for the App Store.
The next step it to break those requirements down into the basic logic needed to perform the task. This is like a football team going out onto the field with a play. If they trot out there without a play, everyone is going to do something different and it will be chaos.
A programmer may use algorithms and flow charts to design the logic for a program. Algorithms are basically a computer program written in English, while flow charts look more like a map that traces the logic of the program.
Skipping the endless meetings that are bound to happen to accomplish those first steps, we get to the actual programming. Programming has a lot of creative problems solving, but it also involves looking for patterns in the code that can be isolated and turned into tasks, which can be reusable "objects" or "functions" within the program. A computer program is just a set of tasks gathered together to perform a more complicated task, which is then matched with other complicated tasks to perform an even more complicated task, etc.
After programming comes debugging, which essentially means running the program over and over to find bugs, going back to the previous step to code those bugs out of the program and arriving at debugging again. This is called the "iterative process." A programmer repeats these steps until they (hopefully) find all of the bugs. A computer bug is any piece of code that delivers an unwanted error message or doesn't produce the correct result.
If there are "unwanted" error messages, are there wanted error messages? Absolutely. Think about directing your web browser to a page that doesn't exist. You will get a 404 error message that will inform you the page doesn't exist. Some 404 messages are funny, most are practical, but they all serve a purpose: to inform the user that something went wrong. This is better than the page remaining blank, which would leave the user wondering what happened.
0 Σχόλια