Before electronic computers were developed, logicians and mathematicians worked out the foundations of computation. The idea was, and is, that computations are the result of following a set of precise directions (the program) which apply to a formal system of symbols (syntax).
Alan Turing was one of these logician-mathematicians. We'll look at one of his "machines" in a moment...
...but first we need to consider a preliminary matter.
Recall the difference between syntax and semantics. Syntax is a matter of form or grammar for the language. The syntactical definition of a sentence (like 'Pb&Ib') depends only on how that complex object is built up out of it's parts. How it's formed. So, syntax is purely formal.
But semantics gives meaning to our syntactical objects. It defines them by relating them to the world. For example, 'b' can stand for "George W. Bush" and 'P' for the property of being president of the US. So, 'Pb' is true in 2004. That's semantics.
Much of symbolic logic can be accomplished by formal manipulation on the syntactic side which nonetheless respects the semantics.
For example, we've seen that Modus Tolens expresses a form of inference which is always valid.
P > Q
We can understand meaning (that this argument is valid) just by understanding syntax (that this argument has a correct form).
Turing machines and computation in general give us another example of syntax standing in for semantics. Let's see how this works.