Unlocking Programming: Types
© 14 Jul 2011 Luther Tychonievich
Licensed under Creative Commons: CC BY-NC-ND 3.0
other posts

Unlocking Programming

Part of a series of posts explaining programming for the lay-person.

 

This is part of a series of posts; see the introduction to this series.

One of the things everyone seems to know about computers is their use of “‍ones and zeros‍” or binary. This is a bit of a misnomer, since what we actually have are wires that carry a high or low voltage, not a one or zero, but the comparison is not a bad one. Every value is stored as a series bits A bit is the binary version of a digit. Each 1 or 0 (or wire with high or low voltage) is called a bit. Three bits make an “‍octal digit‍”, four make a “‍nibble‍” or a “‍hexadecimal digit‍”, and eight make a “‍byte‍” , and any given sequence of bits may have many different meanings. Which meaning to use is dictated by the type of the value.

The type of a value is the kind of thing that value represents. A few examples: “‍3 is a number‍”, “‍Rufus is a dog‍”, “‍‘‍Rufus‍’ is a name‍”, “‍I am a human‍”. Note that a value can have multiple types: three is a number, and a real number, and an integer, and a natural number.

Types are important in programming partly because they can be used as a sanity-check. In most cases, you can figure out the type of an expression without knowing what value it has. For example, “‍x + 3‍” generally implies that x is a number, as is the expression as a whole. If x’s type is “‍vegetable‍” instead of “‍number‍” then “‍x + 3‍” doesn’t make any sense. How can you add 3 to a broccoli? In this case we’d say that the expression “‍doesn’t type-check‍”, meaning our program doesn’t make sense.

Typing may seem obvious or trivial, but it turns out that even experienced programmers write programs that don’t type-check from time to time. These are almost always simple mistakes, like calling someone by the wrong name or tripping while walking, but they happen often enough that programmers sometimes get quite passionate about little details of their preferred language’s way of handling types.

Types. They started as a way of letting the same set of bits (e.g., 01100001) represent different things in different contexts (e.g., both 97 and the letter “‍b‍”); they became a sanity check on programmers’ work and have evolved to a near-religion.

Books can (and have) been written on types. But if you remember that a type is the “‍kind of thing‍” a value represents, and that type-checking is “‍sanity-checking‍” expressions, you’ll do fine.




Looking for comments…



Loading user comment form…