Luther's Fourth Postulate
© 20 Feb 2013 Luther Tychonievich
Licensed under Creative Commons: CC BY-NC-ND 3.0
other posts

essay

File formats control your mind via data’s influence on tools, with consequent thoughts on bearing testimony.

 

I have often heard it said that grad students solve “‍impossible‍” problems because they are too new to know they are impossible. Not knowing the set of approaches their advisors know they create new ones that solve problems the old ones couldn’t. The reverse of this principle is the well-known law of the hammer: “‍if all you have is a hammer, everything looks like a nail‍”.

I have spent much of my spare time for many years thinking about a derivation of this concept in computing. I now pose it as my Fourth Postulate See also the First; I realize I have not posted much on the Second and Third. Hopefully I will post more soon. :

The data model a tool uses to store information controls the design of the tool itself and consequently the way users perceive the world.

The more experience I have with more fields, the more firmly I am convinced this principle is very potent. I’ll give three examples that have been much on my mind.

Programming Languages

The de facto standard in computing—indeed, I know of only a handful of corner-case exceptions—is to have the file format be the user interface. Little if anything is stored beyond what the user creates.

The longer I spend in computing the longer becomes my list of things this data model prevents people from thinking. The computer can’t make guesses about your intent because it can’t verify they are correct. You can’t let the computer fill in details because you don’t know if it did them right. And so on.

Try asking a programmer sometime why their development environment doesn’t engage them in requirements elicitation. In my experience, they are initially confused and, if I persist, offer off-topic objections to the idea. The source-code data model has conditioned their minds not to think that way.

Genealogy

genealogy

There are lots of genealogy data models see, e.g., BetterGEDCOM’s list but all seem to be some combination of two basic kinds of storage: information and conclusions. Several threads on RootsDev have used the phrases “‍conclusion person‍” and “‍evidence person‍” to distinguish between these, and a fair amount of energy goes into arguments about whether to store these distinctly or identically. The more I read these conversations the more I am convinced that the way the users of each see genealogy is limited by the data model they use.

I’ve written before (here and here) about my ideas for genealogical data models. I want to store the process the researchers use, not the conclusions they reach. It’s a different kind of storage, and it lets you think in different ways. Collaboration is no longer a shared tree, it’s shared findings and insights. There’s not longer a need to pick a single perspective to believe and contradictions are now permitted in the world view.

But genealogists don’t think this way. Even the no-computers-involved Genealogical Proof Standard defines the data model to be a defense of a conclusion. Ask most genealogists what they could share with each other besides conclusions and all they can think of are sources and research logs. It’s as if the rest of the process doesn’t even register in their minds.

The Spirit and Testimony

I wish to draw one more example, this one about the data model of the brain. It’s a bit tricky to discuss holes in the data model of the brain while trying to communicate with your brain, but I’ll do my best.

Grant me the premise, for this section I believe this premise (or something close thereto) and would be pleased if you accepted it for more than this section, but this section will suffice for rhetorical purposes. , that each person has a spirit and a body, where the spirit “‍drives‍” the body. Grant also that, just as a driver of a car can converse with another driver without the car being any the wiser, so too the spirit can converse with other spirits. How would the brain remember that experience? Such things are outside its data model.

Boyd K. Packer famously observed “‍A testimony is to be found in the bearing of it.‍” I suspect this is because my own words are part of my mind’s data model. I can remember and store my describing the influence of the Holy Ghost where I cannot fit the influence itself into my data.

Mormons have made some strides to work more of the workings of the spirit into the conversational data model. Words like “‍testimony,‍” “‍know,‍”, “‍feel,‍” “‍impression,‍” and “‍prompting‍” have been repurposed to serve this end. But the spirit remains essentially outside mortality’s data model and thus the thought process that allows it to be recognized must be developed.

Speculate…

What would happen if the data model beneath the tools you use changed? What questions are you not asking because you only know about hammers?

Since most people use computers to perform only one non-consumption task (writing text) let’s look at a few random data models that might change how you think about that task. What if the computer stored were our individual keystrokes instead of the text those keystrokes produced? Or if inside the computer your text was a semantic model of each idea you wrote and a set of “‍word it this way‍” decisions? What tools might these designs create, and how might those tools change how you think about writing?

If you are one of the people who designs data models, might I posit that agile development models are dangerous in part because it leads to poorly considered data? Lots of incremental changes leads to tools that are incrementally different from hammers. Software architects aren’t just engineers; they are making tools for mental processes and, by the law of the hammer, are thus controlling their clients’ minds.




Looking for comments…



Loading user comment form…