ADVERTISEMENT
  About the SA Blog Network













Doing Good Science

Doing Good Science


Building knowledge, training new scientists, sharing a world.
Doing Good Science Home

Ada Lovelace and the Luddites.

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



Today is Ada Lovelace Day.

If you are not a regular reader of my other blog, you may not know that I am a tremendous Luddite. I prefer hand-drawn histograms and flowcharts to anything I can make with a graphics program. I prefer LPs to CDs. (What’s an LP? Ask your grandparents.) I find it soothing to use log tables (and I know how to interpolate). I’d rather use a spiral-bound book of street maps than Google to find my way around.

Obviously, my status as a Luddite should not be taken to mean I am against all technological advances across the board (as here I am, typing on a computer, preparing a post that will be published using blogging software on the internet). Rather, I am suspicious of technological advances that seem to arise without much thought about how they influence the experience of the humans interacting with them, and of “improvements” that would require me to sink a bunch of time into learning new commands or operating instructions while producing at best a marginal improvement over the outcome I get from the technology I already know.

That is to say, my own inclination is to view technologies not as ends in themselves but as tools which, depending on how they are deployed, can enhance our lives or can make them harder.

The original Luddites were part of a workers’ movement in England in the early 19th century. The technologies these Luddites were against included the mechanical knitting machines and looms that shifted textile production from the hands of skilled knitters and weavers to a relatively unskilled labor force tending to the machines. In the current economic climate, it’s not too hard to see what the Luddites were worried about: even if the Industrial Revolution technologies didn’t result in an overall decrease in jobs (since you’d need workers to tend the machines), there would be no reason to assume that the owners of textile factories would be interested in retraining the skilled knitters and weavers already in existence to be the machine-tenders. And net stability (even increase) in the number of jobs can be cold comfort when your job goes away.


So, the Luddites smashed the looms, committed other acts of industrial sabotage, were harassed by government troops, saw some of their number executed for machine breaking (and others exiled to Australia) after the passage of the Frame Breaking Act, assassinated a mill owner, and then pretty much faded as a movement.

Before the passage of the Frame Breaking Act, a member of the House of Lords argued vehemently against imposing capital punishment on frame breakers, delivering to the Lords a theatrical speech “loaded with sarcastic references to the ‘benefits’ of automation, which he saw as producing inferior material as well as putting people out of work.” That member of Parliament was George Gordon, Lord Byron, who, as it happens, was also the father of Ada Lovelace.

Ada Lovelace is often heralded as the world’s first computer programmer. The computer in question was Charles Babbage’s Analytic Engine — a machine Babbage proposed but never actually built. So Lovelace’s accomplishment requires a bit of explanation:

During a nine-month period in 1842-43, Lovelace translated Italian mathematician Luigi Menabrea’s memoir [written in French] on Babbage’s newest proposed machine, the Analytical Engine. With the article, she appended a set of notes. The notes are longer than the memoir itself and include (Section G) in complete detail a method for calculating Bernoulli numbers with the Engine, recognized by historians as the world’s first computer program. Biographers debate the extent of her original contributions, with some holding that the programs were written by Babbage himself. Babbage wrote the following on the subject, in his Passages from the Life of a Philosopher (1846):

I then suggested that she add some notes to Menabrea’s memoir, an idea which was immediately adopted. We discussed together the various illustrations that might be introduced: I suggested several but the selection was entirely her own. So also was the algebraic working out of the different problems, except, indeed, that relating to the numbers of Bernoulli, which I had offered to do to save Lady Lovelace the trouble. This she sent back to me for an amendment, having detected a grave mistake which I had made in the process.

The level of impact of Lovelace on Babbage’s engines is the subject of debate. The debate is difficult to resolve due to Charles Babbage’s tendency not to acknowledge (either verbally or in writing) the influence of other people in his work. Lovelace was certainly one of the few people who fully understood Babbage’s ideas and created a program for the Analytical Engine. Had the Analytical Engine ever actually been built, her program would have been able to calculate a sequence of Bernoulli numbers. Based on this work, Lovelace is now widely credited with being the first computer programmer. Lovelace’s prose also acknowledged some possibilities of the machine which Babbage never published, such as speculating that “the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent”.

For those inclined to credit Babbage, rather than Lovelace, with the computer program in Section G, Lovelace is still indisputably the world’s first debugger.

Had the Analytic Engine been built, the plan was to deliver its input on punched cards of the same sort used to run mechanical looms. Even though Lovelace never knew her father (who was otherwise occupied being bad, mad, and dangerous to know), it’s curious to try to reconcile the father who defended the Luddites with the daughter who laid important groundwork for our computer age. Arguably, computers have ushered in socioeconomic changes that eclipse those brought by the mechanization of the mills.

Still, I think what Lovelace did was pretty cool.

On the one hand, the procedures she developed for using a computing machine (one that had been described but not built) to carry out complex calculations were a beautifully concrete demonstration of what was possible. Her programs mapped out how Babbage’s proposed machine could be put to practical use. In other words, Lovelace’s work showed that a mechanical computer was not just a cool thing in itself, but a technology that could “click” into the work and the world of a certain sort of intellectual laborer. In the meantime, until a mechanical computer of the sort Babbage proposed was a material reality, people actually had to do their calculations by hand.

And while Lovelace and Babbage are the intellectual godmother and godfather of a significant technological leap, there’s a way in which the shift their work put in motion feels different than the industrialization of textile production that the original Luddites decried. Knitting machines, on the face of it, can render skilled human knitters obsolete. Mechanical computers, on the other hand, don’t so obviously make the existing workforce of human computers obsolete. While mechanizing complex mathematical calculations removes some level of intellectual grunt-work from the humans who might otherwise be doing them with pencil and paper, those intellectual energies might be directed to other, related problems (including formulating new ideas that lead to new calculations, or drawing conclusions from the results of the calculations). Moreover, the very mechanization of the calculations generates demand for a new type of intellectual work — devising the instructions to the computing machine to perform the calculations.

Writing computer code may now strike many as intellectual drudgery. This is a little amazing to me, as my childhood experiences learning programming (using BASIC and a Commodore “Teacher’s Pet”) opened up an intriguing world of problem solving that felt like play. I have to believe that the drudgery of the code monkey’s lot has more to do with the nature of the workplace than the nature of computer programming.

Even so, there’s something about a world in which drudgery can engage our intellects as well as our bodies that feels like it gives us a bit more wiggle-room to be humans, not just cogs in the industrial machine.

And my mother’s willingness to undertake such drudgery in a programming job is what kept us clothed and fed when I was but a wee Luddite. As it happens, my mother was also the one who first taught me how to program, who conveyed the fun and sense of possibility in this activity. That this activity got its start with the technically minded daughter of the romantic poet who defended the Luddites somehow strikes me as exactly right.

Janet D. Stemwedel About the Author: Janet D. Stemwedel is an Associate Professor of Philosophy at San José State University. Her explorations of ethics, scientific knowledge-building, and how they are intertwined are informed by her misspent scientific youth as a physical chemist. Follow on Twitter @docfreeride.

The views expressed are those of the author and are not necessarily those of Scientific American.





Rights & Permissions

Comments 3 Comments

Add Comment
  1. 1. Postulator 7:04 pm 10/7/2011

    So did you ever learn programming in Ada? It’s apparently quite a useful language.

    Link to this
  2. 2. Janet D. Stemwedel 8:33 pm 10/7/2011

    I never did learn Ada. The programming languages I used to program computers where the programming was an end in itself were BASIC and Pascal. To do science (which is to say, to drive micropumps, control data collection, and run simulations) were FORTRAN and C+. I have what feel like fond memories of the NAG Library, but it might just be nostalgia for my youth.

    Link to this
  3. 3. mcgeejw 10:18 am 10/9/2011

    Janet,I have worked in technology for my entire 25 year career (computer science). Like you I have become somewhat of a luddite. One of the benefits of slow development and introduction of new ideas is that a certain “dicipline of use” tends to evolve with it. Usually there is consideration given for it’s ethical and practical implications. Humanity has not always been at its best at following that discipline but it does exist. I am particularly concerned about social media (and I preceded this comment by tweeting you). Twitter and Facebook have allowed unprecedented global communicative access but, depending on the expert, communication may consist of 75 percent body language and facial expression cues. Not a problem for our generation, we grew up without social media. My concern is for subsequent generations who may spend more time online chatting with friends than actually in person. With the available alternatives such as television, computers, and gaming families don’t spend as much time together. I wonder if children will get enough contact time to develop the ability to effectively read body language and facial cues? Maybe a little of subject for this topic but it is a thought.

    Wes

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Dinosaurs

Get Total Access to our Digital Anthology

1,200 Articles

Order Now - Just $39! >

X

Email this Article

X