Split infinitives and smash the patriarchy

I just posted a little rant (in a Facebook group about linguistics) about the supremacist attitudes inherent in certain grammar “rules.” My rant ended with “In short: Split infinitives and smash the patriarchy.” This message seems to be resonating with a lot of folks, and I have a friend working on a graphic design for stickers/t-shirts/etc.

I’ll put the whole rant here soon; I just wanted to get this post up right away for intellectual property purposes.

Blogging about Commas

My site description says “knowledge management, good Web content, duck confit, odd bits of beauty, general nerdliness, and the Oxford comma.”

While I *use* the Oxford comma on this blog, I am not sure I have really blogged about it, per se. My brother-in-law Seth  called me on this the other day–and then a lot of people read this news story and told me it made them think of me. I’m quite proud.

I am a staunch, steadfast proponent and defender of the Oxford comma. None of the arguments against it make sense to me, when weighed against the arguments for it. I’m not going to try to convince you, though. You can do that for yourself. (Just Google “Oxford Comma” and be amazed at the nerdery and vitriol.)

Lynne Truss’ lovely Eats, Shoots & Leaves calls the comma a “grammatical sheepdog” that “tears about on the hillside of language, endlessly organising [sic] words into sensible groups and making them stay put.” Ms. Truss acknowledges the pro vs con argument and advises “There are people who embrace the Oxford comma and those who don’t, and I’ll just say this: never get between these people when drink has been taken.”

Funny examples:

Times when a comma (not always Oxford) made a difference in the outcome of a court case:

Data: Singular or Plural?

Over the years as a writer and editor (and interrupted linguist), I’ve mellowed quite a bit. From a young age through my late 20s, I was a strict prescriptivist/pedant: “These are the rules; I am going to follow them, and I am going to get an A!” Gradually I’ve shifted toward descriptivism: “All usage is in some stage of flux; I just want to write clearly for my audience, so I can convey ideas as accurately as possible.”

But there’s one usage about which I am adamant: “Data.” Is it singular, or plural? The answer I accept is “Ask your audience.”

Here’s my argument:

(1) Are you speaking or writing Latin? “Data” is plural.

(2) Are you speaking or writing English? Ask yourself: How does my audience expect me to treat “data”?

(a) “Data” is neither singular nor plural in essence, but a mass/uncountable noun (like “furniture” or “traffic”–or “audience.”) Nevertheless,

(b) If you are writing or speaking to an audience of scientists (especially social scientists, but not computer scientists), you should use plural verbs and markers with “data”–otherwise, they will consider you unsophisticated, and possibly think less of your expertise. (I don’t think that’s a fair leap to make, but it’s a fact of life.) 

(c) If you are writing or speaking to a general audience and/or computer scientists, use “data” with singular verbs and markers. Otherwise, your audience is quite likely to think you are being pretentious. (If you *want* them to think you are pretentious, have at it. Just be aware of the effect this choice can have.)

(d) If you don’t know enough about your audience to make an informed choice, rewrite the sentence to avoid having to use “data” with a marker of grammatical number.

The argument that “data” is the plural of “datum” holds no weight with me, because:

(i) I can’t remember the last time I heard “datum” (rather than “data point”) in common parlance; and, more importantly,

(ii) English is not Latin. Once English has accepted a word from another language, the grammatical rules of the root language no longer control that word. “Opera” in Latin is the plural of “opus,” but in English “opera” is most frequently used as a singular noun. Most people use “agenda” in English as a singular noun as well–“Do we have an agenda?” “Hold on, I’ll send it to you.” 

Come at me.

Economics is just modern fortune-telling.

Minor rant: If I were Queen of the Universe, #12 on my list of proclamations would be this: We stop saying/writing/reporting things like “Market fails to meet analysts’ projections”, or “The 3rd quarter figures were lower than predicted.” All such utterances should place the blame where it goes: On the economists, not on the figures.

“Analysts fail to predict market. Again. So far this year, they’re doing only slightly better than chance. Could you remind me why we’re paying them?”

“For the 23rd quarter in a row, the economists are wrong. This time they only missed the answer by 3%, which is pretty good, for them.”

I used to think that economics wasn’t a science, but I’m broadening my definitions. I think macroeconomics is an interesting way of looking at the world. I find the Freakonomics podcast fascinating, for example. But that doesn’t make economics a good way of predicting the likelihood of a specific event–certainly not to the degree you can rely on in chemistry or physics.

It’s kind of like weather forecasting for my neighborhood vs. meteorology for the planet. You can still call it science, if you’re using “science” to mean a “way of knowing”. It just falls apart a little when you get to the “replicability” standard for scientific merit. I’m OK with that–I don’t require that level of rigor from everything I believe. Love isn’t predictably replicable. Nor is poetry, or faith. But economics is pretending to be chemistry, when it’s arguably more like astrology, and that pretense bothers me.

I want to put an image here from Demotivators.com, because it would be funny. However, I’m pretty wary of image-searching-lawyer-bots, so I’ll just link to it instead: http://demotivators.despair.com/demotivational/economicsdemotivator.jpg. Enjoy.

Grit in the Dish

As a detail-oriented writer, webmaster, and knowledge worker, I notice errors. When I point them out, sometimes people are grateful. Other times, I get pushback along the lines of “Nobody cares about stuff like that except you,” or “I don’t see why you’re worried about that detail.”

Have you ever eaten steamed mussels with sand in them, or salad with grit on it? The food might be perfectly cooked, creatively seasoned, and beautifully plated. But as soon as there’s grit on your palate, you notice. If there’s one piece of grit, some people might overlook it. The more grit, the less edible the dish—no matter how good everything else is, that grit makes the dish less enjoyable, or even inedible. If you go to a restaurant twice, and there is grit in your food both times, would you go back a third time? I wouldn’t. I would think their prep work was sloppy, and that would make me worry about their hygiene practices and respect for product.

“Nobody cares except you” is a coward’s defense, and it dismisses the experience of at least part of your audience. Some people will notice the details. For example, I’m passionate about words, grammar, and usage. I notice when people use a word or phrase imprecisely or inappropriately. I notice if someone is using serial commas, or not using them, or using them inconsistently. I’m passionate, though not expert, about design; I have a strong aesthetic, and I notice when people haven’t learned the same design basics as I have—when they break a grid, or choose colors or fonts haphazardly, or don’t have real people use a product before release.

Caring about these things has earned me some ridicule, but it also has made me a valuable team member—I’ll notice the things that others on the team don’t, so when we put a product out in the world people will notice the content or usefulness of what we made, and (I hope) not be distracted by awkward design choices or grammatical errors. If we don’t pay attention to those kinds of details, some people will think less of us—and think less of our expertise.

Typos, grammatical errors, awkward usage, and inelegant or untested design are the grit in any work. Having a grit-detector on your team makes the work better, and builds your audience’s faith and trust in your expertise.

On the other hand, if the salad with grit in it is made of tough or spoiled greens, that’s a different kind of problem. Don’t bother removing the grit if the greens themselves aren’t good. Spend your energy on starting fresh.

Knowledge Management and Beauty

A conversation I’d like to hear more of in the knowledge management and exchange (KME) [1] space is this: It is worthwhile to spend time on design of knowledge products. People will more readily absorb knowledge that is presented in a pleasing way. You aren’t going to share your knowledge effectively if looking at your newsletter makes people’s eyeballs hurt.[2] “Look and feel” isn’t about being pretty or cool; I see it as a genuine make-or-break issue for successful knowledge management.

Look and feel—which I think of as shorthand for the Venn diagram overlap of usability, user experience, and design—is important. In my experience, most people who are vocal about the importance of look and feel are designers, so I think non-designers take that opinion with a grain of salt. “Sure, *you* think it’s important to avoid flashing-spinning-screaming things and Comic Sans, but I love my ideas, and I’m the client.” (There are many excellent summaries of this client/designer tension, including The Oatmeal’s How a Web Design Goes Straight to Hell, so I’ll spend no more time on it.)

I think more people in the KM(+/-)E sphere should be concerned about look and feel—and not just about websites. Anything I produce—from an email to a print piece to a website to a conference presentation—has a look and feel. Considering look and feel, finding out what people think about it, and improving it where possible is critical to effective knowledge management and exchange.

Someone looking at a website for the first time decides in 1/20th of one second whether it looks good. That instant carries over into judgments about quality, usefulness, and reputation of a site and its content. So creating a positive first impression is a crucial first step to improving knowledge use and exchange.

If a website design makes me feel overwhelmed, I’m going to leave (that’s why I picked Google over Yahoo in the search engine wars back in the day—Google gave me a clean search box; Yahoo garbaged the search up with news and entertainment and travel options and and and…). If a brochure has jarring or out-of-context art choices (e.g., a combination of stock photography and clip art), I probably won’t read it. If a presenter is reading words from her own slides, she loses my attention. I don’t get the benefit of the attempt at knowledge exchange. In everyday life, that’s as much my fault as yours—but if you call yourself a knowledge manager, invested in knowledge exchange and uptake, it’s your responsibility to think about whether that initial 1/20th of a second will make your audience think that your website/brochure/presentation is worth more seconds, or even minutes or hours, of their attention.

[1] At some point circa 2011, I started seeing the abbreviation “KM” for “Knowledge Management” being replaced with “KME,” “KM&E”, or “KM/E”—meaning “Knowledge Management and Exchange”. The change didn’t fully take hold—”knowledge management” gets over 13,000,000 results on Google, vs. just under 6 million for “knowledge management and exchange”. I had always read the “…and exchange” as implied: To me, there’s no purpose in managing knowledge unless people use and exchange the knowledge I’m managing. Go back to reference point.

[2] I realize this whole topic marginalizes people with visual impairments. I don’t know the accessible equivalent of “look and feel”. I should probably educate myself much more on that front. Go back to reference point.

Global North vs Global South: Haves and Have Whats?

I have worked in what I consider the nonprofit sector for almost fifteen years. My current employer is a research and communication center within a university, so some might argue I’m in the academic sector now. However,  my program (and my work) is funded by USAID and operates in the global public health sphere, which makes me feel like I work for a non-governmental organization. That’s not the nomenclature problem. The problem is with the terminology surrounding the global distribution of wealth, power, and certain kinds of economic development.

I’m not denying that there are inequalities in play—countries that give or receive aid, export more than they import, have or don’t have certain kinds of industry and infrastructure, or are above or below the global gross domestic product per capita average. But I think it’s a false dichotomy, and the nomenclature around it is deeply unsatisfactory.

Right now the in-vogue term for countries that (for lack of a better term) I shall call the “economic-industrial-have-nots” is “the Global South”, or just the South. These countries, and the people who live there, are called Southern. Communication and cooperation between them is called “South-South”. This makes my teeth hurt, because of geography. Here’s a map from Wikipedia of the countries above and below the average GDP per capita line.

Screen Shot 2013-05-30 at 9.01.24 PM

Yep, a lot of the blue (more-money-than-average) countries are in the northern hemisphere–which by the way includes nearly all of Asia and about half of Africa (I’m not sure, because my brain has been warped by the Mercator projection). There are a lot of blue countries in the southern hemisphere, too. Imprecision bothers me.

I don’t object to having gotten rid of the term “third world countries”–I don’t hear it any more from people in my professional space. “Developing countries” was in vogue for a while, which seemed better, but then as the director of my project noted the other day, “It’s not like a country crosses some magical line and doesn’t have any more progress to make.” Some people were using “emerging markets” for a while (and might stil be), but I find that pretty insulting–as though people in the international development sector are there solely for the purpose of selling people things. (I’m not saying that isn’t *a* reason. But it’s not the only reason. And it’s certainly not my primary reason for doing the work I do.)

I think I’m also irritated because the “rich/poor”, “industrial/agrarian”, “democracy/dictatorship” dichotomies deal in such a narrow sphere of human value. They all split the countries up and attempt to name them as two groups by reducing people to dollars, or voters, or oppressed masses. I think it’s too simple.

And yes, I recognize that my discomfort with the nomenclature is a First World Problem, and I’m having all kinds of guilt about my carbon footprint and disproportionate consumption of all sorts of resources. But here’s a totally different map–a scale, not a dichotomy:

Screen Shot 2013-05-30 at 10.08.53 PM

This is the Happy Planet Index map. It’s about ecological footprint.

Surprisingly, I’m having trouble finding a map of happiness, or fulfillment, or peace, or connection, or time with family, or any of the other things that count to me as a person to my quality of life.

So, I’m on the lookout for an evolution in the nomenclature. I’ll keep you posted.

Update! Sept. 25, 2013: No evolution in nomenclature, but a new report on global happiness from the Sustainable Development Solutions Network (SDSN), a nice post about it on Columbia University’s Earth Institute website, and a digital publication version. Sadly, still no map.

Update 2! Oct. 15, 2021: Nomenclature!! I just heard the term “Majority World” for the first time. It’s a great phrase, although I’m struggling with some imprecise nuances. I’m also annoyed that I haven’t heard it before, despite it having been coined no later than 2009 (the publication date on the article I link to above). Diffusion of innovations is an interesting thing.

Dubious Milestone: 1,000 spam comments

I would like to thank all the marketers of Cialis, knockoff designer handbags (especially Louis Vuitton), and SEO optimization “services” for their interest in my blog. I’m touched. But since I have better things to do than moderate spam comments (for example, anything else I ever do), I have closed comments.

While I’m strongly in favor of a participatory atmosphere and an open exchange of ideas, this blog has received over 1,000 spam comments, and zero real comments. If you’re a real human who actually wants to talk to me (without me buying anything from you or accepting membership in your malware-enslaved spam-spewing botnet), you’ll find ways to contact me via the About page.

 

Jargon du Jour

Some days, I live in a world of hurt. Some meetings, conference calls, and academic papers are too much for me to take. My name is Simone, and I am jargon-sensitive.

Firecrackers. Bio-break. Grasstops. Fireballs[1]. Realtime. Synergy. Deep Dive. Gamechangers, and their ancestral Paradigm Shifts. Leverage. Sustainable. Animert.[2]

Words like these hurt me. When I say them, I feel dirty. (I say them anyway, sometimes, because other people in my professional space expect to hear them.) When I hear them, it’s worse than nails-on-a-chalkboard; it’s like stepping barefoot on a tiny piece of glass.

A few months ago I was exposed to the word “exnovation”, and it literally made my palms sweat with linguistic consternation. It was used in the sense of “to improve something by removing outdated or superfluous features”—a process I applaud. It’s the word I have a problem with. Let’s take a look at the pieces (with some help from the Etymological Dictionary):

innovate (v.) Look up innovate at Dictionary.com 1540s, “introduce as new,” from L. innovatus, pp. of innovare “to renew, restore; to change,” from in- “into” (see in- (2)) + novus “new” (see new). Meaning “make changes in something established” is from 1590s. Related: Innovated; innovating.

in- (2) Look up in- at Dictionary.com Element meaning “into, in, on, upon” (also im-, il-, ir- by assimilation of -n- with following consonant), from L. in- “in” (see in). In O.Fr. this often became en-, which usually was respelled in English to conform with Latin, but not always, which accounts for pairs like enquire/inquire. There was a native form, which in W.Saxon usually appeared as on- (cf. O.E. onliehtan “to enlighten”), and some verbs survived into M.E. (cf. inwrite “to inscribe”), but all now seem to be extinct. Not related to in- (1) “not,” which also was a common prefix in Latin: to the Romans impressus could mean “pressed” or “unpressed.”

ex- Look up ex- at Dictionary.com Prefix, in English meaning mainly “out of, from,” but also “upwards, completely, deprive of, without,” and “former;” from L. ex “out of, from within,” from PIE *eghs “out” (cf. Gaul. ex-, O.Ir. ess-, O.C.S. izu, Rus. iz). In some cases also from Greek cognate ex, ek. PIE *eghs had comparative form *eks-tero and superlative *eks-t(e)r-emo-.

So, following these pieces, we see that “innovation” means “the process of imbuing something with newness”, and thus can parse “exnovation” as “to remove or expel newness”. I don’t think that is what the coiner (apparently A Sandeep, or so his blog states) meant. I think he meant “edit” or “improve” or “iterate”, all of which are words that cause me no central nervous system distress.

I don’t mind people coining new words for new ideas (like “meme”, which will probably get a post of its own eventually). In the case of “exnovate”, my objection is to the disregard for venerable prefixes.

[1] “Fireball” hits my jargon-nerve only when used to mean “fans” or “champions”—that is, “People who are on our side, and are vocal about it, and may attract more people to our cause”. I like the word “fireball” when it denotes big globs of flaming pitch, dragon-breath, explosions, meteors, or cinnamon-flavored jawbreakers.

[2] Honorable Mention: After years of it being common parlance, “webinar” is now just below my pain threshhold, thanks in part to the comparative horror of “eSeminar”.

 

How to Write a Useful Bug Report

I work in the space between front-end web people (who write content and talk to web users) and back-end web people (who write code and build web applications). I help wonks[1] communicate with geeks[2]. A fundamental element of success in this pursuit is the Useful Bug Report. (There’s a bit of a rant coming up, so if you want to skip straight to the how-to, here’s your link.)

The tracking system we use for bugs/feature requests/etc. works well. It isn’t perfect.[3] It leaves room for gaps in wonk/geek communication. For example, the instructions for the “Description” field say:

Put as many details as possible in the Description, including links to the pages involved, usernames/emails, and your browser version.

These instructions aren’t very helpful to a wonk who is trying to write a report that will be useful to a geek. Minute detail does not necessarily produce good bug reports. On the other hand, neither does an “I was on our website and something is broken—I clicked something, and it looked weird” approach.

I just finished writing up a more practical “How To” sheet for the wonk side of my team. I would have thought this kind of information would already be widely available, but a quick Google search for “how to write a good bug report”  led me to:

  • Specific bug reporting systems which require you to log in before giving you any information (boo), and
  • Some “how to write clear bug reports” tips from bloggers who don’t write clearly (by my standards) and/or are condescending.

Not terribly helpful. I shall now rush in to fill this information gap.

~~~

How To Write a Useful Bug Report

The way you describe a bug to developers can make a big difference to how quickly they can resolve an issue. The better the bug report, the less initial troubleshooting/diagnosis the developers need to go through.

Your “Description” field [in our internal system] should contain a little narrative that includes the answers to the following:

1. Where were you? (Website / product or application / URL / section of page)

2. Were you logged in, and if so, with what permission levels? (e.g., not logged in, logged in as user or admin)

3. What operating system and browser were you running? Include version numbers if you can (e.g., Windows XP 2002 SP 3 + Firefox 10.2.0). It can be illuminating to check a bug in multiple operating systems and browsers, and then add something like this to your description: “First encountered this on my work desktop setup, where I’m running Firefox 10.2.0 over Windows XP 2002 Service Pack 3. Tested in Chrome 17.0.963.56 m and Internet Explorer 8.0.6001.18702; same results. I also tested it from home where I’m running a (really old) Mac, OS 10.5, and Firefox 9ish; same results, only slower.”

4. Describe the bug:
a. What did you click, or what action did you take right before the bug made itself known (e.g., added something new to cart, clicked “Submit”)?
b. What did you expect to happen when you took that action?
c. What actually happened?
d. What does “fixed” look like? That is, do you want the fix to reflect the “what did you expect” state above, or something else—like hiding the broken thing, or reverting to an earlier build?

This should give [our in-house development team] enough to go on to make a good start at diagnosing the issue, and save a bunch of back-and-forth emails or ticket comments.

~~~

Remember, dear reader: The above is a basic cheat sheet I wrote for my own team. Your mileage may vary. Other things like bandwidth/connection type, other programs you were running (like adblockers or portals like Blackboard), whether you were streaming Pandora at the time, etc. can also be helpful diagnostics. This isn’t the One True Way. It’s my way, today.

[1] Wonk: I use this term to mean “Expert; having a deep and detailed understanding of a particular content area which is little known by the general public.” Etymological sources differ; some suggest it is drawn from “know” spelled backwards, as in “to know your material backwards and forwards.” Go back to reference point

[2] Geek: I use this term to mean “A person who is proficient in digital technologies; who knows how to write at least one type of computer code and/or is comfortable installing, troubleshooting, or repairing computer and networking hardware.” As a person who identifies as a wonk and a geek (as well as a nerd and occasionally a dork), I use these terms with love and respect. Go back to reference point

[3] None of the bug/feature tracking systems I have used are perfect. They all have quirks, or leave something out that you wish they put in, or have more features than you’ll ever use. Go back to reference point