Another

Another death. Another killing.

Another explosion.

Another shooting.

Another outrage perpetrated by those in power on those with none.

Another set of babble and bile from the privileged talking heads.

Another day of confused thoughts and feelings no one knows what to do with.

I dread hearing the word "another."

An improv principle and event planning

In the winter of aught-six, I took a beginning improv class at DSI Comedy Theatre and learned a lesson that I pass along almost as holy writ to others.

DSI Comedy Logo

As an actor who always preferred having a script in hand, lines memorized, and my character's blocking pencilled in, improv struck me as insane as Frost's tennis without a net. Improv always scared me to death, yet I always loved watching it. It was a mystery. So I decided to face up to it and took the class, which was excellently led by Ross White (who now edits Inch).

Ross taught us that there were -- invisible to the audience -- many little rules and guidelines to improv and, weird to say it, but I actually did feel my mind change its shape and stretch to make new neural connections about halfway through the 6-week class. Many of my assumptions about how to act onstage were challenged.

Of the many things Ross taught us, there were two principles that stuck with me. The first was counter-intuitive. In a two-person scene, when given a random prompt ("You're a chocolate chip in a cookie -- go!" "You're superheroes who work as janitors -- go!"), the pressure is on and the temptation is to force the scene to a conclusion, to lift it and drag it -- and your partner -- to where you think the scene should go.

As Ross explained, though, that's working way too hard. Beginners think that, in a two-person scene, one actor needs to give 50% and the other actor needs to give 50%. In fact, Ross said, the breakdown is this: one actor gives 25%, the other actor gives 25%, the situation gives 25%, and the audience gives 25%.

When you look at the scene in that way, you can relax and make more impact with less effort. It's not all up to you; you're simply one of many people ensuring that the scene will succeed. Everyone is pulling together.

Which leads to the second principle: trust. You simply lay back, let the water support you, breathe, and just float. Relax and trust that your partner will contribute ideas to the scene (though you have no clue what they'll say till they say it) (and they probably won't know until then either),  trust that the scene will take wing and lift off, and trust the audience, which really does want to see you succeed. When I learned to stop forcing the punchline or forcing the moment, I found myself enjoying improv more and being delighted and surprised at where our improvised scenes wound up.

I explained the 25% principle to fellow neighborhood association board members last night after they congratulated me for working so hard to organize the NNO event. As I told them, it wasn't that difficult. I only had to give 25%, the other board members gave 25%, the potluck event on its own gave 25%, and all the neighbors who came out to eat and meet gave 25%.

After people started trickling in to put their dishes on the tables, it really was like an improv scene come to life. No one could control what was going to happen next; you could only go with the flow, work with what your partner gave you, and trust that it would all turn out just right. Which, I'm pleased to say, it did.

Enhanced by Zemanta

A student or a scholar

One of the things I discovered about myself during the past year is that I'm a student, not a scholar. I've always thought of myself as a "lifelong student," but I'm not sure I really understood what that meant till recently.

In my view, a master's candidate is a student, a PhD candidate is a scholar. The differences are many: the difference between being an amateur (student) and a professional (scholar), between minor league and major league, between levels of commitment in terms of time, energy, passion, and dedication.

For me, a lifelong student retains the joy of learning new things and loves sampling the buffet. That's been me, that will always, probably, be me. The scholar, I think, takes a deeper interest and is best served (at least in their early years) by not flitting from flower to flower. Also, the way academe is structured, scholars are professionally groomed for a tough job market; the decisions they make today on the research they publish will have repercussions years down the line. The student, I think, lives more in the moment, or at least has a shorter time horizon for the satisfying of their desires.

As I'm sure I've said in other posts, I like taking classes. This seemed to separate the student from the scholar, in my brief experience. I think I'm one of the "Scanners' that Barbara Sher describes in her book Refuse to Choose: someone who loves the novelty and variety of learning and resists constraining themselves to a single specialty.

Reminds me of these quotes by Bill Moyers on the fun of being a journalist:

A journalist is a professional beachcomber on the shores of other people's wisdom ... A journalist is basically a chronicler, not an interpreter of events. Where else in society do you have the license to eavesdrop on so many different conversations as you have in journalism? Where else can you delve into the life of our times? I consider myself a fortunate man to have a forum for my curiosity.

Had I stuck it out in the PhD realm, my chosen research style would have been that of a journalist. The challenge for my life now, I think, is to elevate that curiosity and focus from a hobby done in my spare time to a respected place of prominence at the center of my life and how I choose to spend the rest of my years on the planet.


Research Journal for my 780 class

Cover of "7 Up"
Cover of 7 Up

Since our 780 Research Methods class doesn’t have a Blackboard site for the class, I’ll post my various links and thoughts to the blog, tagged with “780.”


I wonder if Michael Apted’s wonderful Up series of documentary interviews would be an example of a kinda sorta longitudinal study or panel study? When a new film comes out every 7 years with updates on these people, it’s always fascinating to see where life has -- or hasn’t -- taken them. Instead of gathering statistics about a large group of people, there's something very satisfying about getting to know a small group of people very well.


We’ve been talking about experiments, planning a study, theories, types of studies, etc. One of our last readings was about where one gets ideas for theories. This reminded me of Seth Roberts, a Berkeley researcher in psychology, who frequently touts self-experimentation as a way to generate research ideas. This is one of his more famous papers. He maintains an active and entertaining blog.

What I admire about Seth Roberts is his abundant idea-generation and his zeal for measurement and record-keeping. His goal is to experiment on himself first, then if his data indicates that there are possibly interesting results, then he proceeds with more methodical testing and inquiry, possibly leading to more formalized studies (or not).

When I’ve been thinking about possible studies I might like to try, I remember this quote from one of his blog posts:

SR: Tell me something you've learned about research design.

BW: When I was a graduate student [at the Stanford Business School], I would jog on the school track. One day on the track I met a professor who had recently gotten tenure. He had only published three articles (maybe he had 700 in the pipeline), so his getting tenure surprised me. I asked him: What's the secret? What was so great about those three papers? His answer was two words: "Cool data." Ever since then I've tried to collect cool data. Not attitude surveys, which are really common in my area. Cool data is not always the easiest data to collect but it is data that gets buzz, that people talk about.

Thinking about what “cool data” might mean in a digital curation or archival or info-science context can be tough. I think the social networks are certainly perceived as cool and you can do cool stuff with them, certainly, but I’m not that curious about them. I feel like, were I to study one of them, I’d just be chasing a parade that’s got a five-mile headstart. Better to find my own parade. :)

Curiosity is probably what drives me. Certainly, one of the itches that a researcher must scratch is his or her own personal obsession with some nagging question or detail that no one has really addressed or answered to their satisfaction. (The same way most writers have to write their own poems, stories, and plays, because no one else is publishing what they want to read.)

Check out his numerous posts tagged scientific method (though he’s more usually critical of scientists’ behavior than the method itself) and self-experimentation for more.


Another great Seth Roberts post that got my attention was this one on appreciative thinking, especially as it relates to reading journal articles. I see what he describes in the classes I attend, where we read a paper that’s 1 year, 5 years, or 10 years old, and it’s rather thoroughly shredded during the ensuing discussion for any number of reasons (and I've been guilty of trashing articles, myself).

Instead of this negative critical thinking, I like his suggested questions to ask instead, especially the simplicity of his fifth question: “What’s interesting or enjoyable about it?” Even if I find the writing of an article stilted or atrocious, I think it should be possible to at least admire a piece’s energy, its intent, its point of view, its ability to stir thoughts in me, etc. Saying something constructive is not about becoming a positive-thinking ninny; it's about seeing more sides of the issue than only one.

Even for a piece (Mabry's "Reference Interview as Partnership") that didn’t really touch me, I appreciated that this was the author's distillation of a career’s worth of lessons that she wanted to impart. In my summary of the piece, I said I could see it being used to start a conversation about one’s own personal manifesto for serving at a reference desk. We’re not often asked to reflect on our larger purpose or philosophy when it comes to our jobs, or even our career, so I saw the Mabry piece as a terrific starting point for such a conversation.


Speaking of writing up experiments so they’re repeatable -- how often does repeating an earlier experiment really happen?

Reblog this post [with Zemanta]

On specifying your terms

One of the books I read over the Christmas vacation was Writing the Mind Alive, which one Amazon reviewer tags as the book to go to after freewriting has taken you as far it can. I used to write morning pages and still enjoy journaling, but I'm always open to new approaches and methods (the Topics du Jour approach being one that has most impressed me recently). The method was created by two ex-academics, who lead workshops on the method. Their web site is here.

The book is an easy and breezy read, and I appreciated the inclusion of students' "Writes" (as the authors call them). The method is straightforward and, as some of the Amazon reviewers notes, not all of the ritual surrounding the Write--which includes lighting a candle and playing Baroque music--are really necessary. Also, the book (as one would expect) sings the praises of its "proprioceptive method," affording it real and affecting emotional benefits to its practitioners.

What most separates the method, for me, is its direction to use what the authors call the “proprioceptive question” or PQ. As one transcribes one’s inner monologue and writes, “I hate it when my mother does that,” the method directs one to listen to the voice and then ask, “What do I mean by ___?” In this case, “what do I mean by hate?” or “what do I mean by that?” And then write out what you mean.

The goal, as the authors explain, is to dig out those details that are glossed over by the wallpaper words we use to not look too closely at the things that bother us. What do I mean by things? Remembered events, memories, assumptions, images, long-buried hurts, and the like.

I usually count a technical book a success if I can get at least one good idea out of it. I consider the PQ a good idea and one I’m going to start using in my own journaling. As I’ve found in my coaching and in monitoring my own self-talk, I will often make a blanket statement as if to say. “Of course, what I've said is true and inviolable and not to be questioned.” But as I’ve gotten rid of various blocks and taken risks and experienced successes, I’m seeing more and more the value of exercising some healthy skepticism by making my assumptions explicit and bringing them out into the open where they can be dealt with.

The whole idea of defining one’s terms hove into my view first due to an email newsletter by Laurie Taylor, host of BBC4 Radio’s Thinking Allowed. I admire the way Taylor always attempts to connect that week’s program to a personal anecdote and his lighthearted style is welcome. Here’s his 21-Dec-2008 newsletter:

I had a university tutor in psychology who was popularly known as Doctor Dit.  For a couple of terms I assumed along with my fellow students that this was an innocent nickname.  But then one day I was told by a postgraduate that it was really an acronym.  It was not DIT but DYT and the letters stood for Define Your Terms.

It was a very appropriate designation.  Whereas other tutors would positively encourage some debate in their seminars, the man known as DYT would immediately bring any such discussion to a halt by a demand for definitions.  It was not unlike being repeatedly hit over the head ‘Right.  Taylor, what is value of optical illusions in the study of perception?’ ‘Well,’ one would begin, ‘When your eyes are deceived it could be that the deception is the inappropriate application…’ ‘Not so fast, Taylor.  You said ‘deception?’ ‘That’s right’ ‘Define your terms.  Define your terms.’

Over coffee in the basement canteen we’d wonder about the nature of Dyt’s home life.  We’d construct scenarios in which Mrs Dyt turned to him over breakfast coffee one morning and announced her dissatisfaction with the sexual side of their marriage.  ‘We don’t make love any more.’  That would really get Dyt going.  ‘Make love?  Make love?  Define your terms.  Define your terms.’

Now that I look back on my time with Doctor Dyt, I feel more sympathetic to his intellectual crusade.  What he wanted to do was purge the world of all ambiguity and ambivalences.  He envisaged a time when people only used terms with precise definitions, a time when every flower in his intellectual garden would be precisely labelled.

Only when we reached that happy state, when the undergrowth of uncertainty had been cleared away, would we be able to arrive at hard and fast truths about the world.

But, of course, Dr Dyt’s enterprise was doomed to failure.  Words simply won’t sit still and have precise definitions hung around their necks.  Their meaning slips and slides: it is determined as Wittgenstein maintained by their many uses:

“Think of the tools in a toolbox: there is a hammer, pliers, a saw, a screwdriver, a rule, a glue-pot, glue, nails and screws.  The functions of words are as diverse as the functions of these objects.  (And in both cases there are similarities.)  Of course, what confuses us is the uniform appearance of words when we hear them spoken or meet them in script or print.  For their application is not presented to us so clearly.”

I’ve plucked that quotation from the introduction to Key Concepts in Education, a new book by Fred Inglis and Lesley Aers which doesn’t so much offer clear-cut definitions of such familiar educational terms as Assessment, Citizenship, Curriculum, Literacy and Pedagogy, as show how such terms have been variously used by people with different material and philosophical interests.  Dr Dyt would not have approved.

I think Taylor heightens his professor’s point of view and his own reaction for comic effect and to make a better point for the newsletter. But as I read about Dr. Dyt’s approach, I was thinking, “Yep, yep, good for you. That’s the way to do it. Don't lecture. Let the student teach themselves.” When working with my coach or talking with one of my mentors at school, I find I do often have to say my assumptions out loud. I frequently find that the other person has a different assumption or interpretation of the term or concept; because I'm open about what I think I mean, they’re able to either set me straight or give me new information I wasn’t aware of.

A very simple tool, and easily dismissed because of its simplicity. But I think it has great potential, particularly in school, where I’m asking questions to associate new knowledge to old mental structures so as to create new structures. Defining one's terms also helps thicken those endless essays and papers. And as I prepare to move into a potential role as teacher, I’m wondering how best to deploy this tool for good educational effect.

The key is to not become Dr. Dyt (if he was ever like that). Don't ask the question endlessly of every word or idea that comes your way. Set boundaries so you don't distract yourself. In the proprioceptive writing method, the Write is limited to 25 minutes, with specific follow-up questions to help the writer link the new information from the Write into a larger mental frame.

Michael Lewis, in a remarkable article on the end of the Wall Street boom he documented in his book Liar's Poker, offered up this choice anecdote on the value of getting specific:

Both Daniel and Moses enjoyed, immensely, working with Steve Eisman. He put a fine point on the absurdity they saw everywhere around them. “Steve’s fun to take to any Wall Street meeting,” Daniel says. “Because he’ll say ‘Explain that to me’ 30 different times. Or ‘Could you explain that more, in English?’ Because once you do that, there’s a few things you learn. For a start, you figure out if they even know what they’re talking about. And a lot of times, they don’t!”

This also reminds me of the five whys method, used to find root causes of problems--stop at five. No need to burrow further down the rabbit hole (or your navel) to find the ultimate cause; after a certain point, you have to stop and put that information into action. Otherwise, the questioning becomes an exercise in itself, rather than a means to an end.

There’s a famous story of Confucius from the Analects:

Chi Wen Tzu always thought three times before acting. When Confucius heard of this, he said: "Twice is enough."

Reblog this post [with Zemanta]

Overreactions and decisions

The SILS MSIS curriculum requires a master's paper or project and the professors of even the core required classes encourage the students to begin thinking early about likely topics. Fortunately, it's possible to review a database of previous master's papers from SILS graduates so you can gauge the scope and treatment of the topic areas. As a result, I'm always on the prowl for good topics, for others if not for myself (I may have my own gem of a topic, but it's too early to talk about it now). Earlier this year, I ran across the following Schneier on Security blog posting, on the public overreaction to rare risks, in response to the Virginia Tech shootings. It's a sobering testament to how human we are--which is a mixed blessing, in this case.

I was especially struck by the following comment on the post:

As a student of behavioral decision making, I see irrational decisions made on a regular (and unfortunately, in many cases, predictable) basis. And as you alluded to, the reactions to these can often lead to ridiculous policies and unproductive debate over preventing the effects, not the causes. However, there is something so human about these errors that seems to be impossible to overcome. The real next frontier, in my opinion, is to understand these biases better, and to use them (perhaps through policy) to aid in productive, positive decision making.

The world of economics has its own problems with this, since so many of its models assume rational consumers. Define "rational." (Today, I spent a half hour in Circuit City looking at stuff so I could spend a $25 gift certificate, only to find at the counter it was a Best Buy certificate.)

So, in relation to research for a master's paper, think about how much information does a user need to absorb before making a decision? But that topic has surely been done to death. However, even if you take in just enough information, not too much, when would information overrule emotion in the decision-making process? Can it ever? How can you measure the before and after of an emotional (ie, unconscious or reactive) decision? Or could you build an interface or algorithm that either allowed for users' unique mixes of rational/irrational, naive/experienced, emotional/logical, etc. or confronted them with the results of their choices? How to build in bias when the user wants it but leave it out when the user needs it to be left out?