Monday, November 10, 2014

Feminist Confessions


This was originally posted on the PostCalvin Blog, but since their site is having a little trouble as of late and I wanted to reference this particular post in another, I felt a repost here would not go amiss.

When I think about feminism I often remember Sarah, a girl I played soccer with at Calvin. We were a tough team, known in the league (a friend from Albion once told me) for hard knocks. On picture days, girls who wore their hair down for a prettier photo were mocked. It’s soccer, we said. We’re here to win not look pretty. As if that was a self-evident dichotomy.

Sarah did both. Some days I would meet her on the pathway down to the lower practice fields. My ankles were taped up and I was banishing poetry class from my thoughts, trying to focus on drills and sweat and the upcoming game. Sarah found a dandelion and stuck it in her hair. She had wide blue eyes and her hair was always flying out of any ponytail she put it in. I’m not sure she ever pointed out the shapes of clouds to me as we walked together, but it feels true when I remember her.

Back in my first year at Calvin I was introduced to feminism by Simona Goi, my political science professor. I adored that class—the readings, the lectures—but the only moment from it I actually remember is the day Professor Goi gave us her policy on pronouns. No more “their” or generic “his;” use “his or hers” or switch between them.

We protested. It sounds awkward. “His” means the same thing; we know it stands for both. When Professor Goi, visibly disgusted and frustrated, asked the women in the class we were being left out. I remember not caring, thinking she was blowing the importance of pronouns out of proportion. My shoulders shrugged with others’: exiled from the text, a small sacrifice for the sake of fluid prose.

I don’t remember who changed that for me. Maybe it was the Bechdel test. Maybe it was Joss Whedon. Maybe it was how ridiculous people sounded talking about Hilary Clinton’s hair instead of her politics. It might have been my self-defense instructor, her short, silver hair a curly halo as she led us in meditation one minute and then explained every way to gouge out an attacker’s eyes the next. It might also have been Pastor Mary, whose relatable and eloquent sermons reminded me—I don’t confess this lightly—of the disdain a younger me felt when I attended a wedding presided over by a female pastor.

In perhaps sophomore year Professor Vander Lie challenged my linguistics class to observe in our classes the number of times a female student spoke versus the number of times a male student spoke. Most viscerally I remember men, who were physically outnumbered in my theology class, volunteering information twice as much as the women in the class. Five years after that little experiment I’m still horrified by the ratios I see in my own classroom.

Somehow, years before, I’d put myself in a box. I could either be pretty or a bad-ass soccer player, not both, and it was obvious which the superior choice was. I began to recognize that my submission and silence for greater societal good—clean pronouns in prose, for instance; less contention in class—was harmful for myself (lacking in confidence) and others (lacking diversity). We are never one thing nor should we be. My teammate Sarah probably still comes to mind because I try to emulate the freedom she had. She was, like we all are, vulnerable, intelligent, whimsical, hard-working, clever—a million different things all at once.

Self-Selection


Just last month I wrote a short piece about my journey to feminism, a gradual awakening similar to how America’s love for Taylor Swift has become more and more acceptable. This past week I was encouraged by a similar story from EileenPollack, a physicist-turned-writer, who came to Purdue University to discuss why women self-select themselves out of STEM disciplines.

I found her story compelling. Pollack grew up in a rural, underfunded public school system. Without any advanced courses, some of the parents in the district complained and a new system was set up: students who did well on a test would be advanced a year in science and math courses starting in middle school, and then during their upperclassmen years of high school could attend classes at the local college. Pollack was disappointed when she was advanced with many of her male classmates and livid when she found out the principle had held her back because, “everyone knows girls don’t go on in math and science and it would have been a waste of a seat.”

So she taught herself calculus and got herself into Yale where she majored in physics—publishing a couple papers on the way. At the end of her four years, she quit the sciences and turned to creative writing (where she has since published two books and several collections of short stories and currently teaches at the University of Michigan). Why? Why would such a gifted person leave the STEM disciplines when she clearly had the potential for a stunning career? By all accounts she was a brilliant physicist—doing all the work required of her with little to no institutional support. Why would fail to even consider a continued life in the STEM disciplines? Why would she fulfill her middle school principal’s prognostication that she, like other women, would not go on in the field?

These are important questions with sad answers. Pollack didn’t get “distracted” by a desire to make a family. Nor did she lack the innate ability to do math or science because of her gender. No one overtly bullied her or harassed her for being only one of three girls in the entire physics department. They ignored her and called it equality.

Pollack spoke of the time her pantyhose caught on fire when she spilled a chemical on it during a college lab. The hose going up in smoke and her leg, bloody and burned, caused her to scream in surprise. At the time she was wearing the pantyhose because she’d come from temple and a religious holiday, but other women have recounted the challenge of dressing for the sciences: too feminine and you don’t get taken seriously; too nondescript and you’re no longer “womanly.”

Pollock described the demographics of her classroom and the physics department—white men and a few white women—the pictures of past physicists (white men), the study groups conducted only in the boys’ dorms, the feeling of always being behind, outside, a token woman—representing women everywhere if she failed in her quest and an outlier if she succeeded. And so, without even considering it, Pollack threw away her career in physics for something less isolating, where her work could simply be work and not something she had to prove.

Wednesday, October 29, 2014

Top 5 Ways to Divorce Work from Life

A new meme called “Old Economy Steve” will, according to Buzzfeed, “enrage millennials everywhere.” Considering a millennial probably made the meme in the first place, the meme is designed more for millennials to commiserate with one another and communicate to parents on Facebook about horrific job prospects and conditions.

Born and raised on a Protestant work ethic and the baby boomer parental evidence that hard work produces success, many millennials struggle with self-imposed overwork chasing undefined success. A culture of excess. A culture of overwork. For some of us that means grad school. Others are in their first few years of teaching, wading through hours of unpaid lesson planning and grading. Others have landed that great job Old Economy Steve always told them about, but with technological advances and job scarcity the 40-hour workweek is a distant dream.

The assumption for many is that work should be the highest priority of a person’s life (Bourne & Foreman, 2014). For those of us who overwork, it’s sometimes hard to recognize when we’re not working (Beatty & Torbert, 2003). Thus, without further ado: the top 5 ways to divorce your work from your life.

     1.      Don’t Drink Wine while you work. Or watch Gilmore Girls or 24. Or eat your favorite food (cookies!). Combining work and leisure is called “working lite,” and it’s the easiest way of extending your workweek. Work bleeds into leisure until you can’t tell when you’re taking the night off or not when you’re grading tests and drinking wine. Save the luxuries for leisure time; you’ll enjoy them more.

     2.      Be Old-Fashioned about your phone. For some people this will mean avoiding email after work hours or (for those who can’t be so strict) at least during meal times and the hour before bed. Studies show that electronics before bed make it harder to fall asleep. If you have to, avoid the internet while watching television or reading books. Other studies show that switching from task to task (even between tabs on a browser) exhausts your brain.

     3.      Protect Your Living Space from work. This is an old creative writing technique: you have your workspace and your living space. Don’t confuse the two, or your work will consume your life and your life will bleed into your work. In some ways, that’s okay (my co-author Virginia will inform you why later). But making life and work synonymous also increases less-productive working hours and makes leisure hours less enjoyable.

     4.      Pick Your Moments of productivity—and exploit them. Corollary: Know Thyself. I visited home a couple weekends ago and when I arrived, dead tired, at around 9:30 p.m., my mother was already asleep on the couch. By 10:00 p.m., the rest of us were similarly unconscious. Like my mother, I’m freshest in the morning. By around 8 p.m. I’m useless. Everyone has a different schedule for productivity: know when you can work hard and work then. Know when you can’t and use those times to unabashedly relax.

     5.      Look At Your Feet. Frederick Buechner writes, “If you want to know who you are, if you are more than academically interested in that particular mystery . . . watch your feet. Because where your feet take you, that is who you are.” Where you go and where your feet spend their time reveals your deepest values. If you want to value life, spend some time living outside of work.

Caveat: I have broken most of these rules in the last week. Good luck to you!

Monday, October 27, 2014

Cybervetting: Why I delete people on Facebook

    Right before my high school graduation an instructor commented on the promises made by senior students to stay in touch with one another. He based his argument on his own experiences, telling us that no matter how committed we feel to each other, we would eventually lose touch with a majority of the people we were close to in high school. I heard him out and then calmly informed him about how wrong he was. My argument was based around the fact that when he graduated high school, online social media was not in existence making it harder for people to stay in touch.

    A couple of years later I became interested in exploring what others could know about me just by clicking on my profile. Facebook allows you to view your profile as another person, when I did so, I was pleased to find that a lot of my recent content was blocked from public view. When Facebook introduced their new timeline feature I noticed how easy it was for anyone to scroll through my earliest posts, which were not as well protected from public views. This presented a new threat: people viewing my profile could now access a portrayal of me as a high school student.

    My concerns were justified earlier this semester when I read a piece by Berkelaar and Buzzanell (forthcoming) on the topic of cybervetting, which they define as “the process whereby employers seek information about job candidates online” (p. 6). Their paper describes a study with 45 employers who discuss their practices in hiring new employees. Online media (especially social media) affords them the ability to find more information about potential applicants. Reading through the article, you quickly realize the ethical implications involved in this practice. Berkelaar and Buzzanell point out that often, the employees stop at the first ‘red flag’, short of confirming the validity or examining the context of the find. Such information includes visual and relational information.

    Visual information can be especially detrimental because it gives the employer insight into aspects of a person’s life that are illegal to seek during an interview. For example, photo’s can reveal a person’s sexuality, their family plans, even drinking habits. Relational information is based on the judgement of a person’s network, or who someone has on a friends list. My biggest concern was to be found guilty by association. To me that came in the form of being tagged (relational information) in photos (visual information) with comments that went against my beliefs and values.

    Within my own profile I often filter, block, and change the viewing settings of a lot of the content posted on my page. It is a bit more trouble, but I found it to be worth so that I am not associated with someone who makes unsolicited derogatory comments on content I post or am linked to. To relate this back to my high-school friends, I found myself deleting many of them. Not because I had a personal vendetta against the person, but because I realized how little personal value having a person listed on an online website actually has especially when this is a person is someone with whom I haven’t spoken in years. In the end I would rather be associated with the persona of “MA student, family-oriented” rather than be written off as “high school yearbook editor, has a lot of friends.”


Berkelaar, B., & Buzzanell, P. M. (forthcoming). Online employment screening and digital career capital: Exploring employers’ use of online information for personnel selection. Management Communication Quarterly.

Saturday, October 18, 2014

The Confidence Gap

Feedback for girls and boys differs: “Boys’ mistakes are attributed to a lack of effort,” while “girls come to see mistakes as a reflection of their deeper qualities.”  
(Carol Dweck, Mindset – quoted in The Atlantic)

Last week when I asked my aunt about mentors in big companies and the difference between men and women’s success (glass ceiling, etc. etc.), she said it had less to do with a lack of opportunities for women and more to do with the “confidence gap.”

This is the confidence gap: people who are confident in themselves get promoted past those who aren’t. They are hired more often, and they usually do better in interviews than people who lack confidence. The confidence gap is usually a gap between women and men: men are more confident in themselves and women are less. A May article in The Atlantic reported than even high-powered women—top CEOs, basketball stars, and famous news correspondents with every right to feel confident—doubt themselves. The “imposter syndrome” strikes women more regularly than men.

My aunt identified one possibility of why this exists. While she was given plenty of opportunities to advance in the companies she’s worked within, she was usually given a trial period to prove herself before she got a promotion. Once she proved herself, she got the job. This wasn’t a hoop her male counterparts were usually forced to jump through. Moreover, she told me,

Men don't really like working with ‘girls.’ The boys club still exists—and since we may not do business the same way (golf course, football games)—they tend to gravitate to where they feel more comfortable.

It irks me that even in my own fields—communication, literature, writing—fields which are dominated by women in many ways continue to be somewhat “ruled” by men. The confidence gap exists in academic writing—a writing instructor reports her female graduate students’ papers speckled with “maybe’s”—and in is probably fed by consistently lower evaluations for female professors. Even the hiring is skewed in men's favor: a resume with a male name on the top (“John”) inspires more confidence than an identical (but for the name “Jennifer”) female resume.

And it’s not just a “confidence” issue. One reason women get hired less (and sometimes almost fired!) are concerns about her baby-making timeline. More on this later, but I’ll end with some advice from my aunt, a successful businesswoman and mother of two:

. . . don't quit to raise your children.  Five years out of the workforce is like re-entering at entry level.  Find a way to get some help.  Your kids will be fine.  They will grow up with a positive view of women as equals.  Nothing wrong with that.

Saturday, October 11, 2014

Are You My Mentor?

“Here’s the problem, in short: The assertive, authoritative, dominant behaviors that people associate with leadership are frequently deemed less attractive in women.”

(Harvard Business Review, September 2010)

The above article is about mentoring in the business world and the difference between sponsorship (advancing your protégé’s career) with mentorship (growing your mentee into a better employee or leader). Women tend to be “mentored” and men are more often “sponsored.” It’s a problem—for certain, as is the persisting belief that assertiveness is good for male leaders but not female leaders.

But what about for Christians? We don’t really need a sponsor since our “advocate” is Jesus Christ (“If God is for us, who can be against us!” Paul exudes in Romans). Mentors, however, should be integral to the Christian walk: someone to call us out when we plateau, someone to pray for us (someone to pray for), someone who has climbed mountains before you and if not advice, at the very least has comfort.

I suspect, however, that many people let this get away from them. For one reason, it’s awkward. Graduate students like to call asking a professor to be their adviser “proposing” (as in a wedding proposal), except unlike in a real proposal you barely know said person, it’s a professional “relationship, and there’s a serious lack of equality in the relationship. The process of gaining a mentor is similar: how can you ask someone to become your [best] friend without the benefits (equality of experience, mutual interests)?
Mommy mentor.

Awkward from the mentor’s side, too: at what age is someone qualified—with all due humility—to be a mentor? How would one go about proclaiming himself to be mentor material without (in so doing) disqualifying himself?

Outside of the awkwardness, there are plenty of logistical issues, from both the mentor and the mentee’s side. From the mentor: how will I have time to mentor someone? How will I know what to say? Who would want to learn from me? From the mentee: Who I can find who can mentor me given my personal background? Given my career aspirations? Given my location?

[C]onversations with Christian leaders reveal that the number one reason they don't take on a disciple or facilitate a mentoring program is that they simply have no time.” (Christianity Today, 2006)


The entire New Testament is arguably a treatise on why Christians should be seeking and, after, becoming mentors. The gospels tell of Jesus’ mentoring the disciples. The epistles reek of Paul’s mentoring and Peter calls for everyone to “Be examples to the flock.” Perhaps that realization—along with relevant experience—is what qualifies someone to be a mentor: if he or she recognizes the importance of mentoring and is willing to suffer the awkwardness and sacrifice the time. Perhaps someone bold enough to ask deserves a mentor. Perhaps the logistics are not nearly as important as the relationship.

Thursday, October 9, 2014

On Retirement

by Virginia Sanchez

Chapala, Jalisco, MX
    Chapala is a small city situated along one of Mexico’s largest lakes. When I visited the town this past summer, my uncle, who acted as a tour guide, pointed out the large number of US citizens who find it to be a great retirement location. Due to a strict time constraint, I was only there about half of a day but quickly started imagining waking up in Chapala as a retiree. I could picture myself eating at the local markets, buying from artisans at the flea market, and taking pictures by the inactive volcano (a retirement feature I didn’t know I needed). Retirement is not a concept I think about frequently, and upon reflection I realized how odd it was that I had formed such a specific idea of what I wanted my retirement to look like.

    After interviewing 84 people regarding their perceptions about retirement, Smith and Dougherty (2012) concluded that their participant’s stories of retirement (or future retirement) lined up, forming a ‘master narrative.’ They also note the similarities between this narrative and the American Dream, pointing out how the two narratives define success as an individual pursuit. Defining success as solely individualistic has several negative implications including that it places blame on those who either don’t have the means to retire early or are unhappy in their retirement.The article asserts that for many older workers, this creates additional stress as those workers are “forced to live with the stigma of having brought this state of life on themselves” (p. 470).

    Having such a heavily individualistic focus, I wonder, what this master narrative would tell us about retirement if we were defining it based on collectivistic, rather than individualistic, pursuits. At a micro level, our individual narratives would likely change. Rather than daydreaming about future travels or my daily schedule in Chapala, my suddenly flexible schedule might cause me to focus on local community development.

    I would argue that at the very least, we would change our perception of those individuals who are forced to continue working well into old age. The master narrative would also point to a larger societal problem rather than placing the blame on financially unstable individuals who continue to live on an employer-mandated routine. My questions to anyone reading this: What do you think a collectivist master narrative would look like? Are there any downsides to a collectivist master narrative?

Smith, F. L., & Dougherty, D. S. (2012). Revealing a Master Narrative Discourses of Retirement Throughout the Working Life Cycle. Management Communication Quarterly, 26(3), 453-478.

Tuesday, September 30, 2014

The Secret Skill for the Job Hunt

**This post brought to you by the folks at Webucator and their 2014 Most Marketable Skills Campaign! They have some great resources, including free, self-paced Microsoft training courses.**

Three years ago my aunt, a vice president of a well-known corporation in the U.S., laughed at the idea that an applicant for any job wouldn’t be vetted according to their social media use. “We can get in there,” she said when my sister and I brought up Facebook’s privacy settings. Since that moment I’ve come to realize what most millennials already know: employers are checking all job candidates’ social media accounts, not just LinkedIn.

In their forthcoming paper, Buzzanell and Berkelaar interviewed 45 employers in a variety of fields (IT, law, media/communication, etc.) about their hiring practices, looking for what kinds of information employers were accessing and how they were using it to evaluate potential hires.

Summarized from their findings:

The vast majority of employers acknowledge the importance of cybervetting in their hiring process. Employers are tired of clean cover letters, worked-over resumes, and recommendations that all say the same thing. In order to get a feel for real personality of the candidates applying for their job postings, employers “do a quick Google search” or Facebook check any and all applicants—from entry-level to executive. Most look first and foremost at pictures, vetting for unprofessional party pictures or PDA. Fair or not, photography of “unprofessional behavior” in a candidate’s social life disqualifies him or her as the kind of person an employer doesn’t want representing her company.

After that they’ll look at textual information: what, how, and how often a person is posting. Racist or jargon-such-as-YOLO-filled language, bad grammar, misspellings or improper punctuations: all of these can disqualify a candidate immediately. They will even look for hobbies “incompatible” with the proffered job (such as “that Farmville game”).  Employers trust their initial impressions of candidates based on a quick social media scan. Similarly, candidates who spend “too much” time on Facebook or Twitter will be judged as time-wasters. On the other hand, job candidates with no social media presence are also disqualified. As Buzzanell and Berkelaar (2014) write, “Just as lacking a credit history lowers credit scores, these data suggest that lack of visible online information negatively impacts employability assessments” (p. 25).

While the implications of these practices about privacy and work/life balance are, frankly, horrifying, knowing what employers are looking for gives job applicants a crucial leg up in the job hunt. One of the top marketable skills for recent or upcoming grads is the ability to craft an online presence that is attractive to employers. Employers are looking for millennials who are professional in their social media lives, adept at social media use, and high achievers with volunteer experience and a dense social network. 

Source:
 Berkelaar, B. & Buzzanell, P. M. (2014). Online employment screening and digital career capital: Exploring employers; use of online information for personnel selection. Management Communication Quarterly.