Tuesday, June 21, 2016

60 is the New...60



I turned 60 on Sunday. In the days leading up to it, I said that on my birthday I’d be officially “old.” On my birthday I said that I was now officially “old.” And now that it’s past, I still say that I’m now officially “old.”

Many of my friends, however, are in denial…serious denial. They keep telling me that 60 isn’t old. Some of them have given me the line that “60 is the new 50”…which I guess supposes that 50 was the new 40. Of course, all these friends who tell me that 60 isn’t old have either already crossed the threshold themselves or are coming up right behind me.

But really people, there comes a point where you have to admit that you’re not young anymore, and 60 seems about it.

At 30, 60 definitely seems a long way away…and old. Let’s face it…it’s grandparent age, and by definition, grandparents are old. At 60, 30 seems like it was just yesterday…and young. In fact, I was joking to one of my co-workers that at 59, a 45-year-old woman seemed like “some young thing” to me.

Let’s face it, and face it honestly…people under 20 are kids, 20-29 are emerging adults [a term I got from Aziz Ansari’s book, Modern Romance], 30-39 are adults, 40-59 is middle aged, and from 60 on up is old. You gotta draw the line in the sand somewhere. When you look at colors, there may be some debate as to whether a certain color is blue, green, or aqua; but there’s no confusing any shade of blue with any shade of yellow. That’s a line in the sand that everyone agrees with.

Now let’s be clear about something…old is not necessarily the same as decrepit. There’s an AARP video where “young people” are asked what age they think “old” is, and then to demonstrate how an “old person” would do certain things. It was sort of amusing to hear one of them say that 40 was old (remember that “young thing” I mentioned earlier?), and their perceptions of what “old people” in the 50 to 65 range were told me that they probably hadn’t spent a lot of time around their parents…who were probably in that age range. And yet…while many people are physically and mentally active up into their 90s, I have to admit that I work with a lot of people my age and older who do fit the stereotype of the frail, decrepit, not quite with it, old person. For every person who’s taking weekly East Coast Swing lessons and blogging about it, there’s another one using a walker who has a hard time understanding how to make simple phone calls on their cell phone.

So in case I wasn’t clear about it, my saying that now I’m officially old isn’t about saying that I’m falling apart (although I have to admit that parts of my body remind me daily that I ain’t no spring chicken). Rather, it’s recognizing that I’ve reached a significant milestone, a milestone that says that statistically, with luck, I’ve got about 20 years left.

And there’s nothing wrong with being realistic about it.

Next stop…70!

Tuesday, June 14, 2016

Good Advice vs "Blaming the Victim"

I get soooo tired whenever what would generally be considered useful advice on how to keep yourself safe gets tagged as “victim blaming.”

After Trayvon Martin was killed, I reflected on how my parents taught me that if you think someone’s following you, you shouldn’t turn around and confront them, and you shouldn’t run; but quickly walk to the nearest safe place and get help (see also call the police). I suggested that had Trayvon done any of those things...or used his cell phone to call 911 instead of his friend...he might still be alive.

The response from people? I was blaming the victim. Eff no. I still blame Zimmerman. But by being a little smarter about how to handle the situation, things might not have escalated to the point where Trayvon’s dead and a lot of us would like to see Zimmerman follow him.

Similarly, I learned early on that if I didn’t want to be jumped and robbed that I didn’t walk through certain neighborhoods at certain times of day, alone...and definitely not with a lot of money on me. If I got jumped and robbed...IT WAS STILL THE FAULT OF THE ROBBERS, but there were things I could do to keep myself relatively safe.

The simple fact of the matter is that THERE BE PREDATORS OUT THERE. I should be able to walk through any part of town at 11.00 at night with $100 bills pinned to my clothes, without anyone coming up to me and trying to take some. A woman should be able to walk through any neighborhood at any time of day, butt naked, without someone going up to her and trying to sexually assault her. These are all shoulds. But the fact of the matter is, as I’ve already said, THERE BE PREDATORS OUT THERE.

And it’s not enough to say that predators shouldn’t prey. One needs to be cognizant of the fact that these people are out there and learn how to keep yourself from falling victim to them. You can't just say “Well, I shouldn’t have to do this; they should just learn how to behave.”

That. Doesn’t. Work. Denial of the reality doesn’t work.

And giving advice on how to avoid the next person one of those predators preys upon isn’t “blaming the victim.”

It's trying to prevent one more person from becoming a victim.

I’ve heard this chorus of “blaming the victim” so many times that I’m beginning to wonder if it’s an unconscious way of trying to avoid any sense of personal responsibility for one’s own well-being, and of saying “I’ll do what I want. You’re not the boss over me.” But let me ask you this question: is it “blaming the victim” when we suggest that wearing seatbelts might prevent people from being thrown from the vehicle to their deaths in auto accidents…even those accidents caused by someone else?

Is it “blaming the victim” when we suggest that not being falling down drunk is a good way to avoid being jumped and robbed…or raped…or dying in a house fire that you’re too incapacitated to escape from?

Is it “blaming the victim” any time we suggest some simple ways that people might keep themselves safe.

To be sure, there are situations where people do seem to blame the victim. But let’s not confuse true victim blaming with good common sense advice.

I now await the hordes coming to uncritically accuse me of “blaming the victim.”

But remember...THERE BE PREDATORS OUT THERE.

And your opinion on whether or not there should be isn't going to change that.

Tuesday, May 17, 2016

Ebonics, Nu?

Lately I’ve been contemplating the language…dialect…whatever…that has been the bane of many “well-spoken”, “articulate”, and well-educated African-Americans for years. I’m talking about Ebonics.

I, like many of my African-American classmates when I was a kid, grew up in a house where we were not allowed to talk like that, and where “proper” grammar and pronunciation were insisted upon; because we didn’t want to sound like “those people.” In fact, I knew blessed few other African-Americans personally who spoke like that. Everyone I knew…in my extended family and elsewhere…pretty much spoke something very close to the Queen’s English.

So to us, what what has been called either Ebonics or African American Vernacular English (AVE), was simply poor grammar and poor pronunciation. Both signs that you were “one of them.” As a result, I looked upon Ebonics/AVE with contempt; and so when, a number of years ago, someone suggested actually teaching in this dialect, in order to reach more kids for whom that was their “first language”, I thought this was the most incredibly stupid thing I’d ever heard.

And then my daughter, the linguist, pointed out something that I hadn’t considered. She said that while it might sound like bad English, what we call AVE actually has its own internally consistent grammar and syntax. The problem is that what makes perfect sense grammatically in AVE sounds a whole lot like bad grammar in standard English.

Take for example the verb form “to be.” When you hear someone say “He be staying at his grandma’s house”, you’re probably thinking that they meant to say “He is staying at his grandma’s house”, but chose the wrong verb form. However, “He be staying” means something totally different from “He is staying.” The “be” implies a state of constant happening…as in “He is always staying at his grandma’s house” or “He is staying at his grandma’s house for the foreseeable future.”

And this got me thinking…suppose there was another language whose proper grammar and syntax sounded like a corrupted version of a closely related one? I didn’t have to go very far to come up with an example…I knew two of them personally: German and Yiddish.

Depending on who you talk to, Yiddish is either a dialect of German or a language in its own right. Yiddish has slightly different grammar and syntax than German, and uses some of the same words differently. As a result, it’s entirely possible that many speakers of standard German look upon Yiddish with the same disdain that many of us hold Ebonics in.

And yet, I love Yiddish. And while I love Yiddish, Ebonics don’t get no respect from me. As they might’ve said on the Lower East Side 100 years ago, “Farvosh ist das?” or “Why is that?”

I think there are a number of reasons for this. The first is that for centuries Yiddish was spoken all over the Jewish world, and not just in Germany. As a result, there was a thriving Yiddish literary tradition. And the fact that there was a thriving Yiddish literary tradition, with books and newspapers being written in it, means that this was the language of the educated as well as the uneducated. A second important reason is that Yiddish has its own alphabet. Well not quite…it uses Hebrew characters, but still…the fact that it’s written with a different character set makes it a totally different language that’s similar, and not simply a case of bad grammar in the mainstream one. Perhaps if Ebonics had a similar literary tradition and used a different alphabet, it would get a little more respect?

So where does this leave me? It leaves me having to cut Ebonics a little slack. After all, if I can appreciate Yiddish for what it is, I should be able to appreciate Ebonics for what it is…grudgingly.

And this brings me to my other daughter…my daughter the smart-ass. She understands from her older sister that Ebonics has its own internally consistent grammar, and even understands how that grammar works. One day, at school, she heard one of her African-American friends speak a poorly-formed sentence, and said to him, “That’s not how you say it! You can’t even speak wrong right!”

The girl has chutzpah, that’s for sure!

Tuesday, May 10, 2016

Whose Business is It Anyway?

A few weeks ago my daughter and I were talking about one of her favorite movies…and musicals. We were talking about Legally Blonde. I’ve seen both, and as the grandson of a hairdresser, my favorite scene was in the courtroom when…oh wait…some of you haven’t seen it yet, and far be it from me to spoil it for you by telling you that Rosebud is the sled.

Anyway, we were talking about Legally Blonde, and my thoughts turned to the relationship between the recently murdered billionaire Hayworth Windham, his much younger wife Brooke, and his adult daughter Chutney (from a previous marriage).

There was a lot of cynicism about the marriage between Brooke and Hayworth, with the usual snarky comments about the gold-digging younger woman only going after the rich guy for one thing. And yet, we find out in a private conversation that his money wasn’t the attribute she found most impressive about him.

Still, though, many people who weren’t privy to what actually went on in the relationship…and couldn’t possibly be inside either of their heads…ascribed the worst motives to the behaviors of each of them. She couldn’t possibly love him; after all he was so old. And him…what could he possibly see in someone that young…besides a hot body? Shame on him! She was obviously using him for his money and/or he was obviously only using her for sex.

There was also no love lost between Chutney and Brooke. Chutney felt the same about Brooke as most of the cynics, and hated seeing her father with someone who was roughly her own age.

But here’s the important question: Was it really anyone’s business but Brooke and Hayworth’s?

Really…was it?

Say what you will about what it may have looked like to you; if Brooke made Hayworth happy, was it any of Chutney’s business? Is it even any of our business?

Some people might argue that Brooke didn’t really love him, and was just pretending, in order to get at his money. But folks, if she acted like she loved him, and if that pretense made him happy, shouldn’t that be enough? Shouldn’t it be enough that she made him happy?

Oh, but some people might argue that if he was happy, he was happy for the “wrong reasons”, and that he should know “the truth.” But despite what the Bible says, the truth doesn’t always set you free. Sometimes it makes you miserable.

Besides…maybe what you think is “the truth” isn’t quite so true after all.

But getting back to the point, even if Brooke was using him, and putting up a pretty good act; if that “act” made him happy, shouldn’t that be enough?

Even if Chutney found their relationship embarrassing and distasteful; if Brooke made her father happy, shouldn’t that be enough? Shouldn’t she want him to be happy? Aside from possibly having to split the inheritance, if Brooke wasn’t mistreating him, what business of it was hers, really?

Yes, the story of Hayworth, Brooke, and Chutney has me wondering why so many of us tend to stick our noses into other people’s relationships because we don’t like it, or because we find it distasteful. And let me say right now that relationships where there’s abuse, or other serious problems, are totally different situations. But if the other person is simply the “wrong” age, ethnic group, religion, sex, height, hair color, whatever…as long as he or she makes our friend or family member happy, shouldn’t that be enough? Shouldn’t we be glad that our friend or family member has this person in their lives?

Or are we so selfish and closed-minded that we’d rather have our friend or family member be lonely?

Even when there’s no billion-dollar inheritance involved.


Tuesday, February 9, 2016

The Most Segregated Hour?

I’ve heard it over and over again…that the hour between 11.00 Sunday morning and 12.00 noon is the most segregated hour in America. Why? Because it’s when we all go to our separate, racially-divided, churches.

Except that I don’t buy it.

Why?

Because sometimes something that clearly looks like one thing on the surface is actually something else.

Imagine walking into a college dining hall and seeing three tables: one has nine black students and one white; one has eight white students, one black, and one Asian; and the other has eight Asians and two Hispanics. It would be easy to assume that for the most part, these tables are intentionally divided up by race or ethnicity.

Until you did a little digging…and found that the 10 students at the first table are all from Buffalo, the 10 at the second table are all in the choir; and the 10 at the third table all live on the same floor of the same dorm. People do like to hang out with people who they have some sort of connection with, whether or not it’s one that’s obvious to others.

Now let’s take another look at that “most segregated hour.”

I’ll grant you that some denominations were created from rifts over racial issues. Many denominations split in the run-up to the Civil War over the issue of slavery. For example, the Southern Baptist Convention was founded in 1845 as the result of a dispute within the greater Baptist church over whether or not slave owners could serve as missionaries. The Presbyterians split in 1861 over the issue of slavery, but for the most part rejoined in 1983. The African Methodist Episcopal and African Methodist Episcopal Zion churches were created when African-Americans left certain Methodist congregations in the 1800s to form their own, as a result of discrimination against them.

But they don’t tell the whole story. Some denominations exist because of the traditions that different ethnic groups brought with them when they came to this country. Despite their goal to see themselves as a more inclusive and representative denomination, my own Evangelical Lutheran Churchin America (often referred to as the ELCA) is still a largely ethnic church, having large numbers of people of German and Scandinavian descent; those people having brought the Lutheran tradition with them from their home countries. And as Scandinavians and Germans are to the Lutheran Church, Italians, Irish, and Hispanics are to Roman Catholicism.

I grew up in the Episcopal Church, in a congregation with a growing number of African-American members; and I had a fair number of friends who were African-American and Roman Catholic.

Of course there are theological differences; while both being Christians, the largely white Roman Catholic Church and the largely black AME Church have slightly different slants on Christianity, and that’s a very important thing to consider. In addition there are just plain stylistic differences. Many people wouldn’t know a theological difference if it bit them on the nose, but change the music, and you’ve got issues. I’ve often said that given a choice between a black church that did Bach and a white church that did Gospel, many African-Americans would choose the white church…not even thinking about the theology.

And what is Gospel music anyway? To a black audience it means one thing, and to a white audience it means something else. So when a large enough number of African-Americans join a certain congregation that the musical style starts to change, are the white people who leave doing so because they’re racist or simply because the style has changed? And if it’s the latter, haven’t a lot of us done that, no matter who we are or what church we’re in?

So…is the hour from 11.00 Sunday morning until noon the most segregated one or the most diverse one? I say it depends on how you define diversity. If you look at the cereal aisle in the supermarket, the question is “are we looking at the aisle or at the box?” If we’re looking at what’s inside of individual boxes, then a box of Corn Flakes is all Corn Flakes, a box of Wheaties is all Wheaties and a box of Cheerios is all Cheerios. Seems pretty segregated to me. But if you look at the aisle, and all the choices you have, from Corn Flakes to Wheaties, to Trix, to Apple Jacks, to Cinnamon Harvest, and who knows what all else; that’s a pretty diverse selection. And it’s a selection that gives everyone something to choose from.

Calling it the most segregated hour implies that people have no choice as to where they worship, and are forced to worship along ethnic lines. But the fact that I grew up in the Episcopal Church and am now a Lutheran, the fact that I knew black kids who were Catholics, and the fact that I know Italians who belong to traditionally black churches, prove that that’s not the case.

Sometimes you just have to dig a little deeper than what you see on the surface, and find out more about the kids at those tables in the dining hall.


And what kind of cereal they eat.

Tuesday, February 2, 2016

Primary Elections, General Elections, and Heart and Head Votes

As we officially enter the 2016 Presidential election season (and it can’t be over soon enough for many of us), there are a few things that need to be addressed.

Someone once said…and it might even have been me…I can’t remember, that what we really need are two elections: a heart election and a head election. Either that or the ballot should have spots for your heart choice and your head choice. This way people could still vote for their heart choice without feeling that they wasted a vote on someone who wouldn’t win anyway. Perhaps if people were able to make a heart choice without feeling that they’d wasted a vote, more people would vote for their heart choice, and the heart choice would actually have a chance of winning (or maybe that person still wouldn’t have a chance). Because, you see, in the heart/head system, whoever won between heart and head would be the overall winner.

And then I realized that we already have heart and head elections, but don’t realize it. They’re called the primary and general elections.

Let me explain.

The primary elections are all about voting with your heart. They’re all about voting for your ideals. Which candidate in the crowded field of your party best fits your beliefs? It doesn’t matter whether or not that person has a snowball’s chance of winning; you vote for them in the hope that they’ll win your party’s nomination for the big one in November.

Let me say this again: A vote cast for your ideal candidate in the primaries is not a vote wasted…not even if your candidate gets trounced. And that’s because all that was at stake was who got to run in the big one.

The national general election in November, on the other hand, is most definitely about voting with your head. You are given two, and only two viable choices. And the key word here is viable. There are only two people who have any chance of winning. Third parties have never won an election, write-in candidates have never won an election. The best (or worst) they can do is siphon off votes from the viable candidate that you actually would’ve have preferred to win in a close election.

Case in point, Ralph Nader in 2000. Had the people who voted with their hearts for Nader then, voted for Gore, Bush II wouldn’t have been president.

And this is the mistake that many people make when it comes to the general election. They still think that it’s about voting with their heart. They still think it’s about voting for what they believe in. They still think it’s about making a “principled stand.”

Let me break it down for you. In a theoretical election there are three candidates. Candidates A and B each have a 47% chance of winning, while Candidate C has only a 6% chance. On the other hand, you agree with 95% of what Candidate C stands for, you only agree with 60% of what Candidate B stands for, and you totally disagree with a whopping 95% of what Candidate A stands for.

What do you do?

Believe it or not, there are people who still believe that they should vote for Candidate C, because Candidate C most perfectly represents what they believe in.

Even though they would absolutely hate to see Candidate A win.

They can’t let go of their ideals enough to realize that throwing their votes to Candidate B would give them most of what they wanted. And so, instead Candidate A wins by a hair, and the “Six Percent Club” starts immediately complaining about the election results and the “broken system”, instead of admitting that by being pigheadedly idealistic, they handed the election over to Candidate A.

So let me repeat this, because I have friends who voted for Nader in 2000: You get to vote with your heart in the primaries, but when it comes to the general election, you only have two viable choices. Don’t waste you vote and end up giving the advantage to the candidate you absolutely hate, because you couldn’t bring yourself to vote for the one who wasn’t perfect. Don’t waste your vote and give the advantage to the candidate you hate by writing in the name of your “perfect” candidate who can’t win. Look at your two viable choices, and vote for the one that you’d prefer.

Because if you write in the name of, or otherwise vote for, a candidate who absolutely cannot win, and the person you totally despise wins…

You have no one to blame but yourself.

Tuesday, January 26, 2016

Pointing Out the Speck in Their Eye

Back in September I wrote a little piece called The Girder in Our Own Eye, which I had intended as a preemptive strike against “ourselves” for the piece that was to follow in the next week or two.

And then a few other things came up, I wrote about other things, and I never got around to it.

Well, now is the acceptable time…even though I have a stack of other things to write about; and having taken a look at the girder in our own eyes, it’s now time to take a look at the speck in theirs.

While I accept the fact that the goals of many in the anti-abortion camp come from the best of intentions, I absolutely hate how they seem to “cook the books” in order to try to reach their goal; and I hate how while using their own religious arguments to demonstrate why abortion is wrong, they don’t take into account other religious arguments that might say that it’s not quite an open and shut case.

I hate their scare tactics. There’s a billboard that crops up on a regular basis that says that “abortion increases breast cancer risk.” Now I’m an open-minded person. When faced with information that I’d never heard before, I don’t immediately dismiss it out of hand…I do a little research. And where best to go for information about breast cancer than the website of the American Cancer Society? What did I find there? I found that in the huge majority of tests, they’ve found neither causality nor correlation between abortions and breast cancer. However, in a very small minority of cases, a correlation was found (not the same thing as causality), and this is the “fact” that this billboard, and others like it, are based on.

Then there’s that famous Planned Parenthood video about them selling fetal body parts. I haven’t seen it, but I’ve heard from people who have, that it’s a really terrible editing job, spliced and cobbled together to make it look like people are saying things that they’re probably not. The only real way to tell for sure would be to have the original “frame codes” showing at the bottom of the screen. Then we’d know when a few important seconds from the conversation were left out, or moved around, in order to change what was said.

But wait, there’s more. When this whole controversy first hit, a friend of mine said, “The same thing happens to fertilized eggs left over from in-vitro fertilization, so where’s the outrage over that?”

Good question.

And a week or two ago, a friend of mine posted a pie chart that purported to compare all the abortion deaths since Roe vs Wade to all the American war deaths since 1776. That includes The American Revolution, the War of 1812, the Civil War, the Spanish-American War, World Wars I and II, Korea, Vietnam, and our various conflicts in the Middle East. According to this chart, abortion deaths were something like 90% of the pie.

But while this may be true, I believe that the case was severely overstated by comparing abortions over a 40-year period to wars that each lasted a limited amount of time. Compare apples to apples. Compare, say abortions during any four-year period from 1973 to now to the total American war deaths during World War II, and then we can talk. It may still be more, but at least the overstatement of the case wouldn’t be stretching the credibility of the chart.

I wrote four years ago about how what may be a good cause suffers in my eyes when they stoop to tactics that either lie outright or distort the truth. I was talking about anti-smoking campaigns at the time, but I think the same can be said about the anti-abortion movement. This is the speck in their eye.

And if we could all stop treating this as a zero-sum game, and instead agree that we’re all going to have to live with a half-loaf, we could work together to reduce the number of abortions without infringing on anyone’s rights.

Or lying.