Monday, October 27, 2014

Taylor Swift's 1989 not the evolution I had anticipated

Taylor Swift just keeps getting better.

Or at least that's what I was going to write in a lamentation about the fact that every time she puts out a new album I lose the ability to listen to anything she's released prior. It was a disease in which I likened Swift to Pokemon.

Yes, Pokemon.

Each record was a new evolution. When Fearless came out, Swift's self-titled debut couldn't compete anymore. It was the same thing with Speak Now - which would turn out to be the Squirtle of TSwift albums. But Red... Red was something else. Red was Wartortle. And when you have a Wartortle, who wants to go back to Squirtle? Swift was evolving constantly with every new record, and so it was no surprise that with all the hype surrounding 1989, I was expecting Blastoise.

I was anticipating a level of perfection that would make me toss Red into the pile with its predecessors. And knowing my history with Swift's albums, how unlistenable all others become when she gifts the world a new collection of music, I almost wanted to hold off on 1989. I wanted to hang on to Red a little longer.

But come on, this is Blastoise.

I bid farewell to Red and thanked it for All Too Well on repeat, Begin Again in the dark with my headphones on, I Almost Do pumping into my ears during walks around my neighborhood. Then I loaded 1989 on my iPod and waited to join the masses of those salivating over each electronically produced note.

And then I kept waiting. And I waited some more. And I got there eventually, but not until the three bonus songs on the deluxe version.

Let me be clear: I don't dislike this album. I'll probably listen to it regularly, and there are a few tracks that'll likely end up on repeat at some time. That is, if I can remember which ones they are.

When I finished my first complete listen of all the tracks, I was trying to pick out my favorites. Trying to decide which songs I wanted to go back and listen to. And that's when I realized the problem, they just weren't distinct enough for me to even remember what I liked and what I liked a little bit less. And that was disappointing to me given how much love surrounded this release.

And now, on my second or third listen I'm having those ah-ha moments, those "oh yeah, this was the song that I kind of liked, let me check the name of it so I don't forget again" thoughts. And the song in question was I Wish You Would, if you were interested. Which I know now, and will likely only immediately remember it because I'm writing about it. (Edit: While I was re-reading this before publishing, I kind of laughed at his paragraph when I realized that I in fact had forgotten the song.)

I just didn't experience this with Red. Red was distinguishable from one track to the next, whereas 1989 kind of all seems to flow into one really long song for the most part.

Or maybe I'm just bitter because I was expecting Bad Blood to be the Mew of Swift songs (if we're still on the whole Pokemon thing here), knowing that she's a self-proclaimed expert at revenge, and it lyrically and musically fell flat to me.

But because I really wanted to love 1989, let me shift my focus to the deluxe songs I mentioned earlier, the area where I think she shines. Wonderland, You Are in Love, and New Romantics are my three favorite songs on the album by far - both musically and lyrically. I can't deny my love for the whole subtle Alice in Wonderland theme in the first song, and the last one is just as fun, but it's You Are In Love that redeems the entire album for me.

Maybe it's because there aren't many of those soft songs on an album that sounds like it's entire purpose was to be radio-friendly (which I'm not saying is a terrible thing, by the way), maybe it's because it's just beautiful, I don't know, but I'm loving it. Everything about it.

These are just immediate reactions, and I can almost guarantee that more songs, if not the entire album, will grow on me in time. It's already started happening as I continue to listen. I'm a fan of Swift's music, but I'm not one of those people who can hear her mutter any sound and think it's automatically perfection. I guess the bottom line is that I like this album, I really do, but it didn't blow me away the way I expected it to.

1989 is a new-ish area for Swift, a completely non-country area. So in a way, maybe it's a little bit unfair to be expecting another evolution. Maybe I'm not going to be battling my enemies with Blastoise, but I'm content to grab a Charmander and start leveling up again. And I'll look on the bright side - we've got a new side of Swift and some good music to go along with it, and I've finally broken that routine I was in: so if you'll excuse me, Red is calling.

Tuesday, October 21, 2014

Why my journalism degree made me skeptical toward journalism

Shoved inside of my desk drawer, under random pieces of paper and old magazines that I haven’t been able to part with yet, is my college diploma. It’s probably bent, it’s probably collecting dust, I don’t really know - I don’t even use that desk anymore.

You’d think that something that took four long years and countless sleep-deprived nights to finally come into my hands would be on display somewhere. Or at least carefully stored. I mean, I should be proud of it, right? There it is in big, bold letters: Bachelor of Arts. Magna Cum Laude. But I didn’t care. Journalism isn’t always what it should be, and I knew it.

I have a couple of disclaimers here before I get started. First of all, I have to credit the fantastic journalism program that I graduated from back in 2011. My decision to not go into journalism after graduation isn’t a reflection on the department or the brilliant professors I had the honor of learning from. I took a publication design class in my last semester there and my professor told the class on the first day that at the very least, by the end of the semester we'd be able to pick out bad design. That’s how I see my take on the media: I learned the way that honest, ethical journalism should be done, and because of that it’s easier to pick out the bad practices.

The second thing I want to say is that I of course realize that there are plenty of exceptions to what I’m about to write.

I’m going to break this into three parts, though I could go on longer, in the interest of not writing a novel. Let’s get started.

The live coverage of breaking news:

This is by far the epitome of my love/hate relationship with journalism. I consume these moments like a sponge, glued to my TV as I watch everything unfold in real time. And honestly, there are times when I do wish that I was on the other side of the screen, fantasies that I could be the one to cut into your favorite television show to announce the breaking news. I guess you can take the girl out of journalism but you can’t take the journalist out of the girl. Anyway, like I said, I’ve always consumed those moments. I vividly remember one instance when I was a senior in high school. I actually sat at my kitchen table taking notes on a breaking report, for no other reason other than the fact that I simply just felt it. I know how important those moments are. But the race to be first can easily change everything, and not in a good way.

As these events are happening, information is being thrown around left and right. It’s coming from everywhere, it’s being passed between people at lightning speed, and none of it is usually 100% accurate. All fact checking goes out the window at the cost of being able to say that one particular news station was the first to report something.

Take Sandy Hook for example. Remember the earliest reports? Ryan Lanza was falsely accused as the shooter when in reality it was his brother, Adam. Of course you could defend this if you want to (but you shouldn’t), by saying that since Ryan Lanza was so outspoken about it on social media you could quickly deduce that there was an inaccuracy there. Or by saying that since his brother had his ID then it’s an honest mistake to make, but to me it just verifies the fact that even when you have every reason to believe something is true (I mean, why wouldn’t you believe that an ID on someone’s person is their ID?), you still have to fact check. You still have to be without a doubt. You don’t race to the news desk and put someone’s life and reputation on the line.

Whether you want to give them a pass for an honest mistake or not, the facts remained the same: CNN spread false information to get a pat on the back for being first. In school, you can rush through a test and be the first to hand it in, but the one who took their time answering could be the one with the correct responses. You don’t get points on a test for being first. 

And maybe those facts couldn’t immediately be fact-checked, but in that case just hold off on releasing the name. As many, many people have said after national tragedies, don’t give the shooter the attention they desire, anyway.

BuzzFeed, who also reported the misinformation, wrote an explanation here, and I think it's interesting to note the follow piece from that article:

Reporters at Gawker, Mediaite, and BuzzFeed were among the first to find and publish information from a Facebook account, including photographs, of a man matching several details — name, approximate age, town of birth and town of residence — of the man named by police as the shooter.
So even as they're trying to explain their mistake, they made sure to slip in the fact that they were among the first. Congratulations, BuzzFeed, you were among the first to cause an innocent man to leave work and find out that he was being held responsible for the murder of countless children. You were among the first to literally turn this guy into a meme within seconds. Are those the kind of accolades you were hoping for?

These news outlets are like twelve-year-old boys who race to the comments section of a YouTube video in order to snatch the highly-coveted "First!" comment. What's the point? That person has contributed absolutely nothing to the topic at hand, and is more often than not met with ridicule at the sheer stupidity of it all.

Fantastic. Good for you.

My feature writing professor in college once handed a paper back to me with the words "fatal mistake" scrawled across the first sentence. I misspelled a last name. I couldn't (and probably still can't) spell that particular name from memory, and yes I should have double checked, but if a minor misspelling is a fatal mistake, I can only imagine what Professor Lord would call the announcement of blatantly false facts.

Sensationalism:

The screen goes dark.
Capitalized, bold, red letters race across the screen.
A few menacing notes play in the background.

It's the intro to whatever current even the media has currently chosen to beat to death.

Some updates are just not updates at all. They aren't. They're excuses for news stations to jump on whatever they think will give them viewers, even when the broadcast has nothing of value to add. It's a "Hey look at us. We're still here, still covering this. Nothing good to tell you, but don't forget about us." After a certain amount of time I start suffering from information overload. Back in 2009/2010 I didn't need to hear about Tiger Woods every day. A decade prior when I was in fifth grade I distinctly remember getting tired of hearing the name Elian Gonzalez every morning. My mother still jokes about how aggravating that used to be to 11-year-old me to hear the same story every day. Every day. Sometimes I take a step back and ask myself, "Are they doing this because they have information to get across, or are they doing this for ratings?" Or maybe it was both (no, it was never both). There's not much of a place for Morning Glory in the actual, daily news.

Citizen Journalism:

I have to admit, I used to not be such a fan of this concept. In college I saw it as people who just jumped online to do what I was spending four years and thousands of dollars to learn how to do correctly. But now I see it for exactly what it is: People who have no agenda, no news agency to report back to, no ratings to go after, spreading the news because it needs to be spread. They have nothing to lose, nothing to gain. 

The manhunt in Boston was one of the main events that shifted my perspective. I was glued to this, maybe even more so than I usually am to breaking news, sitting on the couch in front of TV with my laptop open in front me. My boyfriend had a local news stream from Boston playing on his laptop, and we watched that and CNN simultaneously. The local news was really the one to watch, they had it together far better than CNN did, and reported the capture several moments earlier (one of those times when being "first" was not akin to prepubescent YouTubers). But there was another medium at play that until that day I hadn't given much consideration to: Twitter.

Yes, journalists used it. Yes, I followed along with them. But it was also the citizens of Boston and the surrounding area who were keeping people up to date. The night before the suspect was caught I stayed in front of the news as long as I possibly could. Afterward when I dragged myself to bed I still followed along on Twitter, and I don't feel as though I missed anything at all. In fact, it was a Twitter user who was arguably the first person to even announce the bombings to begin with.

As I was verifying a piece of information here I came across this (regarding the Boston Bombing coverage of Twitter) on the Twitter blog, which goes along with what I wrote earlier in this post:



And even more recently: one of the first people to announce the US Military's bombing in Syria was a Syrian man on Twitter.

The point I'm making with all of this is, until more people in the news industry adopt this same attitude, this news first, agenda later never attitude, I'll always have a skeptical eye on the news and the people who present it to me.

I scratched the surface of the aspects of the media that bother me, but for more information I suggest starting with this resource.