Sunday, May 9, 2010

Change is constant

"Once a new technology rolls over you, if you're not part of the steamroller, you're part of the road." - Stewart Brand

If you’ve been following at all, you know that this blog was a discussion of technology: past technology, past predictions of technology, present technologies and future technologies. Oftentimes one post would include not one, not two, but all of these things. A discussion of cell phones began with telegraphs, evolved to rotary phones, took a detour through “futuristic cell phones of the past” and eventually ended up at blackberries; this is the formula many of my posts followed.

Why did I approach topics this way? What did it teach me?

It’s simple, really: nothing lasts forever.

Today’s technological advances are often tomorrow’s stone age devices. Remember Windows Vista? And how that was around for oh, I don’t know, 2 seconds? (In case you haven’t picked up on this, I exaggerate things. It’s for effect) How about the original iPod? If you still have one of those things, you might as well use a Walkman.

Things change. Personally, I think it’s scary.

I’m a communication major with a focus in public relations. Public relations, like many other concentrations, is constantly evolving due to changes in the field. The difference with public relations today, is, that social media has been adapted by PR pros (and aspiring pros like me) as our own. Twitter, linkedin, blogs, foursquare: PR toys.

As a young PR student, I’m worried. Or, as I like to say, "optimistically aware." Should I bother with these “toys?” Should I learn them, become a pro at them, earn the title of “guru” and garner the respect of my peers….for a year? Will the industry still be using twitter when I graduate? Or will I have experience in a useless tool? If someone has some insider info on that one, I would greatly appreciate it.

My point is we don’t know. We don’t know who will invent what and when. The social media tool of the future may already have been invented, may be in use right now, but it’s not widespread. Facebook and Twitter didn’t take off overnight. So what if it’s out there, waiting for me to use it, and I’ve got no clue it exists? It’s weird to think about, but it’s very possible.

I know that this occurs in almost every field of study: every day has the possibility of creating a new theory, a new tool, a new way of thinking that can revolutionize the way things work. But I’m in PR, so that’s what I worry about.

All I can say is learn to adapt. That’s what I’ve learned: to be open to learning. I know. It sounds really, really simple. And it is—somebody just needs to let you in on the secret simplicity of success.

So for now I’ll tweet, I’ll keep blogging, I’ll eventually figure out how to use foursquare and I’ll continue to change the privacy settings on my facebook until it hardly exists at all.

Or maybe I won’t—but that’s for tomorrow to tell me.

Happy Sunday, & Happy Mother’s Day to my Mom and to all other Moms.

Thursday, May 6, 2010

Internet Addicts Anonymous

"Getting information from the Internet is like taking a drink from a fire hydrant." -- Mitchell Kapor

My name is Kelly, and I'm addicted to the Internet.

Okay maybe not, but the Internet is awesome. Seriously.

Last night, my phone wasn’t working so I googled the problem I was having and voila, I fixed it. A couple of weeks ago I wanted to make a cheesecake but didn’t have a good recipe, so I googled it. I find out what’s going on at school, what’s going on at home, what’s going on in the world all in one place. Internet = awesome.

This semester, I had the pleasure and advantage of taking Jour289i: Information 3.0. This, of course, is the reason I’ve been blogging all semester. I’ve learned about blogger, twitter, secondlife, picasa—so many social media sites and apps that I cannot remember all of them. But I think that one of the most important things we’ve learned in this class is fact checking and analysis.

Some of my favorite posts to write this semester were those where we read an article, and were told to comment on it (think gender & video games or cell phone use and dependencies.) I never had enough info from my one reading or in my personal repertoire to successfully analyze and respond to these articles, so I was always forced to do further research. This not only expanded my knowledge on the specific topics, but it taught me to read critically. I’ve learned to find a second source, to compare and to contrast, and to learn from all of the available information on the web (which is a lot.)

So what have I learned? To consider things from all points of view. To take what I know and use that as a launching point to learn more. To utilize the resources that I’ve been given to find and use new ones. This class was not only about using the Internet or about new technologies or social media; this class was about taking all of these things and expanding your knowledge, learning to use these things to think critically.

Citizen Journalists and Street Cred

"The smarter the journalists are, the better off society is. To a degree, people read the press to inform themselves-and the better the teacher, the better the student body." -- Warren Buffett

Students in Jour289i surveyed friends on technological use and opinions. One of the questions we asked was "Do citizen journalists have as much credibility as professionals?" The answer was an overwhelming "no." Later in the survey, the question "In the future, do you think citizen journalists will have as much credibility as professionals?" was raised. The answer changed in a positive manner. So although our peers do not believe that currently, citizen journalists have credibility, we do believe that in the future, things will change.

This instilled some serious curiosity in this citizen. Why? What is going to change that is leading today’s students to believe that in the future, we will have just as much credibility as people like Stephen Colbert.

I think it’s technology. Technology is making the difference. Technology is driving this change in belief and technology is enabling this change.

For years, amateurs have been sending in videos to news stations who were unlucky enough to miss “the story of the century,” which, in all the novels I’ve read, appears to occur every week which seems to be a bit counterintuitive but whatever. Back to my point. Amateurs, who happened to be in the right place at the right time, would catch the right video clip, whether on a professional video camera or your run-of-the-mill model. Whether they were looking for this opportunity or not, they happened upon it and they made news.

According to a 2009 Marist Poll, 87% of Americans have cell phones. How many of those cell phones have cameras capable of photo and video? I don’t have a number but I’m going to approximate: a lot. Every single one of these cell-phone owning Americans has the mobile ability to capture moments—to make news.

Here’s a personal example.

A recent Jour289i class assignment revolved around Maryland Day. For those of you who do not know, Maryland Day is an event, held at none other than the University of Maryland, where the University is open to visitors of all ages. Activities range from performing shows, campus tours, moon bounces and autograph sessions. It’s a blast and a half. This year, my classmates and I were given the task of travelling around campus, experiencing different aspects of Maryland day, and reporting back on it. Live. All. Day.

I had an iTouch, a camera, a cell phone and a rain coat, and I had a long day. But I learned something. My classmates and I provided (to my knowledge) the best available online coverage of Maryland Day. It may have been the only available online coverage of Maryland Day, but still, it counts for something, right? I was a citizen journalist. I tweeted, I uploaded pictures, I interviewed performers, visitors and volunteers. Isn’t that credible? So I’m not a professional, and Maryland Day may not be a big deal—but what’s preventing my (and my classmates’) coverage of Maryland Day from being credible? The fact that we’re students? The fact that we weren’t paid to do it? (unless you count a pass/fail grade…) I’m not sure. I don’t really see the difference between the information and materials that I provided, and the information and materials that a professional could have provided.

Here’s another story, free of bias since I didn’t participate and I didn’t write it. I’m just reporting.

This CNN article, “Citizens monitor Gulf Coast after oil spill” tells the story of an MIT student trying to make a difference. Jeffrey Warren is walking up and down the Louisiana coast “holding a kite string that’s tethered to a helium-filled trash bag and a point-and-shoot camera.” Sounds ridiculous, but he’s doing it anyway. Why? Because the professional satellites that we have covering the oil spill are really freakin far away, and the pictures they take and the efficiency with which they take them are just not that helpful. So citizen journalist Warren is doing his part and documenting the accident himself. I give him credibility, but I’m not a pro.

To sum up my longest blog post in a while: This class has taught me to be a citizen journalist. To create news. To efficiently and effectively produce credible content. To utilize various social media tools in correlation with different technological toys like digital cameras and iPods. Am I credible?

Well, you just read this whole thing—didn’t you?

Happy Thursday :)

Thursday, April 22, 2010

Personalize me Cap'n!

"I wanna talk about me!" - Toby Keith

We’ve been talking about personalization quite often in class lately; probably because the rest of the world is talking about personalization, well, almost constantly.

But what is personalization? Is it being able to pick what fonts your website is displayed in? Is it making sure that your computer remembers your Facebook password for easy login? Or is it being able to download whichever app you want, when you want it straight to your handheld iPod?

D. None of the above.

Personalization is “is a computation-based application that takes a well defined set of inputs and returns one (or more) recommendations for a piece of content to be immediately served to an end user.” Or, personalization is a mouthful.

So let’s break it down. Google may be one of the best and most familiar examples of online personalization. Whether you’re aware of it or not, Google tracks everything you search. I have a gmail account, and I logged in to my account and went to my privacy settings to try to better “hide” myself from Google. What did I find? Every. Single. Thing. That I have googled since I bought this laptop. All of it. Scary, huh? Immensely.

So anyway, Google takes all of that information, stores it, and uses it to send you the best (according to its computer brain) results for you personally. I was trying to find a place to go out for lunch the other day, and I typed “lunch restaurants in” and what comes up? Bethesda. It read my mind. Scary again. Useful? You betcha. Still scary.

This web article, written on March 3, 2010, claims that Google personalizes about 20 percent of search results, usually the top results on your Google search.

Now this may lead you more quickly to the sight you do actually want, but is it entirely a good thing? Is it making users close-minded? Is it preventing us from moving outside of our comfort zone? It’s helping prevent me from eating anywhere besides Bethesda, that’s for sure. So is it helping or hurting? Broadening our horizons or erasing them? I’m not so sure. I’m a googler-I google everything; I think it’s because I’m very inquisitive. But because I click the first site every time, am I only reading what google wants me to read? Am I letting google do some of my thinking for me? I hope not. I consider myself a pretty good thinker; I need to think about this one on my own & give Google the night off. I urge you to do the same!

Wednesday, April 21, 2010

I love Rock N' Roll. And Nintendo.

"Maaarriooooooooooooooooooo" -- Luigi
Two words: Super. Nintendo.

Selected by none other than me as 1991’s greatest invention. Possibly top 5 all time. It’s really that great.

My aunt and uncle still have one. Every time I visit their house, I plug in this game system, smack it a few times (the only way to turn it on) and make sure all the dust is out of the bottom of the video game itself (they’re kind of hollow.) Totally. Worth it.

For those of you who don’t share my feelings or my extensive knowledge of Super Nintendo, here’s some background. In short, Nintendo released the Nintendo Entertainment System (NES) in 1985 and this device “single-handedly revitalized the video game industry.” There were over 60 million NES units sold and people could finally play new and exciting video games in the luxury of their own homes.

Super Nintendo was introduced in 1991 and featured 16-bit technology which meant more processing power which meant cooler games. Yea the graphics aren’t great and okay the games weren’t the greatest but COME ON. They’re definitely fun, and they’re definitely a worthwhile waste of time. Old-school Mario, Clue, golf, even shoot-em-up games that everyone seems to be obsessed with.

And then, Nintendo one-upped themselves.

That’s right. Nintendo 64. 1995.

It even has its own Facebook fan page, with 86,631 fans. This beats the the majority of the President of the United States’s fan pages by approximately 80,000. Moving on.

This console was the beginning of 3d video gaming and introduced the gaming world to a “realer” look and feel of its games. The controller has a joystick that allowed characters to move freely around the screen as opposed to the standard up/down/left/right that fondly brings memories of Pacman to mind.

One downside: (this is a consideration of the gaming world, not me. I love Nintendo 64 and the guys down the hall have one which they play all the time. Still awesome, no matter what.) The Nintendo 64 is cartridge-based, meaning the games are not on CDs. This, to the critics, is a big mistake. CDs were considered the future, and Nintendo released their futuristic device, which was more or less anchored in the past due to these cartridge games.

Fast forward to 2006 and Nintendo releases the Wii, which quickly became the “best-selling latest generation console system in the world,” and let me tell you, it’s awesome. Practice your serve like Andy Roddick in Wii Tennis. Mix potions under Snape’s watchful eye in Harry Potter. Guide Mario through different galaxies in Super Mario Galaxy. These games are too. Fun. The graphics are great, they’re easy to play, and the wireless motion-sensor controllers are really just enjoyable to play with. There’s no wires, they take standard batteries, and they’re MOTION SENSORED. You move, your character moves; talk about personalization.

Overall, I disagree with the gaming industry. I think Nintendo rocks and they’ve always been on top—their games, their consoles—you think video games and you think Mario (which you can now conveniently play online,) you think N64. They get my vote.

How do you guys feel?

Which company has the BEST video game console?
Playstation (Sony)
X-Box (Microsoft)
Other free polls
Happy Wednesday!

Monday, April 19, 2010

Let's talk about games, baby!

This week's assignment was to come up with a creative, online game for children representing our topic. This took some serious deliberation because, as I believe I have mentioned, I am not overly creative. My topic is also very wide, and therefore a bit difficult to encapsulate into one, simple game.

BUT, I think I've got it.

My game is quite simply a blend of new and old. It would be a virtual reality game, so you'd get to wear really great-looking headsets and gloves and whatever else it is that you wear when you play virtual reality games. This would obviously be the new technology part. You would then enter a virtual world that would literally be half old and half new. Old games would be on one side, new games directly opposite. New technologies facing old, old methods being overshadowed by new.

Kids would "walk through" this virtual reality and try everything out. iTunes and record players, skype and rotary phones, games like candyland or electronic games like catchphrase. There would be "levels" with different categories LIKE technology, games, music, leisure etc. and to pass each level (and get to the more fun levels like games & leisure at the top) you would have to take a quiz where you successfully utilized each old & new technology in a timed trial, or where you beat the computer in a new or old game--ya know, fun stuff.

Even better--since this is something I have seriously always wanted to do--instead of playing the games (like candyland, or Super Mario! or something old school like that) you could actually be in the game--Jumanji style! minus the animals trying to kill you. How cool would that be? Someone please invent this for me.

I really hope that the next part of this assignment is not "create this game!" 'cuz then I am flat out of luck. But it's definitely cool in theory, right?


Happy Monday!

Sunday, April 18, 2010

iTunes against the world

"I'll tell you about the magic, gonna free her soul
It's like trying to tell a stranger 'bout rock and roll" - John Mellencamp

It was new, it was cheap, it was TIME’s coolest invention of 2003.

It was iTunes.

Bono, Mick Jagger, and Dr. Dre were there when Steve Jobs introduced his newest brainchild. The iTunes Music Store was Jobs’ recent stroke of genius in a long line of rights. It was a way to share music legally, to give back what was due to those who had created the music while still allowing users to download content. Songs were just 99 cents, albums were $9.99. Here’s the catch—when iTunes was created, it was only available on Macs. In 2003, Mac computers represented a meager 3 percent of the computer world.

That was when Jobs made the decision; not only to sell music in an online format, but also to make the iTunes store available to the other 97 percent of the market: the PC users. Within the next three days a million copies of iTunes has been bought online and downloaded by PC users.

The rest is history.

When this article in TIME magazine was written, Apple had the rights to sell 400,000 songs and iPod’s were $499. iTunes now features more than 11 million songs and Apple now offers several different versions of the iPod with different features at different costs

iPod shuffle: 2GB-$59/4GB-$79

iPod nano: 8GB-$149/16GB-$179

iPod classic: 160GB-$249

iPod touch: 8GB-$199/32GB-$299/64GB-$399

iTunes, at the time, seemed to be the perfect solution to the problems of illegal online downloading. Create a quick, easy and efficient way to download, store and listen to your music all in one convenient application right on your desktop. However, within a few years, fingers were starting to point back at Steve Jobs & Apple as creating problems—not solving them.

About two years after the initial launch of the iTunes store, record labels and their company executives began to worry that they had relinquished too much money, errr-power, to Steve Jobs and Apple. Apple “set the ground rules for their own business,” according to several music tycoons. Record labels argued that they wanted to set their own prices for songs—give discounts on older albums and bump up the price of newer, popular songs.

Record labels may have agreed to iTunes before they fully understood the immensity of the project, a smart move (or a sketchy one) by Jobs. When the iTunes deals were made, the store was available only on Macs. Of course, a few months later, Apple released the iTunes music store for download availability on all platforms—also releasing the cheap songs to the rest of the computer world. By 2005, iTunes was outselling traditional music stores, further upsetting the record labels and the music industry.

Now, 5 years later, the music industry is still not happy about iTunes more-or-less monopoly over the current music business. Apple is now the number one American music retailer and the music industry is not enjoying it. According to the New York Times, “the relationship (between Apple and record labels) remains as tense and antagonistic as ever.”

Some labels argue that iTunes is not paying their “fair share” of the money they make off of songs. iTunes distributes them, but who do they really belong to? This debate was sparked 7 years ago when Jobs first introduced iTunes Music Store, and it does not seem to be going away anytime soon.

So what do you think? Is Apple killing the music industry, or saving it? Should Apple fork over more of the royalties on the downloaded songs, or are they just keeping what they deserve?

Maybe soon we'll be able to buy the answer on iTunes.

Saturday, April 17, 2010

8-Tracks to iPods

In the 1920s, the record player was king, kinda like Fonzie in 1950s Wisconsin. If you wanted to listen to your favorite band all you needed to do was go to the record store, by their album and pop it in your record player.

Then we had tapes. They were cool, right? Double-sided so you could put more music on them, cassette players eventually became portable and eventually cars were made with tape-decks in them. 1963 was when the Philip’s compact audio cassette made its debut.

In 1979 the Sony Walkman was introduced, and that thing had a long run. Look hard enough and you can still see people running with Walkmans, listening to them on the Metro, walking to class with theirs on. Of course the technology is now better than it was in 1979, but it’s worth noting that this technology has lasted even with the competition of CD players, mp3 players and iPods.

1982 is the year that the first ever CD was made, and 6 years later CD sales had already surpassed LP sales. Word on the street (according to Wikipedia, since that’s the only source I can find that seems to care about this topic) is that the first C.D. manufactured and released in the United States was Bruce Springsteen’s Born in the U.S.A. New Jersey, for the win! Definitely fitting for the first American CD, and definitely on my iPod. The first album to be released on CD, (again, according to Wikipedia) was Billy Joel’s 52nd Street. Another obvious winner.

After the CD came mp3 players and the beginning of online music sharing, including sites like BearShare, Limewire, Napster and more. Of course, along came Apple and the iPod and the music biz was forever changed. One thing I did not realize is that the iPod is less than ten years old. October 23, 2001 is the release date for Apple’s first iPod music player. Today is April 17, 2010 and it’s becoming difficult to find someone who doesn’t own an iPod.

Photo credit: Kelly O

Detailing the past and the evolution of the music industry leads me to wonder—where is it going? How are we going to listen to music next year? In 5 years? Books used to be spoken, then they were printed on paper, now they’re available on somewhat futuristic devices like the Kindle and the iPad. How is music going to change? & more importantly, how is the music industry going to adapt to that? These questions are going to start being answered in the next few years, but for now I’m good with my iPod and my 6-disc CD changer.

If you’re at all interested in this, and would like to take a look at a more in-depth outline, check out this Web site, which chronicles music formats and recording history from Thomas Edison up to the iPod shuffle.

Saturday, April 3, 2010

Rule of Thirds 3.0: Don't say I didn't try

Caution. Broken stairs outside of Hagerstown.
WARNING: I have no photography experience. Bear with me.
This was one of the first shots I took while on my lovely Thursday afternoon photo-shoot. I left Hagerstown, my humble campus home,
and saw the broken bricks on the stairs, said "This is kind of artistic," and took a picture.
I think I lack inspiration.
This week I got a different assignment in Info 3 pt. 0. I wasn't assigned a blog topic, I was assignment. We were sent out with our cameras & camera phones and told to learn something about photography. To take what we were taught in class and apply it in real world settings.
I tried.
So the photo above is my first example. While not perfect, I think I did some good things here, that I would like to point out. The rule of thirds, as I mentioned in my previous post, is a way of positioning the subjects of your photograph so that the most important, most eye-catching items are placed at the intersections of divisions of thirds. In my cleverly named "Traffic-Cone Photo," the black railing is roughly one-third of the way down the photo, marking the top third of the scene. The bottom third roughly coincides with one of the steps in the bottom of the photo. The black railing, which is a color that stands out against the reddish brick dominating the photo, leads the eye into the undoubtedly most exciting part of the photo--the orange traffic cone. The traffic cone is positioned in the top right third, and coincides with the top right intersection (of the rule of thirds lines.) The lighting is a bit bright in the top of the photo-which I was going to edit out-but I decided to keep it because it makes the reflector strips on the traffic cone glow, which I feel adds more emphasis to the focal point of the picture.
Try #2.

I like this one. Probably because of my lovely model Jenn. Once again the black railing in this photo pops against the color. Whereas in the other photo, the horizontal divisions were extremely clear, the vertical divisions in this one almost draw themselves. Jenn is standing on the left division, the green tree takes up the middle, and the black railing and the red tree set the third division. Cool.
The horizontal divisions are drawn by the end of the pavement (and the beginning of the tree) and the top of the tree, which almost lines up perfectly with the bubble-wand and the bubble Jenn is blowing.

But Kelly, these pictures are not of technology!

I know. But I have a point.

Jenn's blowing bubbles. Jenn's also almost 20 years old. Some other friends of ours walked past while we were shooting these photos and amused themselves by chasing the bubbles around and popping them. Simple pleasures.

I've also got a friend with an iPhone. This friend has an iPhone app which, to the best of my knowledge, consists of nothing other than popping virtual bubbles on virtual bubble wrap. Seriously? Some new things (virtual bubble wrap) will just never replace old things (real bubble wrap, or real bubbles.) In short, technology is not always the answer. Sometimes it's better just to do things the simple way.

Some Others:

The red car is sitting on the top third intersection in this photo. I wanted to highlight the several aspects of technology in this photo--the cars, the brand new building and the crane in the background. I also enjoy the lighting in this photo.

Chinatown. I had to go see a play for another class near the Verizon Center, so I went up early and walked around a bit. This photo is taken at the intersection of the Verizon Center part & the Chinatown part. I know, that sounded intelligent. But, hear me out--the left third of the photo contains the colorful chinatown arch. The middle and top center part are of an at&t building, but they incorporate the same colors as the arch. The bottom center and the right third are full of grey cars and a gray building to the left. The bottom third is divided by the digital bulletin boards on the at&t building, another immediate contrast to the gray cars and the boring building in the background.


I may not have followed all of the rules for this photo (AKA please don't grade me on it Prof Yaros!) but I had to share because it's downright adorable.

Back to regular blogging next week!!
Happy Saturday & Happy Easter =]

Friday, April 2, 2010

Photography Unit

"There are no rules for good photographs, there are only good photographs." -- Ansel Adams

Remember rotary phones? I don't. They're that old. Older than me, which, my friends like to remind me, is old.

This photo represents my topic in an almost perfect comic sense. The iPhone, one of the newest and most popular tech toys of today, with a rotary phone screen. A blend of new and old in a much more fashionable package.

It relates to my topic, but it may be one of the least visually stimulating photos I've ever seen. Definitely funny though, just boring.

So I've decided that my topic is not easily communicated visually. It's not exactly simple to capture the constant struggle between new and old technologies. It's not very easy to search for either. Technology images, technology photography, photos about technology--not a lot going on there.

So I've decided to stretch the limits of my topic a little for the sake of this project. Just FYI.

This photography depicts a solar energy plant, a building covered with solar panels in Chicago. Though not the most exciting photo, it does follow several must-have photography rules. The "rule of thirds" is simply a way to split photos into nine different cells, with the most important pieces of the photo being placed at the intersections. The photo is also in focus and well-lit, without being over-or underexposed.

In the photo above, the top third of the photo is designated by the top of the highest row of solar panels, coinciding with the bottom of the brick structure in the top right cell of the photograph. The bottom third is also separated by a solar panel. The vertical lines almost lay directly on the white pipes coming up and out of the surface of the building; the above all culminate in an almost perfectly placed photograph.

I like this photo the best. Cool red phone booths in a cool red line in cool London. The color is vivid and pops against the drab grey stone. However, the photographer did not consider the rule of thirds in this picture. The 'telephone labels' almost line up, creating somewhat of a top third division, but the rest of the photo is just there, which begs the question--does it really NEED to follow the rule of thirds? I like this photo, I think it's the most interesting, the most engaging, and it's fun. But I'm not a photographer, so maybe my opinion doesn't matter.

One negative point to ALL of these photos--they don't communicate anything. I don't get a message from any of them, no meaning, no story. I get phone, solar panels, phone booths. The latter makes me think of Superman and my own cheesy picture inside a London photobooth, but that's not really a story. Professional photographers strive to communicate meaning with their photographs, to literally portray "a thousand words." I got two words at best from any of these photos, as interesting or funny as they may be.

So what is it that makes a good photograph? Is it a good subject, nice lighting, the rule of thirds, or a good story? I would have to guess it's some perfect combination of all of them-I wish someone would let me in on the secret.

As always, Happy Friday!

Thursday, March 25, 2010

Online Dating: Phenomenon or Phreaky?

“Chivalry is dead.” – idk who originally said this.

Remember when people would meet their true love at the carnival, hop in their ferris wheel cart, write 365 letters to each other, swear we’re a bird if they’re a bird, and then get married, write a book and live long happy lives?

Probably not, but that’s how it went down in The Notebook.

My point is, for most people, that’s just not how things happen anymore. There is no courting period, there’s no wooing, and things have just gotten a whole lot less romantic. Yes, The Notebook is set in the 40s, so things were bound to change, but in my opinion, somewhere along the line, a whole lot of dating how-to info got lost in translation.

Goodbye chivalry and old-fashioned romance, hello ease of online dating.

We all know the online dating stereotypes. Losers, computer nerds living in their mother’s basement, creepy 80-year-old men posing as 24 year old multi-millionaires. The usual. Online dating has often been associated with a stigma in our culture; those who cannot get a date turn to online dating, only desperate people date online, only weirdos would go out with someone they met on a Web site. But times have changed and online dating has moved away from savior of the awkward to matchmaker for the masses.

Several movies have poked fun at online dating – there’s the famous You’ve Got Mail, where Meg Ryan and Tom Hanks begin an online relationship where they end up falling in love though they actually despise each other in “real life.” He’s Just Not That Into You chronicles the difficulty of dating in today’s technological age where online dating, texting, emailing and Facebooking are common methods of pursuing (or avoiding) a relationship, and Must Love Dogs, which ends in love for two successful attractive individuals for whom dating in the real world did not work out, but their love of dogs posted online was a source of immediate attraction. Online dating takes a less than pleasant starring role in many crime dramas-unfortunately, one of the daters almost always ends up dead. Online dating has invaded multiple areas of society, and it looks like it’s here to stay.

There are hundreds of online dating sites, ranging from those like or which are general sites targeted towards everyone, to sites like, or There are even online sites to teach you how to utilize and maximize your potential on online dating sites. Here’s one example which actually has several video tutorials on topics like how to take the best profile picture, how to stay safe while dating online and which online dating site is right for you.

According to several online sources, more than 40 million people use online dating sites and the average users spends $239 dollars a year. Some of these dating websites claim to be the key to happiness. eHarmony says “On average, 236 eHarmony members marry every day; that accounts for 2% of U.S. marriages.” 2 percent? promises they’ll be so helpful, you’ll feel like you already know the person you choose to date. has helped redefine the way people meet and fall in love. provides a rich tapestry of ethnicities, interests, goals, ambitions, quirks, looks and personalities from which to choose.” But do these sites really hold the key? Is online dating really the answer?

Some research may suggest otherwise.

When you meet someone in person, your first impression is often formed by noticing things about their appearance: apparent age, height, attractiveness level, weight, physical build etc. This impression does not occur online; 1 in 10 users are scammers, with fake profiles and fake pictures, and the other 9 out of 10 commonly lie. American men lie most about age, height and income, while women tend to lie about weight, physical build and age. 1 out of 10 of these real users then leaves within the first month.

According to a CNN article, “The type of people who misrepresented themselves online is the same type of people who do so face-to-face,” saying that some people know what others like and configure their personal profiles to fit those stereotypes. These people desire inclusion, and acceptance, and feel the need to be well liked. However, these desires will persist online or not. The same article says that while men are more likely to lie than women, gender is not the biggest indicator of lying. The study attributes a propensity for lying to being a “high self-monitor.”

Self-monitoring is a part of the impression management process. Impression management, as defined by Dale Leathers and Michael H. Eaves, authors of “Succesful Nonverbal Communication,” is an individual’s conscious attempt to exercise control over selected communicative behaviors and cues. This happens in all walks of life, online and offline—but do we have more control over it online? We obviously use impression management often-when we’re applying for jobs, meeting someone new, or trying to make friends-but we cannot control our appearance. Online, we can control everything. Our name, our age, our height, our interests, our this that and the other thing.

So in the end are we all that different? Is there that much separating Noah from The Notebook and John Doe from

I think so, but then again, that’s just my opinion.

(If you’re considering online dating, check out this list of Web Sites to find the one for you! BUT—always be careful. Research shows that 1 in 10 sex offenders use online dating to meet people. Know your stuff and don’t become a statistic!)

Happy Thursday.