Sunday, May 9, 2010

Change is constant

"Once a new technology rolls over you, if you're not part of the steamroller, you're part of the road." - Stewart Brand

If you’ve been following at all, you know that this blog was a discussion of technology: past technology, past predictions of technology, present technologies and future technologies. Oftentimes one post would include not one, not two, but all of these things. A discussion of cell phones began with telegraphs, evolved to rotary phones, took a detour through “futuristic cell phones of the past” and eventually ended up at blackberries; this is the formula many of my posts followed.

Why did I approach topics this way? What did it teach me?

It’s simple, really: nothing lasts forever.

Today’s technological advances are often tomorrow’s stone age devices. Remember Windows Vista? And how that was around for oh, I don’t know, 2 seconds? (In case you haven’t picked up on this, I exaggerate things. It’s for effect) How about the original iPod? If you still have one of those things, you might as well use a Walkman.

Things change. Personally, I think it’s scary.

I’m a communication major with a focus in public relations. Public relations, like many other concentrations, is constantly evolving due to changes in the field. The difference with public relations today, is, that social media has been adapted by PR pros (and aspiring pros like me) as our own. Twitter, linkedin, blogs, foursquare: PR toys.

As a young PR student, I’m worried. Or, as I like to say, "optimistically aware." Should I bother with these “toys?” Should I learn them, become a pro at them, earn the title of “guru” and garner the respect of my peers….for a year? Will the industry still be using twitter when I graduate? Or will I have experience in a useless tool? If someone has some insider info on that one, I would greatly appreciate it.

My point is we don’t know. We don’t know who will invent what and when. The social media tool of the future may already have been invented, may be in use right now, but it’s not widespread. Facebook and Twitter didn’t take off overnight. So what if it’s out there, waiting for me to use it, and I’ve got no clue it exists? It’s weird to think about, but it’s very possible.

I know that this occurs in almost every field of study: every day has the possibility of creating a new theory, a new tool, a new way of thinking that can revolutionize the way things work. But I’m in PR, so that’s what I worry about.

All I can say is learn to adapt. That’s what I’ve learned: to be open to learning. I know. It sounds really, really simple. And it is—somebody just needs to let you in on the secret simplicity of success.

So for now I’ll tweet, I’ll keep blogging, I’ll eventually figure out how to use foursquare and I’ll continue to change the privacy settings on my facebook until it hardly exists at all.

Or maybe I won’t—but that’s for tomorrow to tell me.

Happy Sunday, & Happy Mother’s Day to my Mom and to all other Moms.

Thursday, May 6, 2010

Internet Addicts Anonymous

"Getting information from the Internet is like taking a drink from a fire hydrant." -- Mitchell Kapor

My name is Kelly, and I'm addicted to the Internet.

Okay maybe not, but the Internet is awesome. Seriously.

Last night, my phone wasn’t working so I googled the problem I was having and voila, I fixed it. A couple of weeks ago I wanted to make a cheesecake but didn’t have a good recipe, so I googled it. I find out what’s going on at school, what’s going on at home, what’s going on in the world all in one place. Internet = awesome.

This semester, I had the pleasure and advantage of taking Jour289i: Information 3.0. This, of course, is the reason I’ve been blogging all semester. I’ve learned about blogger, twitter, secondlife, picasa—so many social media sites and apps that I cannot remember all of them. But I think that one of the most important things we’ve learned in this class is fact checking and analysis.

Some of my favorite posts to write this semester were those where we read an article, and were told to comment on it (think gender & video games or cell phone use and dependencies.) I never had enough info from my one reading or in my personal repertoire to successfully analyze and respond to these articles, so I was always forced to do further research. This not only expanded my knowledge on the specific topics, but it taught me to read critically. I’ve learned to find a second source, to compare and to contrast, and to learn from all of the available information on the web (which is a lot.)

So what have I learned? To consider things from all points of view. To take what I know and use that as a launching point to learn more. To utilize the resources that I’ve been given to find and use new ones. This class was not only about using the Internet or about new technologies or social media; this class was about taking all of these things and expanding your knowledge, learning to use these things to think critically.

Citizen Journalists and Street Cred

"The smarter the journalists are, the better off society is. To a degree, people read the press to inform themselves-and the better the teacher, the better the student body." -- Warren Buffett

Students in Jour289i surveyed friends on technological use and opinions. One of the questions we asked was "Do citizen journalists have as much credibility as professionals?" The answer was an overwhelming "no." Later in the survey, the question "In the future, do you think citizen journalists will have as much credibility as professionals?" was raised. The answer changed in a positive manner. So although our peers do not believe that currently, citizen journalists have credibility, we do believe that in the future, things will change.

This instilled some serious curiosity in this citizen. Why? What is going to change that is leading today’s students to believe that in the future, we will have just as much credibility as people like Stephen Colbert.

I think it’s technology. Technology is making the difference. Technology is driving this change in belief and technology is enabling this change.

For years, amateurs have been sending in videos to news stations who were unlucky enough to miss “the story of the century,” which, in all the novels I’ve read, appears to occur every week which seems to be a bit counterintuitive but whatever. Back to my point. Amateurs, who happened to be in the right place at the right time, would catch the right video clip, whether on a professional video camera or your run-of-the-mill model. Whether they were looking for this opportunity or not, they happened upon it and they made news.

According to a 2009 Marist Poll, 87% of Americans have cell phones. How many of those cell phones have cameras capable of photo and video? I don’t have a number but I’m going to approximate: a lot. Every single one of these cell-phone owning Americans has the mobile ability to capture moments—to make news.

Here’s a personal example.

A recent Jour289i class assignment revolved around Maryland Day. For those of you who do not know, Maryland Day is an event, held at none other than the University of Maryland, where the University is open to visitors of all ages. Activities range from performing shows, campus tours, moon bounces and autograph sessions. It’s a blast and a half. This year, my classmates and I were given the task of travelling around campus, experiencing different aspects of Maryland day, and reporting back on it. Live. All. Day.

I had an iTouch, a camera, a cell phone and a rain coat, and I had a long day. But I learned something. My classmates and I provided (to my knowledge) the best available online coverage of Maryland Day. It may have been the only available online coverage of Maryland Day, but still, it counts for something, right? I was a citizen journalist. I tweeted, I uploaded pictures, I interviewed performers, visitors and volunteers. Isn’t that credible? So I’m not a professional, and Maryland Day may not be a big deal—but what’s preventing my (and my classmates’) coverage of Maryland Day from being credible? The fact that we’re students? The fact that we weren’t paid to do it? (unless you count a pass/fail grade…) I’m not sure. I don’t really see the difference between the information and materials that I provided, and the information and materials that a professional could have provided.

Here’s another story, free of bias since I didn’t participate and I didn’t write it. I’m just reporting.

This CNN article, “Citizens monitor Gulf Coast after oil spill” tells the story of an MIT student trying to make a difference. Jeffrey Warren is walking up and down the Louisiana coast “holding a kite string that’s tethered to a helium-filled trash bag and a point-and-shoot camera.” Sounds ridiculous, but he’s doing it anyway. Why? Because the professional satellites that we have covering the oil spill are really freakin far away, and the pictures they take and the efficiency with which they take them are just not that helpful. So citizen journalist Warren is doing his part and documenting the accident himself. I give him credibility, but I’m not a pro.

To sum up my longest blog post in a while: This class has taught me to be a citizen journalist. To create news. To efficiently and effectively produce credible content. To utilize various social media tools in correlation with different technological toys like digital cameras and iPods. Am I credible?

Well, you just read this whole thing—didn’t you?


Happy Thursday :)

Thursday, April 22, 2010

Personalize me Cap'n!

"I wanna talk about me!" - Toby Keith

We’ve been talking about personalization quite often in class lately; probably because the rest of the world is talking about personalization, well, almost constantly.

But what is personalization? Is it being able to pick what fonts your website is displayed in? Is it making sure that your computer remembers your Facebook password for easy login? Or is it being able to download whichever app you want, when you want it straight to your handheld iPod?

D. None of the above.

Personalization is “is a computation-based application that takes a well defined set of inputs and returns one (or more) recommendations for a piece of content to be immediately served to an end user.” Or, personalization is a mouthful.

So let’s break it down. Google may be one of the best and most familiar examples of online personalization. Whether you’re aware of it or not, Google tracks everything you search. I have a gmail account, and I logged in to my account and went to my privacy settings to try to better “hide” myself from Google. What did I find? Every. Single. Thing. That I have googled since I bought this laptop. All of it. Scary, huh? Immensely.

So anyway, Google takes all of that information, stores it, and uses it to send you the best (according to its computer brain) results for you personally. I was trying to find a place to go out for lunch the other day, and I typed “lunch restaurants in” and what comes up? Bethesda. It read my mind. Scary again. Useful? You betcha. Still scary.

This web article, written on March 3, 2010, claims that Google personalizes about 20 percent of search results, usually the top results on your Google search.

Now this may lead you more quickly to the sight you do actually want, but is it entirely a good thing? Is it making users close-minded? Is it preventing us from moving outside of our comfort zone? It’s helping prevent me from eating anywhere besides Bethesda, that’s for sure. So is it helping or hurting? Broadening our horizons or erasing them? I’m not so sure. I’m a googler-I google everything; I think it’s because I’m very inquisitive. But because I click the first site every time, am I only reading what google wants me to read? Am I letting google do some of my thinking for me? I hope not. I consider myself a pretty good thinker; I need to think about this one on my own & give Google the night off. I urge you to do the same!

Wednesday, April 21, 2010

I love Rock N' Roll. And Nintendo.

"Maaarriooooooooooooooooooo" -- Luigi
Two words: Super. Nintendo.

Selected by none other than me as 1991’s greatest invention. Possibly top 5 all time. It’s really that great.

My aunt and uncle still have one. Every time I visit their house, I plug in this game system, smack it a few times (the only way to turn it on) and make sure all the dust is out of the bottom of the video game itself (they’re kind of hollow.) Totally. Worth it.


For those of you who don’t share my feelings or my extensive knowledge of Super Nintendo, here’s some background. In short, Nintendo released the Nintendo Entertainment System (NES) in 1985 and this device “single-handedly revitalized the video game industry.” There were over 60 million NES units sold and people could finally play new and exciting video games in the luxury of their own homes.

Super Nintendo was introduced in 1991 and featured 16-bit technology which meant more processing power which meant cooler games. Yea the graphics aren’t great and okay the games weren’t the greatest but COME ON. They’re definitely fun, and they’re definitely a worthwhile waste of time. Old-school Mario, Clue, golf, even shoot-em-up games that everyone seems to be obsessed with.

And then, Nintendo one-upped themselves.

That’s right. Nintendo 64. 1995.

It even has its own Facebook fan page, with 86,631 fans. This beats the the majority of the President of the United States’s fan pages by approximately 80,000. Moving on.

This console was the beginning of 3d video gaming and introduced the gaming world to a “realer” look and feel of its games. The controller has a joystick that allowed characters to move freely around the screen as opposed to the standard up/down/left/right that fondly brings memories of Pacman to mind.

One downside: (this is a consideration of the gaming world, not me. I love Nintendo 64 and the guys down the hall have one which they play all the time. Still awesome, no matter what.) The Nintendo 64 is cartridge-based, meaning the games are not on CDs. This, to the critics, is a big mistake. CDs were considered the future, and Nintendo released their futuristic device, which was more or less anchored in the past due to these cartridge games.

Fast forward to 2006 and Nintendo releases the Wii, which quickly became the “best-selling latest generation console system in the world,” and let me tell you, it’s awesome. Practice your serve like Andy Roddick in Wii Tennis. Mix potions under Snape’s watchful eye in Harry Potter. Guide Mario through different galaxies in Super Mario Galaxy. These games are too. Fun. The graphics are great, they’re easy to play, and the wireless motion-sensor controllers are really just enjoyable to play with. There’s no wires, they take standard batteries, and they’re MOTION SENSORED. You move, your character moves; talk about personalization.

Overall, I disagree with the gaming industry. I think Nintendo rocks and they’ve always been on top—their games, their consoles—you think video games and you think Mario (which you can now conveniently play online,) you think N64. They get my vote.

How do you guys feel?

Which company has the BEST video game console?
Nintendo
Playstation (Sony)
X-Box (Microsoft)
Sega
Other
pollcode.com free polls
Happy Wednesday!

Monday, April 19, 2010

Let's talk about games, baby!


This week's assignment was to come up with a creative, online game for children representing our topic. This took some serious deliberation because, as I believe I have mentioned, I am not overly creative. My topic is also very wide, and therefore a bit difficult to encapsulate into one, simple game.

BUT, I think I've got it.

My game is quite simply a blend of new and old. It would be a virtual reality game, so you'd get to wear really great-looking headsets and gloves and whatever else it is that you wear when you play virtual reality games. This would obviously be the new technology part. You would then enter a virtual world that would literally be half old and half new. Old games would be on one side, new games directly opposite. New technologies facing old, old methods being overshadowed by new.

Kids would "walk through" this virtual reality and try everything out. iTunes and record players, skype and rotary phones, games like candyland or electronic games like catchphrase. There would be "levels" with different categories LIKE technology, games, music, leisure etc. and to pass each level (and get to the more fun levels like games & leisure at the top) you would have to take a quiz where you successfully utilized each old & new technology in a timed trial, or where you beat the computer in a new or old game--ya know, fun stuff.

Even better--since this is something I have seriously always wanted to do--instead of playing the games (like candyland, or Super Mario! or something old school like that) you could actually be in the game--Jumanji style! minus the animals trying to kill you. How cool would that be? Someone please invent this for me.

I really hope that the next part of this assignment is not "create this game!" 'cuz then I am flat out of luck. But it's definitely cool in theory, right?

Right.

Happy Monday!

Sunday, April 18, 2010

iTunes against the world

"I'll tell you about the magic, gonna free her soul
It's like trying to tell a stranger 'bout rock and roll" - John Mellencamp


It was new, it was cheap, it was TIME’s coolest invention of 2003.

It was iTunes.

Bono, Mick Jagger, and Dr. Dre were there when Steve Jobs introduced his newest brainchild. The iTunes Music Store was Jobs’ recent stroke of genius in a long line of rights. It was a way to share music legally, to give back what was due to those who had created the music while still allowing users to download content. Songs were just 99 cents, albums were $9.99. Here’s the catch—when iTunes was created, it was only available on Macs. In 2003, Mac computers represented a meager 3 percent of the computer world.

That was when Jobs made the decision; not only to sell music in an online format, but also to make the iTunes store available to the other 97 percent of the market: the PC users. Within the next three days a million copies of iTunes has been bought online and downloaded by PC users.

The rest is history.

When this article in TIME magazine was written, Apple had the rights to sell 400,000 songs and iPod’s were $499. iTunes now features more than 11 million songs and Apple now offers several different versions of the iPod with different features at different costs

iPod shuffle: 2GB-$59/4GB-$79

iPod nano: 8GB-$149/16GB-$179

iPod classic: 160GB-$249

iPod touch: 8GB-$199/32GB-$299/64GB-$399

iTunes, at the time, seemed to be the perfect solution to the problems of illegal online downloading. Create a quick, easy and efficient way to download, store and listen to your music all in one convenient application right on your desktop. However, within a few years, fingers were starting to point back at Steve Jobs & Apple as creating problems—not solving them.

About two years after the initial launch of the iTunes store, record labels and their company executives began to worry that they had relinquished too much money, errr-power, to Steve Jobs and Apple. Apple “set the ground rules for their own business,” according to several music tycoons. Record labels argued that they wanted to set their own prices for songs—give discounts on older albums and bump up the price of newer, popular songs.

Record labels may have agreed to iTunes before they fully understood the immensity of the project, a smart move (or a sketchy one) by Jobs. When the iTunes deals were made, the store was available only on Macs. Of course, a few months later, Apple released the iTunes music store for download availability on all platforms—also releasing the cheap songs to the rest of the computer world. By 2005, iTunes was outselling traditional music stores, further upsetting the record labels and the music industry.

Now, 5 years later, the music industry is still not happy about iTunes more-or-less monopoly over the current music business. Apple is now the number one American music retailer and the music industry is not enjoying it. According to the New York Times, “the relationship (between Apple and record labels) remains as tense and antagonistic as ever.”

Some labels argue that iTunes is not paying their “fair share” of the money they make off of songs. iTunes distributes them, but who do they really belong to? This debate was sparked 7 years ago when Jobs first introduced iTunes Music Store, and it does not seem to be going away anytime soon.

So what do you think? Is Apple killing the music industry, or saving it? Should Apple fork over more of the royalties on the downloaded songs, or are they just keeping what they deserve?

Maybe soon we'll be able to buy the answer on iTunes.