Categories
content strategy

Surviving the Distraction of Shiny New Objects

Can a shiny new object put your carefully crafted content strategy in peril?

Continuous partial attention is about scanning continuously for opportunities across a network, not solely about optimizing one’s time by multitasking.Wikipedia

Though Linda Stone coined the term in 1997, continuous partial attention has become more of an issue in the world of content than ever before. Real-time technologies allow greater access to more information at any moment. It could be a tweet, an email, a quick check of an RSS reader, etc. Laura Miller wrote a book on the potential impact this may have on individuals. What about the impact on businesses?

David McCandless’s Hierarchy of Digital Distractions has made yet another round on the viral info-sharing circuit. It clearly illustrates what many of the super-connected people are cycling through any minute they are awake. Just as Miller’s book points out, the lack of focus has people bouncing back and forth between the shiny new objects.

Information itself is not the only shiny new object poised to distract. The services and the hardware/gadgets that serve up the next hit garner the most acute attention. Apple’s most recent product unveiling is a…shining example.

What does all of this divided attention mean for content creators and publishers? What impact will it have on what was once a carefully crafted content strategy? Might the continuous partial attention distract to the point of sabotage?

A good content strategy will have workflows in place that address day-to-day activities.  However, there must also be a strategy and associated tactics and workflows that allow for consideration of new business opportunities as they arise.

I put the emphasis on workflow. Why? Because articles that pop up in the “Wall Street Journal” or “New York Times” get mainstream attention, and the echo chamber amplifies any potential into an unavoidable din. If a workflow is in place that allows each new shiny object to be carefully evaluated and vetted with stakeholders, the sooner they can be either implemented or tabled. The sooner, then, you can return to the task at hand: content.

Managers will rejoice in the fact that such a workflow exists, eagerly anticipating changes in the marketplace and technology. It is in the strategy’s best interest to have something in place that can protect it from a band-wagon jumping moment of viral hysteria.

Whether this shiny new object is a service, network, or a new device (looking at you, iPad), there are several key questions to consider.  At the very least, this component should ask two questions of a shiny new object:

  • Does this serve our customers/clients?
  • Does this fulfill an unmet business need?

Aside from the standard questions of “does this have a solid business plan” and “will this be around in the next 8 months,” other questions you may consider including in a shiny-new-object-evaluation component:

  • Would adopting a wait-and-see approach for the next quarter be appropriate?
  • What approach might our competitors adopt?
  • Which current content workflows would be impacted?
  • What is the potential return, aside from simply “being there?”
  • Does this follow our mission/tone/standards?
  • Would this cannibalize resources from things already established in the strategy?

There may be an advantage to being the first in line for a new technology. Bragging rights can count for something in some arenas.  But wouldn’t you rather be the one that does it right, rather than first? Meet your business and customer needs rather than being able to chime in with “first”?

Most likely, new business opportunities will require SOME sort of addition to existing content workflows. Be it one-time tasks of database configuration or appending metadata or modifying a taxonomy, or ongoing issues of manual content ingestion, editing, or transcoding, a disruption will probably be introduced. (Then again, sometimes disruptions become workflows.)

As with any component of a content strategy, the shiny-new-object-vetting portion must also maintain a razor sharp focus on business objectives, the people using your product/service, and the practical realities of your current operation. What were once major distractions and time vampires will hopefully become ever-so-complete-able tasks.

(Cragars! image via Flickr user cindy47452 (cc: by-nc-sa))
Categories
content strategy

Putting Metadata to Work for You

A poor metadata component in a content strategy can render precious content virtually nonexistent. Search engines cannot find it, recommendation engines cannot recommend it, and it may end up lost forever in a content purgatory of sorts. A strong metadata strategy will pay off when it considers both the limitations and opportunities found in its application.

Robert Stribley recently wrote a post on the Razorfish Scatter/Gather blog about metadata and tagging within the music library of iTunes. This got me thinking of my own experience.

The music library is one of many components that makes iTunes like a Swiss Army Knife. There are movie rentals and purchases, TV shows, a photo and video library, iTunes U, the app store, and the one that I spend some quality time with each day: podcasting.

Public broadcasting has ridden the wave of podcasting’s immense growth in the past 5 years. Each month, millions of public media audio and video files are downloaded, delivering the content to the people that want it. iTunes is the most popular choice for accessing that content. Therefore, having the content appear properly in that space is very important.

Mr. Stribley indicates in his article about the iTunes music library, there are some hurdles to clear. That holds true within the podcasting space as well. A few podcast-specific ones:

  • Store displays only 24 characters of a title in certain areas
  • Search only examines the first 12 keywords in a podcast feed
  • Feed images display in many places at 50×50 resolution
  • Channel-only indexing for search, rather than by individual episode

A couple of examples of truncated titles:

Apple has included a well-written summary in their “Making a Podcast” guide titled “Creating Your Feed and the Importance of Good Metadata.” Like any other space, iTunes provides its own set of limits and rules. Apple likely does this for a number of good reasons. If they let the public decide on the design, iTunes runs the risk of looking like a GeoCities site, not to mention search result speed concerns, intentional gaming of search algorithms, etc.

Reasons aside, the charge is the same – operate within the rules to get in front of that sizable arena of content-hungry people.  The key is providing the right information (directly or indirectly via search results) to those people, so they can accomplish their intended task: devouring your content.

However, there’s a compatibility balance to consider.  While it’s the dominant player in the podcasting field, content published with iTunes in mind often has multiple, additional syndication outlets:

  • Text-based RSS with other attachments
  • Widgets
  • Mobile apps
  • Dynamic pages on other sites
  • (insert here a technology developed in the next 18 months)

With that in mind, content creators can no longer publish with a single set of guidelines.  Outside of detailed specification documents, there are plenty of ways to make content as outlet-friendly and more important,  user-friendly:

  • Use every field to its fullest, even those outside of iTunes spec
  • Be mindful of character limitations
  • Ensure that the text is as potent as possible
  • Create graphic identities that are powerful even as thumbnails

When strategy is drafted, consider the reality that many of these systems/components are connected, often in ways that may not immediately be evident.  The environment may change along with technology and new business opportunity, but the core business aim should remain clear.  As ever, provide the best experience possible with your content.  And that includes helping them find it, with the help of metadata.

(“iPod Touch Back (2)” image from Flickr user Fr3d.org (cc: by-nc-sa))
Categories
content strategy

URL Shorteners: Are They Part of Your Social Media & Content Strategy?

How can a URL shortener impact your overall content strategy? Don’t they just shrink URLs?

I remember the appearance of the first URL shortener, TinyURL, shortly after their debut in 2002. It seemed like a novelty. There were times that I wanted to share a link that got broken with page breaks in emails. Beyond that, what was the use?

Then Twitter arrived. A world that lived and died by the value of character counts. Every single character became precious real estate.  Finally, the solution of URL shorteners had found their problem.

Tons of these services soon popped up, each with their own selling points and clever names. Several have already folded. Like many other web services, they are made or destroyed by The Default.

Until May 2009, the default URL shortener for Twitter was TinyURL. The new default, Bit.ly, has ridden Twitter’s wave of popularity to become the dominant service. To get an idea of the scale of the operation, ponder this: Bit.ly shortened 2.1 billion links in November 2009, as reported by TechCrunch.

At the end of 2009, Google got into the game by putting a URL shortener into Google Toolbar, FeedBurner, and their web browser, Chrome.  (It’s not currently available outside of those places.) Other sites have exclusive shorteners, too, like Facebook (fb.me) and YouTube (YouTu.be). When you share a link from those places, those short, branded domains will be used.

Not to be outdone by a Google announcement, Bit.ly announced that it was partnering with some serious heavyweights (New York Times, AOL, Wall Street Journal, Huffington Post, etc.) to offer a Pro service in limited private Beta. Benefits include: Custom URLs (ex.: nytm.es) and analytics in real-time.

The option to bake a shortener into your CMS that allows you to create a short URL of your choice, domain and all. This is considerably more heavy lifting than the free services already out there, but offers its own set of benefits.

Why does any of this matter?

Something as seemingly innocuous as a URL shortener must figure into the social media portion of a comprehensive content strategy.  There are many, many options, and the decision on which one to use must be carefully considered.  While not as important as a content audit or the choice of a CMS, the benefits of choosing the right URL shortener are clear:

  • Solid, useful metrics
  • Contribute to (rather than hamper) your SEO strategy
  • Prevention of link rot
  • Provide the best user experience
  • Reinforce branding
  • Security/confidence

The metrics component alone may be the make-or-break feature. Most site analytics suites can track in-bound traffic from Twitter.com.  Since 30% of Twitter-related link-clicking traffic comes from third-party applications like TweetDeck or Tweetie, tracking the links from the shortener end becomes considerably more important. Some of the services offer more analytics features, and they are quickly improving them.

Certain shorteners use different types of HTTP traffic redirects to point to your original URL. Some properly refer the traffic and maintain your hard-fought SEO efforts, others do not. Most of the major players play by the rules and refer the traffic on a permanent basis and do not recycle links. Watch out for those.

What constitutes the best user experience is always up for debate. Certain shorteners like the Diggbar and HootSuite’s ow.ly shortener use frames.  While there may be some UX benefits with frames for users, they can also confuse and distract. There is the potential for the frame-based shorteners to cause metrics and SEO issues as well.

Shorteners are only as trustworthy as those using them. We’ve been taught to avoid unfamiliar links in emails for as long as phishing has been around. Selecting a service that uses a familiar or custom-branded URL domain will give people the confidence to click without fear of spam or malware.

So, indeed, it does matter.

What may have been something destined to be a footnote in the history of The Internets is now something vital to a comprehensive content strategy.

Speaking of shortening…


( “some people say I am obsessed with my lawn” image from Jez Page / Flickr (cc:by-nc-sa))
Categories
content strategy

Privacy, Facebook, and Social Media Strategy

We’ve seen a bit of dust kicked up in the past couple of weeks surrounding three announcements from both Google and Facebook about search, status updates, and an old favorite of The Internets, PRIVACY.

#1. TechCrunch reported that Google, a popular search engine, has chosen to include “public” Facebook status updates in the special, Real-Time place in regular Google search results. MySpace and Twitter updates are already showing up.

#2. Facebook just rolled out changes to their privacy settings. Upon login, people have been presented with a prompt, asking them to confirm their privacy settings to the default, recommended settings, or to keep their old ones. The default setting for status updates is “everyone,” meaning that people’s updates of “I’m eating a waffle” are now set to be indexed by Google.

Facebook claims that this is a reaction to the way that people are using the service in this interview on ReadWriteWeb. Barry Schnitt, Director of Corporate Communications and Public Policy at Facebook, gives this somewhat veiled explanation: “Because the site is changing, our userbase is changing and the world changing.”

What ever the official stance may be, it is clear that both companies stand to benefit from more updates being made publicly available, and having those updates showing up in search.

#3. The final announcement, as reported by MediaPost, will have an impact upon those that use the Facebook Pages. From the start, all status updates from a Page have shown up in the news feed of Fans of that Page. Soon, Facebook will be inserting an algorithm into that process, giving preference to Page owners that have a higher level of engagement with their Fans. (Engagement, in this case, being people “Liking” and commenting upon updates.) Simply stated, the more interaction a Page has, more of those updates will appear in Fans’ News Feeds.

What does this mean for those that operate Fan Pages on Facebook?

If your strategy has been to set one up and wait for people to make it viral, then you have your work cut out for you.

If it weren’t already clear, it should be blindingly so now: your social media strategy is a content strategy. Status updates are content. Links and Notes in Facebook get indexed. Photos on your Page will soon show up in search.

No need for alarm.  If dealt with properly, it can be cause for celebration. Reach can now be extended outside of the walled garden of Facebook and into the wide open world of search. More traffic can potentially be driven to Pages and sections of Pages, like Notes, Photos, and Video.

There will be some work involved. Page owners may need to take a new approach to the content they publish to those pages.

Facebook’s new News Feed insertion rules depend on the mojo granted from comments and “Likes.” Page administrators will need to curry the favor of those Fans. To do this, publishers and content creators should apply the same SEO-friendly rules to their Facebook content and updates that are likely in place for other content:

  • Clear, descriptive titles
  • Concise summaries
  • Relevant images, video, audio content
  • Proper metadata attribution
  • Hooks!

For other content creators and publishers, these changes may be a wake-up call. Maybe the updates that have been published up to this point haven’t been that appealing. Perhaps they were posted just to post something, to stay in front of Fans. Now may be the time to look at a larger content strategy. Or, ask bigger questions, such as “What are we doing here?”

As more outlets [and changes in those outlets’ policies] present themselves, they’ll need to be incorporated into strategies, workflows, governance, and metrics. Unlike other, more static components of a content strategy, the rules and tactics in social media are more likely change week to week. User agreement changes, acquisitions, and partnerships with competitors can all force a re-examination.

Even what seems to be a relatively minor change in a single channel may turn out to have a sizable impact.

[“Self Noir” image via Flickr user Jeremy Brooks (CC: by-nc)]
Categories
content strategy

Is Your Content ‘Good Enough’?

A couple of weeks ago the Geek Girls Guide featured a post about the relatively new and inevitable phenomenon of “Good Enough.” Sometimes the answer to a need is less about the most-est and more about the most well-suited. The example is the handy Flip camera — not the most sophisticated or high-tech, hi-res device. But, it is good enough for most people. (myself included.)

Content creators and publishers enjoy a luxury that the hardware and device manufacturers do not: the flexibility of multiple versions of content. This concept can apply to many different kinds of content: feeds, documents, or other file types. It also applies to variations of those, like resolution and format.

Enter the stadium of rock-n-roll content resolution. Radiohead guitarist Johnny Greenwood and Nine Inch Nails founder Trent Reznor have opposing viewpoints on “Good Enough.” They shared their views in a recent set of “New Yorker” articles.

Greenwood argues that average mp3 files are fine:

We had a few complaints that the MP3s of our last record wasn’t encoded at a high enough rate. Some even suggested we should have used FLACs, but if you even know what one of those is, and have strong opinions on them, you’re already lost to the world of high fidelity and have probably spent far too much money on your speaker-stands.

Reznor on the other hand, argues:

Walk into a Best Buy and everyone’s obsessed with the highest possible resolution for their TVs. 1080p versus 1080i resolution, hundred-dollar HDMI video cables…yet everyone still walks around with those terrible quality white iPod ‘earbuds’…I want you to have the same feeling I do sitting in the studio listening to a final mix, surrounded by sound, in heaven.

There are two clearly different attitudes here.

The first looks at the way things are. Why go to much trouble if most people are going to take the low-resolution road anyway? The second looks more at the way things could be. Offer the high-resolution experience, but offer convenience as well.

This gets to the core of the issue: the user experience.

Lots of folks are happy with average-quality resolution mp3s. But what of the people that are looking for a better user experience?

Radiohead put their most recent full-length album up for pay-what-you-like downloading.  But, they used mediocre-quality mp3s: 160kbps. The album, titled “In Rainbows,” was eventually released on CD and vinyl LP.

Reznor, however, released a recent album, “Ghosts,” in a much different way. Five different ways, in fact:

  • First 9 tracks in high-quality 320kbps mp3 – FREE
  • All 36 tracks, in one of three hi-res formats – $5
  • 2-CD set, with access to above downloads – $10
  • Above, plus data DVD with files for remixing, plus Blu-Ray ultra-fidelity version and hardcover book – $75
  • Above, plus the album on 4-LPs and Reznor’s autograph, limited to 2,500 – $300 [sold out]

You can easily give people too many choices. Reznor didn’t release the album on cassette or 8-Track. [Even though some people would likely have bought them.] However, he did release it in a way that offered people a variety of attractive, relevant choices.

Consider the option of multiple versions early in the analysis phase. Know the client, know the audience. Keep the design and interface clean. Clearly explain what is available. People want choices, but not to be overwhelmed.

People that choose their own version of “Good Enough” will think your company is more than just good enough. They may think it is awesome, even.

[“Like a record baby” image via Flickr user itchy73 (cc: by-nc-sa)]
Categories
content strategy

Evergreen content not all that evergreen.

The ever-wise and observant Jennifer Kane, consultant at Kane Consulting sent this tweet yesterday, causing me to consider my past history with “evergreen content”:

In the radio world, an episode of a program with date-neutral content is produced to fill air time should calamity strike [snapped tapes, sunspots, old-fashioned user error, etc.]  These episodes are called “evergreens,” and are used only as a last resort to avoid the dreaded “dead air.”

They are replaced often to keep things somewhat current, actually making them less “evergreen” than the name implies.

Evergreen content is thought to be a godsend to some content creators and publishers. It was relevant before it was even published.  It will be relevant for all eternity. It is EVER GREEN.

But what content is really evergreen? What content does not require some degree of maintenance? What content is not made out-of-date by SOMETHING? Content that few will find really valuable or compelling.

What could possibly happen if content is dealt with in the RonCo Rotisserie method — set it and forget it? Lots of things. Your content might:

  • Become out-of-date
  • Gain a new context
  • Become unusable due to CMS updates
  • Expire, from a legal or rights standpoint
  • Confuse people with multiple versions
  • And more, unfortunately

All of these can lead to a terrible user experience, and even worse, legal liabilities.

Publishers and content creators are strapped for time and resources. Many are too busy pushing the content out the door, leaving no time to put a proper content maintenance strategy in place.

Having a strategy in place that considers the lifespan and life cycle of content can help avoid these issues.  Good questions to ask when putting one together:

  • Is the content good for 6 hours or 6 months?
  • How do editorial considerations apply?
  • Are different versions of content tracked properly?
  • Should it be archived or deleted?
  • What triggers activities like archiving and relocation?
  • What stays on site, what shows up only in searches?
  • Are legal contracts, rights, and obligations in sync with content?
  • How are new contextual opportunities managed?
  • How are new business opportunities applied to existing content?
  • Plus many other considerations.

Plans take time.  Content maintenance strategies take time.  Workable content strategies take time.

Putting a strategy together may add more to existing workflows. Editorial oversight requires staff resources. You’ll confront long-term CMS issues. But, a good maintenance strategy can also provide new opportunities as new business models and development pop up. The final result? Better than evergreen.

You will soon find out that your content can remain vibrant and relevant long after it has been published.

[“Pine tree / 松(まつ” image by Flickr user TANAKA Juuyoh (田中十洋) (CC:at)]