Categories
content strategy

Content, Scalability, and Making More Pie

In the world of content, as in pies, more isn’t always better.

My mother is a good example of this; she is often charged with the task of making food for church events.  This works well, as she is fond of cooking and baking.  However, when the situation calls for 40 pounds of potato salad or pies baked at a two-per-week pace, things start to suffer.  She begins to enjoy it less than cooking for family.  She cuts some corners by buying pre-made pie crusts from the store just to keep up.  Both the process and end result are affected.

What was once sustainable in pie-making becomes unmanageable as the environment changes.  The same happens with the making of content.  In a terrific blog post titled, “Content Strategy Is About Publishing,” Erin Kissane writes:

“…the internet is made of publishing, and its new and often anarchic publishing models are messing with older models in all kinds of ways.”

This became clear in the content strategies of two major content producers: the British Broadcasting Corporation (BBC), and the Canadian Broadcasting Corporation (CBC).

The BBC announced last week some significant budget cuts; the web budget would be cut by 25%.  As a part of this, “The Guardian” reports “The BBC’s internet operation will see the number of web pages it publishes halve by 2013.” (emphasis mine)

Cutting back on projects/initiatives is something that the BBC has done many times in the past. In 2001, they ceased shortwave broadcasts to North America and Australasia.  They trimmed or cut entirely some of their language services in certain markets/regions later in the same decade.

The budget cuts will mean changes in the web staffing and managerial structure of the BBC.  It also means that there will be considerable changes in their content strategy in the next few years.

For a bit of history:  broadcasters once enjoyed the luxury of creating content in an environment that had great built-in features.  Content could be created in a much different way. This is no longer the case.  It is almost hard to image now:

  • Little or no public facing archive
  • Automatic context provided via linear broadcast timeline
  • Massive reach in an uncrowded landscape

Capacity to produce raw content is only one part of the equation.  To be successful, the rest of the editorial workflow must be given the proper attention.  Getting the story on-air is no longer the sole aim. Editors, publishers, and those governing the long-term life cycle of the content share an equal seat at the table.  These positions need not be separate people, but each duty requires time, resources, and diligence.

As newsrooms and broadcasters look to make their content available on all platforms, additional hours are required (once people are trained) to translate the content into appropriate formats.  Translation in this case means that some things will need to be added or subtracted from the formerly-finished product in order to remain in-context and relevant to its surroundings:

  • Text version of audio content
  • Video to accompany audio content
  • Images to populate slideshows
  • Text transcripts of video content
  • Interactive/casual gaming features
  • Platform-specific metadata
  • Branding, rights management, and editing all of the above

Online video, for example, has been viable and mainstream for years.  Many content producers are only now beginning to incorporate it into their content production and editorial workflows.  The chorus has often been “all content to all platforms.”  The CBC recently stated this on their “Inside the CBC” blog post titled “The CBC’s Digital Content Strategy.”

“We don’t know what will work,” (Richard Stursberg, the executive vice president of English programming) said, “One of the big outstanding questions is how long content will live on various platforms.” But he reiterated his commitment to pushing content onto new platforms regardless, “We’re gonna have to absolutely be there,” he said, if we don’t move to these new platforms, “we just lost all our viewers.”

This brings about a question that the BBC may have asked themselves: If it takes longer to create content that is viable on a multitude of platforms, could the current page counts on the web be unsustainable? The answer was a budget cut and subsequent planned page count reduction.

Is publishing a story in audio form, with an image slideshow and text version plus an interactive/gaming feature causing a change in focus?  Are the raw number of pages published no longer the benchmark?

A change of this magnitude allows for an alternative to the “content farm” model by offering an in-depth, robust slate of content–neither the BBC nor the CBC are strangers to that.  As Erin Kissane writes, creation of content that fits the new modes of consumption is “…largely made up of new applications for old skills.”  That is good news for the news.  What is left is this: content producers must now reconcile the amount of time and resources required with changes in output volume.

If it means baking fewer (but better) pies (or content), then I am all for it.

[“Pie Chart” image via Flickr user net_efekt (cc: by)]
Categories
content strategy

Compelling Content: Irresistibly Shareable vs. Content Farms

Does it pay to produce content that values craft, careful research, and proper grammar? Do people want content so irresistibly compelling that it must be shared? Two pieces this week out of the “New York Times” shed some light on this issue.

The first, titled “Plentiful Content, So Cheap,” says that content farms like Demand Media are capitalizing upon a near-loophole-like situation of content creation by low-paid writers. Demand Media discovers “needs” by parsing popular search requests and publishes 20,000 articles each week. Writers are paid $15-$20 for each article, and editors about $3.50 each for proofing and vetting. The texts are produced to rank them high in the world of SEO.

The articles produced on one content farm, eHow.com, show a definite flaw in this type of content strategy. A search on that site for clogged drain reveals articles titled “How to Clean a Clogged Drain Pipe,” “How to Open Clogged Drains,” “How to Open a Clogged Drain Full of Water,” and “How to Clear a Clogged Drain Easily,” among others. This reveals a strategy of “more is better.” An overabundance of similar/duplicative content will lead to confusion and, ultimately, a poor user experience.

Christine Anameier wrote about the end product of content farms on the Brain Traffic blog in a post titled “Sorting through the digital debris.”  She sums the situation up well:

If the whole idea behind the site is “We know all sorts of stuff about everything,” beware. (Except for Wikipedia, which has enough critical mass to make its own rules much the way Amazon does.)…The content farms have learned to game the system, and dubious content is clogging up the works.”

The second “New York Times” article, titled “Will You Be E-Mailing This Column? It’s Awesome,” points out a different kind of content phenomenon. University of Pennsylvania researchers have been poring over the email-to-a-friend data from the “Times” itself, and have uncovered some interesting trends. Long-ish articles are popular (a surprise, there) as are articles about science (a surprise, too). Positive articles outnumber the negative ones. It appears that senders are not just trying to impress their friends with their acumen, but rather “seeking emotional communion,” according to one of the researchers, Dr. Jonah Berger.

The difference is between the content featured in the two articles is not How-To guides versus canonical works, but rather a matter of intent. Quality versus quantity. Spartan, functional content has been around for ages. Lots of it. So has the top-notch, compelling content. The new ingredient is the manipulation and overloading of the system in order to have content of a lesser quality supersede the real thing where it matters most: in search results.

What became apparent to me immediately was that these are two very different kinds of content. The stories topping the shared-with-a-friend lists in the “Times” are examples of content that affects people on a different level.

Content must be created and presented in a way that will meet goals and objectives, rather than simply filling quotas, bloating site content holdings, and search engine placements. A content farm might teach you four ways of how to remove that wookie from your shower drain, but it will not inspire and fill you with awe, let alone meet a true need.

(image: “Field with farm equipment in the distance” via Flickr / Library of Congress (no known copyright restrictions))
Categories
content strategy

Surviving the Distraction of Shiny New Objects

Can a shiny new object put your carefully crafted content strategy in peril?

Continuous partial attention is about scanning continuously for opportunities across a network, not solely about optimizing one’s time by multitasking.Wikipedia

Though Linda Stone coined the term in 1997, continuous partial attention has become more of an issue in the world of content than ever before. Real-time technologies allow greater access to more information at any moment. It could be a tweet, an email, a quick check of an RSS reader, etc. Laura Miller wrote a book on the potential impact this may have on individuals. What about the impact on businesses?

David McCandless’s Hierarchy of Digital Distractions has made yet another round on the viral info-sharing circuit. It clearly illustrates what many of the super-connected people are cycling through any minute they are awake. Just as Miller’s book points out, the lack of focus has people bouncing back and forth between the shiny new objects.

Information itself is not the only shiny new object poised to distract. The services and the hardware/gadgets that serve up the next hit garner the most acute attention. Apple’s most recent product unveiling is a…shining example.

What does all of this divided attention mean for content creators and publishers? What impact will it have on what was once a carefully crafted content strategy? Might the continuous partial attention distract to the point of sabotage?

A good content strategy will have workflows in place that address day-to-day activities.  However, there must also be a strategy and associated tactics and workflows that allow for consideration of new business opportunities as they arise.

I put the emphasis on workflow. Why? Because articles that pop up in the “Wall Street Journal” or “New York Times” get mainstream attention, and the echo chamber amplifies any potential into an unavoidable din. If a workflow is in place that allows each new shiny object to be carefully evaluated and vetted with stakeholders, the sooner they can be either implemented or tabled. The sooner, then, you can return to the task at hand: content.

Managers will rejoice in the fact that such a workflow exists, eagerly anticipating changes in the marketplace and technology. It is in the strategy’s best interest to have something in place that can protect it from a band-wagon jumping moment of viral hysteria.

Whether this shiny new object is a service, network, or a new device (looking at you, iPad), there are several key questions to consider.  At the very least, this component should ask two questions of a shiny new object:

  • Does this serve our customers/clients?
  • Does this fulfill an unmet business need?

Aside from the standard questions of “does this have a solid business plan” and “will this be around in the next 8 months,” other questions you may consider including in a shiny-new-object-evaluation component:

  • Would adopting a wait-and-see approach for the next quarter be appropriate?
  • What approach might our competitors adopt?
  • Which current content workflows would be impacted?
  • What is the potential return, aside from simply “being there?”
  • Does this follow our mission/tone/standards?
  • Would this cannibalize resources from things already established in the strategy?

There may be an advantage to being the first in line for a new technology. Bragging rights can count for something in some arenas.  But wouldn’t you rather be the one that does it right, rather than first? Meet your business and customer needs rather than being able to chime in with “first”?

Most likely, new business opportunities will require SOME sort of addition to existing content workflows. Be it one-time tasks of database configuration or appending metadata or modifying a taxonomy, or ongoing issues of manual content ingestion, editing, or transcoding, a disruption will probably be introduced. (Then again, sometimes disruptions become workflows.)

As with any component of a content strategy, the shiny-new-object-vetting portion must also maintain a razor sharp focus on business objectives, the people using your product/service, and the practical realities of your current operation. What were once major distractions and time vampires will hopefully become ever-so-complete-able tasks.

(Cragars! image via Flickr user cindy47452 (cc: by-nc-sa))
Categories
content strategy

Putting Metadata to Work for You

A poor metadata component in a content strategy can render precious content virtually nonexistent. Search engines cannot find it, recommendation engines cannot recommend it, and it may end up lost forever in a content purgatory of sorts. A strong metadata strategy will pay off when it considers both the limitations and opportunities found in its application.

Robert Stribley recently wrote a post on the Razorfish Scatter/Gather blog about metadata and tagging within the music library of iTunes. This got me thinking of my own experience.

The music library is one of many components that makes iTunes like a Swiss Army Knife. There are movie rentals and purchases, TV shows, a photo and video library, iTunes U, the app store, and the one that I spend some quality time with each day: podcasting.

Public broadcasting has ridden the wave of podcasting’s immense growth in the past 5 years. Each month, millions of public media audio and video files are downloaded, delivering the content to the people that want it. iTunes is the most popular choice for accessing that content. Therefore, having the content appear properly in that space is very important.

Mr. Stribley indicates in his article about the iTunes music library, there are some hurdles to clear. That holds true within the podcasting space as well. A few podcast-specific ones:

  • Store displays only 24 characters of a title in certain areas
  • Search only examines the first 12 keywords in a podcast feed
  • Feed images display in many places at 50×50 resolution
  • Channel-only indexing for search, rather than by individual episode

A couple of examples of truncated titles:

Apple has included a well-written summary in their “Making a Podcast” guide titled “Creating Your Feed and the Importance of Good Metadata.” Like any other space, iTunes provides its own set of limits and rules. Apple likely does this for a number of good reasons. If they let the public decide on the design, iTunes runs the risk of looking like a GeoCities site, not to mention search result speed concerns, intentional gaming of search algorithms, etc.

Reasons aside, the charge is the same – operate within the rules to get in front of that sizable arena of content-hungry people.  The key is providing the right information (directly or indirectly via search results) to those people, so they can accomplish their intended task: devouring your content.

However, there’s a compatibility balance to consider.  While it’s the dominant player in the podcasting field, content published with iTunes in mind often has multiple, additional syndication outlets:

  • Text-based RSS with other attachments
  • Widgets
  • Mobile apps
  • Dynamic pages on other sites
  • (insert here a technology developed in the next 18 months)

With that in mind, content creators can no longer publish with a single set of guidelines.  Outside of detailed specification documents, there are plenty of ways to make content as outlet-friendly and more important,  user-friendly:

  • Use every field to its fullest, even those outside of iTunes spec
  • Be mindful of character limitations
  • Ensure that the text is as potent as possible
  • Create graphic identities that are powerful even as thumbnails

When strategy is drafted, consider the reality that many of these systems/components are connected, often in ways that may not immediately be evident.  The environment may change along with technology and new business opportunity, but the core business aim should remain clear.  As ever, provide the best experience possible with your content.  And that includes helping them find it, with the help of metadata.

(“iPod Touch Back (2)” image from Flickr user Fr3d.org (cc: by-nc-sa))
Categories
content strategy

URL Shorteners: Are They Part of Your Social Media & Content Strategy?

How can a URL shortener impact your overall content strategy? Don’t they just shrink URLs?

I remember the appearance of the first URL shortener, TinyURL, shortly after their debut in 2002. It seemed like a novelty. There were times that I wanted to share a link that got broken with page breaks in emails. Beyond that, what was the use?

Then Twitter arrived. A world that lived and died by the value of character counts. Every single character became precious real estate.  Finally, the solution of URL shorteners had found their problem.

Tons of these services soon popped up, each with their own selling points and clever names. Several have already folded. Like many other web services, they are made or destroyed by The Default.

Until May 2009, the default URL shortener for Twitter was TinyURL. The new default, Bit.ly, has ridden Twitter’s wave of popularity to become the dominant service. To get an idea of the scale of the operation, ponder this: Bit.ly shortened 2.1 billion links in November 2009, as reported by TechCrunch.

At the end of 2009, Google got into the game by putting a URL shortener into Google Toolbar, FeedBurner, and their web browser, Chrome.  (It’s not currently available outside of those places.) Other sites have exclusive shorteners, too, like Facebook (fb.me) and YouTube (YouTu.be). When you share a link from those places, those short, branded domains will be used.

Not to be outdone by a Google announcement, Bit.ly announced that it was partnering with some serious heavyweights (New York Times, AOL, Wall Street Journal, Huffington Post, etc.) to offer a Pro service in limited private Beta. Benefits include: Custom URLs (ex.: nytm.es) and analytics in real-time.

The option to bake a shortener into your CMS that allows you to create a short URL of your choice, domain and all. This is considerably more heavy lifting than the free services already out there, but offers its own set of benefits.

Why does any of this matter?

Something as seemingly innocuous as a URL shortener must figure into the social media portion of a comprehensive content strategy.  There are many, many options, and the decision on which one to use must be carefully considered.  While not as important as a content audit or the choice of a CMS, the benefits of choosing the right URL shortener are clear:

  • Solid, useful metrics
  • Contribute to (rather than hamper) your SEO strategy
  • Prevention of link rot
  • Provide the best user experience
  • Reinforce branding
  • Security/confidence

The metrics component alone may be the make-or-break feature. Most site analytics suites can track in-bound traffic from Twitter.com.  Since 30% of Twitter-related link-clicking traffic comes from third-party applications like TweetDeck or Tweetie, tracking the links from the shortener end becomes considerably more important. Some of the services offer more analytics features, and they are quickly improving them.

Certain shorteners use different types of HTTP traffic redirects to point to your original URL. Some properly refer the traffic and maintain your hard-fought SEO efforts, others do not. Most of the major players play by the rules and refer the traffic on a permanent basis and do not recycle links. Watch out for those.

What constitutes the best user experience is always up for debate. Certain shorteners like the Diggbar and HootSuite’s ow.ly shortener use frames.  While there may be some UX benefits with frames for users, they can also confuse and distract. There is the potential for the frame-based shorteners to cause metrics and SEO issues as well.

Shorteners are only as trustworthy as those using them. We’ve been taught to avoid unfamiliar links in emails for as long as phishing has been around. Selecting a service that uses a familiar or custom-branded URL domain will give people the confidence to click without fear of spam or malware.

So, indeed, it does matter.

What may have been something destined to be a footnote in the history of The Internets is now something vital to a comprehensive content strategy.

Speaking of shortening…


( “some people say I am obsessed with my lawn” image from Jez Page / Flickr (cc:by-nc-sa))
Categories
content strategy

Privacy, Facebook, and Social Media Strategy

We’ve seen a bit of dust kicked up in the past couple of weeks surrounding three announcements from both Google and Facebook about search, status updates, and an old favorite of The Internets, PRIVACY.

#1. TechCrunch reported that Google, a popular search engine, has chosen to include “public” Facebook status updates in the special, Real-Time place in regular Google search results. MySpace and Twitter updates are already showing up.

#2. Facebook just rolled out changes to their privacy settings. Upon login, people have been presented with a prompt, asking them to confirm their privacy settings to the default, recommended settings, or to keep their old ones. The default setting for status updates is “everyone,” meaning that people’s updates of “I’m eating a waffle” are now set to be indexed by Google.

Facebook claims that this is a reaction to the way that people are using the service in this interview on ReadWriteWeb. Barry Schnitt, Director of Corporate Communications and Public Policy at Facebook, gives this somewhat veiled explanation: “Because the site is changing, our userbase is changing and the world changing.”

What ever the official stance may be, it is clear that both companies stand to benefit from more updates being made publicly available, and having those updates showing up in search.

#3. The final announcement, as reported by MediaPost, will have an impact upon those that use the Facebook Pages. From the start, all status updates from a Page have shown up in the news feed of Fans of that Page. Soon, Facebook will be inserting an algorithm into that process, giving preference to Page owners that have a higher level of engagement with their Fans. (Engagement, in this case, being people “Liking” and commenting upon updates.) Simply stated, the more interaction a Page has, more of those updates will appear in Fans’ News Feeds.

What does this mean for those that operate Fan Pages on Facebook?

If your strategy has been to set one up and wait for people to make it viral, then you have your work cut out for you.

If it weren’t already clear, it should be blindingly so now: your social media strategy is a content strategy. Status updates are content. Links and Notes in Facebook get indexed. Photos on your Page will soon show up in search.

No need for alarm.  If dealt with properly, it can be cause for celebration. Reach can now be extended outside of the walled garden of Facebook and into the wide open world of search. More traffic can potentially be driven to Pages and sections of Pages, like Notes, Photos, and Video.

There will be some work involved. Page owners may need to take a new approach to the content they publish to those pages.

Facebook’s new News Feed insertion rules depend on the mojo granted from comments and “Likes.” Page administrators will need to curry the favor of those Fans. To do this, publishers and content creators should apply the same SEO-friendly rules to their Facebook content and updates that are likely in place for other content:

  • Clear, descriptive titles
  • Concise summaries
  • Relevant images, video, audio content
  • Proper metadata attribution
  • Hooks!

For other content creators and publishers, these changes may be a wake-up call. Maybe the updates that have been published up to this point haven’t been that appealing. Perhaps they were posted just to post something, to stay in front of Fans. Now may be the time to look at a larger content strategy. Or, ask bigger questions, such as “What are we doing here?”

As more outlets [and changes in those outlets’ policies] present themselves, they’ll need to be incorporated into strategies, workflows, governance, and metrics. Unlike other, more static components of a content strategy, the rules and tactics in social media are more likely change week to week. User agreement changes, acquisitions, and partnerships with competitors can all force a re-examination.

Even what seems to be a relatively minor change in a single channel may turn out to have a sizable impact.

[“Self Noir” image via Flickr user Jeremy Brooks (CC: by-nc)]