Press "Enter" to skip to content

A link too far?

So, it is going to be a spectacle, especially during a slow news week, when one media company sues another for hyperlinking:
Gatehouse sues NYT Co. over local Websites (Boston.com)
Gatehouse Media sues New York Times Co. over copyright issues (WickedLocal.com)

(For ongoing analysis Dan Kennedy at Northeastern University is tracking the case at Media Nation – he has posted the complaint here)

They say all politics are local, and even on the Web, most business is as well. So in this case there may be a disconnect between the open ideals of the Internet and the cold reality of Website publishers trying to compete with giants such as Boston.com. To put it another way – it is easy to be high minded about such things until it is your ox being gored.

The ‘what are they thinking’ perspective is ably represented by Jeff Jarvis and Mark Potts:
When did Gatehouse become clueless
Gatehousegate

The other side of the story (What they ARE thinking) is so far represented only by the GateHouse complaint.

As someone who competes for online readers in the broader Boston market I can understand GateHouse’s concern. But – I think/hope this might be a technological and design problem, not a legal one in the end.

After all, Boston.com is basically doing what most Websites do – they are aggregating content and linking to original sources. So, it is hard to imagine GateHouse winning outright with this complaint. And, if they did it is hard to imagine the case law thus created would be 100% beneficial to anyone. Let’s all hope they get some mediation and a settlement.

As to the merits, on a first read the trademark dilution complaint appears most on target (to a non-lawyer anyway.) The design of the Boston.com Newton page seems to imply that WickedLocal.com – mixed in with Globe stories and blogs – is just another NYT property. A quick fix there might be to simply separate Globe and ‘other’ content into different news lists with different headers. Just make it clear what Boston.com owns and what they don’t. That is the design solution.

A larger problem (at least for a smaller media property competing in the Boston market) is Google juice. This is where we need a tech solution.

Many small papers have an ongoing complaint that any Web-first breaking news they publish shows up quickly on larger regional Web sites via sharing with AP. The issue is not that AP picks up Web stories, nor that Boston.com (among others) feeds them to their site. The problem is that Google gives big Web sites preference in their search rankings, regardless of whether or not they are the original source for the content.

This happens on a weekly basis when a murder or natural disaster story hits our Web site.  If we publish at 9:00 a.m. it gets to AP by 9:30 a.m. and before 10:00 a.m. Google News has Boston.com, WCAX.com, BostonHerald.com and etc at the top of the search results – while our original and ongoing reporting is in the middle or bottom of the page.

Imagine this same scenario for WickedLocal. If “Your Town” eventually expands to 125 communities who is going to get the search traffic for Newton TAB stories? One would assume Boston.com will get a high rank – and a potentially lopsided share of those first clicks. To be fair I don’t see a strong indication of this effect yet but check out this search result and you can see the beginning of it. So, if 100 readers click to Boston.com and 30 click through to the WickedLocal story is that good? And, is that a gain of 30 for GateHouse or a loss of 70?

So – the ‘simple’ tech solution: Newspaper.coms, the Associated Press and Google need to get together and agree on some ground rules. Newspapers would add metadata to links and external feeds indicating a URL for the original source material. AP would transmit this info with their wire stories and Google would respect that metadata when crunching their Google News algorithms. This would allow everyone to link and excerpt to their heart’s content – but it would NOT reward aggregators with improved search engine rankings built on top of someone else’s content. It would basically be a sort of reverse ‘nofollow’ tag for news stories – that gives credit where due.

AP already has a partnership with Google that is aimed at reducing duplicate wire stories in the index – would it really be too difficult to make this same concept serve individual newspapers? Technologically probably not, politically who knows?

UPDATE: Some more commentary from the blogosphere:
GateHouse Lawsuit vs. New York Times Co. has Dire Implications
A Danger to Journalism
GateHouse: O hai, internetz — we r fail

Gatehouse sues NYTCo over aggregation: But do they have a point?

Globe vs. Gatehouse Part I

Peeking inside Pandora’s Box
GateHouse v NY Times Co.: Not So Simple After All

2 Comments

  1. Damon Kiesow
    Damon Kiesow December 24, 2008

    Testing the comments via Facebook Connect.

  2. Damon Kiesow
    Damon Kiesow December 24, 2008

    Testing the comments via Facebook Connect.

Comments are closed.