Trulia Caught Cloaking Red Handed

June 26th, 2008

Sorry for the slow load & lack of narrative on the last one. The video can still be found here (but be patient with it.) Here’s a pictorial of what’s going on:

#1 – Go to

#2 – Click on Classifieds > Real Estate For Sale

#3 – It takes you to Trulia’s Seattle Weekly Partner Page

#4 – Using the for FF, change your user agent to Googlebot

#5 – Refresh the page. It now reloads as the page Trulia optimizes for (currently #3 for “Seattle Real Estate)

What are the implications of this? Basically that Trulia isn’t playing it as clean as their PR guys would tell you. They’re intentionally screwing their partners (they’re doing this to Parade, as well) out of links back, and they’re picking up some nice one-way sitewides as a result. The results they’re getting probably aren’t worth the risk of a cloaked 301, but they’re sure doing it. Oh…they’re also doing a pretty stupid, easily detectable cloak.

Share and Enjoy: These icons link to social bookmarking sites where readers can share and discover new web pages.

36 Responses to “Trulia Caught Cloaking Red Handed”

  1. Says:

    [...] do I bring this up? Because Eric Bramlett just dropped a video bombshell on Trulia. (Fair warning, the video takes a while to load and may not explain easily to the typical viewer [...]

  2. (4 comments.) Says:

    I checked two more of their partner pages and they are cloaking those as well:

  3. (1 comments.) Says:

    Thanks for the ammunition. I would encourage all to report this clear violation of Google’s policies in their Google Webmaster Tools account, see Google’s blog for more info

    Maybe we can get t-r-u-l-i-a delisted.

  4. Says:

    [...] Bramlett, a local seo expert, discoverd that Trulia, a popular real estate search site, has been cloaking their partner pages.  [...]

  5. dave g Says:

    do a google search for the URL “” and take a look at the URL Google has indexed.

  6. (4 comments.) Says:

    OH my- Realtors will be pissed. This is not their first bash as shady SEO. Back in March they were no-following their own partners (you know the Realtors and Brokers that gave them listings!) They are up to other stuff as well, but I thin the cloaking pretty much sums up their SEO ethics.

  7. Andy Says:

    I think it is lame that trulia gets all the page views from partner sites. It makes no sense to me how this benefits the partner. No idea why they don’t provide a snippet of js to the provider to run listings. This basically means trulia is looking to partners to get their pageviews up.

    Trulia has tons of shady seo.

  8. (4 comments.) Says:

    I know all you agent types hate Trulia, but I’m not seeing what’s so evil here. The page on seattle weekly is Trulia’s content and it is duplication of the seattle page on their site. The 301 prevents Google from indexing the duplication and let’s Google know where the original version exists. It’spretty much a win-win, Google doesn’t have to index yet another lame copy of one of Trulia’s pages, and Trulia gets the credit for their content. Sounds like smart SEO to me. (Although they are retarded for doing via UA)

  9. (31 comments.) Says:

    Google doesn’t have to index the page if they simply noindex,nofollow it. I don’t think it’s necessarily evil, but it indicates they really don’t have any respect for the Google TOS. The UA cloak was stupid, and the benefit they’re getting from the cloak isn’t worth the risk, IMO.

  10. How Much Traffic Will Sphinn Drive? | Bramblog Says:

    [...] page on sphinn looks like it will drive ~200 visitors, based on the Trulia Cloaking [...]

  11. dan Says:

    I am also of the opinion that they are more incompetent than sinister. I think they are trying to avoid dupe penalties rather than obtain seo benefits. as others have pointed out, they should have done this with bot control or scripting rather than cloaking.

  12. (4 comments.) Says:

    I’m with Dan – When you add up their various tactics- it boils down to incompetence. They seem to have a loose grasp of SEO, but not the skill set to implement it properly. They simply need to go out and hire a real SEO firm. If the intention was to avoid duplicate content, there were a number of other methods they could have used, cloaking, not being one of them. Seems like a lot of effort and risk for such a simple task.

  13. (31 comments.) Says:

    I seriously doubt it’s incompetence. They’ve pulled off a fantastic widgetbait campaign, they’re internal linking structure is solid, and they’re refusing to link to any competitors. They’ve purchased some great stealth links, and put together deals w/ template providers that provide keyword rich links to their specific pages. They’re killing it in the SERPs for broad terms, and for long tails. I would say that their SEO team is extremely competent.

    However, I think they’re a little overzealous. Anyone with the knowledge to put together a UA cloak knows how to simply “noindex,nofollow” a page. You learn the noindex,nofollow long before you even know what a UA or IP cloak is. So I don’t buy the “avoiding dupe content” excuse…it is extremely weak. It looks to me like their SEO team wants to squeeze every last bit of PR out of everything. I think they’re cloaking it to make sure that the links redirect to the proper pages, and to provide them with one-way links. It’s definitely not worth the risk, but I don’t think their SEO team thought it was much of a risk.

    Any team that can orchestrate the coup in the search engines that Trulia has is not incompetent (double neg – I know.)

  14. (4 comments.) Says:

    True- and they did say that putting nofollows on Trulia Voices links was a side effect of curbing spam (completely ridiculous – these were links to the agents that populated the whole network). That was a blatantly lie – I believe they were trying to curb PR bleed and create a more powerful internal link structure. My only point is that if this was an effort to avoid dup content- they are definitely incompetent. I agree, no novice could have pulled off the UA cloak, but no pro would have done it that way. Either way- their SEO is riddled with incompetence- there are tons of other ways they could have boosted their metro landing pages in engines that would not have tipped their hand to their tactics. I would consider them less incompetent if they could cover their tracks better or at least come up with better excuses.

  15. dan Says:

    Eric you’ve got to give up the cake! you can’t eat and have it too.

    Either Trulia is attempting to avoid dupe content penalties in a dumb way or as you state “they’re also doing a pretty stupid, easily detectable cloak.” So, are they dumb or stupid?

    Why do you even assume the SEO folks are doing the cloak? I have often seen developers throw in cloaks because they think it is clever to test on something not observable to casual web visitors.

  16. (31 comments.) Says:

    The cloak isn’t being used to avoid dupe content penalties. That’s just the really bad PR spin that they put on it.

    I assume the SEO team implemented the cloak because there is no other reason for it (than SEO.) They used a stupid TYPE of cloak – they should have used a less detectable cloak (like an IP,) and no one would have found it.

  17. (4 comments.) Says:

    They are far from incompetent. The method they chose is absolutely by far the best approach to manage large scale duplication caused by partner content distribution. If you honestly think noindex/nofollow/robots.txt is the golden ticket, you have never dealt with a project of that magnitude. Maintaining the cleanest possible collection of indexed content in G’s database contributes greatly to improving the overall performance of a large site. And 301s are the best possible tool for getting that job done.

    But lets look at the bigger picture, which is the intent of Google’s guidelines. Those guidelines are in place to govern the quality of experience for people searching Google. Evil cloaking involves intentionally feeding Google a page that you want to get indexed and then diverting traffic (Gogole users clicking on a search result). In Google’s opinion, that’s bad because you manipulated their user’s experience.

    That’s not even close to what Trulia has done. Trulia told Google that they didn’t want them to index the particular page because it was a copy of a page that existed somewhere else. They used the best possible method to give that message to Google. And at the same time, they let them know where the true, original, authoritative page existed. They weren’t redirecting Google visitors from the Seattle Weekly page to their page. And no one clicking on the Seattle Weekly link was deceived in any way.

    Not only is that not even close to being cloaking, it also happens to be a practice that is so common, I’d argue that the quality of search would diminish greatly without it. If you spend any amount of time surfing sites like Amazon, Ebay, and even Yahoo using search engine UA’s you will find all kinds of subtle differences between what Google sees and what humans see. While some of what you will find may provide some secondary SEO improvements, the majority of it happens as a result of trying to clean up indexing issues.

    The reality is, from a search engine perspective, Trulia has built a large, authoritative site that is filled with content hundreds of other large reputable sites want to use. And the fact that they have gained that status, gives them higher algogrithmic cred than other sites. That cred is why Trulia is kicking your ass, and they will continue to kick your ass until you are able to create a site where hundreds of people line up wanting to syndicate your content. And When/if that happens, I would hope that you would consider using conditional 301s to help prevent duplicate content and at the same time, give your site the same kind of SEO benefits Trulia is getting. :)

  18. (4 comments.) Says:

    A user agent cloak is just one step below faking PageRank in my book -something some loser in their mama’s basement does when they think they are being clever. The best solution would have been a 301 – but that is besides the point.

    I also agree that they did this for the SEO – there really isn’t any other plausible explanation. Now for the completely unbelievable part- you out Trulia on this tactic- others outed them for invisible text (how lame…) and the nofollows and yet agents are still facilitating their own SERP demise by adding content to Trulia communities and putting links and widgets on their pages back to Trulia. It’s hard for me to feel sorry for people crying poor mouth over Trulia jumping above them in SERPs when they are part of the cause. It’s pretty cut and dry for me- if you don’t want som

  19. (31 comments.) Says:

    If you honestly think noindex/nofollow/robots.txt is the golden ticket, you have never dealt with a project of that magnitude.

    Absolutely. Can you please explain why a conditional redirect is superior to a noindex/nfollow?

  20. (31 comments.) Says:

    That cred is why Trulia is kicking your ass, and they will continue to kick your ass

    It’s pretty arguable whether or not Trulia is kicking my ass in the SERPs. Kicking the general real estate industry’s ass in the SERPs? Maybe.

  21. (4 comments.) Says:

    Yea, I meant “your” collectively. I think it’s now harder to find a real estate serp that Trulia isn’t doing really well than ones that they are.

  22. (31 comments.) Says:

    Back to the subject, though. I was under the impression that serving different pages to users & bots was cut & dry cloaking, and against Google TOS. Even though the subject matter is similar, there’s a much simpler way to keep the index clean. Unless I’m missing something, I just don’t see how “noindex, nofollow,” or “noindex, follow” wouldn’t scale, while a cloaked 301 would. The only benefit I see to a cloaked 301 is that the juice passes through.

  23. (4 comments.) Says:

    Can you please explain why a conditional redirect is superior to a noindex/nfollow?

    The biggest issue has to do with the urls staying in the database. So when Google collects urls to a page as it crawls, but then can’t index that page because of robots.txt or noindex, it doesn’t flush the urls it finds. If there are enough links pointing to those non indexed urls, they will even on occasion start returning them as search results.

    Using a 301 fixes all of that. The urls get purged, and Google is better able to understand what the original source is. And it isn’t just something that happens in relation to syndicated content. The same applies to internal structures where very similar content can be viewed from multiple urls. (Things like product sort functions). You could add nofollow to those links, or use wildcard robots.txt, but that isn’t the best approach for purging crap urls or strengthening internal link structures. The better aspproach is to pick one master page for Google and then 301 al the variations to that page. Since those pages do have value for a human visitor, you have to make the 301 conditional.

    Here’s another quick example:

    Client has to move to a new domain. The current site is and has been indexed in G for quite some time. The new site on the new domain is functional for humans, but not quite ready to be merged, so no index/follow is applied to everything on the new domain because they can’t postpone the press release. The release hits the wires and the new domain gets a lot of press (and links) But Google can’t index the pages people are linking to, so it just hangs on to a bunch of urls with no title and description. Now, all of a sudden, the original site vanishes for their brand name, and the SERPs end up listing the urls that aren’t supposed to be indexed instead. So now their brand name serp looks like shit and they are losing traffic because they “followed the rules” and left everything up to Google to figure out.

    Doing that is the kiss of death. The less decision making you leave up to Google the better off you will be. In this particular case a conditional 301 would have been a much better solution, but the client was too afraid to take the proper steps to have the smoothest transition possible give the set of circumstances, because they’ve spent too much time listening to all the “evil cloaking” bullshit being published by people who aren’t even close to be qualified to know what is or isn’t cloaking, or more importantly, how Google might feel/react to it. (Again Eric, I’m being general here, not directing that at you personally).

    To me, that’s sad. Large scale projects like that are probably 90% of the kind of work I do. And when you spend that much time doing that kind of work, you constantly see examples of how Google doesn’t always work like the say it should, or even worse, are completely incapable of figuring out things they claim are easy. So to me, making decisions about what might be the best way to address a particular problem based completely on FUD is just a rediculous thing to do.

  24. (31 comments.) Says:

    Thank you for the explanation. Tell me if I’m wrong, but it sounds like it’s arguable whether or not the practice is with Google’s TOS, and the reason UA redirects are retarded is b/c they can more easily be found & outed.

  25. LAW Says:

    This is a pretty lame attempt to “out” Trulia’s SEO strategy. They have essentially set up a distribution platform where they have dozens of themed versions of the same page. They have to do *something* for Googlebot or they will get hit with a duplicate content penalty. Now, you get all upset because they don’t use “noindex nofollow”, but why would they do that? There is legitimate content on the linked page, and they “deserve” the link. All they are doing is theming the content in a consistent matter so as to help SEO crawlers consolidate similar pages.

    It’s not a perfect solution because there is a bit of content on the themed seattleweekly page that isn’t on the corresponding Trulia page and vice versa, but if Google’s purpose is to understand the structure of a set of pages, having the Trulia system is the closest thing that can be done to showing Googlebot the real pages (again, probably incurring a dupe penalty). I would think that this is how Google would want you to do it if you asked them.

  26. (31 comments.) Says:

    They’ve pulled down the UA cloak on . Good comments from Matt Cutts

  27. Jon Says:

    Isn’t Seattle Weekly cloaking too? When I visit as the googlebot i don’t even see the navigation bar that would take me to Classifieds. When i go there as a regular FF browser i see it no problem. Are they guilty of the same thing that Trulia is guilty of?

  28. (31 comments.) Says:

    Weird…I looked at it and saw the same thing. When you check the source, all of the navigation is still there.

  29. (1 comments.) Says:

    Seattle Weekly is not cloaking. It is a javascript issue. Their js handles the nav and since the bots can’t run javascript, the nav doesn’t appear. If you disble javascript in your browser and reload their page, you will not see the nav either.

  30. (31 comments.) Says:

    Thanks, Braxton.

  31. (1 comments.) Says:

    I am far from understanding this stuff enought to catch these folks. I am glad to see that you are and can point out to the world what they are doing. Cheating site should not be allowed to get in the way of hard working people.

  32. Says:

    [...] of you guys may remember Trulia was caught back in June for cloaking pages. This is obviously against Google’s TOS but I wouldn’t think that something that happened last [...]

  33. (1 comments.) Says:

    Nice Eric. This is a great tutorial on cloaking as well site investigations.

  34. Says:

    [...] in Google.  Techy folks had complained that the Trulia real estate college project was playing Cloaks and Daggers in some sort of mischievous gray hat SEO strategy game.  Who can blame these college kids from [...]

  35. Says:

    [...] 4th, 2009 — Eric Bramlett I wrote a post on my other blog last summer about Trulia cloaking in order to improve their search results.  This morning, I was pinged by Brad Carroll’s blog [...]

  36. Says:

    [...] surprise if they were penalized.  As Brad pointed out yesterday, Blackwell points out today, and I’ve pointed out in the past, Trulia doesn’t always respect the Google Webmaster Guidelines, and sometimes don’t [...]

Leave a Reply