Archive for April, 2012

Title: 5 Key Link Building Tactics in 2012 — A SPN Exclusive Article

Search Engine News, Search Engine Optimization

Article Source: Link Exchange Service Network

As the Web progresses, so does the refinement of the top search engines and the algorithms they utilize to ascertain important content in cyberspace. Relevant is the key word here, which means valuable to the searcher and the most likely to be exactly what they are searching for. Rolling into 2012, search engines such as Yahoo have begun to understand that keyword matching is only an aspect of the formula to determine relevancy, and that the value and the authority level of the web page is turning into a larger determinant of search relevancy.

That brings us to your link building approach as a webmaster, small business owner, or online entrepreneur as we commence 2012. The following five link building suggestions get you the backlinks and relevance you will certainly need this year:

1. Multiple, Applicable Links

All of us want a million backlinks to our blogs, and trust me, with a little bit of cash, you can get them. There are lots of linkbuilding deals at Fiverr.com and sites such as The Hoist that will sell you literally hundreds of links. Trouble is, Google is looking for quality which means pages linking to yours must be complementary, in the same market space, and in some way associated to your content. Your landing page should somehow be an offshoot of the ideas presented on the linking site. If that site has to do with gourmet foods and your page is addressed to unwed parents, that sort of link will likely not help you a great deal and in fact can harm your Google rank.

So just how can you get them?

* Get in touch with site owners wanting to swap links for starters, usually done in a resources page. This is still a strong technique and has been around for years.

* Leave blog remarks on posts with the same material as yours, specifically authority sites (more on this later).

* Distribute your original articles for publishing on news and blog sites that are a normal match to yours.

* Post relevant links to your website on social media portals like Facebook, Twitter and LinkedIn. The best are links to blog columns or landing pages relevant to Twitter trends or hash-tag lists and LinkedIn discussion groups.

2. Neighbors Like You

The world of online search is going local and is placing more and more focus on local sites. Even if your company is not constricted to where you reside (like an online SEO website that may be able to work for clients globally), it is more useful than ever before for local companies to be linking to yours. Socialize in your Chamber of Commerce and encourage link exchange, keeping in mind that the more similar these companies are to yours (and they’re local), the better off you are. If your SEO establishment is in San Diego, encourage your local links to say “SEO San Diego” to ensure your link is pertinent to their audience.

3. Submit Your Work

There are two primary kinds of authority sites in the internet world: big, highly trafficked, content specific sites (consider Inc.com, Wall Street Journal), and educational and governmental sites. Search engines are still giving lots of credit to university .edu sites and .gov pages, especially the federal government (who knew).

The major ways to obtain backlinks from these sites are to provide valuable, one-of-a-kind content in the form of articles or to post comments to content already on these websites. If you’re lucky enough to hand in great content and get yourself published on a high-authority content site, that is worth its weight in gold to your link building efforts.

Don’t throw away your time with auto-posting, scraping applications that are going to write nonsense responses for you. They’ll post, but in most instances will be deleted in the monitoring of the site. You can browse the web and find listings of good PR .edu and .gov websites (ones with a PR of 4, 5, 6, 7) that allow for forum postings with backlinks. Buyer beware and regularly test the domains, applying PR Checker.

4. Anchor Text Strategy

This is one thing you can most easily control using Article Marketing tactics, where you submit content for syndication and publication on high ranking sites. The actual words in your hyperlink, called anchor text, have to be pertinent to both the page on the site that is featuring your work and also to the destination website that the link goes to. That means that, if you are linking “click here,” you’re skipping half the benefit of this linking technique.

So how does this work? Going back to our example of the gourmet cooking site, you publish an article on that blog about your favorite ways to use cooking sherry, then throughout the body of the article you link your red wine reduction recipe to, you guessed it, a webpage that consists of exactly that. Give that guy a prize!

5. Landing Pages

We just talked about how your landing pages ought to be pertinent to the link. You ought to also be linking to various pages on your website, commonly referred to as deep links, and not only your homepage. This method ought to be easy if your website is structured properly with a product page for each product and an article for each idea you may be linking to. This informs the search engines that your domain is material rich on numerous pages, and that is vital to you earning the relevance trophy.

Is doing these things simple? Nope, but that’s exactly the point. You are doing it the right way, and that practice will reward link builders in 2012.


Karl Walinskas is the CEO of Smart Company Growth, a business development firm that helps small to mid-size professional service firms build competitive advantage in an online world of sameness. His Smart Blog covers leadership, business communication, sales & service, online marketing and virtual business, and was recently named by Buyerzone as one of the Top 20 Business Blogs of 2011. He is the author of Getting Connected Through Exceptional Leadership, has been a featured expert for Inc.com with articles published in Selling Power, America Online, and SiteProNews to name a few, and blogs frequently for Rank My Website, a top San Diego SEO Services firm.

Post from: SiteProNews: Webmaster News & Resources

5 Key Link Building Tactics in 2012 — A SPN Exclusive Article

The In-Content Ad Leader Buy and Sell text links Health and Beauty Store

Article Source: Link Exchange Service Network
If you like all this stuff here then you can buy me a pack of cigarettes.

Title: Matt Cutts Talks About How Google Handles Ajax

Search Engine News, Search Engine Optimization

Article Source: Link Exchange Service Network

Google’s Matt Cutts put up a new Webmaster Help video, discussing how Google deals with Ajax. He takes on the following user-submitted question:

How effective is Google now at handling content supplied via Ajax, is this likely to improve in the future?

“Well, let me take Ajax, which is Asynchronous Javascript, and make it just Javascript for the time being,” says Cutts. “Google is getting more effective over time, so we actually have the ability not just to scan in strings of Javascript to look for URLs, but to actually process some of the Javascript. And so that can help us improve our crawl coverage quite a bit, especially if people use Javascript to help with navigation or drop-downs or those kinds of things. So Asynchronous Javascript is a little bit more complicated, and that’s maybe further down the road, but the common case is Javascript.”

“And we’re getting better, and we’re continuing to improve how well we’re able to process Javascript,” he continues. “In fact, let me just take a little bit of time and mention, if you block Javascript or CSS in your robots.txt, where Googlebot can’t crawl it, I would change that. I would recommend making it so that Googlebot can crawl the Javascript and can crawl the CSS, because that makes it a lot easier for us to figure out what’s going on if we’re processing the Javascript or if we’re seeing and able to process and get a better idea of what the page is like.”

As a matter of fact, Cutts actually put out a separate video about this last month, in which he said, “If you block Googlebot from crawling javascript or CSS, please take a few minutes and take that out of the robots.txt and let us crawl the javascript. Let us crawl the CSS, and get a better idea of what’s going on on the page.”

“So I absolutely would recommend trying to check through your robots.txt, and if you have disallow slash Javascript, or star JS, or star CS, go ahead and remove that, because that helps Googlebot get a better idea of what’s going on on the page,” he reiterates in the new video.

In another new video, Cutts talks about why Google won’t remove pages from its index at your request.

The In-Content Ad Leader Buy and Sell text links Health and Beauty Store

Article Source: Link Exchange Service Network
If you like all this stuff here then you can buy me a pack of cigarettes.

Title: DaniWeb Hit By Google Again, Following Multiple Panda Recoveries

Search Engine News, Search Engine Optimization

Article Source: Link Exchange Service Network

IT discussion community site DaniWeb has had a rather hectic year or so. Hit by Google’s Panda update last year, the site has seen a series of ups and downs – hard hits from Google’s algorithm and tremendous recoveries. The site has been hit yet again, and Founder/CEO Dani Horowitz is telling us about what’s going on this time. She’s not sure if it’s the Panda update, though the whole thing just happens to coincide with a recent iteration of it.

Have you seen traffic increase or decrease since the latest known Panda update? Let us know in the comments.

DaniWeb is one of those sites, which in the heart of the mad Panda scramble of 2011, seemed to be unjustly hit. It’s a forum with a solid user base, where people can discuss issues related to hardware, software, software development, web development, Internet marketing ,etc. It’s the kind of site that often provides just the right kind of answer for a troubled searcher.

We did an interview with Horowitz last year, who told us about some of the things she was doing to help the site recover from the Panda trauma. Here’s the interview, or you can click the link for more about that.

That was in May. In July, Horowitz claimed DaniWeb had made a 110% recovery from Google. In September, Panda appeared to have slapped the site again, causing it to lose over half of its traffic. Shortly thereafter, in early October, Horowitz announced that the site had managed to recover yet again. “Clearly Google admitted they screwed up with us,” she said at the time.

Now, six months later, DaniWeb has been hit yet again, but this time, Horowitz is taking at least part of the blame.

@DaniWeb
Dani of DaniWeb.com
PLEASE PLEASE PLEASE RETWEET … I NEED HELP :( http://t.co/asnxaqAB 12 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

The tweet links to this Google Groups forum discussion, where Horowitz describes her new issues in great depth, also noting that the site had eventually made a 130% recovery from its pre-Panda numbers. DaniWeb rolled out a new platform, coincidentally at the same time a Panda update was made in March, and she says the site’s been going downhill ever since.

Horowitz tells WebProNews she’s been “hibernating in a cave the past few months coding the new version of the site.”

“I do not believe that we were hit by Panda,” she says in the forum post. “Unlike Panda, which was an instantaneous 50-60% drop in traffic literally overnight, we’ve instead had a steady decrease in traffic every day ever since our launch. At this point, we’re down about 45%. We are using 301 redirects, but our site’s URL structure *DID* change. While we’re on an entirely new platform, the actual content is entirely the same, and there is a 1-to-1 relationship between each page in the old system and the new system (all being 301-redirected).”

Later in the post, she says, “This mess is partially my fault, I will have to admit. As mentioned, we changed our URL structure, and I am 301 redirecting the old URLs to the new URLs. However, we also changed our URL structure last February, right after Panda originally hit. I have to admit that when we first went live, I completely forgot about that. While I was 301 redirecting the old version to the new, I was *NOT* redirecting the old old version to the new for about 72 hours, until I remembered! However, by that time, it was too late, and we ended up with over 500,000 404 errors in Google Webmaster Tools. That has been fixed for quite a few weeks already though.”

In between those two quotes, she details the observations in Google’s behavior with her site she’s not happy with. The first one:

If you visit a page such as: http://www.daniweb.com/web-development/php/17 you will see that the article titles have URLs in the format http://www.daniweb.com/web-development/php/threads/420572/php-apotrophe-issue … However, you can also click on the timestamp of the last post to jump to the last post in the article (a url such as http://www.daniweb.com/posts/jump/1794174)

The /posts/jump/ URLs will 301 redirect you to the full article pages. For example, in this specific example, to http://www.daniweb.com/web-development/php/threads/420572/php-apotrophe-issue/1#post1794174 (the first page of the thread, with an anchor to the specific post).

The page specifies rel=”canonical” pointing to http://www.daniweb.com/web-development/php/threads/420572/php-apotrophe-issue

Why then, does the /posts/jump/ URL show up in the Google search results instead of my preferred URL?? Not only am I doing a 301 redirect away from the /posts/jump/ format, but I am also specifying a rel=”canonical” of my preferred URL.

“I don’t like this at all for a few reasons,” she continues. “Firstly, the breadcrumb trail doesn’t show up in the SERPS. Secondly, there is no reason for Google to be sending everyone to shortened URLs, because now nearly every visitor coming in from Google has to go through a 301 redirect before seeing any content, which causes an unnecessary delay in page load time. Thirdly, the /posts/jump/ URLs all tack on a #post123 anchor to the end, meaning that everyone is being instantaneously jumped halfway down the page to a specific post, instead of getting the complete picture, where they can start reading from the beginning. This certainly isn’t desirable behavior!”

You can read the post for further elaboration.

Dani’s second observation:

After skimming the first 40 or 50 pages of the Google search results for site:daniweb.com, it’s essentially entirely a mix of two types of URLs. Those in the /posts/jump/ format, and links to member profiles. Essentially, two types of pages which are both not what I would consider putting our best foot forward.

We currently have nearly one million members, and therefore nearly one million member profiles. However, we choose to use the rel=”noindex” meta tag directive on about 850,000 of the member profiles, only allowing those by good contributors to be indexed. I think it’s a happy medium between allowing our good contributors to have their profiles found in Google by prospective employers and clients searching for their name, and not having one million member profiles saturate our search results. We allow just under 100,000 of our 950,000+ member profiles to be indexed.

However, as mentioned, it just seems as if member profiles are being ranked too high up and just way too abundant when doing a site:daniweb.com, overshadowing our content. This was no the case before the relaunch, and nothing changed in terms of our noindex approach.

Based on prior experience, the quality of the results when I do a site:daniweb.com has a direct correlation to whether Google has a strong grasp of our navigation structure and is indexing our site the way that I want them to. I noticed when I was going through my Panda ordeal that, at the beginning, doing a site: query gave very random results, listing our non-important pages first and really giving very messy, non-quality results. Towards the end of our recovery, the results were really high quality, with our best content being shown on the first chunk of pages.

The bottom line, it seems, according to Horowitz, is that Google has “no grasp on the structure” of the site. Once again, you can read her post in its entirety for further details and explanation from Horowitz herself.

Until the most recent issue, DaniWeb was clearly having a lot of success in the post-Panda world. When asked what she attributes this success to, Horowitz tells WebProNews, “We were at an all-time high in terms of traffic, and there was still constant growth. I definitely don’t think it was just the Panda recovery but all of the other positive SEO changes I made when we were being Pandalized that contributed to our post-Panda success.”

It goes to show, Panda is just one of many signals Google has (over 200, in fact).

“I’ve already documented just about everything that I did along the way, so there’s not much that I can think of adding,” she says. You can go back through the other links in these articles for more discussion with Dani about all of that. “At the end of the day, I think it just comes down to Google having a really good grasp of your entire site structure.”

“Taking yet another massive hit was completely unexpected for us,” she says. “We launched at the exact same time as Panda rolled out (completely not planned), and therefore I don’t know which to attribute our latest round of issues to. It might be Panda, it might be issues with our new version, it might be a little of both, or it might be new signals that Google is now factoring into their algorithm.”

Google has, of course, been providing monthly updates on many of the new changes it has been making. You can see the list for March here.

There’s no question that search engines, including Google, are putting a lot more emphasis on social media these days. We asked Horowitz if she believes social media played a significant role in DaniWeb’s search visibility.

“Absolutely,” she says. “I can definitely see the value in Twitter and Facebook likes, recommendations, and mentions. I think it just all goes into building a solid brand on the web. I forget where I read somewhere recently about how Google is favoring big brands. I don’t think you need to be a fortune 500 company to have earned a reputation for yourself on the web.”

“While I personally still haven’t quite found the value in Google+, I’m not going to discount it for its part in building brand equity in the eyes of Google, either.”

When asked if Google’s “Search Plus Your World” has been a positive thing for Daniweb, and/or the Google user experience (it’s received a lot of criticism), she says, “I happen to be a fan of personalized search results. Am I the only one?”

Do you think Google’s results are better now in the post-Panda, “Search Plus Your World” era? Let us know what you think in the comments.

The In-Content Ad Leader Buy and Sell text links Health and Beauty Store

Article Source: Link Exchange Service Network
If you like all this stuff here then you can buy me a pack of cigarettes.