Welcome to My Blog Page
Current Blog Category: Search Engine Optimization

DMOZ Shut Down

21 Mar

Last August I wrote a blog post about how attention merchants were sucking the value out of online publishing. In it I noted how the Yahoo! Directory disappeared & how even DMOZ saw a sharp drop in traffic & rankings over the past few years.

The concept of a neutral web is dead. In its place is agenda-driven media.

  • Politically charged misinformed snippets.
  • Ads cloaked as content.
  • Public relations propaganda.
  • Mostly correct (but politically insensitive) articles being “fact checked” where a minor detail is disputed to label the entire piece as not credible.

As the tech oligarchs broadly defund publishing, the publishers still need to eat. Aggregate information quality declines to make the numbers work. Companies which see their ad revenues slide 20%, 30% or 40% year after year can’t justify maintaining the labor-intensive yet unmonetized side projects.

There is Wikipedia, but it is not without bias & beyond the value expressed in the hidden bias most of the remaining value from it flows on through to the attention merchant / audience aggregation / content scraper platforms.

Last month DMOZ announced they were closing on March 14th without much fanfare. And on March 17th the directory went offline.

A number of people have pushed to preserve & archive the DMOZ data. Some existing DMOZ editors are planning on launching a new directory under a different name but as of the 17th DMOZ editors put up a copy at dmoztools.net. Jim Boykin scraped DMOZ & uploaded a copy here. A couple other versions of DMOZ have been published at OpenDirectoryProject.org & Freemoz.org.

DMOZ was not without criticism or controversy,

Although site policies suggest that an individual site should be submitted to only one category, as of October 2007, Topix.com, a news aggregation site operated by DMOZ founder Rich Skrenta, has more than 17,000 listings.

Early in the history of DMOZ, its staff gave representatives of selected companies, such as Rolling Stone or CNN, editing access in order to list individual pages from their websites. Links to individual CNN articles were added until 2004, but were entirely removed from the directory in January 2008 due to the content being outdated and not considered worth the effort to maintain.

but by-and-large it added value to the structure of the web.

As search has advanced (algorithmic evolution, economic power, influence over publishers, enhanced bundling of distribution & user tracking) general web directories haven’t been able to keep pace. Ultimately the web is a web of links & pages rather than a web of sites. Many great sites span multiple categories. Every large quality site has some misinformation on it. Every well-known interactive site has some great user contributions & user generated spam on it. Search engines have better signals about what pages are important & which pages have maintained importance over time. As search engines have improved link filtering algorithms & better incorporated user tracking in rankings, broad-based manual web directories had no chance.

The web of pages vs web of sites concept can be easily observed in how some of the early successful content platforms have broken down their broad-based content portals into a variety of niche sites.

When links were (roughly) all that mattered, leveraging a website’s link authority meant it was far more profitable for a large entity to keep publishing more content on the one main site. That is how eHow became the core of a multi-billion Dollar company.

Demand Media showed other publishers the way. And if the other existing sites were to stay competitive, they also had to water down content quality to make the numbers back out. The problem with this was the glut of content was lower ad rates. And the decline in ad rates was coupled with a shift away from a links-only view of search relevancy to a model based on weighting link profiles against user engagement metrics.

Websites with lots of links, lots of thin content & terrible engagement metrics were hit.

Kristen Moore, vp of marketing for Demand Media, explained what drove the most egregious aspects of eHow’s editorial strategy: “There’s some not very bright people out there.”

eHow improved their site design, drastically reduced their ad density, removed millions of articles from their site, and waited. However nothing they did on that domain name was ever going to work. They dug too deep of a hole selling the growth story to pump a multi-billion Dollar valuation. And they generated so much animosity from journalists who felt overwork & underpaid that even when they did rank journalists would typically prefer to link to anything but them.

The flip side of that story is the newspaper chains, which rushed to partner with Demand Media to build eHow-inspired sections on their sites.

Brands which enjoy the Google brand subsidy are also quite hip to work with Demand Media, which breathes new life into once retired content: “Sometimes Demand will even dust off old content that’s been published but is no longer live and repurpose it for a brand.”

As Facebook & Google grew more dominant in the online ad ecosystem they aggressively moved to suck in publisher content & shift advertiser spend onto their core properties. The rise of time spent on social sites only made it harder for websites to be sought out destination. Google also effectively cut off direct distribution by consolidating & de-monetizing the RSS reader space then shutting down a project they easily could have left run.

As the web got more competitive, bloggers & niche publications which were deeply specialized were able to steal marketshare in key verticals by leveraging a differentiated editorial opinion.

Even if they couldn’t necessarily afford to build strong brands via advertising, they were worthy of a follow on some social media channels & perhaps an email subscription. And the best niche editorial remains worthy of a direct visit:

Everything about Techmeme and its lingering success seems to defy the contemporary wisdom of building a popular website. It publishes zero original reporting and is not a social network. It doesn’t have a mobile app or a newsletter or even much of a social presence beyond its Twitter account, which posts dry commodity news with zero flair for clickability.

As a work around to the Panda hits, sites like eHow are now becoming collections of niche-focused sites (Cuteness.com, Techwalla.com, Sapling.com, Leaf.tv, etc will join Livestrong.com & eHow.com). It appears to be working so far…

…but they may only be 1 Panda update away from finding out the new model isn’t sustainable either.

About.com has done the same thing (TheSpruce.com, Verywell.com, Lifewire.com, TheBalance.com). Hundreds of millions of Dollars are riding on the hope that as the algorithms keep getting more granular they won’t discover moving the content to niche brands wasn’t enough.

As content moves around search engines with billions of Dollars in revenue can recalibrate rankings for each page & adjust rankings based on user experience. Did an influential “how to” guide become irrelevant after a software or hardware update? If so, they can see it didn’t solve the user’s problem and rank a more recent document which reflects the current software or hardware. Is a problem easy to solve with a short snippet of content? If so, that can get scraped into the search results.

Web directories which are built around sites rather than pages have no chance of competing against the billions of Dollars of monthly search ads & the full cycle user tracking search companies like Google & Bing can do with their integrated search engines, ad networks, web browsers & operating systems.

Arguably in most cases the idea of neutral-based publishing no longer works on the modern web. The shill gets exclusive stories. The political polemic gets automatic retweets from those who identify. The content which lacks agenda probably lacks the economics to pay for ads & buy distribution unless people can tell the creator loves what they do so much it influences them enough to repeatedly visit & perhaps pay for access.


New gTLDs are Like Used Cars

10 Mar

There may be a couple exceptions which prove the rule, but new TLDs are generally an awful investment for everyone except the registry operator.

Here is the short version…

Imagine registering a domain for $10, building a business on it, and then learning the renewal fee will increase to hundreds of $ a year.— Elliot Silver (@DInvesting) March 7, 2017

And the long version…

Diminishing Returns

About a half-decade ago I wrote about how Google devalued domain names from an SEO perspective & there have been a number of leading “category killer” domains which have repeatedly been recycled from startup to acquisition to shut down to PPC park page to buy now for this once in a lifetime opportunity in an endless water cycle.

The central web platforms are becoming ad heavy, which in turn decreases the reach of anything which is not an advertisement. For the most valuable concepts / markets / keywords ads eat up the entire interface for the first screen full of results. Key markets like hotels might get a second round of vertical ads to further displace the concept of organic results.

Proprietary, Closed-Ecosystem Roach Motels

The tech monopolies can only make so much money by stuffing ads onto their own platform. To keep increasing their take they need to increase the types, varieties & formats of media they host and control & keep the attention on their platform.

Both Google & Facebook are promoting scams where they feed on desperate publishers & suck a copy of the publisher’s content into being hosted by the tech monopoly platform de jour & sprinkle a share of the revenues back to the content sources.

They may even pay a bit upfront for new content formats, but then after the market is primed the deal shifts to where (once again) almost nobody other than the tech monopoly platform wins.

The attempt to “own” the web & never let users go is so extreme both companies will make up bogus statistics to promote their proprietary / fake open / actually closed standards.

If you ignore how Google’s AMP double, triple, or quadruple counts visitors in Google Analytics the visit numbers look appealing.

But the flip side of those fake metrics is actual revenues do not flow.

My own experience with amp is greatly reduced ad revenue. @Topheratl admits that weather dot com may be an anomaly in having higher ad $.— Marie Haynes (@Marie_Haynes) February 22, 2017

Facebook has the same sort of issues, with frequently needing to restate various metrics while partners fly blind.

These companies are restructuring society & the race to the bottom to try to make the numbers work in an increasingly unstable & parasitic set of platform choices is destroying adjacent markets:

Have you tried Angry Birds lately? It’s a swamp of dark patterns. All extractive logic meant to trick you into another in-app payment. It’s the perfect example of what happens when product managers have to squeeze ever-more-growth out of ever-less-fertile lands to hit their targets year after year. … back to the incentives. It’s not just those infused by venture capital timelines and return requirements, but also the likes of tax incentives favoring capital gains over income. … that’s the truly insidious part of the tech lords solution to everything. This fantasy that they will be greeted as liberators. When the new boss is really a lot like the old boss, except the big stick is replaced with the big algorithm. Depersonalizing all punishment but doling it out just the same. … this new world order is being driven by a tiny cabal of monopolies. So commercial dissent is near impossible. … competition is for the little people. Pitting one individual contractor against another in a race to the bottom. Hoarding all the bargaining power at the top. Disparaging any attempts against those at the bottom to organize with unions or otherwise.

To be a success on the attention platforms you have to push toward the edges. But as you become successful you become a target.

And the dehumanized “algorithm” is not above politics & public relations.

Pewdiepie is the biggest success story on the YouTube platform. When he made a video showing some of the absurd aspects of Fiverr it led to a WSJ investigation which “uncovered” a pattern of anti-semitism. And yet one of the reporters who worked on that story wrote far more offensive and anti-semetic tweets. The hypocrisy of the hit job didn’t matter. They still were able to go after Pewdiepie’s ad relationships to cut him off from Disney’s Maker Studios & the premium tier of YouTube ads.

The fact that he is an individual with broad reach means he’ll still be fine economically, but many other publishers would quickly end up in a death spiral from the above sequence.

If it can happen to a leading player in a closed ecosystem then the risk to smaller players is even greater.

In some emerging markets Facebook effectively *is* the Internet.

The Decline of Exact Match Domains

Domains have been so devalued (from an SEO perspective) that some names like PaydayLoans.net sell for about $3,000 at auction.

$3,000 can sound like a lot to someone with no money, but names like that were going for 6 figures at their peak.

Professional domain sellers participate in the domain auctions on sites like NameJet & SnapNames. Big keywords like [payday loans] in core trusted extensions are not missed. So if the 98% decline in price were an anomaly, at least one of them would have bid more in that auction.

Why did exact match domains fall so hard? In part because Google shifted from scoring the web based on links to considering things like brand awareness in rankings. And it is very hard to run a large brand-oriented ad campaign promoting a generically descriptive domain name. Sure there are a few exceptions like Cars.com & Hotels.com, but if you watch much TV you’ll see a lot more ads associated with businesses that are not built on generically descriptive domain names.

Not all domains have fallen quite that hard in price, but the more into the tail you go the less the domain acts as a memorable differentiator. If the barrier to entry increases, then the justification for spending a lot on a domain name as part of a go to market strategy makes less sense.

Brandable Names Also Lost Value

Arguably EMDs have lost more value than brandable domain names, but even brandable names have sharply slid.

If you go back a decade or two tech startups would secure their name (say Snap.com or Monster.com or such) & then try to build a business on it.

But in the current marketplace with there being many paths to market, some startups don’t even have a domain name at launch, but begin as iPhone or Android apps.

Now people try to create success on a good enough, but cheap domain name & then as success comes they buy a better domain name.

Jelly was recently acquired by Pinterest. Rather than buying jelly.com they were still using AskJelly.com for their core site & Jelly.co for their blog.

As long as domain redirects work, there’s no reason to spend heavily on a domain name for a highly speculative new project.

Rather then spending 6 figures on a domain name & then seeing if there is market fit, it is far more common to launch a site on something like getapp.com, joinapp.com, app.io, app.co, businessnameapp.com, etc.

This in turn means that rather than 10,000s of startups all chasing their core .com domain name off the start, people test whatever is good enough & priced close to $10. Then only after they are successful do they try to upgrade to better, more memorable & far more expensive domain names.

Money isn’t spent on the domain names until the project has already shown market fit.

One in a thousand startups spending $1 million is less than one in three startups spending $100,000.

New TLDs Undifferentiated, Risky & Overpriced

No Actual Marketing Being Done

Some of the companies which are registries for new TLDs talk up investing in marketing & differentiation for the new TLDs, but very few of them are doing much on the marketing front.

You may see their banner ads on domainer blogs & they may even pay for placement with some of the registries, but there isn’t much going on in terms of cultivating a stable ecosystem.

When Google or Facebook try to enter & dominate a new vertical, the end destination may be extractive rent seeking by a monopoly BUT off the start they are at least willing to shoulder some of the risk & cost upfront to try to build awareness.

Where are the domain registries who have built successful new businesses on some of their new TLDs? Where are the subsidies offered to key talent to help drive awareness & promote the new strings?

As far as I know, none of that stuff exists.

In fact, what is prevalent is the exact opposite.

Greed-Based Anti-Marketing

So many of them are short sighted greed-based plays that they do the exact opposite of building an ecosystem … they hold back any domain which potentially might not be complete garbage so they can juice it for a premium ask price in the 10s of thousands of dollars.

While searching on GoDaddy Auctions for a client project I have seen new TLDs like .link listed for sale for MORE THAN the asking price of similar .org names.

If those prices had any sort of legitimate foundation then the person asking $30,000 for a .link would have bulk bought all the equivalent .net and .org names which are listed for cheaper prices.

But the prices are based on fantasy & almost nobody is dumb enough to pay those sorts of prices.

Anyone dumb enough to pay that would be better off buying their own registry rather than a single name.

The holding back of names is the exact opposite of savvy marketing investment. It means there’s no reason to use the new TLD if you either have to pay through the nose or use a really crappy name nobody will remember.

I didn’t buy more than 15 of Uniregistry’s domains because all names were reserved in the first place and I didn’t feel like buying 2nd tier domains … Domainers were angry when the first 2 Uniregistry’s New gTLDs (.sexy and .tattoo) came out and all remotely good names were reserved despite Frank saying that Uniregistry would not reserve any domains.

Who defeats the race to the bottom aspects of the web by starting off from a “we only sell shit” standpoint?


And that’s why these new TLDs are a zero.

Defaults Have Value

Many online verticals are driven by winner take most monopoly economics. There’s a clear dominant leader in each of these core markets: social, search, short-form video, long-form video, retail, auctions, real estate, job search, classifieds, etc. Some other core markets have consolidated down to 3 or 4 core players who among them own about 50 different brands that attack different parts of the market.

Almost all the category leading businesses which dominate aggregate usage are on .com domains.

Contrast the lack of marketing for new TLDs with all the marketing one sees for the .com domain name.

Local country code domain names & .com are not going anywhere. And both .org and .net are widely used & unlikely to face extreme price increases.

Hosing The Masses…

A decade ago domainers were frustrated Verisign increased the price of .com domains in ~ 5% increments:

Every mom, every pop, every company that holds a domain name had no say in the matter. ICANN basically said to Verisign: “We agree to let you hose the masses if you stop suing us”.

I don’t necessarily mind paying more for domains so much as I mind the money going to a monopolistic regulator which has historically had little regard for the registrants/registrars it should be serving

Those 5% or 10% shifts were considered “hosing the masses.”

Imagine what sort of blowback PIR would get from influential charities if they tried to increase the price of .org domains 30-fold overnight. It would be such a public relations disaster it would never be considered.

Domain registries are not particularly expensive to run. A person who has a number of them can run each of them for less than the cost of a full time employee – say $25,000 to $50,00 per year.

And yet, the very people who complained about Verisign’s benign price increases, monopolistic abuses & rent extraction are now pushing massive price hikes:

.Hosting and .juegos are going up from about $10-$20 retail to about $300. Other domains will also see price increases.

Here’s the thing with new TLD pricing: registry operators can increase prices as much as they want with just six months’ notice.

in its applications, Uniregistry said it planned to enter into a contractual agreement to not increase its prices for five years.

Why would anyone want to build a commercial enterprise (or anything they care about) on such a shoddy foundation?

If a person promises…

  • no hold backs of premium domains, then reserves 10s of thousands of domains
  • no price hikes for 5 years, then hikes prices
  • the eventual price hikes being inline with inflation, then hikes prices 3,000%

That’s 3 strikes and the batter is out.

Doing the Math

The claim the new TLDs need more revenues to exist are untrue. Running an extension costs maybe $50,000 per year. If a registry operator wanted to build a vibrant & stable ecosystem the first step would be dumping the concept of premium domains to encourage wide usage & adoption.

There are hundreds of these new TLD extensions and almost none of them can be trusted to be a wise investment when compared against similar names in established extensions like .com, .net, .org & CCTLDs like .co.uk or .fr.

There’s no renewal price protection & there’s no need, especially as prices on the core TLDs have sharply come down.

Domain Pricing Trends

Aggregate stats are somewhat hard to come by as many deals are not reported publicly & many sites which aggregate sales data also list minimum prices.

However domains have lost value for many reasons

  • declining SEO-related value due to the search results becoming over-run with ads (Google keeps increasing their ad clicks 20% to 30% year over year)
  • broad market consolidation in key markets like travel, ecommerce, search & social
    • Google & Facebook are eating OVER 100% of online advertising growth – the rest of industry is shrinking in aggregate
    • are there any major news sites which haven’t struggled to monetize mobile?
    • there is a reason there are few great indy blogs compared to a decade ago
  • rising technical costs in implementing independent websites (responsive design, HTTPS, AMP, etc.) “Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell
  • harder to break into markets with brand-biased relevancy algorithms (increased chunk size of competition)
  • less value in trying to build a brand on a generic name, which struggles to rank in a landscape of brand-biased algorithms (inability to differentiate while being generically descriptive)
  • decline in PPC park page ad revenues
    • for many years Yahoo! hid the deterioration in their core business by relying heavily on partners for ad click volumes, but after they switched to leveraging Bing search, Microsoft was far more interested with click quality vs click quantity
    • absent the competitive bid from Yahoo!, Google drastically reduced partner payouts
    • most web browsers have replaced web address bars with dual function search boxes, drastically reducing direct navigation traffic

All the above are the mechanics of “why” prices have been dropping, but it is also worth noting many of the leading portfolios have been sold.

If the domain aftermarket is as vibrant as some people claim, there’s no way the Marchex portfolio of 200,000+ domains would have sold for only $28.1 million a couple years ago.

RegistrarStats shows .com registrations have stopped growing & other extensions like .net, .org, .biz & .info are now shrinking.

Both aftermarket domain prices & the pool of registered domains on established gTLDs are dropping.

I know I’ve dropped hundreds & hundreds of domains over the past year. That might be due to my cynical views of the market, but I did hold many names for a decade or more.

As barrier to entry increases, many of the legacy domains which could have one day been worth developing have lost much of their value.

And the picked over new TLDs are an even worse investment due to the near infinite downside potential of price hikes, registries outright folding, etc.

Into this face of declining value there is a rush of oversupply WITH irrational above-market pricing. And then the registries which spend next to nothing on marketing can’t understand why their great new namespaces went nowhere.

As much as I cringe at .biz & .info, I’d prefer either of them over just about any new TLD.

Any baggage they may carry is less than the risk of going with an unproven new extension without any protections whatsoever.

Losing Faith in the Zimbabwe Dollar

Who really loses is anyone who read what these domain registry operators wrote & trusted them.

Uniregistry does not believe that registry fees should rise when the costs of other technology services have uniformly trended downward, simply because a registry operator believes it can extract higher profit from its base of registrants.

How does one justify a 3000% price hike after stating “Our prices are fixed and only indexed to inflation after 5 years.”

Are they pricing these names in Zimbabwe Dollars? Or did they just change their minds in a way that hurt anyone who trusted them & invested in their ecosystem?

Frank Schilling warned about the dangers of lifting price controls

The combination of “presumptive renewal” and the “lifting of price controls on registry services” is incredibly dangerous.
Imagine buying a home, taking on a large mortgage, remodeling, moving in, only to be informed 6 months later that your property taxes will go up 10,000% with no better services offered by local government. The government doesn’t care if you can’t pay your tax/mortgage because they don’t really want you to pay your tax… they want you to abandon your home so they can take your property and resell it to a higher payer for more money, pocketing the difference themselves, leaving you with nothing.

This agreement as written leaves the door open to exactly that type of scenario

He didn’t believe the practice to be poor.

Rather he felt he would have been made poorer, unless he was the person doing it:

It would be the mother of all Internet tragedies and a crippling blow to ICANN’s relevance if millions of pioneering registrants were taxed out of their internet homes as a result of the greed of one registry and the benign neglect, apathy or tacit support of its master.

It is a highly nuanced position.

Imagine registering a domain for $10, building a business on it, and then learning the renewal fee will increase to hundreds of $ a year.— Elliot Silver (@DInvesting) March 7, 2017


Are You Serious? GSS Is Shutting Down? What Are The Alternatives?

10 Mar

Google Site Search

For all those business holders and e-commerce owners, Google has a slight bad news. In an e-mail delivered to search customers, Google laid down in clear words that, it is going to close down the Google Site Search (GSS) by April 1, 2018,slowly but surely. And to begin that process, Google will cease sales of its Site Search from April 1, 2017. So, should you be bothered at all? Let’s find out.

What is Google Site Search (GSS)? How is it different from Google Custom Search?

Specifically designed for business and e-commerce sites, GSS is a modified version of Google Custom Search which was intended to provide a highly customized site search solution powered directly by Google. An additional fee ensured 24 hours of technical support, a personal look and feel and allowed the addition of other websites in addition to their own. Moreover, Google Site Search allowed your custom search engines to be absolutely ad free. You could make them personalized to make your customers feel that the search results are coming from your own website.

While Google Custom Search searches the entire web for a particular search result, GSS permits searching only within the specified websites mentioned in the GSS’s list.

4 Best Alternatives available –

With so many features and privileges, GSS must have been every business’ important factor in answering their customers’ queries. So it is better to start looking for alternatives before you are left in the dust.

    1. Google Custom Search:-

Google<br />
Custom Search

As Google has still not expressed any plans of doing away with their custom search, for now, it seems to a viable solution as an alternative to Google Site Search.But before you start using it, you should check out what conditions and limitations it has to offer:-

  • Google branding will stick. Yes, you heard it right. While GSS provided the freedom do away with the Google branding making it your own personalized search engine, Google Custom search does not offer the same. But in performance terms, there is no difference.
  • The Custom search includes query limits. This implies that all of you business owners who deal with high volume of data everyday will face a tough time managing their search engines. Your Google Custom Search will stop working once a certain number of queries have been reached.
  • Although Google Custom search is ad-based, yet that does not cause any performance dip. The search engine works just fine asit would be required by an individual for his business.
  • Google still hosts your search results. Be it ads or branding;everyone in this world values Google search results.

    1. Cloud-Based Solutions:-

Cloud technology is the new craze and it is obvious that a lot of might be looking forward towards this option with bright eyes. Cloud search options are offered by Google as well as other websites such as Amazon Web Services (AWS) and Microsoft Azure.

Google Cloud search is considered to be one of the internals search engines. It comes bundled with Business and Enterprise editions of G suite and is capable of searching through Gmail, Docs, Calendar, Slides and Sheets.For all of you who are new to Azure and Amazon Web Services, they are cloud based search engines that follow the same steps as Google.

Amazon has named it cloud search and offers auto-complete, site targeted results and multilingual support of 34 languages. Azure, on the other hand, is a product from Microsoft’s Bing platform. To make it competitive in this world of departing GSS, Microsoft has enabled its cloud search to provide spell check and is also capable of understanding user intent.

Having mentioned the above options, you should also keep in mind that implementing either of them is not a child’s play and unless you are a large organization, it may require additional support to implement Azure or Amazon Cloud search on your website.

    3. Server Side Solutions:-


Although a bit technical, Server Side Solutions offera number of options to achieve desired search results. Server based search engines rely on your own purchased computing power, unlike cloud-based searches which depend on the network of other comparatively larger organizations.

Server-side searches normallydepend on the server you host your site on. While this is a feasible solution compared to cloud-basedsearches, yet it does not provide certain functionalities such as language recognition, autocomplete and the ability to “think like a human”.

So unless, it is too compulsory, such as low traffic, it is advisable to use cloud-based solutions over server-side solutions.

    4. Saas Solutions:-

Saas, also known as Software As A Service, offers a perfect combination of server-side and cloud-based solutions. To put it simply, it is a cloud-based search solution, where you hire a company to manage the search engine on your behalf. This leavesyou hassle free as well as provides your business organization with a highly polished and dependable search product and works well with the requirements and criteria of a mid-size organization.

A good and reliable name in this field is Swiftype. Although it ensures you a hassle free and perfectly working custom search for your business, at $299 per month, it is quite expensive. Comparing this with GSS’s $100 per year plan, rates look quite exorbitant.

Other similar service providers in this field are Cludo Site search and Algolia. While Cludo Site Search is considered to be the nearest alternative to GSS, Algolia is closer to cloud-based solutions. This means that users of Algolia need to maintain some functionalities of this site search on their own.

It is to be noted there are not too many service providers in the section of Saas. The primary reason for this being, the dominance of Google Site search over the years. However, with GSS moving towards retirement, a number of new competitors are likely to come up in this field.

A final verdict:-

While Google Custom search is a great alternative to Google Site Search, certain factors such ads and limited query structure shall prove to be a hindrance for many of you. Cloud-based searches are efficient and deliver quick results to its users. Yet, it’s hard to implement as some of the search models require additional support to be up and running with your website.

Server side solutions, on the other hand, are technically complex, require powerful and fast servers on the user’s end and are incapable in providing cloud-based functions such as auto complete and language recognition. The best of them, Software as service solutions look to be most feasible. You simply hire a company to maintain cloud-based searches for making you free from any hassle of software maintenance job. But this comes with a price and is a bit expensive.

So, as a summary, it is totally dependent on you as a user and owner – what you think would to be best for your business. Be it cloud based services or server side solutions, Google’s custom search or Saas services, it totally depends on your choice and requirement. But before taking any step forward, it is advisable to have a detailed look through each of the above and then make a final decision.


11 Off-Page SEO Tips That You Must Start Employing Today

09 Mar

off page SEO

Every individual owing a website desires to see their web page on top of Google’s search results. To achieve this, each person strives towards making panda and penguin happy by setting every on-page SEO element right. In return, they hope that Google acknowledges the efforts and their site is gifted with a promotion in ranks. It often happens that even after setting every on-page SEO element right your site does not rank as desired.

This is where the role of off-page SEO comes into play. Offline SEO strategies play a major role in promoting your site and in certain situations prove to be more vital than on-page SEO strategies. You must be thinking what is so important about off-page SEO. A greater insight will help you understand the strategy better.

Points to ponder on:-

Off-page SEO refers to activities you undertake outside the boundaries of your website which help your web page to get ranked higher in Google’s search results. Mentioned below are 11 easy steps which if followed will not only make the bird and the bear happy, but also raise your SERP ratings by considerable levels.

     1. Blogs:-      

One of the greatest ways to promote your website in today’s world – blogs are meant to be written. Posting blogs on your website at regular intervals will engage Google more as regular updates will indicate that your site is under constant maintenance and activity. As Google prefers active sites as to dormant ones, this will help to give you a surge in SERP ratings. Moreover, regular blog posts will give your visitors a reason to return to your site at regular intervals.

Blogs should preferably consist of unique contents such as tutorials, question-answer forums and trending video links to keep your visitors engaged. In addition to this, you should comment on other blogs same as your genre, participate in question-answer forums which give you a chance to post a link to your blog in their comment or answer section. If visitors find it relevant, your site traffic is sure to increase.

     2. Social Bookmarking:-      

Penguin and Panda love popular bookmarking sites such as Reddit, Stumbleupon, etc. Posting your blog links in these websites can give you a ranking surge as content of these websites is updated regularly. If your blog has valid content which is related to the information on this site, people could find it useful to click on your link giving you that raise in ranking.

Bookmarking also helps to promote an author’s name to the world. If you have posted a link to your blog or website to Reddit and people there find it helpful and relevant to their needs, they are likely to share it more. This will help Google identify it as a genuine and relevant site which will help the site to get ranked higher through the process.

     3. Acquire backlinks:-      

best backlinks for seo

I am sure, as a website owner you would love to receive valid links to your site from trusted sources and so does Google. Receiving backlinks from higher ranked authentic sites will put your website in Google’s good books. If you are thinking how does it help? Well, web crawlers see that site as consisting of useful and relevant information. Useful content is always appreciated and awarded by Google.

But be careful as Penguin does not like spam links. Suppose your website represents clothes, while you receive a backlink from a blog post related to cars. Penguin identifies these as spams and would result in the de-ranking of your site. Therefore be sure to check you link surely.

     4. Social Media Promotion:-      

social media mobile icons

You are surely aware of all the major social networking sites such as Google+, Facebook, LinkedIn, Twitter, and Instagram. Then you should also be aware of how to use it to give a push to your site’s SERP ratings. If not, then have a read.

Sharing your website or blog in one of these social networking sites offer a chance of free promotion. Since Facebook and Twitter are considered to be the biggest online platforms today, sharing your work in these places is sure to attract more viewers than any other place.

     5. Market Forums:-      

Forum marketing involves getting involved in communities related to your genre. You can participate in online forums discussing a particular topic relevant to your website or blog. As a return, you can post “Do follow” links to your website with a chance of increasing online traffic. This also helps search engines to find your site more easily.

With the help of marketing forums, you can make yourself known to everybody. Moreover, if your site has unique and valuable content, visitors are likely to share it on other platforms giving that all needed exposure.

     6. Local Listing techniques:-      

Instead of targeting a global audience, local listings are an important technique you can apply if it goes with your website’s niche. This also enables Google to find your site easily. Local listing refers to an online profile that will contain your company name, phone number, location and the service it provides. You can do local listing by submitting your site to Google Maps, Yahoo local, Yellow Pages and Google+ Local.

     7. Guest Blogging:-      

Guest Blogging

If you can put in a little more labor for the good of your site, then guest posting is a very effective way. All you need to do is write and post content on some other websites or blogs related to your genre. When visitors see your website name mentioned in several places on a trusted website, they will judge your site as a reliable source of information which in turn will help your website’s traffic.

So how is it done? As mentioned, writing content and publishing it on another website is only the first step. What follows is putting a link to your website, sharing it on social media and keep visiting to answer queries and comments. Guest posting will help you build relations with you readers and is an effective way to get yourself known to others.

     8. Submit to search engines:-      

This is considered as an effective way of internet marketing to increase rankings of a website or webpage. You can directly submit your website to search engines such as Google, Yahoo or Bing. You can submit your website by two processes. Either you can submit one page at a time by using webmasters tools or you can submit your entire website. This is done by submitting the home page of your site to as many search engines as possible.

     9. Directory Submission:-      

Listing your site to several directories or databases under concerned categories or subcategories is known as directory submission. Proper directory submission will enable you to get exposure, provide reliable backlinks and will help increase your blog’s overall earning. Chances are there, that you might also get paid post opportunities.

     10. Ask:-      

Simply asking for a link is often quite beneficial what many of the bloggers forget to do. Suppose, your blog name has been mentioned in any article but without a link, you can simply ask that respective author to include a link to your blog. Moreover, you can also ask for a mention of your blog in return for a similar favor if both blogs are of the same niche. Both bloggers gain equally and also helps up in building contacts.

     11. Link Baiting:-      

The process by which you can get your visitors to share your website’s link is known as link baiting. The primary criteria of successful link baiting are creating quality and unique content. You should be able to make your readers believe that your site has that piece of information that is worth sharing.

In addition to this, do not forget to come up with engaging and attractive content which will compel a reader to click on your site. It must be kept in mind that one tactic is related to the other. Without quality content, an attractive link has no value. Similarly, in the absence of a catchy link, visitors are less likely to click on your site even if you have a catchy content written inside.


What are the benefits of using off-page SEO?

You can get the following advantages by using off-page SEO strategy the correct way:-

  • More traffic:
    As your page ranks higher, your website gets more visitors, followers and social media shares. This is a never ending process where the only criterion is to create good content and regularly update your website.
  • Online Branding:
    If your website manages to please Google through its offline SEO strategies, you will be rewarded for your hard work with online branding facilities from larger companies. In other words, large e-commerce sites will want to hire your page for advertising their products. This, in turn, increases not only your page value but generate that extra income.

Final words:

Therefore, it can be seen that not only online SEO’s, but off-line SEO strategies are very important as well. A survey from trusted sources has shown that people spend 70% of their time maintaining on-page SEO and the remaining 30% goes to off page SEO. Experts recommend a more balanced approach to make your site more SEO friendly. You must remember Google loves pages better optimized for SEO and that is what everyone is striving for today.


What? Is There Another Update From Google In February 2017?

10 Feb

If you turn the pages of Google’s history, you will see as many as 500 to 600 times; the search engine giant has changed its algorithms in a single year. It has brought in major updates like Panda and Penguin that have created significant effects in search results. These have consequently made changes in web traffic and rankings in SERPs.

Well, people, Google has a new update for us again – in 2017. It is speculated that the company has made up gradations in Penguin algorithm probably in February this year. Yes! It is speculated because Google is an excellent secret keeper and they are not going to reveal their strategies to us. The company has refrained from specific answers – neither denying nor confirming the news.

Google Algorithm

Then how did people get to know the change?

Through some signal changes! Webmasters have reported some notable signal changes that fairly indicate an update. This time, Google has targeted those who are into aggressive link building. They are trying to pull down spamming of all sorts.

In fact, this change has also been noticed by SEO experts practicing black hat techniques. According to one of them, “the strategies we are applying are simply not bringing results. We believe, there is obviously a change in Google’s algorithms.”

A sneak peek into Google Penguin:

It was back in April 2012 that the search engine biggie had launched its Penguin to track down sites that have violated the Webmaster Guidelines. Their main aim is to remove spammy links that are influencing search results and improve user experience. A lot of sites, as a result, had to remove bad backlinks.

Coming back to the February update!

Around this whole month, tracking tools are also in confusion regarding this speculated, unconfirmed change. Some of the major tools on this list are Algoroo, MozCast, RankRanger, AccuRanker, and SERP Metrics.

Even, major black hat communities like BlackHatWorld are considering this as a target towards the private blog networks or PBNs. While a section says websites are being penalized, others are seeing a slow pick up in new links by Google. Whatever be it, the search engine chooses to stay quiet as of now.

Analytics opine, there are two possibilities of these changes –

  • Maybe, it’s an entirely new algorithm, that is unknown to us
  • It’s an advancement of Google Penguin, how it detects spammy links and brings them down.

Google Algorithm Update

#What’s PBN?

Now, before moving to the chatter in the SEO community, let us have a quick look at PBN. Expanding to Personal Blog Network, using PBN is a black hat technique.

It involves –

  • Buying expired domains
  • Adding backlinks and contents to that
  • Giving different posts there which are finally linked to the main website

But, quite obviously, our Google genius never appreciates this method (you may be penalized even!). In fact, they are in continuous search of PBNs and their algorithm updates are for tracking out those spams.

The gossip in major communities: What are they all saying?

Let’s check out some comments made by WebmasterWorld and BlackHatWorld regarding the latest unnamed, undeclared and speculated Google update –

Comment 1:

“Recently, we have noticed that things are moving quite slowly with G. In fact, to verify the speculations, we are conducting certain case studies. Those are for determining the time taken by different approaches to kick in, in contrast to any domain with no backlinks. We have chosen such links that do their own thing.”

Comment 2:

“This week, we have built 100 sites. Luckily, it took us few minutes to successfully index 75 of them. But for the rest 25, it’s like a ‘test of our patience.’ Things are going too slow these days.”

Comment 3:

“We have a couple of PBN sites but taking too long to index. Seems like Penguin is on the loose!”
Some users, however, did not agree that there was a slowdown in indexing. According to them, either it’s a Penguin update, or it’s simply targeting the PBNs. So, the change in indexing wasn’t experienced by everyone. Ah! So, it’s the same buzz all around!

Rewinding the history a little:

If we rewind a little, we can see Google has a long history of rolling out such updates.

Google Algorithm Timeline Twitter

Let’s take 2015 –

  • In February, there was an unnamed update targeting mobile usability and e-commerce. Although, the search engine company did not give any confirmation.
  • Next, in April, another change came for all the mobile friendly sites regarding mobile rankings.
  • Coming to October, a revelation came that Google will be using RankBrain, a machine learning technology, for its search engine results.

Next in 2016 –

  • In February, there were updates for Adwords. Google completely removed the right sidebar classic advertisements. On many commercial searches, they have brought in 4-ad top blocks. The change had a significant impact on competitive keywords.
  • In September, Penguin was updated to reduce spamming. One example was keyword stuffing.
  • Again in the same month, temperature extremities of 108 degrees were shown by Google due to its update. The impact of this was quite heavy on organic results as per data received.

Something happened in January 2017 and we know that:

If you blame the company for keeping the secret, we must not forget the January update recently regarding the mobile interstitial penalty. Google has confirmed that. However, the impacts were minimal. What’s more? While killing the link operator, the company has chosen to keep the website operators.

Google has also released updates to infrastructure to its search console. It has again given the explanation regarding crawl budget.

Another newest update around Japanese content:

Most of you must have heard about another Google update regarding Japanese content. And it has been confirmed by the company. The algorithm change was targeted towards lowering the rankings of power quality websites in Google’s search results of Japanese domains.

The company’s explanation –

“We focused on improvements in quality evaluation of websites. The update will bring down those sites which focus more on ranking instead of providing useful information to users. In return, sites with relevant contents and high quality will be listed above.”

However, all the theories regarding Penguin are raw and new yet. So, the question comes –

Didn’t anyone ask Google to debunk all these speculations?

Well, people, don’t be surprised but the questioned was thrown to Google Webmaster Trends Analyst, Gary Illyes on Twitter. He was asked whether they have any intentions to reveal the truth giving specifications and settle the volatility that has started since January.

“Absolutely not,” came the reply from Illyes. Disclosing things would give the spammers excellent chances of avoiding algorithms and finding loopholes.

This makes sense, although! Quite clearly, once spammers get access to the details of Google’s policy of blacklisting links, they will change their black hat strategies proficiently and adapt to the situation. This ‘no transparency’ policy of the company will help them count on low-quality websites that hampering user experience.

Google ranking algorithm updates explained

As of now, the thread ends here!

Taking the data from all the major SERP trackers and comments from all sides, we can conclude that there has surely been some update in the Google mansion. The company is in the mission of constantly updating their services and giving good results to their customers. They have hit the black hat techniques in major areas, hence.

What’s for us? Well, we can create high-quality, value-based contents, follow white hat methods and be on the safe side. We can use the primary SEO tools to audit links in our websites. Can’t we?


How Can Anchor Texts Help You Gain In SERP Rankings?

08 Feb

seo inbound anchor text

Let’s start the discussion with what Matt Cutts had to say regarding the functioning of Google. In one of his YouTube videos, he said that Google looks to rank only those sites which can give its users valuable information for every search query they make. They do this so that this very user comes back to Google the next time he has some query in mind. It’s all about holding your visitors back and giving them more than what they were looking for.

Probably this one statement from the man himself defines the SEO world in the perfect sense. If someone puts in a query with Google, your site will only come up if you have that relevant information. If not, you should be linking to some other site which contains the relevant text. You thus create backlinks with the help of anchor texts and Google rewards you by placing you high in SERP.

Anchor text and its relation with SERP

These are words or phrases that you put in your content to link back to any particular website. These can be your own internal page or any other high-priority site. Again, someone else can write a content and place anchor texts that link back to your website. With all these in place, Google sees that:

  • You sourced out to other sites. This means you are trying to spread more information for that particular keyword.
  • Some other site linked back to you. This indicates that your site contains valuable information which users can read.

Both increase your user-satisfaction index in Google’s algorithm. As with the link building principle, the more the number of such links, the better are your chances to rank. But then again, the search engine giant has its Penguin looking over you and will penalize you if things get spammy.


The right way –

How does it feel if every line you read has an anchor text included? Simply not acceptable. Again, if you get the number right but have simply stuffed them in, Google will send your site for manual checking even if you escape Penguin. Hence, there is simply no escape if you don’t walk along with the webmaster rules.

How to place them then? It’s quite simple actually. Place anchor texts as if they came in there naturally. For instance, consider the following text.

“For more information, watch this video on YouTube.”

Here, YouTube can be your anchor text linking back to the original high priority site. It came in here naturally and Google has nothing to penalize you for. This type is known as Brand anchor texts. Similarly, there are other ways through which you can place such texts in your content.

Types of Anchor texts

Although there is no rule-book which suggests that the following are the main variations available, these are generally used in every content across the internet to gain in SERP. The common ones are:

  • Naked Links:

    This is where you paste your URL directly – generally in the comment sections of social forums. It may not go directly in the content.

  • Generic texts:

    Click here, read more, additional information, over here – anchor texts that are framed to convince without naming the brand.

  • Click-Here

  • Image anchors:

    Slowly gaining popularity and proving to be effective. Recent statistics showed that websites with an optimum number of images gained quite a bit of traffic. So, including your link, these can help you in a lot of ways.

  • Keyword:

    The old-school way of creating anchor text – just create the hyperlink on your target keyword. Another variation of this is when you lengthen it a bit, like “watch video on YouTube” – the whole text is hyperlinked.

  • Long tail:

    A relevant phrase is used instead of the keyword. So, no question arises of framing your content according to the anchor text. These are more naturally occurring than the rest in the list and give a lot for user satisfaction.

But there is one more way of using anchor text that has been a dark horse all along. People have used this for internal link building but never actually identified its importance. Nearly 25% of rank one sites have actually implemented this hidden technique and have gained ranking at a fascinating rate.

anchor text keywords

The Clear Winner!

Anchor texts with your Page Title. Think like this – when you refer to a book or a movie while writing a review, you generally use its full name in the content. So, if you are referring to someone else’s post or page, you can naturally place the whole title in your content. Google treats this way more human-like and certainly values your website.

But then again, it should be placed in the content naturally. So, when you are writing your blog, frame the title in such a manner that it can occur in a sentence without seeming out of place. Check out the next phrase for a better understanding.

“These 5 ways of link building to gain in SERP rankings can help you out in your SEO venture.”

Even, use this same title as meta-text. Google has ranked such sites at the top and is likely to do so in future as well.

Where to link then?

With all that said, it ultimately comes down to the question – where to get the links from or which all sites should you link to? It is the same old story actually, stick to the white hat techniques and avoid all black hats. No paid links, no understanding for a link back and all those places where Penguin simply doesn’t want you to go.

  • Link to priority sites –
    Or stick to authentic domains. Then again, it doesn’t mean that you do not link to sites having a domain with a lesser valuation than yours. Stick to the top few and you shall be fine. It’s all about authentic sites with no spam contents.
  • Monitor sites that are linking back to you –
    This is where you may get penalized for no fault of yours. You will not control such backlinks. Hence, regular monitoring of every link that you receive is almost a necessity.
  • backlink image

  • Directories are alright –
    This myth is old school that Google penalizes you if your site has a backlink from web directories. Not all are filled with spam content. If a site is listing the top restaurants in Chicago, why won’t you want your anchor text there? It will generate traffic and you definitely want that.
  • Restrict inner page links –
    While this is encouraged while performing on page SEO, things should always be kept subtle. Too many links in the internal pages are not seen well in Google’s manual checking and reports may arrive from the console.

    But then again, if another website links back to you from multiple of its internal pages, those are alright. It actually helps you get a hike in that backlink number.

Ask for backlinks and anchor texts if some site has mentioned you or your brand. This is not black-hat as you deserve a link. Think about Moz for instance – how did its site gain in SERP? You will find its anchor text spread across numerous websites.

Indexing Backlink

Finally –

It’s all about relevance and it’s all about keeping Google’s users to Google. Anchor texts are one of those top SEO techniques to make your site rank and increase your brand’s online awareness. Mix and match all the anchor text placement techniques and build backlinks. Things do take time to have an effect, but once they do, there is no end to traffic on your site. Take the right steps and climb that SERP ladder!