LG5 and Samsung S 7 compared and reviewed by features
Sunday, February 28, 2016
5 kind of linking strategies that can effect your seo rankings
Among the the most crucial changes that might affect your website traffic is your linking strategy. At times we go overboard to get links and focus on the quantity , rather than quality.This affects your traffic and might change your SERP Again how do you judge quality of your links . There are 3 ways how you can do this
1) Is the industry you are targeting complimentary or it is your competitor ?
2) How many links does the website have ? Are the links merely buried in their links pages. A page has more than 100 links ..you can be assured the link will not pass on the attributes of links
1) Is the industry you are targeting complimentary or it is your competitor ?
2) How many links does the website have ? Are the links merely buried in their links pages. A page has more than 100 links ..you can be assured the link will not pass on the attributes of links
3) Relevance and authority : Links that are related to the same topic are given more weightage than random linking to unrelated pages.Think of relevance of each link being evaluated by in the context of specific user search query .For example " for a search term " new cars at Arizona " if the publisher has received a link from Arizona chamber of commerce, the search engine derives that the fact that" the link is relevant and trustworthy as the site is about Arizona".Once you decide on the industry you are targeting,your focus on link building should be razor sharp on getting linksfrom within that industry.
4) No follow links : Although google recently claimed that the attribute of "Nofollow" links is no longer that relevant and important as compared to a couple of years ago. However when you use a No follow meta tag on a page, the search engine will still crawl the page and place it in its index. However all links ( both external and internal) on the page will be disabled from passing the link juice to other pages
4) Anchor text : Desist from a link where the anchor text mentions" our links, click here to know more,read more, or check out the full post here. Be very specific on the anchor text to which you site is being linked to. The best policy is to use the title of your website or webpage as an anchor text The impact of anchor text is quite more powerful than you think.For example if you link to a page that has minimal search friendly ( flash site for example) The search engine will look for signals to know what the page is about.In such case inbound anchor text becomes the primary driver in determining the relevance of the page
4) Anchor text : Desist from a link where the anchor text mentions" our links, click here to know more,read more, or check out the full post here. Be very specific on the anchor text to which you site is being linked to. The best policy is to use the title of your website or webpage as an anchor text The impact of anchor text is quite more powerful than you think.For example if you link to a page that has minimal search friendly ( flash site for example) The search engine will look for signals to know what the page is about.In such case inbound anchor text becomes the primary driver in determining the relevance of the page
Friday, February 26, 2016
end of good times for swiss watches, as apple smartwatch is on a roll
In what could be a watershed in the history of wrist watch.. Smartwatches sales have surpasses traditional swiss watches with a sales of 8.1 million in Q4, 2015.. In contrast 7.9 milliom swiss watches were sold during the same period.As compared to Q4,2014, smartwatches have seen unprecedented e growth of 316% , wheras Swiss watches managed to ship a mere 1.9million.
In terms of marketshare ,Apple smartwatch rules this segment with over 63% marketshare and is far ahead of samsung with 16% share .Combined, Apple and Samsung make up 8 out of 10 smartwatches sold . Apple sold 2.6millio smartwatches in q2,2015 while samsung managed to sell 600,000 which means apple sales is 6 times ahead of its nearest rival.Across wearable technology, which icludes smartwatches, fitness trackers , Apple’s 3.6m Watch sales contributes only 0.8m and ranks 2nd behind market leader Fitbit, which sold 4.4m devices in Q2.2015
![]() |
Biggest market for swiss watches |
![]() |
swiss watches contribution by regions |
Meanwhile About 1.2 billion watches are produced annually, according to the Federation of the Swiss Watch Industry’s estimates ..At company level, three Swiss watch and luxury groups – Swatch Group, Richemont, and Rolex – are the clear world market leaders. Together the three groups account for an estimated 45% ofglobal swiss watch sales.
Rolex leads the market with a share of 13.59% followed nby Omega and IWC, with a marketshare of 9.82 and 7.23%. Rado and Breitling with 5.6 and 8.08%
Thursday, February 25, 2016
how search engines use historical and temporal link data for ranking
5 ways how search engines use historical and temporal data to determine SERP ranking
while we know that links are the basic bedrocks to ranking a site and count them as a vote for the site. This concept was actually based on " citation" where an established thesis was often cited during researching for another topic, which meant that the original thesis was a vote for further research.However how do search crawlers use these links as an information ?Know more below or click here
Index inclusion :Search engines need to decide what kind of information and pages to include in their index .They do this by discovering web pages by crawling the links, if the site has more links it crawls through those pages and this process continues as the crawlers jumps from one link to another
In short links are used to index most parts of the world wide web, by which search spiders collect all the relevant information which is stored in its index .The second way how search engine discover webpages with links is through Xml site maps
However one caveat here is that " search engines" do not include links to webpages which it considers to be a low value page.. They do this as cluttering up their index with low quality pages impacts the search results for the user.
Crawl Rate Frequency :The search spider crawls a portion of the world wide web every day. How do search engine decide which sites they need to visit, where to begin and where to end
Google has publicly stated that page rank is an indicator in which order it starts to crawl . According to Google, it starts their crawl in reverse Page Rank order. So they visit the PR 10 sites,followed by PR 9 and so on and so forth. A page with higher page rank also gets crawled much faster and more deeper
Ranking :Everything being equal , the site with the highest number of backlinks will be ranked first
Source independence : A link back from your own site to another site run by you is not an editorial vote for your site. A link from 3rd party independent site is seen as an " actual link" which is a vote for your site.
Among the temporal factors used for ranking are
1)When did the link first appear.
2)When did the link disappear or become a broken link
3)How long has the link been there: A link for a larger time is a ranking factor
4)How quickly was the link added : should be organic and gradual
5) The context of the page and links pointing towards them
6)Page Placement : links at the body content as compared to a link page at the bottom is considered more powerful and impacts ranking
Wednesday, February 24, 2016
us states ranked by vc investments across saas start ups
US states Ranked by "SaaS" start up Venture Fund Deals
Caifornia VC firms have invested $16billion across saas start ups from 2011 to 2014. The VC deals in California was equal to all other US started combined. saas based VC firms invested a total of $11.7 billion,as compared to VC investment of $16billion in california. To read more click belowTuesday, February 23, 2016
these are vc firms with highest exposure across the fintech sector
The list of VC investors which the highest exposure across Fintech sector
- Google Ventures
- Intel Capital
- Citi Ventures
- Mastercard wordwide
- Funders Club
- American Express Ventures
- Ebay
- Cyberrentagent ventures
- Renren
- Credit Saison
- Bitcon Shop
- SK Telecom VenturesFintech Companies have started to grow big time after the 2008 financial market crash as as policymakers started to concentrated on making finance safer.However apart from the regulatory spotlight, Fintech companies have started to revolutionized the financial services with innovative products and services across such areas as mobile payments, online transactions , peer to peer lending payments, big data, using technology to change financial alternatives.These listings does not include conventional financial investors like Banks, Insurance firms and other financial services companies.The list shows VC firms that have been traditionally active across the Tech and IT services, which have recently started to invest across Fintech recentlyGoogle Ventures, is no 1 most active across the Fintech industry with investment across 25 start ups, followed by Intel Capital with morethan 10 investments into the space
Goldman sachs estimate the worldwide fintech pie to be worth $4.7 trillion, with North America projected to reach $19.9 billion in 2017.
According to to statista around 29% investments across Fintech were in the banking &corporate finance space. Read about fintech start ups in Israel more below after the click
Apart from US, Israel is the next biggest and hottest market in the fintech ecosystem which has seen many start ups going on to become the leaders across fintech vertical Israel vased Actimize is the leader across the area of fraud prevention, FundTech in the area of transaction banking solutions, Retalix in the area of point-of-sale, Trusteer in the area of cybercrime prevention, Sapiens in the area of insuranc First, the fintech disrupters will cut costs and improve the quality of financial services. They are unburdened by regulators, legacy IT systems, branch networks—or the need to protect existing
top 10 private equity firms investing across financial services
The list of the top 10 PE firms across the fintech & financial sector with the highest fund size.Among the top 3 include HML investments ,New York Life Partners and DB capital
the highest paid top 3 chief information officers
The comparison chart shows the most highly paid chief information Officers( includes salaries and stocks and shares)
With over $40 million annual compensation,Walgreen's Timothy J. Theriault , the Global Chief Information Officer and Executive VP of Walgreen Boots Alliance happens to be he highest paid CIO
The second and the highest paid CIO ,with a total salary of $36 milllion happens to be Mr. Robert B. Carter , The CIO and Divisional Executive VP at Fedex . A veteran at at Fedex he has been in the company for the last 16 years as EVP - FedEx Information Services and CIO
Trent Taylor , the CIO at Hhgregg is at number 3 with annual compensation of over $12 million,Mr. Taylor has over 25 years of experience in the information technology field, including systems architecture, data center infrastructure, networking, applications and e-commerce.
Monday, February 22, 2016
list of ad networks acquired by google ventures
.
So far Google has purchased 16 ad networks, which include online as well as mobile ad networks .Google's double click acquisition was the most expensive , followed by Admeld .
Google acquired double click for $3,100,000,000, while Admeld was the second most expensive advertising network to be acquired by Google for $400,000,000
Google's purchase of ad networks gives it enormous reach in display advertising . While AdSense network enables publishers to run relevant ads - including text, image,rich media and video ads
across their websites, The ad networks plays a different role .The ad networks aggregates a large number of ad networks (over 65 at last count, including more than half of the largest 20 networks in the US) which competes in real time auction instead of using historical data, or trying to negotiate prices upfront (usually at a discount). It chooses the highest value ad from these competing networks at each moment,
Google acquisition of ad networks integrated its AdSense network to work with DoubleClick, Admeld , Invite media and other ad networks t help publishers more precise metrics which helps them to judge the effectiveness of their campaigns
google's most expensive acquisition in united kingdom
![]() |
LIST OF STARTUPS BOUGHT BY GOOGLE IN UNITED KINGDOM; |
DeepMind technologies, a London-based artificial intelligence firm which specialises in machine learning, advanced algorithms and systems neuroscience was aquired by Google Ventures, the VC group of Google.. Google paid a hefty amount of £400m ($650m) for acquisition of DeepMind
Deep minds has created a neural network that might be able to access an external memory like a conventional Turing machine, resulting in a computer that appears to possibly mimic the short-term memory of the human brain.Deep Mind Technologies plans to develops technologies for e-commerce and games, and plans to develop computers that think like humans.
Deepmind recent made headlines when it announced that its alphaGo program had successfully beaten a human professional
The two-year-old artificial intelligence startup was founded by former child chess prodigy and neuroscientist Demis Hassabis alongside Shane Legg and Mustafa Suleyman.
Sunday, February 21, 2016
5 tips on how to make your content management system friendly for seo
5 Ways to ensuring SEO benefits while deciding your content management system
While looking to publish a website many webmasters might wonder if the selection of CMS plays a role in seo and how to ensure that " you fine tune your CMS to make it SEO friendly
The truth is that CMS does play a huge role in seo. The top 3 CMS happens to be Jhoomla,Drupal and Wordpress, out of which wordpress has the largest marketshare
Lets take a look on the the basic things you need to keep in mind while deciding the CMS and how to ensure your CMS functionality plays a big role in ensure your search visibility
TITLE TAG CUSTOMIZATION : A search engine friendly CMS has to ensure that each title tags are customised based on the url not only at a page level but also enable rules for particular webpages Sites that run on blogger and wordpress often use the date as a url. xyz.wordpress/post/21-02-2016 . This is seo unfriendly and should be avoided. Replace the date in the url with the post title . The biggest issues CMS faces is not to customise the title tags with the url or the theme of the post
For example if you have a site on cameras ..and your url is .www.a1cameras4 you.com , and your CMS only allows you to create the title , where the tag always has to start with your domain name followed by a colon, followed by the article you post, Your on the brink of seo disaster
Lets see the example below. In the above site , a post on the top 10 cameras has a url which is a1cameras4you.com/top-10/cameras If your CMS allows you only to create your title which starts with your website name for example in the above post ( the title shows A 1 cameras for you repeats for every ul and post , then you are treading dangerously .You should be able to customize each url with customized title and meta tags
PAGINATION CONTROLS :Pagination can be the bane of website search rankings so controlling it with inclusion of more items per page and more contextually relevant anchor text is recommended .Instead of next or previous page at the bottom of you can use titles like "more eCommerce news", or latest trends on online marketing"
301 FUNCTIONALITY: Many CMS lack this critical feature which plays an very crucial role in redirection of content when necessary.. Using 301 permanent redirection tells the search crawlers to treat a non www version and www version as the same url, therefore informing the crawlers to pass on the benefits and link juice to the same url. 301 redirection is used when you have a new domain or have a newer version and wish to pass on the search benefits to the new one, thereby helping to preserve the search benefits of the the older version.This also helps dodging from keyword cannibalization
IMAGE HANDLING :mage Handling and alt attributes:: alt attribute are a must have feature , which is used as an anchor text when you use an image link.( However remember on terms of search preference text links are more advisable than image links.) However if you are using it, ensure that the CMS have this alt tag functionality when helps search engines understand " the relevance the content of your image . Images in CMS navigational elements should preferably use CSS image replacement rather than merely al tag attributes
STATIC CATCHING OPTIONS :Static Catching options is a must for the CMS you are considering for your website: Many CMS currently offer caching options which makes perfect sense if a page receives consistently higher traffic from social or news portals. A bulk CMS often make extraneous database connections which may increase load and and overwhelm the server if caching is not in place.. This might affect and lessem your potential inbound links.
MULTILEVEL CATEGORIZATION STRUCTURE:.If your CMS does not allow you to nest subcategories into categories , subcategories to internal categories, rethink your CMS options. This limited functionality of the CMS will not allow you to use your site structure and internal hierarchical linking structure.
META NO INDEX FOR LOW VALUE PAGES : even if you use rel= NoFollow for your internal pages , other sites might still link to you, or some low value pages might rank ahead of the pages you intend to optimize for . Check if your CMS allows to you use NoIndex for those pages which have a low value, like about us, contact us or FAQ's
This iis a better way to handle these low value pages which you do not intend to show up in the SERPs
Britain's top 10 best seller consumer electronics products in Amazon
Amazon's Kindle Fire Stick, and Kindle Fire ( 7inch display with 8gb) along with Kindle paperwhite 6 inches, high resolution are among the higest selling products at UK Amazon.E
The Kindlle Fire TV stick has been UK's Amazon's best seller since the past 256 days, while The Kindle Fire has been the second most sought after gadget at number 2 for 184 days. Kindle stick overall has been the highest selling consumer electronic items online since the last 8 months on Amazon
The kindle stick allows Tens of thousands of TV episodes and movies, from Amazon Video, Netflix, BBC iPlayer and more, plus games, music and apps. 8 GB of storage and 1 GB of memory, plus a dual-core processor for fast streaming and smooth performance.
However its not all good news for Tablet manufacturers SA new report published by IDC shows that worldwide tablet shipments have shrunk for the first time since 2010.
The Kindlle Fire TV stick has been UK's Amazon's best seller since the past 256 days, while The Kindle Fire has been the second most sought after gadget at number 2 for 184 days. Kindle stick overall has been the highest selling consumer electronic items online since the last 8 months on Amazon
The kindle stick allows Tens of thousands of TV episodes and movies, from Amazon Video, Netflix, BBC iPlayer and more, plus games, music and apps. 8 GB of storage and 1 GB of memory, plus a dual-core processor for fast streaming and smooth performance.
However its not all good news for Tablet manufacturers SA new report published by IDC shows that worldwide tablet shipments have shrunk for the first time since 2010.
Saturday, February 20, 2016
why should you stop commenting on blogs
Many site owners use Spam tactics by creating bots that crawls around the web looking for open forums and blogs where its easy to add comment and get a backlink to their website , which leaves behind automated comments bordering on spam, which are most of the times, not related to the content of the site in question
The great majority of these sites are deleted by the rel=NoFollow tag or by the blogs software content management systems. however the spammers do not care as they operate on a huge scale
The great majority of these sites are deleted by the rel=NoFollow tag or by the blogs software content management systems. however the spammers do not care as they operate on a huge scale
Reason number 2 why you should avoid comments across blogs and forums is most of the times, the comments do not have any relevance to the topic. If you do have to comment ,ensure you stick on the context and relevance, this way. For Example sites like quora and a the seo forum like webmastertools, has user generated content which is editorially very sound.While blog and forum site owners might use rel=NoFollow tag, as we saw in the previous post, Search engines can decide the quality of the post and the site which is linked too, while the no follow tag might be used, it only stops passing of the link juice to the page linked to and not the indexing
Link Farms and Number of links: due to the nature of the blogs and forums, just anyone can leave behind a comment, irrespective of relevance or being topical. Over a number of time Forums and Blogs acquire a huge number of links, with search engines viewing the links as not relevant or even ignoring the links .Worse a webmaster that has many related sites , might ask a link back in exchange for 10 other sites he owns. This way all links are interlinked and your site might be a part of the "link farms" which google penalises very heavily
5 facts about rel=no follow attribute to keep in mind before optimizing your search
In 2005 all 3 search engines,Yahoo, Bing an google agreed to support an initiative to reduce the effectiveness of automated spam
Unlike the meta robots version of NoFollow, a new directive was employed as an attribute with within an or link tag to indicte the fact that the linking site " does not vouch for the quality of the linked page.
With In short the rel=nofollow tag was intended for search spiders not to pass on the link juice to the third party link which the website is linking to originally this enabled to " stop automated links appearing o blogs as comments, forums and other user generated content siteswhere links were liberally splashed around, to fool the search engine to crawl and pass on the usual benefits of the search benefits
In due course of time" it was seen" most website owners used content from other sites, but used the tag rel=no follow" to stop the link juice flowing to the linked page. However google guidelines say that " only paid links" or links attained through dubious methods should be used as rel=noFollow tag. Google also says that " when linking a site " which is editorially good" you should not be using the " rel=no follow tag.
Please note that although the rel=noFollow tag is used to indicate search crawlers from passing on the linking benefits, it does not stop indexing the link( despite the lack of semantic logic)
You can implement the no Follow link as follows a <a href="http://www.onlinemarketing-trends.com/" rel="NoFollow">
In 2009, Matt Cutts wrote a post which suggests that" link juice " associated with NoFollowed link is discarded rather than reallocated , In theory you can still use rel=NoFollow many times you want, however using it on internal links does not bring the type of benefit webmasters and seo preference which it once used to
One word of caution, is using it many times across external links too many times,can be flagged as a site being overoptimized. the thumb rule here is out of 10 posts use no follow for 7 of them , while for posts which you use from third parties" no do use rel=no Follow for sites which are editorially seen as very strong
Unlike the meta robots version of NoFollow, a new directive was employed as an attribute with within an or link tag to indicte the fact that the linking site " does not vouch for the quality of the linked page.
With In short the rel=nofollow tag was intended for search spiders not to pass on the link juice to the third party link which the website is linking to originally this enabled to " stop automated links appearing o blogs as comments, forums and other user generated content siteswhere links were liberally splashed around, to fool the search engine to crawl and pass on the usual benefits of the search benefits
In due course of time" it was seen" most website owners used content from other sites, but used the tag rel=no follow" to stop the link juice flowing to the linked page. However google guidelines say that " only paid links" or links attained through dubious methods should be used as rel=noFollow tag. Google also says that " when linking a site " which is editorially good" you should not be using the " rel=no follow tag.
Please note that although the rel=noFollow tag is used to indicate search crawlers from passing on the linking benefits, it does not stop indexing the link( despite the lack of semantic logic)
You can implement the no Follow link as follows a <a href="http://www.onlinemarketing-trends.com/" rel="NoFollow">
In 2009, Matt Cutts wrote a post which suggests that" link juice " associated with NoFollowed link is discarded rather than reallocated , In theory you can still use rel=NoFollow many times you want, however using it on internal links does not bring the type of benefit webmasters and seo preference which it once used to
One word of caution, is using it many times across external links too many times,can be flagged as a site being overoptimized. the thumb rule here is out of 10 posts use no follow for 7 of them , while for posts which you use from third parties" no do use rel=no Follow for sites which are editorially seen as very strong
santa barbara real estate median sales price in 2016
.
At $537,250, Santa Barbara County homes have the ninth-highest median sale price of all counties in California. With a 1-year price forecast of 6.30%, home values in Santa Barbara County are expected to have one of the lowest appreciation rates compared to counties in California, where prices are expected to rise 9.32% on average
At $537,250, Santa Barbara County homes have the ninth-highest median sale price of all counties in California. With a 1-year price forecast of 6.30%, home values in Santa Barbara County are expected to have one of the lowest appreciation rates compared to counties in California, where prices are expected to rise 9.32% on average
Salesforce ups its arsenal with with the acquisition of machine learning start up PredictionIO
![]() |
Machine Learning and its application across Industries |
Salesforce with its acquisition of Machine Learning start up, salesforce looks to stock ups its cloud based data science business by strengthening its arsenal further. Salesforce has recently acquired similar machine learning start ups RelateIQ and Tempo AI, among other companies.Machine learning is a predictive and powerful recommendation engine which uses algorithms to crunches tons and petabytes of unstructured data into meaningful insights..Its application in Big data is immense. Click below to see applications of machine learning in real life |
The simplest examples of machine learning in action are aggregator sites like like reditt or quora, or google news where consumer generated questions are automatically grouped by algorithm and arranged by topics and are automatically classified into meaningful categories. News aggregators are examples of real time machine learning .
Amazon is one of the pioneers in use machine learning.. with the concept of " using social proof" to encite users to buy related products
the top 5 start ups that received venture funding this week
The following are the start ups that got funding this week
- Snapbizz: Retail technology firm
Investors : Former CEO of the TATA group Ratan Tata
Amount : Undisclosed - LodgIQ: Hotel revenue management systems
Investors:Highgate Ventures and Trilantic Capital Partners.
Amount :$5million
- Jugnoo :aggregator of auto rides
Investors:Led by personal HNI TV and film actress Saumya Tandon.
Amount : Indosclosed
- Diligent Corp: Maker of boardbook apps
Investors: Insight Venture
Amount :$624 million
- Qualia Media : SasS start up
Investors : 15 investors led by Verizon Ventures
Amount : $5.5million
Scroll down or click below to see the other Vc deals this week
- Hometeam, $5 million: seniors home care provider
Investors:Kaiser Permanente Ventures
Amount: $5 million
- Operative: ophthalmic app that allows users to get an eye check via its app
Investors:Led by Jump Capital along with Tribeca Venture Partners, Pritzker Group Venture Capital, Chicago Ventures,
Amount : $6million
Thursday, February 18, 2016
5 best practices on creating spiderable link structure for search crawlers
Links are the bedrock of the worldwide web. Search engines rely on the links to rank websites . Search algorithms depend a lot of the the link graph which are created by human editors
The quality of a site and its ultimate chances in appearing on th SERPs is determined to a large extent by the search spiders which crawls these sites, picking up linking signals on who links to it . Each link is used as a citation and a positive signal for the site that is linked to
This means you need your website to be search friendly to allow the crawlers to spider your site. However many site owners obfuscate their sites navigation , which in turn obfuscate the links structure" to such an extend that the search crawlers cannot find them which, limits spiders accessibility and thus impacting SERP rankings
Described below are some the best practices in creating a link structure for your website. Each of these factors affects the crawlers ability to spider your site.
The quality of a site and its ultimate chances in appearing on th SERPs is determined to a large extent by the search spiders which crawls these sites, picking up linking signals on who links to it . Each link is used as a citation and a positive signal for the site that is linked to
This means you need your website to be search friendly to allow the crawlers to spider your site. However many site owners obfuscate their sites navigation , which in turn obfuscate the links structure" to such an extend that the search crawlers cannot find them which, limits spiders accessibility and thus impacting SERP rankings
Described below are some the best practices in creating a link structure for your website. Each of these factors affects the crawlers ability to spider your site.
1)Link in submission required forms: Search spiders cannot read submitted content or forms which are accessible only via a form as they are invisible to search engines
2)Links in hard to parse java script : If you use java script for links , you may find that search engines either do not crawl or give very ,little weightage to the embedd links( In June 2014, google announced enhanced crawling of Java script and CSS . For a review on how your site may render, go to google search console- crawl- Fetch as Google ( you need to login to google webmaster tools)
3)Link in Flash,Java and other Plugins :Links embedded inside Java and plugins are invisible to the search engines.In theory the engines are making progress in detecting links within flash, but don't rely too heavily on this
4)Links in powerpoint, pdf are no different from Flash, java and other plugins. Search engines sometimes report links seen in pdf and powerpoints, but its not yet clear how much they count
5)Avoid Linking to pages with"No Follow" or Robots.txt . If your link is pointing to pages blocked by meta robots tag or ,rel="NoFollow, it is almost equal to a dead link .Both these factors " prevent" search crawlers to " pass on the page rank juice to the pages which are linked from there as well as the "links ability" to serve as a citation for other websites
6) Links on pages with hundreds of links : Google ( According to Matt Cutts) its guidelines on linking states that its crawler will stop spidering the page having more that 100 links" Though this is just an indicative number. Limiting your links to a number between 100-200 " on a web page will ensure that the crawlability of the page in question is not affected .
Wednesday, February 17, 2016
linkedin and pinterest fall from grace from top 5 social media sites
Snapchat and Twitter has surpassed linkedin in terms of traffic . Facebook leads Social network marketshare with over 1,5 billion, followed by Tumblr and instagram with 500 million users. Twitter and Snapchat share the honors for the 4th and 5th place
Pinterest and Linkedin have seen their places toppled by Instagram and snapchat.
5 amazing things you can do with youtube: from making a ringtone to creating a dubsmash
5 things you never knew you can do with youtube :You can do some cool and amazing thinks with youtube. Didyou know you can create your own ringtones , become a virtual DJ ,by remixing multiple songs, even create dubsmash , get access to NSFW and flagged videos and save only a specific part of the video, and bypass regional filtering and add your special effects, change the background sketch and create charaters and even add subtitles , without even having to get off the chair, for a break.. You can do all of these amazing things with youtube in just under 15 mins
1) makng ringtones from Youtube The site MadRingtones enables you to make ringtones from YouTube videos or MP3 files from your computer. To do this enter the URL of the video or MP3 file and click “˜Load’. Listen through, and it lets you specify the tone’s start/end points and then download it to your computer as MP3, AMR, OGG or M4R file.
2)Playing only the interesting part of the video : You could use the “#t” URL trick to skip to the interesting part. However you can check out TubeChop or Splicd., which allows you to input the length or the duration of the video which you wish to save. Embedd the chopped video, to your Blog or social web
3)create a remix or a dubsmash; Looking to create a DJ track by remixing your favorite songs. Dont worry. Just download the app called " zYtDub" which is great video dubbing app which allows you to easily dub any YouTube video with audio from another video
Add the YouTube urls of the video you wish to dub, along with the audio or video to be used as a background score and hit ‘Dubbo’ this will allow you to show a video of your favorite video album by using a different audio recording
4)Watch removed flagged or NSFW videos ; NSFWYouTube is a site that allows you watch videos that has been flagged, removed or has been tagged as NSFW video .Just replace the youtube( only the youtube.. not the entire url with the" nsfw" and watch it with your hearts content
5)Turning off the related videos section : Feeling irritated , having to watch those related videos ?Just add ‘&rel=0′ to the end of the url part of the embed code and you just turned off the related video suggestions!
1) makng ringtones from Youtube The site MadRingtones enables you to make ringtones from YouTube videos or MP3 files from your computer. To do this enter the URL of the video or MP3 file and click “˜Load’. Listen through, and it lets you specify the tone’s start/end points and then download it to your computer as MP3, AMR, OGG or M4R file.
2)Playing only the interesting part of the video : You could use the “#t” URL trick to skip to the interesting part. However you can check out TubeChop or Splicd., which allows you to input the length or the duration of the video which you wish to save. Embedd the chopped video, to your Blog or social web
3)create a remix or a dubsmash; Looking to create a DJ track by remixing your favorite songs. Dont worry. Just download the app called " zYtDub" which is great video dubbing app which allows you to easily dub any YouTube video with audio from another video
Add the YouTube urls of the video you wish to dub, along with the audio or video to be used as a background score and hit ‘Dubbo’ this will allow you to show a video of your favorite video album by using a different audio recording
4)Watch removed flagged or NSFW videos ; NSFWYouTube is a site that allows you watch videos that has been flagged, removed or has been tagged as NSFW video .Just replace the youtube( only the youtube.. not the entire url with the" nsfw" and watch it with your hearts content
5)Turning off the related videos section : Feeling irritated , having to watch those related videos ?Just add ‘&rel=0′ to the end of the url part of the embed code and you just turned off the related video suggestions!
Tuesday, February 16, 2016
best resolution vs price comparison:top set top media
The best streaming players compared by output resolution with Price vs Rating and Price
moto smartwatch vs one touch alcatel
Smartwatches comparison: Motorola Moto 360 compared with Alcatel One Touch
3 steps to get your site back after a google penalty
While getting penalised by google is bad enough, getting your site back Google after you have been flagged and penalized can be a daunting experience.
But you dont have to have many reasons to worry if you follow these steps which are outlined below
What you need
1)Access to google webmasters tool
2)Access to a site to check domain authority website "bulkchecker"
2) Google Drive
3) Excel sheet to map your progress
Penalty are of 2 types
1) manual penalty : due to unatural linking
2) Algorithm penalty: due to conflict between your site SEO and Google SEO algorithm
Today we will today cover only manual penalty
Google looks at your website almost like a credit card report,checking who links to whom how important the link is who is referring yous . To cut a long story short.. if you've come to a party , Google just wants to know to who has brought to you the party
Manual Penalty also called unnatural link penalty,as these are purely caused due to " unatural and artificial links which projects your site in a certain way links which are irrelevant, meant to manipulate and artificially pass link juice , links that are broken, irrelevant links , including buying of links to manipulate page rank
HOW DO YOU KNOW IF ITS A PENALTY (1)if your site has been loosing a lot of backlinks fast as compared to earlier Its time for you to check if your link building .
(2)However if your site has indeed been flagged by google and its has been validated by the search team at California.. You will get a mail from Google asking you to take a relook on your unatural or artificial linking, and will ask you to resubmit the request to index your site once you have removed the links in question
Step :1)solving manual Penalty : Go to webmaster and go to tools,click on search traffic and then who links to your site . Download the entire list as CSV
Step:2)Now go to a site called Bulkdachecker : This toll will identify which of the links you have at your site has the lowest domain authority. To to this first download all the links of your site via Exel or CSV
Step 2:) Now upload your links one by one or upload the entire list of links via a excel or a CSV. Once you have entered and clicked on check , it shows you " all the links with their domain authority. Now check each link and their domain authority and remove all the links which has a DA ( domain authority of 25)
But you dont have to have many reasons to worry if you follow these steps which are outlined below
What you need
1)Access to google webmasters tool
2)Access to a site to check domain authority website "bulkchecker"
2) Google Drive
3) Excel sheet to map your progress
Penalty are of 2 types
1) manual penalty : due to unatural linking
2) Algorithm penalty: due to conflict between your site SEO and Google SEO algorithm
Today we will today cover only manual penalty
MANUAL PENALTY Manual penalty refers to the penalty which you receive by a real life human, after the site in question is flagged by google spiders.This does not happen on whims and fancies of the google spiders but based on both spiders and human intervention
HOW IS PENALTY INFLICTED :Any website that is online which can be found by a search engine is being evaluated by who knows who and who is recommending whom.. I know "Content is the King" has been a favorite for all webmasters for the last 15 years. and it continues to reverberate across search engine forums and search gurus alike But the fact of the matter is your content does not dictate to your site on how well you rank at Google. In fact it cant.Google has made sure of that that due to "lots of gaming and content manipulation which has happened earlier..
Google looks at your website almost like a credit card report,checking who links to whom how important the link is who is referring yous . To cut a long story short.. if you've come to a party , Google just wants to know to who has brought to you the party
Manual Penalty also called unnatural link penalty,as these are purely caused due to " unatural and artificial links which projects your site in a certain way links which are irrelevant, meant to manipulate and artificially pass link juice , links that are broken, irrelevant links , including buying of links to manipulate page rank
![]() |
This tool helps you to understand your links with low domain and remove it from the links |
![]() |
This is the kind of email you will receive if your guilty of unatural linking |
HOW DO YOU KNOW IF ITS A PENALTY (1)if your site has been loosing a lot of backlinks fast as compared to earlier Its time for you to check if your link building .
(2)However if your site has indeed been flagged by google and its has been validated by the search team at California.. You will get a mail from Google asking you to take a relook on your unatural or artificial linking, and will ask you to resubmit the request to index your site once you have removed the links in question
Step :1)solving manual Penalty : Go to webmaster and go to tools,click on search traffic and then who links to your site . Download the entire list as CSV
Step:2)Now go to a site called Bulkdachecker : This toll will identify which of the links you have at your site has the lowest domain authority. To to this first download all the links of your site via Exel or CSV
Step 2:) Now upload your links one by one or upload the entire list of links via a excel or a CSV. Once you have entered and clicked on check , it shows you " all the links with their domain authority. Now check each link and their domain authority and remove all the links which has a DA ( domain authority of 25)
Monday, February 15, 2016
worldwide indexed pages by google stands at 50billion vs Bings 5 billion
![]() |
Bing vs Google's Indexed Pages |
The saffron color shows the number of pages indexed by Bing since 2014 which stands at 5 billion as of February 16th,2016 .The Dutch Indexed Web contains at least 241.41 million pages ( as of Monday, 15 February, 2016
![]() |
Number of Pages Indexed by google in the last 3 monthsTool to check worldwide Indexed pages by search engines in Real |
Subscribe to:
Comments (Atom)













