Google’s fluctuating page rank can be confusing.
Your new site or page might rank well initially but, after a week, it begins to drop, declining over time to a middle-of-the-pack slot. Why?
Matt Cutts, head of Google’s Webspam team, recently put together a video to explain the phenomenon.
Cutts said Google’s algorithms can, initially, have a hard time figuring out the original source of new content. Time changes that, however.
“Writing a search engine is kind of a complex task,” Cutts said. “You’re trying to make sure you return the best quality result but you also have to do that with limited information.”
Cutts likened the ranking process to the reporting of breaking news during an earthquake.
For instance, he said, one minute after an earthquake occurs there is limited information about what happened, ten minutes later there is slightly more information and an hour later a lot more information.
“With any event that has breaking news, it can be hard to know (what is accurate) even if multiple people are saying the same thing,” he said. “One person might be the original author another person might be using an RSS” feed to relay information.
“It can be difficult to try to suss out … where this content was appearing originally. And, over time, over the course of hours, days or weeks, that gets easier. But it can be hard after just minutes or hours.”
Cutts said initial rankings are often Google’s “best guess” on how relevant a page or piece of content is.
As more information becomes available, Google incorporates its new knowledge into the mix and, “typically, things settle down into a steady state,” Cutts said. “When there is a steady state, we’re better able to guess how relevant something is.”
Armed with that information, Google can then determine if the page or content would be better served by QDF (queries that deserve freshness) or evergreen.
QDF is a component of the Google algorithm for queries that need frequent updating, such as breaking news stories.
If Google determines the best results for a particular search will change daily, or even hourly, it will designate the search QDF. Such designations means new and relevant content will make its way to the top of the search results. It also means, however, that content will quickly be bumped as newer relevant content is posted.
Evergreen, on the other hand, is a term Google applies to pages or sites that are frequently updated and are likely to attract both first-time and repeat visitors.
“A lot of people think there should be one set of rankings, it should be completely uniform, everybody in the world should see the exact same thing,” Cutts said. “The fact is, we have different results for people in different countries, even in different cities.
“And the results can change over time. Not just because links change or the content on the page changes, but because we are better able to assess which pages are more relevant.”
Mysteries buried deep underground in London’s labyrinth of Tube tunnels could soon be coming to the surface.
Exploring London Underground’s abandoned stations, or ‘ghost’ stations, is limited to just a couple of small TfL-led tours a year. Tickets sell out quickly despite virtually no publicity, as the curious clamour to see parts of the city which have been closed off to the public for decades.
Now, one of the most historically significant of London’s ghost station, Down Street, is set to reopen as a tourist attraction – 80 years after closing to passengers.
The enterprise is the creation of Ajit Chambers, founder of The Old London Underground Company, and following four years of careful planning he now has a consortium in place to acquire the lease for the site.
During the Second World War, Down Street, which lies on the Piccadilly line between Green Park and Hyde Park Corner, served as an underground bunker for Winston Churchill and his war cabinet.
The plan is to recapture that moment in the station’s history in an interactive World War II exhibit and open it up to tourists.
But Down Street isn’t the only station on the project’s route map. Ajit has submitted proposals to bring 26 abandoned Underground stations back to life as money-making entertainment venues.
There are plans already in place to convert Brompton Road, another Tube station acquired by the MOD to help in the war effort, from an antique Underground station building into a heritage tourist attraction, climbing wall and roof-top restaurant included.
Ajit said: “In a credit crunch some projects will deliver revenue in a sustainable fashion, others will purely be political moves. This project is both – it delivers revenue created from state-owned assets, as outlined by David Cameron, and highlights any bad political behaviour hampering projects that are crucial to London’s growth.
“I based my model on Alcatraz, which is a tourist attraction that was assisted by the mayor of San Francisco, and designed the Ghost Station Project, with the assistance of our mayor Boris Johnson.”
Since setting out on the project in 2009, Ajit, a former City banker, has been in constant talks with TfL and the MOD about gaining access to the stations and has already secured £20 million from private sources to finance the scheme. But with many of the sites sitting alongside a live railway, both Boris and TfL have needed convincing that it can be done safely and without dipping into the public finances before fully backing the project.
News titles from around the world have contacted Ajit wanting to feature the project. The Old London Underground Company website received 700,000 views in just 48 hours after being featured in a BBC article online.
Project partners are also now queuing up to get involved. Ajit is currently in talks with the owners of the Dominion Theatre about a possible collaboration.
“I’ve just been into a site visit. I took my construction team and TfL left us the keys for six hours. They’re now saying yes and we’re about to take the lease on Down Street,” added Ajit.
“I have also found a section of track that was used to take people to the Dominion Theatre. This site is of particular interest to me as the nearby section holds the tunnels that secured the Elgin Marbles in the Second World War.
“The most exciting thing is that there will finally be a project that will harness the spirit of London – opening the ghost stations in the world’s oldest Underground system.”
Location: Solihull, United Kingdom
A forward thinking and dynamic marketing and events agency based near Solihull is seeking to appoint competent and professional event managers to manage a portfolio of high-profile leisure events for a prestigious client base.
This is a varied and diverse role that requires somebody with proven event-management skills who can work under pressure to tight deadlines.
Managing a portfolio of events...
We require classified sales executive and event administrator for our niche, international B2B magazine, website and events. The role is 10-12 month maternity cover. Ideally you will be an experienced, assertive sales person with strong administration skills, looking for new challenges.While working as part of a team, you will have plenty of initiative and be able to work independently, with great...
- Conduct quality assurance audits of products and processes and review policies and procedures to ensure they are accurate and being followed. Confirm product meets quality control standards and meet company's expected level of quality.
- Work with VCE Supply Chain on resolving component supplier and contract manufacturer quality related issues and identifying opportunities to improve quality and reduce non-conformities. Ensure supplier Corrective Action Requests (CARs) are being properly issued and closed.
- Work closely with contract manufacturer’s quality group.
- Manage Field complaints and issues and Missing/Wrong/Damaged (MWD) process.
- Monitor and manage projects that directly impact company success while capturing metrics and statistics.
- Data collection and metric reporting.
- Ability to lead teams, manage projects, and coordinate and facilitate cross-functional meetings. Document and track action items and drive issues to closure.
- Must communicate effectively, excellent verbal communication and writing skills.
- A minimum of five years or more in quality assurance jobs. Experience includes strong knowledge of quality and project management and the understanding and use of quality assurance system applications.
- Six Sigma and Lean certifications preferred.
- ISO certification experience and experience in performing ISO audits a plus.
- Education: Bachelor’s Degree in quality related field is preferred
If you think you have what it takes to continue the remarkable growth and success of VCE, we invite you to explore a career with a recognized industry leader. We will give you the chance to learn more and accomplish more, faster than you ever thought possible. The Cloud is waiting. What are you waiting for?
ServiceSource is the global leader in recurring revenue management. The world's most successful companies rely on us to maximize subscription, maintenance and support revenue, improve customer retention and increase business predictability and insight. ServiceSource delivers results with Renew OnDemand, the world's only cloud application built specifically to manage and grow recurring revenue, which can be combined with our industry-leading services.
With over a decade of experience focused exclusively in growing recurring revenue, our services and applications are based on proven best practices and global benchmarks. The company is headquartered in San Francisco, and has over $8 billion under management for customers in more than 150 countries and 40 languages.
Under the direction of the Regional Manager, Global Customer Support, this position will be responsible for the support of ServiceSource’s Recurring Revenue Management application product suite. This role will assume the responsibility of inbound Incident triage, resolution, application configuration changes, and additional application administration activities as deemed necessary. This role will interact regularly with ServiceSource Contact Center personal, Advanced Support Engineers, and Global Customer Support Regional Managers for the purposes of ensuring timely Incident resolution, high customer satisfaction, and the ongoing and efficient delivery of support service to ServiceSource’s customers. The role will be based out of ServiceSource’s Dublin Service Center and will require little to no business travel.
- Triage and resolution of Incidents originating from inbound Contact Center or Support Tools
- Efficient routing of Incidents to other business partners within the organization
- Effective routing and escalation of Incidents that require advance support skills in a timely manner
- Adherence to Service Level Agreements (SLAs) as they apply to Incident triage, management, and resolution
- Function as liaison between internal and client application users and the Advanced Support Engineers, as necessary
- Identify root cause of the problems to recurring Incidents and escalate for resolution
- Perform production environment updates while adhering to established policies and practices
- Contribute and utilize existing documentation, websites, FAQ’s, Wiki’s, and Forum’s to resolve Incidents
- Document process steps for the resolution of known application issues
- Bachelor’s degree in Math/Science/Computer Science or related field
- 1-3 years of Information Technology experience
- 2+ years of experience in configuring, administering, and managing enterprise level applications
- 1+ year supporting a SaaS solution
- 1+ year experience in Customer Support that includes regular interactions with users and meeting stated Service Level Agreements
- Ability to analyze, identify and resolve basic application problems, as well as, the judgment to determine when to escalate for additional support to ensure client requirements and objectives are met
- Must possess strong inter-personal relationship skills and excellent verbal/written communication skills
- Able to manage multiple projects simultaneously and deal with conflicting priorities
- 2+ years experience using the Microsoft Office suite of applications, including MS Word and MS Excel
- Experience using and/or administering Microsoft Dynamics CRM
- System development experience (.NET, SQL, Java, SharePoint)
- Understanding of dimensional models, drill paths, slowly changing dimensions and other data warehousing concepts
- Familiarity with Node.JS and MongoDB
- Familiarity with Incident Management tools such as ServiceNow, ServiceCloud, etc.
- Familiarity with Microsoft Visio or other process mapping applications
ServiceSource offers an attractive competitive salary and benefits package.
Cloud computing continues to be the great equalizer for small business by making technology more affordable and accessible than ever before. And with the recent availability of Office 2013 and revamped Office 365 plans, small business owners may want to take another look at the value proposition offered by Microsoft's subscription-based Office 365.
Here are three compelling reasons why an Office 365 subscription may make sense for smaller businesses.
3 Reasons to Choose Office 365
1. Low upfront cost
Small businesses can sign up for an Office 365 plan – that meets their exact requirements -- for a predictable monthly fee. Compare that to setting aside funds to purchase new hardware servers, software licenses and CALs for the requisite server operating system and Exchange Server. This hefty, upfront investment could set back a typical small business by thousands of dollars.
On Microsoft's cloud computing front, Office 365 subscription plans offer a monthly fee that can go as low as $4 per user/month for businesses interested only in hosted email. In addition, small businesses can buy subscription-based access to the desktop version of Office 2013 productivity suite for about $15 per user/month under the Office 365 Midsize Business plan.
2. Multiple licenses
Another big plus that Office 365 plans offer, aside from the low upfront cost, is that small businesses can install the desktop apps on up to five PCs or Macs per user. This is perfect for businesses that embrace BYOD, as it allows employees to install the Office productivity suite on a second laptop or Windows 8 tablet such as the Dell Latitude 10 or Lenovo Tablet 2.
The Office 365 Midsize Business plan, for example, costs $15 per user/month and includes Word, Excel, PowerPoint, Outlook, OneNote, Access, Publisher, Lync and InfoPath. The slightly cheaper Office 365 Small Business Premium offers the same software minus InfoPath.
3. Exchange online
Hosted Exchange used to cost about $20-$25 per user/month when Microsoft first made it available. The mainstream popularity of cloud services today however, has forced traditional software vendors such Microsoft to redesign Exchange for the cloud. The market pressure resulted in a robust Microsoft-hosted Exchange Online service at a highly competitive price: as low as $4.00 per person per month.
Unless your company has compelling legal or compliance concerns, Exchange Online offers a far more appealing cost-of-ownership proposition than an on-site Exchange deployment. Moreover, Exchange Online also includes spam and malware protection, which costs extra if you buy an on-premises solution.Source
It is considered the next big advancement in information and communication technology. Cloud technologies are characterized by hosting computing and storage services in a remote and virtual location, thereby requiring little to no intervention on the user’s end in the implementation of their computing environments. Cloud introduction in the industry has caused a huge revolution in services and computing, creating vast possibilities for storage, computing power and virtual applications among other potential fields. The benefits for small and large businesses are undeniable, as the enterprise ready product takes a handle on global computing.
This is one of the most popular uses of cloud, where productivity applications are hosted on a web based portal for all the members of a business or organization to access and update for the objectives of the business. The use of document collaboration puts a whole new meaning to a global office, where members of a team scattered all over the world can work on a critical document at the same time to increase their collective productivity.
Low Operational Cost
The initial investment is low, and operational costs are quite negligible, normally quantified by the management of hardware resources used in the access of cloud services. This has given rise to most businesses to take up the service and develop their product and corporate operations around cloud technologies.
Most cloud environments have real time text, video and audio communication portals for all of the team members in a small or medium sized company. This reduces the investment on propriety voice and video communication systems to a bare minimum. The portal provides the subscribing company with the opportunity to access all communication methods from one location using a unified log in service.
Securing data is not encrypted behind industry grade protocols that are operated by the integral companies that manage the cloud hosts. Accessing information comes with attached privacy policies which bind the business owners and the cloud hosting services to an agreement to maintain the integrity of the data in all eventualities.
This has often been a challenge for many industries, companies and businesses, the ability to perform regular and real time backups in protecting the data from loss or possible corruption. With Cloud services, this process is automated and the availability of the data is guaranteed for all the times of necessitated access.
A lot of site owners, especially SMBs, are still looking for the ‘Field of Dreams’ effect with their content marketing efforts. If you build it they will come… may have worked for Kevin Costner but, for the rest of us, waiting and hoping is no way to build your online brand presence. A lot of small business owners have a hard time wrapping their heads around the fact that Google isn’t there to help their business succeed. Google doesn’t really care about your business (as harsh as that sounds); they care about their customers, the searchers. If your content helps Google deliver the best possible search results to their users then it will do well organically. If someone else is doing a better job, your content will get pushed deeper and deeper into the SERPs.
Just because you write and publish a piece of content, even a great piece of content, that doesn’t automatically mean Google will love it or reward it. You have to earn the trust and respect of the search engines; something that comes with time and consistent effort.
In a recent interview I did with Ann Handley, the Chief Content Officer of MarketingProfs, she summed it up perfectly;
Good content is only noticed if sharing is a key part of any content marketing effort. You can’t expect Google to do all the work for you — you’ve got to actively share and engage on social media channels as well.
Are you actively promoting your content on social media? Obviously the big sites like Facebook, Twitter, Google+ and LinkedIn (for B2Bs) are a good place to start, but there are dozens of other social platforms you could be leveraging: Tumblr, Pinterest, and Vine are some of the other popular ones, but smaller, industry specific social communities like Inbound.org (for SEOs) are great places to start building your social presence. Look for forums and communities that cater to your audience. If there is a niche you can almost bet there is a social site somewhere that you can promote your content on (such as Ravelry, a site for knitters and crocheters).
By actively promoting your content on social networking sites, every time your content is shared, it creates a social signal — basically a thumbs up in the search engine algorithm. Although we don’t know for sure how influential social signals are, the search engines have admitted that shares are a factor—the idea being that often-shared content is more valuable and, therefore, worthwhile in the search results. TheBing-Facebook integration, for instance, is trying to tap into the social opinion of the Web, making search results more relevant and personalized for each user. A piece of content that has been shared by someone in that user’s network (generating a social signal) could show up in the SERPs on Page 1 while for another user it might still be bogged down on Page 3. The more times your content is shared, starting with your own profiles of course, the more valuable it becomes in the eyes of the search engines.
Both Bing and Google have confirmed they take into account a person’s authority when looking at social links. This means it’s far more valuable to have one real human with a real social presence sharing your content than to have 10 bot accounts Tweeting a dozen times a day. The search engines know that spammers are looking to take advantage of social signals for their own gain, and author authority is one way they can prevent that. Don’t waste your time amassing tons of “followers” that are spam or bot accounts. I’d rather have 100 dedicated, engaged social connections that actually share my content to their own networks than 1,000 dummy accounts that just make the numbers look good. Look to connect with real people with real social presence. Just about every real connection, no matter how small, is worthwhile.
No matter how great your content is you have to do some heavy lifting if you want to get the word out and get that piece of content doing well organically. Don’t wait for things to pick up—build the momentum yourself.
Just as Google’s latest Panda refresh is finishing up its 25th rinsing of online websites in Google’s main index, one has to seriously wonder if affiliate marketing on the Web is history. Again, according to feedback in many online forums and blogs, affiliate marketers seem to have suffered greatly under this whole series of Panda ipdates — which serves out “site-wide” penalties if your site is judged inferior or simply not worthy of the Google Index.
But Panda is just one of the many measuring rods Google has brought into play in the last few years. Let’s not forget its just as powerful cousin Penguin, which targets your site’s link profile. Then there is also the ‘Top-Heavy Update’ that penalizes your content if it has too many ads above the fold. Then there is the ‘EMD Update’ that penalized and down-graded sites that had the exact keyword match in the URL. Everyone knows this was an extremely effective way of targeting Google search traffic for your affiliate products.
We won’t even consider the upcoming “Merchant Quality Update” which Google’s Matt Cutts says is debuting in the near future, along with another “big” Penguin update. Will affiliate sites and affiliate marketing take more blows from the mighty G in the near future?
Now, we all know Google has stated they’re not really targeting affiliate sites per se, just those affiliate sites that have little content or don’t add any value to the whole mix. As always, user experience is all-important to Google — it wants anyone using their search engine to have the best experience possible. Sites or Web content covered in ads is not exactly what the searcher is looking for on the Internet.
Supposedly, affiliate sites that offer valuable tips, advice, reviews and editorial content will not suffer in the rankings. Like the broken record says, content should be king and any affiliate links should be an after thought. In other words, add valuable content and one should have no problem with Google’s countless algorithm updates. Many webmasters and affiliate marketers are probably cringing at that statement, as many innocent sites have been down-graded (unfairly in the eyes of those affected) by Google’s countless moves to improve its search results.
Whether intended or not, many affiliate marketers have been hit by these Google changes, and are struggling to survive without valuable search engine traffic from Google. Considering Google controls the majority of this search traffic, affiliate marketing has indeed taken a major hit, many have abandoned their sites. One simply has to ask the question: is Google really trying to wipe out affiliate marketing from the Web?
Not really. One does have to consider that Google has its own affiliate network which it promotes along side its AdSense program. Webmasters and marketers can join and promote companies/products within this network. While Google does favor its own products in their SERPs, anyone with YouTube videos can testify to that fact, the jury is still out on if Google favors its own affiliate network in its results.
One major point I have learned from having many affiliate sites: Google likes Amazon ads or links and hasn’t penalized these (Amazon only) sites as much with any of the Panda/Penguin changes. Other sites with non-Amazon ads or even a mixture of affiliate ads, have not faired so well, and some have taken major hits. Now, to be quite fair, this can all be brought back to the “user’s experience” since Amazon is a trusted and popular online shopping site. This factor could result in content/sites being rated higher or at least not penalized as heavily when they carry these Amazon affiliate links.
It is going to be interesting to see how the ‘Merchant Quality Update’ plays out and if these Amazon links will survive. I am betting those links and the sites carrying them won’t be touched, but I have been wrong before, so it’s wait and see time. In the same light, used sparingly, I don’t believe having AdSense ads or other ad networks on your site will totally wipe your site out of the rankings in Google.
But if you’re an affiliate marketer, recovering from Panda is not an easy task, especially if you have an older site that needs a major overhaul to recover lost rankings. There have been 25 different Panda updates over the last two years. Google now says it won’t be announcing the next ones, they will just be quietly incorporated into the regular ongoing changes of their index.
If you’re an affiliate marketer, how can you recover or make your site immune to these Panda updates? Many SEO experts give the same tips or advice:
- Remove or improve any low-quality pages and remove any duplicate content/pages;
- Make sure you have only original high-quality content presented above the fold and this content is not found anywhere else on the Web;
- Make sure you have a clean site with no broken links (interior and exterior) and redirect any “not found” pages via the 301 method;
- Avoid using a site which has a heavy template footprint — same keyword links on all the pages could be seen as duplicate and/or poor quality;
- Check all outbound links and make sure they’re not going to any bad neighborhoods;
- Make sure you reduce page load/speed times by limiting graphics, scripts and try limiting the number of links on a page to less than 100;
- Decrease your bounce rate and increase your page views per user to show your site offers a good user experience;
- Make sure your site and content has a heavy “social presence” and can be easily bookmarked in all the major social networks
- Use sparingly the number of affiliate links you have on any one page (some super affiliate marketers use an interior “php redirect” with their affiliate links to hide them and to cut down on affiliate theft — use at your own discretion) and make sure you “nofollow” your affiliate links.
Regardless, affiliate marketers and sites, who have suffered greatly with the Panda and Penguin updates in recent years must adjust their marketing tactics.
One major shift for me — I try to use more wisely the traffic I am receiving by first promoting affiliate products which have residual income: make one sale and get paid for years to come. Also, I am promoting products and services that have both higher commissions and longer cookie timeframes.
Here are a few other things I have done to make up for some of the lost Google traffic and which will keep your affiliate sites/links visible on the Web and in Google.
- Find other ways into Google’s index and search. Try YouTube videos, this is a longer route or method, but it does eventually deliver search engine traffic. Google Index — YouTube Video — Your Site. Along the way, make sure you’re building your own contact list or lists.
- Same as above, but use press releases. I use PRweb to get my content into Google — this is especially effective with holiday shopping events such as Cyber Monday and Black Friday. The Google News listing only lasts a few days, but your release can show up in Google’s SERPs for much longer and bring in targeted keyword traffic from Google. But be warned, Google is rumored to be cracking down on these releases in the near future.
- Try promoting on social networks like Facebook, Twitter, Google+, Linkedin to get traffic flowing to your affiliate sites. Frankly, I don’t find social traffic and advertising to be half as effective as search traffic, but you will still make some sales. I am currently using Facebook and was quite taken back with all the negative feedback until I removed my sponsored story ad — users don’t take kindly to having these ads in their news feeds. At least this has been my experience. Side ads are drawing traffic and no negative feedback.
- If your site or sites have been affected badly by the Panda/Penguin onslaught, make sure you have Authorship Markup in place within Google. This way you can promote your name and brand all across the Web where your content is featured. This will still give you online visibility, even if Google has down-graded your affiliate site or sites into oblivion.
- Don’t forget the other search engines such as Bing, Yahoo, Dogpile… these may not be the king of the hill, but they do bring in targeted traffic. While it is simply wishful thinking to believe Google will lose its search monopoly any time soon, the fall-out and uproar from all of Google’s changes and privacy issues could bring it down a few notches. Facebook has already knocked Google out of first place countless times as the top site on the Web, so anything may happen.
Lastly, if you’re like me and have more than a few sites, survival could simply mean promoting the affiliate sites that have not been affected by all these Google changes. The adage about baskets and eggs probably holds true, having several or more sites will increase your chances of not being totally wiped out by Google, as it narrows its version of a quality Web down to a couple of hundred thousand sites or less.