Websnare Blog

 Problem 1: What's it gonna cost?

Many times, quotes can be underestimated. Every project is different. Designers must take various factors into account when gathering client requirements.

Solution 1: Agree to a budget beforehand.

We can tell you if a budget is reasonable for what you're trying to accomplish. If it's tight, we can help you prioritize features, and make sure the critical ones are done first before the budget is exhausted.

Problem 2: Requirements are not specific.

You need to be extremely specific and detailed about what the finished site needs to look like, and how it needs to operate. The overall cost of the project can change alot based on seemingly minor requirements. You get through part of the project, and realize the requirements overlooked some critical feature you really need, or didn't specify clearly enough. Now all work comes to a halt as the developer needs to renegotiate the contract. The client is unhappy because they're paying more, and the project is late. 

Problem 3: Requirements prevent changing to a more suitable solution.

We get part way through building a site, and realize that if we had chosen a different approach or platform, the end result would work much better for the client. But we're far enough down the path of the current development to back up, and our original approach does fulfill the requirement. We're unhappy delivering a site that could be better, and our customers end up with a clunkier, less than optimal site—but it's easier than going back and renegotiating with the client.

Solution for Problem 2 & 3: Scrap the requirements.

Requirements can almost always generate resentment, and they're also largely unnecessary for small web projects. It is important to have a clear agreement and what is being delivered. Unfortunately, there are a ton of variables, and many of them are not discovered until the project is well underway. Doing the groundwork to identify all the possible pitfalls of a project is probably about half the actual work of a project—and in most cases, that's far more of an investment than the client wants to make without an actual result. Designers almost always put far more into the discovery than planned.

Instead of having hard requirements, we help our customers identify goals and rank them by priority. We start with a previously-finished configuration, and use the budget to modify that configuration towards the goals.

So you probably know what Google indexing is if you read my last blog.  Google crawls your site and adds pages to its index. All results are pulled from the desktop-specific index. Now is the time to get a responsive design, because Google’s primary search index will soon be mobile. If your site isn’t friendly to a mobile audience, you shouldn’t expect it to rank well for certain keywords. When the bot finds that your site is hostile to mobile users or loads very slowly, you’re definitely going to lose rank.  

Want to see how your sites “look” to the Googlebot that uses a mobile user agent? Go to the Google Search Console (formerly Google Webmasters) and click on the “crawl” option on the left-hand sidebar. In the menu that appears, click on “Fetch as Google.  The screen that appears gives you the opportunity to crawl your site as the Googlebot and see what it sees. To see it as a Googlebot using a mobile user agent, you’ll need to select “mobile smartphone” from the drop-down menu next to the “Fetch” button. Click the “Fetch and Render” button and the tool will show you exactly what the crawler sees as it touches your website. That will give you a good idea about how mobile-friendly your site is.

The bottom line- Your prime directive needs to start with optimizing your site for a mobile audience for a better reach. Websnare can quickly build your responsive design!

When you Google something, you are searching Google’s index of the web. Google is constantly clicking on links to see where they go. They start on major websites  and their spiders go through the entire website and click on links that take them to other websites, where they repeat the process and go onto more and more until they reach presumably every website that has a link to other websites. As the spiders crawl over the web, they send information back to Google about what the website is about by looking at various keywords on the site. That information is then stored in Google’s index. When people search Google, the Google algorithm will rank the best results from its index in the order of 1 to infinity.

Therefore, in order to have your website rank within Google, you must make sure your site is within the Google index. It will not matter if you have the right content management system or the right keywords in place. If you don’t have those pages within the index, they will not show up in Google search results.

There are two ways to get indexed:

  1. Get a number of links to your website, where eventually the Google spider will come to your website and crawl it and add it to their index. The problem with this is if certain pages aren’t linked to the main page, Google may not crawl them or index them.
  2. Provide Google with a site map and ask them to crawl the site and submit it their index. This is accomplished in Google webmaster tools and is the preferred way to get websites into the index. Further as the website is updated, the sitemap is updated and tells Google there is a change that needs to be indexed again.

A Google penalty is the result of a constantly changing algorithm utilized to crawl and inspect websites. You may no longer be listed in search results or the ranking for your targeted keywords has dropped dramatically. When your target audience can't find you, it affects revenue.

 

There are two main penalties. The first one is a manual action from Google’s spam team, and the second one is an algorithmic penalty.

1. Manual Action – handed down by a Google employee. They're most often given when your website is found to be doing something against Google's Terms of Service. This may include a virus infection, cloaking, redirects, or buying links.

2. Algorithmic Penalty – may impact websites that have thin or duplicate content, keyword-stuffed copy, slow loading times, or a lack of incoming links. With this kind of penalty, you will still be ranking in search, but probably much lower.

 

To ensure your site doesn't receive a Google penalty, make sure your site does not have:

  • spyware, adware, or viruses
  • hidden links or text
  • cloaking (displaying a different version of a webpage to the search engine robots)
  • deceptive redirects (when a visitor is automatically taken to another page without clicking anything)
  • pages loaded with irrelevant keywords
  • keyword stuffing
  • a substantial amount of duplicate content

 

 Getting rid of a Google penalty is doable. Numerous websites have recovered from all types of penalties.

The key to removing any Google penalty is to understand what caused it. Google’s Webmaster Guidelines can provide great guidance. Once you identify the reason for your penalty, you have to remove the backlinks that led to your rankings drop and disavow the ones you cannot remove. Download all of the backlinks from Google Webmaster Tools, and use an SEO tool to get more insights about your links.

Just because your site loses ranking in search engine results does not automatically mean you've received a Google penalty. Your competition is growing each day, and the way Google ranks websites is constantly changing. When you think you may have a Google penalty, in actuality, you may need improved search engine optimization methods. If you would like some help, contact Websnare at This email address is being protected from spambots. You need JavaScript enabled to view it..