SEO Tutorial for beginners that will get you
ranked
SEO In One Day
SEO is simply not as hard as
people pretend like it is; you can get 95% of the effort with 5% of the work,
and you absolutely do not need to hire a professional SEO to do it, nor will it
be hard to start ranking for well-picked key terms.
Of all the channels we’ll be
discussing, SEO is the one that there is the most misinformation about. Some of
it is subtle, but some of it is widely spread and believed by so-called SEO
consultants who actually don’t know what they’re doing.
SEO is very simple, and unless
you’re a very large company it’s probably not worth hiring somebody else to do.
It’s also something that has a lot of faux veneer around it. Consultants want
to make it seem incredibly difficult so that they can charge you a lot, but
I'll show you exactly how to do it, step by step, and you'll win.
How Google Works In order to
understand what we need to do for SEO let’s look back at how Google started,
how it’s evolving today, and develop a groundwork from which we can understand
how to get ranked on Google.
On-Page SEO
The initial phase in preparing our site to rank is making it
unmistakable to Google what our site is about. For the present we will center
our landing page (our point of arrival) on positioning for one watchword that
isn't our image or organization name. When we do that and understand that
positioning we can fan out into different watchwords and begin to rule the hunt
scene, however until further notice we'll remain laser centered. Catchphrase
Research The principal thing we have to do is to make sense of what that
watchword is. Contingent upon how well known our site is and to what extent
it's been around, the dimension of traffic and trouble we'll get from this
exertion may change.
The Long Tail
There's an idea we should be comfortable with known as the
"long tail." If we somehow happened to diagram "notoriety"
of most things with "fame" being the Y hub and the rank request being
the Y pivot, we'd get something like a power law chart: There are some enormous
hits that get the greater part of consideration, and after a couple of hits the
diagram falls pointedly. The long-tail hypothesis says that as we turn out to
be increasingly assorted as a general public the yellow end of the above chart
will extend always and get taller. Consider Amazon. They most likely have a
couple of top of the line items, yet the lion's share of their retail income
originates from a wide assortment of things that aren't purchased anyplace so
regularly as their top of the line items. So also, if we somehow happened to
rank the notoriety of the melodies played over the most recent 10 years, there
would be a couple of hits that would accumulate the lion's share of plays, and
a colossal number of tunes that have just a couple plays. Those less well known
items and melodies are what we call the long tail. In SEO this issues in light
of the fact that, at any rate in the first place, we will follow long tail
catchphrases — exceptionally correct, expectation driven watchwords with lower
rivalry that we know can win, at that point step by step we'll work our way to
one side. Our site wouldn't outrank ultra-focused catchphrases before all else,
however by being progressively explicit we can begin winning exceptionally
focused on traffic with substantially less exertion.
The watchwords we're searching for we will allude to as
"long-tail catchphrases."
Finding the Long Tail
In order to find our perfect
long-tail keywords, we’re going to use a combination of four tools, all of
which are free.
The process looks like this:
1. Use UberSuggest, KeywordShitter and a little bit of
brainstorming to come up with some keywords
2. Export those keywords to the Google Keyword Planner
to estimate traffic level
3. Search for those keywords with the SEOQuake chrome
extension installed to analyze the true keyword difficulty
Don’t be intimidated — it’s
actually very simple. For this example we’ll pretend like we were finding a
keyword for this book (and we’ll probably have to build out a site so you see
if we’re ranked there in a few months).
Step 1: Brainstorming and Keyword
Generating
In this step we’re simply going
to identify a few keywords that seem like they might work. Don’t concentrate
too much on culling the list at this point, as most bad keywords will be
automatically eliminated as a part of the process.
So since this is a book about
growth hacking, I’m going to list out a few keywords that would be a good fit:
·
Growth hacking
·
Growth marketing
·
Internet marketing
·
Growth hacking guide
·
Growth hacking book
·
Book about growth hacking
·
What is growth hacking
·
Growth hacking instructions
That’s a good enough list to
start. If you start running out of ideas go ahead and check out
keywordshitter.com. If you plug in one keyword it will start spitting out
thousands of variations in just a few minutes. Try to get a solid list of 5–10
to start with.
Now we’ll plug each keyword into
UberSuggest. When I plug the first one — “growth hacking” — in, I get 246
results.
Clicking “view as text” will let
us copy and paste all of our keywords into a text editor and create an enormous
list.
Step 2: Traffic Estimating
Now that we have a pretty good
list of keywords. Our next step is to figure out if they have enough search
volume to be worth our while.
You’ll likely notice that some
are so far down the long tail they wouldn’t do much for us. For example, my
growth hacking list came up with “5 internet marketing techniques.” We probably
won’t go after that one, but instead of guessing we can let Google do the work
for us. This will be our weeding out step.
Google Keyword Planner
The Google Keyword Planner is
a tool meant for advertisers, but it does give us some rough idea of traffic
levels.
Google doesn't make any guarantee of precision, so
these numbers are likely just directionally right, however they're sufficient
to get us in good shape. You'll must have an AdWords record to have the
capacity to utilize the apparatus, however you can make one for nothing in the
event that you haven't utilize AdWords previously. When you've signed in,
select "Get look volume information and patterns."
Glue in your colossal rundown of catchphrases, and
snap "Get look volume." Once you've done as such, you'll see a great
deal of diagrams and information. Shockingly the Keyword Planner interface is a
smidgen of a bad dream to work inside, so all things being equal we will trade
our information to exceed expectations with the "download" catch and
play with it there. Presently what we will do is choose what traffic we need to
follow. This changes somewhat dependent on how much expert your site has. So
how about we attempt to decide how simple it will be for you
Catchphrase Trends
Sadly the "catchphrase trouble" that Google
gives us depends on paid pursuit traffic, not on normal hunt traffic.
To begin with, how about we use Google Trendsto see
the catchphrase volume and direction all the while. You can enter the majority
of the watchwords in the meantime and see them charted against one another. For
my watchwords it would seem that this:
The ones I'm most amped up for are purple and red,
which are "Development hacking methods" and "Development hacking
Twitter." Presently we'll investigate what the opposition resembles for
those two catchphrases.
Manual Keyword Difficulty Analysis
So as to investigate how troublesome it will be to
rank for a specific catchphrase, we will need to take a gander at the
watchwords physically, one by one. That is the reason we begun by discovering
some long-tail catchphrases and narrowing the rundown. This procedure gets
significantly simpler on the off chance that you download the SEO Quake Chrome
expansion. When you've done that, complete a Google inquiry and you'll see a
couple of changes. With SEO Quake turned on the pertinent SEO information of
each site is shown beneath each query output. We will change what is shown, so
in the left-hand sidebar click "parameters" and set them to the
accompanying:
The Google Index: This is how many pages from this base URL
Google has indexed
Page Links: The number of pages linking to the exact domain
that is ranking according to SEMrush’s index (usually very low compared to
reality, but since we’ll be using this number to compare it will be somewhat
apples to apples)
URL Links: The number of pages pointing to any page on the
base URL
Age: The first time the page was indexed by the
Internet Archive
Traffic: A very rough monthly traffic number for the
base URL
Looking at these we can try to
determine approximately what it would take to overtake the sites in these
positions.
You’ll notice that the weight of
the indicators change. Not all links are from as good of sources, direct page
links matter much more than URL links, etc., but if you google around and play
with it for a while you’ll get a pretty good idea of what it takes.
On-Page SEO Checklist
☐ Your keyword is in the <title> tag, ideally at
the front (or close to the front) of the tag
☐ Your keyword is close to the beginning of the
<title> tag (ideally the first words)
☐ The title tag contains less than the viewable limit
of 65 characters (optional but recommended)
☐ Your keyword is in the first <h1> tag (and
your page has an <h1> tag)
☐ If your page contains additional header tags
(<h2>, <h3>, etc) your keyword or synonyms are in most of them
☐ Any images on the page have an <alt> tag that
contain your chosen keyword
☐ Your keyword is in the meta description (and there
is a meta description)
☐ There is at least 300 words of text on the page
☐ Your keyword appears in the URL (if not the
homepage)
☐ Your keyword appears in the first paragraph of the
copy
☐ Your keyword (or synonyms — Google recognizes them
now) is used other times throughout the page
☐ Your keyword density is between .5% and 2.5%
☐ The page contains dofollow links to other pages
(this just means you’re not using nofollow links to every other page)
☐ The page is original content not taken from another
page and dissimilar from other pages on your site
If you have all of that in place
you should be pretty well set from an on-page perspective. You’ll likely be the
best-optimized page for your chosen keyword unless you’re in a very competitive
space.
All we have left now is off-page
optimization.
Off-Page SEO
Off-Page SEO is just a fancy way
to say links. (Sometimes we call them backlinks, but it’s really the same
thing.)
Google looks at each link on the
web as a weighted vote. If you link to something, in Google’s eyes you’re
saying, “This is worth checking out.” The more legit you are the more weight
your vote carries.
Link Structure
HTML links look something like
this:
<a href=”http://www.somesite.com” title=”keyword”>Anchor
text</a>
Where http://www.somesite.com is
the place the link directs you to, the title is largely a remnant of time gone
by, and the linked text — think the words that are blue and you click on — is
called the “anchor text.”
In addition to the amount of
link juice a page has, the relevance of the anchor text matters.
Generally speaking you want to
use your keyword as the anchor text for your internal linking whenever
possible. External linking (from other sites) shouldn’t be very heavily
optimized for anchor text. If 90% of your links all have the same anchor text
Google can throw a red flag, assuming that you’re doing something fishy.
Robots.txt, disavow, nofollow, and other minutia###
Most of SEO at this point is now
managing stuff that can go wrong. There is a lot of that, but we’ll go over
what will cover 99% of needs, and you can Google if there’s something really
crazy.
Robots.txt
Almost every site has a page at
url.com/robots.txt — even google has one.
This is just a plain text file
that lets you tell search engine crawlers what to crawl and not to crawl. Most
are pretty good about listening, except the Bingbot, which pretty much does
whatever it wants no matter what you tell it. (I’m mostly kidding.)
If you don’t want Google to
crawl a page (maybe it’s a login page you don’t want indexed, a landing page,
etc.) you can just “disallow” it in your robots.txt by saying disallow:
/somepage.
If you add a trailing / to it
(e.g. disallow: /somepage/) it will also disallow all child pages.
Technically you can specify
different rules for different bots (or user agents), but it’s easiest to start
your file with “User-agent: *” if you don’t have a need for separate crawling
rules.
Nofollow
A link can have a property
called “nofollow” such as this:
<a href=”http://www.somesite.com” title=”keyword”
rel=”nofollow”>Anchor text</a>.
If you want to link to somebody
but you don’t want it to count as a vote (you don’t want to pass link-juice),
or you support user-generated content and want to deter spammers, you can use a
nofollow link. Google says it discounts the value of those links. I’m not
convinced they discount them heavily, but other SEOs are so they seem to deter
spammers if nothing else.
Redirects
If you’re going to change a URL,
but you don’t want its link juice to disappear, you can use a 301 redirect. A
301 will pass a majority of the link juice.
Importantly, Google
views and austenallred.com as different sites. So decide on one, and
redirect all of one type to the other.
Link Building
Link building is where SEO
really starts to matter, and where a lot of people end up in a world of hurt.
The best way to build links is
to not build links. I’ve worked for companies in the past that don’t have to
ask for them, they just flow in from press, customer blogs, their awesome blog
posts, etc. If this is an option (and we’ll go over a couple of ways to make it
more likely) you’re in a great place.
If not, at least in the
beginning, we’re going to manually create just a few.
We’re going to create them in
legitimate ways and not hire somebody in India to do so. That is a recipe for
disaster, and I can’t even count the number of times I’ve seen that take down a
site.
Web 2.0s The easiest way to
build high quality links are what SEOs call “web 2.0s.” That’s just a way to
say “social sites” or sites that let you post stuff. Now tweeting a link into
the abyss won’t do you anything, but profiles, status pages, etc. do carry some
weight. And if they come from a popular domain that counts as a link.
Some of the easiest are:
·
Twitter (in your bio)
·
Github (the readme of a repo)
·
YouTube (the description of a
video — it has to actually get views)
·
Wordpress (yes, you’ll have to
actually create a blog)
·
Blogger (same here)
·
Tumblr
·
Upvote-based sites (HackerNews,
GrowthHackers, Inbound.org, Reddit, etc.)
Expired Domains
Another way to get link juice is
by purchasing an expired domain. This is more difficult to do, but there are a lot
of options such as expireddomains.net. (Google “expired domains” and you’ll
find dozens of sites monitoring them.)
You’ll want to purchase a domain
that has expired and restore it as closely as you can to its original form
using an archive. These sites likely have some link juice to pass on and you
can pass it to yourself.
No comments:
Post a Comment