Technical SEO refers to the process of optimizing your website for the crawling and indexing phase. With technical SEO, you can help search engines access, crawl, interpret and index your website without any problems. You can make search engines find and discover your website, and index your content by:
1. Links – In order to index content on your website, search engines follow links pointing to different landing pages of your website. They try to find as many links pointing to your landing pages as possible and then try to interpret content on each of those landing pages and index. You should create links for all the landing pages, and make sure each landing page is accessible from any other landing pages of your website, and navigation is super easy for users and search engine crawlers. Also, make sure you create new links for new content and highlight them prominently on your landing pages.
2.XML Sitemaps – A Sitemap allows you to create a file hosted among the other files in your account that informs the search engines of all pages your website contains. Sitemaps contain detailed information about the content on your website. It is a format search engines can easily understand. You need to submit sitemaps to search engines so they can crawl your websites better. You can learn more about sitemaps at – www.sitemaps.org. You can create a sitemap at – https://www.xml-sitemaps.com.
3.Robots.txt – Creating a robots.txt file instructs search engine crawlers on what pages and content of your website are to be crawled, and what pages are to be left alone or ignored. This is specifically useful if you don’t want certain pages like “members only” to be indexed and displayed in search results. This way, you ensure what pages of your website are visible to crawlers, and what pages are not. You can read more about this at – www.robotstxt.org.
4. Unique URL – Try to use unique URLs for every landing page. Landing pages with the same names can confuse search engines.
Let us see how you can submit XML sitemaps containing the listing of page content to Google Webmasters using All In One SEO:
a)From LHS of Google Webmasters dashboard, select Crawl > Sitemaps.
b)From WordPress dashboard select All In One SEO > XML Sitemap.
c)Select View your sitemap.
d)Copy the sitemap URL.
e)Return to Google Webmaster and select Add/Test Sitemap.
f)Paste the URL of the sitemap (directory) and select Submit.
Creating robots.txt file:
a)In WordPress, select All In One SEO > Robots.txt.
b)Create Robots.txt file from this panel.
c)Use Rule Builder feature to create the new Robots.txt file.
d)Select Add Rule to define a new rule. Select Save Robots.txt file to save.
e)Once you have made the changes, select Optimize.
f)Visit Google Webmaster tools.
g)Select Crawl > robots.txt Tester from Search Console dashboard.
h)Make sure there are no errors.
i)You can also configure page level settings for robots.txt from WordPress.
You may find the below resources helpful for learning more about technical SEO:
- Google’s SEO Starter Guide
- Simple ways to increase the visibility of your website
- Moz’s articles on technical SEO
- E-commerce SEO Guide
- BigCommerce SEO (2018)
Brace yourself with mock tests and preparation kits from experts of Irine Digital Factory. Clear SEMRush Content Marketing Course Certification, SEMRush Social Media Marketing Certification Exam, Foundations of IBM Big Data & Analytics and more.
Enroll for Digital Marketing Courses and upstart your career with a bang
Enhance your professional career by enrolling in these affluent Corporate Trainings by experts from Irine Digital Factory