If you're a programmer like me who built your blog or website from scratch, you've focused a lot on design, performance, and code structure. But SEO often gets overlooked. It's a different kind of engineering that focuses on how your content is understood and ranked by search engines like Google.
Here are the 7 pieces of information about SEO that I find essential when building and maintaining my site.
1. Add Proper Meta Tags
Meta tags help search engines understand what your page is about. At a minimum, you should include a title
and description
for each page.
Search engines use the <title>
to show the page's main heading in search results. The <meta name= "description">
tag offers a summary that can influence whether people click on your link.
Here's an example of what to add in your HTML head:
<title>7 SEO Tips for Programmers | My Blog</title>
<meta name= "description" content=" Learn how to optimize your personal website or blog for SEO with these seven beginner-friendly tips for developers.">
To see how your site appears in search results, tools like metatags.io or browser extensions like "SEO Meta in 1 Click" can preview your tags and show what's missing.
Avoid including outdated meta fields like keywords
. Focus only on the title and description.
2. Set Up Social Media Tags
Social media previews are controlled by specific meta tags like Open Graph (for Facebook, LinkedIn) and Twitter cards. These don't directly affect your Google ranking but impact how your links look when shared.
Here's a minimal setup:
<meta property="og:title" content="7 SEO Tips for Programmers">
<meta property=" og:description" content= "Simple SEO tips to help programmers get their blog content indexed and ranked.">
<meta property="og:image" content="https://example.com/cover.jpg">
<meta name="twitter:card" content="summary_large_image">
If you don't add these, platforms will try to guess the title and image, which can lead to poor or inconsistent previews.
3. Connect Google Analytics
Google Analytics helps you understand how users interact with your site, what pages they visit, where they're coming from, and how long they stay.
Even if you don't plan to look at this data immediately, collecting it from the beginning is good. You'll thank yourself later when you want to see what content is working.
Here's the classic snippet to include:
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-XXXXXXX-X"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'UA-XXXXXXX-X');
</script>
Be sure to replace the ID with your own. Modern tools like Plausible or Fathom can be simpler alternatives that consider privacy.
4. Register with Google Search Console
Google Search Console lets you monitor how your site appears in search results, as it's the best way to view your search engine performance. You can see what queries bring in traffic, which pages perform, and whether Google is indexing your pages correctly.
It also helps you find errors, such as broken links or blocked resources and allows you to submit your sitemap.
It's best to start this process early, as it can take several days to verify and view information. Verification involves adding a meta tag, DNS record, or file uploading.
5. Create and Submit a Sitemap
A sitemap is an XML file listing all the URLs you want indexed. It helps search engines discover your pages more efficiently.
For static sites, you can generate one manually or use a tool like:
sitemap.xml
npm packagesnext-sitemap
for Next.js- Built-in features in static site generators like Hugo or Astro
Example:
https://yourdomain.com/sitemap.xml
Submit this URL in Google Search Console to help Google crawl your site faster.
6. Add a robots.txt File
The robots.txt
file tells search engines what they can and can't crawl. It sits at the root of your domain.
Here's a basic one:
User-agent: *
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml
If you're staging a site or working locally, you may want to block crawlers entirely:
User-agent: *
Disallow: /
Always double-check that you're not accidentally blocking pages you want indexed.
7. Prefer Static or Server-Side Rendered Pages
If your blog is a Single Page App (SPA) built with frameworks like React or Vue, search engines might still index your pages, but it's not always guaranteed.
Static or Server-Side Rendered (SSR) pages are preferred because the content is fully available in HTML when the page loads, making it easier for crawlers to parse it.
If you're using Next.js, you can generate static pages using getStaticProps()
. Astro and Hugo are also great options for fully static websites with good SEO out of the box.
A Quick Note on E-E-A-T
Google evaluates content using an E-E-A-T framework: Experience, Expertise, Authoritativeness, and Trust.
You can improve your site's E-E-A-T by:
- Writing about topics you've personally worked on with original research and findings
- Being informative about who you are, your background, and your expertise
- Using original examples, code snippets, or benchmarks
This often comes naturally for technical content; explain how you solved a problem and why it worked. I'll write a more detailed post on how you can utilize this to improve your SEO for programming content. This is a quick note to introduce you to what it is.
Conclusion
SEO doesn't need to be a black box. As a developer, you already have the technical skills required to build a well-optimized site, and you need to know where to apply them. With a few simple additions like meta tags, a sitemap, and analytics tools, you'll set up a solid foundation. And once you start seeing your content appear in search results, it becomes a rewarding feedback loop.