Guides

SEO

This guide will help you optimize your website’s SEO using the Next.js Supabase SaaS template. The template includes customizable meta tags, sitemaps, and robots.txt, allowing you to boost your website’s rankings and improve user experience. By tailoring these elements, you can enhance your website's visibility in search engines and ensure efficient indexing.

Overview

Search Engine Optimization (SEO) is a critical part of building any website. It helps search engines understand your website content and rank it higher in search results, which ultimately improves visibility, increases click-through rates, and enhances the user experience.

The Next.js Supabase SaaS template is pre-configured and optimized for SEO. This template also offers a variety of customization options to tailor SEO elements, such as meta titles, descriptions, Open Graph images, and more, to your specific business needs.

Customizing SEO Metadata

Metadata plays a crucial role in helping search engines and users understand the purpose of each page. The template allows you to customize key SEO elements in the client.config.ts file. Below is an example of how you can structure this file:

{
    app: {
        name: "Your Business Name",
        description: "Brief description of your business",
        template: `%s | ${NAME}`, // Allows dynamic titles
        defaultLocal: "en" satisfies DictionariesKeys,
        defaultTheme: "light" satisfies Themes,
        emailLogoPath: "/logo.png",
        socialMedia: {
            linkedIn: "",
            facebook: "",
            instagram: "",
            twitter: "",
            github: "",
        },
    },
}

These fields are then utilized in the /app/layout.ts file to generate the required metadata for each page.

export const metadata: Metadata = {
    metadataBase: new URL(ClientConfiguration.env.SITE_BASE_URL),
    title: {
        default: ClientConfiguration.app.name,
        template: ClientConfiguration.app.template,
    },
    description: ClientConfiguration.app.description,
    openGraph: {
        title: ClientConfiguration.app.name,
        description: ClientConfiguration.app.description,
        url: ClientConfiguration.env.SITE_BASE_URL,
        siteName: ClientConfiguration.app.name,
        locale: ClientConfiguration.app.defaultLocal,
        type: "website",
    },
    twitter: {
        card: "",
        title: ClientConfiguration.app.name,
        description: ClientConfiguration.app.description,
        creator: ClientConfiguration.app.socialMedia.twitter,
    },
}

Ensure you add relevant business information in these fields for optimal SEO performance.

Sitemap and Robots.txt

Both Sitemap and Robots.txt files are crucial for guiding search engine crawlers and ensuring the proper indexing of your website.

Sitemap

A Sitemap is used to list all the pages of your website, which helps search engines index the site more efficiently. The NextJs Supabase template allows you to customize the sitemap by adding static paths and handling dynamic content, such as blogs.

const staticPaths = ["/", "/pricing", "/faqs", "/blog"];
Note: You can add page in staticPaths variable.

You can also add multiple language codes to handle translations for static pages:

const languages = ["en"];
const pathsWithTranslations = [];

Both static and dynamic pages will be covered by the sitemap, ensuring that all relevant content is indexed by search engines.

Robots.txt

The Robots.txt file informs search engines which parts of your website should not be crawled. The template comes with a pre-built robots.ts file that allows you to define URLs that should remain private.

const languages = ["en"];
const disallowedPaths = languages.flatMap((lang) => [`/${lang}/auth/*`, `/${lang}/dashboard/*`]);

By default, authentication-related and dashboard-related URLs are disallowed from being crawled, ensuring that sensitive parts of the site remain private.

Customization Tips

  • Add business-specific metadata (e.g., titles, descriptions) in the client.config.ts file.
  • Update the Sitemap to include all static and dynamic pages relevant to your website.
  • Modify the Robots.txt file to exclude any sensitive URLs, such as those related to authentication and dashboards.

On this page