Featured image of post gRPC between Go and SvelteKit backends

gRPC between Go and SvelteKit backends

gRPC is a newer remote procedure call (RPC) framework developed by Google that uses HTTP/2 for transport. Since it provides a significant performance boost over traditional REST APIs, it has become increasingly popular for building microservices and backend systems.

The reason I wanted to explore gRPC is the other great benefit it provides: code generation, and thus type safety, across multiple languages.

As I kept working on one of my personal projects, it became more and more cumbersome to maintain the REST API between my Go backend and SvelteKit frontend. Every time I made a change to the API, I risked breaking something on either side without a warning sign until runtime.

I figured gRPC could be the missing piece here, as it allowed me to define my API in a separate repository (single source of truth) using Protocol Buffers (protobufs), and then generate client and server code for both Go and TypeScript.

Goals

As a first-time user of gRPC, I decided to start small.

Currently I have a backend Go server spawner that polls outstanding jobs from a database and spawns worker processes to process them.
There is also a SvelteKit frontend vfront that allows users to control whether or not the spawner should process jobs.

When the user toggles a switch in the frontend,
spawner toggle

  1. it sends an HTTP request to SvelteKit backend (this is type-safe already)
  2. which then sends an HTTP request to the Go spawner backend. (this is currently NOT type-safe)

So the second step is what I want to convert to gRPC.

Setting up the dev environment

I installed the following to my system:

  1. buf - a protobuf build tool, works like make for protobufs
  2. cargo - Rust package manager, needed for protol
  3. protols - language server for protobufs, available in mason.nvim

buf:

1
2
sudo curl -sSL "https://github.com/bufbuild/buf/releases/download/v1.57.2/buf-$(uname -s)-$(uname -m)" -o /usr/local/bin/buf
sudo chmod +x /usr/local/bin/buf

Files

For my project, I created a separate repository for the protobuf definitions, which I named vault-api. The directory structure looks like this:

1
2
3
4
5
6
7
.
├── proto
│   └── spawner
│       └── spawner.proto
├── buf.yaml
├── buf.gen.yaml
└── protols.toml

spawner.proto defines the gRPC service and messages for the spawner backend. It’s like the API contract between the frontend and backend.

spawner.proto
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
syntax = "proto3";

package spawner;
// package will be available at git.junyi.me/vault/vault-api/gen/go/spawner, and named "spawner" in the generated Go code
option go_package = "git.junyi.me/vault/vault-api/gen/go/spawner;spawner";

import "google/protobuf/empty.proto";

// gRPC service to handle spawner configuration changes
service ConfigService {
    rpc GetConfig (google.protobuf.Empty) returns (ConfigResp);
    rpc UpdateConfig (ConfigReq) returns (google.protobuf.Empty);
}

// Request message to update spawner configuration
message ConfigReq {
    optional bool launch = 1;
}

// Response message to get spawner configuration
message ConfigResp {
    bool launch = 1;
}

buf.yaml is the configuration file for buf, specifying the version and modules. There are so much more to this config file, documented here: buf.yaml v2 config file.

buf.yaml
1
2
3
4
5
6
7
8
9
version: v2
modules:
  - path: proto # path to proto files
lint:
  use:
    - STANDARD # standard linting rules
breaking:
  use:
    - FILE # breaking change detection

buf.gen.yaml is the generation configuration file for buf, specifying the plugins and output directories for generated code. Also documented here: buf.gen.yaml v2 config file.

buf.gen.yaml
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
version: v2
managed:
  enabled: true
plugins:
  - remote: buf.build/protocolbuffers/go
    out: gen/go
    opt: paths=source_relative
  - remote: buf.build/grpc/go
    out: gen/go
    opt: paths=source_relative
  - local: gen/ts/node_modules/.bin/protoc-gen-ts_proto
    out: gen/ts
    opt:
      - esModuleInterop=true
      - outputServices=nice-grpc # using nice-grpc for better developer experience
      - outputServices=generic-definitions
      - useExactTypes=false

I opted for nice-grpc instead of the more popular grpc-node, since it provided a smoother workflow. For example, with nice-grpc, I could use async/await syntax directly in my service implementations like:

1
2
3
4
5
6
7
8
// src/lib/server/grpc/spawner.ts
import { createChannel, createClient } from 'nice-grpc';
import { ConfigServiceDefinition } from '@vault/vault-grpc/spawner/spawner';

export const spawner = createClient(ConfigServiceDefinition, channel);

// Usage
await spawner.updateConfig(cfg);

whereas with grpc-node, I would have to use callbacks or promisify the methods manually.

Lastly, protol.toml is the configuration file for protols. it’s not necessary for the project so far, but I needed this to tell protols the import path for protobuf files when the project structure became more complex.

protol.toml
1
2
[config]
include_paths = ["proto"]

In this example, it allows me to do

1
import "spawner/job.proto";

to import a file at proto/spawner/job.proto. Protols would usually try to resolve imports relative to cwd, but with this config, it knows to look inside the proto directory as well.

Code generation time

In this setup, code generation is done using buf command. First, there are some preparation steps for SvelteKit.

SvelteKit (TypeScript):

1
2
3
4
5
mkdir -p gen/ts
cd gen/ts
npm init -y
npm install nice-grpc
cd -

Then run:

1
buf generate

This will generate the Go and TypeScript code in the specified output directories (gen/go and gen/ts).

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
.
├── gen
│   ├── go
│   │   └── spawner
│   │       ├── spawner.pb.go
│   │       └── spawner_grpc.pb.go
│   └── ts
│       ├── google
│       │   └── protobuf
│       │       └── empty.ts
│       ├── node_modules
│       └── ...
│       ├── package-lock.json
│       ├── package.json
│       └── spawner
│           ├── spawner.ts

For Go, there is one more step:

1
2
3
4
cd gen/go
go mod init git.junyi.me/vault/vault-api/gen/go
go mod tidy
cd -

Using the generated code in SvelteKit

To test locally, I used the npm link method to symlink the generated TypeScript code into my SvelteKit project.

In vault-api repo:

1
2
cd gen/ts
npm link

In the consuming project,

package.json
1
2
3
"dependencies": {
    "@vault/vault-grpc": "^0.0.1",
}

Then

1
npm link @vault/vault-grpc

Now I could do something like this in the backend code:

src/lib/server/grpc/spawner.ts
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
import { createChannel, createClient } from 'nice-grpc';
import { ConfigServiceDefinition } from '@vault/vault-grpc/spawner/spawner';
import { env } from '$env/dynamic/private';

if (!env.SPAWNER_URL) {
  throw new Error('Missing SPAWNER_URL environment variable');
}

const channel = createChannel(env.SPAWNER_URL);
export const spawner = createClient(ConfigServiceDefinition, channel);

// Usage
await spawner.updateConfig(cfg);

SPAWNER_URL was set to localhost:8080 where the Go gRPC server was going to run.

Using the generated code in Go

In the Go spawner backend, I set up the server like this:

go.mod
1
2
3
4
5
6
require (
	git.junyi.me/vault/vault-api/gen/go v0.0.1
)
// ...

replace git.junyi.me/vault/vault-api/gen/go => ../vault-api/gen/go

cmd/spawner/main.go
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
package grpcs

import (
	"context"
	"fmt"

	"google.golang.org/grpc/codes"
	"google.golang.org/grpc/status"
	"google.golang.org/protobuf/types/known/emptypb"
	"jy.org/spawner/src/config"
	"jy.org/spawner/src/db"
	"jy.org/spawner/src/logging"

	spawner "git.junyi.me/vault/vault-api/gen/go/spawner"
)

var logger = logging.Logger

type ConfigServer struct {
	spawner.UnimplementedConfigServiceServer
}

func NewConfigServer() *ConfigServer {
	return &ConfigServer{}
}

func (s *ConfigServer) GetConfig(ctx context.Context, req *emptypb.Empty) (*spawner.ConfigResp, error) {
	return &spawner.ConfigResp{
		Launch: config.Cfg().Launch,
	}, nil
}

func (s *ConfigServer) UpdateConfig(ctx context.Context, req *spawner.ConfigReq) (*emptypb.Empty, error) {
    logger.INFO.Printf("UpdateConfig called with: %+v", req)

	if req.Launch != nil {
        if err := db.SetParameter(ctx, db.Pool(), db.LAUNCH_JOB_PARAM, fmt.Sprintf("%t", *req.Launch)); err != nil {
            logger.INFO.Printf("Error setting launch mode: %v", err)
            return nil, status.Errorf(codes.Internal, "failed to update config: %v", err)
        }
	}
    config.UpdateConfig(req)

	return &emptypb.Empty{}, nil
}

That’s how I got everything working.

Publishing npm package

Of course, local testing is not the end goal. I needed to make it work in the production build as well.

For SvelteKit, I published the generated TypeScript code as a private npm package to GitLab package registry. This was done through a GitLab CI/CD pipeline in the vault-api repository.

.gitlab-ci.yml
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
workflow:
  rules:
    - if: $CI_COMMIT_BRANCH == "master"

variables:
  NPM_TOKEN: ${CI_JOB_TOKEN}

stages:
  - release

publish:
  stage: release
  image: node:latest
  before_script:
    - cd gen/ts
    - npm ci
    - |
      {
        echo "@${CI_PROJECT_ROOT_NAMESPACE}:registry=${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/npm/"
        echo "//${CI_SERVER_HOST}/api/v4/projects/${CI_PROJECT_ID}/packages/npm/:_authToken=${CI_JOB_TOKEN}"
      } | tee -a .npmrc      
  script:
    - npm publish
  rules:
    - changes:
        - gen/ts/**/*

One caveat here is that whenever there is a change to the generated Typescript code, I need to manually bump the version number in gen/ts/package.json before pushing to master. GitLab does not allow overwriting an existing package version.

note

It is possible to let GitLab generate a new version number automatically using CI/CD variables, as documented in: Publish npm packages to the GitLab package registry using semantic-release.
However, I figured that since I had to manually tag releases for Go package anyway, it was simpler to just bump the version number manually. It also makes it easier to sync version numbers between Go and TypeScript packages.

I pushed the changes to master, admired the pipeline run, and moved on.

Consuming npm package

Since the package is private, I needed to set up authentication in the consuming SvelteKit project, by adding a .npmrc file with the following content:

.npmrc
1
2
@vault:registry=https://git.junyi.me/api/v4/projects/1/packages/npm/
//git.junyi.me/api/v4/projects/1/packages/npm/:_authToken=${REG_TOKEN}

where 1 is the project ID of the vault-api repository, and REG_TOKEN is a CI/CD variable containing a GitLab personal access token with read_package_registry scope.

package.json already contains the dependency I used for local testing, so I just needed to make sure the local link was removed, and install the package from GitLab:

1
2
3
4
export REG_TOKEN="your_personal_access_token"

npm unlink @vault/vault-grpc
npm install

Dockerfile and pipeline manifest also needed update to be able to use a private npm package:

Dockerfile
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
FROM node:22 AS builder
WORKDIR /app

COPY package*.json .npmrc ./
RUN --mount=type=secret,id=reg_token \
    export REG_TOKEN=$(cat /run/secrets/reg_token) && \
    npm ci && \
    rm -f .npmrc

COPY . .
RUN npm run build
RUN npm prune --production

FROM node:22
WORKDIR /app
COPY --from=builder /app/build build/
COPY --from=builder /app/node_modules node_modules/
COPY package.json .
ENV NODE_ENV=production
CMD [ "node", "build" ]
.gitlab-ci.yml
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
stages:
  - build

build:
  stage: build
  image: quay.io/podman/stable

  only:
    - master
    - stg

  before_script:
    - echo "$CI_REGISTRY_PASSWORD" | podman login -u "$CI_REGISTRY_USER" "$CI_REGISTRY" --password-stdin

  script:
    - |
      set -e
      if [ "$CI_COMMIT_REF_NAME" = 'master' ]; then
        branchTag=prd
      else
        branchTag=$CI_COMMIT_REF_NAME
      fi
      dateTag=$branchTag-$(date +'%Y%m%d')
      echo "Building images with tags: $dateTag and $branchTag"

      podman build \
        --secret id=reg_token,env=CI_JOB_TOKEN \
        -t "$CI_REGISTRY_IMAGE:$dateTag" -t "$CI_REGISTRY_IMAGE:$branchTag" \
        .

      podman push "$CI_REGISTRY_IMAGE:$dateTag"
      podman push "$CI_REGISTRY_IMAGE:$branchTag"      

With that, the SvelteKit app was ready, and I enjoyed another successful pipeline run.

Publishing Go package

Publishing the Go package was simpler. I just needed to tag a new release in the vault-api repository, and Go modules would take care of the rest.

One caveat is that, since the Go package is located under gen/go, the tag needs to include that path. For example, for version v0.0.1, I ran:

1
2
git tag gen/go/v0.0.1
git push origin gen/go/v0.0.1

(after pushing the changes to master, of course)

Consuming Go package

Although it is possible to go with a similar approach as SvelteKit and use a token to authenticate with GitLab, since my Go project already contained some other private packages, I opted to use go mod vendor to push all dependencies into the repository.

First, I removed the replace directive in go.mod:

1
-replace git.junyi.me/vault/vault-api/gen/go => ../vault-api/gen/go

Then I ran:

1
2
go mod tidy
go mod vendor

I did a test run and everything looked good, so I just pushed the changes. No Dockerfile and pipeline changes necessary here.

Updating proto files

When there are API changes and I need to update the proto files, the process is as follows:

  1. make changes to the proto files in vault-api repository
  2. run buf generate to regenerate the code
  3. bump version numbers in gen/ts/package.json and tag a new release for Go package
  4. push changes to master to trigger the CI/CD pipeline for publishing the packages
  5. update the consuming projects to use the new package versions

Conclusion

For me, the experience of using gRPC is quite positive so far. The type safety alone is worth the effort of setting it up. It’s so much better than having to define and maintain two sets of API contracts in REST.

Built with Hugo
Theme Stack designed by Jimmy