Featured image of post Valkey and SvelteKit

Valkey and SvelteKit

Recently while developing a personal project with SvelteKit, I decided to explore some caching options.

The most notorious option by far is Redis, which is a in-memory data structure store known for its speed and versatility. However, there have been some drama concerning its licensing (which I definitely have to do some more homework on), so I decided to go with its open-source fork, Valkey.

Overview

Valkey + SvelteKit was pretty straightforward to set up, but there were some gotchas along the way, so this post will cover:

  1. Setting up Valkey on Kubernetes
  2. Using Valkey in SvelteKit

And as always, my environment:

  1. Kubernetes cluster: v1.33.3+k3s1
  2. ArgoCD: v3.0.6+db93798
  3. Valkey (helm chart by Bitnami): 3.0.31
  4. SvelteKit: 5.0.0

Setting up Valkey

I used ArgoCD to deploy Valkey, with their official Helm chart.

Since it’s an OCI chart, a repository must be configured beforehand:

repo.yml
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
apiVersion: v1
kind: Secret
metadata:
  labels:
    argocd.argoproj.io/secret-type: repository
  name: bitnamicharts
  namespace: argocd
stringData:
  url: registry-1.docker.io/bitnamicharts
  name: bitnamicharts
  type: helm
  enableOCI: "true"

Then,

valkey.yml
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
  name: valkey
  namespace: argocd
spec:
  destination:
    namespace: cache
    server: https://kubernetes.default.svc
  source:
    path: ''
    repoURL: registry-1.docker.io/bitnamicharts
    targetRevision: 3.0.31
    chart: valkey
    helm:
      valuesObject:
        primary:
          persistence:
            enabled: true
            size: 10Gi
            storageClass: csi-rbd-sc
          resources:
            requests:
              memory: 2Gi
              cpu: 500m
            limits:
              memory: 4Gi
              cpu: 1
          service:
            type: LoadBalancer
            loadBalancerIP: 10.0.69.233
        replica:
          resources:
            requests:
              memory: 1Gi
              cpu: 250m
            limits:
              memory: 2Gi
              cpu: 500m
        auth:
          existingSecret: valkey
          existingSecretPasswordKey: password
        metrics:
          enabled: true
          service:
            annotations:
              prometheus.io/scrape: "true"
              prometheus.io/port: "9121"
  sources: []
  project: default
  syncPolicy:
    automated:
      prune: true
      selfHeal: true

In my case there is a secret in the cache namespace containing password key for Valkey authentication.

I also added a LoadBalancer service to expose Valkey for local testing outside of cluster.

A few moments later, there goes the Valkey pods:

1
2
3
4
5
6
jy:~$ k get po
NAME                READY   STATUS    RESTARTS      AGE
valkey-primary-0    2/2     Running   0             20h
valkey-replicas-0   2/2     Running   0             18h
valkey-replicas-1   2/2     Running   2 (20h ago)   31h
valkey-replicas-2   2/2     Running   2 (20h ago)   31h

Using Valkey in SvelteKit

valkey-glide

According to the docs, the official way to use Valkey in Node.js is through the valkey-glide package. Since I’m using Node.js as my SvelteKit adapter, it looked like the perfect fit.

So I installed it:

1
npm install valkey-glide

and wrote some code:

valkey.ts
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
import { GlideClient, GlideClientConfiguration, TimeUnit } from "@valkey/valkey-glide";
import { env } from "$env/dynamic/private";

interface Cacher {
  getCachedJson<T>(key: string): Promise<T | null>;
  setCachedJson<T>(key: string, value: T, ttl: number): Promise<boolean>;
  deleteCached(key: string): Promise<boolean>;
  isHealthy(): Promise<boolean>;
  close(): Promise<void>;
}

class FakeCacher implements Cacher {
  async getCachedJson<T>(_key: string): Promise<T | null> {
    return null;
  }
  async setCachedJson<T>(_key: string, _value: T, _ttl: number): Promise<boolean> {
    return false;
  }
  async deleteCached(_key: string): Promise<boolean> {
    return false;
  }
  async isHealthy(): Promise<boolean> {
    return true;
  }
  async close(): Promise<void> {
  }
}

class GlideCacher implements Cacher {
  private client: GlideClient | null = null;
  private initPromise: Promise<GlideClient|null> | null = null;
  private initialized = false;

  private async initGlide(): Promise<GlideClient> {
    if (!env.CACHE_HOST) throw new Error("CACHE_HOST is not set");
    if (!env.CACHE_PORT) throw new Error("CACHE_PORT is not set");
    if (!env.CACHE_DB) throw new Error("CACHE_DB is not set");
    if (!env.CACHE_PASSWORD) throw new Error("CACHE_PASSWORD is not set");
    
    const config: GlideClientConfiguration = {
      addresses: [{ host: env.CACHE_HOST, port: Number(env.CACHE_PORT) }],
      requestTimeout: 500,
      clientName: "vault",
      databaseId: Number(env.CACHE_DB),
      credentials: {
        password: env.CACHE_PASSWORD
      }
    };
    
    this.client = await GlideClient.createClient(config);
    this.initialized = true;
    return this.client;
  }

  private async getClient(): Promise<GlideClient | null> {
    if (this.client && this.initialized) {
      return this.client;
    }

    if (!this.initPromise) {
      this.initPromise = this.initGlide().catch((error) => {
        console.error("Failed to initialize cache client:", error);
        this.initPromise = null; // Reset so we can retry
        return null;
      });
    }

    return await this.initPromise;
  }

  async getCachedJson<T>(key: string): Promise<T | null> {
    try {
      const client = await this.getClient();
      if (!client) {
        console.warn("Cache client not available, skipping cache read");
        return null;
      }

      const cached = await client.get(key);
      if (!cached) return null;
      
      return JSON.parse(cached.toString()) as T;
    } catch (error) {
      console.error(`Cache get error for ${key}:`, error);
      return null;
    }
  }

  async setCachedJson<T>(key: string, value: T, ttl: number): Promise<boolean> {
    try {
      const client = await this.getClient();
      if (!client) {
        console.warn("Cache client not available, skipping cache write");
        return false;
      }

      await client.set(key, JSON.stringify(value), { 
        expiry: { type: TimeUnit.Seconds, count: ttl } 
      });
      return true;
    } catch (error) {
      console.error(`Cache set error for ${key}:`, error);
      return false;
    }
  }

  async deleteCached(key: string): Promise<boolean> {
    try {
      const client = await this.getClient();
      if (!client) {
        console.warn("Cache client not available, skipping cache delete");
        return false;
      }

      await client.del([key]);
      return true;
    } catch (error) {
      console.error(`Cache delete error for ${key}:`, error);
      return false;
    }
  }

  async isHealthy(): Promise<boolean> {
    try {
      const client = await this.getClient();
      if (!client) return false;

      // Simple ping to check if cache is responsive
      await client.ping();
      return true;
    } catch (error) {
      console.error("Cache health check failed:", error);
      return false;
    }
  }

  async close(): Promise<void> {
    if (this.client) {
      try {
        this.client.close();
        console.log("Cache client closed");
      } catch (error) {
        console.error("Error closing cache client:", error);
      } finally {
        this.client = null;
        this.initialized = false;
        this.initPromise = null;
      }
    }
  }
}

const cacheEnabled = env.CACHE_ENABLED === "true";
export const cacher: Cacher = cacheEnabled ? new GlideCacher() : new FakeCacher();

Everything worked on my local machine, but the pipeline blew up:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
> [builder 6/7] RUN npm run build:
6.413 .svelte-kit/output/server/entries/pages/(protected)/job/_page.svelte.js                    39.73 kB
6.413 .svelte-kit/output/server/chunks/utils2.js                                                 46.31 kB
6.413 .svelte-kit/output/server/index.js                                                        123.68 kB
6.413 ✓ built in 5.48s
6.413 
6.413 Run npm run preview to preview your production build locally.
6.415 
6.415 > Using @sveltejs/adapter-node
8.424   ✔ done
8.557 Segmentation fault (core dumped)
------
Dockerfile:6
--------------------
   4 |     RUN npm ci
   5 |     COPY . .
   6 | >>> RUN npm run build
   7 |     RUN npm prune --production
   8 |     
--------------------
ERROR: failed to build: failed to solve: process "/bin/sh -c npm run build" did not complete successfully: exit code: 139

Trying to build it locally also resulted in a segmentation fault:

1
2
3
4
5
6
$ npm run build
...
Run npm run preview to preview your production build locally.
> Using @sveltejs/adapter-node
  ✔ done
Segmentation fault         (core dumped) npm run build

After some investigating, I found that whenever I imported something from valkey-glide, the build would fail.

1
2
// example:
import { GlideClient, GlideClientConfiguration, TimeUnit } from "@valkey/valkey-glide";

However, the produced build/ directory worked perfectly when ran with npm run preview.

I could circumvent this by using some workarounds like dynamically importing valkey-glide only when needed… but it would make my developer experience a little worse. Since developer experience is one of the most important things in my personal projects, I decided to look for alternatives.

redis

Luckily, since Valkey is a drop-in replacement for Redis, I could just use the redis client package instead.

1
2
npm uninstall valkey-glide
npm install redis
redis.ts
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
import { createClient, type RedisClientType } from "redis";
import { env } from "$env/dynamic/private";
import { building } from "$app/environment";

interface Cacher {
  getCachedJson<T>(key: string): Promise<T | null>;
  setCachedJson<T>(key: string, value: T, ttl: number): Promise<boolean>;
  deleteCached(key: string): Promise<boolean>;
  isHealthy(): Promise<boolean>;
  close(): Promise<void>;
}

class FakeCacher implements Cacher {
  async getCachedJson<T>(_key: string): Promise<T | null> {
    return null;
  }
  async setCachedJson<T>(_key: string, _value: T, _ttl: number): Promise<boolean> {
    return false;
  }
  async deleteCached(_key: string): Promise<boolean> {
    return false;
  }
  async isHealthy(): Promise<boolean> {
    return true;
  }
  async close(): Promise<void> {
  }
}

class RedisCacher implements Cacher {
  private client: RedisClientType | null = null;
  private initPromise: Promise<RedisClientType | null> | null = null;
  private initialized = false;

  private async initRedis(): Promise<RedisClientType> {
    if (!env.CACHE_URL) throw new Error("Missing CACHE_URL");
    
    this.client = createClient({
      url: env.CACHE_URL,
      socket: {
        connectTimeout: 500,
      },
      name: "vault"
    });

    // Error handling
    this.client.on("error", (err) => {
      console.error("Redis client error:", err);
    });

    this.client.on("connect", () => {
      console.log("Redis client connected");
    });

    this.client.on("disconnect", () => {
      console.log("Redis client disconnected");
    });

    await this.client.connect();
    this.initialized = true;
    console.log("Redis client initialized successfully");
    return this.client;
  }

  private async getClient(): Promise<RedisClientType | null> {
    if (this.client && this.initialized && this.client.isOpen) {
      return this.client;
    }

    if (!this.initPromise) {
      this.initPromise = this.initRedis().catch((error) => {
        console.error("Failed to initialize Redis client:", error);
        this.initPromise = null; // Reset so we can retry
        return null;
      });
    }

    return await this.initPromise;
  }

  async getCachedJson<T>(key: string): Promise<T | null> {
    try {
      const client = await this.getClient();
      if (!client) {
        console.warn("Redis client not available, skipping cache read");
        return null;
      }

      const cached = await client.get(key);
      if (!cached) return null;
      
      return JSON.parse(cached) as T;
    } catch (error) {
      console.error(`Cache get error for ${key}:`, error);
      return null;
    }
  }

  async setCachedJson<T>(key: string, value: T, ttl: number): Promise<boolean> {
    try {
      const client = await this.getClient();
      if (!client) {
        console.warn("Redis client not available, skipping cache write");
        return false;
      }

      await client.setEx(key, ttl, JSON.stringify(value));
      return true;
    } catch (error) {
      console.error(`Cache set error for ${key}:`, error);
      return false;
    }
  }

  async deleteCached(key: string): Promise<boolean> {
    try {
      const client = await this.getClient();
      if (!client) {
        console.warn("Redis client not available, skipping cache delete");
        return false;
      }

      await client.del(key);
      return true;
    } catch (error) {
      console.error(`Cache delete error for ${key}:`, error);
      return false;
    }
  }

  async isHealthy(): Promise<boolean> {
    try {
      const client = await this.getClient();
      if (!client || !client.isOpen) return false;

      // Simple ping to check if Redis is responsive
      const result = await client.ping();
      return result === "PONG";
    } catch (error) {
      console.error("Redis health check failed:", error);
      return false;
    }
  }

  async close(): Promise<void> {
    if (this.client) {
      try {
        await this.client.close();
        console.log("Redis client closed");
      } catch (error) {
        console.error("Error closing Redis client:", error);
      } finally {
        this.client = null;
        this.initialized = false;
        this.initPromise = null;
      }
    }
  }
}

const cacheEnabled = env.CACHE_ENABLED === "true" && !building;
export const cacher: Cacher = cacheEnabled ? new RedisCacher() : new FakeCacher();

And it worked seamlessly. Example usage:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
export async function getUser(token?: string): Promise<User | undefined> {
  if (!token) return;

  // check cache
  const cacheKey = `user_session_${token}`;
  const cachedUser = await cacher.getCachedJson<User>(cacheKey);
  if (cachedUser) return cachedUser;

  const sess = await validateSessionToken(token);
  if (!sess) return;

  // Fetch from database
  const dbUser = await db.query.user.findFirst({
    where: eq(user.id, sess.userId)
  });
  // set cache
  if (dbUser) {
    await cacher.setCachedJson(cacheKey, dbUser, CACHE_USER_TTL);
  }

  return dbUser;
}
Built with Hugo
Theme Stack designed by Jimmy