Recently while developing a personal project with SvelteKit, I decided to explore some caching options.
The most notorious option by far is Redis, which is a in-memory data structure store known for its speed and versatility. However, there have been some drama concerning its licensing (which I definitely have to do some more homework on), so I decided to go with its open-source fork, Valkey.
Overview
Valkey + SvelteKit was pretty straightforward to set up, but there were some gotchas along the way, so this post will cover:
- Setting up Valkey on Kubernetes
- Using Valkey in SvelteKit
And as always, my environment:
- Kubernetes cluster: v1.33.3+k3s1
- ArgoCD: v3.0.6+db93798
- Valkey (helm chart by Bitnami): 3.0.31
- SvelteKit: 5.0.0
Setting up Valkey
I used ArgoCD to deploy Valkey, with their official Helm chart.
Since it’s an OCI chart, a repository must be configured beforehand:
repo.yml
1
2
3
4
5
6
7
8
9
10
11
12
apiVersion: v1
kind: Secret
metadata:
labels:
argocd.argoproj.io/secret-type: repository
name: bitnamicharts
namespace: argocd
stringData:
url: registry-1.docker.io/bitnamicharts
name: bitnamicharts
type: helm
enableOCI: "true"
Then,
valkey.yml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
name: valkey
namespace: argocd
spec:
destination:
namespace: cache
server: https://kubernetes.default.svc
source:
path: ''
repoURL: registry-1.docker.io/bitnamicharts
targetRevision: 3.0.31
chart: valkey
helm:
valuesObject:
primary:
persistence:
enabled: true
size: 10Gi
storageClass: csi-rbd-sc
resources:
requests:
memory: 2Gi
cpu: 500m
limits:
memory: 4Gi
cpu: 1
service:
type: LoadBalancer
loadBalancerIP: 10.0.69.233
replica:
resources:
requests:
memory: 1Gi
cpu: 250m
limits:
memory: 2Gi
cpu: 500m
auth:
existingSecret: valkey
existingSecretPasswordKey: password
metrics:
enabled: true
service:
annotations:
prometheus.io/scrape: "true"
prometheus.io/port: "9121"
sources: []
project: default
syncPolicy:
automated:
prune: true
selfHeal: true
In my case there is a secret in the cache namespace containing password key for Valkey authentication.
I also added a LoadBalancer service to expose Valkey for local testing outside of cluster.
A few moments later, there goes the Valkey pods:
|
|
Using Valkey in SvelteKit
valkey-glide
According to the docs, the official way to use Valkey in Node.js is through the valkey-glide package. Since I’m using Node.js as my SvelteKit adapter, it looked like the perfect fit.
So I installed it:
|
|
and wrote some code:
valkey.ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
import { GlideClient, GlideClientConfiguration, TimeUnit } from "@valkey/valkey-glide";
import { env } from "$env/dynamic/private";
interface Cacher {
getCachedJson<T>(key: string): Promise<T | null>;
setCachedJson<T>(key: string, value: T, ttl: number): Promise<boolean>;
deleteCached(key: string): Promise<boolean>;
isHealthy(): Promise<boolean>;
close(): Promise<void>;
}
class FakeCacher implements Cacher {
async getCachedJson<T>(_key: string): Promise<T | null> {
return null;
}
async setCachedJson<T>(_key: string, _value: T, _ttl: number): Promise<boolean> {
return false;
}
async deleteCached(_key: string): Promise<boolean> {
return false;
}
async isHealthy(): Promise<boolean> {
return true;
}
async close(): Promise<void> {
}
}
class GlideCacher implements Cacher {
private client: GlideClient | null = null;
private initPromise: Promise<GlideClient|null> | null = null;
private initialized = false;
private async initGlide(): Promise<GlideClient> {
if (!env.CACHE_HOST) throw new Error("CACHE_HOST is not set");
if (!env.CACHE_PORT) throw new Error("CACHE_PORT is not set");
if (!env.CACHE_DB) throw new Error("CACHE_DB is not set");
if (!env.CACHE_PASSWORD) throw new Error("CACHE_PASSWORD is not set");
const config: GlideClientConfiguration = {
addresses: [{ host: env.CACHE_HOST, port: Number(env.CACHE_PORT) }],
requestTimeout: 500,
clientName: "vault",
databaseId: Number(env.CACHE_DB),
credentials: {
password: env.CACHE_PASSWORD
}
};
this.client = await GlideClient.createClient(config);
this.initialized = true;
return this.client;
}
private async getClient(): Promise<GlideClient | null> {
if (this.client && this.initialized) {
return this.client;
}
if (!this.initPromise) {
this.initPromise = this.initGlide().catch((error) => {
console.error("Failed to initialize cache client:", error);
this.initPromise = null; // Reset so we can retry
return null;
});
}
return await this.initPromise;
}
async getCachedJson<T>(key: string): Promise<T | null> {
try {
const client = await this.getClient();
if (!client) {
console.warn("Cache client not available, skipping cache read");
return null;
}
const cached = await client.get(key);
if (!cached) return null;
return JSON.parse(cached.toString()) as T;
} catch (error) {
console.error(`Cache get error for ${key}:`, error);
return null;
}
}
async setCachedJson<T>(key: string, value: T, ttl: number): Promise<boolean> {
try {
const client = await this.getClient();
if (!client) {
console.warn("Cache client not available, skipping cache write");
return false;
}
await client.set(key, JSON.stringify(value), {
expiry: { type: TimeUnit.Seconds, count: ttl }
});
return true;
} catch (error) {
console.error(`Cache set error for ${key}:`, error);
return false;
}
}
async deleteCached(key: string): Promise<boolean> {
try {
const client = await this.getClient();
if (!client) {
console.warn("Cache client not available, skipping cache delete");
return false;
}
await client.del([key]);
return true;
} catch (error) {
console.error(`Cache delete error for ${key}:`, error);
return false;
}
}
async isHealthy(): Promise<boolean> {
try {
const client = await this.getClient();
if (!client) return false;
// Simple ping to check if cache is responsive
await client.ping();
return true;
} catch (error) {
console.error("Cache health check failed:", error);
return false;
}
}
async close(): Promise<void> {
if (this.client) {
try {
this.client.close();
console.log("Cache client closed");
} catch (error) {
console.error("Error closing cache client:", error);
} finally {
this.client = null;
this.initialized = false;
this.initPromise = null;
}
}
}
}
const cacheEnabled = env.CACHE_ENABLED === "true";
export const cacher: Cacher = cacheEnabled ? new GlideCacher() : new FakeCacher();
Everything worked on my local machine, but the pipeline blew up:
|
|
Trying to build it locally also resulted in a segmentation fault:
|
|
After some investigating, I found that whenever I imported something from valkey-glide, the build would fail.
|
|
However, the produced build/ directory worked perfectly when ran with npm run preview.
I could circumvent this by using some workarounds like dynamically importing valkey-glide only when needed… but it would make my developer experience a little worse. Since developer experience is one of the most important things in my personal projects, I decided to look for alternatives.
redis
Luckily, since Valkey is a drop-in replacement for Redis, I could just use the redis client package instead.
|
|
redis.ts
|
|
And it worked seamlessly. Example usage:
|
|