Build a Git Repo API
So far the Cloudflare track has added stateful primitives one at a time. This tutorial pulls two of them together into something recognizable: a tiny GitHub. Cloudflare Artifacts stores the actual Git history (clone, push, pull all work against it), and a Durable Object per repo holds the metadata you don’t want inside the repo itself — description, topics, star count.
We’ll front the whole thing with Effect’s HttpApi so every route is schema-validated end-to-end and the integration test can call the worker through the same typed client a real consumer would use.
By the end you’ll have a Worker that lets a client create a repo,
git clone against it, read combined info, star it, and update
its description.
Declare the namespace
Section titled “Declare the namespace”A Cloudflare Artifacts namespace is the top-level container for Git-compatible repos. Namespaces are implicit — there’s nothing to provision at deploy time, so the resource is a thin binding marker. Repos themselves are created at runtime through the binding.
Create src/Repos.ts:
import * as import Cloudflare
Cloudflare from "alchemy/Cloudflare";
export const const Repos: Effect<Cloudflare.Artifacts, never, Stack | Stage>
Repos = import Cloudflare
Cloudflare.const Artifacts: (name: string, props?: Cloudflare.ArtifactsProps) => Effect<Cloudflare.Artifacts, never, Stack | Stage>
Marker for a Cloudflare Artifacts namespace binding.
Artifacts namespaces are implicit (created on first repo write) and require
no deploy-time provisioning, so this is a pure binding marker rather than
a full Resource. The Worker provider sees this object in bindings: { ... }
and emits the corresponding { type: "artifacts", name, namespace } binding
to the script.
A Cloudflare Artifacts namespace — the top-level container for Git-compatible
versioned repositories. See the
https://blog.cloudflare.com/artifacts-git-for-agents-beta/ Artifacts launch post
and
https://developers.cloudflare.com/artifacts/concepts/namespaces/ Namespaces docs
.
Namespaces on Cloudflare are implicit: there is no POST /namespaces
endpoint. The namespace is conjured the first time a repo is created against
it (either via the REST API or the Worker binding). Because of that, the
Alchemy "resource" is a thin binding marker — there is nothing to provision
at deploy time. Repos themselves are typically created at runtime through
the bound Artifacts API.
Artifacts("Repos");That’s the whole declaration. The Worker provider will see this
the moment we .bind(...) it.
Define the API
Section titled “Define the API”The schema and endpoint declarations live outside the Worker so
the same file can be imported by clients and tests without
pulling in any runtime code. Start with one endpoint —
POST /repos:
import * as import Schema
Schema from "effect/Schema";import * as import HttpApi
HttpApi from "effect/unstable/httpapi/HttpApi";import * as import HttpApiEndpoint
HttpApiEndpoint from "effect/unstable/httpapi/HttpApiEndpoint";import * as import HttpApiGroup
HttpApiGroup from "effect/unstable/httpapi/HttpApiGroup";
export class class CreateRepoResponse
CreateRepoResponse extends import Schema
Schema.const Class: <CreateRepoResponse, {}>(identifier: string) => { <Fields>(fields: Fields, annotations?: Schema.Annotations.Declaration<CreateRepoResponse, readonly [Schema.Struct<Fields>]> | undefined): Schema.Class<CreateRepoResponse, Schema.Struct<Fields>, {}>; <S>(schema: S, annotations?: Schema.Annotations.Declaration<CreateRepoResponse, readonly [S]> | undefined): Schema.Class<CreateRepoResponse, S, {}>;}
Creates a schema-backed class whose constructor validates input against a
Struct
schema. Construction throws a
SchemaError
on invalid
input (unless disableChecks is set in the options).
Pass the desired class type as the first type parameter. The second optional
type parameter can be used to add nominal brands.
Example (Basic class)
import { Schema } from "effect"
class Person extends Schema.Class<Person>("Person")({ name: Schema.String, age: Schema.Number}) {}
const alice = new Person({ name: "Alice", age: 30 })console.log(alice.name) // "Alice"console.log(`${alice}`) // "Person({ name: Alice, age: 30 })"
Example (Extending a class)
import { Schema } from "effect"
class Animal extends Schema.Class<Animal>("Animal")({ name: Schema.String}) {}
class Dog extends Animal.extend<Dog>("Dog")({ breed: Schema.String}) {}
const dog = new Dog({ name: "Rex", breed: "Labrador" })console.log(dog.name) // "Rex"console.log(dog.breed) // "Labrador"
Class<class CreateRepoResponse
CreateRepoResponse>( "CreateRepoResponse",)({ name: Schema.String
name: import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String, remote: Schema.String
remote: import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String, token: Schema.String
token: import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String, defaultBranch: Schema.String
defaultBranch: import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String,}) {}
export class class RepoConflict
RepoConflict extends import Schema
Schema.const TaggedErrorClass: <RepoConflict, {}>(identifier?: string) => { <Tag, Fields>(tag: Tag, fields: Fields, annotations?: Schema.Annotations.Declaration<RepoConflict, readonly [Schema.TaggedStruct<Tag, Fields>]> | undefined): Schema.Class<RepoConflict, Schema.TaggedStruct<Tag, Fields>, YieldableError>; <Tag, S>(tag: Tag, schema: S, annotations?: Schema.Annotations.Declaration<RepoConflict, readonly [Schema.Struct<{ [K in keyof ({ readonly _tag: Schema.tag<Tag>; } & S["fields"])]: ({ readonly _tag: Schema.tag<Tag>; } & S["fields"])[K]; }>]> | undefined): Schema.Class<...>;}
Like
ErrorClass
but automatically adds a _tag literal field. The
resulting class is both a schema-validated, yieldable error and a tagged
union member.
Example (Tagged error class)
import { Effect, Schema } from "effect"
class NotFound extends Schema.TaggedErrorClass<NotFound>()("NotFound", { id: Schema.Number}) {}
const program = Effect.gen(function*() { yield* new NotFound({ id: 42 })})
TaggedErrorClass<class RepoConflict
RepoConflict>()( "RepoConflict", { message: Schema.String
message: import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String },) {}
export const const createRepo: HttpApiEndpoint.HttpApiEndpoint<"createRepo", "POST", "/repos", HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.Json<Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}>>, HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.Json<typeof CreateRepoResponse>, HttpApiEndpoint.Json<typeof RepoConflict>, never, never>
createRepo = import HttpApiEndpoint
HttpApiEndpoint.const post: <"createRepo", "/repos", never, never, Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}>, never, typeof CreateRepoResponse, typeof RepoConflict>(name: "createRepo", path: "/repos", options?: { readonly disableCodecs?: false | undefined; readonly params?: undefined; readonly query?: undefined; readonly headers?: undefined; readonly payload?: Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>; }> | undefined; readonly success?: typeof CreateRepoResponse | undefined; readonly error?: typeof RepoConflict | undefined;} | undefined) => HttpApiEndpoint.HttpApiEndpoint<...> (+1 overload)
post("createRepo", "/repos", { payload?: Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}> | undefined
payload: import Schema
Schema.function Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}>(fields: { readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}): Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}>
Defines a struct schema from a map of field schemas.
Each field value is a schema. Use
optionalKey
or
optional
to
mark fields as optional, and
mutableKey
to mark them as mutable.
The resulting schema's Type is a readonly object type with the fields'
decoded types. The Encoded form mirrors the field schemas' encoded types.
Example (Basic struct)
import { Schema } from "effect"
const Person = Schema.Struct({ name: Schema.String, age: Schema.Number, email: Schema.optionalKey(Schema.String)})
// { readonly name: string; readonly age: number; readonly email?: string }type Person = typeof Person.Type
const alice = Schema.decodeUnknownSync(Person)({ name: "Alice", age: 30 })console.log(alice)// { name: 'Alice', age: 30 }
Struct({ name: Schema.String
name: import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String, description: Schema.optional<Schema.String>
description: import Schema
Schema.const optional: optionalLambda<Schema.String>(self: Schema.String) => Schema.optional<Schema.String>
Companion type for an optional struct key that also accepts undefined.
Equivalent to optionalKey<UndefinedOr<S>>. Produced by
optional
.
Marks a struct field as optional, allowing the key to be absent or
undefined.
explicitly set to undefined. Equivalent to optionalKey(UndefinedOr(S)).
Use
optionalKey
instead if you want exact optional semantics (absent
only, not undefined).
Example (Optional field accepting undefined)
import { Schema } from "effect"
const schema = Schema.Struct({ name: Schema.String, age: Schema.optional(Schema.Number)})
// { readonly name: string; readonly age?: number | undefined }type Person = typeof schema.Type
optional(import Schema
Schema.const String: Schema.String
Schema for string values.
Schema for string values. Validates that the input is typeof "string".
String), }), success?: typeof CreateRepoResponse | undefined
success: class CreateRepoResponse
CreateRepoResponse, error?: typeof RepoConflict | undefined
error: class RepoConflict
RepoConflict,});
export class class ReposGroup
ReposGroup extends import HttpApiGroup
HttpApiGroup.const make: <"repos", false>(identifier: "repos", options?: { readonly topLevel?: false | undefined;} | undefined) => HttpApiGroup.HttpApiGroup<"repos", never, false>
An HttpApiGroup is a collection of HttpApiEndpoints. You can use an HttpApiGroup to
represent a portion of your domain.
The endpoints can be implemented later using the HttpApiBuilder.group api.
make("repos").HttpApiGroup<"repos", never, false>.add<[HttpApiEndpoint.HttpApiEndpoint<"createRepo", "POST", "/repos", HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.Json<Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}>>, HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.Json<typeof CreateRepoResponse>, HttpApiEndpoint.Json<typeof RepoConflict>, never, never>]>(endpoints_0: HttpApiEndpoint.HttpApiEndpoint<...>): HttpApiGroup.HttpApiGroup<...>
Add an HttpApiEndpoint to an HttpApiGroup.
add(const createRepo: HttpApiEndpoint.HttpApiEndpoint<"createRepo", "POST", "/repos", HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.Json<Schema.Struct<{ readonly name: Schema.String; readonly description: Schema.optional<Schema.String>;}>>, HttpApiEndpoint.StringTree<never>, HttpApiEndpoint.Json<typeof CreateRepoResponse>, HttpApiEndpoint.Json<typeof RepoConflict>, never, never>
createRepo) {}
export class class RepoApi
RepoApi extends import HttpApi
HttpApi.const make: <"RepoApi">(identifier: "RepoApi") => HttpApi.HttpApi<"RepoApi", never>
An HttpApi is a collection of HttpApiEndpoints. You can use an HttpApi to
represent a portion of your domain.
You can then use HttpApiBuilder.layer(api) to implement the endpoints of the
HttpApi.
make("RepoApi").HttpApi<"RepoApi", never>.add<[typeof ReposGroup]>(groups_0: typeof ReposGroup): HttpApi.HttpApi<"RepoApi", typeof ReposGroup>
Add a HttpApiGroup to the HttpApi.
add(class ReposGroup
ReposGroup) {}Schema.Class gives you a runtime-validated class with an inferred
TypeScript type. Schema.TaggedErrorClass gives you a typed error
that becomes a discriminated union member on the client.
Implement the Worker
Section titled “Implement the Worker”Create src/Worker.ts. The handler group is constructed with
HttpApiBuilder.group (pure — safe inside the Worker’s Init
phase), and the fetch field is the result of layering the API
into an HttpEffect:
import * as import Cloudflare
Cloudflare from "alchemy/Cloudflare";import * as import Effect
Effect from "effect/Effect";import * as import Layer
Layer from "effect/Layer";import * as import Path
Path from "effect/Path";import * as import Etag
Etag from "effect/unstable/http/Etag";import * as import HttpPlatform
HttpPlatform from "effect/unstable/http/HttpPlatform";import * as import HttpRouter
HttpRouter from "effect/unstable/http/HttpRouter";import * as import HttpApiBuilder
HttpApiBuilder from "effect/unstable/httpapi/HttpApiBuilder";import { import CreateRepoResponse
CreateRepoResponse, import RepoApi
RepoApi, import RepoConflict
RepoConflict } from "./Api.ts";import { import Repos
Repos } from "./Repos.ts";
// Workers don't have a FileSystem, so HttpPlatform's file-response// surface is stubbed. The repo API never serves files.const const HttpPlatformStub: Layer.Layer<HttpPlatform.HttpPlatform, never, never>
HttpPlatformStub = import Layer
Layer.const succeed: <HttpPlatform.HttpPlatform, { readonly fileResponse: (path: string, options?: Options.WithContent & { readonly bytesToRead?: SizeInput | undefined; readonly chunkSize?: SizeInput | undefined; readonly offset?: SizeInput | undefined; }) => Effect.Effect<HttpServerResponse, PlatformError>; readonly fileWebResponse: (file: HttpBody.FileLike, options?: Options.WithContent & { readonly bytesToRead?: SizeInput | undefined; readonly chunkSize?: SizeInput | undefined; readonly offset?: SizeInput | undefined; }) => Effect.Effect<HttpServerResponse>;}>(service: Key<...>, resource: { readonly fileResponse: (path: string, options?: Options.WithContent & { readonly bytesToRead?: SizeInput | undefined; readonly chunkSize?: SizeInput | undefined; readonly offset?: SizeInput | undefined; }) => Effect.Effect<HttpServerResponse, PlatformError>; readonly fileWebResponse: (file: HttpBody.FileLike, options?: Options.WithContent & { readonly bytesToRead?: SizeInput | undefined; readonly chunkSize?: SizeInput | undefined; readonly offset?: SizeInput | undefined; }) => Effect.Effect<HttpServerResponse>;}) => Layer.Layer<...> (+1 overload)
Constructs a layer from the specified value.
succeed(import HttpPlatform
HttpPlatform.class HttpPlatform
HttpPlatform, { fileResponse: (path: string, options?: Options.WithContent & { readonly bytesToRead?: SizeInput | undefined; readonly chunkSize?: SizeInput | undefined; readonly offset?: SizeInput | undefined;}) => Effect.Effect<HttpServerResponse, PlatformError>
fileResponse: () => import Effect
Effect.const die: (defect: unknown) => Effect.Effect<never>
Creates an effect that terminates a fiber with a specified error.
When to Use
Use die when encountering unexpected conditions in your code that should
not be handled as regular errors but instead represent unrecoverable defects.
Details
The die function is used to signal a defect, which represents a critical
and unexpected error in the code. When invoked, it produces an effect that
does not handle the error and instead terminates the fiber.
The error channel of the resulting effect is of type never, indicating that
it cannot recover from this failure.
die("HttpPlatform.fileResponse not supported"), fileWebResponse: (file: HttpBody.FileLike, options?: Options.WithContent & { readonly bytesToRead?: SizeInput | undefined; readonly chunkSize?: SizeInput | undefined; readonly offset?: SizeInput | undefined;}) => Effect.Effect<HttpServerResponse>
fileWebResponse: () => import Effect
Effect.const die: (defect: unknown) => Effect.Effect<never>
Creates an effect that terminates a fiber with a specified error.
When to Use
Use die when encountering unexpected conditions in your code that should
not be handled as regular errors but instead represent unrecoverable defects.
Details
The die function is used to signal a defect, which represents a critical
and unexpected error in the code. When invoked, it produces an effect that
does not handle the error and instead terminates the fiber.
The error channel of the resulting effect is of type never, indicating that
it cannot recover from this failure.
die("HttpPlatform.fileWebResponse not supported"),});
export default class class Worker
Worker extends import Cloudflare
Cloudflare.const Worker: <Worker>() => { <Shape, PropsReq, InitReq>(id: string, props: InputProps<Cloudflare.WorkerProps<any, Cloudflare.WorkerAssetsConfig | undefined>, never> | Effect.Effect<Cloudflare.WorkerProps<any, Cloudflare.WorkerAssetsConfig | undefined>, never, PropsReq>, impl: Effect.Effect<Shape, never, InitReq>): Effect.Effect<Pipeable & ResourceLike<"Cloudflare.Worker", Cloudflare.WorkerProps<any, Cloudflare.WorkerAssetsConfig | undefined>, { ...; }, { ...; }, Cloudflare.Providers> & { ...; } & { ...; } & Rpc<...>, never, Cloudflare.Providers | ... 1 more ... | Exclude<...>> & (new (_: never) => MakeShape<...>); <Shape, PropsReq>(id: string, props: InputProps<...> | Effect.Effect<...>): Effect.Effect<...> & ... 1 more ... & (<InitReq>(impl: Effect.Effect<...>) => Effect.Effect<...>);} (+3 overloads)
Worker<class Worker
Worker>()( "Api", { main: Input<string>
main: import.
The type of import.meta.
If you need to declare that a given property exists on import.meta,
this type may be augmented via interface merging.
meta.ImportMeta.path: string
Absolute path to the source file
path, compatibility?: Input<{ date?: string; flags?: ("nodejs_compat" | "nodejs_als" | (string & {}))[];} | undefined>
compatibility: { flags: "nodejs_compat"[]
flags: ["nodejs_compat"], date: string
date: "2026-03-17" }, }, import Effect
Effect.const gen: <Effect.Effect<Cloudflare.ArtifactsClient, never, Cloudflare.ArtifactsBinding>, { fetch: Effect.Effect<Effect.Effect<HttpServerResponse, HttpServerError, Scope | HttpServerRequest>, never, Scope | FileSystem>;}>(f: () => Generator<Effect.Effect<Cloudflare.ArtifactsClient, never, Cloudflare.ArtifactsBinding>, { fetch: Effect.Effect<Effect.Effect<HttpServerResponse, HttpServerError, Scope | HttpServerRequest>, never, Scope | FileSystem>;}, never>) => Effect.Effect<...> (+1 overload)
Provides a way to write effectful code using generator functions, simplifying
control flow and error handling.
When to Use
gen allows you to write code that looks and behaves like synchronous
code, but it can handle asynchronous tasks, errors, and complex control flow
(like loops and conditions). It helps make asynchronous code more readable
and easier to manage.
The generator functions work similarly to async/await but with more
explicit control over the execution of effects. You can yield* values from
effects and return the final result at the end.
gen(function* () { const const artifacts: Cloudflare.ArtifactsClient
artifacts = yield* import Cloudflare
Cloudflare.const Artifacts: { (name: string, props?: Cloudflare.ArtifactsProps): Effect.Effect<Cloudflare.Artifacts, never, Stack | Stage>; bind: (<Req = never>(args_0: Cloudflare.Artifacts | Effect.Effect<Cloudflare.Artifacts, never, Req>) => Effect.Effect<Cloudflare.ArtifactsClient, never, Cloudflare.ArtifactsBinding | Req>) & ((artifacts: Cloudflare.Artifacts) => Effect.Effect<any, any, any>);}
Marker for a Cloudflare Artifacts namespace binding.
Artifacts namespaces are implicit (created on first repo write) and require
no deploy-time provisioning, so this is a pure binding marker rather than
a full Resource. The Worker provider sees this object in bindings: { ... }
and emits the corresponding { type: "artifacts", name, namespace } binding
to the script.
A Cloudflare Artifacts namespace — the top-level container for Git-compatible
versioned repositories. See the
https://blog.cloudflare.com/artifacts-git-for-agents-beta/ Artifacts launch post
and
https://developers.cloudflare.com/artifacts/concepts/namespaces/ Namespaces docs
.
Namespaces on Cloudflare are implicit: there is no POST /namespaces
endpoint. The namespace is conjured the first time a repo is created against
it (either via the REST API or the Worker binding). Because of that, the
Alchemy "resource" is a thin binding marker — there is nothing to provision
at deploy time. Repos themselves are typically created at runtime through
the bound Artifacts API.
Artifacts.bind: <never>(args_0: Cloudflare.Artifacts | Effect.Effect<Cloudflare.Artifacts, never, never>) => Effect.Effect<Cloudflare.ArtifactsClient, never, Cloudflare.ArtifactsBinding> (+1 overload)
bind(import Repos
Repos);
const const handlers: Layer.Layer<ApiGroup<string, never>, never, never>
handlers = import HttpApiBuilder
HttpApiBuilder.const group: <string, Any, never, unknown>(api: HttpApi<string, Any>, groupName: never, build: (handlers: HttpApiBuilder.Handlers<R, Endpoints extends Any = never>.FromGroup<never>) => "Must return the implemented handlers") => Layer.Layer<ApiGroup<string, never>, never, never>
Create a Layer that will implement all the endpoints in an HttpApi.
An unimplemented Handlers instance is passed to the build function, which
you can use to add handlers to the group.
You can implement endpoints using the handlers.handle api.
group(import RepoApi
RepoApi, "repos", (h: HttpApiBuilder.Handlers.FromGroup<never>
h) => h: HttpApiBuilder.Handlers.FromGroup<never>
h.Handlers<never, never>.handle<never, unknown>(name: never, handler: HandlerWithName<never, never, never, unknown>, options?: { readonly uninterruptible?: boolean | undefined;} | undefined): HttpApiBuilder.Handlers<HttpRouter.Request<"Requires", unknown>, never>
Add the implementation for an HttpApiEndpoint to a Handlers group.
handle("createRepo", ({ payload: never
payload }) => const artifacts: Cloudflare.ArtifactsClient
artifacts .ArtifactsClient.create(name: string, opts?: Cloudflare.ArtifactsCreateOptions): Effect.Effect<ArtifactsCreateRepoResult, Cloudflare.ArtifactsError, Cloudflare.WorkerEnvironment>
create(payload: never
payload.any
name, { description?: string | undefined
description: payload: never
payload.any
description, setDefaultBranch?: string | undefined
setDefaultBranch: "main", }) .Pipeable.pipe<Effect.Effect<ArtifactsCreateRepoResult, Cloudflare.ArtifactsError, Cloudflare.WorkerEnvironment>, Effect.Effect<any, Cloudflare.ArtifactsError, Cloudflare.WorkerEnvironment>, Effect.Effect<any, any, Cloudflare.WorkerEnvironment>>(this: Effect.Effect<...>, ab: (_: Effect.Effect<ArtifactsCreateRepoResult, Cloudflare.ArtifactsError, Cloudflare.WorkerEnvironment>) => Effect.Effect<...>, bc: (_: Effect.Effect<...>) => Effect.Effect<...>): Effect.Effect<...> (+21 overloads)
pipe( import Effect
Effect.const map: <ArtifactsCreateRepoResult, any>(f: (a: ArtifactsCreateRepoResult) => any) => <E, R>(self: Effect.Effect<ArtifactsCreateRepoResult, E, R>) => Effect.Effect<any, E, R> (+1 overload)
Transforms the value inside an effect by applying a function to it.
Syntax
const mappedEffect = pipe(myEffect, Effect.map(transformation))// orconst mappedEffect = Effect.map(myEffect, transformation)// orconst mappedEffect = myEffect.pipe(Effect.map(transformation))
Details
map takes a function and applies it to the value contained within an
effect, creating a new effect with the transformed value.
It's important to note that effects are immutable, meaning that the original
effect is not modified. Instead, a new effect is returned with the updated
value.
map( (c: ArtifactsCreateRepoResult
c) => new import CreateRepoResponse
CreateRepoResponse({ name: string
name: c: ArtifactsCreateRepoResult
c.ArtifactsCreateRepoResult.name: string
Repository name.
name, remote: string
remote: c: ArtifactsCreateRepoResult
c.ArtifactsCreateRepoResult.remote: string
HTTPS git remote URL.
remote, token: string
token: c: ArtifactsCreateRepoResult
c.ArtifactsCreateRepoResult.token: string
Plaintext access token (only returned at creation time).
token, defaultBranch: string
defaultBranch: c: ArtifactsCreateRepoResult
c.ArtifactsCreateRepoResult.defaultBranch: string
Default branch name.
defaultBranch, }), ), import Effect
Effect.const catchTag: <"ArtifactsError", Cloudflare.ArtifactsError, never, any, never, never, never, never>(k: "ArtifactsError", f: (e: Cloudflare.ArtifactsError) => Effect.Effect<never, any, never>, orElse?: ((e: never) => Effect.Effect<never, never, never>) | undefined) => <A, R>(self: Effect.Effect<A, Cloudflare.ArtifactsError, R>) => Effect.Effect<A, any, R> (+1 overload)
Catches and handles specific errors by their _tag field, which is used as a
discriminator.
When to Use
catchTag is useful when your errors are tagged with a readonly _tag field
that identifies the error type. You can use this function to handle specific
error types by matching the _tag value. This allows for precise error
handling, ensuring that only specific errors are caught and handled.
The error type must have a readonly _tag field to use catchTag. This
field is used to identify and match errors.
catchTag("ArtifactsError", (err: Cloudflare.ArtifactsError
err) => import Effect
Effect.const fail: <any>(error: any) => Effect.Effect<never, any, never>
Creates an Effect that represents a recoverable error.
When to Use
Use this function to explicitly signal an error in an Effect. The error
will keep propagating unless it is handled. You can handle the error with
functions like
catchAll
or
catchTag
.
fail(new import RepoConflict
RepoConflict({ message: string
message: err: Cloudflare.ArtifactsError
err.Error.message: string
message })), ), ), ), );
return { fetch: Effect.Effect<Effect.Effect<HttpServerResponse, HttpServerError, Scope | HttpServerRequest>, never, Scope | FileSystem>
fetch: import HttpApiBuilder
HttpApiBuilder.const layer: <string, Any>(api: HttpApi<string, Any>, options?: { readonly openapiPath?: `/${string}` | undefined;}) => Layer.Layer<never, never, HttpPlatform.HttpPlatform | FileSystem | Path.Path | Etag.Generator | HttpRouter.HttpRouter>
Register an HttpApi with a HttpRouter.
layer(import RepoApi
RepoApi).Pipeable.pipe<Layer.Layer<never, never, HttpPlatform.HttpPlatform | FileSystem | Path.Path | Etag.Generator | HttpRouter.HttpRouter>, Layer.Layer<never, never, HttpPlatform.HttpPlatform | FileSystem | Path.Path | Etag.Generator | HttpRouter.HttpRouter>, Layer.Layer<never, never, FileSystem | HttpRouter.HttpRouter>, Effect.Effect<Effect.Effect<HttpServerResponse, HttpServerError, Scope | HttpServerRequest>, never, Scope | FileSystem>>(this: Layer.Layer<...>, ab: (_: Layer.Layer<...>) => Layer.Layer<...>, bc: (_: Layer.Layer<...>) => Layer.Layer<...>, cd: (_: Layer.Layer<...>) => Effect.Effect<...>): Effect.Effect<...> (+21 overloads)
pipe( import Layer
Layer.const provide: <never, never, ApiGroup<string, never>>(that: Layer.Layer<ApiGroup<string, never>, never, never>) => <RIn2, E2, ROut2>(self: Layer.Layer<ROut2, E2, RIn2>) => Layer.Layer<ROut2, E2, Exclude<RIn2, ApiGroup<string, never>>> (+3 overloads)
Feeds the output services of this builder into the input of the specified
builder, resulting in a new builder with the inputs of this builder as
well as any leftover inputs, and the outputs of the specified builder.
provide(const handlers: Layer.Layer<ApiGroup<string, never>, never, never>
handlers), import Layer
Layer.const provide: <[Layer.Layer<Etag.Generator, never, never>, Layer.Layer<HttpPlatform.HttpPlatform, never, never>, Layer.Layer<Path.Path, never, never>]>(that: [Layer.Layer<Etag.Generator, never, never>, Layer.Layer<HttpPlatform.HttpPlatform, never, never>, Layer.Layer<Path.Path, never, never>]) => <A, E, R>(self: Layer.Layer<A, E, R>) => Layer.Layer<A, E, Exclude<R, HttpPlatform.HttpPlatform | Path.Path | Etag.Generator>> (+3 overloads)
Feeds the output services of this builder into the input of the specified
builder, resulting in a new builder with the inputs of this builder as
well as any leftover inputs, and the outputs of the specified builder.
provide([import Etag
Etag.const layer: Layer.Layer<Etag.Generator, never, never>
layer, const HttpPlatformStub: Layer.Layer<HttpPlatform.HttpPlatform, never, never>
HttpPlatformStub, import Path
Path.const layer: Layer.Layer<Path.Path, never, never>
layer]), import HttpRouter
HttpRouter.const toHttpEffect: <A, E, R>(appLayer: Layer.Layer<A, E, R>) => Effect.Effect<Effect.Effect<HttpServerResponse, HttpRouter.Request.Only<"Error", R> | HttpRouter.Request.Only<"GlobalRequires", R> | HttpServerError, Scope | HttpServerRequest | HttpRouter.Request.Only<"Requires", R> | HttpRouter.Request.Only<"GlobalRequires", R>>, HttpRouter.Request.Without<E>, Exclude<HttpRouter.Request.Without<R>, HttpRouter.HttpRouter> | Scope>
toHttpEffect, ), }; }).Pipeable.pipe<Effect.Effect<{ fetch: Effect.Effect<Effect.Effect<HttpServerResponse, HttpServerError, Scope | HttpServerRequest>, never, Scope | FileSystem>;}, never, Cloudflare.ArtifactsBinding>, Effect.Effect<{ fetch: Effect.Effect<Effect.Effect<HttpServerResponse, HttpServerError, Scope | HttpServerRequest>, never, Scope | FileSystem>;}, never, Cloudflare.ArtifactsBindingPolicy | ExecutionContext<...>>>(this: Effect.Effect<...>, ab: (_: Effect.Effect<...>) => Effect.Effect<...>): Effect.Effect<...> (+21 overloads)
pipe(import Effect
Effect.const provide: <Cloudflare.ArtifactsBinding, never, Cloudflare.ArtifactsBindingPolicy | ExecutionContext<BaseExecutionContext>>(layer: Layer.Layer<Cloudflare.ArtifactsBinding, never, Cloudflare.ArtifactsBindingPolicy | ExecutionContext<BaseExecutionContext>>, options?: { readonly local?: boolean | undefined;} | undefined) => <A, E, R>(self: Effect.Effect<A, E, R>) => Effect.Effect<...> (+5 overloads)
Provides dependencies to an effect using layers or a context. Use options.local
to build the layer every time; by default, layers are shared between provide
calls.
provide(import Cloudflare
Cloudflare.const ArtifactsBindingLive: Layer.Layer<Cloudflare.ArtifactsBinding, never, Cloudflare.ArtifactsBindingPolicy | ExecutionContext<BaseExecutionContext>>
ArtifactsBindingLive)),) {}The handler returns a CreateRepoResponse instance — Schema.Class
expects an actual instance, not a plain object. Errors from
artifacts.create (the only declared error path) are translated to
RepoConflict; anything else dies and surfaces as a 500.
Set up the integration test
Section titled “Set up the integration test”Same Test.make shape as
Add a Durable Object, but
this time the test calls the worker through HttpApiClient.make —
the same RepoApi value drives the typed client:
import * as Cloudflare from "alchemy/Cloudflare";import * as Test from "alchemy/Test/Bun";import { expect } from "bun:test";import * as Effect from "effect/Effect";import * as HttpApiClient from "effect/unstable/httpapi/HttpApiClient";import { RepoApi } from "../src/Api.ts";import Stack from "../alchemy.run.ts";
const { test, beforeAll, afterAll, deploy, destroy } = Test.make({ providers: Cloudflare.providers(), state: Cloudflare.state(),});
const stack = beforeAll(deploy(Stack));afterAll.skipIf(!!process.env.NO_DESTROY)(destroy(Stack));
const repoName = `tutorial-${Date.now().toString(36)}`;Add the first assertion. client.repos.createRepo returns
Effect<CreateRepoResponse, RepoConflict | HttpClientError> — fields
are typed straight from the schema:
const repoName = `tutorial-${Date.now().toString(36)}`;
test( "repo lifecycle", Effect.gen(function* () { const { url } = yield* stack; const client = yield* HttpApiClient.make(RepoApi, { baseUrl: url });
const created = yield* client.repos.createRepo({ payload: { name: repoName, description: "tutorial repo" }, }); expect(created.name).toBe(repoName); expect(created.remote).toBeString(); expect(created.token).toBeString(); }), { timeout: 120_000 },);bun testAlchemy deploys the Worker, the test posts to /repos, and you
get back a remote and token. You could git clone against
them right now.
Look up an existing repo
Section titled “Look up an existing repo”artifacts.get(name) returns an opaque RPC stub — useful for
createToken later, but its fields aren’t enumerable. To return
repo info as JSON, use artifacts.list(...) and find the entry by
name; every record in the list is a plain object.
Add a RepoInfo schema and a getRepo endpoint to the API:
import * as Schema from "effect/Schema";import * as HttpApi from "effect/unstable/httpapi/HttpApi";import * as HttpApiEndpoint from "effect/unstable/httpapi/HttpApiEndpoint";import * as HttpApiGroup from "effect/unstable/httpapi/HttpApiGroup";
export class RepoInfo extends Schema.Class<RepoInfo>("RepoInfo")({ id: Schema.String, name: Schema.String, description: Schema.NullOr(Schema.String), defaultBranch: Schema.String, remote: Schema.String, status: Schema.String, readOnly: Schema.Boolean, createdAt: Schema.String, updatedAt: Schema.String, lastPushAt: Schema.NullOr(Schema.String),}) {}
export class CreateRepoResponse extends Schema.Class<CreateRepoResponse>( "CreateRepoResponse",)({ /* … */ }) {}
export class RepoNotFound extends Schema.TaggedErrorClass<RepoNotFound>()( "RepoNotFound", { name: Schema.String },) {}
export class RepoConflict extends Schema.TaggedErrorClass<RepoConflict>()( "RepoConflict", { message: Schema.String },) {}
export const createRepo = HttpApiEndpoint.post("createRepo", "/repos", { /* … */ });
export const getRepo = HttpApiEndpoint.get("getRepo", "/repos/:name", { params: Schema.Struct({ name: Schema.String }), success: RepoInfo, error: RepoNotFound,});
export class ReposGroup extends HttpApiGroup.make("repos").add(createRepo) {}export class ReposGroup extends HttpApiGroup.make("repos") .add(createRepo) .add(getRepo) {}
export class RepoApi extends HttpApi.make("RepoApi").add(ReposGroup) {}Implement the handler in the Worker, and factor out a findRepo
helper since several handlers will need to look up a repo:
// src/Worker.ts — inside Effect.genconst artifacts = yield* Cloudflare.Artifacts.bind(Repos);
const findRepo = (name: string) => artifacts.list({ limit: 100 }).pipe( Effect.flatMap((res) => { const found = res.repos.find((r: { name: string }) => r.name === name); return found ? Effect.succeed(found) : Effect.fail(new RepoNotFound({ name })); }), Effect.catchTag("ArtifactsError", () => Effect.fail(new RepoNotFound({ name })), ), );
const handlers = HttpApiBuilder.group(RepoApi, "repos", (h) => h .handle("createRepo", ({ payload }) => // …existing ) .handle("getRepo", ({ params }) => findRepo(params.name).pipe( Effect.map( (found) => new RepoInfo({ id: found.id, name: found.name, description: found.description ?? null, defaultBranch: found.defaultBranch, remote: found.remote, status: found.status, readOnly: found.readOnly, createdAt: found.createdAt, updatedAt: found.updatedAt, lastPushAt: found.lastPushAt ?? null, }), ), ), ),);Extend the test:
test( "repo lifecycle", Effect.gen(function* () { const { url } = yield* stack; const client = yield* HttpApiClient.make(RepoApi, { baseUrl: url });
const created = yield* client.repos.createRepo({ payload: { name: repoName, description: "tutorial repo" }, }); expect(created.name).toBe(repoName); expect(created.remote).toBeString(); expect(created.token).toBeString();
const info = yield* client.repos.getRepo({ params: { name: repoName } }); expect(info.name).toBe(repoName); expect(info.defaultBranch).toBe("main"); expect(info.description).toBe("tutorial repo"); }), { timeout: 120_000 },);bun testMint a fresh clone token
Section titled “Mint a fresh clone token”The token returned by create expires. Clients that already know
a repo’s name should be able to ask for a new one without
recreating the repo. Add a cloneToken endpoint:
export class CloneToken extends Schema.Class<CloneToken>("CloneToken")({ id: Schema.String, plaintext: Schema.String, scope: Schema.Literals(["read", "write"]), expiresAt: Schema.String,}) {}
export const getRepo = HttpApiEndpoint.get("getRepo", "/repos/:name", { /* … */ });
export const cloneToken = HttpApiEndpoint.post( "cloneToken", "/repos/:name/clone-token", { params: Schema.Struct({ name: Schema.String }), payload: Schema.Struct({ scope: Schema.optional(Schema.Literals(["read", "write"])), ttl: Schema.optional(Schema.Number), }), success: CloneToken, error: RepoNotFound, },);
export class ReposGroup extends HttpApiGroup.make("repos") .add(createRepo) .add(getRepo) .add(cloneToken) {}repo.createToken(scope, ttl) on the runtime stub returns
{ id, plaintext, scope, expiresAt } — wrap it in a CloneToken
instance:
.handle("getRepo", ({ params }) => /* … */).handle("cloneToken", ({ params, payload }) => artifacts.get(params.name).pipe( Effect.flatMap((handle) => handle.createToken(payload.scope ?? "read", payload.ttl ?? 3600), ), Effect.map( (t) => new CloneToken({ id: t.id, plaintext: t.plaintext, scope: t.scope as "read" | "write", expiresAt: t.expiresAt, }), ), Effect.catchTag("ArtifactsError", () => Effect.fail(new RepoNotFound({ name: params.name })), ), ),),test( "repo lifecycle", Effect.gen(function* () { const { url } = yield* stack; const client = yield* HttpApiClient.make(RepoApi, { baseUrl: url });
const created = yield* client.repos.createRepo({ payload: { name: repoName, description: "tutorial repo" }, }); expect(created.name).toBe(repoName); expect(created.remote).toBeString(); expect(created.token).toBeString();
const info = yield* client.repos.getRepo({ params: { name: repoName } }); expect(info.name).toBe(repoName); expect(info.defaultBranch).toBe("main"); expect(info.description).toBe("tutorial repo");
const token = yield* client.repos.cloneToken({ params: { name: repoName }, payload: { scope: "read", ttl: 600 }, }); expect(token.plaintext).toBeString(); expect(token.scope).toBe("read"); }), { timeout: 120_000 },);That covers the Git half. Artifacts is now storing history and handing out tokens. Next we’ll add the metadata that lives around the repo.
Add the Repo Durable Object
Section titled “Add the Repo Durable Object”Artifacts owns commits, refs, and the clone protocol. It does not
store the things a GitHub-like API needs alongside that —
descriptions you can rename, topics, stars. The Repo Durable
Object represents one repository: a single addressable instance
per repo name with its own transactional storage.
Start with the smallest possible DO — empty public API, no state:
import * as import Cloudflare
Cloudflare from "alchemy/Cloudflare";import * as import Effect
Effect from "effect/Effect";
export default class class Repo
Repo extends import Cloudflare
Cloudflare.const DurableObjectNamespace: Cloudflare.DurableObjectNamespaceClass<Repo>() => <Shape, InitReq>(name: string, impl: Effect.Effect<Effect.Effect<Shape, never, Cloudflare.DurableObjectServices>, never, InitReq>) => Effect.Effect<Cloudflare.DurableObjectNamespace<Repo>, never, Cloudflare.Worker | Exclude<InitReq, Cloudflare.DurableObjectServices>> & (new (_: never) => Shape) (+2 overloads)
DurableObjectNamespace<class Repo
Repo>()( "Repo", import Effect
Effect.const gen: <never, Effect.Effect<{}, never, never>>(f: () => Generator<never, Effect.Effect<{}, never, never>, never>) => Effect.Effect<Effect.Effect<{}, never, never>, never, never> (+1 overload)
Provides a way to write effectful code using generator functions, simplifying
control flow and error handling.
When to Use
gen allows you to write code that looks and behaves like synchronous
code, but it can handle asynchronous tasks, errors, and complex control flow
(like loops and conditions). It helps make asynchronous code more readable
and easier to manage.
The generator functions work similarly to async/await but with more
explicit control over the execution of effects. You can yield* values from
effects and return the final result at the end.
gen(function* () { return import Effect
Effect.const gen: <never, {}>(f: () => Generator<never, {}, never>) => Effect.Effect<{}, never, never> (+1 overload)
Provides a way to write effectful code using generator functions, simplifying
control flow and error handling.
When to Use
gen allows you to write code that looks and behaves like synchronous
code, but it can handle asynchronous tasks, errors, and complex control flow
(like loops and conditions). It helps make asynchronous code more readable
and easier to manage.
The generator functions work similarly to async/await but with more
explicit control over the execution of effects. You can yield* values from
effects and return the final result at the end.
gen(function* () { return {}; }); }),) {}Persist the metadata
Section titled “Persist the metadata”Each DO instance has its own SQLite-backed key/value storage. Pull the current metadata out of storage in the inner init so it survives restarts and hibernation:
export type Meta = { description: string; topics: string[]; stars: number; createdAt: number;};
export default class Repo extends Cloudflare.DurableObjectNamespace<Repo>()( "Repo", Effect.gen(function* () { return Effect.gen(function* () { const state = yield* Cloudflare.DurableObjectState; let meta = (yield* state.storage.get<Meta>("meta")) ?? null; return {}; }); }),) {}meta is null until the repo is initialized.
Add init and get
Section titled “Add init and get”Any function returned from the inner Effect that produces an
Effect becomes a typed RPC method. Add one to seed the metadata
the first time a repo is created, and one to read it:
return Effect.gen(function* () { const state = yield* Cloudflare.DurableObjectState; let meta = (yield* state.storage.get<Meta>("meta")) ?? null;
const ensure = Effect.gen(function* () { if (meta === null) { return yield* Effect.fail(new Error("repo not initialized")); } return meta; });
return {}; return { init: (description: string) => Effect.gen(function* () { if (meta !== null) return meta; meta = { description, topics: [], stars: 0, createdAt: Date.now() }; yield* state.storage.put("meta", meta); return meta; }), get: () => ensure, };});Wire the DO into the API
Section titled “Wire the DO into the API”Add a Metadata schema to Api.ts and extend RepoInfo with a
nullable metadata field:
export class Metadata extends Schema.Class<Metadata>("Metadata")({ description: Schema.String, topics: Schema.Array(Schema.String), stars: Schema.Number, createdAt: Schema.Number,}) {}
export class RepoInfo extends Schema.Class<RepoInfo>("RepoInfo")({ id: Schema.String, name: Schema.String, description: Schema.NullOr(Schema.String), defaultBranch: Schema.String, remote: Schema.String, status: Schema.String, readOnly: Schema.Boolean, createdAt: Schema.String, updatedAt: Schema.String, lastPushAt: Schema.NullOr(Schema.String), metadata: Schema.NullOr(Metadata),}) {}Yield the Repo class in the Worker’s init phase. The handle is a
DO namespace — repos.getByName(name) returns a typed RPC stub for
that repo’s instance:
import Repo from "./Repo.ts";import { CloneToken, CreateRepoResponse, Metadata, RepoApi, RepoConflict, RepoInfo, RepoNotFound,} from "./Api.ts";
Effect.gen(function* () { const artifacts = yield* Cloudflare.Artifacts.bind(Repos); const repos = yield* Repo; // …findRepo helperNow extend the handlers — createRepo calls init after the repo
is created, getRepo reads metadata and merges it:
.handle("createRepo", ({ payload }) => artifacts .create(payload.name, { description: payload.description, setDefaultBranch: "main", }) .pipe( Effect.tap(() => repos .getByName(payload.name) .init(payload.description ?? "") .pipe(Effect.orDie), ), Effect.map( (c) => new CreateRepoResponse({ name: c.name, remote: c.remote, token: c.token, defaultBranch: c.defaultBranch, }), ), Effect.catchTag("ArtifactsError", (err) => Effect.fail(new RepoConflict({ message: err.message })), ), ),).handle("getRepo", ({ params }) => findRepo(params.name).pipe( Effect.flatMap((found) => repos .getByName(params.name) .get() .pipe( Effect.catch(() => Effect.succeed(null)), Effect.map((m) => ({ found, meta: m })), ), ), Effect.map( (found) => new RepoInfo({ Effect.map( ({ found, meta }) => new RepoInfo({ id: found.id, name: found.name, description: found.description ?? null, defaultBranch: found.defaultBranch, remote: found.remote, status: found.status, readOnly: found.readOnly, createdAt: found.createdAt, updatedAt: found.updatedAt, lastPushAt: found.lastPushAt ?? null, metadata: meta ? new Metadata(meta) : null, }), ), ),),The DO’s get() fails with a plain Error when the repo wasn’t
initialised — recover with Effect.catch so the route still
returns the Artifacts info even if the DO has no metadata yet.
Update the test — info.metadata is now typed as
Metadata | null:
const info = yield* client.repos.getRepo({ params: { name: repoName } });expect(info.name).toBe(repoName);expect(info.defaultBranch).toBe("main");expect(info.description).toBe("tutorial repo");expect(info.metadata?.description).toBe("tutorial repo");expect(info.metadata?.stars).toBe(0);bun testUpdate description and topics
Section titled “Update description and topics”Add an update method to the DO:
return { init: (description: string) => Effect.gen(function* () { if (meta !== null) return meta; meta = { description, topics: [], stars: 0, createdAt: Date.now() }; yield* state.storage.put("meta", meta); return meta; }), get: () => ensure, update: (patch: Partial<Pick<Meta, "description" | "topics">>) => Effect.gen(function* () { const current = yield* ensure; meta = { ...current, ...patch }; yield* state.storage.put("meta", meta); return meta; }),};Add an updateRepo endpoint to the API:
export const updateRepo = HttpApiEndpoint.patch( "updateRepo", "/repos/:name", { params: Schema.Struct({ name: Schema.String }), payload: Schema.Struct({ description: Schema.optional(Schema.String), topics: Schema.optional(Schema.Array(Schema.String)), }), success: Metadata, error: RepoNotFound, },);
export class ReposGroup extends HttpApiGroup.make("repos") .add(createRepo) .add(getRepo) .add(cloneToken) .add(updateRepo) {}And the handler:
.handle("cloneToken", ({ params, payload }) => /* … */).handle("updateRepo", ({ params, payload }) => findRepo(params.name).pipe( Effect.flatMap(() => repos .getByName(params.name) .update({ description: payload.description, topics: payload.topics ? [...payload.topics] : undefined, }) .pipe(Effect.orDie), ), Effect.map((m) => new Metadata(m)), ),),Test it:
expect(token.scope).toBe("read");
const updated = yield* client.repos.updateRepo({ params: { name: repoName }, payload: { description: "now with stars", topics: ["demo", "alchemy"] },});expect(updated.description).toBe("now with stars");expect(updated.topics).toEqual(["demo", "alchemy"]);Star a repo
Section titled “Star a repo”Same pattern — add star to the DO, starRepo to the API, and
the handler:
return { // … update: (patch) => /* … */, star: () => Effect.gen(function* () { const current = yield* ensure; meta = { ...current, stars: current.stars + 1 }; yield* state.storage.put("meta", meta); return meta; }),};export const starRepo = HttpApiEndpoint.post( "starRepo", "/repos/:name/star", { params: Schema.Struct({ name: Schema.String }), success: Metadata, error: RepoNotFound, },);
export class ReposGroup extends HttpApiGroup.make("repos") .add(createRepo) .add(getRepo) .add(cloneToken) .add(updateRepo) .add(starRepo) {}.handle("updateRepo", ({ params, payload }) => /* … */).handle("starRepo", ({ params }) => findRepo(params.name).pipe( Effect.flatMap(() => repos.getByName(params.name).star().pipe(Effect.orDie), ), Effect.map((m) => new Metadata(m)), ),),expect(updated.topics).toEqual(["demo", "alchemy"]);
const starred = yield* client.repos.starRepo({ params: { name: repoName },});expect(starred.stars).toBe(1);Run the full suite
Section titled “Run the full suite”bun testEach call round-trips through Artifacts, the Durable Object, or
both — created via artifacts.create, looked up via artifacts.list,
mutated via the DO’s init/update/star RPC methods. The whole
flow is type-checked end-to-end through the same RepoApi schema,
on both the server and the client.
Why this shape
Section titled “Why this shape”Artifacts and the per-repo DO each do one thing well:
- Artifacts is the Git server — it owns commits, refs, and the clone/push protocol. Tokens are scoped and short-lived, so you mint them on demand instead of handing out long-lived secrets.
- The DO is the source of truth for everything that lives around the repo. Each repo gets its own instance, so a hot repo’s writes never contend with another’s.
- HttpApi ties the two together. The same
RepoApivalue drives the Worker, the integration test, and any external client — so contract drift between server and consumer is caught at compile time, not in production.
Combine more primitives the same way: a Workflow that runs CI on push, a Container that builds and publishes artifacts, an AI Gateway that summarizes diffs. The Worker stays a thin handler; each primitive owns its own state.